On December 30, one of the most important scientists in the field of Neurosciences died in Turin. Rita Levi-Montalcini, who had been awarded the Nobel Prize in Physiology or Medicine in 1986, along with the American Stanley Cohen, for the discovery of “nerve growth factor”. This protein was discovered in 1947, and fulfills an essential function in living beings, by guaranteeing the survival of certain neurons in their growth process and connection with other tissues of the organism during the development of the central nervous system.
Neurons and the brain have always been a mystery to researchers. Geologist and paleontologist Henry Fairfield Osborn said that ” the human brain is the most wonderful and mysterious object in the whole universe.” Perhaps the intrigue that arouses among scientists the functioning of this organ has caused other branches of technology to look to neuroscience for inspiration for its development. This is the case of third-generation computing, which, as Wang coined in 2002, uses the study of the brain to create the so-called “cognitive computing”.
The mixture of nanotechnology, neurosciences and supercomputing has led to the development of the field of artificial intelligence. Dharmendra Modha, Head of Cognitive Computing at IBMhe commented on the birth of new cognitive computers, which from the American company “did not seek to build a brain, but to be inspired by this organ to develop new computer advances”.
After the launch of Watson, within the DeepQA project, it was clear that a new stage was opening in the era of computing. This has been demonstrated again by IBM, which last December presented the project “5 in 5“(#ibm5in5), with his five predictions for the next five years in the field of cognitive computing.
The idea is simple: it seeks to achieve a computer capable of feeling, adapting and learning, which looks more like the functioning of a human brain than the classic computer development. Like the protein discovered by Levi-Montalcini that helped neuronal growth, IBM wants to relaunch the development of computing, making artificial intelligence even more real.
Equipping computers with five senses may seem utopian. But the truth is that if in the coming years, we manage from computing to touch, see, smell, taste or hear, we will provoke progress in other areas such as medicine, haute cuisine, the textile industry, or even agriculture.
And the fact is that current computers are closer to being imperfect systems, not being able to manage all the volume of information existing in the environment. If they could interact with him, they could offer much more accurate answers to the questions at hand.
https://www.youtube.com/watch?v=wXkfrBJqVcQ
From an evolutionary point of view, that the human being has its sensory organs is a powerful advantage, which has undoubtedly allowed it to colonize all terrestrial habitats. Through sight we can relate to the environment, interpreting the visible light energy and being able to see colors, with a great visual acuity and a good field of vision. Although with respect to the sense of hearing, there are other animal species that present greater efficiency, power and acuity, we are beings with a moderate hearing capacity. The rest of the senses are not as developed as these first two, especially in comparison with other animals, but nevertheless they are extremely important in the development of our usual routines.
In parallel, and just as the senses confer certain evolutionary advantages to humans over other species, new cognitive computers could be adaptively better than current computers. More precision and responsiveness when it comes to solving computer problems are, without a doubt, features that thanks to new computing could be real in less time than expected.
We can enjoy touch devices, and feel and differentiate with them the texture of different fabrics from different vibrations. If we manage to develop computers with visual sense, we will achieve better interpretation of medical images and more accurate diagnoses. New hearing devices could differentiate a baby’s crying based on their physiological needs, or even act as sensors to prevent environmental disasters. In the future, thanks to information technology and the decomposition of flavors, we will be able to adapt personalized diets or achieve a much more sustainable agriculture. By evaluating odors, electronic devices could prevent us from possible infections in hospitals or even warn us when we are about to get sick.
When Rita Levi-Montalcini picked up the Nobel Prize in 1986, she surely did not imagine that the computer age would reach that far, and succeed in expanding, through a “growth factor based on zeros and ones”. The arrival of cognitive computers, more similar to the human brain, with their extreme complexity, will make true that phrase of the Italian scientist, “above all, do not fear difficult moments, the best come from them”. Thanks to the advances in third generation computing and these new almost human computersyou know, it’s clear that the best is yet to come.
Image / Australia News Limited