In year 2008, Idoya, a robot in Japan, walked in response to the brain signals of a monkey in North Carolina. This was achieved by the team of Dr Miguel A.L. Nicolelis, a neuroscientist at Duke University. The team proved that monkeys can use their thoughts to control a robot.

COG and Kismet project at MIT involves a humanoid robot that brings together several sub-fields of human and artificial intelligence, to artificially produce the behaviour of a two-year-old child. It has a head, torso, arms and hands, and a flexible spine but no legs. It has a sensor system. Its artificial brain is made up of a combination of several processing components.

British scientists have developed a robot that is controlled by a blob of human brain cells. According to the New Scientist, a team of the University of Reading has already used rat brain cells to steer a simple wheeled robot. The aim is to investigate the differences in the behaviour of robots controlled by rats and human neurons.

According to a report in Proceedings of National Academy of Sciences, scientists from Max Planck Institute for Biochemistry in Munich have invented an artificial nerve network using semiconductor chips.

In 2008, IBM built a brain-like computer. Government-funded collaboration was obtained to make circuits to mimic human brain. Called cognitive computing, the study brings together neurobiologists, computer and materials scientists and psychologists. The project funded by DRAPA reverse engineers structure, functions and behaviour of the brain, according to IBM scientist Dharmendra Modha.

A team at Intel Corp is working on a new technology which will directly interpret words as they are thought, unlike current brain-controlled computing which requires users to imagine making physical movements to control a cursor on a screen.

A touch screen that feels your hand movement is being developed by scientists at Microsoft Corp lab in Britain. They have developed an interactive touch screen that can see and recognise one’s hand movement and anything near its surface.

Scientists claim to have decoded brain signals related to vision. Researchers from the University of Glasgow are able to see how the brain tunes into different brain-wave patterns to code distinct visual features. How the brain encodes the visual information that enables us to recognise places and scenes has been a mystery, as reported by Daily Mail.

Yishay Mansour, a researcher at Tel Aviv Blavatnik School of Computer Science has developed a program which will make computers realise their mistakes, operate faster, predict events and have emotions. By understanding the difference between desired outcome and reality, the machine will learn a sense of regret and how to minimise it with lesser mistakes and increased efficiency.

Dr Jagannathan Sarangapani from the Missouri University of Science and Technology has developed a new feedback system that allows robots to operate with minimal supervision. This could eventually lead to autonomous robots and also robots that can think for themselves, learn and adapt and use active critique to work unsupervised. Yet they would not have consciousness.

If we have a brain that can have a conscience and consciousness, all we need is a vehicle like a robot to put it to work. Man is working on different technologies to communicate between the brain and the robot as well as between the robot and the external world. Who knows where this search will lead us to?

Russian billionaire, Dmitry Itskov, is funding research on the implantation of human minds into everlasting bodies called cyborgs.


The author has contributed some unique stories in the past on subjects such as Intelligent Buildings and Electronics in Concept Cars. Unfortunately, he passed away recently

SHARE YOUR THOUGHTS & COMMENTS

Please enter your comment!
Please enter your name here