Editorial

Movie Review: Iron Man

Iron man, starring Robert Downey Jr. is the movie that started it all. Unlike other superheroes gifted with superpowers, Tony Stark is a character that developed his own abilities by creating a super-suit in a cave with a box of scraps. And the iron man suit is what gives him superhuman abilities. He may be a genius, billionaire, playboy, philanthropist without the suit, but with the suit, he can kill powerful supervillains with the snap of a finger (quite literally!)

But replicating an iron man suit, in reality, seems impossible. Just an armored suit with weaponry might still be possible, but flight, repulsor beams and an infinite power source like the arc reactor? That may never become a reality. Moreover, the suit seems to have some mind-reading abilities like we haven’t seen Tony commanding JARVIS to fly or fire all the time. Then how does the suit get instructions? Well, it probably uses something like a mind interface technology where the suit interprets signals directly from the brain and implements it. As impossible as it sounds, this technology already exists!

The Brain-Computer Interface (BCI) technology is a computer-based system that acquires brain signals, analyzes them, and translates them into commands that are relayed to an output device to carry out the desired action. Does that mean we can have telepathic powers like Professor X? Not really, since a BCI most certainly does not read minds, or extract information from unwilling users. Instead, the user, often after a period of training, generates brain signals encoding the intention which the BCI can decode and translate into commands to an output device that can accomplish the user’s intention. It usually requires different components capable of signal acquisition, feature extraction, feature translation, and finally a device output.

Any type of brain signal could be used to control a BCI system. The most commonly studied signals are the electrical signals produced mainly by neuronal postsynaptic membrane polarity changes that occur because of the activation of voltage-gated or ion-gated channels.

In simpler words, neurons which are the basic working units of the brain, have three parts- a dendrite which receives the signal, a cell body which computes the signal and an axon which sends the signal out. Neurons are connected to each other through junctions called synapses. To begin communication between neurons, an electrical signal known as an action potential is generated which causes synapses to release neurotransmitters. These small molecules bind to receptors on dendrites which opens channels that causes current to flow across the neuron’s membrane. When a neuron receives the correct combination of synaptic input, it further initiates an action potential, thus the signal transmission continues. These electrical signals can be recorded in the brain.

Measuring these signals can be done in invasive or non-invasive ways. The non-invasive method includes EEG (electroencephalography) which records the signals from the scalp. It is safe, easy and inexpensive. But a major disadvantage is the loss of important information as the signals are significantly attenuated in the process of passing through the dura, skull and scalp. Thus, the need for invasive intracranial recordings arises. ECOG (electrocorticography) is a method where an electrode plate is kept in direct contact with the brain’s surface to measure the activity of the cerebral cortex. To achieve this a surgeon must open a part of the skull to expose the brain surface. These can record signals from larger areas of the brain as compared to intracortical microarrays which are also embedded in the cortex but are smaller and also require a craniotomy.

The idea of a BCI surfaced in the 1960s when an experiment with monkeys showed that signals from single cortical neurons can be used to control a meter needle. In 1973, Vidal’s Brain-Computer Interface Project attempted to evaluate the feasibility of using neuronal signals in a person-computer dialogue that enabled computers to be a prosthetic extension of the brain. Subsequently, after reaching several milestones over the years, in 2006 a BCI system enabled a patient to open simulated emails, operate a television, open and close a prosthetic hand, and perform rudimentary actions with a robotic arm. There have been many organizations working on BCIs like the startup called NextMind which lets you input commands into your computers with your visual attention. Another startup called BrainCo developed headbands that can track users level of focus and engagement. The most recent project in talks is Elon Musk’s Neuralink which claims to be an implantable device that lets you control a computer or a mobile device for starters but could further be implemented to control a prosthetic limb and help other medical conditions like memory loss, depression, seizures, and brain damage.

So, the future may not allow you to mentally command JARVIS to deploy your iron man suit, but you might be able to ask Alexa to play Iron man on the Tv with your thoughts.

Reference (Sept-20-E1)

Comments are closed.

Subscribe for free!Join our community to get full access to our content

Get updates about our magazine release, events and opportunities!