Brain implants take a 180-degree turn: it’s all thanks to AI

Ai

The innovative system developed by UCLA researchers can help people with paralysis.

When we think of artificial intelligence, the first things that come to mind are conversational bots and virtual assistants: from ChatGPT to Siri, Claude, Gemini, Google Assistant, Copilot, Meta AI, Grok and Alexa, among others (although those mentioned are the most popular).

These tools, as the companies that develop them keep assuring us, are designed to make our lives easier by being able to automate very repetitive and boring tasks, analyse large amounts of data, provide (almost) any type of information and generate content of any kind.

However, the use of AI goes beyond the chatbots of big tech companies. It is being used to make robots ‘smarter’, create autonomous vehicles that do not need a human being behind the wheel, and find out how to live longer.

Artificial intelligence is transforming the way diseases are researched, diagnosed and treated. Not only by offering assisted medical diagnosis that allows algorithms to detect tumours, fractures or abnormalities with great precision, but also by early detection of the risk of heart disease, diabetes or even Alzheimer’s from medical records.

With artificial intelligence, it is possible to provide more effective treatments to patients by tailoring them completely to them after analysing their DNA and genomic data. And what sounds even more incredible, like something out of a science fiction film: helping paralysed patients control artificial/robotic arms through their thoughts. All that is required is a brain chip powered by this technology to give those affected a better quality of life.

Using robotics and AI to give paralysed people back some autonomy

A team of scientists at the University of California, Los Angeles (UCLA), a higher education institution renowned for its scientific research programmes in computer science, engineering, health studies and life sciences, has designed a new portable, non-invasive brain-computer interface (BCI) system that uses artificial intelligence to make life easier for people with physical disabilities.

A brain-computer interface is a system that allows direct communication between the brain and an external device, such as a computer or a prosthesis, without the need to use muscles or the peripheral nervous system. The device records the brain’s electrical activity, processes it with algorithms, and converts it into commands to control the device or improve the natural functions of the nervous system.

The University of California, Los Angeles developed the BCI where AI acts as a kind of co-pilot, working alongside paralysed users to help them control a robotic arm or computer mouse cursor. This innovative system opens the door to the creation of other technologies that allow people with physical disabilities to manipulate objects. Jonathan Kao, study leader and associate professor of electrical and computer engineering at UCLA’s Samueli School of Engineering, explained:

By using artificial intelligence to complement brain-computer interface systems, we are seeking much less risky and invasive avenues. Ultimately, we want to develop AI-BCI systems that offer shared autonomy, allowing people with movement disorders, such as paralysis or ALS, to regain some independence for everyday tasks.

The system developed by UCLA is so innovative because, until now, the most advanced BCI devices required risky and expensive neurosurgery, and often the advantages of the technology were overshadowed by how invasive the procedure was.

Brain Implants Take A 180-Degree Turn: It'S All Thanks To Ai

Portable BCIs, while safer, often lacked the reliability necessary for practical application. The university’s proposal solves this problem by combining an electroencephalography (EEG) cap with a camera-based artificial intelligence platform that records brain activity.

The researchers also developed special algorithms to decode brain signals from an EEG cap. The camera-based AI platform then takes control and interprets the user’s intention in real time to guide actions, such as moving a computer mouse cursor or a robotic arm.

The trial consisted of a group of four participants, three of whom had no motor disabilities and one who was paralysed. In the two-task test, which involved moving the cursor to eight targets and using a robotic arm to move four blocks, all participants completed the task much faster with the assistance of artificial intelligence. In the case of the paralysed user, he completed the robotic arm task in approximately six and a half minutes, when previously he would not have been able to do it on his own.

Johannes Lee, co-lead author of the study and a doctoral candidate in electrical engineering and computer science at UCLA, said that ‘the next steps for AI-BCI systems could include the development of more advanced “co-pilots” that move robotic arms with greater speed and precision, offering a deft touch that adapts to the object the user wants to grasp.’