Friday, December 12, 2025

Teen Builds Working Mind-Controlled Prosthetic In His Garage

While mind-controlled prosthetics still rely on expensive amplifiers and electrode arrays, a 16-year-old has built a functional, thought-driven prosthetic at home, challenging assumptions about who can innovate in neural technology.

Aarav and his innovation

Aarav, a high school student, started exploring brain-computer interfaces while learning robotics and artificial intelligence (AI). “I was fascinated by how the brain could control machines,” he says. “I wanted to find out if it’s possible to build one outside a lab, using tools anyone could access.”

How to capture brain signals

At the heart of the project is the OpenBCI open-source hardware, a 16-channel EEG headset that records neural signals from the front lobe of the brain. This part of the brain is responsible for movements. So, Aarav 3D-printed the headset frame so that the electrodes sat closer together in that region.

“I did not want to go for eight-channel because it would not collect enough data from the parts of the brain that control movement. At the same time, 32 or 64 channels would be too expensive and complicated for a student project. Sixteen channels gave me a good balance between cost, coverage, and accuracy,” he said

The headset then sends the brain signals to a computer for training the data. The program uses a machine learning (ML) algorithm to classify the signals into three categories:

  • Open (command to open the robotic hand)
  • Close (command to close it)
  • Neutral (no action)

“When I started, I did not have any EEG datasets that matched my headset. So I recorded my own brain signals for weeks,  just thinking of hand movements, over and over, and labelling all that data manually,” he said.

Once classified and trained, the commands are sent wirelessly to a Raspberry Pi, which acts as the controller for the 3D-printed arm. The Pi drives servo motors connected to the fingers, translating thought into motion in under a second.

Building a working robotic arm

Once the brain signals were classified, the next step was to translate the signals into movement. First, he designed his robotic arm in Autodesk Fusion 360, using a 3D scan of his own hand as a guide for dimensions. “It was not about making it look real,” he says. “It was about understanding if a simple, open-source mechanism could move in sync with my thoughts.”

The arm was 3D-printed in multiple iterations, each revealing small calibration problems. Some holes for the tendons were too tight; others expanded during printing. “Thermal expansion messes with tolerances,” he explains. “Even a 0.2mm shift can jam the movement.”

Each joint is driven by servo motors, controlled by a Raspberry Pi connected wirelessly to the computer running the BCI program. The setup follows simple logic: if the signal is 1, open the hand. If the signal is 0, close it.

It is basic but effective. The delay between thought and movement is usually less than a second, which is enough to prove the idea works outside lab conditions.

Not got it right on the first try

The early prototype used five separate servo motors, one for each finger. Later versions replaced them with a single high-torque motor, using a tendon system to move all fingers simultaneously. “It reduced the cost and wiring complexity,” Aarav says.

He also plans to replace the Raspberry Pi with an ESP32 controller. It costs less than $5, runs on lower power, and includes built-in Wi-Fi. “It is not as powerful, but it is more practical if someone wants to replicate it,” he adds.

Funding and support

Hardware for neurotechnology is again not cheap. The OpenBCI headset alone costs around US$5000. Aarav received it through OpenBCI’s Innovator Fellowship, which supports independent projects.

He then raised an additional US$8,000 through small international grants and sponsorships, which he used for 3D printing materials, circuit components, and testing. By comparison, a traditional research-grade BCI system can cost more than US$30,000 in total equipment.

What comes next

Aarav is now testing multi-class support vector machine models that account for more parameters, including left, right, and grip pressure, and experimenting with higher electrode density over the motor cortex. His long-term goal is to make the system modular, so monoplegic patients can easily use the arm and its system.


Janarthana Krishna Venkatesan
Janarthana Krishna Venkatesan
As a tech journalist at EFY, Janarthana Krishna Venkatesan explores the science, strategy, and stories driving the electronics and semiconductor sectors.

SHARE YOUR THOUGHTS & COMMENTS

EFY Prime

Unique DIY Projects

Electronics News

Truly Innovative Electronics

Latest DIY Videos

Electronics Components

Electronics Jobs

Calculators For Electronics

×