A new chip uses light instead of electricity to run AI calculations. It processes data very fast and may reduce the power needed for AI systems.

Researchers at the University of Sydney have developed a nanophotonic chip prototype that performs artificial intelligence calculations using light instead of electricity. The chip processes information with photons rather than electronic signals. Because light travels fast, the device can carry out operations in trillionths of a second.
The prototype was built at the Sydney Nano Institute. The work explores hardware approaches that could support the computing needs of artificial intelligence systems. Instead of sending electrical signals through circuits, the chip guides light through nanoscale structures built into the device. As light moves through these structures, calculations take place inside the chip.
Researchers say this approach may help reduce one of the problems in AI infrastructure: energy use. Data centers that run AI models consume electricity and require cooling to keep silicon processors working.
In this design, the layout of the nanostructures functions like artificial neurons. As light moves through the network, the chip can perform tasks such as pattern recognition and classification.
The study shows that neural computations can be performed using light instead of electrical signals. This approach could enable AI accelerators that are faster, smaller, and use less power. The work also explores a way of designing computing hardware where photonics is used to build processing systems for artificial intelligence.
To test the prototype, the research team trained the chip to classify more than 10,000 biomedical images. The dataset included MRI scans of the breast, chest, and abdomen.
Both simulations and laboratory experiments showed that the photonic neural network identified images with an accuracy between 90 percent and 99 percent. Each calculation took place on the picosecond scale, meaning the operations were completed in trillionths of a second as light passed through the nanostructures.
The results suggest that neural network models can be built into nanoscale photonic structures rather than run as software on processors.
Photonic computing could reduce energy demand because light travels through materials without electrical resistance. This reduces heat generation and lowers energy use compared with electronic chips.
Future work will focus on scaling the design to photonic neural networks that can process larger datasets. If the technology scales, photonic chips could support or replace processors in some AI workloads.





