The variety of of artificial intelligence projects and the range of applications is growing at an alarming rate. A report of scientists growing minature brains in Petrie dishes and using them to perform AI tasks sounds like science fiction. It is not. And it has to be asked - have they not watched Terminator?
The pursuit of designing computers that can mimic human brains has captivated all major technology companies worldwide. However, current AI systems are expensive and energy-intensive. Researchers believe that utilizing biological computing could offer a more efficient approach to working with artificial intelligence.
A team from Indiana University demonstrated this concept by growing miniature brain organoids in the lab and using them to train an AI algorithm capable of recognizing speech. While silicon-based computers excel at processing numbers, human brains still hold an advantage when it comes to complex information processing and understanding. The researchers wanted to explore whether a hybrid system combining biological and electronic processing was possible.
They developed a setup called Brainoware, which consisted of both a biological element (cerebral organoids) and a technological component (a microelectrode array). By placing the miniature brains on this electrode array, they were able to interact between the two systems. When electrical impulses were sent to the organoids, they responded by altering their neural networks in response.
To test the system's ability to process data, the team converted 240 recordings of eight people speaking Japanese vowels into electrical signals and fed them through Brainoware. The goal was for the network to identify a specific voice; while it only achieved around 30-40% accuracy initially, over time, it improved to reach approximately 70-80% accuracy.
This unsupervised learning approach demonstrated that the miniature brains were indeed capable of learning. However, Brainoware is still in its early stages and much slower than traditional computing methods. For example, when treating the cultured cells with drugs that prevent new synapse formation, the accuracy did not improve.
While this research represents a significant proof-of-concept, there are still many challenges to overcome before biological computers can become a reality. However, it could pave the way for more efficient computing in the future.
Or the end of the world?
Comments