Covering Scientific & Technical AI | Tuesday, December 3, 2024

Argonne Is Using AI to Map the Brain’s Connections 

Most people don’t think about just how miraculous the human brain is. This organ contains about 80 billion neurons, each of which is connected to as many as 10,000 other neurons. Mapping the neurons themselves is a challenging endeavor, but trying to understand the connections between them is nothing short of a herculean task.

Thomas Uram, Data Science and Workflows Team Lead
Credit: Argonne National Laboratory

While fully mapping the human brain will take many more years of hard work, scientists at Argonne National Laboratory are laying the foundation for future explorations. The project is led by Argonne's Nicola Ferrier, Senior Computer Scientist in the Mathematics and Computer Science Division.

To learn more about this amazing work, we spoke with Thomas Uram, a Computer Scientist in the Argonne Leadership Computing Facility, who is also working on the project.

“(The brain) is one of the most complex things on the planet,” Uram said. “It’s certainly the most complex thing in our bodies, and we don’t totally understand how it works. What we’re trying to do is reconstruct its structure and connectivity.”

While Uram’s curiosity in this work stems from a desire to uncover the unknown, there are also some important incentives to understanding the brain’s connections. Learning more could help researchers understand more about human behavior, as well as provide insight into neurologically degenerative diseases.

A Cubic Centimeter of Brain

Research that maps the connections within an organism’s nervous system falls under the umbrella of connectomics. Considering the complexity of the brain’s structure, the connectomics study Uram and his colleagues are pursuing focuses on samples of brain tissue that are a cubic millimeter in size.

These samples are prepared by taking thousands of 30-nanometer-thick slices of tissue that are residual human brain tissue removed during surgery. Then, the scientists mount them on a tape that goes off to be imaged by an electron microscope. Each section is imaged individually as a collection of tiles and then reassembled as a larger section.

Once these sections are fully reconstructed, they are then aligned with the neighboring ones so that features within them match up. Then, the researchers use a neural network to trace objects within that stack of images. Specifically, Uram stated that the team uses a neural network developed by Google called Flood-Filling Network (FNN) to do the reconstruction part.

With 80 billion neurons, each having as many as 10,000 connections to other neurons, mapping the connectomics of the brain is an extremely difficult task.

FFNs are machine learning neural networks specifically designed for neuron segmentation in connectomics. FFNs are a specialized type of Convolutional Neural Network (CNN) designed to distinguish neurons from other objects in electron microscopy images. CNNs are often used in tasks related to images, such as separating an object from background (a cow from a field, for example), generating captions that describe the objects in an image, or even generating new images.

This same CNN approach is used in FFN, to separate neurons from each other and from other objects found in brain tissue. A main part of the challenge in this case is identifying the many neurons in a small tissue volume.

Even with such a relatively small sample, studying every connection is a major computational challenge.

That cubic millimeter of tissues imaged at a lateral resolution of four nanometers generates about two petabytes of data. As Uram explained, that’s a huge problem – even for the most powerful machines we currently have.

With the current neural networks the lab is using to segment objects, Uram and his colleagues could segment a cubic millimeter of tissue in a few days using the entirety of Aurora’s computing potential. What’s more, this issue becomes exponentially worse as the scientists look to scale up this research.

“Looking into the future, if we wanted to reconstruct a whole mouse brain – that’s a cubic centimeter of data,” Uram said. “It's a thousand times as much data. That would take about 3,000 days on all of Aurora. That’s pushing like 9 or 10 years, on all of Aurora. We won't have access to all of Aurora for ten years straight. So clearly, we need much more computing than we have available now.”

What’s more interesting here is just how much computing power we’ll need to map an entire human brain. Uram stated that a human brain is about 1,000x larger than the cubic centimeter of a mouse. That causes a 1,000x increase in compute need and would require all of Aurora’s resources for 3 million days straight.

What the Future Holds

Clearly, using all the resources of one of the most powerful computers in the world for 3 million days straight isn’t possible. Uram admitted that we’ll need to create more powerful machines before we can even begin to seriously consider mapping an entire human brain.

The technology may not be ready to map the connections of an entire human brain, but the work done by Uram and his colleagues is laying the foundation for this future work.

However, he also points that the solution here isn’t to simply build machines that are 3 million times larger than what we currently have.

“More likely is that we will see significant advances in the tech that we're using,” Uram said. “If we can significantly speed up the neural network in terms of segmentation, then I think we could do much better on the machines that we currently have and that we expect to have in the next generation or two of machines.”

Uram mentioned that at this point, most people are familiar with the types of errors that can be seen in models like ChatGPT. Those same sorts of errors exist when scientists are trying to segment fine-structured neurons. This creates a large amount of data that must be proofread by humans.

He specifically mentioned a different project that worked to map a fly's brain. These researchers estimated that the human time involved in correcting the fly segmentation is on the order of thousands of hours.

On top of cutting back on the time spent on human proofreading, scientists also have a storage problem that they’ll need to solve. Right now, the researchers are working with petabytes of data from the cubic millimeter brain samples they have. For the larger work they want to do, the storage requirements would go way beyond exabytes of data. Exactly how we store and move that data around will demand new innovations.

This is clearly a difficult task, and progressing toward the mapping of a full human brain will only present more obstacles. However, Uram seems up for the challenge.

“I have always been interested in the big questions of life,” Uram said. “How the brain works is a complex and vexing question. It’s a great unknown.”

AIwire