© Pixabay
Author profile picture

If physicians and bioscientists want to take a detailed look inside our bodies, they can use a variety of imaging technologies. For starters, they can find out how big our liver is, whether it changes when we take medication, and if the kidney is inflamed. Or if a tumor is present in the brain and if this has already metastasized. However, these analyses are very time-consuming, given that a great deal of data has to be sifted through and interpreted.

“The analysis of three-dimensional imaging processes is very complicated,” explains Oliver Schoppe from the Technical University of Munich. In collaboration with an interdisciplinary research team, he has developed self-learning algorithms that can help analyze biological imaging data and significantly simplify the evaluation process in the future. These artificial neural networks take only seconds to assess whole-body scans of mice. Instead of displaying organs in various shades of gray, they display them in color and can also segment them.

At the heart of the AIMOS (AI-based Mouse Organ Segmentation) software lie artificial neural networks that are capable of learning. Just like the human brain.”You used to have to tell computer programs exactly what you wanted them to do,” Schoppe explains. “Neural networks don’t need such instructions. It’s sufficient to train them by presenting a problem and a solution multiple times. Gradually, the algorithms start to recognize the relevant patterns and are able to find the right solutions themselves.”

Training self-learning algorithms

The algorithms in the AIMOS project were trained with images of mice. They had to learn to assign the pixels from 3D whole-body scans to specific organs, such as the stomach, kidney, liver, spleen or brain. Based on that mapping, the program is able to pinpoint the exact location and shape.

“We were lucky enough to have access to several hundred images of mice from a different research project. All of which had already been interpreted by two biologists,” Schoppe recalls. What’s more, the researchers used fluorescent microscopic 3D scans from the Institute of Tissue Engineering and Regenerative Medicine at the Helmholtz Center Munich.

The researchers were able to completely remove the dye from mice that had already died by using a special technique. Scientists had already succeeded in completely removing the color from dead mice using a special technique. “The transparent bodies could be scanned step by step and layer by layer with a microscope.”

The distances between these measuring points would have been only six micrometers. This is the size of a cell. Biologists could also pinpoint the organs in these data sets.

Greater accuracy thanks to artificial intelligence

The new algorithms learned faster than expected, Schoppe reported after a presentation at TranslaTUM. “We only needed around ten whole-body scans before the software was able to successfully analyze the imaging data on its own – and within a matter of seconds. It takes a human hours to do this.”

The researchers then verified the reliability of the AI using another 200 whole-body scans of mice. “”The result shows that self-learning algorithms are not only faster at analyzing biological image data than humans, but also more accurate,” Prof. Bjoern Menze sums up. He is head of the Image-Based Biomedical Modeling Group at TranslaTUM at the Technical University of Munich (TUM).

“Saves a lot of time”

The intelligent software will be used primarily in basic research in the future. “Images of mice are vital for, for example, investigating the effects of new medication before they are given to humans. Using self-learning algorithms to analyze will save a lot of time in the future,” Menze emphasizes.

The researchers received support for their project from the German Federal Ministry of Education and Research (BMBF) as part of the Software Campus Initiative, the German Research Foundation (DFG) through the Cluster of Excellence under the auspices of the Munich Cluster for Systems Neurology (SyNergy) and a research grant, and from the TUM Institute for Advanced Study with funding from the Excellence Initiative and the European Union. The research was also funded by the Fritz Thyssen Foundation. NVIDIA supported the work of the GPU Grant Program. In addition, their work, which was published in the academic journal Nature, was funded by the Fritz Thyssen Stiftung, and also backed by NVIDIA through its GPU Grant program.