4D-Brains

Extracting Activity from Large 4D Whole-Brain Image Datasets

Started
July 1, 2021
Status
Completed
Share this project

Abstract

Whole-brain recordings hold promise to revolutionize neuroscience. In the last decade, innovations in fast 3D microscopy, protein engineering, genetics, and microfluidics have allowed brain researchers to read out calcium activity at high temporal resolution from a large number of neurons in the brains of Caenorhabditis elegans, Danionella translucida, Hydra, and zebrafish simultaneously. This technology is considered to be a game changer for neuroscience as it leaves far fewer variables hidden than when only a small fraction of neuronal activities could be recorded. Many fundamental and challenging questions of neuroscience can now be pursued. For example: What global brain activity determines an organism's responses to stimuli? How are decisions computed by networks of neurons? What is the idle activity of an unstimulated brain?

The field suffers from a critical bottleneck. Neuronal activities are recorded as local intensity changes in 4D microscopy images. Extracting this information for a moving animal is very labor-intensive and requires expertise. The promise of whole-brain recordings cannot be fully realized unless the image analysis problem is solved.

There are several challenges: A) 3D images are generally difficult to annotate manually. B) The worm moves, rotates, bends, and compresses fast. C) To avoid blurring, the exposure time and the image quality are limited. D) The resolution in the z-direction is low.

People

Collaborators

SDSC Team:
Corinne Jones
Isinsu Katircioglu
Benjamin Béjar Haro
Guillaume Obozinski

PI | Partners:

EPFL, Laboratory of the Physics of Biological Systems:

  • Prof. Sahand Rahi
  • Dr. Elif Gençtürk
  • Alice Gross
  • Mahsa Barzegarkeshteli
  • Matthieu Schmidt

More info

description

Motivation

Tracking cells and freely moving animals in timelapse recordings is critical to many areas of biology but continues to involve time-consuming manual labor. In our specific application in C. elegans neuroscience, novel genetically encoded calcium indicators and 4D microscopy techniques would make it possible to record neuronal activity at single-neuron resolution in the brains of naturally behaving C. elegans, if it was possible to reliably segment and track these neurons. While this 3D visualization technology opens up new avenues for investigating brain computations, segmenting and tracking imaged neurons over time (see Figure 1) is extremely challenging. The goals of the collaboration consist of identifying specific neurons across 4D images (segmentation & tracking), mapping every pixel in 4D images onto a 3D reference (registration) and speeding up the former tasks for real-time feedback to the animal.

Proposed Approach / Solution

In this project, we present an approach based on graph matching for tracking fluorescently-marked neurons in freely-moving C. elegans.  In our approach, each frame is represented by a complete graph, with the somas and (parts of) neurites being the nodes. Our algorithm learns to use the node features (locations and shapes of the objects) and edge features (distances between the objects) to match the nodes in each frame with nodes in a reference frame. Neurites (and sometimes neurons) can be oversegmented into pieces at the preprocessing phase; our algorithm allows several segments to match the same reference neuron or neurite. In addition to this, we also propose to leverage generative deep learning models to create realistic synthetic datasets that are annotated automatically. The idea is to take a single image of C. elegans which is annotated manually and to deform non-linearly this image so as to produce a large number of other images of the same worm in different anatomical conformations, while preserving the image texture and the fluorescence, and transferring the annotations.

Impact

Efficient image analysis techniques would reduce the burden of manual annotation and unleash the growth of the field. Faster image analysis would mean that more and a more diverse range of experiments can be performed, and more animals can be analyzed, making results more statistically rigorous. Moreover, high-throughput neuroscience with freely moving animals will become possible, and new questions will become accessible, for example, individual differences between animals could be studied in a statistically rigorous way.

Figure 1: We aim to identify pieces of neurons or whole neurons in 3D images and track them in time. This can be done by mapping 3D images from different time points onto the same reference 3D image.

Gallery

Annexe

Publications

  • C. F. Park, M. B. Keshteli, K. Korchagina, A. Delrocq, V. Susoy, C. L. Jones, A. D. T. Samuel, S. J. Rahi. Automated neuron tracking inside moving and deforming C. elegans using deep learning and targeted augmentation. In Nature Methods, 21, 142–149 (2024).
  • C. Jones, M. Barzegar-Keshteli, A. Gross, G. Obozinski, S. J. Rahi. A Graph Matching Approach to Tracking Neurons in Freely-Moving C. elegans. In bioRxiv 2023.11.30.569341 (2023).

Additional resources

Bibliography

  1. K. Chaitanya, N. Karani, C. F. Baumgartner, E. Erdil, A. Becker, O. Donati and E. Konukoglu. Semi-supervised task-driven data augmentation for medical image segmentation. In Medical Image Analysis, vol. 68, p. 101 934, 2021.
  2. S. Chaudhary, S. A. Lee, Y. Li, D. S. Patel, and H. Lu. Graphical-model framework for automated annotation of cell identities in dense cellular images. In eLife 10:e60321, Feb. 2021.
  3. S. Kato, H. S. Kaplan, T. Schrödel, S. Skora, T. H. Lindsay, E. Yemini, S. Lockery, and M. Zimmer. Global brain dynamics embed the motor command sequence of Caenorhabditis elegans. In Cell, 163 (3):656–669, Oct. 2015.
  4. J. Mu , S. D. Mello, Z. Yu, N. Vasconcelos, X. Wang, J. Kautz and S. Liu. CoordGAN: Self-Supervised Dense Correspondences Emerge from GANs. In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022.
  5. J. P. Nguyen, A. N. Linder, G. S. Plummer, J. W. Shaevitz, and A. M. Leifer. Automatically tracking neurons in a moving and deforming brain. In PLOS Computational Biology, 13(5):1–19, 05 2017.
  6. W. Peebles, J.-Y. Zhu, R. Zhang, A. Torralba, A. Efros, and E. Shechtman. GAN-Supervised Dense Visual Alignment. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2022.
  7. X. Yu, M. S. Creamer, F. Randi, A. K. Sharma, S. W. Linderman, and A. M. Leifer. Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training. In eLife, 10:e66410, Jul. 2021.

Publications

Related Pages

More projects

IRMA

In Progress
Interpretable and Robust Machine Learning for Mobility Analysis
No items found.

FLBI

In Progress
Feature Learning for Bayesian Inference
No items found.

STIMO

In Progress
Personalized epidural electrical stimulation of the lumbar spinal cord for clinically applicable therapy to restore mobility after paralyzing spinal cord injury
No items found.

VOCIM

In Progress
Directed Imitation During Vocal Learning
No items found.

News

Latest news

Smartair | An active learning algorithm for real-time acquisition and regression of flow field data
May 1, 2024

Smartair | An active learning algorithm for real-time acquisition and regression of flow field data

Smartair | An active learning algorithm for real-time acquisition and regression of flow field data

We’ve developed a smart solution for wind tunnel testing that learns as it works, providing accurate results faster. It provides an accurate mean flow field and turbulence field reconstruction while shortening the sampling time.
The Promise of AI in Pharmaceutical Manufacturing
April 22, 2024

The Promise of AI in Pharmaceutical Manufacturing

The Promise of AI in Pharmaceutical Manufacturing

Innovation in pharmaceutical manufacturing raises key questions: How will AI change our operations? What does this mean for the skills of our workforce? How will it reshape our collaborative efforts? And crucially, how can we fully leverage these changes?
Efficient and scalable graph generation through iterative local expansion
March 20, 2024

Efficient and scalable graph generation through iterative local expansion

Efficient and scalable graph generation through iterative local expansion

Have you ever considered the complexity of generating large-scale, intricate graphs akin to those that represent the vast relational structures of our world? Our research introduces a pioneering approach to graph generation that tackles the scalability and complexity of creating such expansive, real-world graphs.

Contact us

Let’s talk Data Science

Do you need our services or expertise?
Contact us for your next Data Science project!