Résumé
Organs On a Chip (OOCs) represent a sophisticated approach for exploring biological mechanisms and developing therapeutic agents. In conjunction with high-quality time-lapse microscopy (TLM), OOCs allow for the visualization of reconstituted complex biological processes, such as multi-cell-type migration and cell–cell interactions. In this context, increasing the frame rate is desirable to reconstruct accurately cell-interaction dynamics. However, a trade-off between high resolution and carried information content is required to reduce the overall data volume. Moreover, high frame rates increase photobleaching and phototoxicity. As a possible solution for these problems, we report a new hybrid-imaging paradigm based on the integration of OOC/TLMs with a Multi-scale Generative Adversarial Network (GAN) predicting interleaved video frames with the aim to provide high-throughput videos. We tested the performance of the predictive capability of GAN on synthetic videos, as well as on real OOC experiments dealing with tumor–immune cell interactions. The proposed approach offers the possibility to acquire a reduced number of high-quality TLM images without any major loss of information on the phenomena under investigation.
langue originale | Anglais |
---|---|
Pages (de - à) | 3671-3689 |
Nombre de pages | 19 |
journal | Neural Computing and Applications |
Volume | 33 |
Numéro de publication | 8 |
Les DOIs | |
état | Publié - 1 avr. 2021 |