XR Life Science

3D/VR Life Sciences Imaging.

Supported by MRC, BBSRC and the Wellcome Trust, through the Translational Research Initiative, Prof Craig Daly is leading on pioneering research on the application of confocal laser scanning microscopy (CLSM) in developing 3D assets from research data. CLSM provides highly detailed 3D structures of biological tissues at cellular and sub-cellular level, that could be used to construct images of real organs, tissues or other biological structures. As part of his work, he obtains serial sections of tissues and imports the CLSM data to sophisticated animation software to create 3D images of the structures of interest. In recent years the project has successfully created over 65 models, stored at the XR Life Science Sketchfab collection (Figure 1), and incorporated them into Virtual Reality (VR) technology to deliver unique teaching and public engagement resources. The workflow can also be applied to MRI data.

Virtual Reality and the related techniques of Computer-Generated Imagery have, to this date, been largely confined to the games industry and movie business due to the high cost and complexity of the hardware and software required. However, recent technological advancements have significantly reduced these costs, making consumer level systems more affordable and accessible. Inspired by the unique potential that 3D animations of real tissues, structures and organs can provide for educational purposes, Prof Daly collects CLSM & MRI data obtained for a variety of research purposes and converts them into visual representations, opening up new avenues for teaching, films and special effects, video games, and virtual reality.

Examples of constructs that can be designed from CLSM data.

Figure 1. Examples of constructs that can be designed from CLSM data. The XR Life Science Sketchfab collection currently features 65 3D assets. 

The designed animations could revolutionise conventional teaching methods, as they would allow students to link their theoretical knowledge to actual visual representations of the desired structures and would allow 3D interactions including close-ups, zooming and alternative perspectives of the target tissues. Furthermore, educational materials would not need to feature images created by artists' impressions and could instead use 'real' images of cells, making the project an educationally attractive opportunity. This would enable students to learn in a more fun and interactive way by connecting visual and theoretical elements to gain a more comprehensive understanding of the 3D visualisation. This would be especially beneficial in the area of life sciences, where high-element interactivity can be difficult to understand and natural tissues and organoids are hard to picture, due to their scale and complexity. In terms of teaching, this approach would allow educators to address different learning styles (visual, auditory, read-write and kinesthetic) when designing new instructional or teaching materials, providing opportunities for an optimised, inclusive, and effective way of teaching.

In terms of research, 3D modelling can be incredibly valuable for scientists. The Life Science project allows researchers to input their 3D data and create animations, 3D assets, 3D prints, VR scenes and interactive apps. These data can then become content for research, teaching, outreach, or public engagement for their projects. Outputs of this work could also be applied to the gaming industry, VR headsets, and be incorporated into the “Metaverse”, opening up new commercial avenues and the incorporation of science into other aspects of everyday life.

The XR Life Science workflow requires images to pass through a variety of image processing and games software packages, manual interventions and machine learning algorithms (Figure 2). In the future, the team aim to expand their image collection and services provided to researchers to include tutorials on image analysis linked to the XR facility at the University of Glasgow.

XR-Life Science basic workflow.  a) Images are scanned using either confocal microscopy (as shown) or MRI or EM 3D block.  b) Image volumes are created.  c) features of interest are segmented.  d) data is processed further to enable photorealistic rendering, 3D printing, animation and VR viewing.

Figure 2. XR-Life Science basic workflow. Images are scanned using either confocal microscopy (as shown) or MRI or EM 3D block (a), image volumes are created (b), features of interest are segmented (c), data is processed further to enable photorealistic rendering, 3D printing, animation and VR viewing (d).

The service products are digital and have immense educational and commercial potential. The Sketchfab collection has contributions from 16 research groups and Prof Daly’s YouTube channel, featuring image analysis tips and tutorials, has more than 4K subscribers and over 500,000 views. In the future, this work could be used in multiple applications, including:

  • Use as an educational tool for students of all stages.
  • Image analysis tutorials could provide training and expertise for professional development for users of all backgrounds.
  • Use as a tool to help researchers visualise their work, and for use in conferences, public engagement and outreach activities.
  • Use in the gaming market, in VR headsets, virtual productions and “metaverse technologies”.

In addition to his 3D assets collection, Prof Daly has used his 3D images and expertise gained in designing interactive educational content to create a Virtual Reality game to excite and attract school age students to physiology as a potential career pathway. Funded by the Physiological Society, the game invites users to enter the lab of a famous Physiology Professor following his mysterious disappearance and solve a variety of puzzles in order to discover his secret work. The game aims to educate users in a fun and interactive way, through solving puzzles, and is now open to the public. To learn more, or to download the game, click here.

 

Internal Funding Sources

This project has been supported by a variety of internal awards. Early stages of the project were funded by the MRC Proximity to Discovery Industry Engagement Fund (P2D) in 2018, following by a BBSRC Excellence with Impact award in 2019, and a Wellcome Trust Translational Partnership Award in 2021.

External Funding Sources

The development of the Virtual Reality game has been supported by a Physiological Society grant in 2020.