The ideas of VR (Virtual Reality), MR (Mixed Reality) and AR (Augmented Reality) are ones that are familiar to most people by now. However, it is not uncommon for people to perceive these technologies are mostly useful for entertainment. In reality, these technologies, collectively termed as XR (Extended Reality), hold massive potential in various fields of science, be it innovation, development, or education. The ideas have already begun to be brought to fruition by scientists all over with interesting applications such as VR being used as a training ground for medical students and surgeons, AR being used in microscopes (developed by Google) which are able to detect cancerous cells in real-time, and other such novel applications. In 2018, a ‘Hackathon’ was conducted by Springer Nature, where participants used AR/VR headsets and accessed millions of data points through multiple APIs. The teams then attempted to solve real world problems by visualising data in AR/VR across scientific fields. Programs such as these, and other innovations/developments aided by XR, show that the technology can help scientists progress in their fields by providing enhanced immersiveness and context.
The realm of mixed reality is promising to bring a new age of innovation in scientific laboratories across the world. For the uninitiated, mixed reality (MR) is a technology that attempts to remove the boundaries between real and virtual interactions. This is done via occlusion where computer-generated objects and real physical objects co-exist and interact in real-time. It is a combination of VR and AR essentially. By using Mixed Reality, users can control the physical objects by using hand gestures, voice commands, and even eye movements. Therefore, scientists in laboratories can manipulate their lab benches without so much as touching it.
Holo4Labs attempts to revolutionise the laboratory by integrating MR into this space. Developed by Solution4Labs, Holo4Labs is the mixed reality variant of Laboratory Information Management Software (LIMS) made specifically for Microsoft Hololens 2. Scientists are able to merge data processing, information management software with the day-to-day ongoings of a lab. The goal of Holo4Labs is to simplify data management in labs and streamline operations. Think of the technology as an extra set of hands for scientists, however, since it is computer-based, the room for human error is reduced. One of the main goals of Holo4Labs is to free up scientists from having to do mundane tasks in the lab and be more focused on innovation.
Holo4Labs runs Thermo Fisher’s Scientific SampleManager LIMS, due to which the mixed reality system can tap into laboratory-specific applications. This is especially true in industries such as fuel, pharmaceuticals, food, biobanking and metallurgy since they are already well-integrated with the software. As compared to traditional AR, mixed reality allows scientists to interact with overlays seen through the glasses, and these interactions affect the physical lab equipment. Using the technology, scientists can quickly analyse samples, get the required data and store it for the future, all hands-free! The software also lets scientists communicate with peers across the world on their current projects. Add to this, audio and 2K visualisation abilities of models, samples and more, and you have an interactive workspace which fuels innovation.
In outer worldly matters, NASA scientists have been tapping into the potential of VR to redefine our understanding of our galaxy and how it works. The scientists use a customised, 3D virtual reality simulation that brings to life the speed and direction of about 4 million stars in the Milky Way. Astronomer Marc Kuchner and researcher Susan Higashio stated that the virtual reality program dubbed PointCloudsVR, developed by Matthew Brandt, has helped them obtain a new perspective on stars’ paths and improve their understanding of star groupings. The program could help correct some inconsistencies such as stars being classified in wrong groups as well as shed light on known star groups that may actually belong to larger groupings.
The findings derived from the program were presented at the annual American Geophysical Union (AGU) conference in December 2019. Kuchner and Higashio from NASA’s Goddard Space Flight Center in Greenbelt, Maryland also plan to publish their findings next year after extensively using the novel program, along with engineer Matthew Brandt. As per NASA, Higashio stated that the associations between groups of stars “became more intuitive inside the artificial cosmos found within the VR headset”. The 3D visualisations help Goddard scientists understand how the local star neighbourhood was formed. The scientists said that using the program, they often discovered groups of young stars moving together which reveals that they could possibly be formed at the same time and same place. The PointCloudsVR software has also been officially released and open sourced on NASA’s Github: https://dgit.in/PointCloudsVR. Other star-tracking solutions as well as virtual hands-on applications for NASA engineers and other engineers are also developed which help with exploration and satellite servicing missions. Kuchner also believes that VR headsets will become commonplace in scientific tools in the future and thinks of it as a research tool that should be present on every astrophysicist’s desk.
NASA also utilises VR to train astronauts, where they are required to complete system rescue scenarios with VR technology. NASA’s VRL (Virtual Reality Laboratory) is used, which is an immersive training facility providing real-time graphics and motion simulations. In the VRL astronauts also learn the basics of mass handling and robotics operations. NASA also utilises Charlotte, a VR robot built by McDonnell Douglas, which allows astronauts to feel the weight of an object in space.
AR and VR are poised to transform STEM (Science, Technology, Engineering and Mathematics) curriculum and how it is taught. These immersive technologies can enhance learning through methods such as experiential learning through simulations, modelling and spatial representation of data, and the sense of presence and context. In 2017, the Stanford Human Computer Interaction Lab released a new VR simulation dubbed The Stanford Ocean Acidification Experience which allows students to visit simulated oceans and experience the issue of ocean acidification. The project was developed in collaboration with Stanford marine biologists. Students utilised HTC Vives to notice the effect of CO2 on marine life and even collect samples from the ocean floor.
VR technology is also used in The University of Michigan-Ann Arbor where VR 3D building models help better understand the structural integrity of engineering projects. Students as well as faculty members use the MIDEN (Michigan Immersive Digital Experience Nexus) technology for architectural walkthroughs, virtual reconstructions, and training scenarios for dangerous sites. Yet another example comes from a variety of projects conducted at Texas A&M where students study in the Immersive Mechanics Visualization Lab which depicts computational and experimental data in AR and VR environments. In this environment, 3D CAD models can be manipulated and tweaked in the development stages.
XR is even entering the classroom with software platforms such as Labster. This platform has a suite of advanced lab simulations which encourage students to openly investigate and learn. The software even has a gamified experience which lets students embrace a CSI-type role of a forensics analyst who solves crimes within the lab. Additionally, these simulations also provide access to exorbitant equipment such as NGS (Next-Generation Sequencing) machines and electron microscopes that many universities cannot afford.