The launch of private space flights and the pending launch of metaverse have been the highlights of technology in 2021. My prediction is that since the rich boys now have had their day in near-space, private space travel is going to go into hibernation, or is going to be inconsequential to the larger world soon.
The time for metaverse, meanwhile, is a couple of years away when realistic 3D avatars, realistic 3D spaces, edge compute-enabled responsive interactions, augmented reality, next-gen communication, next-gen sensors and actuators, artificial intelligence, and social computing will combine to give the first usable metaverse platforms for social and business interactions.
This will happen because Meta and Microsoft, among others, will pour resources into developing the metaverse, and there is no stopping them. In the next two years, we will see the growth of AI into every corner of the technology world. Some examples include:
AI-based Immersive Entertainment with Intelligent and Interactive Artificial Cast: Microsoft Comic Chat, now known as Microsoft Chat, was released in 1996. The online conversations are recorded as a comic strip that unfolds live as the chat progresses. The concept has come a long way today but still has ground to cover before realising its full potential.
Released in 2018, Black Mirror Bandersnatch was Netflix’s first interactive offering, in which audiences could make choices and drive the direction of the plot. The future possibility is to have immersive entertainment where individuals can become part of the plot, along with other real and computer-generated individuals. The plot itself would be dynamic and adapt, depending on the real-world characters that take part and the roles they play. This will be the beginning of metaverse.
AI and Bio-Modelling to predict and prevent pandemics: Close to 5.5 million people have died due to Covid-19 so far. In comparison, the Spanish Flu of 1918 took 50 million lives, and the bubonic plague of the 14th century took 200 million lives. According to WHO, since 1970, more than 1,500 new pathogens have been discovered and some have caused significant human loss – such as the Ebola virus in 1976, and HIV in 1983.
With better modelling (including, using AI) of disease-causing germs and their spread, it is possible to prevent and contain pandemics and epidemics. With the new genetic approaches, the possibility exists to detect new germs and germ variations early and develop countermeasures. On the other hand, the possibility of climate change thawing out frozen pathogens of the past is now a reality.
AI, Chaos-based techniques to predict natural patterns and disasters: Natural disasters have precursors and certain patterns that need massive computing resources to monitor, track, and create warnings. Chaos theory describes systems whose state evolves with time exhibiting dynamics that are particularly sensitive to initial conditions and can exhibit exponential growth of initial perturbations. Weather is considered a chaotic system. Hybrid computing (utilising the power of millions of edge devices and their data, along with the power of the cloud) combined with advances in AI techniques and Chaos theory will make it possible to provide natural disaster warnings way ahead of the occurrence — with very low false positives. The challenge is that the progress of Chaos theory application has been terribly slow. Edge data and AI might be the difference in making Chaos mainstream!
AI-based Augmented Reality for enhanced data visualisation: Three-dimensional (3D) visualisation and the ability to work interactively with image data has been possible in the industry for some time. However, in the return-on-investment battle vs 2D visualisation, 2D won. 3D visualisation required specialised computing and even visualisation ‘caves’. With the resources behind metaverse, 3D industrial and health visualisation will become a beneficiary, leading to better factory yields and health outcomes.
I have been writing about the future of technology since 2008. Some of the technologies I have written about have become mainstream now. The AI applications mentioned in this article have been languishing; waiting for other technologies and market needs to catch up. That time is at hand.