The VR Doctor: Where VR meet AI for Healthcare


03 Mar
03Mar

In the 3rd instalment of feature series Dr Raphael Olaiya, a NHS doctor and medical education academic, who works with the NHS on virtual reality (VR) immersive training programs for doctors and nurses. Discusses a glimpse into the meeting point of VR and AI for healthcare.

We can see it, feel it, touch it, taste and even smell it, virtual reality is very tangible and it’s sophistication and fidelity is increasing at a rate where anyone who self-identifies as a technology fan must chase updates biweekly to prevent being left behind. An accepted consensus is that by 2030 VR/AR with be the fundamental platform for mobile communication and be firmly embedded as a mainstay within the industries of energy, entertainment, education/training, travel, e-commerce, trade and healthcare.

Conversely AI, artificial intelligence from the perspective of the average self-identified technology fan can only be perceived in depictions from what Hollywood directors spoon feed us. An optimistic and plausible feat all the same, however actualizing it seems much more of a lucid dream than reality for the masses. When Siri, Catana and Amazon Echo are publically self-hailing themselves as having artificially intelligent capability it throws the masses of technology hobbyists into confusion because the only thing they see here is cool voice recognition search tools. AI to the masses means firstly on demand adaptive learning and androids tendencies physically, emotionally or intellectually stemming from Hollywood's portrayal of characters including Jarvis from Ironman, Skynet, bicentennial man, ex Machina, Star Wars amongst others. This example highlights a problem with the classification of the definition of AI which has lead to a degree of trivialization of commercial AI and reducing its appeal because the recurrent depictions are just simply not realistic technology any time soon.

Delving deeper in AI technicalities, AI is certainly alive and kicking. IBM Watson at the forefront with google deep mind, Microsoft Oxford are currently changing the way businesses trade, the way doctors diagnose and how engineers construct. Deep learning algorithms that are able to resynthesize formulae for future tasks based on the results of previous formulae. However, the reach of IBM Watson and its cousins seems somewhat disconnected to the masses. Why? This boils down to the sedimentary reasons of;

1.    The current level of cutting edge AI in 2017 focuses on area specific ( i.e. IBM Watson X Sloan Kettering partnership focusing on more effective and efficient cancer diagnosis and treatment ) big data processing and delivers directed specific answers after arduous instructions and direction from the subject matter experts i.e. world-renowned oncologists. So the focus of technological advancement is not on AI for consumer tech rather for area specific specialist research and study.

2.    There is not a strong enough tie yet between the organisations/companies with mass consumer reach and the tech communities working hard on breakthrough expert level programming needed to adapt up to date artificial intelligence tech to a use that will serve the masses directly. This disconnection will soon bridge.

3.    Tangible applications to channel current available AI tech into “perceived real AI” applications have not taken off in the open market yet. Eager AI fans wait for the anticipated killer/golden app/robot or any other product. The app that will utilise AI technology in a way that bridges away from the island of the gimmick and enters a realm of the value add whether it be saving time, money, or enhancing entertainment.

VR/AR represents a golden ticket opportunity to present to the mass consumer market an application of artificial intelligence that is very easy to relate to and get excited about for anybody. This highlights the importance of the user experience and perception of AI for example, an AI application field that billions of dollars are currently being invested to currently is an AI medical diagnosis tool for patients with no/delayed access to a doctor. It doesn’t take a controlled experiment for us to agree that the user experience would be more successful with a realistic CGI androidesc VR/AR artificially intelligent voice recognising AI nurse/doctor than a messenger style chat box despite having the exact same level of AI sophistication.


 Whether or not we define or perceive AI as algorithmic formulae clever enough to generate more formulae and build upon past experience, in order to learn deeply coupled with a powerful searching function doesn’t take away the fact that how we interact with it is the most important feature of all! After all, AI is for the betterment of our lives, to increase joy and pleasure, ease suffering and further the realisation of our ability as humans to supersede ourselves.

Virtual reality and AI intertwining in a multidisciplinary approach to serve us more holistically is the goal. Healthcare a universal priority is a worthwhile focus point for this blending.

Application for AI and VR/AR for healthcare: (Non-exhaustive)

1.    Hospital or healthcare facility management and work flow visualisation, planning and implementing.

2.    On demand patient healthcare triage system to prioritise medical or surgical emergencies.

3.    On demand patient home diagnosis. (limited without clinical judgement and examination)

4.    Artificially intelligent VR/AR clinical assistants for Medical doctors to reduce errors.

5.    Medical education and simulation training adaptive to the own users learning style to optimise learning and retaining knowledge.

6.    Demonstrating the neural pathway of the artificially intelligent system in use: There will be a point where the AI being used for healthcare reaches a point that we as humans cannot comprehend the process. VR/AR represents a method by which these processes can be explained to us more effectively than code and we can work to reverse engineer the AI discover to apply it even further.

So just as this article started to see it, feel it, touch it, taste and even smell to do this is to make AI a human experience. 

Dr Raphael Olaiya, Published on VRFocus.com

Comments
* The email will not be published on the website.