Artificial intelligence (AI) is prepared to become a transformational force in healthcare. From chronic diseases and cancer to radiology and risk assessment, there are nearly endless opportunities to influence technology to install more precise, efficient, and impactful interventions at exactly the right moment in a patient’s care.AI offers a number of benefits over traditional analytics and clinical decision-making techniques. Learning algorithms can become more specific and accurate as they interact with training data, allowing humans to gain unique insights into diagnostics, care processes, treatment variability, and patient outcomes (1). 
 
 Using computers to communicate is not a new idea by any means, but creating direct interfaces between technology and the human mind without the need for keyboards, mice, and monitors is a cutting-edge area of research that has significant applications for some patients. Neurological diseases and trauma to the nervous system can take away some patients’ abilities to speak, move, and interact meaningfully with people and their environments. Brain-computer interfaces (BCIs) backed by artificial intelligence could restore those fundamental experiences to those who feared them lost forever. Brain-computer interfaces could drastically improve quality of life for patients with ALS, strokes, or locked-in syndrome, as well as the 500,000 people worldwide who experience spinal cord injuries every year (2).
 
 Radiological images obtained by MRI machines, CT scanners, and x-rays offer non-invasive visibility into the inner workings of the human body. But many diagnostic processes still rely on physical tissue samples obtained through biopsies, which carry risks including the potential for infection. AI will enable the next generation of radiology tools that are accurate and detailed enough to replace the need for tissue samples in some cases, experts predict. Diagnostic imaging team with the surgeon and the pathologist can be brought together which will be a big challenge (3).
 
 Succeeding in the pursuit may allow clinicians to develop a more accurate understanding of how tumours behave as a whole instead of basing treatment decisions on the properties of a small segment of the malignancy. Providers may also be able to better define the aggressiveness of cancers and target treatments more appropriately. Artificial intelligence is helping to enable “virtual biopsies” and advance the innovative field of radiomics, which focuses on harnessing image-based algorithms to characterize the phenotypes and genetic properties of tumours (1).
 
 Shortages of trained healthcare providers, including ultrasound technicians and radiologists can significantly limit access to life-saving care in developing nations around the world. AI could help mitigate the impacts of this severe deficit of qualified clinical staff by taking over some of the diagnostic duties typically allocated to humans (4).
 
 For example, AI imaging tools can screen chest x-rays for signs of tuberculosis, often achieving a level of accuracy comparable to humans. This capability could be deployed through an app available to providers in low-resource areas, reducing the need for a trained diagnostic radiologist on site.
 
 However, algorithm developers must be careful to account for the fact that different ethnic groups or residents of different regions may have unique physiologies and environmental factors that will influence the presentation of disease.The course of a disease and population affected by the disease may look very different in India than in the US. As these algorithms are being developed, it is very important to make sure that the data represents a diversity of disease presentations and populations. we cannot just develop an algorithm based on a single population and expect it to work as well on others (1).
 
 Electronic health records (EHRs) have played an instrumental role in the healthcare industry’s journey towards digitalization, but the switch has brought myriad problems associated with cognitive overload, endless documentation, and user burnout. EHR developers are now using AI to create more intuitive interfaces and automate some of the routine processes that consume so much of a user’s time. Users spend the majority of their time on three tasks: clinical documentation, order entry, and sorting through the in-basket (5).
 
 Voice recognition and dictation are helping to improve the clinical documentation process, but natural language processing (NLP) tools might not be going far enough. Video recording a clinical encounter would be helpful while using AI and machine learning to index those videos for future information retrieval. And it would be just like in the home, where we are using Siri and Alexa. The future will bring virtual assistants to the bedside for clinicians to use with embedded intelligence for order entry(5). AI may also help to process routine requests from the inbox, like