"Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction
4.1K views | +1 today
Your new post is loading...
Rescooped by VAB Traductions from E-HEALTH - E-SANTE - PHARMAGEEK

Infographic Digital Health and Care in the EU 

This infographic gives an overview of the European Commission policy on transformation of health care in the Digital Single Market. It was published together with the Communication and Staff Working Document on this topic.

Related documents: 

Related content

Via Lionel Reichardt / le Pharmageek
No comment yet.
Scooped by VAB Traductions

What is the Role of Natural Language Processing in Healthcare?

What is the Role of Natural Language Processing in Healthcare? | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it
What is the Role of Natural Language Processing in Healthcare?
Natural language processing may be the key to effective clinical decision support, but there are many problems to solve before the healthcare industry can make good on NLP's promises.


For many providers, the healthcare landscape is looking more and more like a shifting quagmire of regulatory pitfalls, financial quicksand, and unpredictable eruptions of acrimony from overwhelmed clinicians on the edge of revolt.


The industry is currently hanging in suspense between the anticipated end of the EHR Incentive Programs and the implementation of the MACRA framework, a transition that may not end up being as smooth as CMS could hope for.


Despite the uncertain atmosphere – or, in some cases, because of it – healthcare providers are taking the opportunity to beef up their big data defenses and develop the technological infrastructure required to meet the impending challenges of value-based reimbursement, population health management, and the unstoppable tide of chronic disease.


Analytics are already playing a major part in helping providers navigate this transition, especially when it comes to the revenue and utilization challenges of moving away from the fee-for-service payment environment. 


But clinical analytics and population health management have been a trickier mountain to climb.  Dissatisfaction with electronic health records remains at a fever pitch, and is unlikely to cool off as developers and regulators try to stuff more and more patient safety features, quality measures, and reporting requirements into the same old software.


Providers often lack access to the socioeconomic, behavioral, and environmental data that would help to create truly actionable analytics at the point of care, and consumer excitement over Internet of Things devices and patient-generated health data is only further complicating the question of how to bring meaningful results to end-users without hopelessly cluttering the computer screen.


While it may be tempting to shut off the laptop, silence the smartphone, and return to a simpler time when the consult room only contained the patient, the provider, and a pad of paper, healthcare won’t solve its insight problems by limiting the amount of data that users have to work with. 


Instead, the old quandary of how to turn big data into smart data will be answered by bigger, smarter computers that can analyze a huge variety of data sources more intelligently, and deliver intuitive, streamlined reports to providers so they can focus on using the information for quality patient care.


Natural language processing (NLP) is at the root of this complicated mission.  The ability to analyze and extract meaning from narrative text or other unstructured data sources is a major piece of the big data puzzle, and drives many of the most advanced and innovative health IT tools on the market.


Natural language processing is the overarching term used to describe the process of using of computer algorithms to identify key elements in everyday language and extract meaning from unstructured spoken or written input.  NLP is a discipline of computer science that requires skills in artificial intelligence, computational linguistics, and other machine learning disciplines.


Some NLP efforts are focused on beating the Turing test by creating algorithmically-based entities that can mimic human-like responses to queries or conversations.  Others try to understand human speech through voice recognition technology, such as the automated customer service applications used by many large companies. 


Still more are centered on providing data to users by identifying and extracting key details from enormously large bodies of information, like super-human speed readers with nearly limitless memory capacity.


Specific tasks for NLP systems may include:

  • Summarizing lengthy blocks of narrative text, such as a clinical note or academic journal article, by identifying key concepts or phrases present in the source material


  • Mapping data elements present in unstructured text to structured fields in an electronic health record in order to improve clinical data integrity


  • Converting data in the other direction from machine-readable formats into natural language for reporting and educational purposes


  • Answering unique free-text queries that require the synthesis of multiple data sources


  • Engaging in optical character recognition to turn images, like PDF documents or scans of care summaries and imaging reports, into text files that can then be parsed and analyzed


  • Conducting speech recognition to allow users to dictate clinical notes or other information that can then be turned into text


Many natural language processing systems “learn” over time, reabsorbing the results of previous interactions as feedback about which results were accurate and which did not meet expectations.


These machine learning programs can operate based on statistical probabilities, which weigh the likelihood that a given piece of data is actually what the user has requested.  Based on whether or not that answer meets approval, the probabilities can be adjusted in the future to meet the evolving needs of the end-user.


Top 4 Basics to Know about Semantic Computing in Healthcare


How Semantic Data Analytics Benefits Population Health Management


In the healthcare industry, natural language processing has many potential applications.  NLP can enhance the completeness and accuracy of electronic health records by translating free text into standardized data.  It can fill data warehouses and semantic data lakes with meaningful information accessed by free-text query interfaces.  It may be able to make documentation requirements easier by allowing providers to dictate their notes, or generate tailored educational materials for patients ready for discharge.


Computer-assisted coding with an NLP foundation received a great deal of attention during the drawn-out ICD-10 conversation process, when it was viewed as a possible silver bullet for the problems of adding sufficient detail and specificity to clinical documentation.


But perhaps of greatest interest right now, especially to providers in desperate need of point-of-care solutions for incredibly complex patient problems, NLP can be – and is being – used for clinical decision support.


The most famous example of a machine learning NLP whiz-kid in the healthcare industry is IBM Watson, which has dominated headlines in recent months due to its voracious appetite for academic literature and its growing expertise in clinical decision support (CDS) for precision medicine and cancer care.


In 2014, just before IBM set up its dedicated Watson Health division, the Jeopardy!-winning supercomputer partnered with EHR developer Epic and the Carillion Clinic in Virginia to investigate how NLP and machine learning could be used to flag patients with heart disease, the first step for helping clinicians take the right actions for patient care.


“Using unstructured data was found to be important in this project,” explained Paul Hake, who worked for the IBM Smarter Care Analytics Group at the time.  “When physicians are recording information, they’ll just prefer to type everything in one place into the notes section of the EMR.  And so this information is kind of lost.  It’s then almost a manual process to map this unstructured information back into the EMR system so that we can then use it for analytics.” 


“We can run natural language processing algorithms against this data and automatically extract these features or risk factors from the notes in the medical record.”


The system didn’t stop at highlighting pertinent clinical data.  It also identified social and behavioral factors recorded in the clinical note that didn’t make it into the structured templates of the EHR.


“Those are some of the factors that are significant in terms of the risk factors,” said Hake.  “Is the patient depressed?  What’s the living status of the patient?  Are they homeless?  These are some of the factors that turn out to be important in the model, but they are also things that can be missed from a traditional analysis that doesn’t consider this sort of unstructured data.”


The pilot program successfully identified 8500 patients who were at risk of developing congestive heart failure within the year.  Watson ran through a whopping 21 million records in just six short weeks, and achieved an 85 percent accuracy rate for patient identification.


More recently, Watson has moved up the difficulty ladder to attack cancer and advanced genomics, which involve even larger data sets.  A new partnership with the New York Genome Center, as well as previous work with some of the biggest clinical and cancer care providersin the country, are prepping the cognitive computing superstar for a career in CDS.


“Cancer is a natural choice to focus on, because of the number of patients and the available proof points in the space,” said Vanessa Michelini, Distinguished Engineer and Master Inventor leading the genomics division of IBM Watson Health.


“There’s this explosion of data – not just genomic data, but all sorts of data – in the healthcare space, and the industry needs to find the best ways to extract what’s relevant and bring it together to help clinicians make the best decisions for their patients.”


In 2014 alone, there were 140,000 academic articles related to the detection and treatment of cancer, she added.  No human being could possible read, understand, and remember all that data, let alone distill it into concrete recommendations about what course of therapy has been most successful for treating patients with similar demographics and comorbidities.


Watson has made a name for itself doing just that, but IBM certainly doesn’t have the NLP world all to itself.  Numerous researchers and academic organizations have been exploring the potential of natural language processing for risk stratification, population health management, and decision support, especially over the last decade or so. 


2009 article from the Journal of Biomedical Informatics made the case for proactive CDS systems and intelligent data-driven alerts before the EHR Incentive Programs pushed electronic records into the majority of healthcare organizations, and pointed out the vital role that NLP technology would play in making that concept a reality.


“In some cases the facts that should activate a CDS system can be found only in the free text,” wrote three authors from the National Institutes of Health and University of Pittsburgh.  “Notably, medical history, physical examination, and chest radiography results are routinely obtained in free-text form. Indications for further tuberculosis screening could be identified in these clinical notes using NLP methods at no additional cost.”


“In principle, natural language processing could extract the facts needed to actuate many kinds of decisions rules. In theory, NLP systems might also be able to represent clinical knowledge and CDS interventions in standardized formats.”


Since then, those theories have been put into action.  A few of the many examples of national language processing in the clinical decision support and risk stratification realms include:


  • In 2013, the Department of Veterans Affairs used NLP techniques to review more than 2 billion EHR documents for indications of PTSD, depression, and potential self-harm in veteran patients.  The pilot was 80 percent accurate at identifying the difference between records of screenings for suicide and mentions of actual past suicide attempts.


  • Researchers at MIT in 2012 were able to attain a 75 percent accuracy rate for deciphering the semantic meaning of specific clinical terms contained in free-text clinical notes, using a statistical probability model to assess surrounding terms and put ambiguous terms into context. 


  • Natural language processing was able to take the speech patterns of schizophrenic patients and identify which were likely to experience an onset of psychosis with 100 percent accuracy.  The small proof-of-concept study employed an NLP system with “a novel combination of semantic coherence and syntactic assays as predictors of psychosis transition.”


  • At the University of California Los Angeles, researchers analyzed electronic free text to flag patients with cirrhosis.  By combining natural language processing of radiology reports with ICD-9 codes and lab data, the algorithm attained incredibly high levels of sensitivity and specificity.


  • Researchers from the University of Alabama found that NLP identification of reportable cancer cases was 22.6 percent more accurate and precise than manual review of medical records.  The system helped to separate cancer patients whose conditions should be reported to the Cancer Registry Control Panel from cases that did not have to be included in the registry.


Natural language processing technology is already embedded in products from some electronic health record vendors, including Epic Systems, but unstructured clinical notes and narrative text still present a major problem for computer scientists.


True reliability and accuracy are still in the works, and certain problems such as word disambiguation and fragmented “doctor speak” can stump even the smartest NLP algorithms.


“[Clinical text]… is often ungrammatical, consists of ‘bullet point’ telegraphic phrases with limited context, and lacks complete sentences,” pointed out Hilary Townsend, MSI, in the Journal of AHIMA in 2013. “Clinical notes make heavy use of acronyms and abbreviations, making them highly ambiguous.”


Up to a third of clinical abbreviations in the Unified Medical Language System (UMLS) Metathesaurus have multiple meanings, and more than half of terms, acronyms, or abbreviations typically used in clinical notes are puzzlingly ambiguous, Townsend added.


“For example, ‘discharge’ can signify either bodily excretion or release from a hospital; ‘cold’ can refer to a disease, a temperature sensation, or an environmental condition,” she explained. “Similarly, the abbreviation ‘MD’ can be interpreted as the credential for ‘Doctor of Medicine’ or as an abbreviation for ‘mental disorder.’”


While the human brain can usually decipher these types of differences by relying on the context of the surrounding words for clues, NLP technology still has a long way to go before it can reach the same reliability threshold as the typical flesh-and-blood reader.


In addition to the questionable validity of certain results, EHR developers are having a hard time figuring out how to display clinical decision support data within the workflow.  Inconsequential CDS alerts are already the bane of the majority of physicians, and there is no industry standard for how to create a support tool that will deliver pertinent, meaningful information without disrupting the patient-provider relationship.


Using NLP to fill in the gaps of structured data on the back end is also a challenge.  Poor standardization of data elements, insufficient data governance policies, and infinite variation in the design and programming of electronic health records have left NLP experts with a big job to do.


Four EHR Optimization Steps for Healthcare Data Integrity


The Role of Healthcare Data Governance in Big Data Analytics


Even though natural language processing is not entirely up to snuff just yet, the healthcare industry is willing to put in the work to get there.  Cognitive computing and semantic big data analytics projects, both of which typically rely on NLP for their development, are seeing major investments from some recognizable names.


Financial analysts are bullish on the opportunities for NLP and its associated technologies over the next few years.  Allied Market Research predicts that the cognitive computing market will be worth $13.7 billion across multiple industries by 2020, representing a 33.1 percent compound annual growth rate (CAGR) over current levels.


In 2014, natural language processing accounted for 40 percent of the total market revenue, and will continue to be a major opportunity within the field.  Healthcare is already the biggest user of these technologies, and will continue to snap up NLP tools through the rest of the decade.


The same firm also projects $6.5 billion in spending on text analytics by the year 2020.  Predictive analytics drawn from unstructured data will be a significant area of growth.  Potential applications include consumer behavior modeling, disease tracking, and financial forecasting.


MarketsandMarkets is similarly optimistic about the global NLP spend.  The company predicts that natural language processing will be worth $16.07 billion by 2021 all on its own, and also names healthcare as a key vertical.


Eventually, natural language processing tools may be able to bridge the gap between the unfathomable amount of data generated on a daily basis and the limited cognitive capacity of the human mind.  From the most cutting-edge precision medicine applications to the simple task of coding a claim for billing and reimbursement, NLP has nearly limitless potential to turn electronic health records from burden to boon.


The key to its success will be to develop algorithms that are accurate, intelligent, and healthcare-specific – and to create the user interfaces that can display clinical decision support data without turning users’ stomachs.  If the industry meets these dual goals of extraction and presentation, there is no telling what big data doors could be open in the future.

No comment yet.
Scooped by VAB Traductions

Natural Language Processing, Voice Tools Offer Solutions to EHR Woes

Natural Language Processing, Voice Tools Offer Solutions to EHR Woes | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

Natural Language Processing, Voice Tools Offer Solutions to EHR Woes

Natural language processing and voice-based documentation tools may be able to reduce pain points when interacting with the EHR.



May 21, 2018 - Artificial intelligence and machine learning are relatively new additions to the tools and applications that now power a large portion of the healthcare industry, but some providers have been using aspects of these strategies for much longer than they may realize. 


Voice recognition tools, which are primarily used to dictate reports and clinical notes into the EHR, have become a mainstay technology for providers in certain diagnostic disciplines, including radiology and pathology. 


Speech-to-text application are also rapidly growing in popularity among clinicians practicing in the inpatient setting and in primary care.


These applications rely upon natural language processing (NLP), a type of machine learning, to turn sound into text. 


Machine learning is also employed to identify meaningful data elements within that text, such as the name of a medication and the numerical dosage that is associated with that drug. 


READ MORE: What is the Role of Natural Language Processing in Healthcare?


For many healthcare providers, including those practicing at Pennsylvania-based WellSpan Health, natural language processing tools can be a game changing option for interacting with the electronic health record.


“Voice recognition can help move the EHR from its necessary function as a documentation tool for the business of medicine into a communication tool for the practice of medical care,” said R. Hal Baker, MD, Chief Information Officer and Senior VP of Clinical Improvement at WellSpan.


“The two functions are intertwined, but they are also distinctly different.  Voice tools and natural language processing make sure that our providers can convey meaning and context using the full breadth of the English language without succumbing to many of the challenges we often see with EHR use.” 


No matter how slick the interface or how intuitive the workflow is designed to be, there is something of an inherent flaw in asking nurses, physicians, and other providers to undertake multiple complex tasks at the same time. 


Providers are expected to give their undivided attention to the patient while simultaneously synthesizing the input into a diagnosis or treatment protocol and transcribing everything into a perfect narrative – all within the ten or fifteen minutes scheduled to address what often turns out to be multiple concerns.


READ MORE: EHR Users Want Their Time Back, and Artificial Intelligence Can Help


“Everyone’s capacity for attention is limited,” stressed Baker.  “That’s why we tell people not to text and drive.  You simply cannot stay focused on both tasks at once, and one is a lot more critical for safety and getting where you’re going than the other.”


“Texting and treating is exactly the same.  You’re asking a provider to manage two discordant tasks at the same time.  They compete for focus, and as a result you’re going to miss important parts of both.”


Few other professions require such competence with multitasking, he added.


“I know very few board chairs or senior executives who try to type notes when they’re running a business meeting,” Baker said.  “It’s not what they’re in the room to do.  It’s the same with healthcare providers – being able to sit at the keyboard and type notes is very rarely what attracted these people to medical school or nursing school.”


The result is often disillusionment with healthcare as a calling, and a daily battle with burnout. 


READ MORE: Health Information Governance Strategies for Unstructured Data


Nationally, burnout is reaching epidemic levels, with one recent survey indicating that 83 percent of healthcare organizations are struggling with how to keep their providers from feeling overwhelmed by documentation requirements and administrative burdens.


“Time has turned into the currency of healthcare in the modern era,” Baker said.  “Right now, the amount of time spent looking at a screen and clicking a mouse is becoming unsustainable.  We need to start employing new strategies to solve the problem.”


Voice recognition tools can be an important component of that new approach.  “The promise of voice recognition goes beyond dictating clinical notes as a replacement for typing or hand-writing them,” he said. 


“In a perfect world, we’ll be able to have a narrative conversation without even thinking about how it’s being recorded.”


At the moment, natural language processing tools are not quite sophisticated enough to completely replace traditional interactions with the EHR, but Baker believes they are not all that far off. 


“Products like Alexa, Siri, and Google Home have shown us that voice recognition with AI behind it can do a pretty good job of following verbal instructions,” he asserted. 


“As the industry refines those capabilities, it’s becoming much less of a leap to think that I’ll be able to say, ‘We are going to put Mrs. Smith on 500mg of amoxicillin, four times a day for seven days.  Send that prescription to the Walgreens on Queens Street.’”


The meaning of those two sentences is relatively simple, and existing virtual assistants may be able to carry out such a directive – assuming they achieve HIPAA compliance in the near future.


“But if I were to enter that order into the EHR myself, it would take me somewhere between 8 and 15 clicks and several keystrokes,” Baker pointed out. 


“It would be a major benefit to my relationship with my patient if I could simply say that sentence out loud, confirm it with the patient, and continue our conversation without losing my focus on the person in front of me.”


The industry is only at the very beginning of being able to achieve that vision, but voice tools and NLP are already improving the patient-provider relationship and reducing frustrations with the electronic health record.


At WellSpan, the combination of Nuance voice recognition technologies and an Epic EHR installation is fostering more natural and equitable conversations between patients and their clinicians.


“The goal is to do things with the patient, not to them,” said Baker.  “I’m a primary care provider by background, and when I dictate my notes in front of the patient, he or she gets to hear what I’m saying and make sure that it’s correct.  If I’m wrong, I can just go back and fix the error right there with their confirmation.”


“It’s a much more cooperative approach – not to mention a more efficient one.  I can talk to both the record and the patient at the same time, so I don’t have to walk out of the room and recount the entire visit again at some later time.  That lets me spend a greater percentage of my time in the patient’s presence.”


WellSpan complements its collaborative approach by participating in OpenNotes, which gives patients the opportunity to access their entire record through a patient portal at their convenience.


“We find that being transparent with the patient from the beginning of the documentation process is a significant benefit,” noted

Baker.  “People feel more invested in their care, and even more confident in their provider and their data because they participated in the process of creating their own record and they have experienced their provider listening to them.”


“Patients have a baseline expectation that they’re being listened to, but there are a lot of situations where that isn’t completely evident.  It’s very clear that they are the provider’s priority when they’re hearing their story repeated back to them.  It’s a much different experience than asking the patient to wait quietly while the provider puts his head down and types for five minutes.”


Providers and patients aren’t the only ones who benefit from voice-based dictation tools. The documentation itself is often of higher quality, and may be more useful for analytics downstream. 


“When you ask someone to fit a story into a form, you are going to lose part of the essence of the narrative.  There are times when point-and-click is very good for collecting data, but it’s not the only way we should be creating documentation,” he said.


“No one writes a novel by pointing and clicking through a template, and I doubt anyone would want to read one that was written that way.  It constrains the readability and accuracy of the ideas you’re trying to convey, and it doesn’t allow for the provider to share their thought process with the next person who’s interacting with that documentation and that patient.”


Allowing natural language processing tools to identify important elements within the text and extrapolate those into structured data formats allows providers to interact more naturally with the EHR without sacrificing on data quality, Baker said.


What about human medical scribes?  They too have been growing in popularity among providers who want to reduce the cognitive strain of multitasking without waiting for virtual assistants to gain more advanced abilities.


The American College of Medical Scribe Specialists estimates that the profession is poised to see explosive growth. 


Approximately 15,000 scribes were working in hospitals and ambulatory settings in 2015, the organization said.  But by 2020, that number is anticipated to rise to around 100,000 as clinicians seek extra help with documentation requirements.


“Scribes can be a very viable option,” Baker readily acknowledged, “especially because humans still have a better ability to interpret subtleties of language than a virtual assistant.  Scribes have been used effectively in several settings to improve the efficiency of providers, and they can play a valuable role.”


“But I believe the patient-provider dynamic subtly changes when there’s a third person in the room,” he added.  “The sense of confidentiality changes, through no fault of the scribe themselves.  You could think about utilizing a human scribe remotely, through video or audio, but then you are running into new questions of data privacy and security, not to mention infrastructure investment.”


Scribes are also only human, he continued, and run the same risks of being distracted as the provider.


“In contrast to people, computers are eternally vigilant,” said Baker.  “They don’t accidentally tune out; they don’t think about what’s for lunch.  Computers might make mistakes, but we can go back into the records and look at exactly what the mistake was and why it was made – and we can improve their programming so that they won’t make that mistake again.”


“It would be a very different world if we could do that with humans, but we can’t.  So there’s an advantage there to using virtual assistants or ambient computing devices that can take some of the variability out of the equation.”


The healthcare industry still has some work to do before voice-activated virtual assistants are a routine member of the care team, but natural language processing tools are already helping many providers have less stressful interactions with their EHRs. 


“Voice has untapped potential to keep improving the provider experience, as well as the patient experience,” said Baker. 


“I believe this is a very good place to be putting the creative energy of healthcare, because provider exhaustion and burnout are affecting nurses, physicians, and just about everyone else involved in care right now.  We need creative solutions, and I firmly believe voice-based tools are going to be a major part of that process.”

No comment yet.
Scooped by VAB Traductions

The Difference Between Big Data and Smart Data in Healthcare

The Difference Between Big Data and Smart Data in Healthcare | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it
The healthcare system’s digital makeover has aimed to help providers work smarter instead of harder, but there are plenty of stakeholders who firmly believe that the electronic health record revolution has completely missed the mark. 

The disappointments stemming from barely usable and scarcely interoperable EHRs are well known by now: a lack of actionable data for patient care; convoluted workflows that put patient safety at risk; hours added on to beleaguered physicians’ long and difficult days just to complete quality reporting and documentation requirements.

Providers feel chronically overwhelmed with an endless stream of high priority tasks for improving quality, managing populations, and scraping savings from new efficiencies, yet they are also woefully uninformed, as CMS Acting Administrator Andy Slavitt recently pointed out.

“Physicians are baffled by what feels like the ‘physician data paradox,’” Slavitt said earlier this spring. 

“They are overloaded on data entry and yet rampantly under-informed. And physicians don’t understand why their computer at work doesn’t allow them to track what happens when they refer a patient to a specialist when their computer at home connects them everywhere.”

Spotty health information exchange and insufficient workflow integration are two of the major concerns when it comes to accessing the right data at the right time within the EHR. 

A new survey from Quest Diagnostics and Inovalon found that 65 percent of providers do not have the ability to view and utilize all the patient data they need during an encounter, and only 36 percent are satisfied with the limited abilities they have to integrate big data from external sources into their daily routines.

On the surface, it appears that more data sharing should be the solution.  If everyone across the entire care continuum allows every one of its partners to view all its data, shouldn’t providers feel more equipped to make informed decisions about the next steps for their patients?

Yes and no.  As the vast majority of providers have already learned to their cost, more data isn’t always better data – and big data isn’t always smart data.  Even when providers have access to health information exchange, the data that comes through the pipes isn’t always very organized, or may not be in a format they can easily use. 
No comment yet.
Scooped by VAB Traductions

Transparency is Key for Clinical Decision Support, Machine Learning Tools

Transparency is Key for Clinical Decision Support, Machine Learning Tools | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

Transparency is Key for Clinical Decision Support, Machine Learning Tools

"Clinical decision support and machine learning vendors should prioritize transparency of the data, methodologies, and algorithms underpinning the recommendations of their tools."

No comment yet.
Scooped by VAB Traductions

Top 12 Ways Artificial Intelligence Will Impact Healthcare

Top 12 Ways Artificial Intelligence Will Impact Healthcare | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

Top 12 Ways Artificial Intelligence Will Impact Healthcare

"Artificial intelligence is poised to become a transformational force in healthcare. How will providers and patients benefit from the impact of AI-driven tools?"

No comment yet.
Scooped by VAB Traductions

Success of AI in Healthcare Relies on User Trust in Data, Algorithms

Success of AI in Healthcare Relies on User Trust in Data, Algorithms | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

Success of AI in Healthcare Relies on User Trust in Data, Algorithms

"Developing trusted, non-biased, and clinically validated artificial intelligence tools will be an ongoing challenge for data scientists and healthcare providers."



May 07, 2018 - From prediction to diagnostics to population health management, artificial intelligence holds unprecedented promise to revolutionize the way healthcare providers and patients interact with data.


Some diagnostic and clinical decision support algorithms that leverage machine learning, deep learning, neural networks, and other AI strategies are already making headlines by competing with human providers in terms of accuracy. 


These early-stage tools are heavily curated and hawkishly supervised – and few have yet reached broad implementation in the real-world clinical setting.


For artificial intelligence to flourish in the wild, however, developers must establish a firm foundation of trust in their algorithms’ accuracy, objectivity, and reliability.


That might present some challenges, said experts from Partners Healthcare, presenting at the recent World Medical Innovation Forum on Artificial Intelligence.


READ MORE: Top 12 Ways Artificial Intelligence Will Impact Healthcare


While the panelists firmly believe that the industry is close to making the leap from pilot to practice, ensuring that AI in healthcare is transparent, appropriately regulated, and implemented in a meaningful manner will be one of the industry’s most pressing concerns.


“Where we are with AI right now reminds me of the early days of genetics,” said Anthony Rosenzweig, MD, Chief of the Cardiology Division at Massachusetts General Hospital.  “There were a lot of genetic associations or hypothesis that didn’t replicate.  There were a lot of false positives, and a learning curve to figure out what the standards should be.  We are very much in that same phase.”


“We need to figure out how to test these algorithms, and what hoops they need to jump through in order to be validated so we can avoid that same lack of reproducibility that we had in early genetics.  We need some sort of vetting process so we know which tools we want to take forward into clinical applications.”


Understanding exactly how these algorithms are making decisions is the first step in that process of validation and vetting.  It’s also one of the most difficult.


Bias is a problem in all analytical tasks.  Data is often selected based on specific criteria, and those criteria may exclude or oversample certain features based on the curator’s unconscious leanings towards an expected solution or hypothesis.


READ MORE: How AI Can Cut Costs, Uncover Hidden Opportunities in Healthcare


Because AI tools must use training datasets designed by humans to feed frameworks for future decision-making, those unintentional biases can creep into the results and may become enhanced as the algorithm reinforces its own learning.


“If we start feeding data in, including genomic data which obviously contains race and ethnicity information, there’s the possibility that the algorithms could spit out a result saying, ‘people who look like this don’t respond to a particular type of therapy,’” said Rosenzweig.


“That could just reinforce some of the disparities in healthcare related to socioeconomic status or other biases in the system. The hope of AI is that we can overcome those disparities and improve the distribution of quality care more widely.  We just have to be attentive to the fact that in comparison to traditional statistical modeling, the processes within something like a neural network are really hidden from view.”


So-called “black box” tools are difficult to avoid in the artificial intelligence world, where the inner workings of algorithms are exceedingly complex and not always easily explicable to anyone other than a highly trained data scientist.


That leaves clinical end-users with the difficult task of balancing skepticism with confidence.  Many times, they must do so without the benefit of understanding exactly what training data was used or how to gauge the reliability of the end result according to some agreed-upon standard.


READ MORE: Medical Imaging is Healthcare’s Artificial Intelligence Bellwether


“There are currently no measures to indicate that a result is biased or how much it might be biased,” explained Keith Dreyer, DO, PhD, Chief Data Science Officer at Partners Healthcare and Vice Chairman of Radiology at Massachusetts General Hospital. 


“We need to explain the dataset these answers came from, how accurate we can expect them to be, where they work and where they don’t work.  When a number comes back, what does it really mean?  What’s the difference between a seven and an eight or a two?”


The challenge becomes even more complicated when moving into areas of medical innovation that have direct connections with patients, Dreyer added.

“If an algorithm that can detect melanoma is available on a patient’s smartphone, and it gives them a risk score of some type, what do they do next?” he asked. 


“This is changing healthcare so much that we really need to rethink not just the number it spits out, but also how we deliver care, how we pull data back into the system, analyze it, and make sure it’s accurate and has some value and meaning.”


Trust in data is equally important for medical devices that may be used to monitor patients in the inpatient setting, in the home, or even from inside the body. 

“The level of trust required is going to depend on the context,” said Rosenzweig. 


“If you’re using AI to help you debate whether to put an implantable cardioverter defibrillator (ICD) in someone who might be at risk of cardiac arrest, the level of certainty you need to have is relatively high.”


Many medical devices now include “smart” features that adapt and learn to predict risks and alert providers to potential adverse events, said Calum MacRae, Vice Chair for Scientific Innovation and Chief Executive of the One Brave Idea team at Brigham and Women’s Hospital.


“There are lots of ways in which devices impact care as sensors and delivery mechanism, as well as distributed storage and computational platforms,” said MacRae. 


“There are fairly sophisticated algorithms that are already present in most conventional, implantable devices – at least in cardiology.  You can imagine how those might end up being platforms for much broader implementation of different technologies.”


If the analytics powering these devices is compromised by bias – or worse yet, by a security flaw that allows a malicious entity to inject a fault into the AI’s reasoning – patient safety may be at risk.

“As we expand the Internet of Things through all of our biomedical devices, there is an obligation on the manufacturer’s side to work with us on the security implications of that,” asserted Gregg Meyer, Partners Healthcare’s Chief Clinical Officer. 


“The algorithms, the hardware, the software – all of them can have vulnerabilities that need patching up.  It’s important to say that collaborating around that is going to be an expectation that we all have to have of each other moving forward.”


Software for medical devices, as well as applications for other clinical functions, tends to be upgraded much more frequently than hardware, Dreyer said.  While there are regulatory processes surrounding software upgrades for devices, AI might throw a wrench into the current paradigm.


“I’m not sure the FDA is ready for self-learning devices: software that can update itself continuously, or update itself differently at multiple locations depending on the perceived need,” he said.  “We’re clearly missing regulatory requirements that are necessary to manage some of this innovation.”


“Anyone can create an algorithm right now.  Is the FDA going to take on the challenge of saying that they’re safe and effective?  There are a lot of broad questions here that need answers.”


The FDA won’t be the only one that has to undertake some self-examination in the age of artificial intelligence, pointed out Anne Klibanksi, MD, Chief Academic Officer at Partners Healthcare and Co-Chair of the 2018 World Medical Innovation Forum.


“What we have right now is probably a very unrealistic set of expectations based on what these things can do,” she said.


“There is an assumption right now for many people that if you’re going to be doing this type of work, it’s going to be 100 percent accurate – and it’s going to replace every other type of decision-making.  I’m not sure that’s realistic, or that it is ever going to be realistic.”


When AI falls short of that lofty bar, it can immediately erode trust – sometimes to the point of abandoning an initiative all together, she continued.


“For example, accidents happen all the time,” she said.  “There are a lot of them.  But if an accident happens with something that is driven by AI…that is it for a lot of people.  That’s the end of the story.”


“And that creates some challenges around innovation.  So we have to be reasonable about expectations in terms of diagnostics based on what people can do and the expectations around an algorithm.”


Creating the right level of expectation, stamping out unintentional bias, and fostering a sense of trust in transparent and clinically validated artificial intelligence tools will help to develop an ecosystem that successfully integrates AI tools into reliable decision-making.


“In reality, the bar for AI is not very high, in one sense,” said MacRae.  “Right now, there’s a 12 to 15 year implementation cycle from the time something becomes a clinical guideline to the time it is more or less uniformly adopted.”


“Once we start to recognize that even in the best organizations, the current level of implementation is woefully inaccurate, then we start to realize that just have a system of standardized decision-making – any system – is probably a unique advantage.” 


“As we start to think about how we can improve on that state of affairs, it’s easy to see how AI might impact clinical care in a very real way and in a very short amount of time.  We just have to make sure we’re improving on what already exists.”



No comment yet.
Scooped by VAB Traductions

How Do Artificial Intelligence, Machine Learning Differ in Healthcare?

How Do Artificial Intelligence, Machine Learning Differ in Healthcare? | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

If the futurists, visionaries, and venture capitalists are to be believed, artificial intelligence is right on the cusp of becoming the most important breakthrough for healthcare since penicillin. 

Self-driving cars, scary-smart advertising, and virtual home assistants are just the beginning, they proclaim.  Hospitals must brace themselves for a complete transformation – a revolution – a total makeover of every aspect of patient care.

If the amount of money riding on "artificial intelligence” breakthroughs is any guide, it won’t be long before the first chat bot bursts through the barrier of the Turing Test.

A new report from MarketsandMarkets pins the healthcare artificial intelligence sector at 7.98 billion dollars in 2022, accelerating at a wild compound annual growth rate (CAGR) of 52.68 percent over the forecast period.

Machine learning powerhouses like Google, IBM, and Microsoft will continue to stretch their lead in the lucrative healthcare AI market, the report predicts, as they develop and refine the deep learning techniques that are already being applied to pathology, predictive analytics, and precision cancer care.

All three industry leaders have recently made headlines for innovative machine learning and artificial intelligence projects focused on specific healthcare use cases.

Microsoft is tackling cancer, vision problems, and imaging analytics, while Google recently published research on the role of machine learning in pathology and cancer diagnosis. 

IBM has committed extensive cognitive computing resources to imaging analytics, genomics, pharmaceuticals, and population health management.

These veterans may face some stiff competition from up-and-coming companies receiving millions in venture capital investments, however.  According to a 2016 report from CB Insights, healthcare AI startups are beating out companies in every other industry in terms of the volume of completed deals."

No comment yet.
Scooped by VAB Traductions

How Healthcare Can Prep for Artificial Intelligence, Machine Learning

How Healthcare Can Prep for Artificial Intelligence, Machine Learning | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

The term “artificial intelligence” often conjures up visions of apocalyptic landscapes decimated by hyper-intelligent machines with a penchant for destroying societies foolish enough to place their trust in autonomous robots and android workers. 

While this bleak vision of the future is still firmly in the realm of science fiction novels and summer blockbuster movies, recent advances in artificial intelligence (AI) and machine learning are leaving some to wonder if Isaac Asimov’s Three Laws of Robotics are going to become applicable to everyday life sooner rather than later.

Self-driving cars, implantable medical devices, the ubiquity of smartphones and wearables, the first hints of programs that can beat the Turing test, and the financial incentive to drive automation into every imaginable business process are all bringing excitement and optimism – and more than a little trepidation – to developers across every economic sector.

The healthcare industry represents a particularly significant opportunity for machine learning to prove its value.  The sheer volume of available medical knowledge has long since outstripped even the most intelligent clinician, requiring supercomputers just to keep up with the latest best practices and big data breakthroughs in genomics, predictive analytics, population health management, and clinical decision support. 

Machine learning, natural language processing (NLP), and artificial intelligence are quickly becoming foundational components of the quest to keep ahead of the data tsunami while adhering to the most important law for robots and human healthcare practitioners alike: first, do no harm.

How are these tools already helping providers to produce better outcomes for patients, how will they evolve in the near future, and what steps should the industry take to integrate AI into the care process without fearing a disastrous big data backlash? "

No comment yet.
Scooped by VAB Traductions

Video: Exploring the Promises of Artificial Intelligence in Healthcare

Video: Exploring the Promises of Artificial Intelligence in Healthcare | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

"Few data science concepts have brought as much excitement and anticipation to healthcare as artificial intelligence and machine learning. 

With an almost limitless capacity to revolutionize how patients, providers, payers, and other stakeholders make decisions and interact with each other, AI is poised to become a transformative technology within a very short timeframe.

In the clinical environment, AI-driven decision support tools are already improving patient safety and finding their way into radiology, pathology, oncology, and other data-rich specialties. 

On the administrative and operational fronts, smart algorithms are identifying new opportunities for greater efficiencies, creating stronger patient-provider relationships, and alleviating some of the burdens of health IT use.

How will artificial intelligence evolve to help meet the needs of patients and providers in an increasingly complex healthcare environment?

At the 2018 World Medical Innovation Forum presented by Partners HealthCare, experts from across the healthcare industry shared their visions for the future of artificial intelligence with HealthITAnalytics.com.

No comment yet.
Scooped by VAB Traductions

What Is the Role of Connected Health in Patient Engagement?

What Is the Role of Connected Health in Patient Engagement? | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

What Is the Role of Connected Health in Patient Engagement?
Connected health and digital health tools have the power to enhance patient engagement, patient access to care, and patient empowerment.
By Sara Heath

As healthcare technology continues to permeate the medical industry, experts are continuing to evaluate how connected health tools can impact patient engagement and the overall patient experience.

As noted in a 2013 study in the International Journal of Medicine, connected health includes a broad set of definitions.

“Connected Health encompasses terms such as wireless, digital, electronic, mobile, and tele-health and refers to a conceptual model for health management where devices, services or interventions are designed around the patient’s needs, and health related data is shared, in such a way that the patient can receive care in the most proactive and efficient manner possible,” the researchers explained.

Top connected health tools include telehealth, remote patient monitoring tools, wearable technology, secure messaging tools, mobile apps, and other digital tools that help connect patients to their providers.

Connected health is useful because it bridges the gaps between patient, provider, and patient health. Connected health has received much praise in recent years, with healthcare experts saying these tools can revolutionize the patient experience with care. Entire connected health departments have emerged in hospitals across the country, and increasingly organizations are investing in their own connected health tools.

But how exactly does connected health impact the patient? What are the benefits these tools have on patient care and patient experience?

Although connected health touches on a large number of patient care points, providers can look to two central patient experience domains: patient access to care and patient empowerment. Through various modalities, connected health tools improve patient care access and self-efficacy, working to deliver on central tenets of patient engagement.

Connected health supporting patient access to care

Foremost, connected health allows patients to connect with their medical providers more quickly and conveniently than ever before. One top tool for digital patient care access is telehealth.

Patients and providers alike can use telehealth through two central modalities. First, patients can use direct-to-consumer telehealth, on which they speak to their providers using video conference. From there, providers conduct an analysis and usually provide a diagnosis and treatment plan.

Second, providers can connect with one another to share consults, expertise, and knowledge during patient care. For example, a provider in an urban, free-standing emergency department might consult a clinician in a larger, city-based hospital. The hospital clinician may have knowledge, equipment, or specialist expertise that the rural provider does not, making the telehealth consult ideal for patient care.

Patients have reported high satisfaction with telehealth, as the technology makes it easier for them to access basic healthcare needs on convenient technology platforms. In some cases, it allows patients to access care when they otherwise would have gone without.

In fact, patients are asking for access to telehealth more and more, showed a June 2017 survey from the Advisory Board. Seventy-seven percent of patients have requested access to some sort of telehealth because the tools increase their access to care.

Although telehealth is growing in popularity, it is certainly not the only form of connected health that supports better patient access to care. Secure messaging tools (many of which are hosted on patient portals) allow patients to message their providers when they have a medical question.

Secure messages are not stand-ins for provider care, but they can mitigate patient concerns. In some cases, secure messages can let a patient know that they do not necessarily need to visit the doctor’s office, saving both patient and medical industry time and money.

Driving patient empowerment with connected health technology

Benefits of connected health go beyond the logistical factors – they also improve the way patients interact with and perceive their healthcare. When patients manage their own health using connected health tools, patient empowerment and self-efficacy increase.

Improved patient empowerment is usually the result of using wearables, remote patient monitoring devices, or diet and fitness apps (although self-efficacy improvements are not limited to those tools).

The above-mentioned technologies put patients in charge of their own care. For example, a fitness wearable helps a patient set her own fitness goals and track her progress toward those goals. A remote patient monitoring system will alert a diabetic patient when his sugars are too high, allowing the patient to make is own adjustments.

Increases in patient empowerment occur because these tools directly connect patients with their providers’ efforts to improve wellness, according to David Albert, MD, founder and CEO of AliveCor, a mobile ECG technology developer.

“We’re entering a new phase of patient centricity,” Albert said in an interview with mHealthIntelligence.com. “Patients are becoming more and more responsible for their own healthcare, and that creates opportunities for engagement.”

Connected health tools make patients accountable for their own health actions by providing an immediate view of where patients are falling short on their wellness path. Connected health can also help patients adjust their efforts.

Surveys have found that patients are intrigued by connected health tools because they help promote self-efficacy. A 2017 Accenture survey found that patients want convenient access to care, better patient education, and avenues to manage their care on their own.

Patients say providers are in charge of delivering these tools but are currently falling short. Only 21 percent of providers have offered their patients access to connected health, patient respondents said. This comes despite the fact that 44 percent of patients said they’d use connected health at their provider’s behest. Thirty-one percent of patients said the same of payer encouragement.

But as healthcare professionals continue to see the benefits of connected health, they are continuing to adopt these tools into their practices. Seventy-one percent of providers have adopted telehealth, for example. More providers are looking into how patient-facing tools can enhance patient care experiences as well.

With the rising tide of consumerism in healthcare, providers are likely to continue their foray into connected health. These tools will allow patients to more easily and conveniently access their care while playing a key role as an arbiter of wellness.

No comment yet.
Scooped by VAB Traductions

Can Clinicians Test Patient Health Literacy in Patient Portal Use?

Can Clinicians Test Patient Health Literacy in Patient Portal Use? | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

Can Clinicians Test Patient Health Literacy in Patient Portal Use?
The test will help providers target which patient portal clinician notes yield low patient comprehension, ultimately gauging patient health literacy.
By Sara Heath

Researchers from the University of Massachusetts Medical School have developed a tool to help healthcare professionals test digital patient health literacy, as reported in a new study published in the Journal of Medical Internet Research.

Patient health literacy is an important component of patient engagement as it allows patients to read, understand, and use the various terminology related to their health. Health literacy as it relates to personal health records (PHRs) and patient portal use is essential, the UMass researchers said.

“Patients with limited health literacy may struggle to understand the content of their medical notes, which can include visit summaries with medical terms, lab reports, and terms and phrases that are not common outside of medicine,” the research team wrote. “A patient’s health literacy can have an impact on their desire to engage with their own PHR.”

However, patient health literacy is at a serious low, the researchers pointed out. Only 12 percent of patients have at least proficient health literacy levels, per HHS statistics. About 50 percent of patients have said they do not understand at least one term on their medical record problem lists, the UMass investigators reported.

Low patient health literacy can have serious consequences, included lower probability that a patient will use a digital health tool. Considering the financial reimbursements tied to patient portal uptake, this could have negative impacts on hospitals.

“Low health literacy can impact a patient’s ability to communicate with their health care providers and to navigate and understand complex EHR information,” the researchers said. “Given the prevalence of low health literacy in the population, tools that effectively assess a patient’s health literacy are needed for both research and practice.”

While healthcare professionals work to improve patient education and health literacy, they must also create tools for targeting those efforts – it is not always helpful to deliver literacy education to a patient who already has high health literacy. As a result, medical professionals have developed numerous tests to ascertain patient health literacy levels.

But most of those tests do not relate to patient portal or PHR use, the UMass researchers said. A more specific tool is necessary to fully improve patient digital health literacy.

The UMass researchers have filled that gap by creating a viable patient health literacy test. While that test will eventually help measure improving levels of patient health literacy, the researchers have thus far only developed the tool.

The team began its work by identifying six common diseases or conditions for the hospital’s population, including heart failure, diabetes, cancer, hypertension, chronic obstructive pulmonary disease (COPD), and liver failure.

From there, the team narrowed down the top EHR notes that flag the most patient comprehension errors. The researchers used those notes to create a 55-item test, entitled ComprehENotes, that would help to identify where patients had patient portal health literacy issues.

“These questions are general enough to be applicable to a wide variety of individuals while still being grounded in specific medical concepts as a result of the hierarchical clustering process,” the researchers pointed out. “In contrast with existing tests of health literacy, ComprehENotes was developed by generating questions directly from real patient de-identified EHR notes. Key concepts from the notes were identified by physicians and medical researchers as part of the question generation process.”

As noted above, there are numerous tests analyzing patient health literacy levels, the researchers conceded. However, ComprehENotes is the first test of its kind to directly assess how patients understand and use the medical jargon in their patient portals. Specifically, the test is well-positioned to pick up on abnormally low levels of health literacy.

“The test is most informative at low levels of ability, which is consistent with our long-term goal of identifying patients with low EHR note comprehension ability,” the researchers said. “Most of the questions have low difficulty estimates, which makes the test appropriate for screening for low health literacy.”

As noted above, the researchers have thus far succeeded in developing the health literacy test. Going forward, they hope to introduce the test to real patients and create some disease-specific assessments.

“This work is a first step toward being able to evaluate patients’ understanding of their health based on information directly contained in their own EHR,” the researchers concluded. “These personalized questions can be administered to patients to evaluate their ability to read and comprehend their own notes.”

No comment yet.
Scooped by VAB Traductions

Consumers Support AI in Healthcare More than Other Industries

Consumers Support AI in Healthcare More than Other Industries | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it
April 16, 2018 - While the general public is more at ease with artificial intelligence (AI) being used in healthcare than in other industries, many consumers still worry about lack of human interaction and data security, according to a survey from SAS.  

When presented with a variety of real-world scenarios, a large proportion of the 500 participants was more comfortable with AI in medical settings than in banking or retail domains.

Sixty percent of survey respondents said that they would be comfortable with their doctors using AI to analyze their medical data to inform treatment decisions.
No comment yet.
Scooped by VAB Traductions

Artificial Intelligence Promises a New Paradigm for Healthcare

Artificial Intelligence Promises a New Paradigm for Healthcare | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

Artificial Intelligence is poised to overhaul the healthcare industry, bringing success to those who can adapt quickly to the new care delivery paradigm. 


The speed with which artificial intelligence has become an inescapable component of every new technology offering is either highly encouraging or deeply troubling, depending on the observer’s zeal or trepidation around the notion of integrating data-rich algorithms into the complex and highly personal practice of caring for patients. 


Unease over AI is still common, and perhaps somewhat justified as researchers start to turn well-controlled pilots into commercialized deployments of diagnostic tools, clinical decision support systems, and workflow optimization aids.


Many of these offerings must still earn the trust of clinicians, especially those who question the underlying integrity and potential biases of the data upon which these algorithms were trained.


Yet many more are showing truly astonishing results that are leaving providers clambering for access, eager to take advantage of more intuitive workflows while gaining the ability to harness the knowledge of thousands of their colleagues and experiences of millions of patients.


Regardless of where any individual or institution falls along the enthusiasm spectrum, it is becoming increasingly clear that nothing is going to stay the same once the healthcare industry hits its AI event horizon – and that moment may be coming very soon.


“We’re seeing an ever-increasing number of cases that have the potential to truly transform the way providers care for patients,” said Samuel Aronson, Executive Director of IT at Partners Personalized Medicine.


“And there is a great deal of activity around open-source health innovation platforms that are focused on making it cheaper and more efficient to build, secure, validate, deploy, share, and ultimately network these applications across institutions.”


“What we’re really focused on now is the last-mile problem associated with enabling the clinical process transformations required to get algorithmic-enabled care more broadly rolled out in healthcare.”


A staggering number of those potentially game-changing innovations were on display at the World Medical Innovation Forum (WMIF) on Artificial Intelligence, presented by Partners HealthCare. 


From interpreting lab tests, simplifying in-vitro fertilization, and personalizing cancer care to monitoring surgical video in real-time or aiding in the diagnosis of pneumothorax, the creativity and ingenuity evident in the flourishing AI research community is both astounding and heartening.


If brought to scale and implemented appropriately, each of these concrete solutions for specific use cases is likely to save dozens, hundreds, or thousands of lives over time. 


Such accomplishments will certainly be nothing to sneeze at.  But artificial intelligence promises the healthcare system something that cannot be counted on the scale of individuals.


Machine learning, deep learning, neural networks, natural language processing, and all of the other components of the AI ecosystem are poised to bring about a complete change in the paradigm, from how doctors are trained to how they make decisions to how they deliver care.


The AI revolution is already well underway in other industries, panelists at the three-day event pointed out repeatedly. 


Mentions of autonomous vehicles popped up in nearly every session, while references to how consumer technology giants like Apple, Amazon, Uber, and Google integrate AI into their smartphones, search functions, apps, and other offerings were nearly as frequent.


Innovation in these sectors is moving at lightning speed due to high demand and fierce competition to become the lifestyle platform of choice for consumers craving convenience and on-demand services.


“In the consumer space, if you pull out your smartphone and search for an object inside your pictures, you can identify thousands of different features,” explained Keith Dreyer, DO, PhD, Chief Data Science Officer at Partners. “We don’t think of the fact that artificial intelligence is doing that.  We just see a smarter phone.  When AI works, you don’t call it AI anymore.”


Healthcare is uniquely positioned to take advantage of what AI has to offer, and many members of the care continuum are eager to see data-driven changes to the way they are now delivering care. 


Few clinicians argue with the assertion that the recent explosion of health-related data has turned every decision into minefield of uncertainty.  Providers must agonize over what they might be missing among the gigabytes of information they’re supposed to consider. 


And no matter how much data seems to be available, it is difficult to be sure that the information on the screen is complete, up-to-date, and accurate.


Coupled with outcomes-based reimbursement structures, increasing pressures on their time and attention from documentation requirements, and the ever-present threat of legal action when a decision goes awry, it seems a natural fit to bring some algorithmic intelligence into the process.


But the healthcare industry is structured much differently than the environment in which a handful of purely profit-driven behemoths now roll out comprehensive AI-driven updates at their user conferences and trade shows. 


Healthcare’s mission is different, and its tech culture is, too.


There is no such thing as a “healthcare 3.0” operating system that can be force-installed into every workflow during an overnight system update.


The HHS and the Office of the National Coordinator have made some attempt to bring such an approach to the industry through the Certified EHR Technology (CEHRT) program, which requires participants in meaningful use – now known as Promoting Interoperability – to adopt the same set of core health IT functionalities from approved vendors.


But that process has taken years, and the version known as the 2015 Edition won’t even start to be required until 2019 – a lag in adoption that would be nearly unthinkable in any other industry.


“Getting to the point of complete [artificial intelligence] ubiquity is going to take a long time in healthcare,” Dreyer predicted, and he wasn’t alone in expressing reservations about the speed at which AI will be able to take hold. 


Many healthcare organizations perennially lack the “activation energy” to adopt new workflows, said Aronson, due to the daily demands of simply getting through the day.


“Everyone is running flat out,” he said.  “When you talk about bringing AI or any other technology into the organization, clinicians have to work with you to define the new workflow, and they have to validate it – often site by site – before they implement it.” 


“If they are working totally flat out, it’s difficult for them to find the time to invest in that, even if they know it’s going to save them time after the process is complete.”


Free time isn’t the only problem, added Neil de Crescenzo, CEO of Change Healthcare.


“In addition to the activation issue, there are some institutions that just don’t have the resources,” he said.  “Even if they could free up the time, they simply don’t have the expertise in-house to make it happen.” 


“At the same time, we’ve seen an incredible surge in the size of institutions over the past few years,” de Crescenzo continued.  “Many [health systems] now have fifty, seventy-five, or a hundred ambulatory offices all over the community.” 


Those newly expanded networks may still be struggling to create cohesion among disparate communities that could be using a dozen or more different electronic health records.  


Working to unify that fragmented environment is typically the top priority, and many organizations are simply not prepared yet to take the next step.


“In order to start thinking about how to deploy AI and machine learning consistently across all of those component entities, you need to be a sophisticated, mature organization.  Right now, it’s not practical for a lot of these institutions.”


Many of these organizations could benefit from partnering with technology companies that offer out-of-the-box services and comprehensive platforms that require little to no additional development from the client, he added. 


“What we’re seeing is that they’re asking us to start fishing for them instead of teaching them to fish,” he said.  “Maybe only for a little while – maybe in five years or ten years, they’re going to want to take it in-house.”  


“But right now, if they’re going to take advantage of what AI can bring them, they’re going to have to give it to someone else to do the things they don’t have the time or skill to do right now.”


Outsourcing the heavy lifting to a dedicated business partner could be a promising way for many organizations to reap the benefits of artificial intelligence without investing in becoming miniature innovation factories.


The convergence of value-based reimbursement, changes in consumer attitudes, and an epidemic of clinician burnout has left providers who can’t develop their own solutions desperate to invest in tools and solutions that will help organizations recharge their “activation energy” batteries and restore a sense of joy to patient care.


Market analysts predict enormous growth opportunities for artificial intelligence products: a recent Research and Markets report forecast a 60 percent compound annual growth rate (CAGR) for the healthcare AI sector until 2022.


In the first quarter of 2018 alone, data analytics and clinical decision support tools, many of which include a machine learning or AI component, attracted nearly $1.2 billion from investors eager to pounce on the next “unicorn” company to rear its head above the herd.


In such a feverish environment, as non-traditional players enter the market and nearly every health IT vendor and developer under the sun seems to be offering an AI-driven product, it can be difficult for organizations to work their way through the promotion, the hype, and the exuberant marketing.


A healthy skepticism when evaluating new tools and platforms is essential, but so too is the recognition that it’s only a matter of time before the majority of health IT tools have at least some sort of AI thread running through them.


That future state of affairs is nothing to dread, the experts stressed.  Physicians and other care providers shouldn’t harbor too much trepidation about being made obsolete by omniscient, infallible robot colleagues who never take vacation days.


Sometimes we do talk in extremes when we’re discussing AI and machine learning,” said Dan Burton, CEO of Health Catalyst. 


“It’s either ‘we’re not going to have any physicians at all anymore – it’s just going to be the machines,’ or we swing the other way towards ‘it’s all horrible and we should never use machines.’  Of course, the truth is almost always somewhere in the middle.”


Uncertainty, and even fear, is a natural response to a concept as fundamentally revolutionary as artificial intelligence is promising to be, agreed Anne Klibanski, MD, Chief Academic Officer at Partners.


“But if we look back in history to the first Industrial Revolution, the exact same themes were there,” she observed.  “Machines will ruin everything; jobs will go away; people will never be the same.”


“And when we started exploring genetics and genomics, there were many, many people who voiced considerable concerns about what it would mean to be human when we had that kind of knowledge.”


The human experience has certainly changed since the Age of Steam, and most would argue that the progress has been, on the whole, a positive development. 


Even in the few short years since the mapping of the genome, science has accelerated drastically – not necessarily to fundamentally alter what it means to be alive, but to offer new hope for patients and families around dozens of previously incurable or undiagnosable diseases.


Within the workflows of physicians and care teams, artificial intelligence brings similar opportunities to augment, not supplant, the irreplaceable value of the clinical mind.


And in the administrative and operational spheres, hopes are incredibly high that AI can smooth out some of the most jagged pain points afflicting providers, patients, payers, and the care delivery process.


The trick to achieving a positive AI outcome will be to develop trustworthy, accurate algorithms that can be widely deployed to address major shortcomings in the status quo.


Many of those problem areas generate huge amounts of financial waste and leave patients with significant gaps in their care, said A.G. Breitenstein, Partner at Optum Ventures.


“AI is already out there to some degree, but it’s not great AI.  It’s being used in all sorts of consumer contexts with variable success.  We’ve all seen the problem of patients going to Doctor Google and scaring themselves with what they find, for example.”


“That’s part of a $50 billion problem of overutilization of the ED and underutilization of primary care,” she pointed out. 


“The solution to that isn’t necessarily telling people they should go to primary care instead of the ED.  The solution is intercepting the patient at the first point of contact – when they sit down to their computer – and using better algorithms to give them more clinically validated, more rigorous data in the places where they’re looking for answers.”


Using AI to get upstream of developing problems is a common theme across the research and development ecosystem. 


Preventing adverse drug events, alerting providers to patient deterioration, stratifying patients by risk for population health management programming, supporting chronic disease management, identifying metastasizing cancers, and predicting kidney function decline are just a small handful of the hundreds of use cases for predictive AI.


In the life sciences arena, artificial intelligence has the potential to carve out a piece of the $500 billion spent on wasted interventions, said Colin Hill, CEO of GNS Healthcare, to HealthITAnalytics.com during the WMIF event.


“There’s at least half a trillion dollars wasted when patients aren’t getting the right drug, or they’re not in the right care management program, or they didn’t get the right procedure or medical device,” he said.


“That’s a staggering cost, and it’s a massive opportunity to change health outcomes by getting down to the fundamentals of slowing disease progression, reducing hospitalizations, and optimizing therapeutic effectiveness.”


Precision healthcare will be “the most important application” of AI, he asserted.


“If we can better match health interventions to individual patients, we can improve outcomes and lower the total cost of care.  Part of that requires the discovery and development of new drugs.  And it also means discovering the biomarkers that can help stratify patients into responding and non-responding subpopulations.”


“We need to understand what works for whom, what the underlying mechanism is, how a biological system operates, and how an intervention impacts that.

Machine learning is critical for that process, because it helps us advance from grappling with the little pieces of what we think we know and presents more insights than humans can typically uncover.”


Genomic data and electronic health record data will be critical for that process, added Noga Leviner, CEO of Picnic Health.


“Genomics and the EHR are a perfect use case for AI,” she said.  “You’ve got huge volume, but you also have some weak signals within that.  Humans are just not going to be as good as algorithms are at teasing those signals out.” 


“As we layer more and more types of data on top of that, it’s only going to become more obvious that doctors using their brains alone aren’t going to be as good as brains combined with algorithms.” 


AI could cut even more costs and free up human brain power on the administrative side of the industry, said Change Healthcare’s de Crescenzo.


“I’m not sure anyone knows exactly how many people work in utilization management for payers and providers,” he said, “but no one has ever argued if I estimate it’s at least 100,000 people.  They’ve got an average salary of about $80,000 a year.  That’s $8 billion spent on managing utilization.”


“What if we could take a conservative estimate of 30 percent of those people and use AI to free them up to work on other things?  Many of them are nurses or other clinicians or caregivers – that’s $2.4 billion in direct savings, not to mention the savings from bringing more clinical expertise back to patients without adding net-new personnel.”


Routine tasks like patient scheduling, supply chain management, and booking operating rooms or testing equipment could also benefit from some automation, said Katherine Andriole, PhD, Director of Research Strategy and Operations at the MGH & BWH Center for Clinical Data Science (CCDS).


“An MRI scanner is a very expensive piece of equipment, and letting it sit idle is like throwing money down the drain,” she said.


“There are more occasions than we’d like where a patient shows up to have an MRI, but it turns out they’re contraindicated. Right now, the process for gathering that information before the appointment is to call them.  Sometimes the patient doesn’t know, or gets something wrong, and sometimes we don’t actually connect with them at all – but we still keep that appointment booked.”


“Why can’t we use data to start identifying the patients likely to have a contraindication before we put them through that entire process?  If we can use AI to predict a problem before we waste resources and time on both sides of the patient-provider relationship, we’re going to create better experiences for everyone.”

Improving the patient experience is an imperative that healthcare providers cannot afford to ignore, agreed Breitenstein.


Leaving a positive impression on consumers will be especially as healthcare moves out of the hospital and into the ambulatory setting, the retail clinic, the smartphone, or the home.


“The original hospitals were places where people went to die.  They were the last stop.  We started to build care around that, which is at the root of a lot of our systemic problems,” she explained. 


“As we start to think about turning the hospital model into one that centers on managing health over time, that need for the physical plant changes dramatically.  The central role of the hospital is starting to break apart, which enables an entirely new wellness environment – an entirely new driver for preemptive care."


“We don’t have to wait for the patient to get sick and present themselves anymore.  Now, we can intervene before they end up in the ED or go to see the specialist.  Illness is usually detectable to an algorithm before it is detectible to a patient.  The fact that we wait long enough for someone to acknowledge that they should get some help is an artifact of the traditional notions we have about healthcare.”


Flexible care delivery and the ability to get further and further upstream of costly diseases that reduce quality of life for patients may help to salvage some of the consumer goodwill that has drained away due to cumbersome interactions with their providers and mounting costs.


“One of the places we find the most stored-up kinetic energy is around consumer frustration,” Breitenstein said.  “To be honest, satisfaction with the healthcare system is about on part with the prison system.  In addition to that, we have people paying $1000 or $1200 out of pocket every time something goes wrong.”


“When you have a very information-aggressive cohort of young people starting to age into their first experiences with illness on that scale, it’s going to drive a lot of change.  We need to move towards a more on-demand system that allows for easy payment, access to scheduling, and more of a self-guided experience of the healthcare system.  AI can help us enable that.”


As artificial intelligence becomes more deeply integrated into these areas of opportunity, healthcare providers will have to make at least a few changes to the way they interact with their technologies and their patients, the experts cautioned.


“Remember that only a few short years ago, biostatistics and data analytics were considered new skills for doctors to learn,” said Klibanski. 


“AI is going to be a fundamental core curriculum that physicians are going to need to know.  They need to know what to trust and what not to trust.  And they need to really understand what they’re being asked to do.  Without those capabilities, they’re either going to trust everything or deny everything, and we need to be somewhere in between.”


Finding the middle ground between faith and discernment will be critical for ensuring that clinical decision support tools are most effective. 


When users of these tools find that optimal balance, the impact could be staggering, said Burton of Health Catalyst.


“Some of the most interesting products coming out are clinical decision support tools that inform the perspective of someone like a radiologist,” he said.


Every diagnostician consumes information slightly differently based on their unique experiences and training, which could lead to disagreements in complex cases or the need to consult colleagues when an unusual presentation appears, he explained.


“How much could those conversations be improved with input from algorithms that are trained on 100,000 patients over here or 200,000 over there – and then 50,000 patients that had a slightly different set of characteristics that present a third option you might not have thought of?” Burton queried. 


“To me, that is something that marries the decision-making strength of a clinician with the scale that technology can bring.”


Integrating AI results into consults with colleagues might only be the first step in changing the foundations of clinical practice.


Klibanksi even hinted that the entire medical education system might be in line for a makeover as artificial intelligence starts to reveal novel associations between genetics, biological systems, and the interventions applied to patients.


“We have traditionally trained primary care doctors, specialists, and subspecialists as different areas and different disciplines,” she said. “Everything is very organ-focused or disease-specific.”


“But once we have a very broad and perhaps different understanding of diseases and disease pathways, we might actually think about training physicians in a whole different way. Some of the traditional disciplines may not accurately match with the way we’re going to approach diseases in five or ten or twenty years.”


Healthcare providers won’t just have to interact differently with their data, their textbooks, and their colleagues, said Joshua Gluck, VP of Global Healthcare Technology Strategy at Pure Storage. 


Whether developing their own tools or purchasing platforms or services, healthcare organizations will need to reevaluate their partnerships with legacy technology providers to achieve the level of scale required to succeed.


“It’s important to find a strategic partner that can really support these discoveries and do it in an innovative way,” Gluck said.  “Having an infrastructure and a data platform to build upon and scale to the magnitude of data points and disparate datasets that they need to combine to generate those insights is vastly important.”


“Some of the traditional players in that market space are taking the technology they’ve developed over the past several years and trying to get that to scale out."


"You can do that for a short amount of time, but just the amount of data and the amount of systems that have to be integrated to support some of these workflows and pipelines, especially in the genomics space…if you don’t rethink the way these structures work, they just won’t scale.”


Organizations that commit to overhauling their infrastructure in addition to embracing innovative workflows and evolving relationships, are likely to be among a select group of entities walking along the path to success, said Aronson from Partners HealthCare. 


“There are two different futures in front of us,” he said.  “In one future, we take these tools that are incredibly powerful, and we develop them based on traditional business models.  That leads us to new infrastructure that does represent a substantial jump forward."  


"But once that happens, I believe we will just wind up at a different plateau based on new proprietary systems that will replace the old proprietary systems.  Innovation will slow down again.”


“The other future depends on everyone putting in the significant amount of effort required to fund and deploy open business models."


Those models may be more complex, and Aronson acknowledged that they may be risker for businesses to engage in. 


However, "I do believe they will also be much more profitable," he said.


“If we can figure that out, then we wind up with a scenario where we will all be able to innovate on top of each other’s work,” he envisioned.  “We’ll be able to use as much open data as possible, and we’ll start to be able to think about a truly continuously learning healthcare system.”


Healthcare stakeholders have the opportunity, right now, to decide which future will win out, he stressed.


“The future depends on who’s in the room and what they decide,” he said. 


“It depends on as many of us as possible standing up and advocating for open, innovative models.  They might require more time to put in place, and they were require a lot of funding, but I truly believe they could bring some amazing benefits to humanity.”


No comment yet.
Rescooped by VAB Traductions from Doctors Hub

Fitbit and Google Team Up on Digital Health Initiative

Fitbit and Google Team Up on Digital Health Initiative | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

The partnership between Fitbit and Google has the potential to revolutionize digital healthcare by enabling patients to link personal data with medical records.

Via Philippe Marchal
No comment yet.
Rescooped by VAB Traductions from PATIENT EMPOWERMENT & E-PATIENT

Can the healthcare industry really expect to drive patient engagement? 

Can the healthcare industry really expect to drive patient engagement?  | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it
Increasing patient engagement as a tool for improving care and outcomes has been a tough code to crack for the healthcare industry. This is particularly true in comparison to the consumer engagement models adopted by companies such as Facebook, Amazon, and Netflix.
“It's unclear whether or not we'll ever achieve comparable levels of success,” said Paul LeVine, VP for analytic services for TrialCard, speaking at the Formulary, Copay and Access Summit in San Francisco April 11-12. During his presentation, “If patient engagement is the holy grail we all seek, what does it really provide us?,” he urged the time has come to adjust expectations and design targeted, timely, and simple interventions to address the inherent barriers to patient engagement.
“It is not necessarily a reasonable comparison to say we should be as good as Facebook,” LeVine explained. “We are never going to get that. If we can adjust our expectations, be smart, and design the kind of targeted approaches that play to our strengths and where the patient needs it the most, we are probably going to be better off.”
LeVine opened his presentation with a quote from Health Affairs that defines patient engagement: “Engagement generally captures the notion that patients are involved in the process of their care — actively processing information, deciding how best to fit care into their lives, and acting on their decisions.”
However, there is no universally agreed upon definition of patient engagement, which may be part of the challenge facing the healthcare sector. IBM cites 22 different reasons that compel patients to engage. These include health conditions, health cost planning, accessibility and availability, medical management, and social interaction. Athenahealth offers a more streamlined perspective: “Patient engagement is active collaboration between patients and providers.”
There is evidence the use of portals is increasing, LeVine noted. 
74% of patients are able to pay their bills through portals, compared to only 56% in 2013
45% of patients scheduled appointments through portals, compared to 31% in 2013
44% of patients now refill their prescriptions through portals, compared to only 30% in 2013
63% of patients communicate with providers through portals, compared with 55% in 2014
37% of portals provide patient-generated health data, versus 14% in 2013 
While the portal adoption rate is increasing, it is only when one compares it to engagement statistics from consumer platforms such as Facebook, Netflix, and Amazon that the disparity becomes clear. For example:
67% of Facebook users check the social media platform at least once daily (Statista, 2017)
23% of Netflix users stream something every day (Leichtman, 2017)
35% of Amazon Prime users shop on the site every week (Walker Sands, 2016)
“Those are some real engagement numbers,” said LeVine. “There are major issues to explore there.”
Patient versus consumer engagement


Patient engagement differs significantly from consumer engagement. One dilemma facing the healthcare industry is that patients' engagement is episodic by nature, noted LeVine. The consulting firm Deloitte identifies this in its 2015 consumer engagement report: Patients newly diagnosed are ravenous about finding out information about their condition, but that interest tails off once they are better educated. Similarly, functions such as choosing a PCP or a health plan – typical measures of engagement – are usually sporadic, rather than consistent ones.    
There are other hurdles to strong patient engagement, too. “Patients don't necessarily want to be reminded about their health condition. A lot of people want to push it away for a while and not be engaged,” explained LeVine.
Also, healthcare providers and payers have to be mindful of the unintentional impediments they put in the way of engaging with patients. LeVine cited a case in which a woman was listed as a having a high-risk pregnancy. She became “the designated high-risk pregnancy lady,” and every time she checked her portal those are the first words she would see. This type of labeling can be a strong deterrent to an individual's engagement with a payer's or provider's portal. 
The overarching question LeVine posed is, “Should we hold patient engagement to the same standard we use for consumer engagement?”
Despite some interesting outliers, there's generally strong support for promoting engagement. According to Health Affairs, patient engagement is one strategy to achieve the "triple aim" of improved health outcomes, better patient care, and lower costs.
In a study conducted by Judith Hibbard at the University of Oregon, patients with the lowest activation scores — those who had few skills and the lowest rate of confidence in actively engaging in their own care — incurred costs of up to 21% higher than patients with the highest activation levels.
“Given its inherent limitations, what would we consider ideal attributes of a ‘patient engagement' intervention?” LeVine asked.
He described a case study that looked into the inadequate use of co-pay cards and adherence. Conducted in 2016, the program used email and telephone interventions to re-engage diabetes patients. Using its QuickPath information platform, which connects HCPs, pharmacists, and payers with the patient through a variety of means, including smartphones, wearables, text, EHRs, and eWallet platforms, TrialCard was able to significantly improve the percentage of patients activating and using a co-pay card, from 19% to 34.5%.
Yet despite those improvements, LeVine believes more can be done. “We're still not reaching enough patients because our methods aren't fully aligned with the reasons for their lack of engagement,” he explained.
So how can payers and providers promote higher levels of activation from patients? LeVine pointed to the model used by the entertainment industry. For example, Netflix, he noted, has been highly successful in identifying consumer preferences and behavior through analytics. In fact, more than 80% of the movies and TV shows users watch on Netflix are discovered through the platform's recommendation system, LeVine said, citing an article from Wired
Netflix is not alone in using sophisticated analytics to improve engagement. According to McKinsey & Co., 35% of Amazon's revenue is generated by its recommendation engine.
“We need to get better behavioral analytics,” LeVine said. “These are the key models for understanding what motivates patients to act. We have to get a whole lot better at using some of those learning models (patient activation model, transtheoretical model, motivational interviewing, and others) as ways of improving that type of engagement.”
“The healthcare industry,” stressed LeVine, “is considerably less advanced in deploying these types of behavioral analytic strategies than the consumer sector – but that doesn't mean we don't know a lot about what motivates patient behavior. We just have more we need to learn.”

Via Plus91, Lionel Reichardt / le Pharmageek
No comment yet.
Rescooped by VAB Traductions from PATIENT EMPOWERMENT & E-PATIENT

The Empowered Patient: Myth or Reality? Transcript + stats – #digitalhealth

The Empowered Patient: Myth or Reality? Transcript + stats – #digitalhealth | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it
Weds 2nd May, 10pm UK/Ire, 5pm ET Co-host Marie Ennis O’Connor aka @JBBC Transcript;    Stats;    WHO defines empowerment as “a process through which people gain greater control over decision…

Via Lionel Reichardt / le Pharmageek
No comment yet.
Rescooped by VAB Traductions from eHealth mHealth HealthTech innovations - Marketing Santé innovant

Artificial Intelligence in Healthcare – Do the Benefits Outweigh the Challenges?

Artificial Intelligence in Healthcare – Do the Benefits Outweigh the Challenges? | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it
As healthcare professionals, it seems we can’t escape the buzz and hype of artificial intelligence (AI) today. However, unlike other industries, healthcare’s adoption of AI is still in its infancy, in part, due to many providers still updating their tools and processes for the digital age. According to an Accenture analysis, growth in the AI health market is expected to reach $6.6 billion by 2021 and key clinical health AI applications can potentially create $150 billion in annual savings for the US healthcare economy by 2026.

AI provides industry players with a unique opportunity to not only offer tools and insights that can vastly improve patient care, but that also improve their bottom line. However, despite all the benefits and advantages of AI that we hear about, some remain skeptical and hesitant to jump on board, and quite frankly, are concerned about the challenges of AI in healthcare and just how much it will impact the industry.

Addressing and Overcoming AI Concerns

One concern among some healthcare providers and professionals is related to AI’s data collection and accuracy, as we are keenly aware that AI is only as good as the data it collects. Not to mention, AI must be implemented correctly, in order to reach its full potential. The concern here is that since AI is built on deep learning, a technique in which computers learn through example and work to better understand and process complex forms of data, there is no real way to determine its inner workings – so providers have to rely on trust. Some providers on the other side of the fence, however, argue that AI is actually much faster and more accurate than humans.

Other AI fears are related to providers losing their jobs, especially radiologists. However, radiologists actually have more job duties and responsibilities than what they are utilizing AI for, and many argue that AI solutions are simply just a supplement to their workflow. Embracing the technology which supports a patient’s outcome is the benefit.

AI’s Aim to Improve Patients Lives

In the medical field, AI has the potential to diagnose diseases and illnesses through deep learning. According to Breastcancer.org, about one in eight U.S. women will develop invasive breast cancer during her lifetime; however, two-thirds of women have the potential to be saved through early detection and progressive treatments. In response, many medical facilities are turning to Digital Breast Tomosynthesis (DBT) technology solutions as their preferred method for screening and diagnostic mammography in order to do just that – detect and diagnose women with early-stage breast cancer.

Via Dominique Godefroy
No comment yet.
Scooped by VAB Traductions

How to Test Digital Health Literacy in Older Adult Patients

How to Test Digital Health Literacy in Older Adult Patients | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

"Conducting digital health literacy assessments via the telephone yields more stable results for older adult patient populations.

October 30, 2017 - Administering digital health literacy assessments via the telephone presents an opportunity to understand older patient internet use and online health information seeking, according to a recent study in the Journal of Medical Internet Research.

More patients, especially those over the age of 50, are using the internet to access information about their own health than ever before. However, the breadth of digital information increases the likelihood that a patient will encounter misinformation. It is thus critical for healthcare professionals to understand the internet use habits and abilities of older patients.

“Greater Internet adoption has increased the availability of health information for consumers, yet disparities in access to relevant online health information persist, especially among users with insufficient skills to discriminate between credible and fraudulent online health information,” the researchers explained.

Administering a patient health literacy assessment is useful for understanding these patient characteristics. eHealth literacy assessments, such as the 8-item eHEALS assessment, look at patient abilities to seek, find, understand, and appraise health information from online sources.

But efforts to obtain answers from older patients often hit roadblocks. Older patients don’t usually like to complete survey items on an electronic form or on a paper questionnaire.

READ MORE: How Digital, Health Literacy Drives mHealth Patient Engagement

Instead, conducting these surveys via the telephone proves effective, the research team has found.

In a test of nearly 300 patients over the age of 50, the research team found that telephone eHEALS assessments are equally effective as the survey administered over other media.

“Assessing consumer comfort and self-efficacy in using technology to access online health resources can help identify skill gaps and gauge the likelihood that users will be successful when using the Internet to access relevant health information,” the researchers observed.

“Results from this study suggest that administering eHEALS to older adults via telephone produces a reliable measure with scores that possess sufficient construct validity evidence.”

The researchers also pointed out a few specific findings about the patient characteristics tested in eHEALS. Notably, the team found that three scales for eHealth literacy are correlated with one another.

READ MORE: The Difference Between Patient Education and Health Literacy

Those three scales relate to utility of health information for patients and confidence in using health information. The correlated survey features include the following:

- I know how to use the health information I find on the internet to help me

- I know how to use the internet to answer my questions about health

- I feel confident in using information from the internet to make health decisions

“These three factors showed moderate to high correlations with one another,” the researchers explained. “The relationship between personal motivations for health information seeking and an individual’s perceived capability to use digital technologies can be affected by online environments with socially persuasive forms of media.”

The eHEALS assessment also tends to capture more information about older patients with high digital health literacy than those with low digital health literacy, the team found. It could be valuable for the assessment to include more survey measures to glean information about patients with limited digital health literacy to better target interventions to improve online information seeking.

“Among older adults, however, there is potential for additional underlying subscales to measure older adults’ confidence to locate, use, and evaluate online health information,” the researchers concluded.

“As older Internet users continue to visit online support groups and discussion forums to find new information about health care perspectives and experiences, it will be important to consider modifying the original eHEALS to adequately measure online health information-seeking behaviors in older populations.”

READ MORE: Patient Pre-Registration Tips for a Quality Consumer Experience

With more health information available via the internet, it will be important for healthcare professionals to understand how patients interact with that information.

A 2013 report from Pew Research Center found that 35 percent of patients have used an online resource to understand their health or obtain a diagnosis. Forty-six percent of those patients said their online self-diagnosis led them to seek medical attention, while 38 percent said they allowed the symptoms to run their course, Pew found.

These well-meaning online searches can have negative consequences. Patients with limited ability to discern quality online health information may develop misdiagnoses. And if those patients are part of the 38 percent of patients who allow online diagnoses to run their course, patients may face serious health side effects.

Healthcare professionals must understand patient online search habits and how patients vet their online sources to counter these online health search risks. Providers can then equip patients with better search habits and resources by establishing a baseline assessment about patient digital health literacy."

No comment yet.
Scooped by VAB Traductions

Patient Engagement Technology, EHRs Influence Patient Satisfaction

"Use of patient engagement technology, access to full medical records via EHRs, and other digital communication tools are a significant determinants of high patient satisfaction, according to a recent survey from Black Book.

Specifically, the way in which healthcare organizations use their inpatient EHRs to share patient health data and communicate with patients outside of the hospital are essential for a positive patient experience, the survey showed.

"Involvement with healthcare consumers through technologies is proving to be a significant element of patient satisfaction," said Black Book Research’s managing partner Doug Brown. "Healthcare consumers more frequently interact through electronic media in 2018, and while they value contact with their providers, they don't have the patience for lacks in hospital interoperability, incorrect billing and access to scheduling and results."

When medical organizations do not use their EHRs in a meaningful way, patients take note. Eighty-nine percent of patients under the age of 40 stated that they are currently dissatisfied with their organizations’ use of patient engagement technology.

Another 84 percent of patients said they are specifically looking for providers who use advanced health IT that helps patients communicate with their doctors and engage with their own health data.

Ninety-two percent of patients said that inability to immediately access their own medical records hindered patient satisfaction, and 85 percent said no access to telehealth was a dissatisfier.

"Patients expect and want to interact more with hospitals through digital channels like email, apps and social media rather than interacting on a traditionally personal level with clinical and financial back office staff," Brown explained.

Although patient engagement technology and effective EHR use are significant influencers for patient satisfaction, few hospitals are doing enough to support health IT. Seventy-eight percent of hospital respondents said they have not set their budgets to prioritize revamping their health IT offerings, including improvements in patient engagement, interoperability, and patient communications.

Hospitals are also falling short when it comes to revenue cycle management technologies, the report revealed. Revenue cycle solutions play an important role in the patient interaction because it can make or break the patient discharge process and patient billing process.

"The revenue cycle management channel of healthcare IT systems had the lowest positive experience," Brown noted. "Hospitals are taking steps to improve it, but they have a way to go."

Sixty-nine percent of healthcare consumers said the discharge process and insurance interactions can ruin an otherwise positive patient experience, the report noted.

"Part of this is probably due in part to patient expectations that have been set beyond most hospital's technological capabilities for interoperability with both other providers and payors," Brown pointed out.

For example, when patients pay in retail spaces, the checkout process is relatively easy. Patients know how much they will pay and can make the monetary exchange relatively pain-free. In healthcare, lacking price transparency and underdeveloped payment processes can cause a headache for patients.

When this happens, patients cannot distinguish between a faulty technology and incompetent hospital workers. Eighty-eight percent of patients said poor patient payment experiences are the fault of the hospital, not the EHR or the revenue cycle management tool.

Healthcare organizations face an increasing imperative to deliver a positive patient experience. As patients continue to assume more financial responsibility for their own healthcare, they will likely shop around to find providers who fit patient needs.

Additionally, amidst national calls for better patient engagement, medical organizations need to ensure they have stable means for communication and patient data sharing. Using an EHR tool that offers patients access to their own health data as well as communication with their providers will be essential to meeting national benchmarks."

No comment yet.
Scooped by VAB Traductions

Are digital pills really the new 'Big Brother'?

Are digital pills really the new 'Big Brother'? | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it
In November 2017, the Food and Drug Association of the United States approved the first pill with a digital sensor. ‘Abilify MyCite’ used to treat schizophrenia, bipolar disorder episodes, and depression comes with a sensor that digitally tracks whether patients have ingested their medication.

The pill is particularly useful for people with faulty memories and hence likely to reduce bills incurred by non-adherence to prescription. According to research, non-compliance to medication costs about $100 billion every year as patients continue to fall sicker and need more money to cover hospitalization costs. The digital pill is touted to have “the potential to improve public health,” said Ameet Sarpatwari, an expert from Harvard Medical School in an interview with the New York Times. 

Co-founder of Proteus Biomedical, the company that developed the technology, George Savage says that the company’s drive is to address medical problems that stem from drug non-compliance. Savage adds that about 40% of hospital readmissions for heart failure happen because patients often fail to stick to taking their prescribed medications properly.

Each pill contains an ingestible event marker (IEM) a microchip with a thin-film battery that gets activated once ingested. Once swallowed, the non-toxic IEM sends a high-frequency electrical current through the body’s tissues.

However, certain experts have questioned the ethical ambiguity of the pill, given its ‘Big Brother’ like tendency. Patients who agree to take the medication can sign consent forms permitting up to four people – including family members, along with their doctors, to get electronic data showing the date and time pills are taken.

Over the years Proteus has raised about $400 million from investors, including Novartis and Medtronic and brought its sensor to commercial use.

AiCure, a smartphone-based visual recognition system, that aims to “optimize patient behaviour and medication adherence,” has reported progress with tuberculosis patients. Similarly, Etect Rx, a company from Florida, has been working on yet another ingestible sensor, the ID-Cap, which is being put to test with opioids and H.I.V. medication. Etect Rx aims to fit readers into watchbands or cellphone cases, with a receiver worn around the neck. With so much investment into digital pills and technologies surrounding the, while ethically questionable to some, digital medications seem to be the future of public healthcare.
No comment yet.
Scooped by VAB Traductions

ONC Issues Guidebook for Patient Access to Health Information

ONC Issues Guidebook for Patient Access to Health Information | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

April 04, 2018 - HHS and the Office of the National Coordinator for Health IT (ONC) has issued a new guidebook to assist patient access to health information. This playbook comes as a part of the HHS and CMS MyHealthEData initiative.

The ONC Guide to Getting and Using Your Health Records is an online and patient-facing document helping patients overcome the challenges they face in accessing their medical records. The guide reviews obtaining a patient health record, checking the health record for accuracy and completeness, and using health records and data sharing for better patient engagement.

This guidebook, in conjunction with the MyHealthEData initiative announced last month, supports HHS’s goals for better patient engagement. Patients become more activated in their own care and are empowered to make healthcare decisions when they can obtain access to their own medical records.

MyHealthEData and patient access to health data also aligns with the 21st Century Cures Act. Patient access to their own medical records is a measure of interoperability, HHS and ONC contend, as is the ability for a patient to send her own medical data to another healthcare provider.

“It’s important that patients and their caregivers have access to their own health information so they can make decisions about their care and treatments,” ONC head Don Rucker, MD, said in a statement. “This guide will help answer some of the questions that patients may have when asking for their health information.

No comment yet.
Scooped by VAB Traductions

Selfie Medicine: Phone Apps Push People to Take Their Pills - The New York Times

Selfie Medicine: Phone Apps Push People to Take Their Pills - The New York Times | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it
SEATTLE — Take two tablets and a selfie? Your doctor's orders may one day include a smartphone video to make sure you took your medicine.

Smartphone apps that monitor pill-taking are now available, and researchers are testing how well they work when medication matters. Experts praise the efficiency, but some say the technology raises privacy and data security concerns.

Selfie medicine works like this: Open an app on your phone, show your pills, put them in your mouth and swallow. Don't forget to show your empty mouth to the camera to prove today's dose is on its way. Then upload the video proof to the clinic.
No comment yet.
Scooped by VAB Traductions

Using Patient-Reported Outcomes Measures to Improve Engagement

Using Patient-Reported Outcomes Measures to Improve Engagement | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

"Healthcare organizations can address technology and cultural barriers to leverage patient-reported outcomes measures in their clinical workflows."


"Between the various quality reporting and reimbursement requirements in the value-based care landscape, it seems as though there is no element of healthcare left unmeasured. But as industry professionals embrace patient-centricity, many are turning to patient-reported outcomes (PROs) and patient-reported outcomes measures (PROMs) to examine how patients actually experience their care and outcomes.

Patient-reported outcomes are the metrics patients self-report about their own health outcomes, quality of life, or functional status. These reports come directly from patients rather than through a physician’s recounting of the patient’s health complaints or challenges.

PROs are “any report of the status of a patient’s health condition that comes directly from the patient, without interpretation of the patient’s response by a clinician or anyone else,” according to the National Quality Forum (NQF). “In other words, PRO tools measure what patients are able to do and how they feel by asking questions.”

Patient-reported outcomes measures are the mechanisms by which the healthcare industry can collect and analyze PROs in a shared, standardized way."

No comment yet.
Scooped by VAB Traductions

Do Health Data Security Concerns Influence Patient Data Sharing?

Do Health Data Security Concerns Influence Patient Data Sharing? | "Patient empowerment through digital health, ehealth, connected health, patient portals, EHR, Health IT, digital hospitals, artificial intelligence (AI) & health │medicine" by VAB Traduction | Scoop.it

"December 05, 2017 - Patients need better assurances of PHI and health data security before opting into a health information exchange or other patient data sharing model, according to a study published in the Journal of Medical Internet Research.

Patient data sharing and HIEs are important aspects of healthcare, particularly as digital patient records have become the norm.

HIEs are tools that allow providers to access patient information from other disparate providers for the purposes of treating a patient. HIEs are especially useful in times of medical emergency where patients and providers do not have the time to manually obtain patient records.

HIEs rely on patient buy-in, the research team from California State University Long Beach noted. Both opt-in and opt-out HIEs require patients to make conscious decisions about whether to share their PHI with other providers via an HIE.

“Patients’ information cannot be shared, unless patients agree to share via an HIE,” the research team explained. “The value of HIE, therefore, is directly related to the relative ease of sharing among providers, payers, and patients.”

READ MORE: Opt-In HIE Fuels Patient Access to Health Data, Satisfaction

While it is difficult to measure how many patients have or have not participated in an HIE due to variable statewide regulations, the researchers maintained that patients who do not participate in an HIE may experience care quality issues.

“Patients’ decisions not to share may result in medical errors and undesired health outcomes,” the researchers explained. “Our aspiration to understand the psychology behind patients’ decision comes from our desire to address barriers to sharing and enhance motivators of sharing to help patients make better choices for their own health.”

The research team looked at over 1,600 patient responses to the Health Information National Trends Survey (HINTS) to gain a better understanding of reasons why patients do not share their health data. The team combed through the data for patient responses about security concerns, patient activation, patient-provider relationships, and issue involvement, or how relevant a specific issue is to the patient.

The data showed that patient privacy was the biggest concern among patients considering sharing their health information. Patient activation, issue involvement, and patient-provider relationships also had impacts on patient decisions to share with an HIE, but to a lesser degree than security concerns.

Patient security concerns should reframe how healthcare providers approach patient education about data sharing, the researchers said.

READ MORE: 82% of Patients Say Health Information Exchange Ups Care Quality

“This finding provides practical implications to health care providers and policy makers of the significance of this concern,” the research team explained. “Health care providers and policy makers should prioritize their efforts and focus on addressing individuals’ privacy concerns. In addition, health care providers should invest in educating people on the privacy policies that protect patients’ information and privacy.”

There is currently a push toward educating patients about the benefits of patient data sharing, the researchers noted. Patient education campaigns usually focus on the patient safety and care coordination advantages of data sharing, which while important, do not fill the entire scope of patient education necessities.

“Our study shows that there should be a shift in patient education, with a more salient focus on addressing privacy concerns,” the researchers pointed out. “By making patients more aware of existing privacy policies and security measures in place, the health care providers are creating an environment where the patients are more likely to share their PHI, and therefore still able to achieve cost and error reduction benefits.”

The researchers also acknowledged the importance of patient activation during data sharing and HIE discussions. Patient activation is tied to propensity to share in an HIE, and is a static measure. Assessing patient activation at the start of a care encounter will help providers better target other efforts, the researchers said.

Overall, the researchers acknowledged the importance of both the patient and the provider being more informed during the data sharing discussion. Providers must better understand patient activation levels. This will help providers better target motivation strategies for opting into an HIE.

READ MORE: How Patient Data Access, Education Improve Patient Safety

Likewise, providers need to expand patient education efforts to include information about data privacy on an HIE and other data sharing platforms. This will arm the patient with more information needed to make a decision.

“We suggest that physician education is as important as patient education,” the research team concluded. “Physicians who are aware of the dimensions of the patient-physician relationship can improve the said relationship, leaving the patient more prone to PHI sharing, achieving better medical decisions, reduction in medical errors, and cost benefits.”"

No comment yet.
Curated by VAB Traductions
ENG to FR freelance translator focused on patient empowerment through: ehealth - digital health - health IT - AI & medicine/healthcare - health literacy - health education - patient education - global health www.linkedin.com/in/VABtraductions