'Bad Bot' (part II) by @DrMurphy11

The story of a dangerously flawed #eHealth AI Chatbot.

A THREAD about #HealthTech #Hype, #AI #Governance & #MedicalDevice #PatientSafety.

Read on, or see the single tweet summary👇0/45
This thread follows on from a prior thread ('Bad Bot' - part I) & picks up the Babylon story from the #BabylonAITest event which was held on the 27th June 2018.

'Bad Bot' (part I) - 👇 1/45
The #BabylonAITest event was held at @RCPLondon & promised to reveal to the world, Babylon's latest "ground breaking research" & "the most exciting advances in Artificial Intelligence".

Dr Murphy was expecting Big Things from @babylonhealth...

2/45
The #BabylonAITest event kicked off with Ali Parsa talking about the companies 'mission statement'.

IMO - the companies focus on accessibility & affordability - has been to the detriment of ensuring #PatientSafety - which is the focus of this thread.

3/45
Dr Butt took the stage for the BIG #BabylonAITest reveal.

Babylon claimed, their #AI #Chatbot was more capable of passing the diagnostic component of the post-graduate @rcgp exam, than a typical UK GP trainee.

Sounds good doesn't it.
Maybe too good to be true?

Dr Butt 👇4/45
During the event, Babylon also demonstrated the science behind their AI, using some highly advanced wavy lines.

The message was clear. If you want to have good health, you had better follow the guidance of the Babylon AI...

Scientific wavy lines👇5/45
Ali Parsa (Babylon CEO) closed the #BabylonAITest event stating; the Babylon AI Chatbot could diagnosis patients as reliably & safely as a doctor, and the 'diagnostic' Chatbot was live & available to all...

Ali Parsa's sales pitch 👇6/45
Babylon took to @Twitter with their claim that the @babylonhealth AI Chatbot had the "safety & accuracy of a real-life doctor!"

Babylon also released a 'paper' that outlined the 'research' presented at the #BabylonAITest event.


7/45
On reviewing the @babylonhealth 'research paper' - significant flaws in the methodology were evident.

Was the 'diagnostic' AI Chatbot actually any good, or was the #BabylonAITest just an elaborate PR exercise?

Dr Murphy's non-academic review 👇8/45
Other twitter users shared their opinion of the #BabylonAITest 'paper'.

Including Professor @EnricoCoiera (trained in medicine with a computer science PhD in Artificial Intelligence), author of "Guide to Health Informatics".

The Enrico review 👇9/45
Dr @EricTopol (currently leading The Topol Review: preparing the healthcare workforce to deliver the digital furture) also provided his opinion.

Eric Topols tweets 👇10/45
Was the new, much hyped, Babylon 'diagnostic' AI Chatbot as reliable & safe as @babylonhealth claimed?

Or was this as misleading as their prior '100% safe' Chatbot claim.

Dr Murphy put the AI Chatbot to the test.

A prior misleading claim👇 11/45
Test 2 #NoseBleed triage.

Chatbot diagnosis = it REALLY doesn't know what to do with a nosebleed.

Result = AI fail.

Bizarre #NoseBleed triage 👇13/45
Test 3 #PE triage.

Chatbot diagnosis = #Costochodritis & #BrokenAnkle.

Result = AI fail.

Another #DeathByChatbot 👇 14/45
The #BabylonAITest was quickly becoming a PR Car Crash.

A statement from the Royal College of General Practitioners @rcgp, rebuked Babylon’s claims as “dubious" - & the online media hype was replaced with more objective reporting.

Media coverage 👇15/45
mobihealthnews.com/content/uk-pra…
Despite the negative coverage - the Babylon PR machine continued to disseminate the over-hyped promotional claims.

"Babylon’s AI is on par with doctors"

Misleading Babylon promotional email 👇16/45
There was further embarrassment for @babylonhealth when @RCPLondon - who according to Babylon collaborated in the #BabylonAITest event - also released a statement to distance themselves from the dubious claims.

Another bad day for Babylon 👇17/45
Babylon's PR Car Crash continued - with their Chief Scientist clarifying that the data presented at the #BabylonAITest event was in fact 'preliminary' & from a 'pilot study' - hence NOT practice changing. 🤔

More red faces at Babylon👇18/45
It appeared that both the diagnostic abilities of the Babylon AI Chatbot, & the AI Governance within @babylonhealth were not robust.

Of concern, it appears they had also cut corners with the @babylrwanda AI system deployed in Rwanda.

BBC Click 👇19/45
Then - on the 11th July, the @DailyMailUK dropped this bombshell...

"The new Health Secretary, @MattHancock - uses a controversial smartphone app, @babylonhealth - that offers virtual consultations with a GP."

Dr Murphy needs counselling 😕 20/45
Babylon were also in the news due to their involvement in the controversial @George_Osborne @EveningStandard - “money-can’t-buy” #FutureLondon advertorial scheme.

Babylon's PR panic👇21/45
However, even the well-oiled Babylon PR machine, couldn't keep the Safety Concerns regarding the AI Chatbot, out of the mainstream news media.

The @FinancialTimes reported on the flawed reality of the unvalidated @babylonhealth AI Chatbot...

👇22/45
Babylon's response to the reported #PatientSafety concerns was both dismissive & infantile.

@babylonhealth, claimed the company was the victim of Twitter #Trolls, motivated by vested interests...

#PatientSafety #Troll 👇23/45
At least the Babylon PR team could still rely on the @EveningStandard for some positive media coverage.

The benefits of an Advertorial deal 👇 24/45
Although Babylon had claimed their 'diagnostic' AI Chatbot was 'on par' with GPs.

In reality, the @babylonhealth Chatbot was flawed.

Some of the Chatbot triage algorithms could cause patients unnecessary distress.

e.g. Flawed #UTI triage👇 25/45
Other flawed AI Chatbot triage algorithms were amusing enough to make it into the @MailOnline.

e.g. Flawed #NoseBleed triage 👇 26/45
Worryingly, 18 months after #PatientSafety concerns had first been raised with the @CareQualityComm & @MHRAdevices.

Dangerously flawed AI triage algorithms, continued to be a risk to patient safety.

e.g. #Cardiac #ChestPain triage👇27/45
Given the obvious flaws in the @babylonhealth #ChestPain algorithm - some of the media reports were a little alarming.

#DeathByChatbot👇 28/45
The clinical errors in the Babylon AI algorithms were so fundamental, they raised questions regarding the Corporate & Clinical Governance within @babylonhealth.

Could they be trusted to deliver healthcare?

#DeathByChatbot 👇 29/45
Although Ali Parsa talked of the need for 'testing of AI technology to ensure people aren't harmed'.

He hadn't tested his own Chatbot!

Bizarrely, he also intimated @MHRAdevices regulations & @CareQualityComm were a 'barrier'. 🤔

Listen to Ali👇30/45
The Babylon story then went from Bizarre to Surreal...

The new Health Secretary, @MattHancock, used his first speech to promote #HealthTech & specifically @babylonhealth @GPatHand 😲

Matt Hancock supports Babylon's Bad Bot👇31/45 bbc.com/news/health-44…
However, even with @MattHancock's support - the Babylon AI Chatbot remained dangerously flawed.

Sudden onset #ChestPain & #Breathlessness = #DeathByChatbot

Another Bad Chatbot triage👇32/45
In some cases, the @babylonhealth AI Chatbot would suggest completely random diagnosis.

For example; Productive Cough = #MultipleSclerosis. 🙄

This Babylon AI Chatbot was an #eHealth liability.

Would you trust this Bot👇33/45
Concerns regarding the safety of the Babylon AI Chatbot were raised in parliament - with @sarahwollaston questioning @MattHancock regarding the safety & the regulation of #eHealth Apps.

The Bad Bot debates👇34/45
Thread to be continued...

Usual T&Cs apply 👇

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Dr Murphy (aka David Watkins)

Dr Murphy (aka David Watkins) Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @DrMurphy11

Jun 24, 2018
If you have an interest in #AI, #HealthTech or #PatientSafety - then please read this evidence based thread which tells the story of @DrMurphy11 & an #eHealth #AI #Chatbot.

Read on, or see single tweet summary here👇0/44
is not a 'nerd'; he's a pretty typical NHS consultant with an interest in #PatientSafety.

On 6th Jan 2017 a tweet about an #eHealth #AI #Chatbot in #NHS trials caught his attention, so he thought he'd take a look. 1/44

Evidence here 👇
Dr Murphy downloaded the @babylonhealth App & tried the #Chatbot with a few simple clinical presentations. It quickly became apparent that the #Chatbot had flaws, raising the question if the App had been validated as a triage tool?

As evidenced by👇 2/44
Read 45 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(