With one in two people experiencing trauma or adversity related to the COVID-19 pandemic, the global toll on mental health is – and will continue to be – profound.
That’s a key finding of the Inaugural Mental State of the World 2020 Report from Sapien Labs, which used the Mental Health Quotient online assessment tool to obtain data from roughly 49,000 people in the U.S., Canada, U.K., Australia, New Zealand, South Africa, India and Singapore.
The study found that the percentage of respondents with clinical-level risk jumped from 15 percent in 2019 to 26 percent in 2020, and that the U.S. and Canada have experienced the most severe mentalhealth impact from COVID-19.
This suggests “a long-term fallout from the pandemic on the mental health front” with younger generations expected to be hit hardest, stated Sapien Labs founder and chief scientist Dr. Tara Thiagarajan.
It’s a bleak forecast, yet a bright spot is on the horizon: With the ongoing work of innovative Canadian start-ups like Montreal’s Aifred Health and Toronto’s Pentavere Research Group Inc., artificial intelligence is quickly changing how Canadian clinicians understand, diagnose and even treat mental health.
“Every day in Canada, 200 people wake up and try to commit suicide and 11 people succeed, and yet we know very little because we have very little data on these people,” said Aaron Leibtag, CEO and co-founder of Toronto-based Pentavere, a rapidly growing company that is applying AI to ensure data utilization in healthcare is on par with other industry sectors like finance.
“We know very little from a data perspective about patients who are dealing with horrible depression, patients who are dealing with suicidal ideation, patients who are going further and further down the hole of addiction, and the reason for that is so much of the data we have isn’t in drop-down boxes, isn’t in registries, it’s in the clinical notes of the psychiatrist,” said Leibtag.
Launched in 2016, Pentavere is on a mission to “quickly and economically” extract data from electronically inaccessible documents such as clinical notes, transcription texts, lab tests and diagnostic and pathology reports, allowing information to be more easily aggregated, analyzed and digested to support good decision making in mental health.
The idea was born in response to a devastating personal tragedy suffered by Pentavere CTO and cofounder Steven Aviv, whose mother died during a routine medical procedure because life-saving information was buried in a clinical note.
Five years later, the company’s proprietary AI engine – called DARWEN™ to reference both the evolution of health care (Darwin) and the need for real world evidence (RWE) to drive decision making – is successfully applying several AI models and methodologies to extract clinical information from digital sources with a very high degree of accuracy.
Emerging as an AI-as-a-Service model that can be applied to any healthcare discipline, DARWEN is empowering researchers and clinicians to gain a better understanding of patient populations without having to incur the time, cost and inaccuracies associated with manual chart review.
A project under way in Hamilton, Ontario, is using the platform to extract information from roughly 30,000 medical records to better understand the impact of COVID-19 on mental health.
Dr. Zena Samaan, a psychiatrist at St. Joseph’s Healthcare and associate professor at McMaster University, said it would “not have been humanly possible” to comb through that many records to unlock data with the same degree of accuracy and consistency as DARWEN.
Researchers will be looking to understand what transpired over the past year by analyzing patient demographics; treatments received; changes in treatment; whether substance abuse started, ended or relapsed; suicidal attempts or thoughts; and, whether or not psychotherapy was received, for example.
“I can’t wait to get my hands on this data to see what is really happening,” said Dr. Samaan. “Did COVID make people’s depression worse? Did it make suicidal thoughts worse? Did they start to use more cannabis and alcohol? We think it did, but imagine the confidence you have in your conclusions when you’re basing it on thousands of people who are real people, not an experimental model.”
Researchers will also be looking to gather information from medication records, lab results, imaging and radiology in a tabular form that provides a rich picture of a patient’s healthcare journey, including the treatments they received and how they got to where they are today, she added. The information will be used to help design healthcare delivery models and improve outcomes for patients, and to look for predictors of relapse or death by suicide.
“It’s usually years before you can get even 10 percent that size of data collection,” said Dr. Samaan. “(With DARWEN) we can generate and answer much bigger questions than we could have answered if we had an individual human opening each file separately, reading what’s in it, and taking down the notes.”
Since the start of the pandemic, Dr. Samaan is seeing a growing number of referrals and requests for mental health services, particularly for depression. The increase is not just in new onset of depression cases due to distress over loss of finances or employment stability, but also in known cases that have destabilized during the crisis. Substance use has also increased.
She’s excited about the possibilities ahead as DARWEN helps to paint a clearer picture of the impact of COVID.
“At the moment, the whole world is suffering from this pandemic and the effect of stress could be different in different people. But if those people have one thing in common – which is depression disorder – we want to see what kept the people who didn’t decline better,” she said. “If we can get a fast result while we are still going through this crisis, can we do differently? Can we do better? I’m very optimistic that when we find something actionable, services would be responsive to change.”
To better understand the uses and trends of AI and machine learning in mental health in Canada, the Mental Health Commission of Canada (MHCC) undertook an environmental scan, literature review and stakeholder map with the Canadian Agency for Drugs and Technologies in Health (CADTH) in 2019.
Maureen Abbott, manager of the MHCC’s Access to Quality Mental Health Services team, said the goal was to learn “more about the effectiveness and safety of AI and how best to support its integration into mental health.”
One finding that stood out from the collaboration is that algorithms used in AI are only as good as the data that forms them, and many of the existing training algorithms don’t reflect specific groups such as older adults, refugees, immigrants, ethnocultural or racialized populations, First Nations, Inuit and Metis people, she said.
“We’ve seen stories about face recognition software that doesn’t work with different ethnic groups … so you can imagine the same thing happening when you’re trying to predict a response to a treatment,” explained Chris Kamel, CADTH director of Health Technology Assessment and Rapid Response. “If the training data of the algorithm doesn’t reflect the diversity of the possibilities, then it’s less likely to be as effective for that diverse use in clinical practice.”
At the time of its assessment, CADTH found that most AI applications in mental health were still at the research and development stage, but rapidly evolving. The study included conversational agents or chatbots designed to support treatments like cognitive behavioural therapy, as well as emerging decision support tools or tools to predict responsiveness to treatment.
Abbott identified Kids Help Phone as a leader in the area, pointing to the organization’s recently launched navigation technology called Kip as an example. Kip is a chatbot designed to make it easier for anyone seeking mental health resources to explore and find support at their own pace.
“Organizations like Kids Help Phone use AI to detect words in chats and text messages that are then correlated to suicidal thoughts, and they’re then able to triage clients in that way,” she explained, noting that much of the work takes place behind the scenes. “Many of us have virtual assistants at home, so I think it’s only natural that AI for mental health is also happening and helping to make our processes, knowledge and prevention easier.”
One of the challenges in evaluating and understanding the effectiveness of AI in mental health is the speed at which the technology is evolving, added Kamel. “The development cycle is quite rapid,” he said. “The volume of different technologies and the pace with which they change will be a challenge for the healthcare system broadly.”
Rapid change is something Aifred Health can attest to. Not that long ago, the company’s four founders were McGill University students who came together as a club to look at the issue of how mental healthcare gets delivered in Canada. Now they are poised to start clinical trials of a first-of-its-kind clinical decision support tool in mental healthcare that uses AI to support better treatment management, matching patients to the right treatment.
And it’s all because a professor urged them to enter the global IBM Watson AI XPRIZE in 2016, a competition to accelerate AI technologies that are solving societal grand challenges.
“So, this very young team of brilliant McGill students entered the competition and the only reason they even incorporated in August 2017, was because it was a requirement of the XPRIZE,” said Aifred Health CEO Marina Massingham, who came on board in 2019 to assist the startup with financing and commercialization.
The original four founders – Dr. David Benrimoh, Sonia Israel, Kelly Perlman and Robert Fratila – all have leadership roles as the company forges ahead on its journey, now one of three global finalists in the competition with the $5 million prize to be awarded this June. Regardless of the outcome, Aifred is already making a difference, said Massingham.
“Today 90 percent of patients within Canada who have major depression only ever get to see their family doctor; they will not get to see a specialist or psychiatrist,” she said. “Family doctors are wonderful, but they are not well equipped in the treatment of complex mental health conditions. Our tool is designed to help physicians understand where the patient is in on the disease cycle and, based on the behavioural information of that patient, what to do next.”
Aifred Health has been running clinical pilot projects in Quebec since October 2020. North American clinical trials are expected to start later this year, as the company begins the process of obtaining both FDA and Health Canada approval as a class II medical device. Aifred replaces the traditional ‘trial and error’ approach to treatment selection by using a deep learning model to predict the optimal recommended treatment for an individual patient. Predictive models are trained on high-quality and reliable clinical data sets and are continuously fed de-identified patient outcomes to further refine and improve their predictive capability.
Core to the solution is a clinician-patient application that works in a browser on any device, with a downloadable version expected soon.
Patients use the app to answer standard questionnaires about their mental state and quality of life and their unique inputs are combined with the company’s proprietary clinical treatment algorithms to guide treating physicians as they decide the best course of treatment.
“It’s like having a consult with your colleagues where you need it: at the point of care,” said Dr. Fanny Hersson-Edery, a family doctor in Montreal and an early user of Aifred.
Similar to Dr. Samaan, Dr. Hersson-Edery is also seeing a rise in the number of patients reporting anxious and depressive symptoms related to the pandemic.
Using Aifred in her practice gives her a better understanding of the effectiveness of treatment options, according to a patient’s specific symptomology, and enables her to digest a patient’s responses to the questionnaires ahead of their scheduled appointment. An added benefit is that the tool keeps patients interested in their diagnosis and care.
“The decision-aiding tool is very helpful in closing gaps in terms of knowledge for the patient and the physician, and the fact that it’s a tool that you’re using together really engages the patient in a different way that wasn’t previously possible,” said Dr. Hersson-Edery.
“I think mental health sometimes suffers from this aspect of it’s not really a medical diagnosis but more of a social construct; ‘You didn’t try hard enough’ or ‘Pull yourself up,’” she added. “Having a tool that uses best evidence, as well as objective data, helps to reiterate the point that mental health is a medical diagnosis.”