In early 2018, Vesela Kovacheva, MD, PhD, an anesthesiologist for high-risk pregnancies at the Brigham, met with Erin McKenna, Brigham Ignite’s program director, at Mass General Hospital. The meeting was to discuss the artificial intelligence project that Kovacheva had been working on, one that could eventually impact a vast section of patient care.
AI was in its early days then, in stark contrast to the significant strides it has made today. One question loomed large for Kovacheva: Would there be interest in her project? Never had she met with anyone from the Mass General Brigham Innovation staff to talk about her years-long work, and she speculated how the conversation would land.
“I tried to explain my project and wondered if it had a chance,” Kovacheva remembers. As she laid out her project, directly relating it to her area of expertise—an AI-based automating drug administration and dosage system for vasopressor medication in Cesarean deliveries—Kovacheva knew that data she had been gathering from her innovation could someday be the basis for deciphering so much of research’s mysteries and inefficiencies.
At the end of their meeting, McKenna showed interest—lots of it.
The advent of the EHR has bettered data mining for researchers as they sort through sets of medical information, identifying patterns and relationships that tease their work. But at Mass General Brigham, as it is everywhere, obtaining Big Data, which is needed for AI research is time-consuming and resource intensive. For example, the same medication can be represented by several trade names, a generic name, and several different formulations. In addition, some of these may be written with capital or lower-case letters. To design accurate AI tools, it is necessary to have all these data clean, which means that the same medication needs to be used with a consistent name throughout the system.
This process of cleaning and processing multiple data can be demanding and time-consuming, Kovacheva knew. A new universal platform for gathering research biomedical data was sorely needed. Yet most of that computational work Kovacheva knew of was usually performed by large teams of engineers in industries with robust resources, something unavailable at most hospitals with smaller IT capabilities.
With limited resources, Kovacheva and Raphael Cohen, a senior machine-learning scientist in her lab, began designing an AI platform, called the Medical Record Longitudinal Information AI System, or MERLIN, which would become the foundation of her research—in fact, can be used for AI research outside of medicine. Working with several IT open-source tools (programming that is not proprietary, which can be modified or built upon), she and her team developed MERLIN, a highly scalable, collaborative, and resource-efficient health care AI platform, reducing research time from idea to research completion.
MERLIN can ingest, store, and process electronic health record data, live data, and real-time high-resolution device data in a HIPAA-compliant manner; moreover, clinical expertise is integrated at every stage. The core advantage of the platform is the early translation of data to atomic representations, which means every data is stored in the smallest components. In this way, the data are always clean and can be used immediately in any downstream projects. This approach allows efficient data processing, enabling near-real-time generation of datasets for subsequent analysis and development of AI models. The platform is interactive and includes machine learning tools that allow work in parallel on multiple projects.
MERLIN is project-agnostic and is designed to scale with data and compute needs; it relies on a microservices architecture with multiple containerized components that enable data processing, dataset generation, model development, and interaction with applications. This approach creates opportunities for faster AI integration in clinical care to meaningfully impact patient outcomes.
“Initially, we made a few modules of it. Due to the microservices architecture, multiple modules can work together to accomplish a task. If we need to get the data for 100,000 patients, and one module of it can get 1,000 patients, making multiple copies of that module can accomplish the task much faster, for example, in an hour instead of days or weeks,” she said. “This allows us to be very efficient with a small but highly dedicated team.”
McKenna saw the value in this work and awarded Kovacheva a Brigham Ignite Development Award, which is focused on developing AI models using MERLIN. In addition, Kovacheva received a Mass General Brigham Innovation Discovery Grant and grants from Shark Tank of Brigham Research Institute, the Anesthesia Patient Safety Foundation, Mass Life Sciences Center, Connors Center, National Institutes of Health, and others totaling $1.5 million over four years.
The significance of MERLIN, however, is not limited to research. For instance, the team has created an algorithm to predict who is at risk of developing preeclampsia in pregnancy and has had some promising preliminary results. Using MERLIN, the team identified some racial disparities in care and demonstrated that for AI tools to be very efficient, high-quality data from all patients need to be used.
Kovacheva still recalls the meeting with McKenna. “I still have that presentation I made to her, and I probably didn’t do a great job. But I remember some of her words and her encouragement. It made me realize that this project would be valuable, and that many patients would benefit if I persisted with it.”
Credit also goes to Glenn Miller, PhD, strategic innovation leader, who helped Kovacheva present her innovation to funders in a way that highlighted the value of investigating blood pressure changes in pregnancy, a research area that is not commonly studied. Of note is that pregnancy research historically has been underfunded. He also helped develop relationships with industry, defining priorities, and providing insights on IP and regulatory compliance issues.
Thanks also goes to Kalpana Kamath, PhD, program manager for Brigham Ignite, who offered continuing advice, breaking down bottlenecks, and coming up with solutions that were not previously considered. She also connected Kovacheva to internal resources and provided insightful feedback on presentations.
Support cannot go unrecognized from Chris Coburn, chief innovation officer at Mass General Brigham, who was instrumental from the beginning of her work.
Kovacheva remains grateful for all the assistance, saying it has helped her grow as a scientist and innovator. “I find it personally extremely gratifying for this opportunity to collaborate and help others with groundbreaking research.”
published on
published on
published on
published on
published on
published on
published on