Protecting Maternal Health in Rwanda | MIT News


The world faces a maternal health crisis. According to the World Health Organization, about 810 women die every day from preventable causes related to pregnancy and childbirth. Two-thirds of these deaths occur in sub-Saharan Africa. In Rwanda, one of the main causes of maternal death is infected caesarean section wounds.

An interdisciplinary team of doctors and researchers from MIT, Harvard University and Partners in Health (PIH) in Rwanda have come up with a solution to address this problem. They have developed a mobile health (mHealth) platform that uses artificial intelligence and real-time computer vision to predict infection in cesarean section wounds with around 90% accuracy.

“Early detection of infection is a significant problem worldwide, but in low-resource areas such as rural Rwanda, the problem is even more serious due to the lack of qualified doctors and the high prevalence of antibiotic-resistant bacterial infections,” says Richard Ribon Fletcher ’89, SM ’97, PhD ’02, a mechanical engineering researcher at MIT and the team’s technology lead. “Our idea was to use cell phones that could be used by community health workers to visit new mothers in their homes and inspect their wounds for infection.”

This summer, the team, led by Harvard Medical School professor Bethany Hedt-Gauthier, won the $500,000 top prize in the NIH Technology Accelerator Challenge for Maternal Health.

“The lives of women giving birth by caesarean section in developing countries are compromised by both limited access to quality surgery and postpartum care,” adds PIH team member Fredrick Kateera. “Using mobile health technologies for the early identification and plausibly accurate diagnosis of people with surgical site infections within these communities would be an evolutionary game-changer in optimizing women’s health.”

Training algorithms to detect infection

The birth of the project is the result of several fortuitous encounters. In 2017, Fletcher and Hedt-Gauthier crossed paths in the Washington Metro at a meeting of NIH investigators. Hedt-Gauthier, who had been working on research projects in Rwanda for five years at that time, was looking for a solution to the cesarean section problem that she and her collaborators had encountered in their research. Specifically, she wanted to explore the use of cell phone cameras as a diagnostic tool.

Fletcher, who leads a group of students in Professor Sanjay Sarma’s AutoID lab and has spent decades applying phones, machine learning algorithms and other mobile technologies to global health, was a natural candidate for the project. .

“Once we realized that these kinds of image-based algorithms could support home care for women after cesarean delivery, we approached Dr. Fletcher as a collaborator, given his his extensive experience in developing mHealth technologies in low- and middle-income settings,” says Hedt-Gauthier.

During that same trip, Hedt-Gauthier happened to sit next to Audace Nakeshimana ’20, who was a new MIT student from Rwanda and would later join Fletcher’s team at MIT. With Fletcher’s mentorship, during his senior year, Nakeshimana founded Insightiva Rwandan startup that applies AI algorithms for clinical image analysis, and was a top winner in the annual MIT IDEAS competition in 2020.

The first step of the project was to compile a database of images of wounds taken by community health workers in rural areas of Rwanda. They collected over 1,000 images of infected and uninfected wounds and then trained an algorithm using that data.

A central problem emerged with this first set of data, collected between 2018 and 2019. Many photographs were of poor quality.

“The quality of wound images collected by health workers was highly variable and a lot of manual work was required to crop and resample the images. Since these images are used to train the machine learning model, the quality and variability of the image fundamentally limits the performance of the algorithm,” says Fletcher.

To solve this problem, Fletcher turned to tools he had used in previous projects: real-time computer vision and augmented reality.

Improved image quality with real-time image processing

To encourage community health workers to take higher quality images, Fletcher and the team revised the mobile wound screening app and paired it with a simple paper frame. The frame contained a printed calibration color pattern and another optical pattern that guides the application’s computer vision software.

Health workers are responsible for placing the frame on the wound and opening the app, which provides real-time feedback on camera placement. Augmented reality is used by the app to show a green tick when the phone is in the correct range. Once in range, other parts of the computer vision software will then automatically color balance, crop the image, and apply transformations to correct for parallax.

“By using real-time computer vision at the time of data collection, we are able to generate beautiful, sharp, uniform, color-balanced images that can then be used to train our machine learning models, without the need for to manually clean the data or post-processing,” says Fletcher.

Using machine learning models of convolutional neural networks (CNNs), along with a method called transfer learning, the software was able to successfully predict infection in cesarean section wounds with an accuracy of approximately 90% within 10 days of delivery. Women who are predicted to have an infection via the app are then directed to a clinic where they can undergo diagnostic bacterial tests and be prescribed life-saving antibiotics if needed.

The app has been well received by women and community health workers in Rwanda.

“The trust that women have in community health workers, who were a big promoter of the app, made the mHealth tool accepted by women in rural areas,” adds Anne Niyigena from PIH.

Using Thermal Imaging to Address Algorithmic Biases

One of the biggest obstacles to scaling this AI-based technology to a more global audience is algorithmic bias. When trained on a relatively homogeneous population, such as that of rural Rwanda, the algorithm works as expected and can successfully predict infection. But when images of patients with different skin colors are introduced, the algorithm is less efficient.

To solve this problem, Fletcher used thermal imaging. Simple thermal camera modules, designed to attach to a cell phone, cost around $200 and can be used to capture infrared images of wounds. Algorithms can then be trained using the heat patterns of infrared wound images to predict infection. A study released last year showed over 90% prediction accuracy when these thermal images were paired with the app’s CNN algorithm.

Although more expensive than simply using the phone’s camera, the thermal imaging approach could be used to adapt the team’s mHealth technology to a more diverse global population.

“We give health staff two options: in a homogeneous population, like rural Rwanda, they can use their standard phone camera, using the model that has been trained with data from the local population. Otherwise, they can use the more general model that requires the thermal camera to be attached,” says Fletcher.

While the current generation of the mobile app uses a cloud-based algorithm to run the infection prediction model, the team is currently working on a standalone mobile app that does not require internet access and also reviews all aspects of maternal health, from pregnancy to postpartum.

In addition to developing the library of wound images used in the algorithms, Fletcher is working closely with former student Nakeshimana and his team at Insightiv on app development and using locally made Android phones in Rwanda. PIH will then conduct user testing and field validation in Rwanda.

As the team seeks to develop the comprehensive maternal health app, privacy and data protection are a top priority.

“As we develop and refine these tools, special attention must be paid to the privacy of patient data. More details on data security should be incorporated so that the tool fills the gaps it is meant to fill and maximizes user trust, which will eventually drive its wider adoption,” says Niyigena.

Members of the winning team include: Bethany Hedt-Gauthier of Harvard Medical School; Richard Fletcher of MIT; Robert Riviello of Brigham and Women’s Hospital; Adeline Boatin of Massachusetts General Hospital; Anne Niyigena, Frederick Kateera, Laban Bikorimana and Vincent Cubaka of PIH in Rwanda; and Audace Nakeshimana ’20, founder of Insightiv.ai.

Previous British satellite giant OneWeb launches bid to raise billions
Next Bangladeshi banks are at high risk of cyberattacks