Data Solutions
  • Articles
  • April 2026

Finding Her Smile: AI expertise sparks a medical breakthrough

By
  • Jeff Heaton
  • John Agliata
Skip to Authors and Experts
Dynaface-Heatons-0326-MainImage2
In Brief

When RGA Vice President of AI Innovation Jeff Heaton watched his wife, Tracy, lose movement in half her face after surgery to remove a benign brain tumor, he couldn’t accept “wait and see” as the only answer. Jeff collaborated with one of the top facial reanimation surgeons in the world and used his AI expertise to collaborate to build an innovation that has evolved into Dynaface, an open-source platform now used in medical research to quantify facial paralysis recovery.

Key takeaways

  • A personal crisis sparked a breakthrough. Jeff Heaton applied his AI expertise to develop Dynaface, now a validated tool in facial paralysis research.
  • AI revealed what eyes couldn’t see. Dynaface captured subtle, dynamic facial changes and synkinesis patterns previously unmeasurable in clinical settings.
  • Human connection drove innovation. Collaboration with Johns Hopkins began because a surgeon listened to a family’s story — and recognized the potential in their data.

 

What she didn’t expect on that swampy Missouri day in August 2021 was the complete lack of movement on the left side of her face.

That’s when the fear arrived, she said. 

Tracy awoke from an eight-hour surgery to remove a benign acoustic neuroma – a noncancerous tumor that had been growing on the nerve controlling her hearing and balance – expecting relief. Instead, she felt the stillness. Half her face refused to move. She couldn’t blink. She couldn’t lift her eyebrow. Her smile — the one the middle school Spanish students she teaches knew, the one her family loved — had vanished.

“It wasn’t until a couple weeks later that it hit me: This is probably really bad,” she recalled. The realization wasn’t loud. It was slow and heavy, settling in with each day that nothing changed.

Jeff stayed beside her through every quiet, terrifying moment. Tracy remembers looking at him and seeing the steadiness she needed. “He’s like a rock,” she said. “That helped me not to panic.”

But beneath that calm, Jeff was carrying his own fear — quietly, analytically, determined not to let it control the room.

Surgeons told them to wait. Such facial paralysis wasn’t uncommon in this type of procedure. Perhaps the nerve would recover, they said.

But days turned into weeks with no flicker of movement. Tracy avoided mirrors. She began a daily ritual, taking a photo of her face — sometimes one, sometimes three or four — hoping to capture some improvement in the facial muscles she could no longer feel. Most days, she saw nothing but the same stillness staring back.

“There were some really bleak days,” she said.

By the fall of 2021, Jeff had consumed every medical paper on his wife’s condition that he could find. That’s when he discovered Dr. Kofi Boahene, a leading global expert in facial reanimation at Johns Hopkins Medicine. The hospital’s staff quickly scheduled an appointment, and the Heatons flew to Baltimore in October 2021 for a consultation with Dr. Boahene. 

“It gave us hope hearing Dr. Boahene explain the reanimation options,” Tracy said. “He was so patient, and I felt like I was in good hands because he had developed many of the modern procedures I needed.”

After returning to St. Louis, Tracy’s facial nerve did not recover on its own in the following months. So the Heatons returned to Baltimore in January 2022, and Dr. Boahene performed a nerve transfer on Tracy meant to eventually awaken her smile.

The surgery gave them hope, but the waiting dragged on.

The moment AI stepped In

As the months passed, Tracy kept taking photos — not because she saw progress, but because she needed to believe recovery was possible.

Jeff looked at the photos differently. He didn’t just see the face of his best friend and partner. He saw data points.

“I wanted a number you could plot on a chart,” he said. “As I’d heard time and again from the actuaries I work with, if you can’t measure it, it’s not real. So I adapted AI to automatically calculate several symmetry measures created by Dr. Boahene.”

Jeff began testing prebuilt facial-recognition models in the hopes one would be able to show the minute progress the human eye couldn’t detect. His goal was to assure his wife that she was, indeed, making progress. At the time, he had no idea these measurements could be used for medical research and to possibly direct future care for those with a similar condition.

Each model failed. None had been trained on faces with paralysis. They straightened Tracy’s asymmetry, “fixing” her expressions in ways that erased what mattered most to what Jeff was trying to accomplish.

“This is the bias we deal with a lot in medical data at RGA. The models were biased towards non-paralyzed faces,” Jeff said. “It’s all they had ever seen. You have to correct for that bias.”

So Jeff did what he does at RGA when an existing model doesn’t accurately reflect truth: He built something better.

His version unwound the bias. He wrote new code to track landmark facial points. He stabilized every photo by aligning Tracy’s pupils. And then he spent night after night testing, calculating, adapting his model, and retesting.

One evening, months into his venture, the breakthrough came. His AI and the painstaking manual measurements he’d done in Photoshop produced the exact same result.

“That was a moment,” he said – a moment when the impossible began to feel measurable.

Tracy and Jeff Heaton
Learn more about the usefulness of the technology born from Jeff Heaton’s creation of Dynaface.

A surgeon sees the future

At Tracy’s next appointment, Dr. Boahene noted some improvements and asked about them. Jeff, using the precise language of someone who has lived inside the data for months, explained the exact progression of her symmetry with precise numbers. Jeff told Dr. Boahene that he could use AI to measure facial symmetry.

The surgeon was stunned.

Jeff showed him the time-lapse video he’d created — hundreds of Tracy’s daily images aligned and stabilized into a haunting, beautiful progression of microscopic change.

Tracy with Dr. Boahene over a year

 

“Oh my goodness,” Dr. Boahene said. “I never have had the opportunity to see day by day what actually happens. This is a treasure trove.”

A follow-up call turned into a collaboration. Dr. Boahene provided guidance for Jeff to expand the initial measurements into richer, clinically significant metrics. With this help, Jeff expanded his code to analyze video, track 95 facial landmarks, and quantify subtle movements such as blinking or involuntary eye narrowing. Johns Hopkins validated the technology and named it Dynaface.

The tool that began in Jeff’s home office — built out of love, fear, and data — became an open-source platform used by a growing number of clinicians around the world to help chart progress and plan treatment for patients with conditions similar to Tracy’s.

From personal crisis to published science

Dynaface led to a peer-reviewed medical journal publication that objectively measured oralocular synkinesis — a condition where the eye narrows as the mouth moves — and revealed expression-specific patterns that had previously been impossible to quantify. Puckering as if you had just sucked on a lemon, for example, emerged as the most sensitive diagnostic expression.

“It allows us to scan somebody’s face and develop what I call a fingerprint,” said Dr. Boahene. “This is precision medicine for fixing paralysis of the face.”

Fig 1. Dynaface analyzes a video of Tracy Heaton blinking.

Jeff even had the opportunity to present the work at a Johns Hopkins Grand Rounds session, a high-level weekly educational conference dating back to 1889 where medical experts present complex, real-life patient cases, new research, or treatment guidelines. Jeff fielded questions from surgeons, neurologists, and researchers — all eager to understand how AI could illuminate what the human eye could not.

“It was intimidating,” he said, “But they had a lot of interest.”

Dynaface has since been presented in international medical settings, including a recent conference near Davos, Switzerland. There, Dr. Boahene shared the tool and Tracy’s video, which has been viewed by more than 300,000 people since it was uploaded to YouTube in May 2023 – people searching for something Tracy once longed for: a sign that recovery is possible.

 

jeff and tracy in the kitchen

A smile that offers hope

Five years after the paralysis began, Tracy’s smile has returned — not suddenly, not perfectly, but undeniably.

“It felt like a miracle,” she said.

As she started to notice improvements, she sent the photos of her new smile to her sister. She showed her mother. She looked in the mirror and finally recognized herself again — or, more accurately, the new version of herself she had fought to become.

Her daily photos — the ones she once took in despair — became the raw material powering a tool that offers hope to others. What the Heatons didn’t expect was the response.

Once Johns Hopkins shared Tracy’s time‑lapse publicly, comments began pouring in — people from around the world living with paralysis, people awaiting surgery, people terrified they would never smile again.

Many wrote versions of a similar message: Your video gives me hope.

Others asked: How long did it take? Should I keep waiting? Should I travel to Johns Hopkins? Who is your surgeon?

Jeff responded to many of them personally — pointing them toward Johns Hopkins, sharing what he knew, and offering reassurance when fear was all they could feel.

For Tracy, the messages were deeply moving. People confided in her about their own paralysis, their loneliness, their uncertainty. She understood all of it.

“It feels like something good came out of the suffering,” she said. “That’s what makes it worthwhile. I wouldn’t wish this on anyone, but the good that’s come from it has made it redeemable.”

The doctor’s lesson

To Dr. Boahene, the Heatons’ story captures what can happen when technical expertise joins forces with the emotional truth of a patient’s lived experience. The collaboration didn’t start in a lab or with a research team. It started with a husband and wife sitting in a hospital recovery room, trying to make sense of the unthinkable.

 “It all began with listening to a family,” he said. “That’s such an important message for me, for my colleagues: Conversations with our patients can open the door.”

Dynaface continues to evolve through ongoing work with Johns Hopkins. And while its scientific impact is growing, the Heatons know its human impact may matter even more.

Because the tool isn’t just measuring smiles.

It’s helping people believe they can get theirs back.


More Like This...

Meet the Authors & Experts

JEFF HEATON
Author
Jeff Heaton
Vice President, AI Innovation
John Agliata
Author
John Agliata

Senior Writer, RGA Enterprise Marketing & Communications

References

  • Renne, A., Heaton, J., Derakhshan, A., Nellis, J. C., Desai, S. C., & Boahene, K. D. (2025). Use of dynamic, automated facial analysis in quantifying oral-ocular synkinesis. Facial Plastic Surgery & Aesthetic Medicine.
  • Renne, A., Heaton, J., & Boahene, K. D. O. (2026). Associations of AI-based facial metrics with patient-reported outcomes in idiopathic facial paralysis. Laryngoscope. Advance online publication.