I could (technically) tell your heartbeat from your Snapchat video

João Ribas
3 min readFeb 4, 2018

How we used a cell phone camera, and a few tricks, to track my heart rate during the 2015 MIT GrandHack.

Right before I joined the team at MIT Hacking Medicine, I participated in the 2015 MIT Grand Hack — the biggest, baddest, most fun healthcare hackathon around (I’m biased, but still true).

Working with our team at the GrandHack. High-tech prototyping with paper cups within our team. Photo credits: Lina A. Colucci.

We started with a clinical need. Congestive heart failure results in around 200,000 new cases per year in the US. These patients are at increased risk of heart failure. Hence diagnosis and monitoring are of vital importance to these patients.

“Body weight standardised to time of day and clothing, preferably recorded regularly by the patient at home as well as in the clinic, is useful in detecting sodium and water retention (increased weight) and over‐diuresis or cardiac cachexia (weight loss). The potential of non‐invasive home telemonitoring (for daily weight, blood pressure, heart rate and rhythm) is exciting.” ¹

Now, for the tech stuff

We saw an opportunity for tracking these patients through telemedicine, namely (a) measuring rapid changes in weight and (b) measuring jugular venous pressure (or pulse). The first mistake we made was to have two different solutions instead of focusing our attention on the most relevant approach.

Jugular vein pressure explained.

Elevated jugular vein pulse is a warning sign of heart failure. In the emergency room, physicians observe the patient and try to determine the length of distention. This often subject to each physician. Our thought was then to use any cell phone camera and allow patients to track jugular vein pulse at home, as an early sign of imminent heart failure.

We divided the workflow in two, (1) motion augmentation and (2) heart rate monitoring. For the motion augmentation, we used Michael Rubinstein’s awesome Eulerian Video Magnification code (MIT CSAIL, check Michael’s work here!), and for the heart rate monitoring, we used a pixel-shift method I had worked on before (published here).

Can you barely notice the pulsation? Raw video.

Since we didn’t have access to any congestive heart failure patients, we found a way around. I went running like crazy around MIT Media Lab to get my heart rate up so that we could barely see something on video. Yes, this is not the exact same thing, but it was enough for a proof-of-concept.

After running the raw video through Michael’s Eulerian Video Magnification code (SIGGRAPH2012) the results looked amazing! We could see something!!

Pulsation made visible by using SIGGRAPH2012.
Graph of the pulse obtained from the augmented motion video. Time signature changed hence this x-axis is incorrect.

Using (again) Matlab, we were able to plot the pulse by selecting a window where it’s visible in the video.

It was a fun weekend working on understanding the medical need, coming up with this solution, and prototyping a proof-of-concept. Far from perfect, but I believe in the future any camera will be able to track minute differences, including your heart. It could very well be a security camera on the emergency room picking up a patient with elevated jugular pressure, or it could very well be tracking your excitement in a store. Who knows.

Thank you, Michael for making your code available online!

Now, how cool would it be to check Leonardo Dicaprio’s heart rate after his inspirational speech on “The Wolf of Wall Street”?

Leonardo Dicaprio in “The Wolf of Wall Street”

Notes:
¹https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1861485/

--

--

João Ribas

🥼 🚀 Biotech VC and company builder @ Novo Holdings; 🎙️ Founder/Host @ The Future Labs podcast; views are my own.