2014 Essays: The Role of the Smartphone in Digital Health

When discussing digital health it would not only be unwise to miss out the important role of the smartphone it would also be difficult. Since the smartphone is often the central hub for many digital health devices, activity trackers, implantable sensors and apps, it will continue to play an increasingly important role in driving the digital health revolution forward.

What once was used primarily for talking and texting has turned in to a powerful portable computer that has more processing power than a desktop PC had only a few short years ago. The transition from mobile phone to ‘smartphone’ is owed largely to Apple and the introduction of the iPhone in 2007 and the subsequent following of similar devices and operating systems.

Connected via Bluetooth, wireless or downloaded to the phone itself, digital health tools rely on the smartphone’s processing power to track and analyze data and on its touch screen for displaying and interrogation of the data.

Quite simply, the smartphone is the mother ship of digital health.

Smartphone biosensors

As well as acting as the centralized repository for connected third-party devices the smartphone itself is packed with a range of intelligent sensors that can – and are – being used to help people improve their health.

The latest generation of iPhone, the 5s, has seven inbuilt sensors with the most recent two – the fingerprint sensor and the M7 motion coprocessor – being introduced with the iPhone 5s launch. With nine inbuilt sensors the Samsung Galaxy S4 smartphone has two more than the iPhone.

While not all are specifically designed to be health related, these sensors are being used for health purposes to good effect.

Cardiio is an innovative app that measures heart rate without physical contact. Using the iPhone’s front camera sensor Cardiio analyzes the light that reflects off a user’s face and predicts with an almost perfect degree of accuracy the resting heart rate.

Developed by MIT’s Media Lab, Cardiio works by analyzing the light that reflects of the user’s face. Every time the heart beats, more blood is pumped to the face. This slight increase in blood volume causes more light to be absorbed, and hence less light is reflected from the face.

Unlike the raft of wrist-worn activity trackers on the market today Moves is an app-based activity-tracker that utilizes the iPhone 5s’ M7 motion coprocessor so a user can track walking, cycling and running. Data is presented in the app as a storyline of when and where you move and includes metrics of time, steps, distance covered and calories burnt.

The Moves app relies solely on the iPhone for activity tracking
The Moves app relies solely on the iPhone for activity tracking

SkinVision is an app that has been developed for one reason – keeping track of the size and shape of a body’s moles. The app allows users to analyze their skin lesions by taking a picture on their smartphone.

“SkinVision technology uses a propriety mathematical algorithm to calculate the fractal dimension of skin lesions and surrounding skin tissue and builds a structural map that reveals the different growth patterns of the tissues involved,”
says the app’s website. “By processing this map, SkinVision is able to see if the mole has abnormal development and to alert the user if a medical visit is required.”

The SkinVision app
The SkinVision app

Sensors everywhere


With the number of sensors being added to smartphones increasing from generation to generation what will the smartphone of the future look like? Will it have specific health-related sensors? Will consumers be able to pick and choose their choice of sensors on their smartphone based on their own health requirements?

The number of iPhone sensors in the iPhone 5s as of 2014. What will the next five years look like?
The number of iPhone sensors in the iPhone 5s as of 2014. What will the next five years look like?

Finally, will the ‘smart’ in smartphone be a justifiable description of the phone of the future? Perhaps not.

From smartphone to supercomputer


The processing power of mobile technology is increasing at an exponential rate. The computing power of today’s smartphone once had to be housed in a large room full of coolers to prevent it from over-heating whereas in the future the same processing power will be small enough to fit in to a red blood cell.

Future gazers predict that computing power will surpass the human brain power by 2023 in what has been named Technological Singularity which is the theoretical emergence of greater-than-human super-intelligence through technological means.

Tomorrow’s smartphone will be like a third brain with superior data driven intelligence and provided more personalized and real-time information than thought imaginable.

Think this is merely speculation by some over-enthusiastic Star Wars fans you may think again.

Technology and consulting company, IBM, built a supercomputer that can not only understand questions posed in natural language but can answer them using its databank of information with a superior degree of accuracy.

When the supercomputer, Watson (named after IBM’s first president), answers a question incorrectly it learns from its mistake, ensuring that if the same question is asked again it answers it correctly.

To prove its superior intelligence and as a test of its abilities Watson competed on the US TV quiz show Jeopardy! in 2011 against the show’s two most successful players.

The biggest all-time money winner and the record holder for the most consecutive games won both unsuccessfully competed against Watson in the special one-off show vindicating the academic field of artificial intelligence and perhaps opening up the minds of the general public to such technological capabilities.

Watson, our new computer overlord

From a medical standpoint, Watson has already been sent to medical school and in particular has been trained in oncology. In a year the computer has studied over 600,000 diagnostic reports, two million pages of medical journal articles, one and a half million patient records and 14,700 hands-on training.

Wired magazine reported that Watson is now better at diagnosing cancer than human doctors, “Only around 20 percent of the knowledge human doctors use when diagnosing patients and deciding on treatments relies on trial-based evidence. It would take at least 160 hours of reading a week just to keep up with new medical knowledge as it’s published, let alone consider its relevance or apply it practically.

“Watson’s ability to absorb this information faster than any human should, in theory, fix a flaw in the current healthcare model. Wellpoint’s Samuel Nessbaum has claimed that, in tests, Watson’s successful diagnosis rate for lung cancer is 90 percent, compared with 50 percent for human doctors.”

Finding additional uses for Watson is part of IBM’s intention with health using smartphones being touted as a future feature. In November 2013, I.B.M announced that Watson’s API (Application Programming Interface) would be made available for developers to create third-party apps.

Welltok, a consumer social health management platform, is one of the first to take advantage of Watson’s capabilities and has integrated it in to an app called CafeWell Concierge. Premium Welltok customers can ask Watson for personal health recommendations and according to Mobile Health News, “can answer users based not only on their question but also on specific information like their location, health status, health benefits, health improvement programs and incentives available from their insurer, physician or local pharmacy.”

Sample Watson conversation from WellTok (via Mobile Health News)
Sample Watson conversation from WellTok (via Mobile Health News)

The future of the smartphone and digital health


Predicting that smartphones will become more powerful is nothing eye-opening or indeed informative as Moore’s Law has proved that this is to be expected. Where it gets interesting is what these devices will be able to do with the additional CPU power. The smartphone of the future will have the power and the capacity to process large data sets much like the scientific laboratory supercomputers do today.

Screen shot 2013-02-14 at 4.02.53 AM
What will fit in to our pocket in the coming 20 years?

Single data points such as weight will be crunched alongside variables with data points in the hundreds (blood biomarkers), millions (DNA) and even trillions (microbiome) to give real-time information and accurate predictions on an individual’s health. Connected to a multitude of unobtrusive and passive sensors on and inside the human body the smartphone of the future will gather and interpret these large data sets providing meaningful health information.

While the mobile capability isn’t here yet the processing power is, albeit in a research laboratory. Renowned astrophysicist, Larry Smarr, crunched his own data to the microbial level using the supercomputer at his research laboratory where he discovered his own Crohn’s Disease which went unnoticed by his doctors.

Smarr’s belief is that computers will know a lot more about an individual’s body than a computer ever could and will spot a disease long before they feel sick. The smartphone of the future may be the computer to help do that.

Featured image credit: mobilespleaseblog

One Comment

  1. Don’t get too wrapped up in the smartphone form factor, because that too will change.

    I often compare today’s smartphones with one of the mainframe computers I worked on in the mid-1970s. The IBM System/370 Model 158-3 became a common performance benchmark, because it could execute 1 million instructions per second (MIPS). This $3.5 million computer was so large and expensive it was shared by hundreds or thousands of people. Just ten years later they all had personal computers, and each one cost 1,000 times less.

    Now back to the smartphone. Where that IBM mainframe was fast (1 MIPS), even my older iPhone 5 is some 10,000 times faster. And rather than share, I carry it in my pocket everywhere I go, with the ability to sync with medical sensor devices and do Anywhere/Anytime video calls with my doctor. But with real-time monitoring of sensor data, we’ll soon have medical care Everywhere/AllTheTime.

    From one of my articles (http://www.mhealthtalk.com/2013/07/moores-law-and-the-future-of-healthcare/).

2014 Essays: Wearable technology

Apple vs. Google. The battle for digital health begins