by Maisha Razzaque
“Noom isn’t just your average dieting app. It’s goal-oriented psychotherapy that helps you think critically about the food you’re eating.” I heard this pitch a few weeks ago during a radio ad, and I thought: wow, I like that way of looking at food tracking. I’d totally use this app.
Except I already have.
Here is a comprehensive list of apps I have used, am using, or have downloaded with the idea of future usage in order to regulate and maybe even improve my productivity, diet, and overall physical and emotional health: Noom (food and exercise tracking), Clue (period and ovulation tracker), doc.ai (personalized health data), Calm (mood tracking), Sayana (mood tracking), 30 Day Fitness (exercise tracking), Lifesum (Macro Tracker), Flora (habit tracking), Reflectly (anxiety tracking), Focus Keeper (time management)— I’ll stop here. You get the point; there are a lot.
Reading a book, going for a run, eating a meal, and relaxing are all supposed to be pretty uncomplicated activities for most people. Yet, I — and I suspect an alarming number of others — have overcomplicated it to a point of chaos. The question remains: why did I convince myself that I need a billion apps to regulate my life?
Six years of being in school has taught me to look to the research. According to the Health and Wellness Foundation, 60 million people in the United States are using some sort of mobile health app. And, as studies indicate, a majority of those users are female millennials. The promise of these apps basically can be boiled down to something called the Health Belief Model. Developed in the 1950s, the Health Belief Model is a systemic method that identifies health behavior; the main gist of it is that an individual’s intention to “engage in health behavior” (positive or negative) is directly related to how vulnerable to health threats they believe themselves to be. The user will act according to his/her own perceived strengths and weaknesses. Health apps enable self-monitoring which, in theory, should lead to some positive effects. Mobile health apps are essentially a user-friendly tracking journal on your phone, and for the most part you don’t have to analyze your own data because the app does it for you. It can even be helpful to bring this sort of data to your healthcare provider if you’re trying to manage a chronic condition (e.g. blood sugar, blood pressure, heart rate, etc).
When it’s put that way, it doesn’t sound so bad, so what could go wrong? It turns out, a lot.
What starts out as helpful information from well-intentioned apps can turn into data overload. As a result, this can exacerbate health anxiety — a phenomenon in which a person has an irrational preoccupation with possible health threats. For people with certain proclivities to obsess over calorie/exercise tracking, these apps can actually be enablers of unhealthy behaviors. There is also the question of validity. Many mobile health products are self-report models. What if you don’t know the exact calorie count in a home-made meal or the “intensity level” — something a version of MyFitnessPal has asked users to report — of your cardio exercise? Without valid data, the analysis provided by these apps is not useful to managing health at all!
Last of all, we have to look at the money.
In 2019, the mobile health app industry made $3 billion in sales. A lot of this money is coming from advertising, but what the average user might not know about is the amount of personal data being harvested and sold to third party companies. In 2018, several period app companies including Glow and Flo got into hot water because they were selling personal data about people’s menstrual cycles to companies that were, in turn, using them to create targeted ads. Suddenly, aggressive online ads for baby clothes and cribs would coincide with a missed period. But is hot water the right way to describe the backlash? There were no legal consequences; In fact, most of these apps have tiny, tiny print stating that you’re allowing them to do whatever they want with your health data the moment you tap “install.”
But surely, you’re thinking, there are some legal standards to protect people from this kind of predatory data mining.
That’s just the thing. Mobile health apps have taken the market by storm and seemingly transforming how people are looking at health management overnight. The legislation has simply not had the time to catch up. The sheer vastness of the mobile health market makes it hard for the average user to judge quality. The FDA’s oversight of mobile health products has been met with a lot of handwringing. Pushback from the industry has been hanging on the argument that overregulation could hamper growth and innovation. Nathan Cortez, a law professor at the Southern Methodist University, has suggested broadening the FDA’s jurisdiction. In a 2014 article about FDA regulation of mobile health apps, he argues that the existing legislation that limits the FDA’s involvement is bad for doctors and dangerous for health app users. He proposes that Congress should consider allowing a professional third-party to evaluate the algorithm and quality safeguards outlined in an FDA regulatory guides. Since then, a 2017 redraft of the FDA’s regulatory guidelines has tightened regulations of diagnostic apps — ones that physicians use to aid with making clinical diagnoses. The wheels of government regulation turn slowly — so, so slowly — but surely.
What I’m piecing together from this crash course on mobile health products is that these too-good-to-be true apps might not work, may be using my personal data, and aren’t being closely regulated. But why did I convince myself I needed so many of them in the first place? The answer, as you may expect, isn’t quite so simple. It can be broken down into a few pieces. Maybe I may have not been the one doing the convincing. If everyone is touting the newest and best app that’s transforming their lives, it’s natural that I should want in. When my favorite disembodied nutrition podcast host voice tells me to take control of my life by downloading Noom, I just may do it. Perhaps, I — and the aforementioned millions of users — have fallen prey to the phenomenon of “too much data”; it’s very easy to rationalize that somehow having “more” apps is the same thing as having “better” apps. Then before you know it, you’ve used up all your phone storage on six different AI Mindfulness apps. Despite all this, I haven’t reached the conclusion that the apps are bad. After all, people just want to take an active role in their health. Understanding mobile health apps can help us critically think about which ones can better meet our needs and which ones are just unnecessary noise.