Sleep better. Boost your mood. Reduce stress. Health and wellness products make a lot of promises, but there is little keeping them honest.
And there's a lot of money to be made. In 2020, "wellness" (including everything from beauty products to alternative medicine) grew to a $4.5 trillion dollar industry, while more than 84 billion peopleused health or fitness apps.
That's where Mashable comes in. In our "Does It Work?" column, we're going to test health and wellness products, and speak to experts about their claims.
"These products are developing so fast, and there's so much enthusiasm," said Serife Tekin, a University of Texas at San Antonio professor who has studied the regulation of mental health apps, but policy "is not moving as fast."
Here's what lets health and wellness companies make the claims they do — and whether you should believe them.
You might be under the impression that bureaucratic government employees make sure every health and wellness product has evidence to back up its claims. That would be nice. But there are not.
There are two federal government agencies responsible for regulating health and wellness products: the Food and Drug Administration (FDA), and the Federal Trade Commission (FTC).
For the most part, it's the FDA's job to ensure products that can affect a person's health are safe — and, to some extent, effective — before companies can sell them. It's the FTC's job to make sure advertising and marketing claims are true after those products hit the market.
Only a narrow subset of devices have to back up their promises before companies can sell and advertise them. Those are products that could physically harm a person if they don't work properly (like pacemakers).
For most health and wellness products, FDA approval isn't necessary. It doesn't view "medical mobile apps" as medical devices (with some narrow exceptions). And the wellness industry as a whole? From sleep aids to stress management tools, the FDA says, it's not our problem.
Tweet may have been deleted
Experts say the FTC is very good at building cases against advertisers that lie. However, it picks its targets carefully. Like the FDA, it focuses on the potential harm false advertising might cause. For example: Is a product going to waste someone a few bucks? Or will it make someone sick? Does it make vague claims that a reasonable person might know to take with a grain of salt? Or is it claiming something big and specific — like it prevents cancer — without the receipts?
But the FTC has to referee a lot of health and wellness products. And so shady companies continue to thrive. The reactive nature of the FTC, along with the fact that some kinds of harm are hard to quantify, lets bad products slip through the cracks.
The photos and text in an Instagram ad or Amazon product description could have a lot to do withyour decision to buy a product.
Google, Apple, and Amazonhave policies prohibiting companies from making misleading claims in ads. Facebook(which owns Instagram) even has specific guidelines for health claims. And Apple says it carefully scrutinizeshealth apps with the potential for causing physical harm.
Still, ads and listings for miracle cures and shady diet products make their way past those safeguards. Like the FTC, reactive measures from tech companies aren't good enough to stop every scammy wellness product.
Instagram ads don't always show the companies behind the brands. That makes it difficult to keep businesses accountable.
"Most companies self-police because they are long-term players and they want to get repeat purchases," said Anita Rao, an associate professor of marketing at the University of Chicago's Booth School of Business who has studied FTC regulation. "Versus, there are some companies that just want to exist for a month. Just make as much money as you can, say you cure coronavirus, and then just exit the market. Those are short term players."
Tweet may have been deleted
Because companies can go viral and make big bucks with a single product, they don't need to focus on silly things like "trust" or "reputation." And it's easy to write fake reviews to drown out negative feedback.
The idea of "preventing harm" informs regulatory guidelines and advertising policies. But often that just means physical harm.
If a user is trying to ease their anxiety or drink less, using an app that doesn’t work could be very discouraging. And some apps can push unhealthy ideas about weight loss. Those are certainly "harmful" consequences, but they’re often not regulated.
"A lot of app downloads happen in crises," Tekin said. "They over-expect the power of one app."
Tweet may have been deleted
So what is Mashable looking at first? Massage guns, which are supposed to help pro athletes and regular gym-goers alike recover from workouts.
TopicsSocial Media