It’s the customer’s data, stupid

Paul McKeon

Shortly after the US presidential debate finished, the chief executive of Oura, makers of the eponymous health and fitness tracking ring, commented on LinkedIn that the event had literally made peoples’ hearts beat faster.

Apparently, you weren’t alone in holding your breath in suspense as Joe Biden appeared to lose his train of thought in front of more than 50 million American voters, likely marking the beginning of the end of his re-election campaign.

Oura’s Tom Hale wrote, “typically we expect to see heart rate levels drop sharply in the evening, but we saw them stay at an almost daytime-like level for the duration of the debate… we saw dropped heart rate variability rates (by nearly 5 per cent) by the end of the debate, possibly indicating higher levels of physiological stress and lower recovery”.

Image: Dmitriy

On the surface this was a nice publicity coup for Oura, demonstrating the potential of a device still largely in the shadow of the Apple Watch. But, longer term, using user data like this is a minefield and other leaders would be well-advised to tread carefully.

The important thing to remember is, it’s not your data it’s your customers — and that is the practical reality regardless of what is written in your End User Licence Agreement, privacy policy or other set of terms and conditions.

Oura is not the first to miss the memo.

Around 15 years ago, fuelled by the rise of smartphones, information from internet connected devices appeared to be the ‘new oil’. Software developers and device manufacturers seemed to be sitting on a potential data goldmine.

It looked like a rare win-win. A smart toothbrush, for example, could mean better dental health for customers and also a new income stream for the manufacturer; either via subscriptions for premium services or the sale of the aggregated user data it produced.

It might even improve our knowledge of human health. Alphabet formed Google Health around the same time with the aim of combining real-world data from many sources to help improve clinical outcomes, increase efficiency and reduce the cost of healthcare.

Over subsequent years I advised on a number of transactions where the perceived value of an organisation was based, at least in part, on its potential to monetize data or an exit strategy that anticipated a sale to Apple, Google, Samsung or GE.

But times have changed, as smart device pioneer iRobot learned the hard way.

In 2017, the US company, whose Roomba product line created the market for modern robot vacuums, found itself in hot water after Reuters reported it had plans to sell the data gleaned from mapping customers’ homes.

The company issued a clarification, walking back selling to sharing, but the damage was done. Five years later, iRobot was still regularly being asked to deny the original story when Amazon launched a (since abandoned) bid for the struggling business.

Today, iRobot’s privacy policy says that while it will not sell users’ data, it may share it under certain conditions, including when anonymised. The company apparently missed this 2019 Nature paper which showed such data can be reverse-engineered.

If people baulk at one company profiting from its knowledge of the layout of their homes, imagine how they might feel about another making use of their most personal of personal information: health data.

These days Google Health is a much scaled down initiative; no longer a separate division or “alpha bet”. BigTech has learned data is only valuable if you can sell it — which means finding someone willing to pay — and that’s often easier said than done.

Regulatory changes have also had an impact.

The US Food and Drug Authority has raised the bar for organisations aspiring to market ‘general wellness’ solutions in that country (and, in effect, globally), pushing many previously under the radar into expensive and time consuming 510K submissions.

And, since the introduction of GDPR (General Data Protection Regulation) in the EU in 2018, many organisations have updated their privacy policies, and associated training, to ensure any information they gather is used only for the purpose it was intended.

But not everyone.

In 2020, the Australian Competition and Consumer Commission successfully pursued medical booking service HealthEngine for misleading conduct, including the selling non-clinical information from 135,000 patients to health insurers.

The company, which blamed the action on “rapid growth …  which outpaced our systems and processes” was ordered to pay $2.9 million in penalties. It subsequently built the Morrison Government’s problematic COVID-19 vaccine booking system.

In Oura’s case, clearly no one individual’s privacy was misused in the post-debate anecdote.

The company’s privacy policy, also explicitly states it shares users’ personal data (including heart rate, movement, temperature, and respiration) with ‘certain trusted service providers and partners’ and it does not sell or rent such information.

It’s also reasonable to argue that collectively, Oura users, as comparatively early adopters and people likely to be interested in the ‘quantified self’ aspects of health and fitness tracking, might see little harm in their data being used for marketing purposes.

But just because Oura can do it, doesn’t mean everyone can.

In the event of a reputation crisis like iRobot experienced, a generous privacy policy isn’t a get-out-of jail-free-card. It should represent the minimum standard to which it an organisation holds itself regarding the user data entrusted to it, not the highest.

That’s important because the most precious asset of any business is not its data or intellectual property, it is customers’ trust. And the rules by which people place their trust in an organisation — and can also take it away — aren’t written in black and white.

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories