Facebook, Health Data Privacy and the Need for a New Business Model

By Kurt Long, CEO and founder, FairWarning, Inc.
Twitter: @FairWarningLLC
Twitter: @KurtJamesLong

According to CNBC, Facebook sent a doctor on a secret mission to ask hospitals to share patient data. Facebook was also in talks with top hospitals and other medical groups as recently as last month regarding a proposal to match patient data with their user data.

The idea was to build profiles based on these matches that would include medical conditions and prescriptions issued, as well as social and economic factors pulled from Facebook. The stated intention was to “help the hospitals figure out which patients might need special care or treatment.” Being cautious of what they are doing post-Cambridge Analytics data leak scandal, Facebook stated that the project is on hiatus so it can focus on “other important work, including doing a better job of protecting people’s data.”

But when that clearly hasn’t been a priority for the data giant, the statement rings hollow. For Facebook’s 2 billion users, many of whom are outraged by this latest revelation, the real issue isn’t just how Facebook now plans to protect its users but how data is used in the United States – and who the data actually belongs to.

In this climate of heightened awareness, healthcare organizations need to put customers first by making privacy and security of data actionable priorities. You’ve probably heard it said that, in the digital age, data is the new currency. But they’re wrong; the new currency is trust. It’s hard to come by these days, so any organization that can distinguish itself as a source of trust will have no trouble attracting business.

Who’s Doing What with Your Data?
The question of what companies are doing with their customers’ data is not a new issue, but it’s one that has been gaining steam recently. Though it’s impossible to reverse engineer the tipping point of any movement, several factors have played a role in raising awareness about this recent topic.

The first is the European Union’s General Data Protection Regulation (GDPR), set to take effect in May. The EU’s history of privacy sensitivity dates to at least World War II, acknowledging how information on individuals, if misused, can constitute a loss of rights that jeopardize a nation’s citizens or even a continent’s population.

The encryption debate is another factor. The government’s desire to keep its citizens safe has been pitted against the privacy of those citizens. Big tech firms like Apple have taken a stand, believing that creating “back doors” for federal authorities is a slippery slope that erodes user privacy.

Another factor is Russia’s attempts to compromise U.S. elections. It got people thinking about how they may have been targeted by a foreign power with persuasion tactics meant to sway their opinion. This then ties into Facebook’s Cambridge Analytica data leak scandal. People are furious that their personal data was used without their knowledge for purposes they may not have approved of, also with the intent of influencing the election.

Again, this activity isn’t new. Anyone who’s heard of Nate Silver knows that data analysis has been used for years by both political parties to plan and test campaign strategies. The difference here, though, is that private individuals’ personal data was being used without their knowledge or consent.

Taking a Stand
The Cambridge Analytica scandal hurt Facebook, and with this healthcare data scheme arriving hot on its heels, only time will tell if the social media platform will survive. In the same vein, sub-par privacy standards will hurt healthcare organizations going forward. Americans have reached the “enough is enough” stage. They are fed up with being treated merely as a crop to be harvested for someone else’s gain and are demanding change. Some of that change may come from legislation, but the law-making process is usually slow and consensus on difficult topics like data privacy can’t always be achieved.

So, individuals are rising to the challenge of creating change with actions such as the #DeleteFacebook campaign. When the likes of Elon Musk delete their accounts, it’s time for hospitals and healthcare companies to step up the plate and actually take action on privacy and security instead of just issuing apologies and vague promises.

How do organizations maintain or rebuild consumer trust? First of all, executives need to talk about data privacy in a public and authentic way. Mark Zuckerberg talks about it, but he’s not authentic. He’s been talking about this topic since 2007, yet clearly, things have actually gotten worse. During a recent press tour, Facebook COO Sheryl Sandberg admitted that the company was informed about the Cambridge Analytica data leak two-and-a-half years ago but did nothing.

On the other hand, Tim Cook at Apple has talked about data privacy and backs it up with his actions. He’s resisted pressure to monetize data mining for short-term gain, protecting the brand image over time and playing the long game of consumer trust.

Essentially, Apple’s strategy is “Our products are our products” instead of “Our products are free, and you are the product.” When this happens, the consumer wins – and so does the company. As consumers place a higher premium on their privacy, companies will be able to differentiate themselves by putting privacy front and center. As Apple continues to succeed with this strategy, others will follow.

Are You Building Trust?
This speaks to a new business strategy in which healthcare organizations weave privacy and security into their business model. They will implement privacy solutions as a matter of course; they won’t wait for a data breach but proactively take action.

The scarcest thing in the world today is trust, and by expanding trust with patients, healthcare providers can provide a greatly differentiated service. That’s what it’s going to boil down to: trust of relationships.