Can Facebook be Trusted with User Health Data?
Recently, Facebook was in the spotlight for trying to share patient data with health organizations.
Pharmacists are probably familiar with the drama surrounding Facebook in the past month related to a possible data break of more than 80 million users. The breach stemming from Cambridge Analytica goes back to 2014, when they began collecting data, and it has become concerning for its possible use of data influence on voter opinion during political elections. Taking that into consideration, the trust of Facebook has not been particularly high as of late. It's for that reason that tidbit of news that recently came to media scrutiny caught my attention.
Announced by CNBC's Christina Farr in a news report, Facebook had been sending a doctor to hospitals, looking to share patient data. On the tailwind of this recent debacle with Cambridge Analytica, Facebook announced this project would be on hiatus to work on other projects to better protect user data.
But why was Facebook pushing for hospital access to user data? Well, on the 1 hand, it could be useful, depending on how you use it. The CNBC story quoted Cathleen Gates, interim CEO of the American College of Cardiology (ACC), who was approached by Facebook. "For the first time in history, people are sharing information about themselves online in ways that may help determine how to improve their health. As part of its mission to transform cardiovascular care and improve heart health, the American College of Cardiology has been engaged in discussions with Facebook around the use of anonymized Facebook data, coupled with anonymized ACC data, to further scientific research on the ways social media can aid in the prevention and treatment of heart disease—the No. 1 cause of death in the world," she said. "This partnership is in the very early phases as we work on both sides to ensure privacy, transparency and scientific rigor. No data has been shared between any parties."
What we are seeing is that medical organizations recognize the social data that patients have made available online could be utilized for health-focused studies, and possible clinical utilization. In this case, the example mentioned in the story was that you could identify patients who could be at risk at discharge from the hospital. A purported pitch from Facebook put it thus: you know from the medical record a patient is on so many medications, has these diseases, and need these services. But Facebook knows the patient has difficulty with the English language, their social and marital histories, and community support. Identifying patients who may have trouble transitioning to home could be pre-identified and intervened upon earlier, leading to potentially better health outcomes. Rather a unique approach I feel, and I could see other uses. Using social information could definitely be beneficial. From a health and pharmacy standpoint, I could see it being used in different mechanics, for good or ill though.
Take for instance parents who share or like antivaccination posts. Would these parents be targetted for health services to increase vaccinations for their children through better education? Or, patients who share and like alternative medication posts. Would these patients also be approached in another direction when being prescribed medications, say, flagged in a system for a provider to recognize they may not be adherent or receptive to medications? I am not sure, but that seems to be some part of what this could turn into.
But here's the rub: People who use Facebook didn't know that their information was being shopped around to healthcare organizations. And in light of the issue with Facebook's earlier history, such as when they were manipulating users newsfeed to see if they could impact their emotions (through positive and negative posts), and the this debacle with Cambridge Analytica, Facebook isn't exactly setting themselves up for a positive outcome.
I am not saying Facebook is terrible for this. I'd like to think they had some genuinely altruistic intentions here, and many researchers have been pushing for the use of patients online data to be involved in patient care, such as for treatments of mental health and chronic diseases. Just, being transparent I think would have been better in this case. Ultimately, had this not all come to light, it could be argued that Facebook really wants to get into healthcare, in one form or another, for a profit. Because, Apple and Google aren't the only tech companies that sees healthcare as the next frontier, so Facebook has to turn to what it can produce and sell, which, is the users, and their data.
Ultimately, Facebook has a lot of positives as a social network. Many patients are on Facebook with community groups that support one another. It's not like people who use Facebook for this purpose are going to run away from the platform—they've integrated it into their lives. That being the case, I do see, whatever goodwill that has been built up over the years, decreasing due to what Facebook is doing. Being transparent for users of Facebook would be in their favor, though, because as it stands, most people using the platform don't regard them as an organization that they would affiliate helping them with their health, intentionally or not.
Farr C. Facebook sent a doctor on a secret mission to ask hospitals to share patient data. CNBC. https://www.cnbc.com/2018/04/05/facebook-building-8-explored-data-sharing-agreement-with-hospitals.html. Published April 5, 2018. Accessed April 11, 2018.