“One of our biggest responsibilities is to protect data.”
That’s what Mark Zuckerberg, Facebook’s chief executive and co-founder, said just over a week ago after revelations that the data research firm Cambridge Analytica had gained access to the profiles of 50 million Facebook users.
Over the past year, the downside of social media companies — Facebook foremost among them — has become glaringly apparent. They have been platforms for information campaigns that influenced elections and endangered lives. They have failed to keep users’ personal information private. And the carefully targeted ads that appear on them can be, well, creepy.
Now the drumbeat for regulation of social media on both sides of the aisle — and from many in Silicon Valley — is getting louder by the day. “I think that this certain situation is so dire and has become so large that probably some well-crafted regulation is necessary,” Tim Cook, Apple’s chief executive, said recently.
Mr. Zuckerberg may be getting tired of other executives sniping at his company, but he needs to come up with a solution — fast. For several weeks, I’ve been canvassing various technology executives, privacy advocates, academics and others to come up with some ideas about exactly what Facebook could do to fix itself.
It is not clear that Facebook will be able to stave off regulators, but Mr. Zuckerberg and his colleagues might want to consider this: a “Why me?” button. (Google, Amazon and others might want to take note, too.)
Facebook suffers from a lack of trust because of the asymmetrical nature of the relationship users have with it. We provide it all sorts of information. But we have no idea how the information is being used, how our data is being harvested, how that data is being commingled and cross-referenced with other data sets and ultimately sold to advertisers. (Facebook already has a button, relatively buried in a list of other items, that it calls “Why Am I Seeing This Ad?” It is, to some degree, a crippled version of what I’m suggesting should be introduced on all of Facebook’s properties.)
To his credit, Mr. Zuckerberg does appear to recognize the problem that his company faces over privacy concerns. After acknowledging that protecting users’ data is one of the company’s biggest responsibilities, he went on to say, “If you think about what our services are, at their most basic level, you put some content into a service, whether it’s a photo or a video or a text message — whether it’s Facebook or WhatsApp or Instagram — and you’re trusting that that content is going to be shared with the people you want to share it with.”
But the solution Facebook offered last week — putting all of your privacy settings on one screen and blocking some third parties from access to data — is just a Band-Aid. It’s a start, but it doesn’t provide a road map for how your information is being used.
That’s where the “Why me?” button could help. This button would sit next to every advertisement and piece of content that appears before you on all of Facebook’s properties, including Instagram and WhatsApp.
If you saw an ad or an article pop up on your screen, you could click the “Why me?” button. Then you would see a full explanation of why that item was pushed to you.
The “Why me?” explanation would not just include the name of the advertiser, but what keywords, demographics or other information the advertiser specifically targeted. It would also offer a full rundown of how your information fit into the parameters of the advertiser’s request and where that information came from.
Maybe Facebook saw that you had clicked on a pair of sneakers on a different service that shares information with Facebook. The “Why me?” section would show you when and where that information was collected. Did Facebook notice you searched for a new cellphone plan? Facebook would tell how it knew that, too.
Did Facebook use artificial intelligence to find your face in an image without you or anyone else labeling it? The “Why me?” section would show you the images that the artificial intelligence system cross-referenced to identify you. Did Facebook scrape another site or buy information about you that it commingled with the data that you had provided? It would tell you that, too.
Most important, at every point in the “explanation tree,” the user would have the option to turn off or disable that specific piece of data. Facebook, of course, could — and probably should — explain the trade-offs of such decisions.
The “Why Me?” button could also be extended to content. When a friend or company shares a post, Facebook doesn’t just display it to everyone listed as the poster’s friend or follower. Facebook’s algorithms choose which friends and followers see it based on all sorts of parameters, usually related to the kind of content or how you’ve interacted with the friend in the past.
The “Why me?” button should explain all the data points that are used in the calculation and how they are weighted.
The current “Why Am I Seeing This Ad?” often says something like: “Company A wants to reach people ages 18 and older who live or were recently in the United States. This is information based on your Facebook profile and where you’ve connected to the internet.”
But it hardly tells you the whole story. Facebook doesn’t say so explicitly, but it allows companies to upload their own databases, cross-reference them with Facebook’s data and use that information to serve ads to users. It also doesn’t say how that other site originally got that information about who you are.
In fairness, the “Why me?” button might create all sorts of problems for Facebook, and its advertisers, too. It would allow users — and rivals — to reverse engineer much of the way the system works. And advertisers would probably object to the idea of making their targeting plans public. But that would be the cost of using such large public platforms with such exact targeting.
It has become something of a cliché, but in 1913, before he became a Supreme Court justice, Louis D. Brandeis said, “Sunlight is said to be the best of disinfectants.”
It could well be applied today at Facebook — and its peer companies — if it wants to avoid a governmental crackdown.