It’s hard to remember what online life was like before the advent of Wikipedia nearly two decades ago. Every day, millions of people turn to the nonprofit site in moments of curiosity or confusion, taking for granted that it will give them answers — and quite possibly good ones.
Last week, Wikipedia received an unusual prompt. For one thing, it dealt with sensitive subject matter. And it didn’t come from a person — it came from YouTube, the largest video site on the internet.
During an interview at South by Southwest in Austin, Tex., YouTube’s chief executive, Susan Wojcicki, announced that the company would enlist Wikipedia’s help to deal with the proliferation of conspiracy theories and misinformation on its platform.
To fight the problem, she said, YouTube would soon begin experimenting with what it called “information cues” sourced from the online encyclopedia. The cues would appear as captions and article links beneath videos that dealt with topics related to popular conspiracy theories — she used the moon landing and “chemtrails” as examples.
Her interviewer, Nicholas Thompson of Wired Magazine, gently teased, “So, YouTube will be sending people to text?”
Away from the conference stage, the responses were anything but gentle.
“I’m genuinely curious whether YouTube management is a bunch of Pollyannas who have never watched the videos on their platforms or a bunch of people who just don’t care,” Renee DiResta, a researcher and expert in online disinformation, tweeted.
Justin Brookman, the director for consumer privacy and technology policy at the Consumers Union, called the plan “a disingenuous cop-out that will make Wikipedia’s job harder.”
Others noted what seemed like an obvious point: Can’t anyone edit Wikipedia, including the conspiracy theorists themselves?
From the Wikimedia Foundation, which oversees Wikipedia, the first response was confusion.
YouTube is doing what?
Katherine Maher, the executive director of the Wikimedia Foundation, sounded as if she wouldn’t have minded a heads up. “When the announcement came out, we were surprised that we hadn’t been contacted,” Ms. Maher said in an interview.
She had learned about YouTube’s plans at the same time as everyone else — including Wikipedia’s army of volunteer contributors, some of whom were not pleased with the idea that an internet colossus had casually declared that it would outsource one of its knottiest problems to a relatively small nonprofit organization.
“Wikipedia is not something that just exists,” Ms. Maher said. “It takes work and it requires labor.”
A Wikipedia editor who goes by the handle SEMMENDINGER shared a concern on a contributor discussion page, writing, “My worry is that the people viewing these sorts of videos in the first place are the same people who wholeheartedly are in agreement with the content.”
Another editor, Doc James, was more sanguine. “I do not imagine many problems,” he said. “We have lots of policies to help support those who come with good references and rebuff those without.”
YouTube’s announcement might be chalked up to an unintentional miscommunication or public relations misstep. But for Wikipedia, it was part of a familiar pattern.
A cross between a piece of infrastructure, a public commons and an online community, Wikipedia has been a boon to some of the largest companies in the world. Whatever benefit Google, the owner of YouTube, hopes to wring from the crowdsourced encyclopedia is likely to pale in comparison with the value it receives from including Wikipedia entries in its search results.
Google’s “Knowledge Graph,” which displays certain answers to search queries in separate, authoritative-looking boxes on the right side of a computer screen, draws heavily from Wikipedia. (In 2007, Google even started an ad-supported Wikipedia competitor called Knol. It was discontinued in 2012.) Furthermore, virtually every voice assistant, including Amazon’s Echo and Apple’s Siri, uses the site to give customers a wide range of answers to their questions.
Wikipedia has also proved invaluable to tech companies as they develop artificial intelligence and translation services — as a multi-language corpus for researchers, it is unparalleled (and, of course, free).
“The degree to which the Wikipedia data has informed computer science is pretty astonishing,” said Brent Hecht, an assistant professor at Northwestern University who has studied Wikipedia’s relationships with tech companies and online communities. “Wikipedia definitely creates massively more value for these companies than it puts in.”
With the YouTube announcement, Wikipedia finds itself in a predicament similar to that of a small town after it has been chosen as the site of a new Walmart: Some municipalities have said that the retail giant creates undue stress on local police forces, straining them to the breaking point with endless calls to deal with petty crimes at the super stores.
Wikipedia’s lopsided relationships can also amplify pre-existing internal problems. Its contributors skew white and male, for example, leading to predictable outcomes — just 17 percent of the subjects of biographies on the site are women. Wikipedia is taking steps to address the issue, but the next generation of consumer tech is being built on top of it — and its gaps and biases — in the meantime.
Having come of age in the era of desktop and laptop computers, Wikipedia is also struggling to bring itself into the smartphone era (not to mention whatever comes next).
“If we have interfaces like Alexa, for example, they’re using infrastructure and content generated by editors, but they’re not really encouraging readers to contribute in any way,” said Dariusz Jemielniak, who published a book-length ethnographic study of Wikipedia and now sits on the Wikimedia Foundation’s board. “They might not even realize it’s coming from Wikipedia,” he added. “It would be nice if people who take also give a little bit.”
The main problem with YouTube’s presumptuous announcement, Mr. Jemielniak suggested, is that Wikipedia is not necessarily geared toward breaking news — and conspiracy theories tend to move at lighting speed during times of crisis.
Recently, for instance, YouTube was rife with conspiracy videos about the school shooting in Parkland, Fla., and it is unclear what even a thorough and well-sourced Wikipedia article could have done to dissuade an audience intent on believing that students at the school were paid “crisis actors” carrying out orders.
More to the point, Mr. Jemielniak said, if Google wanted help from Wikipedia, it could have asked.
Ms. Maher agreed. At the very least, she said, it would be nice to have “a primary point of contact” at Google and companies like it — “someone who is invested in thinking of us an entity rather than just a resource.”
As it stands, no such person has that role — though YouTube has reached out to the foundation since the announcement, and the two organizations are regularly in touch in less formal ways. (Google decline to comment for this story. Maher said of their interaction this week, “It was good for a start.”)
Then there’s the issue of money. As important as Wikipedia may be to some of the richest companies in the world, it is, in financial terms, comparatively minuscule, with a yearly budget of less than $100 million — a rounding error for big tech. (It should be noted that Google has made one-off contributions to Wikipedia in the past and includes the Wikimedia Foundation in a program through which it matches employee donations, which netted the Foundation around $1 million last year.)
For now, Wikipedians — among the most central and least visible participants in this conversation — seem not overly perturbed.
“Looks like we have more work to do,” wrote one, after the YouTube announcement. “I doubt that much will change anytime soon.”