About to turn 25, Wikipedia faces the challenge of remaining relevant and financially viable in an Internet dominated by generative artificial intelligence. João Peschanski, executive director of Wikimedia in Brazil and one of the founders of Wikipedia in the country, says that an increasing number of people are accessing information “recovered” (systematically collected) from Wikipedia on ChatGPT and other chatbots.
“These AIs are completely sucking us in. Wikipedia is the biggest source for ChatGPT and Gemini, and they don’t attribute it to us, they don’t put Wikipedia as a source,” he says. “We depend on small donations. If people access our projects less, they will give less.”
But he cautions that the survival of resources like Wikipedia is essential to the existence of generative AI models. “If we cease to exist, the content consolidation and collaborative curation that holds it all together will also cease. »
What is the profile of Wikipedia editors, all volunteers?
The data indicates that the majority are over 35 years old (44%), an evolving profile. In 2021, 31.5% were over 35 years old. The majority are still men: 17.13% of editors are women, compared to 11.10% in 2021.
What motivates someone to become a volunteer Wikipedia editor?
The most common editor is the circumstantial editor, one who went to Wikipedia, saw that something was out of date, and made an edit. For example, someone died, Cristiano Ronaldo scored a goal, Corinthians went to the Copa do Brasil final. This person places the information and only republishes it after a very long cycle.
This is the majority of editors, but not the majority of edits: 10% of editors make 80% of edits. Among them we have those who have an almanac profile, make long lists, catalog everything. They are generally obsessive people. And there are people who fit into niches.
In the United States, there is a movement of right-wing politicians and influencers who accuse Wikipedia of having a left-wing bias. Elon Musk, for example, named the site “Wikipedia”. Does this phenomenon also exist in Brazil?
A Harvard study a few years ago showed that Wikipedia’s political content was very centrist. For example, to write the existing article on Bolsonaro, editors from all backgrounds participate, Bolsonarists and non-Bolsonarists.
In the case of women, there are more and more women editors, but that is why we can say that Wikipedia is feminist (…) Today, women still represent only 17% of editors. In Brazil, the one who said Wikipedia was left-wing was Levy Fidelix (PRTB founder, former presidential candidate, 1951-2021). He created programs against Wikipedia and sent notifications (extrajudicial, requesting changes to entries).
Are you receiving many notifications?
About ten per month. In an election year, it reaches 50 per month.
When someone questions information, what is the procedure for this complaint to be investigated?
The person goes to the page and leaves their comment, other editors follow this article and will validate it. There is a “discussion with the editors”, or a discussion with Wikipedia. You leave the comment there and people resolve it.
And when someone makes an edit that is a lie, starts adding disinformation or if it’s a press officer, how do you deal with it?
We monitor very significant changes to sensitive articles. Before the person even posts, they will be blocked if they delete the content or vandalize it, unless the person is a repeat publisher.
We also have a group of editors, which we call statutory (there are 7,000 for Portuguese Wikipedia), who monitor it. These are people who have the power to delete, block or protect articles and block publishers.
We also have automated mechanisms that identify changes that need to be reviewed. They appear in different colors if the authors are beginning editors, so that they can be checked by more experienced editors.
What do you think of the Federal Court’s decision which increases the liability of platforms with regard to third-party content?
It is not clear whether we are in our place or not. These laws should be aimed at large platforms which, in fact, have responsibilities, and not at those which offer open educational resources, as is the case with us and many others.
In PL 2630 (on fake news), an exception was included for collaborative encyclopedias and other open educational resources. This time there is none. And we cannot establish the same rule that is adopted for Google for educational projects supported by volunteers, without advertising or economic interest.
Is there any type of regulation that you believe is necessary to keep the information ecosystem habitable?
We need public spaces on the Internet, digital public goods, like the BBC. There should be regulation that not only imposes limits, but creates or encourages these public spaces, which belong neither to the private sector nor to the state. Otherwise, we will only have a privatized public square (controlled by big technologies), which, in fact, does not offer real freedom of circulation of information, or we will have government spaces, or a Gov portal.
What is the impact of AI on Wikipedia?
These AIs are completely sucking us in. Wikipedia is the largest source for ChatGPT and Gemini. They don’t attribute it to us, they don’t put Wikipedia as a source. And Wikipedia is based on a free license with attribution. Additionally, it is a problem for the motivation of our editors. They do it for free, but they want their work to be recognized.
I was in a meeting at Google two weeks ago and I said, look, we know you’re using it, use it, it’s great, but assign it and become a donor. Today, Wikipedia depends on small donors. We need (big AI companies) to contribute more.
Have you ever felt a decline in the number of editors?
Not in Brazil, but we don’t really know why. On Spanish Wikipedia, yes.
Has there been a decline in Wikipedia’s audience?
In Brazil, the audience is stable: in 2024, we had 2.3 billion users. In the United States and Europe there was a decline.
If Wikipedia visits decline because people can get the same information through chatbots, could this reduce donations and threaten Wikipedia’s sustainability?
We depend on small donations. If people access our projects less, they will give less. But there is an important problem. ChatGPT doesn’t produce content, it just replicates it. If we cease to exist, the content consolidation and collaborative curation that holds it all together will also cease to exist. Ultimately, the AI will commit suicide, it’s a kick in the foot, because if you kill the source of verified and collaborative information, you will not be able to train your model effectively.
Is AI a threat to Wikipedia?
Look, there are challenges, but I think it’s also an opportunity for us to find other ways to become viable. There is the European model, which consists of supporting Wikipedia and other educational projects as spaces for public communication. Today it would be difficult to have this debate, we have this more regulatory discourse against big tech, because of everything that has happened in the last ten years. But we need Internet governance that is focused, not just restrictive.
X-RAYS | JOÃO ALEXANDRE PESCHANSKI, 45 YEARS OLD
He is executive director of Wikimedia Brasil, affiliated with the Wikimedia Foundation, which manages Wikipedia. Graduated in Social Sciences from USP (University of São Paulo) and in Social Communication from PUC-SP, he holds a master’s degree in political science from USP and a doctorate in sociology from the University of Wisconsin-Madison (United States). He has worked on Wikimedia projects as a volunteer since 2011.