It is possible to improve digital debates without censoring them - that is what we are working on

Question: Professor Wessler, let's first take a step back: How did your interest in communication come about?

Hartmut Wessler: To answer this question, I am actually taking a big step back to Namibia, where I was born. I grew up there in a colonial environment characterized by apartheid. I realized early on how important it is to communicate across borders - even if I didn't consciously reflect on this as a child. Perhaps that was the decisive influence. I later studied journalism, political science and sociology in Berlin and spent some time in the USA. Today, I research the importance of good discussions for democratic discourse at the University of Mannheim.

Question: What is a "good" discussion?

Hartmut Wessler: For me, a good discussion is characterized by the fact that you can express yourself and learn from each other at the same time. Democracy thrives on this exchange, on listening at least occasionally. I find it exciting and important to find out how we can improve this interaction in digital debates.

Question: If you look at the development of the communication system over the last 20 years, what do you think has changed the most?

Hartmut Wessler: The transformation brought about by digital is enormous. Whereas public discourse used to be curated by editorial teams, today practically anyone can express their opinion freely. That sounds positive and democratic at first. But as a result, we are also seeing more forms of communication that used to remain private, such as regulars' table slogans. This dynamic has created space for strategic players who deliberately disrupt and polarize public discussions. Such "polarization entrepreneurs", as Steffen Mau calls them, pursue the goal of dividing society into opposing camps in order to achieve certain political or social goals, often with extreme positions. These actors deliberately fuel the conflicts because they benefit when camps harden and dialogue no longer takes place.

Question: You spoke of polarization entrepreneurs and networks that thrive on dissent. Do you believe that we are developing into a society in which opposing camps no longer listen to each other?

Hartmut Wessler: It is a legitimate concern, but also an open empirical question. Germany, for example, is less polarized than the USA, and the idea of echo chambers is viewed in a more differentiated way in academia today. We are often even exposed to more opposing opinions on the internet than in our private lives. The real problem is that these other positions are often only perceived negatively. The metaphor of "trench warfare" therefore fits better than that of the "echo chamber". Democratic communication thrives on dissent, but the important thing is how this dissent is carried out. The aim should be to create more willingness to at least consider other perspectives.

Question: One keyword you have already mentioned is listening. What role does it play in the digital debate culture?

Hartmut Wessler: A very big one. In English, we talk about voice and listening - in other words, expression and listening. Only when both elements come together do people really feel heard. This also applies to democratic communication: if you have the feeling that you are allowed to speak but are not heard, dissatisfaction arises. Today, many people withdraw from public discussions because they are exposed to hate speech and disinformation. Initiatives such as "Ich bin hier" (I'm here), which take targeted action against hate comments on social networks, are doing important work here. They show that individuals can act as trustees of the public debate, even if this can only be part of the solution. Ultimately, we need more regulation and technical solutions to improve the quality of discussion.

Question: What could reverse the trend of increasing polarization?

Hartmut Wessler: First of all, we need to understand the causes of polarization. One central cause is the economic logic of social media, which is based on emotional interactions. Platforms are not programmed for democracy, but to maximize interactions - often through negative stimuli. The platforms are interested in generating as many data traces as possible from users, for example through intensive, emotional interactions, because these can be monetized best. What would be important for democracy, however, is pretty much the opposite: aspects such as listening, discussing differences of opinion or the ability to reach a joint decision. Unfortunately, such quality-oriented elements are not provided for on these platforms. Our research approach therefore aims to steer discussions in this direction through moderation. In doing so, we go beyond pure content moderation and focus on improving debates without censoring them in any way. In a comprehensive experimental setup, we tested the effectiveness of AI in moderation by training AI to recognize when a debate shows too little consideration or attention to each other and then make respectful, listener-oriented interventions. Surprisingly, AI moderation has been shown to work just as well as human moderation in promoting listening skills.

Question: What did this experiment actually look like?

Hartmut Wessler: The participants discussed a fictitious scenario involving the planning of a barbecue with vegetarians and meat eaters. We wanted to see whether AI could intervene in heated debates. To do this, we programmed AI to intervene in a respectful way and encourage participants to engage with each other or ask questions. In addition, we provided special buttons such as "Thank you" and "Respect", which offered the opportunity to recognize when people appreciated each other. We then conducted half-hour online chats with a total of almost 800 participants and tested the various elements against each other. The human moderation by ourselves and the AI moderation actually contributed almost equally to improving mutual listening in the discussion.

Question: How do you assess the potential of using such AI-supported tools on large news platforms?

Hartmut Wessler: There is already great potential, but each platform would have to adapt its own version. Every editorial team has its own culture and history when it comes to moderation. It would therefore make sense to see such tools as a long-term project that is regularly evaluated and adapted. For us as academics, it is exciting to set such processes in motion, because it is not just about reach, but also about the quality of the debate and achieving a constructive exchange.

Question: So AI-based moderation could become a counterweight to the prevailing algorithms and build new bridges?

Hartmut Wessler: I believe it is important to give people more technical and communicative opportunities to express their willingness to cooperate. Most people want to discuss things constructively and find solutions together. Unfortunately, the current structures of social media hardly encourage this willingness. If we create ways to strengthen this cooperation and encourage people to listen, this can help to make us less divided as a society in the long term.

Question: So would it be possible in future for democracy-strengthening tools like the ones you have developed to become just as visible as the polarizing forces?

Hartmut Wessler: That is precisely the hope. There are many people who value democracy, but we are currently experiencing a crisis phase. Science can play a bridging role here by offering solutions and showing how such tools could bring society closer together again. This is not about an all-encompassing solution, but about taking small steps in the right direction and observing what works and what does not. I am convinced that in the long term it is possible to defuse the trend towards polarization and strengthen democratic dialogue.

Photo: Hannah Aders