Unmasking Social Bots

Research project at Bielefeld University being funded by Volkswagen Stiftung 

Social Bots are believed to have influenced public opinion in the 2016 U.S. presidential election. And before the European Parliamentary election, an EU Commissioner warned of disinformation campaigns being waged with Social Bots. Social Bots are specialized computer programs made specifically to communicate over social media. What impact are Social Bots having on societal discourse? And how might technical systems be used to combat these Bots? Researchers from Bielefeld University, the University of Applied Sciences Bielefeld, and the Australian National University are working on these questions as part of a research project that is being funded by the Volkswagen Stiftung (Volkswagen Foundation).

How are Social Bots influencing political debates, and how can they be detected? Together with project partners, Bielefeld University’s Prof. Dr. Philipp Cimiano, Dr. Ole Pütz and Privatdozent Dr. Florian Muhle are working on these issues in the “Unbiased Bots that Build Bridges” research project. Photo: CITEC/Bielefeld University “Today, the formation of public opinion is increasingly taking place online. As a result of this trend, Twitter, Facebook, and other social media platforms are growing in importance as channels of political communication,” says Prof. Dr. Philipp Cimiano, who is a professor at Bielefeld University’s Cluster of Excellence Cognitive Interaction Technology (CITEC).  The computer scientist heads the “Unbiased Bots that Build Bridges” project (U3B) together with Dr. Ole Pütz and Privatdozent Dr. Florian Muhle. In this project, the researchers are investigating how automated systems and Social Bots influence opinion-making on the Internet. Their approach makes use of machine learning methods, among other techniques, to recognize virtual robots that purport to be human beings on social media platforms. “We believe that there is an unknown number of profiles out there that have not yet been recognized as Bots. In addition to this, current systems for identifying bots still have quite high rates of error,” says Cimiano.

“Social media can indeed facilitate political dialogue in the public sphere,” says Florian Muhle, member of the Faculty of Sociology. “But the danger with social media is that people exchange ideas with other like-minded individuals, thus reinforcing the opinions they already hold. Automated systems in social media can thus intensify the development of such ‘echo chambers’ that supply users with ideas which affirm their conception of the world.”

“The challenge is to distinguish Social Bots from other accounts. In order to accomplish this in a reliable manner, we work in an interdisciplinary team with members from the social sciences as well as technology and engineering,” says Dr. Ole Pütz, a sociologist who is part of Philipp Cimiano’s research group and who also serves as Coordinator of the U3B project. “This enables us to combine qualitative methods from the social sciences with technical approaches from machine learning. In addition to this, we also use psychological experiments to study the impact of Social Bots,” as Pütz explains. In these experiments, study participants are shown examples of posts that had been posted on Twitter and are tasked with distinguishing whether the posts were made by a human or a Bot. They also have to evaluate how convincing or how emotional they find the individual posts to be.

“People react to Social Bots in different ways: some take them seriously, while others see right through them. These tests are meant to help us distinguish different types of users,” explains Dr. Florian Muhle. “This can help to create customized information and assistance for dealing with Social Bots based on the needs of different types of users.”

By better understanding Social Bots and how they operate, the team seeks to develop technical systems that detect Social Bot activity and help to build bridges between fragmented communities on the Internet. Such systems could be used during election cycles, for instance, to warn users before they re-post news stories published by Bots.

The project is called “Unbiased Bots that Build Bridges (U3B): Technical Systems That Support Deliberation and Diversity as a Chance for Political Discourse.” The Volkswagen Stiftung (Volkswagen Foundation) is funding the project through March 2020 as part of its initiative “Artificial Intelligence and the Society of the Future.”

Weitere Informationen:

Prof. Dr. Philipp Cimiano, Bielefeld University
Cluster of Excellence CITEC
Telephone: +49 521-106 12249
Email: cimiano@techfak.uni-bielefeld.de