Research Finds Extreme Elitism, Social Hierarchy among Gab Users

A new study discovers small number of influential users share homogeneous content sometimes associated with state-sponsored propaganda.

September 3, 2019

Social Media Icons

Despite its portrayal as a network that champions free speech, users of the social media platform Gab display more extreme social hierarchy and elitism when compared to Twitter users, according to a new study published in the online journal First Monday.

Researchers found a small number of Gab users hold a significant amount of influence, sharing more homogeneous content than Twitter, some of which was associated with state-sponsored propaganda.

The ongoing debate surrounding content regulation and the censorship of social media platforms has grown increasingly contentious in recent years. Despite this, little research has been done to understand the impact regulation might have on online behaviors.

Researchers at the George Washington University School of Engineering and Applied Science, Northeastern Illinois University and Johns Hopkins University set out to examine the impact of content regulation by comparing two social media networks with similar technical structures: Twitter, a popular platform that occasionally removes content in violation of its terms of service; and Gab, a completely unregulated platform that advertises itself as championing “free speech, individual liberty and the free flow of information online.”

The results suggested that the complete absence of regulation did not lead to more fair and inclusive discourse, nor to a free marketplace of ideas.

“Governments and social media platforms around the world are trying to figure out how they can promote a free exchange of ideas without censoring content,” said David Broniatowski, a SEAS associate professor. “When compared to Gab, our results suggest that adherence to even relatively mild terms of service, such as Twitter’s hateful conduct policy, may make a big difference in promoting the free exchange of ideas.”

The team looked at millions of messages sent on each platform between August 2016 and February 2018 to determine who shared content and the types of content shared.

The researchers discovered that, despite its promotion of open and free discussion for all, a smaller number of users with substantial followings on Gab controlled a larger share of the content compared to Twitter.

For example, most Gab users had less than five followers and did not post messages regularly, with 57 percent of the platform’s approximately 150,000 users posting fewer than four messages during the 18-month study. In contrast, only 3,000 users posted more than 1,000 messages, thus creating a more extreme social hierarchy.

Turning to the types of content users post and share, the research team identified several popular topic areas discussed on each platform. It found Gab users’ content to be more homogeneous, with political topics making up 56 percent of all messages sent, and race and religion also being popular topics of discussion. On Twitter, while political topics were popular, they only made up a small percentage of posts compared to other topics, which included pop culture, business and sports. The researchers discovered Gab users were more linguistically homogeneous as well, with 89 percent of all messages written in English.

As with Twitter, Gab users also looked to share information through external content, with 74 percent of the most commonly shared URL domains on Gab linking to news websites. While the researchers found these links mostly led to far-right news websites, they also observed links to websites associated with conspiracy theories and state-sponsored messaging from non-U.S. governments.

“Unfortunately, Gab’s stated mission of totally free speech creates an environment in which significant hateful and divisive rhetoric flourishes and paradoxically impedes debate. As a platform, it ultimately promotes the kind of hate which culminated in the shooting at a Pittsburgh synagogue in October 2018 shortly after the shooter posted on Gab,” William Adler, an associate professor of political science at Northeastern Illinois University, said.

Dr. Broniatowski and his colleagues noted the results from the study have implications for attempts by governments to regulate social media platforms and for platforms to regulate their own content.

The researchers suggest possible options for regulation, including redesigning algorithms to ensure that a small number of voices do not dominate the debate, and removing the possibility for automated content to overwhelm genuine content by requiring users to provide evidence of whether their account is operated by humans or bots.

A Fuzzy Trace Model

Dr. Broniatowski also recently published a paper that examined factors that make a person likely to share online content, in particular tweets about vaccinations. This work was driven by the Fuzzy Trace Theory, a theory in experimental psychology that predicts people are most likely to remember things that communicate clear meaning. Dr. Broniatowski and his co-author Valerie F. Reyna from Cornell University’s Human Neuroscience Institute explored whether they could predict what content would be most likely to be shared based on this theory.

Their research found that, beyond the emotional and motivational “clickbait” components that garner attention, a tweet must communicate some bottom-line meaning or suggestion for action to go viral.

For example, false stories linking vaccines and autism provide a causal connection that helps people feel like they have some insight or understanding into mysterious events in the world around them, Dr. Broniatowski said. This research is important because it explains why there may be a link between conspiracy theorizing and online sharing behaviors.

“For the average person who may not be informed and may be looking for an answer to something they can’t explain, these kinds of conspiracy theories give them the answer that they are looking for,” he said.

Tweets that contain motivational content are more likely to be noticed, but may be quickly forgotten. In contrast, tweets containing meaningful content about vaccinations are likely to be more easily remembered, and so retweeted more frequently than tweets on the same subject that do not share these components, Dr. Broniatowski said.