Misinformation in Conversations about Tulsi Gabbard

14 min read

On the weekend of 19 October 2019, I pulled just over 1.2 million tweets from Twitter that covered conversation about Tulsi Gabbard. These posts cover a time period from 10 October 2019 to 19 October 2019. To understate things, there is a lot of noise in this data, and multiple conflicting stories in this data. This dataset contains accounts that average hundreds of posts a day, and that almost certainly belong to real people. This dataset also contains accounts that are highly active, and look like rented/borrowed accounts from real people. The dataset contains still other accounts that appear like trolls or sockpuppets engaged in artificial manipulation of the conversation. Telling these different accounts from one another can be complicated -- the behavior of a true believer can look a lot like the behavior of a bot or a sockpuppet, and a useful idiot can look a lot like a professional troll.

The narratives in the dataset also cover a lot of ground. The time period covers the Democratic debate, and the conversation around Hillary Clinton's remarks about Jill Stein, Russian assets, and Tulsi Gabbard's comments calling Hillary Clinton a "warmonger." Multiple conspiracy narratives get rehashed in this data, including how the 2016 campaign was "rigged" by the DNC so Hillary Clinton could win (As a side note, Hillary Clinton is truly conspiracy catnip; so many conspiracy theories -- from Seth Rich, to Pizzagate, to Epstein, to Benghazi, to the DNC server -- have been spread around Hillary Clinton that the mere mention of her name allows for a host of conspiracy theorists to draw "connections" and to document those "connections" with links to multiple web sites and YouTube videos). The conversation also included Yang supporters, Sanders supporters, Marianne Williamson supporters, Trump supporters, and adherents of the various Q conspiracy narratives, to name a few.

In this post, for reasons of brevity and coherence, I will limit myself to inorganic accounts and the narratives (and potential networks) they represent. This constraint leaves a lot out; for example, a scan of the popular YouTube shares shows (as usual) a skew towards outright misinformation or heavily slanted sources.

Accounts highlighted in this post meet multiple criteria to merit inclusion, and the accounts highlighted in this post also represent narratives that are more prevalent in the larger data set. The other reason this writeup focuses on accounts is that the dataset contains multiple examples of what appears to be inorganic activity looking to manipulate the conversation online.

Key Highlights:

  • The conversations about Tulsi Gabbard on Twitter contain thousands of accounts that exhibit traits of inorganic behavior;
  • Multiple different actors are represented in the dataset. Gabbard -- as a subject of conversation -- has been used in different ways to promote and amplify different narratives. This post contains several examples of representative accounts.
  • The divisiveness of Gabbard within the Democratic party makes her an attractive subject for actors looking to pollute conversations.


At least 1,238,124 posts were shared by 342,894 accounts. 1858 accounts, or 0.54% percent of all accounts active in this spike, created 10% of the content. This imbalance of activity -- where under 1% of participating accounts create more than 10% of all content -- is comparable, but somewhat more pronounced, to other spikes. This potentially indicates that the most active accounts in this dataset are more active than normal. While high posting rates can be an indicator of trolls or sockpuppets dominating the conversation, posting rates alone are not an adequate indicator of inorganic activity or artificial amplification of narratives.

Out of all of the accounts that posted about Tulsi Gabbard between 10 October and 19 October, 33,354 (9.7% of all accounts) were created since January 1, 2019. Out of these fresh accounts, 2397 averaged 100 or more posts a day, and 643 averaged 200 or more posts a day. To put this in perspective, to share 100 posts a day, a person would need to average just over 8 posts an hour over 12 hours, every day. To average 200 posts a day, a person would need to average just under 17 posts an hour for 12 hours, every day.

Within the new accounts posting about Tulsi Gabbard, profile descriptions contained hashtags from across the political spectrum. While the hashtags used in profiles leaned toward hashtags associated with President Trump, right wing politics, and the Qanon conspiracy theory, several democratic campains were also represented within the dataset.

Looking at the top 15 most popular hashtags in profiles of new accounts, we get a rough breakdown across the ideological spectrum:

  • 8 Trump/Right/Q related hashtags - 3779 total hashtags
  • 3 Yang related hashtags - 1192 total hashtags
  • 1 Sanders related hashtag - 267 total hashtags
  • 3 Left/General Progressive related hashtags - 1158 total hashtags

The counts for the top 15 hashtags are listed below:

  • maga 1322 Right
  • kag 772 Right
  • resist 738 Progressive
  • yanggang 695 Yang
  • trump2020 544 Right
  • wwg1wga 401 Right
  • yang2020 278 Yang
  • 2a 272 Right
  • bernie2020 267 Sanders
  • resistance 225 Progressive
  • humanityfirst 219 Yang
  • fbr 195 Progressive
  • kag2020 174 Right
  • qanon 167 Right
  • trump 127 Right

Representative Accounts

In this writeup, I am highlighting multiple representative accounts. These accounts are examples of types of accounts that are present in large numbers in the data set.

For an account to merit inclusion in this writeup, the account needs to meet several or all of the following criteria:

  • new account, and/or
  • highly active in general, and/or
  • active on a developing news story, and/or
  • collecting large numbers of followers quickly, and/or
  • has a follower/following percentage of close to 1, and/or
  • no personally identifying info available,and/or
  • profile picture or background is sourced from stock photos or is clearly not the account holder, and/or
  • the account posts in multiple languages and across multiple highly disparate topics, and/or
  • shares links to sources that are known to be unreliable, and/or
  • other images used as background or in timeline are misrepresentations, and/or have been used in past misinfo efforts.

While these traits are used specifically to evaluate accounts on Twitter, these criteria can be adapted for use on other platforms.

Account Type # 1: Yang Supporter

The representative account of a Yang supporter calls for Sanders to support Tulsi to back up Andrew Yang and Marianne Williamson. Multiple accounts attempted to create the connection between Sanders, Yang, Gabbard, and Williamson as people connected in their fight against a "rigged" system. This narrative works in several ways: it uses the larger visibility of Bernie Sanders to get visibility for less popular and visible candidates like Yang, Gabbard, and Williamson; and it also plays on the subtext of a "rigged system", and this subtext still resonates with a subset of Sanders supporters upset about the 2016 Democratic primary. These narratives are divisive, provide visibility for fringe candidates that could conceivably siphon votes away from a Democratic candidate in a general election, and use past history as a foundation for current narratives -- and the divisiveness of the narrative makes it a great tool for misinformation.


This account uses the name of James Chen, a "US Army Soldier." It was created in September 2019, and has been fairly inactive except for during this spike in conversation.

Profile screengrab

However, this account uses the picture of Danny Chen, a soldier who died in Afghanistan in 2011.

News story

Account Type # 2: Right Wing/Q

This Q account opportunistically uses the conflict between Gabbard and Clinton to push multiple narratives and conspiracy theories. First, the account touches on a favored talking point held in common between Q and the far right, which claims that the Russian intervention in the 2016 election never happened, and that claims of Russian intervention are somehow part of a conspiracy designed to hurt President Trump. Second, this account shares links to ZeroHedge, a far right conspiracy site. Third, this account injects Q-related talking points into the conversation about Gabbard. All of these actions work to introduce the conspiracy theories and language of Q into the conversation about Gabbard.

Multiple Q posts

On its own, this might not seem especially significant - after all, the vast majority of people will be unaffected by these messages. However, misinformation or disinformation does not need to be universally successful to be effective. Success for some forms of misinformation involves normalizing fringe theories to make them more palatable to the mainstream, and eventually expanding the reach of a fringe message to people who might be receptive to it. This account has over 83,000 followers, which ensures that posts shared by this account (and similar accounts) are broadly distributed.

Account profile

This account uses a profile picture of Seth Rich, who has been at the center of conspiracy theories pushed by Sean Hannity on Fox, among others. The background picture uses images of Pepe, an image that was co-opted by right wing trolls. These images should be understood both as dog whistles to other conspiracy theorists, and potentially as propaganda for people not familiar with the conspiracy theories.

Also, this point needs to be emphasized: accounts highlighted here are representative. They are included not because they are singular examples; they are included because there are multiple accounts that fit this profile. For example, if we only look at new accounts (created after January 2019), over 100 have Q-related hashtags in their profile and average over 100 posts a day. The screencap below is from an account that was created in September 2019, and averages over 290 posts a day.

Second Q profile

As is suggested by the profile picture, the content of this feed is consistently racist, and amplifies conspiracy theories. These two Q-related accounts are two of many accounts active in the conversation about Tulsi Gabbard that fit this profile.

Account Type # 3: International Focus

The dataset also includes multiple accounts that have a more international focus. These accounts use profile pictures drawn from non-US sources, post about topics related to international politics, post in multiple languages, and amplify stories from state run or controlled media.

International Account 1:

In the screenshot below, we see an account sharing content in three languages (English, Turkish, and Urdu), sharing content from the Government of Punjab, and RT, a site controlled by the Russian government. The RT article is in English, and discusses Brexit.

Multiple posts

The account posted 17 times about Tulsi Gabbard, including the article shown below which asks a rhetorical question about whether Indian nationalists are collaborating with Russia to interfere in US elections.

Gabbard retweet

The account sharing these posts was created on May 30. Since it has been live, the account has averaged 288 posts a day. To put this in context, a person would need to average 24 posts an hour for 12 hours a day, every day, to create 288 posts daily.

International Account 2:

The next account we will look at was created on October 8, 2019, and has averaged 321 posts a day since it was created. The account shared multiple posts about Gabbard, including these two that float the idea of Gabbard as a third party candidate, and that highlight the conflict between Clinton and Gabbard.

Gabbard-related posts

From scanning the timeline of this account, the account also shares content more broadly related to US politics.

US politics

However, the account has broad interests outside of US politics. For example, it shares content from the Pakistani government, and BBC Urdu.

multiple languages

The account also shares content, in Russian, from the Russian government.

Russian language content

As an added bonus, the account also promotes pornography.

Porn spam

While it is unclear who controls this account, and despite the English language content shared by this account, the account does not seem to be run by a person fluent in English.

Posts in English

International Account 3:

The next representative account was created on September 28, 2019. Since creation, it has averaged 817 posts a day. To put this in human terms, a person would need to share 68 times an hour for 12 hours a day, every day, to keep up this posting volume. In this dataset, this account posted 24 times about Tulsi Gabbard, generally highlighting posts from known progressive accounts.

Sockpuppet retweets

A review of the images used for the account profile highlights some red flags.

First, the profile image is stolen from an image of a model.

Twitter Profile:

Sockpuppet background on Twitter

Original image source:

The background image is a modified version of an image stolen from a video game.

Original background

From a manual review of the timeline, the account shows interaction patterns that could suggest future atttempts at source hacking. In the screenshot below, the account replies to Lindsey Graham.

sockpuppet interactions

Account Type # 4: Progressive Sockpuppet

The dataset of accounts posting about Tulsi Gabbard includes progressive accounts that show several traits of being sockpuppets, and/or being involved in inorganic amplification of content.

The account pictured below posted posted at least 13 times about Tulsi Gabbard in the dataset. The account was created on February 15, 2019, and since that time has averaged 158 posts a day. While this is far from the most active account in this dataset, in human terms this would require 13 posts an hour for 12 hours a day, every day.

Progressive sockpuppet profile

In recent conversations about Tulsi Gabbard, posts from this account use language that shows consistent support for Hillary Clinton. In general, posts from this account highlight standard centrist Democratic talking points.

posts from progressive sockpuppet

A look at the user profile shows some red flags.

Progressive sockpuppet profile

Despite the relative newness of the account, it has managed to amass over 9000 followers, and the follower to following ratio is pretty close to 1:1 (9442 followers, 9310 following when the data were collected). Additionally, the user profile picture of the account is pulled from a stock photo titled, "Portrait of confident blonde man with arms crossed."

Progressive stock photo


As noted in the beginning of this post, over 33,000 new accounts (created since 1 January 2019) participated in the conversation about Tulsi Gabbard between 10 October 2019 to 19 October 2019. Many of these accounts are legitimate new users, but a subset of these accounts -- marked by high posting rates, stolen profile images, and/or any of the other traits of a suspect account. In addition, numerous other accounts created before January 2019 were also highly active in the conversation, and a subset of these accounts show multiple characteristics of inorganic amplification of various narratives.

One detail that stood out about the Gabbard dataset was the number of inorganic accounts, the methods used to create these account personas, and the various narratives spun by these inorganic accounts. Within the dataset, we had sockpuppets pushing libertarian ideals, mainstream democratic positions, hardcore right wing positions, and conspiracy theorists. We also had accounts with an international focus posting in multiple languages about political issues in Pakistan, India, Scotland, Britain, using state media and web sites associated with past misinformation campaigns.

The methods used to create these accounts spanned stolen identities, profile images and background images that serve as dog whistles (such as Pepe the frog or images of Seth Rich), to tactics used by romance scammers. The range of narratives, alongside recognized tactics used to create fake accounts in bulk, alongside the vast amount of conversation coming from inorganic accounts, suggests that some level of coordination between subsets of these accounts is likely. However, this analysis stops short of identifying specific actors in specific networks.

With all that said, Tulsi Gabbard clearly remains a useful prop for misinnformation and disinformation. Her divisiveness within the Democratic party makes her very useful for right wing propaganda efforts. Her positions on Syria and her lack of criticism of Bashar al Assad ensures that she can be used by fringe sites and the accounts that promote them to generate ongoing attention. Her past appearances on Russian state media and her willingness to appear on Tucker Carlson's show ensure that she will draw clicks from the right looking to criticize Democrats, from the left looking to criticize Gabbard, and from the anti-Hillary Clinton wing of the Democrats/Progressives who still cling to the belief that the DNC "stole" the nomination from Bernie Sanders. None of this implies or indicates that Gabbard or her campaign has any role in any of the inorganic amplification of the narratives about her. However, Gabbard does benefit from the visibility these campaigns bring.

The point of misinformation isn't to "win" an argument. Misinformation works by shaving off the margins; sometimes visibility is the end goal, because visibility is all that's required to pollute the conversation. The chatter about Gabbard, and the inorganic activity from fake accounts in the middle of this chatter, indicates that Gabbard's presence in the Democratic primary provides a useful tool that supports multiple narratives with different goals.