HomeWorld NewsTwitter files part two: Secret Blacklists “Visibility Filtering” Exposed

Twitter files part two: Secret Blacklists “Visibility Filtering” Exposed

The second set of Twitter files have been released after several days of delays

This set of files were posted on Twitter by journalist Bari Weiss.

The first thing revealed on Thursday is that Twitter employees built and maintained an active blacklist of content they want to suppress all without users knowledge.

“teams of Twitter employees build blacklists, prevent disfavored tweets from trending, and actively limit the visibility of entire accounts or even trending topics—all in secret, without informing users.

“A new #TwitterFiles investigation reveals that teams of Twitter employees build blacklists, prevent disfavored tweets from trending, and actively limit the visibility of entire accounts or even trending topics—all in secret, without informing users.”

“Twitter once had a mission ‘to give everyone the power to create and share ideas and information instantly, without barriers.’ Along the way, barriers nevertheless were erected.”

Twitter blacklisted tweets, topics and even entire accounts from trending and also blocked certain users and tweets from appearing in search results.

Dr. Jay Bhattacharya a professor at Stanford was opposed to Covid lockdowns and an early critic speaking out about the negative effects on children, a demographic of the population at the lowest risk of being susceptible to the Covid virus.

Bhattacharya’s was added to Twitter’s “Trends Blacklist,” which prevented his tweets from trending on the platform.

Weiss shared an image on Twitter of his account from Twitter’s point-of-view with the yellow tag indicating the restriction.

“Take, for example, Stanford’s Dr. Jay Bhattacharya who argued that Covid lockdowns would harm children. Twitter secretly placed him on a ‘Trends Blacklist,’ which prevented his tweets from trending,”

Weiss also reported Dan Bongino, a right-wing talk show host was slapped with a yellow tag for a “Search blacklist” restriction.

Weiss shared another image Kirk’s account with a “Do Not Amplify” message

”Twitter set the account of conservative activist Charlie Kirk to ‘Do Not Amplify,’”the independent journalist tweeted.

“We now know Twitter specifically blacklisted Charlie Kirk,” Jack Posobiec wrote, “and put him on a Do Not Amplify list. The regime names those they fear.”

Twitter fiercely denied they “Shadow ban” users.

In 2018, Twitter’s Vijaya Gadde (then Head of Legal Policy and Trust) and Kayvon Beykpour (Head of Product) said: “We do not shadow ban.” They added: “And we certainly don’t shadow ban based on political viewpoints or ideology.”

What many people call “shadow banning,” Twitter executives and employees call “Visibility Filtering” or “VF.” Multiple high-level sources confirmed its meaning.

“Think about visibility filtering as being a way for us to suppress what people see to different levels. It’s a very powerful tool,” one senior Twitter employee told us. Weiss wrote.

This type of censorship is found in Justin Trudeau’s “Online Streaming Act” Bill C-11. This is called manipulating the algorithm. “Instead of saying – and the Act precludes this – we will make changes to your algorithms as many European countries are contemplating doing – instead, we will say this is the outcome we want. We want Canadians to find Canadian music. How best to do it? How will you do it? I don’t want to manipulate your algorithm. I want you to manipulate it to produce a particular outcome.” admitted CRTC chair Ian Scott.

According to Weiss, Twitter manipulated the algorithm search results and trends all the time to control what content they want to suppress and what content they want to amplify.

The decision on limiting certain user accounts were handled by the Strategic Response Team – Global Escalation Team, or SRT-GET. It often handled up to 200 “cases” a day.

There was another level of moderation for high profile or controversial accounts that was so secret the handling of the cases dodge transparency and accountability by not using any type of ticketing system to address them.

The secret Twitter moderation group they called “Site Integrity Policy, Policy Escalation Support,” known as “SIP-PES.” included Head of Legal, Policy, and Trust (Vijaya Gadde), the Global Head of Trust & Safety (Yoel Roth), subsequent CEOs Jack Dorsey and Parag Agrawal, and others.

This is where the biggest, most politically sensitive decisions got made. “Think high follower account, controversial,” another Twitter employee told us. For these “there would be no ticket or anything.”

The Twitter user Libs of TikTok was created in November 2020 which exposes leftists wiwth their own words and videos has been suspended from Twitter several times, six of which occurred in 2022 before Musk bought the platform. The account created by Chaya Raichik which now boasts 1.4 million followers has been suspended eight times in total being locked out of her account for a week each time.

“Twitter repeatedly informed Raichik that she had been suspended for violating Twitter’s policy against ‘hateful conduct.'” Weiss wrote. “Hateful conduct” was the catch-all for things like “misgendering,” or calling biological males men when they would prefer to be seen as women.

Libs of TikTok is just posting publicly available content from other places sharing it with an audience that would otherwise would not have seen it for a number of different reasons.

Weiss posted a “Site Policy Recommendation” for the account, saying that there was recommendation for placing the Libs of TikTok account “in a 7-day timeout at the account level [meaning, not for a specific Tweet] based on the account’s continued patter of indirectly violating Twitter’s Hateful Condult Policy by tweeting content that either leads to or intends to incite harassment against individuals and institutions that support LGBTQ communities. At this time, Site Policy has not found explicily violative Tweets, which would result in a permanent suspension of the account.”

“The committee justified her suspensions internally by claiming her posts encouraged online harassment of ‘hospitals and medical providers’ by insinuating ‘that gender-affirming healthcare is equivalent to child abuse or grooming,'” Weiss wrote.

“Compare this to what happened when Raichik herself was doxxed on November 21, 2022. A photo of her home with her address was posted in a tweet that has garnered more than 10,000 likes,” Weiss wrote.

“When Raichik told Twitter that her address had been disseminated she says Twitter Support responded with this message: ‘We reviewed the reported content, and didn’t find it to be in violation of the Twitter rules.’ No action was taken. The doxxing tweet is still up,” Weiss wrote.

Weiss learned through an internal Slack message from Twitter’s former global head of trust and safety said when a direct violation can’t be found, the company uses technicalities to suppress the visibility of tweets and subjects.

“In internal Slack messages, Twitter employees spoke of using technicalities to restrict the visibility of tweets and subjects. Here’s Yoel Roth, Twitter’s then Global Head of Trust & Safety, in a direct message to a colleague in early 2021:

“A lot of times, SI has used technicality spam enforcements as a way to solve a problem created by Safety under-enforcing their policies. Which, again, isn’t a problem per se – but it keeps us from addressing the root cause of the issue, which is that our Safety policies need some attention,” Roth wrote in Slack, as shared by Weiss

“The hypothesis underlying much of what we’ve implemented is that if exposure to, e.g., misinformation directly causes harm, we should use remediations that reduce exposure, and limiting the spread/virality of content is a good way to do that,” Roth wrote.

“We got Jack on board with implementing this for civic integrity in the near term, but we’re going to need to make a more robust case to get this into our repertoire of policy remediations – especially for other policy domains,” added Roth.

Weiss’ tweets came after the bombshell report by independent journalist Matt Taibbi on Friday. Elon Musk, the new owner of the company, referred to Taibbi’s tweets as “what really happened with the Hunter Biden suppression story by Twitter.”

We’re asking readers, like you, to make a contribution in support of BC Rise's fact-based, independent reporting.

Unlike the mainstream media, BC Rise isn’t getting a government bailout and fully independent. Instead, we depend on friendly support of Canadians like you.

A media outlet cannot remain neutral and fair if they have special beneficiaries or government handouts.

This is why independent media in Canada is more important than ever. If you’re able, please make a donation to BC Rise today. Thank you so much.

Jordan
Jordan
Jordan is a casual reporter for BC Rise
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular