Big Tech Blames Remote Working For More Censorship

(PresidentialWire.com)- Big tech companies are taking an interesting stance when it comes to justifying removing content from their platforms that doesn’t, in fact, violate their rules — they are blaming remote work and the COVID-19 pandemic.

Since the start of the pandemic in March 2020, social media giants Facebook and Twitter scaled back the number of content reviews that were conducted by human employees. Both of these companies use contractors to go through the massive amount of information posted on their sites to see if any violates their rules and regulations.

Now, both of the companies are saying that the contractors can’t do that work from home. As a result, they are blaming the pandemic — and the resulting remote work environment — for the erroneous removal of some content from their sites.

The ironic part is that while the companies are saying they don’t have the adequate staffing and ability to monitor content on their sites, they are at the same time increasing the restrictions they are putting on content.

During the pandemic, both social media giants added new rules to prohibit any speech that suggested that the coronavirus that caused the pandemic escaped from a lab in Wuhan, China.

On a regular basis, both social media platforms suspend users — including politicians — who question in any way vaccine mandates being put into place.

When Twitter shifted to a remote work environment in March of 2020, the company said they’d rely on “machine learning and automation” to “take a wide range of actions on potentially abusive and manipulative content.” The company also acknowledged that the change in approach would definitely lead to content removal mistakes and an increasing number of penalties doled out to users incorrectly.

At the time, Twitter said the change would be temporary, but now the company said it would continue to “utilize machine learning” to enforce the rules of the platform.

Facebook, meanwhile, has blamed its issue on staffing problems. This has had an extreme negative effect on the ability of users to appeal content that has been removed.

The social media company said it doesn’t have the bandwidth to conduct these reviews, even in instances where the content was removed by their automated system erroneously. The company has said they “promise” that user complaints will serve as training data to help the automated systems do better “in the future.”

Still, in response to an inquiry from The Washington Free Beacon, a spokesman for Facebook directly contradicted this. He said the company still allows appeals of content takedown in “the vast majority of cases.”

He also said the company was “actively monitoring feedback to improve the accuracy of our enforcement efforts.”

But the spokesman’s words contradict the findings of a report released in November of 2020 that found Facebook has said many of the decisions it makes on removing content that are later reversed are done by an automated system.

Yet, again, social media platforms are trying to get away with unfair policies by blaming someone — or something — else for their problems.