Facebook apologizes for incorrectly removing content “on the other end of the political spectrum”

0
944
 

Facebook on July 17, apologized to the U.S. House Judiciary Committee for censoring content “on the other end of the political spectrum”.  The social media giant admitted to driving traffic away from conservative Pro-Trump sisters, Lynette “Diamond” Hardaway and Rochelle “Silk” Richardson, from September last year. They later justified their action saying the the sisters were ‘unsafe to the community’.

On Tuesday, Facebook admitted that it had “incorrectly removed content on the other end of the political spectrum”, and that they knew “these incidents often garner significant public attention.”

Facebook claimed to have learned from these experiences, and said that they were committed to improving their errors. They conceded that they will never be able to reduce the number of errors to zero.

Facebook has since hired an outside adviser, former Republican Senator Jon Kyl, to prevent potential bias against conservative opinions.

The Committee acknowledged that since its last hearing in April, there has been numerous effort by social media giants to improve transparency. But that they still continue to see numerous content that is still being unfairly restricted.

The Committee cited the example of how Facebook had recently automatically blocked a post from a Texas newspaper for containing a text which was part of the American Declaration of Independence.

Facebook apologises to Texas newspaper after its algorithm flags and removes Declaration of Independence text as Hate Speech

The Committee said: “Think about that for a moment. If Thomas Jefferson had written the Declaration of Independence on Facebook, that document would have never seen the light of day. No one would be able to see his words because an algorithm automatically flagged it—or at least some portion of it—as hate speech. It was only after public outcry that Facebook noticed this issue and unblocked the post.”

Adding: “Facebook may be embarrassed about this example—this Committee has the opportunity today to ask—but Facebook also may be inclined to mitigate its responsibility in part because it was likely software, not a human being, that raised an objection to our founding document. Indeed, given the scale of Facebook and other social media platforms, a large portion of their content filtering is performed by algorithms without the need of human assistance.”