Following the recent leak of Facebook’s moderation practices, online community expert, Venessa Paech, covers the myriad challenges for the platform, and where to next for Facebook and its users in this guest post for B&T
Facebook Inc will hire 3,000 more people over the next year to respond to reports of inappropriate material on the social media network and speed up the removal of videos showing murder, suicide and other violent acts, Chief Executive Mark Zuckerberg said on Wednesday.
The hiring spree is an acknowledgment by Facebook that, at least for now, it needs more than automated software to improve monitoring of posts.
Facebook has announced a number of measures in the last few weeks to tackle the spread of harmful content or posts that support terrorism on its platform. While the efforts are encouraging, industry insiders are questioning the amount of resources Facebook is investing to these initiatives and would like to see more collaboration between the platform and verification partners.
Social media companies are under pressure to block terrorist activity on their sites, and Facebook recently detailed new measures, including using artificial intelligence, to tackle the problem.
Video of a murder uploaded to Facebook this week upset many users, especially since it took Facebook two hours to take it down. But the incident illustrates a dilemma for the company as it becomes an open platform for both recorded and livestreamed video.
After being contacted by ProPublica, Facebook removed several anti-Semitic ad categories and promised to improve monitoring.
Facebook faces an uphill battle in cleaning up political advertising on its platform.
Facebook is making adjustments to how political advertisers buy ads on its platform. It’ll still be hard for the social media giant to identify all of the political ads running through its system, though.