
Why has Facebook deleted 30 million posts of users in three months?
Facebook uses millions of people worldwide and there are almost every kind of debate on the social media platform. On Tuesday, the social media giants said that in the first three months of 2018, Facebook removed nearly 30 million posts from platforms due to sexual or violent photographs, intimidation or hatred.
In a report about transparency after Cambridge Analytica Data Privacy Scandal, Facebook provided information about the action taken against such content under its ‘Community Standards’. Facebook said that the use of Artificial Intelligence with the help of technology was taking action on about 35 million posts (graphic violence). This is almost three times as compared to the last quarter of 2017.
Facebook’s report said that even before the user warned, Facebook identified the photographs in nearly 85.6 percent of cases. Please inform that before this report, Facebook had removed nearly 200 apps on its platform. These apps have been removed as a result of misuse of personal data of users.
According to the report, content removed by Facebook also includes annoying content for some users, while it is not voilating according to the Facebook standards. With the help of improved technology, Facebook could take action against 1.9 million such posts which were to promote terrorist propaganda. According to the company, all these were deleted without any alerts. The credit given by the company to improved photo detection technology
During this time, Facebook removed about 2.5 million such content that was about to dissolve hatred poison among people. Facebook itself identified 38 percent of the content, while everyone else was complained by Facebook users.
Most of the posts about which Facebook users have expressed their concerns, is Adult Nudity or Sexual Activity. Apart from this, child pornography has not been covered in this report. Like the October-December 2017, the number of such posts in the first three months of 2018 was about 21 million.