Meta actioned 1.4 million content related to adult nudity and sexual activity, 233,600 pieces of content related to bullying and harassment and 1.8 million violent and graphic content
Meta, formerly Facebook, took down over 11.6 million pieces of content across 13 policies for Facebook and over 3.2 million pieces of content across 12 policies for Instagram for the month of January, the company said.
Meta actioned 1.4 million content related to adult nudity and sexual activity, 233,600 pieces of content related to bullying and harassment and 1.8 million violent and graphic content, among others.
“In accordance with the IT Rules, we’ve published our monthly compliance report for the period for 31 days – January 1 to January 31. This report will contain details of the content that we have removed proactively using our automated tools and details of user complaints received and action taken,” said a Meta spokesperson.
For Facebook between January 1 and 31, Meta received a total of 911 reports through the Indian grievance mechanism and 100 per cent of these reports were responded to, said the company.
In the case of Facebook the highest number of complaints were related to accounts being hacked (270), followed by fake profiles (107) and bullying and harassment (106) among others.
For Instagram, the company received 1,037 reports and the company responded to 100 per cent of the complaints.
Again accounts being hacked scored the highest complaints (677) followed by fake profiles (252) among others.
“Over the years, we have consistently invested in technology, people and processes to further our agenda of keeping our users safe and secure online and enable them to express themselves freely on our platform. We use a combination of Artificial Intelligence, reports from our community and review by our teams to identify and review content against our policies,” said the company in a statement.