Whatsapp India Banned 20 Lakh Indians In 30 Days: Here’s The Reason Why?
Whatsapp shared in its latest compliance report that it banned 2,079,000 Indian accounts in December.
IT Rules-Compliant Report
This is the company’s seventh monthly report in compliance with IT Rules 2021.
The new rules which came into effect in May require large digital platforms (with over 5 million users) to publish compliance reports every month.
The reports must contain details of complaints received and action taken.
Types Of Grievances
It received 528 grievance reports among which:
- 149 were account support requests
- 303 ban appeals
- 29 other support
- 34 product support
- 13 safety requests
Reasons For Banning Accounts
The company received 528 grievance reports in India in December and it took action on 24 of those.
It banned over 2 million accounts using its abuse detection approach, which also includes action taken on user reports.
In its last report, it had said that over 95% of bans are due to the illegal use of automated or bulk messaging (spam).
Spamming involves forwarding messages with misinformation or forwarding a certain message to too many contacts repeatedly.
Improving User Safety
The report also lays out WhatsApp’s own preventive actions to combat abuse on the platform.
It has invested in Artificial Intelligence and other state-of-the-art technology, data scientists and experts, and in processes to boost its efforts towards user safety.
How It Detects Abuse
Aside from behavioural signals from accounts, the platform relies on available unencrypted information such as user reports, profile photos, group photos and descriptions to detect and prevent abuse.
To that end, it also makes use of advanced AI tools and resources
Abuse Detection System
Its abuse detection process works in three stages of an account’s lifecycle: At registration, during messaging, and in response to negative feedback such as user reports and blocks.
Users can report accounts that they think are spam, abusive or for other issues by long pressing a single message.
Meta’s Actions on Facebook Complaints
WhatsApp’s parent company Meta took down 19.3 million pieces of bad content across 13 categories on Facebook.
Similarly it banned 2.4 million pieces of such content across 12 categories on Instagram in December in compliance with the new IT Rules 2021.
Subject Of Complaints
It received 534 reports on Facebook through the Indian grievance mechanism from December 1 to December 31.
It responded to all such reports which had to do with fake profiles, harassment/abusive content and hacked accounts.
Meta took action on 28 out of 95 reports where the specialised review was needed.