Reports emerged that Meta had restricted access to a news page in India focusing on Muslim communities, citing violations of its Community Standards on hate speech and misinformation. While the specific page, referred to as a “Muslim News” outlet, was not named in public statements, the incident aligns with ongoing concerns about Meta’s content moderation practices in India, where 200 million Muslims form the country’s largest religious minority. The restriction, which limited the page’s visibility and reach, sparked debate about freedom of expression and algorithmic bias in a nation with 900 million internet users.
Meta’s actions follow a pattern of scrutiny over its handling of religious and political content in India, where 80% of the population is Hindu. In 2024, the company removed 17 million pieces of content in India for violating policies, with 30% flagged by automated systems. Critics argue that Meta’s algorithms disproportionately target Muslim and minority voices, often misclassifying legitimate news as inflammatory. The restricted page reportedly covered issues like communal harmony and minority rights, but Meta claimed certain posts risked inciting violence, a sensitive issue after 2023 riots in Haryana that killed seven.
India’s government, which regulates digital platforms under the 2021 IT Rules, has pressured Meta to curb misinformation, with fines of $1.2 million imposed in 2024 for non-compliance. Muslim advocacy groups have called for transparency, citing a 2021 report that found Meta’s policies failed to address anti-Muslim hate speech by Hindu nationalist groups.
The company has invested $500 million in AI moderation tools but admits challenges in detecting nuanced content in India’s 22 official languages. The incident underscores tensions between global tech platforms and local regulations, with Meta pledging to review the restriction amid calls for fairer moderation practices.