Facebook’s recommendation algorithm amplifies military propaganda and other material that breaches the company’s own policies in Myanmar following a military takeover in February, a new report by the rights group Global Witness says. A month after the military seized power in Myanmar and imprisoned elected leaders, Facebook’s algorithms were still prompting users to view and “like” pro-military pages with posts that incited and threatened violence, pushed misinformation that could lead to physical harm, praised the military and glorified its abuses, Global Witness said in the report, published late Tuesday.
That’s even though the social media giant vowed to remove such content following the coup, announcing it would remove Myanmar military and military-controlled pages from its site and from Instagram, which it also owns. It has since enacted other measures intended to reduce offline harm in the country.
Facebook said Tuesday its teams “continue to closely monitor the situation in Myanmar in real-time and take action on any posts, Pages or Groups that break our rules.” Days after the Feb. 1 coup, the military temporarily blocked access to Facebook because it was being used to share anti-coup comments and organize protests.