Traders of child pornography have been using WhatsApp to distribute their disturbing content without much of a hurdle, said a media report.
In the absence of adequate human moderators, such material is slipping by the automated systems of the popular instant messaging platform that offers end-to-end encryption for its users, TechCrunch reported on Thursday.
Third-party apps for finding WhatsApp groups offer invite links to join users trading child pornography material, according to a report from two Israeli NGOs — Screen Savers and Netivei Reshe.
Many of these groups are currently active, TechCrunch found in its investigation. Some of these groups do not even hide what they are into, according to anti-exploitation start-up AntiToxin.
Cases of child porn being shared on WhatsApp have been reported in India as well. In April this year, an international WhatsApp group that was suspected to be sharing content on child porn was busted by the cyber cell of Madhya Pradesh. People from more than 25 countries including India, were reportedly part of that group.
Facebook was found wanting in its efforts to prevent the spread of such material on WhatsApp, according to the TechCrunch investigation.
Even without technical solutions that would require a weakening of encryption, WhatsApp’s moderators should have been able to find these groups and put a stop to them, the report said.
Child pornography, however, is not the only problem that the messaging platform is currently battling. In countries like India, WhatsApp has also been linked to rumours leading to lynching of dozens of people.