Facebook accused of downplaying staff's fake content flagging
text_fieldsNew Delhi: Facebook India acted as medium for constant polarising nationalistic content, fake messaging, misinformation, denigration of minorities, and its staff flagged it multiple times internally between 2018 and 2020, sending three memos. But an internal review by the Vice President (VP) of the firm in 2019 inferred that there was only a low prevalence of inappropriate content, The Indian Express (TIE) reports.
While the first two reports flagging hate speech were sent during January and February 2019, months before India's Lok Sabha elections, the third was sent in August 2020. But minutes of the meetings with the VP concluded that people are feeling safe, and experts told the platform that the country is relatively stable.
The first report, "Adversarial Harmful Networks: India Case Study", found that as high as 40 per cent of sampled top VPV (view port views) posts from West Bengal were either fake or inauthentic. VPV is Facebook's metric to measure how often the content is actually viewed.
An employee authored the second report in February 2019 based on the findings of a test account. TIE has reported the test account's inference in October. Within three weeks of creating the account, the test user's feed had been filled with polarizing nationalistic content, misinformation and violence. The user had only followed the content recommended by Facebook's algorithm.
Facebook's review meeting after the first two reports was conducted a month before the Election Commission announced the Lok Sabha elections in 2019. In the sessions, Chris Cox, the VP then, said that "big problems in sub-regions may be lost at the country level".
The third one, another internal memo, admitted that Facebook's AI (artificial intelligence) tools could not identify regional languages and failed to identify problematic content. Following that, employees questioned Facebook's investment plans for India to prevent hate speech and problematic content. They asked how the company does not even have the essential key systems to detect such content.
Facebook didn't respond to TIE's queries regarding Cox's meeting and the internal memos. In October, Facebook had told TIE that it had significantly invested in technology to find hate speech in many Indian languages, including Hindi and Bengali.
The three reports and the minutes of internal meetings were among the documents leaked by whistleblower and former Facebook employee Frances Haugen. The redacted versions of the documents received by the United States Congress have been reviewed by a group of global news media, including TIE.

