How Meta is Prepared to Protect the Upcoming State Elections in India 

36

How Meta is Prepared to Protect the Upcoming State Elections in India 

How Meta is Prepared to Protect the Upcoming State Elections in India 

 

  • Meta has been preparing for these elections in India; we have a comprehensive strategy in place to keep people safe and encourage civic engagement.
  • We’ll be activating our Elections Operations Center so we can watch out for potential abuses that could emerge across the platform related to these elections. That way we can respond to them in real time.
  • How Meta is Prepared to Protect the Upcoming State Elections in India : We’ve made significant investments in teams and technologies to keep hate speech, misinformation and other forms of harmful content off of the platform. In addition to ramping up our regional language support, we are giving funding support to our fact-checking partners to deliver training programs for the public and journalists on tools and techniques to verify election information and stories

With the upcoming elections in Uttar Pradesh, Punjab, Uttarakhand, Goa and Manipur starting February 10, we are sharing an update on how Meta is prepared to protect people and our platform during this period. We have a comprehensive strategy in place for these elections, which includes detecting and removing hate speech and content that incites violence, reducing the spread of misinformation, making political advertising more transparent, partnering with election authorities to remove content that violates local law and helping people make their voices heard through voting.

Activating Our Elections Operations Center 

We’ll be activating our Elections Operations Center so we can monitor and respond to potential abuses that we see emerging related to these elections in real time.

Since 2018, we’ve used this model for major elections around the world. It brings together subject matter experts from across the company – including from our threat intelligence, data science, engineering, research, operations, policy and legal teams — to give us more visibility of emerging threats. That way we can respond quickly before they become larger.

Tackling Hate Speech and Other Harmful Content 

We’re very aware of how hate speech on our platforms can lead to offline harm. In the backdrop of  elections, it is even more important for us to detect potential hate speech and prevent it from spreading. This is an area that we’ve prioritized and will continue working to address comprehensively for these elections to help keep people safe.

We’ve invested more than $13 billion in teams and technology. This has allowed us to triple the size of the global team working on safety and security to over 40,000 including 15,000+ dedicated content reviewers across 70 languages. For India, Meta has reviewers in 20 Indian languages.

If a piece of content violates our policies against hate speech, we remove it using proactive detection technology or with the help of content reviewers. If it doesn’t violate these policies, but can still lead to offline harm if it becomes widespread, we demote it so fewer people see it.

Furthermore, under our existing Community Standards, we remove certain slurs that we determine to be hate speech. We are also updating our policies regularly to include additional risk areas. To complement that effort, we may deploy technology to identify new words and phrases associated with hate speech, and either remove posts with that language or reduce their distribution.  We also take down accounts of repeat offenders or temporarily reduce the distribution of content from such accounts that have repeatedly violated our policies.

We’ve made significant progress. The prevalence of hate speech on the platform is now down to just 0.03%. But we know there is always more work to be done.

Combating Misinformation, Voter Suppression and Fake News 

We know it’s important for people to see accurate information across all our apps, which is why we continue to fight the spread of misinformation on our services in India.  We’re removing the most serious kinds of misinformation, such as content that is intended to suppress voting, or  could  lead to imminent violence or physical harm. For content that doesn’t violate these particular rules, we partner with 10 independent third-party fact-checkers in India to review and rate the accuracy of such content. All our fact-checking partners are  certified by the nonpartisan International Fact-Checking Network and    review content in 11 Indian languages..  Each time they rate a piece of content as false, we significantly reduce its distribution, notify people who share the content – or who have previously shared it – that the information is false,  and we apply a warning label that links to the fact-checker’s article disproving the claim. Using the WhatsApp API, our fact-checking partners also have tiplines for the public to get the latest fact-checks and verify content. We have also provided funding support to our fact-checking partners to deliver training programs for the public and journalists on tools and techniques to verify election information and stories

Improving the Transparency of Political and Social Advertising 

We believe that every voter deserves transparency as they participate in political discussion and debate. That’s why we have introduced a number of tools that provide more information about political ads on Facebook and Instagram.

Last December, we announced the expansion of ads enforcement, requiring “Paid By For” disclaimers for ads about elections or politics, to include social issues. The enforcement will be applicable on ads that discuss, debate, or advocate for or against important topics. We also require anyone running ads on social issues, elections or politics on Facebook and Instagram to be authorized. This enables people to see the name of the person or organization running these ads. Ads are also entered into our Ad Library for seven years and can be reviewed any time.

Enhancing Civic Engagement 

We believe Meta has an important part to play in creating an informed community, and helping people access all the information they need to take part in the democratic process. We are launching the security megaphone before elections to remind users to protect their accounts against online threats by activating 2-factor authentication. This will be available in 5 Indian languages including Hindi.

We will also be offering Election Day reminders to give voters accurate information and encourage them to share this information with friends on Facebook, Instagram and WhatsApp.

Voluntary Code of Ethics with the Election Commission of India 

In 2019, led by the industry body IAMAI, we had set up a high priority channel with Election Commission of India for Facebook, Instagram and WhatsApp, to receive content-related escalations and to remove content that violates local law after receiving valid legal orders. The Voluntary Code is applicable for this election as well.

Staying Safe on WhatsApp 

We ensure that WhatsApp remains an industry leader among end to end encrypted private messaging services  and that the safety of the  users is at the core of everything we do. We have devoted efforts both, from a product innovation standpoint and an education lens, to empower users with resources that help them verify information.

WhatsApp actively constrains virality on the platform. The limits we imposed on ‘forwards’ have reduced the spread of ‘highly forwarded messages’ on WhatsApp by over 70%. We allow users to block and report accounts to WhatsApp if they encounter problematic messages.

Our advanced spam detection technology works round the clock to spot and take action on accounts engaging in automated and bulk messaging, which results in us banning such accounts for violating WhatsApp’s Terms of Service. We banned over 2 million accounts in the month of December 2021 alone.

Ahead of all elections, we train Political parties about the responsible use of WhatsApp and party-workers are cautioned about the possibility of their accounts getting banned if they send WhatsApp messages to people without prior user-consent.

Additionally, from time to time, we run multiple awareness campaigns like ‘Share Joy, Not Rumours’ and ‘Check it before you share it’ to remind people about double-checking facts before forwarding messages.

How Meta is Prepared to Protect the Upcoming State Elections in India 

We will also continue to raise awareness and provide training on the correct ways to use WhatsApp to help keep people safe.

Do Follow us on :- Facebook, Instagram, Twitter, YouTube.

We know that election periods are contentious and they can often be unpredictable. So while we head into these elections in India prepared and ready to meet the challenges we know will be present, we’re also ready to adapt to changing circumstances and unforeseen events.  We won’t hesitate to take additional steps if necessary to protect this important exercise of democracy in India and keep our platform and the Indian people safe before, during, and after the voting ends.

LEAVE A REPLY

Please enter your comment!
Please enter your name here