Facebook will expand its third-party fact-checking program in India to include The Healthy Indian Project (THIP), which will be Facebook’s first health-specialist partner. The company has taken a step forward in its efforts to combat misinformation about Covid-19 and other health-related issues on the platform. THIP works to fact-check news and claims about health, medicines, diet, and treatment in partnership with verified medical professionals. The content in English, Hindi, Bengali, Punjabi, and Gujarati is fact-checked by THIP media.
Facebook works with 80 fact-checking partners globally, which helps the company in monitoring content available in more than 60 languages. The company’s fact-checking partners have been certified through the independent, non-partisan International Fact-Checking Network (IFCN).
In India, Facebook has 10 fact-checking partners. The Indian partners share the largest partnership with Facebook after the US. This includes several news organizations, including AFP, India Today Group, Quint Factly, News mobile, Fact Crescendo, BOOM Live, Vishvas News (Dainik Jagran), and NewsChecker. They fact-check in English and 11 Indian languages, including Malayalam, Gujarati, Marathi, Hindi, Bengali, Telugu, Kannada, Tamil, Punjabi, Urdu, and Assamese.
In a fellowship launched by Facebook with 10 fact-checking organizations, third-party experts will train fact-checkers virtually. Two organizations based in India, namely Factly and Quint, will also be a part of this fellowship.
During the pandemic, the third-party fact-checkers have helped the social media giant remove more than 18 million bits of harmful misinformation across Facebook and Instagram and label over 167 million fake news posts on Covid-19.
As per Facebook, its partnership with THIP will boost its capabilities to understand and curb health-related misinformation on the platform.
For the unaware, third-party fact-checkers evaluate stories, check their factual reliability, and rate their accuracy. When a fact-checker rates a story as false, Facebook shows it lower in News Feed, reducing its dissemination and audience-views significantly.
Pages and domains that repeatedly share false news also see their distribution reduced along with the temporary removal of their ability to monetize and advertise their content. Community members are presented with a pop-up notice if someone tries to share a fact-checked post. People who shared a later debunked story are notified to raise awareness on additional reporting on content and people who share it.