Facebook said it will work with global and local health authorities to remove false information about the coronavirus from the company's social networks.
In a post on the company's blog, Facebook head of health Kang-Xing Jin outlined its steps to combat misinformation about the deadly virus.
"We are doing this as an extension of our existing policies to remove content that could cause physical harm," Jin wrote. "We’re focusing on claims that are designed to discourage treatment or taking appropriate precautions."
Misinformation about the coronavirus has been rife on social media. For example, posts and videos are spreading on Facebook falsely claiming Vitamin C will cure or prevent the virus.
Facebook will allow health organisations to flag content on Facebook and Instagram that promotes harmful content such as conspiracy theories or false cures for removal, Jin wrote.
The company will also block or restrict hashtags associated with misinformation on Instagram.
The plan supplements existing efforts by the social media giant to stop misinformation by allowing independent fact-checkers to review content for false information.
If a post is found by one of the company's fact-checking partners to contain false information, the post will have its reach limited on the platforms and users will see an alert notifying them of the fact-checker's finding.
Jin also said Facebook will start showing an information module with information about the crisis when users view searches or hashtags related to the virus, following a similar policy announced by Twitter earlier this week.