TikTok has introduced a new feature to fight the spread of misinformation on its platform.
The app will begin warning users before they share videos with unverified info. The update is meant to address a kind of gray area in the fact-checking process: claims that fact checkers are unable to verify.
According to product manager - Gina Hernandez said "People come to TikTok to be creative, find community, and have fun. Being authentic is valued by our community, and we take the responsibility of helping counter inauthentic, misleading, or false content to heart.
We remove misinformation as we identify it and partner with fact checkers at PolitiFact, Lead Stories, and SciVerify to help us assess the accuracy of content. If fact checks confirm content to be false, we'll remove the video from our platform. "
Here is how it works:-
- First, a viewer will see a banner on a video if the content has been reviewed but cannot be conclusively validated.
- The video's creator will also be notified that their video was flagged as unsubstantiated content.
- If a viewer attempts to share the flagged video, they’ll see a prompt reminding them that the video has been flagged as unverified content.