If you are going to post an inappropriate statement on a video on YouTube, you will be warned
“YouTube is going to ask people to weed out offensive comments on videos, “Is this something you want to share? —before they post something which could be offensive. This. With this. Facebook is following in the footsteps of other social media companies that have attempted to clamp down on their platforms for bullying and abuse.
YouTube is launching a new product feature that will warn people if they are about to post a comment that could be offensive to others the video streaming platform explained in a blog post, giving them a chance to reflect before posting.
This latest tool, though, would not necessarily prohibit you from making any remarks. Also, only for those posts that YouTube might consider offensive, the prompt to ‘rethink’ will appear, and this is based on content that has been repeatedly reported on the platform. Users can either go ahead and publish the comment precisely as they intended, or it can take some time to update it until the prompt comes up.
In YouTube Studio, the place where creators manage their channels, the company is rolling out better content filtering systems for the creators. There is a new filter in place at the YouTube studio that will look for inappropriate and offensive comments that will be flagged and held for review automatically and removed from the queue so that people do not have to read them.
This functionality will first roll out on Android and be available in English before other languages are made available.
YouTube has been attempting to purge derogatory and violent messages from its website and has been focusing on this for a while now. According to their findings, the site has used automated filters to delete regular hate speech messages 46 times more than ever since early 2019.
YouTube estimates that 1.8 million channels were terminated in the last quarter for inappropriate content and more than 54,000 of them were due to hate speech.