lundi 11 février 2019

YouTube announces it will no longer recommend conspiracy videos

"YouTube has announced that it will no longer recommend videos that "come close to" violating its community guidelines, such as conspiracy or medically inaccurate videos."

https://www.nbcnews.com/tech/tech-ne...videos-n969856

"Guillaume Chaslot, a former Google engineer, said that he helped to build the artificial intelligence used to curate recommended videos. In a thread of tweets posted on Saturday, he praised the change."

Guillaume Chaslot's Twitter thread:

https://twitter.com/gchaslot/status/...559044610?s=21


From YouTube's announcement:

"To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11"

https://youtube.googleblog.com/2019/...o-improve.html

"While this shift will apply to less than one percent of the content on YouTube, we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community. To be clear, this will only affect recommendations of what videos to watch, not whether a video is available on YouTube. As always, people can still access all videos that comply with our Community Guidelines and, when relevant, these videos may appear in recommendations for channel subscribers and in search results. We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users.
This change relies on a combination of machine learning and real people. We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations. These evaluators are trained using public guidelines and provide critical input on the quality of a video.
This will be a gradual change and initially will only affect recommendations of a very small set of videos in the United States. Over time, as our systems become more accurate, we'll roll this change out to more countries. It's just another step in an ongoing process, but it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube."




It's good the AI won't recommend CT crap to someone watching a random CT video.
But it appears it will still recommend crap to people already subscribed to CT channels.


via International Skeptics Forum http://bit.ly/2RSnqpT

Aucun commentaire:

Enregistrer un commentaire