Instagram continues to fight against the serious problems of hate and harassment that plague its platform: the company has improved the algorithms to hide potentially harmful content in apps. In addition, it will anticipate user reports to hide posts that could harm them.
The content that is uploaded to social networks is not always to everyone’s liking, can often hurt from the sensitivities to the people themselves. And, since Instagram is one of the most popular networks, and because of the type of content that can be uploaded to your feed and Stories, it is unfortunately common for the platform to be used for cyberbullying. Instagram is aware of this. And he has just announced new measures to counteract it.
Less relevance to content that may affect you
Instagram content report
Instagram has been including measures on its platform to make the use of apps by teenagers safer. Given that it is a vulnerable group, and due to the enormous importance that Instagram has for themProtecting them as much as possible seems to be the company’s current strategy. Especially after the recent clash of Facebook and its internal studies showing that Instagram was toxic for minors.
After the announcement that Facebook stopped the development of Instagram Kids, all efforts are aimed at improve the experience of adolescents in the generic application. New parental control tools were recently added, now Instagram improves its algorithms to get ahead of user reports.
The platform has tightened control over the content that is published in the Stories and on the main cover of the application, the so-called feed. Instagram will more strictly monitor photos and videos that may hurt users, either directly (harassment) or indirectly (insults, nudity…). In addition, the reporting of inappropriate content has much more force in the way it appears in the applications.
As users report a type of content as inappropriate Instagram will hide it in its Feed and Stories. In this way, the algorithms will anticipate the report by directly hiding what they might have reported as inappropriate. As a kind of “Minority Report” for images and videos.
With the changes to the way content is analyzed, Instagram hopes to curb harassment and hate on its platform. The problem is that detection is based on automatic algorithms and user reports: It could be the case that someone lost visibility because there was an orchestrated maneuver to take it away. However, that Instagram takes the problem seriously is always positive.
More information | Instagram
Via | gadget