Will what we see in IG change?

Several artists I have been following on social media have been talking about how the Instagram algorithm has been updated once again.

Now, Instagram is going to demote content they see as “ inappropriate ” even if the post does not violate the social media page’s community guidelines. This means that a post on Instagram sees as “inappropriate” will have less reach and will be filtered from Explore and Hashtags. This will affect many influencers who post content that is revealing or cause many meme pages to get less exposure.

A problem of this is that the guidelines are very vague. Some assume that since AI is involved in this process of screening content, this can be seen as trying to make the algorithm efficient and shrinking its responsibilities.

With Tumblr recently banning sexual content and now with Instagram lowering the engagement of vaguely “inappropriate” content, it makes raises the question if this is a trend we could expect from other social media platforms.

Reference: https://techcrunch.com/2019/04/10/instagram-borderline/

Image: TechRadar

 

3 thoughts on “Will what we see in IG change?

  1. I can see the bright side of these AI moderate tools in social media. According to Zak Doffman from Forbes, kids are online almost constantly and before these moderators, it was a very slow process to take down inappropriate content. It can be concerning with the trend of the internet to not have much control to what your children are exposed to. These algorithms may cause a hassle for those influencers, yet it is essential for the protection of these impressionable kids.
    The hours spent on the internet and on social media is dangerous to our mental health already and the more we can drown out the hate the better we are off as a society. Cyberbullying is a huge problem thanks to social media and these instant blockers can save those victims the heartache and pain that they are shown. All in all, I am excited about these advancements as they can provide so much good for us.
    Refrence: https://www.forbes.com/sites/zakdoffman/2019/02/01/googles-mission-impossible-use-ai-to-detoxify-the-internet/#70732bf44e5a

  2. With Tumblr’s questionable methods of blocking inappropriate content, this is a rather concerning trend. If social media sites continue to use faulty algorithms, it will impact lots of people who may not even post anything inappropriate, and they’ll be hidden for no reason at all. As you mentioned, the details are vague, but Hanna Kozlowska on Quartz also fears that it will prevent people from expressing themselves the way they want. It seems people are concerned about the individuals who make their living as sex workers or other related careers, but I don’t think full-on nudity was allowed in the first place. This just takes that rule a step further, but we’re not sure how much further. It could also impact artists and photographers who focus on darker, but not quite inappropriate content. It’s also hard for any kind of algorithm to regulate such content, since it’s mostly based on personal individual judgement. Not everyone has the same idea about what’s “appropriate” to be posted publicly. This could impact Instagram negatively if it’s not executed well.

    Reference:
    Kozlowska, Hanna. “Instagram Will Demote ‘Inappropriate Content’-and Self-Expression along the Way.” Quartz, Quartz, 13 Apr. 2019, qz.com/1594392/instagram-will-demote-inappropriate-content-and-self-expression-along-the-way/.

  3. The article below titled, The Trauma Floor, highlights the totally insane world of social media content moderators. Despite what most people think, content moderation is not done by AI, instead, it is done by independent 3rd party firms that Facebook pays. This shields Facebook from many of the negative effects of employing such people. Because these people, after extended periods of time content managing end up with serious psychological issues. A quote from the attached article is as follows, “someone is stabbing him, dozens of times, while he screams and begs for his life.” That is a content moderator describing something she witnesses that was of course taken down by her. “We were doing something that was darkening our soul” another content moderator explained. The world of content moderation needs to be looked at and changed asap. It is leaving people with serious psychological issues that they will never fully recover from.

    https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

Leave a Reply