Instagram blasted over dangerous ‘eating disorder hashtags’ – and blames artificial intelligence for blunder
INSTAGRAM has blamed an artificial intelligence gaffe for shocking hashtags promoting eating disorders.Almost a dozen hashtags being used to share "unhealthy and dangerous
attitudes towards food and body image" were found circulating on the
site – without any warnings. Instagram has been accused of failing to protect its most vulnerable users. When you search for terms like "anorexia", "proanorexia" or "bulimia" on Instagram, you'll be blocked by a warning message asking if you need help. This policy has been in place since 2016, to protect users looking for sensitive topics on the site.
But an investigation by Sky News uncovered similar hashtags being used to share rogue content, which had slipped through Instagram's filter systems. "The hashtag search terms were slight variations or different spellings on others that have been flagged up," the report explains.Hashtags used to promote dangerous content around eating disorders were freely accessible in the Facebook-owned app Speaking to Sky News, Daniel Magson, a former bulimic and vice chair of Anorexia & Bulimia Care charity, said: "It is incredibly dangerous and a real health risk."It's not a safe space at all and these communities are promoting things like 'these are the best places to dine with private toilets for afterwards'."They promote the best ways to injure or self-harm and that should not be allowed." Normally searching for terms around eating disorders launches a pop-up.
It reads: "Can we help? Posts with words or tags you're searching for often encourage behaviour that can cause harm and even lead to death."If you're going through something difficult, we'd like to help." Users can then choose to "See posts anywhere", or select "Get Support". The latter takes you through to a page that recommends talking to a friend, chatting with a helpline volunteer, or getting tips and support on how to "support yourself". Instagram has been using artificial intelligence and machine learning systems for the last six months to root out rogue hashtags relating to sensitive topics. The systems find content that's likely to be deemed sensitive, by comparing post information to existing sensitive information. These posts and hashtags are then flagged up to Instagram.
According to Sky News, Instagram described these systems as a "work in progress", and has now added warnings to the offending hashtags.
In a statement, an Instagram spokesperson said: "We care deeply about
making Instagram a place where people feel empowered, inspired and
comfortable to express themselves. "Every day, millions of people use Instagram to strengthen
relationships with friends and build communities of support,
particularly around body image. "Instagram was created to foster a safe, kind and supportive community and we're committed keeping it so." Instagram's eating disorder gaffe mirrors a similar hashtag blunder uncovered by The Sun earlier this year.We exposed secret Instagram sex hashtags that were being used to share hardcore porn videos around the Facebook-owned app. The Sun found shocking videos depicting full sex with genitals in clear view, while others showed oral sex or masturbation. One clip even showed a bestiality scene involving an adult woman and a horse – which is illegal to distribute in the UK. Others didn't necessarily depict nudity, but included male
ejaculation or close-up crops on hardcore sex scenes – leaving genitals
just out of shot. Importantly, Instagram is aimed at users aged 13 and over, so this content was highly inappropriate.
Speaking to The Sun at the time, Andy Burrows, the NSPCC's Associate Head of Child Safety Online, said Instagram wasn't doing enough to protect kids. "Instagram's rules ban pornography but clearly its moderation systems aren't removing content that it should. Instagram should proactively filter out content which breaks its own rules. "Young people on Instagram should never be exposed to this kind of adult content, some of which includes bestial themes. "Following the NSPCC’s Wild West Web campaign, Government announced it will bring in new safety laws for social networks. The new Digital Secretary Jeremy Wright must make sure these laws are fit for purpose, and are backed by an independent regulator with teeth." We've asked Instagram for comment and will update this story with any response.
Do you think Instagram needs to do more to clean up its act? Let us know in the comments!
But an investigation by Sky News uncovered similar hashtags being used to share rogue content, which had slipped through Instagram's filter systems. "The hashtag search terms were slight variations or different spellings on others that have been flagged up," the report explains.Hashtags used to promote dangerous content around eating disorders were freely accessible in the Facebook-owned app Speaking to Sky News, Daniel Magson, a former bulimic and vice chair of Anorexia & Bulimia Care charity, said: "It is incredibly dangerous and a real health risk."It's not a safe space at all and these communities are promoting things like 'these are the best places to dine with private toilets for afterwards'."They promote the best ways to injure or self-harm and that should not be allowed." Normally searching for terms around eating disorders launches a pop-up.
It reads: "Can we help? Posts with words or tags you're searching for often encourage behaviour that can cause harm and even lead to death."If you're going through something difficult, we'd like to help." Users can then choose to "See posts anywhere", or select "Get Support". The latter takes you through to a page that recommends talking to a friend, chatting with a helpline volunteer, or getting tips and support on how to "support yourself". Instagram has been using artificial intelligence and machine learning systems for the last six months to root out rogue hashtags relating to sensitive topics. The systems find content that's likely to be deemed sensitive, by comparing post information to existing sensitive information. These posts and hashtags are then flagged up to Instagram.
According to Sky News, Instagram described these systems as a "work in progress", and has now added warnings to the offending hashtags.
Supporting someone with an eating disorder
Here's the official advice from the NHS
- If your friend or relative has an eating disorder, such as anorexia, bulimia or binge eating disorder, you will probably want to do everything you can to help them recover.
- You're already doing a great job by finding out more about eating disorders and how to try to support them – it shows you care and helps you understand how they might be feeling.
- Getting professional help from a doctor, practice nurse, or a school or college nurse will give your friend or relative the best chance of getting better. But this can be one of the most difficult steps for someone suffering from an eating disorder, so try to encourage them to seek help or offer to go along with them.
- You can support them in other ways, too:
- Keep trying to include them – they may not want to go out or join in with activities, but keep trying to talk to them and ask them along, just like before. Even if they don't join in, they will still like to be asked. It will make them feel valued as a person.
- Try to build up their self-esteem – perhaps by telling them what a great person they are and how much you appreciate having them in your life.
- Give your time, listen to them and try not to give advice or criticise – this can be tough when you don't agree with what they say about themselves and what they eat. Remember, you don't have to know all the answers. Just making sure they know you're there for them is what's important. This is especially true when it feels like your friend or relative is rejecting your friendship, help and support.
Speaking to The Sun at the time, Andy Burrows, the NSPCC's Associate Head of Child Safety Online, said Instagram wasn't doing enough to protect kids. "Instagram's rules ban pornography but clearly its moderation systems aren't removing content that it should. Instagram should proactively filter out content which breaks its own rules. "Young people on Instagram should never be exposed to this kind of adult content, some of which includes bestial themes. "Following the NSPCC’s Wild West Web campaign, Government announced it will bring in new safety laws for social networks. The new Digital Secretary Jeremy Wright must make sure these laws are fit for purpose, and are backed by an independent regulator with teeth." We've asked Instagram for comment and will update this story with any response.
Comments
Post a Comment