Skip To Content
BuzzFeed News Home Reporting To You

Utilizamos cookies, próprios e de terceiros, que o reconhecem e identificam como um usuário único, para garantir a melhor experiência de navegação, personalizar conteúdo e anúncios, e melhorar o desempenho do nosso site e serviços. Esses Cookies nos permitem coletar alguns dados pessoais sobre você, como sua ID exclusiva atribuída ao seu dispositivo, endereço de IP, tipo de dispositivo e navegador, conteúdos visualizados ou outras ações realizadas usando nossos serviços, país e idioma selecionados, entre outros. Para saber mais sobre nossa política de cookies, acesse link.

Caso não concorde com o uso cookies dessa forma, você deverá ajustar as configurações de seu navegador ou deixar de acessar o nosso site e serviços. Ao continuar com a navegação em nosso site, você aceita o uso de cookies.

Instagram Lets You Limit Comments Just To People You Follow

New tools let you limit who can comment, as well as immediate tools for reporting self-harm danger during livestreams.

Posted on September 26, 2017, at 8:31 a.m. ET

Today, Instagram is rolling out new features aimed at making it harder to troll and harass users, and also at looking out for some of its most vulnerable users.

1. A big new change that’ll make it easier to eliminate creepers: You’ll be able to limit who can comment on your posts. There will be four options:

  • Everyone

  • People you follow

  • People who follow you

  • People who follow you and people you follow

This will only be available to public accounts. Previously, you could only block comments from people one by one, and it wasn’t possible to limit large groups of people.


2. The “hide offensive comments” feature will now be available in French, German, Portuguese, and Arabic. This feature launched in English in June as a way of blocking certain words, but now it’s expanded into a more robust, AI-powered detection of the nuances of harassing comments.

To turn it on, go into the “Comments” section of the Instagram app, and toggle on the automatic filter:

3. The last feature is an enhancement to the mental health and safety tool aimed at helping people who are posting about self-injury or suicide. Currently, when you report a post, Story, or live feed for self-harm or suicide content, the person will be shown a notification that says “we’re reaching out to offer help.”

Starting today, that menu will now show up during a livestream if someone is worried about you and reports your livestream for self-harm. Before, the menu would only show up after your livestream had ended, which might be too late. You can browse the options (anonymously) in the menu while livestreaming, and then seamlessly return to your stream.


You can view resources that will suggest you should contact a friend, call a helpline, or get tips and support (the specific support links and helplines will vary by country).

This also means that to be effective, the reports need to be reviewed and processed by human moderators very quickly – while someone is still streaming. Instagram says of its moderation efforts, “we have teams working 24 hours a day, seven days a week, around the world to be there when people need us most. This is an important step in ensuring that people get help wherever they are — on Instagram or off.”

A BuzzFeed News investigation, in partnership with the International Consortium of Investigative Journalists, based on thousands of documents the government didn't want you to see.