How Facebook and Instagram define recommended posts and reject others

Sometimes I feel that the Facebook it looks like a Frankenstein, with a sea of ​​content, and it is natural that we get lost a little. To help, the social network makes many recommendations in the hope of revealing new communities and content that they believe is mine, yours, our interest – but how do you choose which ones?

Some examples of recommendations on Facebook are Pages you might like, Posts for you in the News Feed, People you may know or Groups you should join. On Instagram, go through the Explore sections, Accounts you might like and Discover IGTV. Everything is guided by some factors.

Two points are important when guiding recommendations:

  • history of access on the social network;
  • actions (reactions in general, participating groups, friends, etc.);

“We suggest pages, groups, events and more based on the content you’ve shown interest in (in the past) and the actions you take on our apps. We personalize the recommendations to ensure they are relevant and valuable ”.

“Our goal is to personalize the recommendations so that each person receives exclusive recommendations. If you and someone else have Facebook friends in common, we can suggest that person as a potential new friend. ”

“Or, if you interact with restaurants and bookstores on Instagram, we can recommend content about food, recipes, books or reading,” says the document referring to the chain.

How recommended posts are chosen 🍬

The social network recognizes that recommendations can make users discover lovely things. But, as the recommended content does not come from accounts you choose to follow, “it is important to have standards for what we recommend”.

What do these patterns do?

Guy Rosen, Facebook’s vice president of integrity, says they help ensure that potentially sensitive content is not recommended for anyone who has not explicitly indicated that they want to see it. Even not recommended, it is still allowed.

“We just won’t show it in places where we recommend content,” he explains.

To determine what content is eligible to appear in the recommendations, there are “Recommendation Guidelines”. Facebook made these guidelines in August 2020 public in the Help Center, providing context on why some types of content are not driven (which can affect their creators as well).

Who developed these guidelines?

50 experts and leaders in recommendation, expression, security and digital rights systems were consulted to adjust the resource to a safe and positive experience.

Posts not recommended by Facebook and Instagram 🙅🏽‍♀️

No more bullshit …

There are five categories of content allowed on both platforms, but which are not always 100% qualified to appear among the recommended posts.

Content that prevents the promotion of a safe community:

  1. Content that discusses self-harm, suicide or eating disorders;
  2. Content that depicts violence, like people fighting;
  3. Content that can be sexually explicit or suggestive, such as photos of people in transparent clothes;
  4. Content that promotes the use of certain products (even regulated), such as tobacco or vaping products, products and services for adults or medicines;
  5. Content shared by any non-recommended account (Groups or Pages).

Delicate or low quality content on health or finance:

  1. Content that promotes or describes cosmetic procedures;
  2. Content with exaggerated health claims, such as “miracle cures”;
  3. Content that tries to sell products or services based on health claims, such as promoting a supplement to help a person lose weight;
  4. Content that promotes deceptive or biased business models, such as salary loans or “risk-free” investments.

Content that users are known to dislike:

  1. click bait;
  2. engagement bait (likes and shares);
  3. that promotes a contest or a sweepstakes.
  4. which includes links to low-quality or deceptive landing pages or domains, such as pages filled with malicious ads or click bait.

Content associated with low quality ads:

  1. Non-original content that is widely reused from another source without adding material value;
  2. Content from sites that get a disproportionate number of clicks from Instagram compared to other places on the web;
  3. News content that does not include explicit information about the authorship or editorial staff of the news agency.

False or misleading content:

  1. Content that includes statements that were found to be false by independent fact verifiers;
  2. Vaccine-related misinformation that has been widely refuted by leading global health organizations;
  3. Content that promotes the use of fraudulent documents, such as a person sharing a publication about the use of a false identity document.

Remembering that the same posts, when flagged by users or robots, can also be removed due to violating the terms of use of these social networks.

About account and profile recommendations 🗣️

There are also recommendations on accounts (and not just the content published by them).

Accounts are not recommended that:

  1. They recently violated the Instagram Community Guidelines or the Facebook Community;
  2. They share on a recurring basis and / or have recently shared content that is not recommended;
  3. They repeatedly published vaccine-related misinformation that has been widely refuted by leading global health organizations;
  4. They repeatedly used deceptive practices to increase the number of followers, such as buying “likes”;
  5. They were prohibited from running ads on the platforms for any reason;
  6. Recently and repeatedly published false information, as determined by independent fact checkers or certain specialized organizations;
  7. They are associated with movements or offline organizations linked to violence.

All of these measures end up, in the end, reducing the reach of these posts. Remembering that not all allowed content will be eligible for recommendation.

The guidelines on recommended posts (or not) in various sections of the platform are part of an even greater social media strategy to manage problematic content in Facebook group apps dating from 2016, called “remove, reduce and report”And includes other content moderation initiatives.

With information: Facebook, Instagram Help and Facebook Help

Leave a Comment