Daxdi now accepts payments with Bitcoin

Facebook Provides a (Small) Glimpse Into Its Recommendation Engine

(Image: Getty)

Facebook today provided a peek inside its recommendation engine—sort of.

As TechCrunch reports, Facebook has updated its Help Center with documentation about content that gets fed into its recommendation engines on Facebook and Instagram.

These suggestions show up as you scroll through the News Feed as things like Pages You May Like, “Suggested For You” posts in News Feed, People You May Know, or Groups You Should Join.

Since 2016, Facebook has used a system it calls called remove, reduce, and inform: remove content that violates its Community Standards; reduce the spread of problematic content that does not violate our standards; and inform people with additional information.

"We work to avoid making recommendations that could be low-quality, objectionable, or particularly sensitive, and we also avoid making recommendations that may be inappropriate for younger viewers," Facebook says.

"Our Recommendations Guidelines are designed to maintain a higher standard than our Community Standards, because recommended content and connections are from accounts or entities you haven't chosen to follow.

Therefore, not all content allowed on our platform will be eligible for recommendation."

Areas that aren’t allowed under recommendations include content that:

  • Talks about self-harm, suicide or eating disorders.

  • Depicts violence.

  • Is sexually explicit or suggestive.

  • Promotes regulated products, like cigarettes.

  • Stems from any non-recommendable account.

As TechCrunch notes, however, "the documentation offers no deep insight into how Facebook actually chooses what to recommend to a given user.

That’s a key piece to understanding recommendation technology, and one Facebook intentionally left out."

It comes as social media sites have been criticized for recommending content that sends users down conspiracy theory rabbit holes or thrives on conflict.

As The Wall Street Journal reported in May, Facebook execs largely avoided doing anything to crack down on divisiveness.

Joel Kaplan, VP for US Public Policy at Facebook, reportedly wanted to avoid a "paternalistic" approach that might irk conservative users and publishers.

(Image: Getty)

Facebook today provided a peek inside its recommendation engine—sort of.

As TechCrunch reports, Facebook has updated its Help Center with documentation about content that gets fed into its recommendation engines on Facebook and Instagram.

These suggestions show up as you scroll through the News Feed as things like Pages You May Like, “Suggested For You” posts in News Feed, People You May Know, or Groups You Should Join.

Since 2016, Facebook has used a system it calls called remove, reduce, and inform: remove content that violates its Community Standards; reduce the spread of problematic content that does not violate our standards; and inform people with additional information.

"We work to avoid making recommendations that could be low-quality, objectionable, or particularly sensitive, and we also avoid making recommendations that may be inappropriate for younger viewers," Facebook says.

"Our Recommendations Guidelines are designed to maintain a higher standard than our Community Standards, because recommended content and connections are from accounts or entities you haven't chosen to follow.

Therefore, not all content allowed on our platform will be eligible for recommendation."

Areas that aren’t allowed under recommendations include content that:

  • Talks about self-harm, suicide or eating disorders.

  • Depicts violence.

  • Is sexually explicit or suggestive.

  • Promotes regulated products, like cigarettes.

  • Stems from any non-recommendable account.

As TechCrunch notes, however, "the documentation offers no deep insight into how Facebook actually chooses what to recommend to a given user.

That’s a key piece to understanding recommendation technology, and one Facebook intentionally left out."

It comes as social media sites have been criticized for recommending content that sends users down conspiracy theory rabbit holes or thrives on conflict.

As The Wall Street Journal reported in May, Facebook execs largely avoided doing anything to crack down on divisiveness.

Joel Kaplan, VP for US Public Policy at Facebook, reportedly wanted to avoid a "paternalistic" approach that might irk conservative users and publishers.

PakaPuka

pakapuka.com Cookies

At pakapuka.com we use cookies (technical and profile cookies, both our own and third-party) to provide you with a better online experience and to send you personalized online commercial messages according to your preferences. If you select continue or access any content on our website without customizing your choices, you agree to the use of cookies.

For more information about our cookie policy and how to reject cookies

access here.

Preferences

Continue