Facebook News Feed fails to provide transparency on its algorithm
What has been conceived as an innocuous platform for friends and loved ones to connect to each other by sharing information and snapshots has grown into a gargantuan enterprise and spawned problems affecting individuals and nations.
Facebook has been at the center of criticisms about its role in spreading fake news and hate speech, and invading the privacy of ordinary citizens.
Over the weekend Facebook founder and chairman Mark Zuckerberg published an essay urging governments around the world to play a more active role in establishing the rules that govern the internet in order to guard against harmful content and safeguard elections and other institutions.
At the core of all the controversies surrounding the social network is its News Feed function, which distributes content to users based on their browsing habits and engagements with other online links.
Facebook predicts what users want to read based on its own algorithm. But some individuals and organizations have abused this function to distribute fake news to specific groups of users in a bid to pursue their own political or economic agenda.
On Monday, Facebook unveiled a feature called “Why am I seeing this post?” which explains to users why a specific post, for instance, appears on their News Feed. It is an expansion of an existing feature Facebook already provides for advertisements. This feature allows users to control what they see on their News Feature.
Once the service is rolled out, users will be able to tap on posts on News Feed to understand why they are appearing, and take action to further personalize what they can see.
The feature enables the users to understand that a post comes from a friend, a group they have joined or a page they have followed.
Simply put, the more frequent the users interact with their friends, groups or pages, the more chances the users will come across information about them or from them on their News Feed.
It is quite clear that Facebook is trying to rebuild the trust of users by gradually raising the curtain on its algorithm.
However, the function only explains why users are seeing what appear on their News Feed, and does not give them the decision with regard to what they want to see. True, Facebook gives them the ability to stop seeing what they don’t want to see, but such an arrangement is still a far cry from allowing them to build their own News Feed, or something similar to the subscription model in which users only read news from sources they have subscribed for, and not those chosen by the Facebook algorithm.
One basic problem with the Facebook algorithm is that it is a tool to drive the company’s advertising sales. Facebook uses the algorithm to determine what products and services a user most likely will be interested in and will probably buy.
A user must respond to and engage with such news or advertisement chosen by the algorithm, otherwise Facebook may eventually stop sending them on the premise that the user is not interested.
In short, Facebook has decided to give more information about how its algorithm works, but the control is still in the company’s hands, not with the users.
In an interview with The Daily Telegraph, Facebook’s global head of News Feed John Hegeman said the company would not make its News Feed algorithms public as that would make it too easy for “bad actors” such as scammers and foreign intelligence agencies to “game the system or abuse the platform to cause harm”.
However, the problem is that its algorithm could be giving priority to content that provokes strong emotional reaction from users, and this has helped in spreading sensational or divisive information, including fake news.
Aside from explaining the workings of its algorithm, Facebook is considering building a dedicated news section to provide quality news from established and reputable news sources. The new product has been in development for quite some time and should be ready to launch by the end of this year.
Sure, Facebook is trying its best to protect users from harmful content. But consumers are wondering why the company can’t just return to being a simple social network without the intervention of artificial intelligence and algorithm?