Social media and misinformation go hand in hand, and they are unavoidable. Facebook plays a major role in this and has been under scrutiny for many such activities. Facebook CEO, Mark Zuckerberg, had made a formal public statement about renewing the focus on Facebooks’ communities and groups feature, two years ago. He believed that this would make a more significant social infrastructure. “The project was successful as the count of users belonging to “meaningful groups” rose from one million in 2017 to ‘hundreds of millions’ recently”, as told by Zuckerberg to his investors, the previous month. He also said that “Connecting and communicating with people you are fascinated with and sharing common interests with them is going to be similar and comfortable as connecting with friends and family.”
While this shift has worked out well with individual users and has done a great deal to the company, researchers feel that the Facebook activity has moved out of public view. Their knowledge of what is happening on the platform and how the company’s algorithm-driven recommendations are channeling people to use these unconventional communities and misinformation has decreased many folds. Director of research at Columbia university’s Tow Centre for Digital Journalism-John Albright, states that “As researchers, they can access only the content, which does not facilitate to explore Facebook or observe how the groups work. They need more information on the network and members of the group who target individuals and spread rumours. With this current upgrade, the researchers are putting double the efforts than usual, and they require a partnership with Facebook, a step above they are willing to share, for this work to progress.”
The main drawback of these groups and communities are their tendencies to mislead individual via recommendations. One such incident that was uncovered by the media and gained public attention was the communities spreading false news about vaccines. There has been content that demoralizes and discourage new parents from vaccinating their children. This might have been partly responsible for the measles outbreak in Washington State. Larry Cook, a self-described social media activist with no prior training in the medical field and no children, set forth the largest anti-vaccine private group- “STOP Mandatory Vaccination,” on Facebook with 126,000 members. Cook and the group members advertise the debunked theory of vaccines causing harm to children especially autism and outbreaks of preventable diseases are proven hoaxes by the government.
Facebook is looking out on additional measures to battle against this misinformation on vaccines. Also, Adam Schiff, a US representative wrote a letter to Facebook and Google requesting them to resolve this issue as a priority. He mentioned that “repetitions of this false information often are considered facts. The algorithms do not distinguish the quality of the information or verify if they are true.” In acknowledgment, Facebook said that “various measures are in consideration to face the problem. The first step is to reduce or remove this type of content from recommendations which include removing ‘groups you should join’ followed by demoting it in search results and finally, making certain that the quality of information provided is authentic and reliable.”
Following this, the recommendations engine- the vital unit for Facebook’s progress and the algorithm that the company depends on to get more users to join different groups, has come under scrutiny. According to FB’s spokesperson, the algorithm that controls the right rail of ‘suggested groups’ is not available to the public and that these are customized to individual users. His statement was opposed by Renee DiResta, a researcher who studies online disinformation as director of research at cybersecurity company New Knowledge. She has been in this field and has been observing FB for four years now. She states that “Facebook deliberately shoves its users through a rabbit hole of many misinformed, conspiratorial and radical communities. For example, the misinformation about vaccines is thrust upon the new parents. Especially, if you already are a member of a moms’ group, you will get requests from anti-vaccination groups.” Zuckerberg had accepted the vitality and importance of the recommendation engine, and he stated that “most of the individuals do not search for communities or groups on their own. These are either suggested by Facebook or friends send invites.”
After Facebook, YouTube’s recommendation engine also has similar features. YouTube faced the same criticism as FB for creating algorithms that mislead the users. Due to the howls of protests from researchers and former employees for years, YouTube has decided that there will be changes made in its recommendation engine to stop engaging users in any conspiracy videos. Anti-vaccine videos misinforming users in dangerous ways were the topped results for vaccine searches. However, YouTube has made amendments to its top results page, which now shows videos to dispel vaccination myths. While YouTube made such changes with immediate effect, FB does not plan to follow suit. A spokesperson from FB pointed out that there is an option provided to its users to hide or remove the suggested groups.
Although on this social media, a page or a person can be disciplined or banned, it’s associated group would still be active and working. An example of this circumstance is the group called ‘INFOWARS.’ Facebook’s 2018 ban has not been constraining the conspiracy news site’s group feature which directs how the communities can border the cleanup efforts by Facebook.
Despite the ban, Alex Jones, owner of Infowars, have been having hands on Facebook pages, due to which Facebook renewed it’s enforcement policy recently closing all the loopholes. This lead to shutting down of 22 individual Infowars pages. Nonetheless, Jones and his members have given life to Infowars on Facebook’s private group which consists of more than 117,000 members. The group primarily discusses and comments on controversial topics such as anti-muslim matters and conspiracy content. They portray themselves as “a media source for those who are awake!”. The employees along with Jones keep the members of the group up to date on alternate pages and groups, on information from Infowars website and also, advertise supplements sold on the Infowars stores.
As per Facebook’s spokesperson, the company has begun applying the renewed policy completely across Facebook covering its features and products which includes pages, events, groups, and Instagram. The two criteria FB is following to remove the group’s with unwanted content are –groups with the same admins and similar group names that have been shut down. Infowars Group appears to fall onto both the criterion. According to the spokesperson, “the removal of those pages are the beginning and that Facebook plans on expanding in months ahead.” However, beyond these measures, Infowars community page states “This group is doing great.”