Auditing Algorithms: Investigations of the Facebook News Feed Recommender System
In the past decade digital intermediaries such as Facebook, Google and Twitter, have assumed a pivotal role in the information space as central distribution channels for all forms of content, most notably news. The distribution and (in)visibility of information on these platforms are dictated by algorithmic recommender systems, which are often described as black boxes, in reference to the opaque nature of their decision-making systems. The information delivery outcomes of such systems are understood to have profound implications for the public discourse, yet oversight and transparency have been severely limited.
Efforts to understand the effects of such algorithmic systems have resulted in an emerging area of research: algorithmic auditing. Sandvig et al (2014) set out the initial understanding of algorithmic audits as a variety of methods used to uncover issues within the decision-making structure of an algorithm. As a nascent area of research, methods have yet to be standardised and encompass a wide variety of techniques investigating algorithmic systems by indirect means. Such audits by academics, journalists and activists in recent years have uncovered evidence of harmful algorithmic mechanics on various platforms including issues of racial bias, discrimination, misjudgement and misattribution.
The proposed research uses an empirical approach to investigate the effects of algorithmic governance of information on the Facebook platform, by auditing the Facebook News Feed recommender system. The research design includes using a parametrized timeline of known strategic changes to the Facebook News Feed recommender system, in conjunction with media content including a corpus from The Guardian newspaper (2011-2020), and utilising CrowdTangle to access Facebook engagement metrics for such content. The proposed method is to build a model based on the documented changes to Facebook algorithms, which is subsequently modeled with Cross-Correlation temporal analysis, Augmented Dickey–Fuller and Granger-Causality tests, and finally the Seasonal Hybrid ESD (S-H-ESD) algorithm. This study presents a proof-of-concept audit of the Facebook News Feed and forms the basis of an extended set of further investigations aimed at contributing to the understanding of algorithmic governance on digital intermediaries.