Social Media Counters and Metrics: Measurement at all Costs?

Social Media Counters and Metrics: Measurement at all Costs?

‘The one question I keep returning to is whether we can dispense
with social media counters entirely’.

 Kane X. Faucher’s new book Social Capital Online (available open access in the CDSMS series from UWP) considers the dominant role of quantification in social media environments and how we end up competing for dubious forms of digital ‘social capital’. He explains:

An obsession with metrics pervades much of the private and public sector, and is paralleled on popular social media. It is the promise of metrics that see so many place an inviolable faith in their ability to increase efficiency, effectiveness offer ready tools for benchmarks and box-ticking. Worse still is the promise that metrics will facilitate better prediction and can be used as a directional planning tool. There is no doubt that measurement is indispensable in the sciences and engineering. There measurement is essential. The problem arises when metrics are applied widely to domains such as social media. When it comes to metrics, what we measure, how we measure, and why we measure it are equally essential questions. With social media, we have an abundance of metrics – some visible, others requiring some digging, and still others entirely invisible to the public.

A quick rundown makes this clear:

  1. Visible Metrics: On Facebook or other networks, it may seem easy to assign a value to any user by the number of friends or likes accumulated. It can be an easy way of determining popularity or relevance in a socially competitive field – a process not dissimilar from casting ballots regularly. Motives for why users assign a ‘like’ will vary widely as the reasons why people support a politician in elections. And yet, because of the presence of these visible metrics there are notable behavioural changes in the way some people operate on social media, being conspicuous in their online production, reputation management, and effectively campaigning for the most ‘votes’ on their content. But, unlike an election, there is no end date to the campaign; any sense of victory is fleeting. Users’ behaviour may adopt more risky behaviour in order to garner more attention, a higher ‘score.’  Businesses, try to increase their social media score believing that this will convert to customers, then sales. As a metric, this may be flawed or merely correlation.
  2. Less Visible Metrics: Services (some free, others paid) will provide loads of metrics on number of impressions, clickthroughs, etc. Google Analytics provides a welter of data on the demographics of visitors to a site, what operating systems they use, the flow-through of the pages users visit, and for how long. This quasi-cybernetic affordance can provide a website operator guidance by which to reconfigure parts of the website to optimize visits, longer stays, and improve the ‘experience.’ YouTube provides similar metrics notably CPM (clicks per thousand).Other metrics can also be calculated such as providing a dollar value on a social media account. Klout and other companies may tell us how much a person’s tweet is ‘worth’ and the overall value of the account itself. These are potential values, but it is unclear what they mean. Assigning a dollar value to a collectible item is usually a reflection of the market and what others are willing to pay; on social media, there is no sense of true exchange value whereby a user can sell their account or tweet. Sure, there are plenty of celebrities like Kim Kardashian who will charge a set fee for promoting a product or service on their social media accounts, and so perhaps that lip service endorsement can result in sales. But this is little different from traditional forms of celebrity endorsement in other media venues apart from it being digital and potentially reaching a wider audience.
  3. ‘Invisible’ Metrics: Facebook is able to automate the process of counting interactions and drill down into data that compares what you mention to your demographic information. These result in the creation of ‘buckets’ that businesses can access for money to better refine their target marketing. Algorithms simplify this process, but it is not an exceptionally sophisticated one despite the conspiratorial chatter about how we are being ‘controlled’ by social media. Obviously there are behaviour-shaping elements on social media that strongly resemble conditioning. There is also a strong availability heuristic at play in how these social media sites decide on our behalf what content we will see in the newsfeed, which may keep us sequestered in our filter bubbles. It was not long ago when Facebook conducted its own behavioural experiment in selecting a number of users (without their knowledge or explicit consent) and showing them positive or negative posts while observing the behaviour of those users.

Figures – Donald Trump to name one – may have tens of millions of followers on Twitter, but it would be a mistake to believe every one of them endorses his views or supports him. A good number may follow his tweets out of public interest, for comedy, to troll him, or because their job (such as being in the media) requires it. Sentiment analysis on engagement may help to understand if those followers are supporters or not. Despite all of this assigning a value on the basis of a raw score is flawed because there is no consensus on what we mean by value. It is as rough and ready as saying another human being can be given a value on the basis of how much money they have in her or his bank account.

Algorithms: Mystery but no magic

As a predictive tool, social media counters are far from perfect. What is popular now will not necessarily remain so. At one point #Kony2012 was the top trending hashtag on Twitter, but the fortunes of that organization changed quickly. And yet metrics are considered an essential ingredient in recommender systems to get us to purchase similar products based on the purchasing habits of those who have been placed in a similar data bucket. When the term social media algorithm is mentioned there is a kind of magical understanding, that it occurs in a black box heavily guarded by complex streams of code.

Worse, it isn’t even scientific, but a kind of pseudoscience. The sorcery involved is really covering the fuzziness of the operation. It also completely disregards the old GIGO principle (garbage in/garbage out) as it does not measure or produce anything all that meaningful. There is absolutely nothing mysterious or magical about algorithms. Running your finances through a spreadsheet would quality as an algorithm. A simple Turing Test is an algorithm. A good algorithm is a feedback loop that does not require human intervention. It would be an exercise in futility to task a human being to calculate on the fly the trajectory of a missile in order to shoot it down. GPS operates as a feedback system, whereas the ABS on your car is a feed-forward system using actuators.

The algorithms in use by those like Facebook are not feedback loops, but feed-forward. They will assume some models of human behaviour, but they cannot fully calculate the variance between groups. What they sell in terms of data is limited and not a feedback tool for making useful predictions. As such, it is unstable and its results hit and miss. The dream of predicting the behaviour of crowds is an old one, and it continues to thrive in excitable statements such as Google’s that human beings are programmable. Our behaviour can be shaped through persuasive techniques, but the outcomes are not foolproof.

At best, these algorithms aim to recognize patterns, and then take action on the basis of those patterns. This is little different than actuarial tables to determining insurance premiums on the basis of past data where someone who is of a certain age, gender, location, etc., is matched against comparative mortality statistics. Such tables require frequent adjustment, but they assume in advance a set of conditions in order to calculate the premium. In the case of social media assumptions are applied to groups who share some characteristics but the process is akin to throwing something at the wall to see if it will stick. If, say, the algorithm detects a pattern where 20 year old females are more likely to purchase a Mac as opposed to a Microsoft computer, the ads in the sidebar will aim to reflect that pattern in order to produce that result by increasing its probability. It is a little like adjusting the controls of an experiment to arrive at the result one desires.

One analogy that may serve to illustrate this operation would be an assembly line where, for example, every 10th widget is inspected for quality control. There is a ‘model widget’ that is applied, and if a defective one is found the assembly line is shut down and then the cause of the defect is investigated. Applied to social media, if the ad is not resonating with the targeted group, the algorithm is reconfigured. This process can be better refined by getting user input, such as with Google Ad choices where we have the opportunity to say whether the ad was relevant to us. The algorithms at play on social media assume we conform to the model widget, pending which bucket we’ve been placed in. There is nothing sinister or spooky about this kind of machine learning. What is objectionable is how all our interactions are logged, tabulated, and then syndicated across our networks behind a gamified environment where our labour is obfuscated as leisure activity in a high trust milieu. Rather than a McLuhan ‘global village,’ the glowingly optimistic pronouncements about social media in its shining ubiquity is more aptly viewed as a Potemkin Village where so much social activity and connectedness obscures the very real power dynamics of capitalism, data capture, and cutthroat competition for attention and value determined by sheer numbers alone.

Only a Numbers Game

The one question I keep returning to is whether we can dispense with social media counters entirely. As much as it may provide a temporary ego-boost, jockeying for more ‘points’ seems to undercut the true value of generating online social capital: the ability to organize, mobilize, share, and connect with others in a social venue.  To run up our scores is really to do the work of social media sites, with these scores as the token payment for our labour. Can we not appreciate the intrinsic value of sharing our content without judging it by the number of people who clicked or tapped their approval? Can we make use of social media without so quickly rushing to commodify and brand ourselves? The answer to those questions is certainly yes, but it is something we would have to elect to do while putting pressure on popular social media platforms to simply remove these counting features.

Whenever we engage in the games of online social capital on a purely numerical basis, we may be feeding egos with token scores, but we are also feeding the machines to better refine its pattern recognition to restrict our choices and persuade us to support particular viewpoints or purchase a product or service. It becomes clear that the incentive for including these counters serves the purpose of increasing the time we engage in social media, while masking the labour we perform behind a kind of competitive game.

Dr Kane X. Faucher teaches at the Faculty of Information and Media Studies, Western University, Ontario, Canada


‘[T]he continued relevance of the Propaganda Model is abundantly clear.’

A fresh interview with the editors of The Propaganda Model Today has been published on the CAMRI blog exploring such questions as what the PM has to say about Russo-phobia, digital media and whether and how it is being taken up by a new generation, internationally. 

See the full interview only here. 

To view open access or download the book (description in image) see The Propaganda Model Today: Filtering Perception and Awareness.

‘Russo-phobic environment’ fits nicely into Propaganda Model framework – says Herman.

What does the PM have to say about the media coverage of Trump’s election campaign and first months as President?

The MSM [Mainstream media] clearly favoured Hillary Clinton, but many of the elite were pleased with Trump’s anti-regulatory and tax ‘reform’ plans. They also gave Trump a great deal of free media space because his demagoguery resonated with large numbers and playing him up raised media audience sizes. Since the election the MSM have been much more hostile to him and have teamed with the Democrats in creating a Russo-phobic environment, in good part to squelch any attempt on his part to soften policy on confronting Russia and keeping the war party happy and profitable. This all fits nicely into the PM framework.

Rethinking ‘Freedom’ – a book launch event

Rethinking ‘Freedom’ – a book launch event

The Westminster Law and Theory Lab are inviting interested parties (all welcome to attend) a book launch and drinks reception for Riccard Baldissone’s new book, ‘Farewell to Freedom: A Western Genealogy of Liberty‘ Dr Elena Loizidou (Birkbeck Law School),  Profesor Saul Newman (Politics and International Relations Goldsmiths College London) and Professor Nathan Widder, (Politics and International Relations, Royal Holloway College, London) will consider the vocabularies and history of the idea of freedom at the  event: Friday 21 November 18.00 – 20.00.

VENUE: University of Westminster, The Pavilion, 115 New Cavendish St W1W 6UW. Register here to attend.  For free open access download of the book navigate to here.


Communications and Geography: An Ever-Closer Union?

Communications and Geography: An Ever-Closer Union?

Here in an extract from his editorial Doug Specht in the latest issue* of Westminster Papers in Communication and Culture  reflects on how space has not gone away or been ‘annihilated’. He considers how communication theories may help understand a world in which maps of all kinds are being reconfigured with the aid of the users and suppliers of Big Data  as space in all dimensions is being mediated and reshaped. 

Late twentieth century communication and information technologies have produced such a blurring of what is real and what is representation that the two can no longer be distinguished (Corner, 1999), leading to persistent questions over how human behaviour is constituted through space and time, and within specific social contexts (Dear, 1988). Our mappings of the world, be they through cartographic representations and data visualizations (Space-in-media), or mediated senses of place (Place-in-Media, and Media-in-Place), are in-between the virtual and the physical. A distinction not to be confused with a distinction between real and fake, ‘as we would not claim that our bodies are real while our minds are fake’ (Smith, 2017: 30). Did you find the world or did you make it up? asked Winnicotts (cited in Corner, 1999), a salient question indeed. The information super-highway agenda of the 1990s was designed to change the very fabric of society (Robins, 1997), to create a homogenized flow of communications transcending geography (Greig, 2001). This post-modern condition of ‘space-time compression’ (Harvey, 1989) would annihilate space. Yet, space has not disappeared, but has re-established itself in new spheres, created of ever larger data, and increasingly mediated, and must then be understood through the use of semiotic and communication theories, such as the Marxist spatial frameworks of Castells and Lefebvre, or the Ideologiekritik of the Frankfurt School (Lagopoulos, 1993). The postmodern creates tensions between all theories in an attempt to best understand the conditions of existence, at its core, perhaps, lies the dialectic between space and society; a geographical puzzle in which structures, institutions and human agents operate on different scales to define spatial patterns in any given locale (Dear, 1988). The individual does not disappear in the midst of the social effects caused by the pressures of the masses, but is instead affirmed (Lefebvre, 1991). It is seeing that establishes our place in the surrounding world; we explain that world with words, but words can never undo the fact that we are surrounded by it, as Fuchs (2018) states: ‘means of communication are (just like social space) means of production through which humans produce social relations and therefore also social space’ (p. 19). The relation between what we see and what we know is never settled. Each evening we see the sun set. We know than the earth is turning away from it. Yet the knowledge, the explanation, never quite fits the sight (Berger, 1972/2008). While human geography has always been a maze of diverse interests (Dear, 1988), the use of Geographic Information has changed dramatically in the past decade, and continues to do so; increasingly it is used in mediated practices, to shape stories, to transcend boundaries, to develop new ethereal networks, as well as to produce maps. But even in those maps, users themselves are being encouraged to crowdsource data, be that to add to the ‘usefulness of the map’ or to create counter maps. Data has become the standard way in which the world is ordered (Thatcher and Dalton, 2017), with those that link location and temporal information being seen as fixes for capitalism’s tendencies towards over-accumulation (Greene and Joseph, 2015). As the scholars in this issue demonstrate, there is much to be gained from the combining of communications theories and those from the geographic disciplines. Bringing the two together allows for an alternate, nuanced, and a spatially grounded approach to envisioning the myriad ways in which the digital age mediates social, economic and political experiences and, in particular, in the increasingly technologically informed media and communications sector.

[*’GEOGRAPHY AND COMMUNICATIONS‘  the full open access issue can be viewed or downloaded at the WPCC website ]


Berger, J. (1972/2008). Ways of Seeing, . London: Penguin UK.

Corner, J. (1999). The Agency of Mapping: Speculation, Critique and Invention. In: Dodge, M., Kitchin, R., & Perkin, C. (eds.), The Map Reader: Theories of Mapping Practice and Cartographic Representation, 213–252, Chichester: Wiley-Blackwell. 

Dear, M. (1988). The postmodern challenge: Reconstructing human geography. Transactions of the Institute of British Geographers, 262–274. DOI:  

Fuchs, C. (2018). Henri Lefebvre’s theory of the production of space and the critical theory of communication. Communication Theory, 1–22. DOI:  

Greene, D. M., & Joseph, D. (2015). The digital spatial fix. tripleC: Communication, Capitalism & Critique, 13(2): 223–247. DOI: 

Harvey, D. (1989). The Condition of Postmodernity: An Enquiry into the Origins of Social Change. Malden, MA: Blackwell. DOI:

Lagopoulos, A. P. (1993). Postmodernism, geography, and the social semiotics of space. Environment and Planning D: Society and Space, 11(3): 255–278. DOI:

Lefebvre, H. (1991). The Production of Space. Blackwell: Oxford

Robins, K. (1997). The new communications geography and the politics of optimism. Soundings 5, 191–202.

Smith, T. G. (2017). Politicizing Digital Space. London: University of Westminster Press. DOI:

Thatcher, J., & Dalton, C. M. (2017). Data Derives: Confronting Digital Geographic Information as Spectacle. In: Briziarelli, M., & Armano, E. (eds.), The Spectacle 2.0: Reading Debord in the Context of Digital Capitalism. London: University of Westminster Press. DOI:

Propaganda Model of Herman and Chomsky reassessed in new title

Propaganda Model of Herman and Chomsky reassessed in new title

Still relevant, useful and controversial after 30 years,  Edward S. Herman and Noam Chomsky’s Propaganda Model is considered afresh in the age of Trump, digital media and social media manipulation. Published within UWP‘s Critical Digital and Social Media Studies series edited by Professor Christian Fuchs the book is a wide-ranging examination of the topic.

Including a new interview with Edward S. Herman before his passing in 2017 the book reassesses the model’s strengths and relative limitations, offers applications to the internet and world of digital media, to sport and screen entertainment in addition to which presents specific case studies on topics as diverse as the 2008 financial crisis and austerity in Britain, Cuba and the use of nuclear weapons. It suggests that there may be a case for considering new filters and outlines reasons for the model’s continuing explanatory power.

In 2009 Westminster Papers in Communication and Culture analysed the PM after 20 years.

Much has changed but has much also stayed the same?

Night Time Economy & Entertaining Licensing Law – abstracts requested for 10 November.

UWP journal ESLJ is welcoming abstracts for special issue on the topic and the ‘cultural and commercial impact of entertainment and alcohol licensing schemes ‘. The deadline has now been extended to the 10th November 2018 for abstracts.

ESLJ nightime

Authors should engage with the role of legal stipulations and procedures, though interdisciplinary research and perspectives from other disciplines are certainly appropriate. Article types include the following:
  • Research Articles (up to 8,000 words)
  • Interventions (up to 4,000 words)
  • Commentaries up to 4,000 words
  • Reviews approximately 2,000 words.

Full details at and on previous blog announcement.