Social Media Counters and Metrics: Measurement at all Costs?

Social Media Counters and Metrics: Measurement at all Costs?

‘The one question I keep returning to is whether we can dispense
with social media counters entirely’.

 Kane X. Faucher’s new book Social Capital Online (available open access in the CDSMS series from UWP) considers the dominant role of quantification in social media environments and how we end up competing for dubious forms of digital ‘social capital’. He explains:

An obsession with metrics pervades much of the private and public sector, and is paralleled on popular social media. It is the promise of metrics that see so many place an inviolable faith in their ability to increase efficiency, effectiveness offer ready tools for benchmarks and box-ticking. Worse still is the promise that metrics will facilitate better prediction and can be used as a directional planning tool. There is no doubt that measurement is indispensable in the sciences and engineering. There measurement is essential. The problem arises when metrics are applied widely to domains such as social media. When it comes to metrics, what we measure, how we measure, and why we measure it are equally essential questions. With social media, we have an abundance of metrics – some visible, others requiring some digging, and still others entirely invisible to the public.

A quick rundown makes this clear:

  1. Visible Metrics: On Facebook or other networks, it may seem easy to assign a value to any user by the number of friends or likes accumulated. It can be an easy way of determining popularity or relevance in a socially competitive field – a process not dissimilar from casting ballots regularly. Motives for why users assign a ‘like’ will vary widely as the reasons why people support a politician in elections. And yet, because of the presence of these visible metrics there are notable behavioural changes in the way some people operate on social media, being conspicuous in their online production, reputation management, and effectively campaigning for the most ‘votes’ on their content. But, unlike an election, there is no end date to the campaign; any sense of victory is fleeting. Users’ behaviour may adopt more risky behaviour in order to garner more attention, a higher ‘score.’  Businesses, try to increase their social media score believing that this will convert to customers, then sales. As a metric, this may be flawed or merely correlation.
  2. Less Visible Metrics: Services (some free, others paid) will provide loads of metrics on number of impressions, clickthroughs, etc. Google Analytics provides a welter of data on the demographics of visitors to a site, what operating systems they use, the flow-through of the pages users visit, and for how long. This quasi-cybernetic affordance can provide a website operator guidance by which to reconfigure parts of the website to optimize visits, longer stays, and improve the ‘experience.’ YouTube provides similar metrics notably CPM (clicks per thousand).Other metrics can also be calculated such as providing a dollar value on a social media account. Klout and other companies may tell us how much a person’s tweet is ‘worth’ and the overall value of the account itself. These are potential values, but it is unclear what they mean. Assigning a dollar value to a collectible item is usually a reflection of the market and what others are willing to pay; on social media, there is no sense of true exchange value whereby a user can sell their account or tweet. Sure, there are plenty of celebrities like Kim Kardashian who will charge a set fee for promoting a product or service on their social media accounts, and so perhaps that lip service endorsement can result in sales. But this is little different from traditional forms of celebrity endorsement in other media venues apart from it being digital and potentially reaching a wider audience.
  3. ‘Invisible’ Metrics: Facebook is able to automate the process of counting interactions and drill down into data that compares what you mention to your demographic information. These result in the creation of ‘buckets’ that businesses can access for money to better refine their target marketing. Algorithms simplify this process, but it is not an exceptionally sophisticated one despite the conspiratorial chatter about how we are being ‘controlled’ by social media. Obviously there are behaviour-shaping elements on social media that strongly resemble conditioning. There is also a strong availability heuristic at play in how these social media sites decide on our behalf what content we will see in the newsfeed, which may keep us sequestered in our filter bubbles. It was not long ago when Facebook conducted its own behavioural experiment in selecting a number of users (without their knowledge or explicit consent) and showing them positive or negative posts while observing the behaviour of those users.

Figures – Donald Trump to name one – may have tens of millions of followers on Twitter, but it would be a mistake to believe every one of them endorses his views or supports him. A good number may follow his tweets out of public interest, for comedy, to troll him, or because their job (such as being in the media) requires it. Sentiment analysis on engagement may help to understand if those followers are supporters or not. Despite all of this assigning a value on the basis of a raw score is flawed because there is no consensus on what we mean by value. It is as rough and ready as saying another human being can be given a value on the basis of how much money they have in her or his bank account.

Algorithms: Mystery but no magic

As a predictive tool, social media counters are far from perfect. What is popular now will not necessarily remain so. At one point #Kony2012 was the top trending hashtag on Twitter, but the fortunes of that organization changed quickly. And yet metrics are considered an essential ingredient in recommender systems to get us to purchase similar products based on the purchasing habits of those who have been placed in a similar data bucket. When the term social media algorithm is mentioned there is a kind of magical understanding, that it occurs in a black box heavily guarded by complex streams of code.

Worse, it isn’t even scientific, but a kind of pseudoscience. The sorcery involved is really covering the fuzziness of the operation. It also completely disregards the old GIGO principle (garbage in/garbage out) as it does not measure or produce anything all that meaningful. There is absolutely nothing mysterious or magical about algorithms. Running your finances through a spreadsheet would quality as an algorithm. A simple Turing Test is an algorithm. A good algorithm is a feedback loop that does not require human intervention. It would be an exercise in futility to task a human being to calculate on the fly the trajectory of a missile in order to shoot it down. GPS operates as a feedback system, whereas the ABS on your car is a feed-forward system using actuators.

The algorithms in use by those like Facebook are not feedback loops, but feed-forward. They will assume some models of human behaviour, but they cannot fully calculate the variance between groups. What they sell in terms of data is limited and not a feedback tool for making useful predictions. As such, it is unstable and its results hit and miss. The dream of predicting the behaviour of crowds is an old one, and it continues to thrive in excitable statements such as Google’s that human beings are programmable. Our behaviour can be shaped through persuasive techniques, but the outcomes are not foolproof.

At best, these algorithms aim to recognize patterns, and then take action on the basis of those patterns. This is little different than actuarial tables to determining insurance premiums on the basis of past data where someone who is of a certain age, gender, location, etc., is matched against comparative mortality statistics. Such tables require frequent adjustment, but they assume in advance a set of conditions in order to calculate the premium. In the case of social media assumptions are applied to groups who share some characteristics but the process is akin to throwing something at the wall to see if it will stick. If, say, the algorithm detects a pattern where 20 year old females are more likely to purchase a Mac as opposed to a Microsoft computer, the ads in the sidebar will aim to reflect that pattern in order to produce that result by increasing its probability. It is a little like adjusting the controls of an experiment to arrive at the result one desires.

One analogy that may serve to illustrate this operation would be an assembly line where, for example, every 10th widget is inspected for quality control. There is a ‘model widget’ that is applied, and if a defective one is found the assembly line is shut down and then the cause of the defect is investigated. Applied to social media, if the ad is not resonating with the targeted group, the algorithm is reconfigured. This process can be better refined by getting user input, such as with Google Ad choices where we have the opportunity to say whether the ad was relevant to us. The algorithms at play on social media assume we conform to the model widget, pending which bucket we’ve been placed in. There is nothing sinister or spooky about this kind of machine learning. What is objectionable is how all our interactions are logged, tabulated, and then syndicated across our networks behind a gamified environment where our labour is obfuscated as leisure activity in a high trust milieu. Rather than a McLuhan ‘global village,’ the glowingly optimistic pronouncements about social media in its shining ubiquity is more aptly viewed as a Potemkin Village where so much social activity and connectedness obscures the very real power dynamics of capitalism, data capture, and cutthroat competition for attention and value determined by sheer numbers alone.

Only a Numbers Game

The one question I keep returning to is whether we can dispense with social media counters entirely. As much as it may provide a temporary ego-boost, jockeying for more ‘points’ seems to undercut the true value of generating online social capital: the ability to organize, mobilize, share, and connect with others in a social venue.  To run up our scores is really to do the work of social media sites, with these scores as the token payment for our labour. Can we not appreciate the intrinsic value of sharing our content without judging it by the number of people who clicked or tapped their approval? Can we make use of social media without so quickly rushing to commodify and brand ourselves? The answer to those questions is certainly yes, but it is something we would have to elect to do while putting pressure on popular social media platforms to simply remove these counting features.

Whenever we engage in the games of online social capital on a purely numerical basis, we may be feeding egos with token scores, but we are also feeding the machines to better refine its pattern recognition to restrict our choices and persuade us to support particular viewpoints or purchase a product or service. It becomes clear that the incentive for including these counters serves the purpose of increasing the time we engage in social media, while masking the labour we perform behind a kind of competitive game.

Dr Kane X. Faucher teaches at the Faculty of Information and Media Studies, Western University, Ontario, Canada

 

Communications and Geography: An Ever-Closer Union?

Communications and Geography: An Ever-Closer Union?

Here in an extract from his editorial Doug Specht in the latest issue* of Westminster Papers in Communication and Culture  reflects on how space has not gone away or been ‘annihilated’. He considers how communication theories may help understand a world in which maps of all kinds are being reconfigured with the aid of the users and suppliers of Big Data  as space in all dimensions is being mediated and reshaped. 

Late twentieth century communication and information technologies have produced such a blurring of what is real and what is representation that the two can no longer be distinguished (Corner, 1999), leading to persistent questions over how human behaviour is constituted through space and time, and within specific social contexts (Dear, 1988). Our mappings of the world, be they through cartographic representations and data visualizations (Space-in-media), or mediated senses of place (Place-in-Media, and Media-in-Place), are in-between the virtual and the physical. A distinction not to be confused with a distinction between real and fake, ‘as we would not claim that our bodies are real while our minds are fake’ (Smith, 2017: 30). Did you find the world or did you make it up? asked Winnicotts (cited in Corner, 1999), a salient question indeed. The information super-highway agenda of the 1990s was designed to change the very fabric of society (Robins, 1997), to create a homogenized flow of communications transcending geography (Greig, 2001). This post-modern condition of ‘space-time compression’ (Harvey, 1989) would annihilate space. Yet, space has not disappeared, but has re-established itself in new spheres, created of ever larger data, and increasingly mediated, and must then be understood through the use of semiotic and communication theories, such as the Marxist spatial frameworks of Castells and Lefebvre, or the Ideologiekritik of the Frankfurt School (Lagopoulos, 1993). The postmodern creates tensions between all theories in an attempt to best understand the conditions of existence, at its core, perhaps, lies the dialectic between space and society; a geographical puzzle in which structures, institutions and human agents operate on different scales to define spatial patterns in any given locale (Dear, 1988). The individual does not disappear in the midst of the social effects caused by the pressures of the masses, but is instead affirmed (Lefebvre, 1991). It is seeing that establishes our place in the surrounding world; we explain that world with words, but words can never undo the fact that we are surrounded by it, as Fuchs (2018) states: ‘means of communication are (just like social space) means of production through which humans produce social relations and therefore also social space’ (p. 19). The relation between what we see and what we know is never settled. Each evening we see the sun set. We know than the earth is turning away from it. Yet the knowledge, the explanation, never quite fits the sight (Berger, 1972/2008). While human geography has always been a maze of diverse interests (Dear, 1988), the use of Geographic Information has changed dramatically in the past decade, and continues to do so; increasingly it is used in mediated practices, to shape stories, to transcend boundaries, to develop new ethereal networks, as well as to produce maps. But even in those maps, users themselves are being encouraged to crowdsource data, be that to add to the ‘usefulness of the map’ or to create counter maps. Data has become the standard way in which the world is ordered (Thatcher and Dalton, 2017), with those that link location and temporal information being seen as fixes for capitalism’s tendencies towards over-accumulation (Greene and Joseph, 2015). As the scholars in this issue demonstrate, there is much to be gained from the combining of communications theories and those from the geographic disciplines. Bringing the two together allows for an alternate, nuanced, and a spatially grounded approach to envisioning the myriad ways in which the digital age mediates social, economic and political experiences and, in particular, in the increasingly technologically informed media and communications sector.

[*’GEOGRAPHY AND COMMUNICATIONS‘  the full open access issue can be viewed or downloaded at the WPCC website ]

REFERENCES

Berger, J. (1972/2008). Ways of Seeing, . London: Penguin UK.

Corner, J. (1999). The Agency of Mapping: Speculation, Critique and Invention. In: Dodge, M., Kitchin, R., & Perkin, C. (eds.), The Map Reader: Theories of Mapping Practice and Cartographic Representation, 213–252, Chichester: Wiley-Blackwell. 

Dear, M. (1988). The postmodern challenge: Reconstructing human geography. Transactions of the Institute of British Geographers, 262–274. DOI: https://doi.org/10.2307/622990  

Fuchs, C. (2018). Henri Lefebvre’s theory of the production of space and the critical theory of communication. Communication Theory, 1–22. DOI: https://doi.org/10.1093/ct/qty025  

Greene, D. M., & Joseph, D. (2015). The digital spatial fix. tripleC: Communication, Capitalism & Critique, 13(2): 223–247. DOI: https://doi.org/10.31269/triplec.v13i2.659 

Harvey, D. (1989). The Condition of Postmodernity: An Enquiry into the Origins of Social Change. Malden, MA: Blackwell. DOI: https://doi.org/10.1191/030913298669028680

Lagopoulos, A. P. (1993). Postmodernism, geography, and the social semiotics of space. Environment and Planning D: Society and Space, 11(3): 255–278. DOI: https://doi.org/10.1068/d110255

Lefebvre, H. (1991). The Production of Space. Blackwell: Oxford

Robins, K. (1997). The new communications geography and the politics of optimism. Soundings 5, 191–202.

Smith, T. G. (2017). Politicizing Digital Space. London: University of Westminster Press. DOI: https://doi.org/10.16997/book5

Thatcher, J., & Dalton, C. M. (2017). Data Derives: Confronting Digital Geographic Information as Spectacle. In: Briziarelli, M., & Armano, E. (eds.), The Spectacle 2.0: Reading Debord in the Context of Digital Capitalism. London: University of Westminster Press. DOI: https://doi.org/10.16997/book11.h

Propaganda Model of Herman and Chomsky reassessed in new title

Propaganda Model of Herman and Chomsky reassessed in new title

Still relevant, useful and controversial after 30 years,  Edward S. Herman and Noam Chomsky’s Propaganda Model is considered afresh in the age of Trump, digital media and social media manipulation. Published within UWP‘s Critical Digital and Social Media Studies series edited by Professor Christian Fuchs the book is a wide-ranging examination of the topic.

Including a new interview with Edward S. Herman before his passing in 2017 the book reassesses the model’s strengths and relative limitations, offers applications to the internet and world of digital media, to sport and screen entertainment in addition to which presents specific case studies on topics as diverse as the 2008 financial crisis and austerity in Britain, Cuba and the use of nuclear weapons. It suggests that there may be a case for considering new filters and outlines reasons for the model’s continuing explanatory power.

In 2009 Westminster Papers in Communication and Culture analysed the PM after 20 years.

Much has changed but has much also stayed the same?

Night Time Economy & Entertaining Licensing Law – abstracts requested for 10 November.

UWP journal ESLJ is welcoming abstracts for special issue on the topic and the ‘cultural and commercial impact of entertainment and alcohol licensing schemes ‘. The deadline has now been extended to the 10th November 2018 for abstracts.

ESLJ nightime

Authors should engage with the role of legal stipulations and procedures, though interdisciplinary research and perspectives from other disciplines are certainly appropriate. Article types include the following:
  • Research Articles (up to 8,000 words)
  • Interventions (up to 4,000 words)
  • Commentaries up to 4,000 words
  • Reviews approximately 2,000 words.

Full details at https://www.entsportslawjournal.com/announcement/ and on previous blog announcement.

 

 

 

Amilcar Herrera prize won by Knowledge in the Age of Digital Capitalism by Mariano Zukerfeld

Amilcar Herrera prize won by Knowledge in the Age of Digital Capitalism by Mariano Zukerfeld

The Association ESOCITE (Asociación Latinamericana de Estudios Sociales de la Cience y la Tecnología) has honoured UWP author Mariano Zukerfeld in its best book category. The Amilcar Herrera Prize is awarded to the best book by an established author in the association’s field of social studies of science and technology at its annual conference this year held in Santiago Chile.

Also next week via the auspices of the Cambridge-based Centre for Research in the Arts, Social Sciences and Humanities, the Culture, Politics and Global Justice research cluster has welcome all to join in a reading group which will look at the first two chapters of the book: Chapter 1: Capitalism, Physical Property and Intellectual Property (1-30) and Chapter 2. How to Know Knowledge? Introducing Cognitive Materialism (31-52). The book is available to download digitally from UWP’s website as PDF, ePub or for kindle.

16 October 2018, 16:00 – 18:00 Mary Allan Building, Homerton College

 

 

Silk Road journal launched with call for papers

Silk Road journal launched with call for papers

UWP’s third journal title Silk Road: A Journal of Eurasian Development was launched last week with a call for papers.  The journal will ‘promote evidence-based scholarly research in social sciences and public policy studies that make the affairs of the Great Silk Road countries an area of significant interest, scholarship and impact.’

The journal’s editorial team is headed by Joint Editors in Chief Professor Peter Catterall, of the University of Westminster and Charles Becker of the Department of Economics, Duke University.  The journal’s base is at Westminster International University in Tashkent  (pictured) where Bakhrom Mirkasimov Dean of Research will act as Silk Road‘s Managing Editor.  Submissions for the first issue are due 1 December 2018.

 

 

What to do about the Gig Economy and Mental Health

What to do about the Gig Economy and Mental Health

The latest CAMRI Policy Brief considers policy perspectives on precarity in the light of the findings of the largest nationwide survey of its kind into the impact of the working conditions in the UK music industry.

Authors Sally-Anne Gross and George Musgrave recommend more education regarding mental health challenges in precarious careers, access to mental health support for gig economy workers and in the long term a Universal Basic Income to address the challenge.

Read or download.

The CAMRI Policy Briefs series from the CAMRI Policy Observatory.