Category: Critical Digital and Social Media Studies series

Is the Gig Economy healthy?

Is the Gig Economy healthy?

That is the question posed in the fourth title in the Media Policy Brief series from the CAMRI Policy Observatory. In summary form it presents the results of a wide survey into mental health of musicians and patterns of work. It suggests that they and other creative industries workers’ may signal the growth of psychological issues for those operating under flexible working regimes and as automation continues to rise. Well-Being and Mental Health in the Gig Economy: Policy Perspectives on Precarity makes the case for considering the mental health outcomes for gig economy workers of policies affecting labour markets in the UK’s media and creative sectors. Authors Sally-Anne Gross, George Musgrave and Laima Janciute ask whether a more serious look at a universal basic income as suggested by the likes of Guy Standing is also called for. 

Using Open Access and a concise, easy-to-read format, this peer-reviewed series aims to make new research from the University of Westminster CAMRI media researchers available to the public, to policymakers, practitioners, journalists, activists and scholars both nationally and internationally.


Recipes for understanding taste – new title in ‘Law and the Senses’ series

Recipes for understanding taste – new title in ‘Law and the Senses’ series

The second title in an ambitious new interdisciplinary series from the University of Westminster’s Law and Theory Lab has been published. Called TASTE it is one of five volumes that will explore the terrain of law and each of the five senses with SEE  already published and both titles available open access. In an extract from a wide-ranging analysis of what taste means in terms of theory and law in the book’s introduction, Andrea Pavoni considers the medium of the ‘recipe’. Uniquely TASTE includes seven considerations of the ingredients of the recipe via specific examples touching upon themes from Veblen’s conspicuous consumption and ‘foodstagramming’ to abjection and disgust to the ambiguous rituals of food hospitality. The featured image illustrates the range of this unique feature of the title. To sample the recipes download the full title which is also available in paperback.

What is a recipe if not the gastro-normative artefact par excellence? A set of how-to instructions meant to adapt the contingency of cooking to the standard of a normative knowledge. Recipe, in Latin, is the imperative form: take, and was the introductory formula of medical prescriptions. As Flandrin explains, it was only in the seventeenth century that gastronomy proper supplanted dietetics, cooking began to be assumed as an art rather than a medical science, and the hedonism of the ‘gourmet’ was liberated.†  Yet, the early, normative power of recipes remained in place. This had ossifying effects, Haden † †  argues, vis-à-vis the parameters of taste, and often resulted in communicating a rationalised and standardised gastro-normativity, exemplified by the ideology of measurability and repeatability expressed in recipe cookbooks and, we may add, repeated and magnified in today’s TV cooking shows. Camporese †††  has emphasised the crucial role played by a 1891 recipe book by Pellegrino Artusi, The Science in the Kitchen and the Art of Eating Well, in producing a national consciousness in an Italy as, 30 years after the unification, the country was still culturally and linguistically split among regional enclaves. A unification that, however, occurred in heavily asymmetric form. Artusi, a bourgeois from central Italy, crafted a series of recipes in which ‘the politico-economical system, the social structure of his society, and the myth of bourgeois order’ were carefully and paternalistically translated, along marked geographical, socio-economic and gender cleavages. The seven speculative recipes gathered in the second part of this volume aims towards an opposite direction. They seek to disentangle taste, first, from its parochial entrapment into bourgeois enjoyment and, second, from their normatively atrophying ideology. No longer a mechanism that preventively defuses contingency, the recipe is thus reconfigured as a tool aimed at detecting and unfolding the contingent frictions between the experience (of taste) and the culinary continuum of bodies and structures that shape it.

 †  Jean-Louis Flandrin, ‘From Dietetics to Gastronomy: The Liberation of the Gourmet’, in Food: A Culinary History, ed. Jean-Louis Flandrin and Massimo Montanari (New York: Columbia University Press, 1999).

 †† Haden, ‘Lionizing Taste’.

 ††† Piero Camporese, Alimentazione, Folclore, Società (Parma: Pratiche Editrice, 1980), 117 [my translation].

Welcome (and Farewell) to Freedom

Welcome (and Farewell) to Freedom

Just published Riccardo Baldissone’s tour de force new book considers the meanings of liberty, freedom and related concepts. It ranges from classical texts to the present. From the introduction the author explains some of the transformations associated with the word and why a new vocabulary might be helpful even liberating. Farewell to Freedom is available in open access digital editions and available to order in print. 

“Actually, the notion of freedom is not even a Platonic invention, as the Greek word ἐλευθερία9 [eleutheria] is previously attested in Pindar: Plato improves and systematizes an already active process of production of abstractions. Havelock associates this process with the construction of the first Greek written alphabetical language, which the Socratic-Platonic semantic enquiries culminate. The book argues that before this process there is no literal freedom, but just free things, and then, free humans. When the word ἐλεύθερον [eleutheron], free, appears in the Homeric text, it does not grammatically refer to human subjects, but it metaphorically hints to their state: for example, we now translate the Homeric expression ἐλεύθερον ἧμαρ [eleutheron hēmar], literally free day, as the day of liberty, that is, the condition of freedom. Only in the fifth century BCE, does the appearance of the word eleutheria in two Pindaric odes herald a series of neologisms, such as, for example, Thucydides’ αὐτονομία [autonomia], which we now render in English as ‘autonomy.’ These terms become part of a wide constellation of locutions that construct a plurality of freedoms: a similar constellation also revolves around the Latin words liber, free, and libertas, liberty. Later on, Christian authors such as Augustine identify a proper freedom and relocate it in the afterlife, whilst associating its mundane limited exercise with will. As compared with the GraecoRoman and Germanic variously grounded notions of liberty and freedom, the Christian emphasis on individual salvation takes further the Stoic and Neoplatonist retreat towards interiority, and it produces a radical decontextualization of personal choice. After the turn of the first Christian millennium, medieval theological debates focus on freedom both as a divine faculty and as a secular practice. The latter aspect is also developed by lay legal scholars and political thinkers, following the recovery of Roman law codes and Greek philosophical texts. Paradoxically, Luther and Calvin’s stress on predestination allows then the redirection towards worldly tasks of individual agency, and its unlimited expansion. As early modern constructions of freedom emerge from a clash of religious fundamentalisms, despite their claim of absolute novelty they often recast medieval theological notions. However, seventeenth-century English parliamentary debates also revive the Roman phraseology of slavery, in order to articulate the concept of freedom as absence of dependence. This concept is formulated by Hobbes on the model of the new physics. In the eighteenth century, Rousseau follows Hobbes in reshaping medieval mystical bodies in the form of the general will. Moreover, he redefines freedom as the obedience to a self-prescribed rule. Similarly, Kant claims absolute autonomy through a voluntary subsumption of the individual under the universal.

German idealist thinkers’ inflation of the concept of freedom reveals it as a mere hyperbole, which can be realised either as absolute compulsion or in the absence of others. Hegel endeavours instead to capture freedom within a framework of evolving historical necessity. The reaction to the Hegelian dynamic totalization opens the way to a variety of theoretical challenges to the very notions of subject and will, which are the foundations of the medieval and modern constructions of freedom. From Stirner on, a veritable fault-line opens up in Western thought between the pursuit of a conceptual definition of liberty and the attempt to rethink freedom as the human production of novelty. Whilst Marx anchors this production to material processes, Nietzsche takes further Stirner’s questioning of ideas by challenging the unity of the Western subject. Nietzsche’s effort to reconstruct conceptual entities as processes allows us to revise the discourses of freedom in terms of human practices. In particular, a radical shift of the very locus of freedom and autonomy results from a double change of theoretical focus: Simondon rethinks individuals as processes of individuation, and Foucault constructs subjects as processes of subjectivation. These processual approaches undermine the raison d’être of the notions of freedom and autonomy: regulative properties such as freedom and autonomy only apply to an enclosed and selfconsistent entity – the individual, or the collective – as distinct from others, and they cannot fit subjectivation processes that are based on the constitutive participation with others. Hence, a new theoretical lexicon is needed to strike a dia-nomous middle path between autonomous and heteronomous alternatives: such a relational third way requires likewise relational notions. Of course, it may seem impossible to transcend the horizon of freedom: the very plurality of the discourses of liberty may rather appear to justify the hope in some understanding of freedom that transcends its pervasive neoliberal version. Nevertheless, also more articulate discourses of liberty can hardly face our current challenges, both in the public and the private sphere. For example, these discourses also still claim the freedom to exercise an absolute power over oneself – a mastery that in fact is their paradoxical cornerstone. If the discourses of freedom appear exhausted and even counterproductive, couldn’t we treasure instead the neoliberal unwitting demonstration of the performative power of words, and thus realise that other words may help catalyse other (and participative) practices? In this case, we could take advantage of our knowledge of the past to construct a different vocabulary, which may empower us to claim the life that we all deserve”.

(Reproduced without footnotes. The full text of the opening chapter ‘Antiquities before Christianities’ is available from the publisher’s site to view and download). DOI:

Social Media Counters and Metrics: Measurement at all Costs?

Social Media Counters and Metrics: Measurement at all Costs?

‘The one question I keep returning to is whether we can dispense
with social media counters entirely’.

 Kane X. Faucher’s new book Social Capital Online (available open access in the CDSMS series from UWP) considers the dominant role of quantification in social media environments and how we end up competing for dubious forms of digital ‘social capital’. He explains:

An obsession with metrics pervades much of the private and public sector, and is paralleled on popular social media. It is the promise of metrics that see so many place an inviolable faith in their ability to increase efficiency, effectiveness offer ready tools for benchmarks and box-ticking. Worse still is the promise that metrics will facilitate better prediction and can be used as a directional planning tool. There is no doubt that measurement is indispensable in the sciences and engineering. There measurement is essential. The problem arises when metrics are applied widely to domains such as social media. When it comes to metrics, what we measure, how we measure, and why we measure it are equally essential questions. With social media, we have an abundance of metrics – some visible, others requiring some digging, and still others entirely invisible to the public.

A quick rundown makes this clear:

  1. Visible Metrics: On Facebook or other networks, it may seem easy to assign a value to any user by the number of friends or likes accumulated. It can be an easy way of determining popularity or relevance in a socially competitive field – a process not dissimilar from casting ballots regularly. Motives for why users assign a ‘like’ will vary widely as the reasons why people support a politician in elections. And yet, because of the presence of these visible metrics there are notable behavioural changes in the way some people operate on social media, being conspicuous in their online production, reputation management, and effectively campaigning for the most ‘votes’ on their content. But, unlike an election, there is no end date to the campaign; any sense of victory is fleeting. Users’ behaviour may adopt more risky behaviour in order to garner more attention, a higher ‘score.’  Businesses, try to increase their social media score believing that this will convert to customers, then sales. As a metric, this may be flawed or merely correlation.
  2. Less Visible Metrics: Services (some free, others paid) will provide loads of metrics on number of impressions, clickthroughs, etc. Google Analytics provides a welter of data on the demographics of visitors to a site, what operating systems they use, the flow-through of the pages users visit, and for how long. This quasi-cybernetic affordance can provide a website operator guidance by which to reconfigure parts of the website to optimize visits, longer stays, and improve the ‘experience.’ YouTube provides similar metrics notably CPM (clicks per thousand).Other metrics can also be calculated such as providing a dollar value on a social media account. Klout and other companies may tell us how much a person’s tweet is ‘worth’ and the overall value of the account itself. These are potential values, but it is unclear what they mean. Assigning a dollar value to a collectible item is usually a reflection of the market and what others are willing to pay; on social media, there is no sense of true exchange value whereby a user can sell their account or tweet. Sure, there are plenty of celebrities like Kim Kardashian who will charge a set fee for promoting a product or service on their social media accounts, and so perhaps that lip service endorsement can result in sales. But this is little different from traditional forms of celebrity endorsement in other media venues apart from it being digital and potentially reaching a wider audience.
  3. ‘Invisible’ Metrics: Facebook is able to automate the process of counting interactions and drill down into data that compares what you mention to your demographic information. These result in the creation of ‘buckets’ that businesses can access for money to better refine their target marketing. Algorithms simplify this process, but it is not an exceptionally sophisticated one despite the conspiratorial chatter about how we are being ‘controlled’ by social media. Obviously there are behaviour-shaping elements on social media that strongly resemble conditioning. There is also a strong availability heuristic at play in how these social media sites decide on our behalf what content we will see in the newsfeed, which may keep us sequestered in our filter bubbles. It was not long ago when Facebook conducted its own behavioural experiment in selecting a number of users (without their knowledge or explicit consent) and showing them positive or negative posts while observing the behaviour of those users.

Figures – Donald Trump to name one – may have tens of millions of followers on Twitter, but it would be a mistake to believe every one of them endorses his views or supports him. A good number may follow his tweets out of public interest, for comedy, to troll him, or because their job (such as being in the media) requires it. Sentiment analysis on engagement may help to understand if those followers are supporters or not. Despite all of this assigning a value on the basis of a raw score is flawed because there is no consensus on what we mean by value. It is as rough and ready as saying another human being can be given a value on the basis of how much money they have in her or his bank account.

Algorithms: Mystery but no magic

As a predictive tool, social media counters are far from perfect. What is popular now will not necessarily remain so. At one point #Kony2012 was the top trending hashtag on Twitter, but the fortunes of that organization changed quickly. And yet metrics are considered an essential ingredient in recommender systems to get us to purchase similar products based on the purchasing habits of those who have been placed in a similar data bucket. When the term social media algorithm is mentioned there is a kind of magical understanding, that it occurs in a black box heavily guarded by complex streams of code.

Worse, it isn’t even scientific, but a kind of pseudoscience. The sorcery involved is really covering the fuzziness of the operation. It also completely disregards the old GIGO principle (garbage in/garbage out) as it does not measure or produce anything all that meaningful. There is absolutely nothing mysterious or magical about algorithms. Running your finances through a spreadsheet would quality as an algorithm. A simple Turing Test is an algorithm. A good algorithm is a feedback loop that does not require human intervention. It would be an exercise in futility to task a human being to calculate on the fly the trajectory of a missile in order to shoot it down. GPS operates as a feedback system, whereas the ABS on your car is a feed-forward system using actuators.

The algorithms in use by those like Facebook are not feedback loops, but feed-forward. They will assume some models of human behaviour, but they cannot fully calculate the variance between groups. What they sell in terms of data is limited and not a feedback tool for making useful predictions. As such, it is unstable and its results hit and miss. The dream of predicting the behaviour of crowds is an old one, and it continues to thrive in excitable statements such as Google’s that human beings are programmable. Our behaviour can be shaped through persuasive techniques, but the outcomes are not foolproof.

At best, these algorithms aim to recognize patterns, and then take action on the basis of those patterns. This is little different than actuarial tables to determining insurance premiums on the basis of past data where someone who is of a certain age, gender, location, etc., is matched against comparative mortality statistics. Such tables require frequent adjustment, but they assume in advance a set of conditions in order to calculate the premium. In the case of social media assumptions are applied to groups who share some characteristics but the process is akin to throwing something at the wall to see if it will stick. If, say, the algorithm detects a pattern where 20 year old females are more likely to purchase a Mac as opposed to a Microsoft computer, the ads in the sidebar will aim to reflect that pattern in order to produce that result by increasing its probability. It is a little like adjusting the controls of an experiment to arrive at the result one desires.

One analogy that may serve to illustrate this operation would be an assembly line where, for example, every 10th widget is inspected for quality control. There is a ‘model widget’ that is applied, and if a defective one is found the assembly line is shut down and then the cause of the defect is investigated. Applied to social media, if the ad is not resonating with the targeted group, the algorithm is reconfigured. This process can be better refined by getting user input, such as with Google Ad choices where we have the opportunity to say whether the ad was relevant to us. The algorithms at play on social media assume we conform to the model widget, pending which bucket we’ve been placed in. There is nothing sinister or spooky about this kind of machine learning. What is objectionable is how all our interactions are logged, tabulated, and then syndicated across our networks behind a gamified environment where our labour is obfuscated as leisure activity in a high trust milieu. Rather than a McLuhan ‘global village,’ the glowingly optimistic pronouncements about social media in its shining ubiquity is more aptly viewed as a Potemkin Village where so much social activity and connectedness obscures the very real power dynamics of capitalism, data capture, and cutthroat competition for attention and value determined by sheer numbers alone.

Only a Numbers Game

The one question I keep returning to is whether we can dispense with social media counters entirely. As much as it may provide a temporary ego-boost, jockeying for more ‘points’ seems to undercut the true value of generating online social capital: the ability to organize, mobilize, share, and connect with others in a social venue.  To run up our scores is really to do the work of social media sites, with these scores as the token payment for our labour. Can we not appreciate the intrinsic value of sharing our content without judging it by the number of people who clicked or tapped their approval? Can we make use of social media without so quickly rushing to commodify and brand ourselves? The answer to those questions is certainly yes, but it is something we would have to elect to do while putting pressure on popular social media platforms to simply remove these counting features.

Whenever we engage in the games of online social capital on a purely numerical basis, we may be feeding egos with token scores, but we are also feeding the machines to better refine its pattern recognition to restrict our choices and persuade us to support particular viewpoints or purchase a product or service. It becomes clear that the incentive for including these counters serves the purpose of increasing the time we engage in social media, while masking the labour we perform behind a kind of competitive game.

Dr Kane X. Faucher teaches at the Faculty of Information and Media Studies, Western University, Ontario, Canada


Is Social Media too Competitive by Design? New title Online Social Capital released.

Is Social Media too Competitive by Design? New title Online Social Capital released.

This is just one question Kane X. Faucher addresses in his new book Social Capital Online released today.  The book is the 7th title in the Critical Digital and Social Media Studies series. following the recent appearance of The Big Data Agenda.

A work of critical media studies it examines the idea within the new ‘network spectacle’ of digital capitalism drawing on the ideas of Marx, Veblen, Debord  (see also Spectacle 2.0)  Baudrillard, Deleuze and others. His book concludes with consideration of what could be done to address the pathologies of online obsession with accumulation and status and the alienation that follows.

Titles in the series (all open access) now consists of (in reverse order of publication) the following titles:

Series Editor: Christian Fuchs

Social Capital Online: Alienation and Accumulation

Kane X. Faucher

The Big Data Agenda: Data Ethics and Critical Data Studies
Annika Richterich

Spectacle 2.0: Reading Debord in the Context of Digital Capitalism
edited by Marco Briziarelli and Emiliana Armano

Capital, State, Empire: The New American Way of Digital Warfare
Scott Timcke

Politicizing the Digital Sphere: Theory the Internet and Renewing Democracy
Trevor Garrison Smith

Knowledge in the Age of Digital Capitalism: An Introduction to Cognitive Materialism
Mariano Zukerfeld

Critical Theory of Communication: New Readings of Lukács, Adorno, Marcuse and Habermas in the Age of the Internet

Christian Fuchs