Some recent-ish publications

Experimental Publishing Compendium

Combinatorial Books: Gathering Flowers (book series)

How To Be A Pirate: An Interview with Alexandra Elbakyan and Gary Hall by Holger Briel’.

'Experimenting With Copyright Licences' (blogpost for the COPIM project - part of the documentation for the first book coming out of the Combinatorial Books pilot)

Review of Bitstreams: The Future of Digital Literary Heritage' by Matthew Kirschenbaum

Contribution to 'Archipiélago Crítico. ¡Formado está! ¡Naveguémoslo!' (invited talk: in Spanish translation with English subtitles)

'Defund Culture' (journal article)

How to Practise the Culture-led Re-Commoning of Cities (printable poster), Partisan Social Club, adjusted by Gary Hall

'Pluriversal Socialism - The Very Idea' (journal article)

'Writing Against Elitism with A Stubborn Fury' (podcast)

'The Uberfication of the University - with Gary Hall' (podcast)

'"La modernidad fue un "blip" en el sistema": sobre teorías y disrupciones con Gary Hall' ['"Modernity was a "blip" in the system": on theories and disruptions with Gary Hall']' (press interview in Colombia)

'Combinatorial Books - Gathering Flowers', with Janneke Adema and Gabriela Méndez Cota - Part 1; Part 2; Part 3 (blog post)

Open Access

Most of Gary's work is freely available to read and download either here in Media Gifts or in Coventry University's online repositories PURE here, or in Humanities Commons here

Radical Open Access

Radical Open Access Virtual Book Stand

'"Communists of Knowledge"? A case for the implementation of "radical open access" in the humanities and social sciences' (an MA dissertation about the ROAC by Ellie Masterman). 

Thursday
Dec072023

New Issue of Culture Machine: Anthropocene Infrapolitics

Culture Machine is proud to announce the publication of its latest volume (vol. 22) on Anthropocene Infrapolitics, guest-edited by Pedro Aguilera-Mellado (Notre Dame), Peter Baker (Stirling) & Gabriela Méndez Cota (IBERO, Mexico City). 

https://culturemachine.net/vol-22-anthropocene-infrapolitics/

Below is the table of contents and editorial introduction. We hope you enjoy it!

---

On Anthropocene Infrapolitics, edited by Pedro Aguilera-Mellado, Peter Baker & Gabriela Méndez Cota

https://culturemachine.net/vol-22-anthropocene-infrapolitics/

Contents

Contents

editorial

Época sin época y segundo comienzo
Alberto Moreiras

Un planeta transformado
Jan Zalasiewicz et al.

Nula o el lugar
Luz María Bedoya

Molten Praxis: Infrapolitics and the Inner-Outer Earth Juncture
Nigel Clark

La piel de la tierra
Teresa Vilarós

Between Futurology and Extinction: A Transautographic Experiment in Two Turns
Maddalena Cerrato & Peter Baker

Antropoceno y filosofía
Jorge Álvarez Yagüez

Infrapolitical Epimetheia: A Wondrous Machine
Gabriela Méndez Cota

Responsibility Toward the Planetary Nothing: For Infrapolitical Preparation
Rafael Fernández

Interrogación infrapolítica del Antropoceno: a propósito de Amaiur de Aixa de la Cruz
Pedro Aguilera-Mellado

Anthropocene, Infrapolitics, and Epochal Anxiety: Upon Reading Samanta Schweblin’s Kentukis and Distancia de Rescate
Gareth Williams

From Correlation to Corroboration: When the Weather Makes Sense of Death
AJ Baginski

Anthropocene Afterlives, or: Burial Rites for the Twenty-First Century
Adam R. Rosenthal

Plant-Thinking as Infrapolitical Ethics
Daniel Runnels

La justicia y el cambio climático abrupto
Nigel Clark (2011), trad. Irina López Rodríguez y Gabriela Méndez Cota

---

Editorial Introduction
https://culturemachine.net/submissions/vol-22-cfp-anthropocene-infrapolitics/editorial/

Since Paul Crutzen suggested the term in 2000, ‘the Anthropocene’ has become established as a narrative frame for the convergence of numerous discourses and collections of data exploring the reach, as well as the limits, of human agency within inherently dynamic Earth processes. This volume of Culture Machine arrives in the wake of a decade-long acceleration of Humanities discourse on the Anthropocene, the radical implications of which remain, in our view, unthought.

Already in 2016, Cohen, Colebrook and Hillis Miller thought of the Anthropocene as a twilight concept: ‘a form of half-recognition that can only occur in the moment of waning’. They noted that even if the idea of the Anthropocene had fully exposed the fictions of Cartesian Man, its paradoxical effect had been to stir, almost immediately, a production of counter-narratives, most of which failed to question narrative as such. In other words, the boom of the post-human and the non-human, alongside so many political challenges to the universalizing claims of the Anthropocene, most often provided a way of sustaining the human as a problem. By contrast, Cohen, Colebrook and Hillis Miller called on us to ask about the ways in which technical modes of inscription produced ‘the Anthropocene’ as a masculinist delusion of self-erasure and anthropo-political narrativizing.

Almost a decade later, the unrelenting chaos associated with the Anthropocene still calls for intellectual responsibility, but structural difficulty persists in (and beyond) university discourse. If the latter is characterized, in our time, by a political saturation, the structural difficulty concerns finitude as such, the experience of which increasingly converges with technological acceleration and the threat of human extinction. The question insists: is the Anthropocene above all a political question, a question of narrative? Broadly conceived as the absolute difference between life and politics, between being and subjectivity, between writing and narrativizing, infrapolitics gives way to the task of thinking existence in the ‘epoch without epoch’ that is now framed as the Anthropocene. 

More specifically conceived as a second turn of deconstruction, infrapolitical reflection recuperates the Heideggerian problematic of the ontico-ontological difference at the time of the consummation of metaphysics, of the reduction of life –including culture and politics –to calculability, or the principle of general equivalence under the guise of late post-industrial capitalism. Reframed today as an archive of planetary devastation, the Heideggerian concept of Gestell continues to pose a question about the limits of storytelling and the need for, as Weinstein and Colebrook (2017) put it, no less than a decision on the value of existence. As formulated by Alberto Moreiras, infrapolitics is always in every case a commitment to think that decision in terms of an exception to the principle of general equivalence.

Anthropocene Infrapolitics gathers contributions that strive to think the exception, the incalculable, in the Anthropocene. Most of them are based on presentations given at the II International Seminar of Contemporary Thought which took place on 29-30 June 2023 in the Universidade de Vigo, in Galicia, Spain, and was organised by Alberto Moreiras (Texas A&M University), Helena Cortés Gabaudan (Universidade de Vigo), Jorge Álvarez Yagüez (Universidad Complutense de Madrid), Carmela García González (IES Vigo), Arturo Leyte (Universidade de Vigo), Cristina Moreiras (University of Michigan), Teresa Vilarós (Texas A&M University) and Gareth Williams (University of Michigan). We want to express our sincere gratitude to all of them and to the participants of the Vigo meeting for having accepted our invitation to edit and disseminate their work in Culture Machine with a spirit of radical open access

Even if the meeting was made possible and nurtured by the institutional frameworks of academic scholarship, Anthropocene Infrapolitics does not seek, above all, to make ‘progress’ on ‘knowledge production’ by telling more stories about planetary catastrophe. More fundamentally it seeks to ask, once again, what thinking means, with an openness to the proliferation of singular experiences and working against all attempts to construct a new hegemonic framework for academic work via scientific, economic or cultural knowledge about the human. 

As such, infrapolitics is irreducible to technics, ethics or politics, and we may, at best, regard it as a call for an attunement to somewhere strange and unthematizable. Working at the limits of language, writing, and thought, one of the main questions for infrapolitical reflection is therefore over the form or style that the announcement of the infrapolitical should take, where writing is always understood as the writing of life itself, or perhaps more accurately, as what sub-cedes and sub-sists of life beyond or below its metaphysical capture. In this regard, we give special thanks to Luz María Bedoya for a very special contribution to Anthropocene Infrapolitics, namely, the artwork included in this issue.

Anthropocene Infrapolitics seeks to make space, within the most rigorous scholarship in the theoretical Humanities, for untimely textual inscriptions, or writings that attempt to consciously bear the mark of their own historical or existential circumstances. We would like to acknowledge Sergio Villalobos-Ruminott, Jessica Bekerman, Tatjana Gajic, Cristina Moreiras, Benjamín Mayer-Foulkes, Janneke Adema, Fiona Noble, José Luis Villacañas Berlanga, Claire Colebrook, and Ángel Octavio Álvarez Solís, for carefully and enthusiastically taking part in the open peer review alongside the guest-editors and the contributors to this volume. The non-anonymity of peer reviewing was, in this case, a wager and a test for our infrapolitical desire to affirm that another scholarly writing is possible, and that open writing collaborations matter, beyond scientific standards or political convictions, for the task of thinking existence in the Anthropocene.

Selected exchanges from the open peer review process will be edited and published throughout Winter 2023-2024 in Culture Machine’s Interzone, as part of an extended conversation on Anthropocene Infrapolitics.

 

Friday
Dec012023

Launch of the Experimental Publishing Compendium 

The COPIM community and Open Book Futures are pleased to announce the launch of the Experimental Publishing Compendium: https://compendium.copim.ac.uk/.

The compendium is a guide and reference for scholars, publishers, developers, librarians, and designers who want to challenge, push, and redefine the shape, form, and rationale of scholarly books. The compendium gathers and links tools, examples of experimental books, and experimental publishing practices with a focus on free and open-source software, platforms, and digital publishing tools that presses and authors can either use freely or can adapt to their research and publishing workflows. With the compendium we want to promote and inspire authors and publishers to publish experimental monographs and to challenge and redefine the shape, form, and rationale of scholarly books.

We are celebrating the official launch of the Experimental Publishing Compendium with a festive calendar. Follow us on Twitter and Mastodon (#ExperimentalPublishingCompendium) to discover 24 experimental publishing tools, practices & books from the compendium.  

The compendium includes experiments with the form and format of the scholarly book; with the various (multi)media we can publish long-form research in; and with how people produce, disseminate, consume, review, reuse, interact with, and form communities around books. Far from being merely a formal exercise, experimental publishing as we conceive it here also reimagines the relationalities that constitute scholarly writing, research, and publishing. Books, after all, validate what counts as research and materialise how scholarly knowledge production is organised.

We hope the linked entries in this compendium inspire speculations on the future of the book and the humanities more in general and encourage publishers and authors to explore publications beyond the standard printed codex format.

______________

The Experimental Publishing Compendium has been curated by Janneke Adema, Julien McHardy, and Simon Bowie and has been compiled by Janneke Adema, Simon Bowie, Gary Hall, Rebekka Kiesewetter, Julien McHardy and Tobias Steiner. Future versions will be overseen, curated and maintained by an Editorial Board. Back-end coding by Simon Bowie, front-end coding by Joel Galvez, design by Joel Galvez & Martina Vanini. 

The Experimental Publishing Compendium is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0). All source code is available on GitHub at https://github.com/COPIM/expub_compendium under an MIT License

The compendium grew out of the following two reports: 

  • Adema, J., Bowie, S., Mars, M., and T. Steiner (2022) Books Contain Multitudes: Exploring Experimental Publishing (2022 update). Community-Led Open Publication Infrastructures for Monographs (COPIM). doi: 10.21428/785a6451.1792b84f & 10.5281/zenodo.6545475.
  • Adema, J., Moore, S., and T. Steiner (2021) Promoting and Nurturing Interactions with Open Access Books: Strategies for Publishers and Authors. Community-Led Open Publication Infrastructures for Monographs (COPIM). doi: 10.21428/785a6451.2d6f4263 and 10.5281/zenodo.5572413.

COPIM, Open Book Futures, and the Experimental Publishing Compendium are supported by the Research England Development (RED) Fund and by Arcadia.

 

Monday
Nov202023

Is Big Publishing Killing the Academic Author?

'It’s my belief that you don’t make money until you make someone else money' - Marc Maron

The last few of weeks have seen the appearance of two articles examining the impact of corporatisation and conglomeration on literary fiction: not just the way novels are published but also the content of those novels; as well as how we understand the global literary canon of everyone from Saul Bellow, Philip Roth, John Updike, Albert Camus, William Burroughs, VS Naipaul, WG Sebald and Martin Amis, to Roberto Bolaño, Orhan Pamuk, Salman Rushdie, Ben Okri, Chimamanda Ngozi Adichie and Hanif Kureishi (a canon that is actually highly US-centric in its production and organisation). They are Scott W. Stern, ‘Big Publishing Killed the Author' (an essay reviewing Dan Sinykin's Big Fiction: How Conglomeration Changed the Publishing Industry and American Literature), and Alex Blasdel, ‘Days of The Jackal: How Andrew Wylie Turned Serious Literature Into Big Business'. Both are pieces of journalism, with all the associated drawbacks and limitations. Nevertheless, they have the effect of inadvertently emphasizing how academia, and open access publishing within it, provide a space for something very different to the commercial ecosystem of super-agents, management consultants, marketers and publicists that Stern and Blasdel depict. This is largely because, while some academic authors do have agents, there’s little prospect of individual scholars being a ‘moneymaker’ for the likes of Andrew Wylie in the way Stephen King and Cormac McCarthy are, let along the literary estates turned ’commercial assets’ of Vladimir Nabokov and Jorge Luis Borges.

Open access (OA) academic book publishing can still be a relatively small, non-profit, editorially independent affair, in which presses are able to take risks in order to make work publicly available that is intellectually challenging, at times even unusual and eccentric. (As I've written elsewhere, an unknown '[s]omeone can write a book on a subject so marginal and obscure it may be of interest to only one other person in the world ... yet it might still be worth' bringing out. Economic reality is therefore not such an 'important factor when it comes to deciding what an open-access' press should include in its list.) What’s especially exciting about OA publishing is that this risk-taking applies not just to the content of books but their form, too. (‘Form’ here refers to the organisation of the various elements that make up a book.) Hence some of the more radical open access presses have experimented with publishing titles on a fluid, living, processual basis using a copyright licence – Creative Commons CC-BY, but also Collective Conditions for Re-Use (CC4r) and even no copyright – that enables texts to be collectively and collaboratively written, edited and remixed.

Under pressure from the demands of capital for ‘growth and returns on investment’, literary publishing, by contrast, is increasingly focused on works (and authors) that can be marketed, publicised and sold as neoliberal commodities in an economy that positions the reader as customer. It’s a pressure that, according to Sinykin and Stern, has led to a change in the content of serious novels, most notably as a result of literary authors embracing the techniques of popular genre fiction: ‘from A.S. Byatt in Possession (romance) to Denis Johnson in The Stars at Noon (spy thriller) to Colson Whitehead in Zone One (zombie apocalypse)’. The comparison with open access, however, only serves to emphasise that the problem concerns more than just the content of such books.

Despite its association with creativity, self-reflection and the innovative use of language, literary publishing operates by producing variations within a quite limited set of formal conventions - constraints that are so naturalised and unmarked as to be effectively invisible. Novels are for the most part single-authored, usually with a central human protagonist in possession of a fixed and unified self who is cast into a developing storyline. (This explains why so many literary authors have found it a relatively simple matter to turn their hand to the even more profitable writing of memoirs and autobiography - or works of fiction that have their roots in them.)  With few exceptions, their contents are arranged in a linear order so as to constitute an extended narrative designed to be read in a progressive temporal sequence in the physical form of a consecutively-paged, printed codex book. These volumes are published as fixed and finished autograph texts in uniform, multiple-copy editions. They are then made available to readers on a mass industrial basis as ‘closed access’ consumer products using an all rights reserved copyright licence. (And that’s without even mentioning the uniformity of these novels as material objects in their very differentiated numerousness: ‘their metadata, paratextual mediation, paper, binding, design, layout, use of white spaces, locations where the pages are cut and so on’.)

It should be emphasised that a willingness to adhere to such preformatted practices and structures is not a feature confined solely to those literary novels that are produced by conglomerate publishers. It is also a notable characteristic of the relative few works of multicultural, intersectional and experimental literature they are prepared to include in their lists; as well as of the auto-fiction authors such as Sally Rooney, Rachel Cusk and Karl Ove Knausgaard write in order to focus precisely on the ‘challenges of navigating the publishing industry’. It even applies to those novels that are published by the prize-winning independents like Fitzcarraldo Editions, ‘the avant-garde houses like New Directions, the small presses like Coffee House, and the university presses’. What's more, this is the case regardless of the fact that these non-corporates – and in some cases non-profits – expressely define themselves against, and thus in relation to and in terms of, the conglomerate publishers, 'putting out consciously literary fiction to counter the oversaturated commercialism of the big houses’, and ‘transgressive works by numerous authors of color to counter the overwhelming whiteness of trade imprints’. As the author and editor Rachael Allen observes:

A common note in acquisition meetings at publishing houses is that a book is too ‘academic’ [read ‘bad’,  difficult, complex, intellectual], that the audience for a work ‘doesn’t exist’. That a book does not know ‘how to be marketed’. I have found that these books, usually the more formally innovative and boundary pushing works, to be authored by historically marginalised writers, who, now and in the recent past, are simultaneously cynically and vampirically exploited for their identities. There’s now a vulture-like focus on ‘discovering’ these authors, who fit the ‘accessible’ (read ‘good’) bill – working class writers, black and brown writers, writers in precarious circumstances, trans and queer writers.

If there’s something curiously old-fashioned about the version of the modern publishing industry that Stern and Blasdel engage with, much the same can be said of their versions of capitalism. We can leave for another time the question as to whether capitalism is in fact dead, and we’re now living under a form of technofeudalism that is even worse. What’s noticeable about Stern and Blasdel’s portrayal of big publishing is that it hasn’t even undergone the transition to surveillance capitalism yet. Although the impact of the rise of Amazon in reshaping contemporary fiction and accelerating the ‘neoliberalization of publishing’ is acknowledged by Stern, neither article contains any mention of data analytics, for instance. 

The limitedness of the literary publishing worldview in this respect is perhaps one reason the training of AI engines on data that have been scraped from large corpora of text, including copyrighted works by published authors found in ‘shadow’ or ‘pirate’ libraries, has come as such a shock to many writers of fiction, John Grisham, Jonathan Franzen, George R.R. Martin, George Saunders and Jodi Picoult among them. These ‘avatars of conglomerate publishing’ are now suing OpenAI and Meta for mass scale copyright infringement in an effort to maintain a neoliberal system which is working quite nicely for a small number of highly profitable brand-name authors, their publishers and agents, but less so for everyone else:

Between 1986 and 1996 … 63 of the 100 bestselling books in the [US] were written by just six people: Tom Clancy, Michael Crichton, John Grisham, Stephen King, Dean Koontz, and Danielle Steel. Astonishingly, five of those six continue to dominate bestseller lists to this day. (No doubt Clancy would too, had he not died in 2013.)’

Contrast this with the findings of the 2022 Authors’ Licensing and Collecting Society (ACLS) survey. It reveals that the average earning of a self-employed author in the UK is £7,000. That represents a 38.2% decline in real terms since 2018. Moreover, 47% of total earnings are claimed by a mere 10% of authors (although it should be noted that ‘[w]omen, black and mixed-race authors, the very young, and very old, all earn less than their respective counterparts’). 

At the same time, academic publishing and open access are not free from concerns with economics and the bottom line. Consider just how much of the mainstream discussion around OA is taken up with never-ending debates over business models – green, gold, hybrid, APC/BPCs, diamond – rather than exploring how we can produce intellectually ambitious forms of research and publishing, some of which might even be avant-garde, experimental or singular in nature.

The latest plan for a future ‘scholar-led’ OA system of ‘responsible publishing’ put forward by the cOAlition S consortium to replace the ‘prevailing business models’ is just the latest example. (As I’ve pointed out on the Radical Open Access mailing list, these plans are not really 'scholar-led' in the sense a lot of us have been using the term. Nor, contrary to how they are presented in a recent Nature article, are they a 'radical open access initiative'. And they certainly don't represent a 'revolution' in academic publishing.)

In fact, open access publishing is undergoing its own processes of conglomeration. Evidence Ubiquity's joining with De Gruyter in 2022. De Gruyter has in turn recently agreed to acquire Brill for €51.5m to form De Gruyter Brill. With their combined revenues of around €134 million and 750 employees, De Gruyter Brill is aiming to publish in excess over 3,500 books and 800 journals per year. Its self-declared aim is to create the leading academic publisher in the humanities. 

As Leslie Chan points out, however, companies such as Elsevier, Wiley-Blackwell and Springer are increasingly looking to monetize not just academic content, but the ‘entire knowledge production workflow, from article submissions, to metrics to reputation management and global rankings’ and the related data extraction and analysis. Hence the partnership struck by Taylor & Francis with the Research Gate commercial social networking site for academics in October 2023. 

Some of the ScholarLed community have even gone so far as to argue these companies are endeavoring to capture the whole ‘system of exchange between themselves and research-learning communities’, free from the mediation of public university libraries which these publishing conglomerates regard as ‘knots’ that hamper their ability to extract public funds. 

The recent report by SPARC which examines the data privacy practices of Elsevier’s ScienceDirect business, ‘the leading academic discovery platform of the world’s largest publisher and many libraries’ largest vendor for collections', supports this view. It demonstrates how the tracking of users and the harvesting of their personal data, something that would have been considered beyond the pale in a traditional physical library setting, is now a regular practice through the platforms of these companies.

Moreover, the SPARC report notes that Elsevier, as an academic-publisher-come-data-analytics-business whose ‘products span discovery; research management, funding, and collaboration; publishing and dissemination; and research analytics’, is a subsidiary of RELX. The latter describes itself as ‘a global provider of information-based analytics and decision tools’. By amalgamating ten thousand data points relating to hundreds of millions of individuals, RELX is able to supply extensive databases of personal information to law enforcement agencies, governmental bodies and corporations. ‘RELX risk products have been documented as being used in ways that raise serious concerns, including to help monitor protestors’ social media feeds, surveil immigrants, blackmail women, and help in attempting to manufacture false terrorism charges against some of those who participated in anti-racism protests in the summer of 2020’.

On this account academic publishing seems to be an even more joined-up system of profit-driven conglomeration and control than that of serious literary fiction. Which is why I was enquiring a few weeks ago, following the publication of its latest World University Rankings by the Times Higher Education, about research that connects the dots between the THE, World University Rankings, university league tables, REF, Digital Object Identifiers (DOIs), Crossref (the world’s largest registry of DOIs and metadata), ORCID (co-developed by Crossref), Elsevier (a founding sponsor of ORCID) and such surveillance capitalism.

Are academic authors increasingly becoming mere ‘cogs’ in this ‘corporate machine’? Like their literary counterparts, will many of them too end up producing work that is less and less interesting as a result? Or is open access publishing that is actually radical and scholar-led capable of producing a very different future for the academic author?

 

Monday
Oct232023

Creative AI - Thinking Outside The Box 

I'm currently experimenting - more or less playfully/piratically - with the concept of artificial creative intelligence collaboratively generated by Mark Amerika and GPT2. In My Life as an Artificial Creative Intelligence this is defined by Amerika/GPT2 as ‘a human being who can think outside of the box’.

For me, such artificial creative intelligence (ACI) needs to include thinking outside of the masked black box that ontologically separates the human, its thought-processes and philosophies, from the nonhuman: be it plants animals, the planet, the ... or indeed technologies such as generative AI.

The approach to AI of ACI is thus very different from that promoted by the various institutes for human-centered, -compatible or -inspired AI that have been established around the world; as well as that put forward in recent work looking to ‘unmask’ the algorithmic biases of AI in order to safeguard the human (but which also functions to keep the human seperate from its co-constitutive relation with the nonhuman whilst simultaneously maintaining the human's position at the heart of the world). A snapshot illustration of such creative thinking can be provided with the help of two recent accounts of AI art. The first comes from a paper on ‘AI Art and Its Impact on Artists’ written by members of the Distributed AI Research Institute in collaboration with a number of artists. In this paper the human is set up by Harry H. Jiang, Lauren Brown, Jessica Cheng, Mehtab Khan, Abhishek Gupta, Deja Workman, Alex Hanna, Johnathan Flowers and Timnit Gebru in a traditional hierarchical dichotomy with the nonhuman machine that is artificial intelligence through the robotic insistence that multimodal generative AI systems do not have agency and ‘are not artists’. Art is portrayed as a ‘uniquely human activity’. It is connected ‘specifically to human culture and experience’: those continually evolving ‘customs, beliefs, meanings, and habits, including those habits of aesthetic production, supplied by the larger culture’.

Declarations of human exceptionalism of this kind should perhaps come as no surprise. Certainly not when ‘AI Art and Its Impact on Artists’ derives its understanding of art and aesthetics in the age of AI in part from liberal, humanist figures who were writing in the first few decades of the 20th century: namely, the American philosopher and reformer of liberal arts education John Dewey; and the Englishman Clive Bell who was a representative of Bloomsbury Group liberalism.

To be fair, Jiang et al. also refer in this context to several publications by contemporary scholars of Chinese, Japanese and Africana Philosophy (although its noticeable the majority of these scholars are themselves located in Western nations). Still, liberal humanism holds it values to be universal (rather than, say, pluriversal), so nothing changes as a result: most philosophers on art and aesthetics argue that nonhuman entities are unable to be truly creative, according to Jiang et al.. On this view, artists use ‘external materials or the body’ to make their lived experience present to an audience in an ‘intensified form’ through the development of a personal style that is authentic to them. It is an experience that is 'unique to each human by virtue of the different cultural environments that furnish the broader set of habits, dispositions towards action, that enabled the development of anything called a personal style through how an individual took up those habits and deployed them intelligently’. Consequently, art cannot be performed by artifacts. Generative AI technologies ‘lack the kinds of experiences and cultural inheritances that structure every creative act’. 

The second account of artificially intelligent art can be found in Joanna Zylinska’s book for Open Humanities Press, AI Art. It shows how human artists can be conceived more imaginatively – and politically – as themselves ‘having always been technical, and thus also, to some extent, artificially intelligent’. This is because technology, far from being external, is at the ‘heart of the heart’ of the human, its ‘“body and soul”’, in a relation of what Derrida and Steigler term originary technicity or originary prostheticity. Or as Zylinska has it: ‘humans are quintessentially technical beings, in the sense that we have emerged with technology and through our relationship to it, from flint stones used as tools and weapons to genetic and cultural algorithms’. She even goes as far as to argue that the ethical choices we think we make as a result of human deliberation on our part consist primarily of physical responses as performed by ‘an "algorithm" of DNA, hormones and other chemicals’ that drives us to behave in particular ways.

How can this second ‘human-as-machine’ approach to artificially intelligent art be positioned as the more political of the two? (Doing so seems rather counter-intuitive given the critically engaged nature of the work of DAIR, Gebru et al..) Quite simply because, in its destabilising of the belief that art and culture stems from the creativity of self-identical, non-technological human individuals, and its opening up to an expanded notion of agency and intelligence that is not delimited by anthropocentrism (and so is not decided in advance: say, as that which is recognised by humans as agency and 'intelligence’), such ACI presents an opportunity even more radical – in a non-liberal, non-neoliberal, non-moralistic sense – than that Jiang et al. point to in ‘AI Art and Its Impact on Artists’.

Rooted as the latter is in the ‘argument that art is a uniquely human endeavor’, Jiang et al. advocate for new ‘sector and industry specific’ auditing, reporting, and transparency proposals to be introduced for the effective regulation and governance of large-scale generative AI tools based on the appropriation of free labour without consent. (One idea often proposed is to devise either a legal or a technological means whereby artists can opt out of having their work used for training commercial machine learning like this. Alternatives involve incorporating a watermark or tags into AI-generated output for the purpose of distinguishing it from human-generated content. Some intellectual property experts have even suggested the introduction of a new legal framework, termed 'learnright’, complete with laws designed to oversee the manner in which AI utilises content for self-training.) The aim is to orient these tools, together with the people and organisations that build them, toward the goal of enhancing human creativity rather than trying to ‘supplant it’. When it comes to the impact of AI on small-scale artists especially, the danger of the latter approach includes loss of market share, income, credit and compensation, along with labour displacement and reputational damage, not to mention plagiarism and copyright infringement, at least as these are conventionally conceived by late stage-capitalism’s consumer culture. It is a list of earnings-related harms in keeping with their presentation of independent artists today – especially those who are neither financially self-sufficient nor able to support their practice by taking on other kinds of day jobs – as highly competitive microentrepreneurs. Evidence the interest attributed to them by Jiang et al. in trading ‘tutorials, tools, and resources’, and in gaining sufficient visibility on social media platforms to be able to ‘build an audience and sell their work’.

According to Demis Hassabis, chief executive of Google’s AI unit, we ought to approach the dangers posed by artificial intelligence with the same level of seriousness as we do the climate crisis, instituting a regulatory framework overseen initially by a body akin to the Intergovernmental Panel on Climate Change (IPCC), and subsequently by an organisation resembling the International Atomic Energy Agency (IAEA) for the long term. It is typical of those behind Big Tech to call for the regulation of the anticipated or hypothetical dangers that will be posed by large-scale foundational AI models at some point in the future, such as their ability to circumvent our control or render humanity extinct, rather than for actions that address the risks they represent to society right now. Obviously, the position of Google, Amazon, Microsoft et al. as the dominant businesses in the AI market – the latter both in its own right and as an investor in the capped profit OpenAI – would be impacted far more if governments were to seriously adopt the latter approach rather than leaving it to voluntary self-regulation on their part. They would also be subject to greater competition and challange if it wasn’t just Big Tech that was presented as having the computing power, money and technical expertise to deal with such existential concerns: if AI engines and their datasets had to be made available on an open source basis that makes it easier for a diverse range of smaller (and even non-profit) entities to be part of the AI ecosystem, for instance, and thus provide alternative visions of the future for both AI and the human. Nevertheless, to convey a sense of the radical political potential of artificially creative intelligence, let us return to the example of the environmental crisis I provided previously in relation to Naomi Klein’s critique of the architects of generative AI. As we saw there, our romantic and extractive attitude toward the environment, which presents it – much as Jiang et al. do the work of artists in the face of AI – as either passive background to be protected or freely accessible Lockean resource to be exploited for wealth and profit, is underpinned by a modernist ontology based on the separation of human from nonhuman, culture from nature, alive from non-alive. It is this very ontology and the associated liberal, humanist values – which in their neoliberal form frequently include an emphasis on auditing, transparency and reporting, as we have seen – that artificial creative intelligence can help us to move beyond with its ability to think outside of the box.

 

Friday
Oct132023

Proud To Be Anti-Growth

One of the university presses I’ve published with in the past has just announced growth of 17% on the previous financial year.

As a not-for-profit publisher I understand why they’re celebrating this. Still, I wonder when such presses will realise being pro-economic growth today is not necessarily a good thing? I mean, come on, it’s an agenda that’s being championed by Liz Truss. Nuff said.

Isn't the direction of travel more toward degrowth postdevelopment and postextractivism in an effort to repair the destruction of the planet brought about by the mass production and consumption of commodities?

I know many of the projects I’m involved with are proud to be anti-growth.

---

Coninuing the anti-growth theme, I came across these two recently:

1) Eve Tuck and K. Wayne Yang, 'R-Words: Refusing Research', in Humanizing Research: Decolonizing Qualitative Inquiry with Youth and Communities, eds Django Paris & Maisha T. Winn (London: Sage, 2014).

(It's the text proposed by Eva Weinmayr and Femke Snelting for the next Limits to Openness reading group on the 18th December, 2023.)

2) A call for submissions to a panel on 'Degrowing : Valuing and Practicing Intentional Data Loss', at the 4S/EASST Conference, Amsterdam, 16-19 July 2024.