Furtherfield Stuff

Over at Furtherfield, I have an article online about Claire Bishop’s essay on mainstream art and “thematising” the digital. But I also want you to visit Marc Garrett’s interview with Charlie Gere on his new book – there’s a bit of Meillassoux in there too.

MG: Community without Community in Digital Culture, is a curious title. It proposes contradictory meanings and these contradictions are clearly explained in the introduction. Although, the last sentence says “In this such technologies are part of the history of the death of God, the loss of an overarching metaphysical framework which would bind us together in some form of relation or communion. This can be understood in terms of contingency, which has the same root as contact.” Could you unpack this last sentence for us, I’m especially interested in what contingency means to you?

CG: I owe my understanding of contingency to the work of philosopher Quentin Meillassoux, whose book After Finitude is causing a stir. Meillassoux is one of a small number of young philosophers sometimes grouped together under the name ‘speculative realism’, mostly because of their shared hostility to what they call ‘Kantian correlationism’, the idea that there can be no subject-independent knowledge of objects. Meillassoux follows the work of David Hume, who questioned the whole notion of causation; how one can demonstrate that, all things being equal, one thing will also cause another. For Hume causation is a question of inductive reasoning, in that we can posit causation on the grounds of previous experience. Meillassoux pushes the implications of Hume’s critique of causation to a point beyond Hume’s own solution, to propose the only necessity is that of contingency, and that everything could be otherwise, or what Meillassoux calls ‘hyperchaos’.

I use his ideas to think through the implications of the ‘digital’. According to the Oxford English Dictionary ‘digital’ has a number of meanings, including ‘[O]f, pertaining to, using or being a digit’, meaning one of the ‘ten Arabic numerals from 0 to 9, especially when part of a number’, and also ‘designating a computer which operates on date in the form of digits or similar discrete data… Designating or pertaining to a recording in which the original signal is represented by the spacing between pulses rather than by a wave, to make it less susceptible to degradation’ (the word for data in the form of a wave being ‘analog’).

As well as referring to discrete data the dictionary also defines ‘digital’ as ‘[O]f or pertaining to a finger or fingers’ and [R]esembling a finger or the hollow impression made by one’, thus by extension the hand, grasping, touching and so on. Much of the book concerns deconstructing the ‘haptocentric’ implications of contact, and communication, especially in relation to the claims made for social networks, and to engage with what I understand as the relation between ‘contact’ and contingency’. ‘Contingency’ is derived from the Latin con + tangere, to touch. ‘Contingency’ enables us think through the implications of the term digital, by acknowledging both its relation to the hand and touch and also to the openness and blindness to the future that is a concomitant part of our digital culture after the death of God.

unCloud

Contained within the package I found a sticker for a new project called “unCloud”: by Rui Guerra and David Jonas co-comissioned by Stuk and Arnolini. (which means Geoff Cox has something to do with it.)

It looks very very interesting. Basically it’s an art project whereby you install the software on Mac OSX and create your own cloud network straight from your airport.

“unCloud is an application that enables anyone with a laptop to create an open wireless network and distribute their own information. Once it is launched, a passerby using a mobile internet device can connect to this open wireless network. The person running the application can decide what information is shown in any web address. Users can access information wirelessly while at the same time remain disconnected from the internet. unCloud does not depend on a remote datacenter, instead it can be run from a laptop, making it an ideal application to run in a train or at a café.”

So in effect one could potentially bypass the proprietary cloud networks usually required to distribute and disseminate information. A ‘local’ cloud service. Ideal.

Some (very quick) thoughts on Critical Engineering

I’ve been following an interesting thread over on the Empyre forums between Cesar Baio, Simon Biggs, Davin Heckman and Gabriel Menotti. It’s basically a conversation on the ambiguous nature of accessible artworks created by technology and in particular, computing programming languages. I was struck by a comment made by Julian Oliver (who recently showed some fantastic work at transmediale, part of the Labor Berlin show).

Simon Biggs: “Much contemporary computer based art work has a cargo-cult like quality due to such illiteracy. This can be interesting but usually in spite of itself.”

Julian Oliver: “Indeed, also one of the fruits of Bricolage. However with a language like Engineering having such influence over the lives and minds of people – how we eat, travel, communicate – I really think you need to speak the language to truly act critically within its scope.

This is what we sought to underscore in the manifesto:

http://criticalengineering.org

I’ve talked to several artists that have expressed disempowerment in this age of database automation, google maps, wireless networking, the Cloud etc – technologies that shape how they live and even their practice yet they find no entry point to dissassembling and thus critically engaging them. It’s not enough to talk about how we are influenced by all this engineering – technology that becomes social, political and cultural infrastructure – this leaves us in little better position. It must be engaged it directly to understand the mechanics of influence. This is the difference between a topic (technology) and as a material (engineering).

Most that receive this email will have little or no idea how it arrived to their inbox, unable to accurately describe it to another, not even close. At the same time most would be able to describe how a postcard arrived at their friends mailbox. Just 15 years..

Ignorance as to how these engineered infrastructures actually function, what they do and what is done with them behind their own presentation, is actively being abused both inside and out of democracies.”

Clearly there’s much truth here. Understanding how technologies work (especially the proprietary ones) and how they construct and mediate technological experience adds heft to ones artistic ability to undermine and tinker with those structures if need be. At present this is now badged with the ‘new materialism’ trend blossoming around media studies; an interest in the material connections and contingent points of technological construction. This is a materialism I can get on board with – finding entry points and understanding the connections. Kudos to Julian for making the stakes clear.

My response to this however is to reinforce some ontological questioning on behalf of those technologies that we wish to critically engineer. Humans are incredibly good at being sanguine about their technological reliance, as Julian suggests. Hardly any of us are aware of the almost infinitesimal amount of functions, strings, algorithms and bits which construct our mediated existence, proprietary or open source – but we’d need not beat ourselves up over the lack of freedom that comes with it.

If Heidegger ever taught us anything about technology (the menacing Bremen lectures notwithstanding) it’s that daily existence itself constantly misses most of the elements that construct our limited experience. Both proprietary programming and open source programming conceal their functions from us, because they are coded using the paradigmatic logic of encapsulation (which is part of a bigger relationship between ontology as philosophy and ontology as programming). The difference, the crucial difference, is that open source is built with an additional precondition of open modification.

It’s worth reminding that human experience does not execute strings, nor compile a source language into a target language, the things themselves do this. We never, nor will have absolute knowledge of computing functions.

This is why I’m always concerned (yet generally supportive) with software studies. One gets the general sense (and Wendy Chun has said something similar I believe) that one has cracked some meaningful challenge if one discovers the ‘source’ of proprietary software. That the finding of the source is only goal for meaningful political activity (for example the recent trend of printing the fork bomb script on T-Shirts – as if simply knowing the code conceptually is enough knowledge).

I’d suggest that being ‘critically engaged’ with our reliance on media need not involve seeking the source or accurately describing every last detail of how a code works. Being critically engaged must begin with surrendering conceptual knowledge of the thing itself. You are not the thing. This need not be a backstep, because it can be said that political activity itself is built from such a lack of knowledge. You think private companies know absolutely every last detail about the platforms and software they create? Of course they don’t. Computing, source code or otherwise, does not favour your knowledge and expectation – it’s history has been shown to thoroughly surprise any capacity to know and to critique. Artistic practice is in fact ideally suited to articulate this capacity to surprise.

Re:Wire news

Clearly I haven’t posted anything in while. Sorry about that; my normal full time work has been crazy this month. Lots of end-of-year tax crazy stuff.

So keeping up with academic news has been hard – but now that Easter is here, I’ve got a bit of time to recap on whats been going on. Lets start with this.

It “looks” like I’ll be presenting a paper at the Re:Wire Conference 2011, Fourth International Conference on the Histories of Media Art, Science and Technology in Liverpool hosted by FACT (Foundation for Art and Creative Technology) 28th September 2011 – 1st October 2011 – two days before my birthday.

I stress the “looks like” bit, as I’m only judging this from a short ‘Accept’ update from the online OpenConf Conference Management System they are using. I won’t outright confirm it until I get the official nod.

If all goes well, the title of paper will be: Speculative Media – The Beholding of Objects: Object Oriented Ontology, Algorithmic Artworks and the Beholding of Media.

Paul Caplan on OOO and jpeg protocols

HERE is a great post from Paul Caplan on jpegs, software protocols and OOO objects. I’ve had a quick glance through it (as I’m about to go out for the night), and I’m not sure I can disagree with any of his sound words. We’re on the same panel for the platform politics event in May, and I’m sure we’ll have plenty to talk about. (If any one is around or can get to Cambridge on the 12th/13th May, I’d STRONGLY recommend going to Platform Politics – it is going to be such a fantastic relevant conference).

Ironically, the co-authored paper I’m writing with Geoff has no OOO in it whatsoever – and for me thats been a great thing to do. Diving head first and making sense of Geoff’s line of thought has been a real healthy challenge and I think it’ll produce a fantastic paper.

I like this bit;

“Firstly this perspective escapes correlationism, Quentin Meillassoux’s term for the tendency to focus on the subject-object relation, to see everything in terms of the human-world connection{Meillassoux 2009}. Here there is no world without the human nor human without the world. It is this separation (yet partnering) of subject and object that drags us away from focusing on objects, their connections and their working.  In terms of jpeg, it demands we address jpeg and the Facebook database in terms of the humans using it. At the very least this means it becomes difficult to explore machine vision systems where computers ‘see’, ‘file’ and ‘analyse’ with no human intervention, a situation an object-oriented approach could happily discuss in terms of photo object connecting within face-recognition object within a surveillance-image-evidence object.”

It sounds very similar to what I was getting at with my paper at Aarhus; for most digital aesthetic scholars, studying digital art is never as simple as ‘object x interacts with object y and forms object z’.

For them, there must always be something that resonates deeper with us and our changing habits rather than anything simple like objects. It must always connect to our culture, myth laden history or whatever… “This person in 1860 has been thinking this digital subject all along – thus – this will contextualise our contemporary digital crisis!”

It changes ‘everything’, and paradoxically changes nothing as the system’s own crucial relationships are ignored.

This requires a healthy strategy of thinking, and luckily software is such a pragmatic tool that it stands some chance of embellishing the interrogation. Is there a form of aesthetics that takes place within the interfaces of protocols and objects themselves? The trick I think is not to get bogged down with trying to distinguish between protocols and objects.

Come and see me in Cambridge

Heres one for Alex Andrews. I’ve been accepted to deliver a collaborative paper at the Platform Politics Conference, Anglia Ruskin University, on the 12th -13th May. It will be a straight merge of my paper and Geoff Cox’s paper for Januarys Public Interfaces Conference, in Aarhus. We already agreed that Geoff can’t make it, due to other commitments, so it’ll just be me doing it.

It was Geoff’s idea. I’ve not jointly written or delivered a collaborative paper before; but there was a clear link between my thoughts on interface and encapsulation and Geoff’s paper on the political cleansing of code, endemic of ‘apps’ and proprietary platforms. Laporte’s History of Shit is essential here, and that’s Geoff’s intervention mainly. (Laporte’s book is a great read: had he not died so young, he could well have taken Zizek’s place in the history of psychoanalytic-cultural theory icons.)

Abstract blurb below:

Platform Politics, 12 & 13 May 2011

 “Antagonistic Interfa(e)ces: The purification of General Intellect, encapsulation and impure code.”

 

Geoff Cox & Robert Jackson (Aarhus University & University of Plymouth, DK/UK).

 “The most significant challenge the open web will need to overcome is not technical, it is political.”                                  

 Dmytri Kleiner, from Thimbl 

 If platform applications have become the primary mode of accessing online information and communication, it should come as no surprise that the political motivations for doing so emerge from an antagonism inherent in the Object Oriented paradigm of encapsulation. Following ‘Platform Studies’ (Bogost and Montfort), the paper will assume that ‘platforms’ are the foundational hardware or software system environments where programs are executed.

 However, what constitutes the increasingly proprietary level of privatised service platforms are the forced removals of complexity implemented through the logic of encapsulation. As a rule, encapsulation reduces system complexity between the user and the platform’s influence mediated by the formal behaviour of code and the representative interface. In turn, this has led to the constraint of expression within the public realm by suppressing the political practice of platform modification, and the privileging of private property.

 We claim that the ‘App’ paradigm promotes private, proprietary software through ‘clean’ interfaces. In turn, this can be theorised as a symptom of a wider political stance towards the purification of code. In a strictly ideological manner, a ‘clean’ interface can make sense of what does not make sense. It aims to mask or supplement the ‘shitty’ complexity of code and the underlying platform structure, concealed through layers of operational function and benign interface. Such statements evoke the intervention of Dominique Laporte, in the History of Shit (first published in 1978), who verified how the suppressed agency of citizen-subjects under modern power is coded through the conditions of managing human waste. Parallel to the cleansing of the streets, Laporte argues that the French language had also been similarly purged of its “lingering stink” so that it became purer and closer to authority.

 The paper will further link the purification of code to the structure of ideology inherent in Zizek’s psychoanalytical model between Symbolic reality, the antagonistic Real and ideological concealment. The purification is technical in so far as it is political and indeed we argue that code is, in itself, subject to the same excremental foundations as the subject. We claim that platform politics is not just the study between the technical limitations of platforms, software, code, hardware and representation, but also the heightened political strategy of masking, cleansing and purifying those antagonistic limitations.

 

Dr. Geoff Cox is a researcher at the University of Aarhus (DK), an occasional artist, writer, and Associate Curator of Online Projects at Arnolfini, Bristol (UK). He is also an adjunct professor at Transart Institute (Donau University, Austria). He has a research interest in software (art) studies expressed in various projects such as the co-curated touring exhibition ‘Generator’ (2002/03), his PhD thesis ‘Antithesis: The Dialectics of Software Art’ (2006), the co-curated public art project ‘Social Hacking’ (2007) and is currently working on a book project. He is a founding editor for the DATA Browser book series (published by Autonomedia, New York), and co-edited ‘Economising Culture’ (2004), ‘Engineering Culture’ (2005) and ‘Creating Insecurity’ (2009).

Robert Jackson is an artist, writer, software developer and PhD candidate for the Kurator / Arts and Social Technologies Research group at the University of Plymouth. His arts practice focuses on the expropriation of searched web material and the creative contingencies of reconfigured web algorithms. Under the working title ‘Algorithm and Contingency: Towards non-human aesthetics’ his PhD research fuses early computer algorithmic / generative art and mainstream American art criticism from 1960 to the mid 1970s with recent research into contemporary speculative philosophy.

On Agency: Response to Roden’s Reply on my Remarks

I’ve only just picked up on one of Roden’s comments over at Enemy Industry (his own blog) about my remarks on algorithmic artworks such as Every Icon.

“I feel I need to respond to Robert’s remark on algorithmic art at greater length. Discreteness – unless I’ve read him wrong – doesn’t seem to entail complete ontological separate between things. It only entails that things have discrete and independent dispositions by virtue of their causal powers – including their computational powers.Thus the fact that a java program being run in a particular System S is disposed to iterate a loop once started without stopping conditions is a fact about S which obtains as long as nothing outside of S tinkers with its computational structure. It doesn’t entail S’s inaccessibility to relations as such.”

That is a fair shout actually. I’m not even sure myself how far discreteness goes in the ontological separation of things (hence a work in progress). One can easily find agency and discreteness in Latour and Whitehead for example, but these entities are nothing but relations, by the nature that they are only real if they perturb or translate.

The issue of course is how do you account for that agency? For an object to act, and furthermore act on other objects something needs to be expressed that was not expressed before. Endorsing a total relational system risks deferring agency to the pre-individual, or the whole before the parts strategy.

The private, withdrawn object manoeuvre deals with this by arguing that a real object is always more ontologically speaking – a black hole of sheer formal agency, untouched and undisturbed.

This is crux of the matter. To put it another way, objects are clearly contingent on many factors to ontologically exist, both externally and internally. But if you reduce an object down to ‘just’ those contingent factors, it does not seem to resolve the crux.

Heres the issue then – the object is one of three things; wholly contingent on it’s relations (systems), somewhat or partly contingent on them (my position I think) or not contingent on them whatsoever (Graham’s position I think). There could be a case for a ‘slight’ undermining of objects – perhaps being ‘partly’ contingent on relations is Graham and Levi’s position, not sure on this point.

Agency is important here of course. Agency denotes having the choice to act and understanding when you are unable to act. And its in this sense that OOO has a huge role to play in discourses and social constructivism. Levi more than anyone has challenged the idea that human agency never mind object agency is entirely governed by its relations. Badiou, Agamben, Zizek and Ranciere of course are pathologically obliged to find agency literally anywhere – outside of mathematical ontology, the Real of absolute negation, acting by ‘pure means’, the spectre-like subject ‘without’ an object. If the capitalised ideological subject is always-already reciprocally governed by relational networks of power, then agency becomes a real problem, discrete or no discrete.

But heres where I believe algorithmic artworks and algorithms in general play a crucial role. An algorithm is a strange thing indeed. You take one or more elements, CPU, graphics accelerator, hard-drive, RAM, L2 cache’s, bus speeds and an execution can be performed that hopefully produces a calculable end result. Note that the end result is not always achievable. The cake may fail to rise in the oven or even burn, the revolution may be halted by lack of people and the military, the Earth will disintegrate before Every Icon’s algorithm is finished counting.

All of these are factors are contingent on the algorithmic procedure, because the algorithmic procedure is contingent on certain factors. The issue then is, can it be said that an algorithm has agency? Does it possess something inherent in its form that is inaccessible to its contingent relations?

My answer is yes, but it is a limited yes. A paradoxical tension between it’s contingent vicissitudes and it’s vigorous execution. The next step then is to discover what this vigorous execution actually is without sounding disconcertingly vague.

One example to think about. If an algorithmic calculation has a bug that causes it to malfunction, and it is modified so as to achieve the end result and execute properly, is it a different algorithm?