Skip to content

The Alchemy of Algorithmic Determinism

Short Link for this Blog-Post: http://robertjackson.info/index/?p=467

An interesting discussion has popped up between Graham, LeviChris Vitale and Steven Shaviro, on accounting for the genesis and perishing of objects in OOO and how this ‘accounting’ splits into substance/actual/static entities in comparison to virtual/process/becoming entities.

Graham’s made his intentions clear on the over-privileged argument for the virtuality of objects and their becomings. Shaviro makes some very interesting points regarding Whiteheads view on emergent objects;

“Rather, no list of an actual entity’s qualities can give us the entity, because such a list excludes a crucial dimension: the entity as process, or the way in which it selects, and then organizes or “harmonizes”, those qualities. This added dimension is a process or an action, rather than anything substantial (this is where I diverge somewhat from Graham Harman’s admirable notion of “allure,” as the dimension of an object that is withdrawn from, and in excess of, all its qualities).”

However Chris Vitale has weighed on ‘what-it-is’ exactly that proposes the constitution, construction and demise of an object;

“My question is – who are what creates this closure, generates these objects? And perhaps more importantly, from what perspective does this entity or force or whatever view these things, and decide when they happen? For example, when Levi says “closure is acheived”, well, who determines what is closed, and what is open? A semi-permeable cell membrane, closed from some points of view, open from others, semi-open from yet others. Now if you say from the object´s perspective, well, that´s at least saying something, but then one needs to do the work of saying what that means or entails – does the object know it has a perspective, or is this perspective described by a more intelligent entity´s perspective? A whole can or worms unfolds with this approach.”

Just to be clear here, when Levi publishes The Democracy of Objects, I’m sure we’ll all acknowledge his position on the emergent nature of objects. It is interesting however that, in a later post, Levi utilises Bogost’s Unit Operations as a possible future methodology for the generation of units from systems. Systems then emerge from aggregated units and not just purely the other way round.

Homing in on Bogost’s term “Operation”, he states quite explicitly;

“In short, unit operations produce, they generate a new entity, whereas system operations re-produce, they iterate an already existing pattern or object.”

Although I’m confident that Ian’s position as changed somewhat since he’s busily finishing off Alien Phenomenology, I read the ‘Operation’ part as the subjective approach taken to create a new modes of meaning between different systems, by counting units out of them, so to speak, (he terms this ‘Unit Analysis’). For instance, Bogost spends a number of pages dissecting the Steven Spielberg film ‘The Terminal’ as a set of unit operations and not an insular system. The point being, Bogost is advocating a new form of criticism which privileges the operation of picking units (such as the specific modes of “uncorroborated waiting”) out the film and combining them with the viewer’s own ontological experience. This is why videogames are unit operational par excellence.

This isn’t anything I remotely disagree with here, however what I would propose is the wholly relevant issue of unit or object continuity. Continuity is an apt issue here, precisely because it highlights this, frankly weird, transcendental element which keep units unitary, or keeps an object, well, an object. What is it that unifies parts?

For example, when I play my electric guitar, something weird is happening. A collection of objects which are configured as me, somehow relate or fuse to another collection of objects resembling a guitar (amongst others such as FX Pedals, Stompboxes, Amps and Leads), this creates another unit; “me-playing-guitar”; a unit which relates to another host of units. The point being, that certain objects are needed to in order for the me-playing-guitar object to not only function, but provide a whole set of relations, that each object alone is not capable of. Consciousness is not a pre-requiste for having a self-perspective on being an object, its almost like aesthetic attraction (this is what for Harman constitutes allure). Think about what kind of an object you would be if you lost vital objects, (think about what you would be if you fused yourself with others).

But the issue within the realm of sealed off objects is one of continuity (or Shaviro’s “harmonising”). An object can quite clearly lose an enormous amount of objects and still continue as a vigorious unit. The added OOO intervention is to suggest that it is not humans who give birth to this continuity, nor is the result of some underlying monism, nor is it just the chaotic chance of an objects dependency of relations. For Latour, this constitutes a completely different actor in its own right, but Harman, for example, distances himself here precisely because a thing cannot be the total sum of its relations. Something else must give continuity to the non-relational real object that generates its relations. But heres the deal-breaker, OOO must allow for Object-Continuity, without pledging allegiance to determinism.

I’ve mentioned recently my fondness for algorithms as an untapped area for OOO; it also links together with a number of points that Bogost has made in Unit Operations. One can find a reply to this issue of continuity in objects with regard to research into computation. Very briefly, I’ll mention the most prominent suggestion which circles my thoughts.

You see, continuity and determinism has been a problem for scientists who study complexity as well, particularly when it comes to the Second Law of Thermodynamics. As Wolfram observed in A New Kind of Science, one of the violations is that despite the mathematical certainty of entropy, complex systems can still generate order and increase disorder at the same stretch. Not everything leads to total disorder as some may believe.

Wolfram has become notorious for proposing a new approach to construct alternatives models that resemble reality. Equations may resemble regularities of stable systems, but as we know, ontology isn’t simple, its complex, and mathematics often struggles when its comes to unimaginable complexity.

Wolfram believes that the computational universe has something more to offer to theoretical science than just processing mathematic equations. Wolfram has studied sets of systems called cellular automata, and other similar systems. What these systems have in common is that they are compromised by a set of discrete steps and a few simple rules. What Wolfram proposes is that every system, no matter how complex, can be represented by simple rules, which through the execution of the program, can create complex and emergent phenomena, such as patterns, shapes or even the phenomenon of continuity. The only simple element of complexity is the endlessly replicated algorithm.

The proposal then, as Wolfram sees it, is that computation violates the Second Law of Thermodynamics. The same set of simple rules can produce not just complexity, but can also, for no reason whatsoever, order and continuity. It can also descend back into disorder again for no reason other than the determination of repeated rules.

Now my problems with this are descended directly from OOO, namely;

1.) I’m instantly suspicious of the suggestion that everything can be reduced to one simple rule. It smacks of undermining even more than the materialist attitude.

2.) There is the added complex question of what the algorithm runs on. If, as Wolfram is hoping, science can find the one golden rule that explains everything, this cannot but suggest infinite regress. How can a instruction run if it has nothing to run on. Configurability is the key issue here.

3.) I agree in principle that simple instructions can generate unimaginable complexity, but add the suggestion that simple rules are not wholly determined. This is connected to the “one rule for all algorithm” problem which haunts the last two problems. There has to be an additional factor which accounts for the generation of objects, and my suspicion harks back to the idea that algorithms are multiple and come in determined pockets. They resist and are contingent to each other. They are more akin to actual situations rather than actual objects. After all, how can anything new emerge (consistent or complex) if it is determined by systematic rules. There must be an equal level of undetermined contingency.

Like I say, all work in progress, but I still think OOO has a lot to find within the computational universe regardless of my musings.

One Trackback/Pingback

  1. […] 4, 2010 I’m on the run, but here’s a USEFUL POST by Jackson on the most recent debate between Shaviro, Levi, Vitale, and me, in which Bogost’s […]

Post a Comment

Your email is never published nor shared. Required fields are marked *
*
*