Futures Past: Twenty Years of Arts Computing
Matthias Weiss, Leipzig, Germany
Microanalysis as a Means to Mediate Digital Arts1
Keywords: computer art, description, microanalysis
From a German perspective, academically-oriented art history generally ignores the fact that the computer is, and has been, both a tool and a component of art for nearly as long as the machine itself has existed. A reappraisal of this history which attempts to place it in an art history context is therefore still needed. This perspective shifts when the international art scene and the many varieties of computer art that closely follow developments in the technological domain are considered. In this paper, two stimuli are examined as a means to look at computer art in an art historical context; The first attempts to define the topic and clarify the historicity of the phenomenon in stages; the second examines the role that description (i. e. especially microanalysis) plays in order to show that close examination facilitates differentiation and that comparisons between older and newer works are possible. Thus the potential for a more profound understanding of computer art is created. In the first section, I examine connections that have been neglected to date, in the second I discuss two works from two different periods that are effective illustrations of the history of computer art. The term ‘computer art’ is used in the accepted sense to refer to the use of digital methods in the production of art works.
In order to be able to understand the phenomenon of computer art, a context is required. The term ‘computer art’ rather than ‘software art’ is used because the former implies a historically integrated factor that permits a comparative investigation of computer art. Using this approach, connections between the latest phenomena which have achieved great popularity in the field of new media art, and art works from the 1960s and 1970s will be established. By looking at the artistic use of computers from an historical perspective, these connections make it possible to assess their importance and role within art as a discipline. It is also important not to draw type distinctions categorically between immersive artificial worlds (or augmented reality projects), which are generated by using computers, and ‘software art’ programs. In recent Art History, this division has led to a less than fruitful connection between interactive environments and video art, which, as a supposedly logical consequence from the history of film, has prescribed the digital as the medium of the future. It is also necessary to draw a distinction between the different developments and impacts of video art as well as computer art, although intersections certainly exist.
I refer to computer art as a kind of artistic activity that would not be possible or have any meaning without computers. For example: a specific script that can run on any ordinary computer (and that actually requires it for the desired performance), or a remotely connected installation which generates artificial life forms in a projection room using distant computers over the Internet with local user input data. Both are determined by the use of computer systems and a communication structure, and both would be unthinkable without these components. All of these technologies contribute to meaning.
The history of computer art can be classified into three phases, defined by and dependent on what was technically feasible at the time. In the first phase, mainstream computer art fed back into aesthetics, which in turn developed into two models of non-representational art – abstraction and concretion. This phase ended in the mid 1970s. The output consisted of graphics along with works such as Videoplace by Myron Kreuger.
But this type of work could be considered as being part of the second phase. As computing capabilities increased and industry, from mechanical engineering to film, discovered simulation, immersive artificial worlds also began to be used for artistic experimentation. This trend was influential in prominent institutions such as the Center for Art and Media Technology in Karlsruhe and Ars Electronica in Linz, as well as the media arts scene. With the recognition that contemporary physics had called into question the role of philosophy as the primary field for the development of world views, models like Chaos Theory began to be seen inspirational for the art scene. This motivated artists interested in mathematics and cybernetics, like Karl Gerstner, to create the ‘new image of the world’ with fractals. At that time, technical arts became institutionalised at institutes of technology along the same lines as that established at the Center for Advanced Visual Studies (CAVS) at The Massachusetts Institute of Technology (MIT). But still the concentration on images, the constant insistence on the production of two-dimensional visualisations on one hand, and the extremely expensive development of three-dimensional image machines like caves, on the other, caused a schism in computer art and led to the third phase.
While the computer artists of the 1960s were mainly working under the paradigm of an art of precision and traditional imaging, they were rather insensitive to other current forms of art which was dedicated to communicative or politicising actions. Performance work could not be taken up by computer artists who were ‘object-oriented’. For this reason, no direct influence from the artists and pioneers of the technically-oriented arts of the earlier period is detectable on the contemporary computer art scene. It is not surprising that static computer graphics, which largely comprised the computer art of phases one and two, only became part of the system of art in an irregular manner, as for some time, art had been trying to change social practices through artistic action. In addition, given that the technology did not make possible what could already be achieved in the analogue world - with its mail, fax and copy machine networks - it is understandable that computer art made no lasting impression on the aesthetics of information. The following section discusses whether there are missing or hidden links and examines examples of the works.
Schotter by George Nees
This work by Nees is a portrait-format graphic assembled from twelve sets of twenty-two squares, each set having the same length along the sides. Read from left to right, as one would read a European language, it shows disorder that increases from top to bottom as one views the graphic. The visible defines the order, which is not the same as the order of the pictures, but an optimal state in which the squares lie along a horizontal line, forming a row in which each one is set precisely beside the next one, so that straight lines are formed by the upper and lower edges. This state is not seen in the picture as illustrated here. Row by row, the state of disorder successively increases down to the lower border of the picture. The program creates disorder through the rotation of each square at the point of intersection of its diagonal, and also through the increasing distortion in the graphic space.
This graphic introduces a number of questions concerning the relationship between an image that is constructed and one that is computed. For this reason, the slight possibility of gaining insight through viewing has to be assessed by comparison with a work of ‘classical’ constructive art. In this way, by means of observation, the intent of the picture, or its inherent logic, can be recreated. This raises the question - what is the actual content of the picture? It could be argued that the picture illustrates the relationship between order and disorder. The rather orthogonal section of ordered squares in rows next to each other (up to row six) can be evaluated as a state of higher ordering compared to the lower section of the image. We cannot tell by mere viewing, however, exactly which processes are responsible for the increase in disorder. The coordinates and inclinations of the squares could of course be measured, but this would result in defining a boundary that could not crossed by inspection. It must be assumed therefore that the observer actually sees past the sense of the picture or extracts each of its senses visually without having them supported contextually. With additional viewings, a spatial effect is seen, an optical illusion of a gentle turning from the inside out in the centre left and lower right of the image area. A graphical realisation of a mathematical model that was coded by means of a formal language constitutes its context. The question arises as to whether the image is above and beyond a specific visualisation randomly generated by a machine.
This raises another important question: is the depiction then a picture, a diagram, a technical drawing, or something in between? Successive examinations of the image from top to bottom, gives the impression of increasing deviation from the system of order (as described above). However, upon further examination, structures appear even as disorder increases; structures that cross over from the formal context of individual pieces through their respective positions relative to each other on the surface, to new non-rigid geometrical figures. An interpretation allows us to claim that the condition of increasing disorder allows ordered structures within the image to appear, without clearly fixing them into definite geometrical forms. This effect can be described using information theory by stating that super symbols are being formed in the region of disorder. It could be described as having dynamic and contingent qualities. The upper portion of the image, however, is static. This leads to the realisation that by interpreting the results of observation on a higher plane, dependent elements of order are visible within the realm of disorder. This does not occur in the region of the image with higher order. Here it is evident that an additive lining up of squares can in turn lead only to the formation of other squares or rectangles.
Examining the programming that actually gave rise to the image reveals that the optical evidence for simultaneous states of order (without the generation of formally divergent super-symbols) with a relatively gradual transition into disorder - a disorder that evokes contingent and formally divergent super-symbols – is a feature of the programming. As a result, the role of random generators that programmed chance in the parameters would have to be taken into consideration. Nees writes:
Image 38, Schotter, is produced by invoking the SERIE procedure [...]. The non-parametric procedure QUAD serves to generate the elementary figure which is reproduced multiple times in the composition process controlled by SERIE. QUAD is located in lines 4 through 15 of the generator. This procedure draws squares with sides of constant length but at random locations and different angles. From lines 9 and 10, it can be seen that the position of a single square is influenced by random generator J1, and the angle placement by J2. The successively increasing variation between the relative coordinates P and Q, and the angle position PSI of a given square, is controlled by the counter index I, which is invoked by each call from QUAD (see line 14).
It can be concluded that the meaning of an image, which adds value to that which the work has as a diagram of a formula, can only be deduced if an integrated investigative model is applied, comprising both observations and investigation of the computational foundations.
In a logically deterministic computer program, this action then imposes a relationship between an experience in observation and knowledge about the abstraction of a problem. It can be seen that on the one hand understanding is determined exclusively from a unilateral investigation of the source code, and on the other from the examination of the image alone. This is because in contrast to a composition created in the traditional way – by the visual calculations of an artist in a series of trials, or simply through the creation of an image – however it came about, an examination of the source code shows that in Schotter, it exists in the form of one of the n-possible graphical states of the program. In terms of computer art, this is the key element of this work. At no point is there an integration of the code and the visual, and this is typical of early computer art because other arts – for example concept art and performance art – had yet to be appreciated.
Forkbomb by Alex McLean
Forkbomb, (http://runme.org/project/+forkbomb/ 2 November 2005) which Alex McLean wrote in 2001 using the PERL scripting language, is essentially a thirteen-line program that found its way into the art community through transmediale.02, where it won a prize. In describing the function and action of the script, a certain radical quality is apparent, coupled with the claim that this piece of software is art. It is in fact nothing other than an uncontrolled system halt. Through mechanisms that will be described here, the code gradually paralyzes the system on which the interpreter applies the script. This occurs through a so-called ‘process’ that branches out and launches an avalanche of identical processes. This continues until - depending upon the capabilities of the computer - the system resources are exhausted and a system halt results. At that point, an output is produced as a bit pattern of zeros and ones. On the homepage of transmedial, the following message appears: ‘The pattern in which these data are presented can be said, in one sense, to represent the algorithm of the code, and, in another way, to represent the operating system in which the code is running. The result is an artistic expression of a system under stress.’
In the first-line pathname, the symbols ‘#!’, which can be called ‘hash-bang’ or ‘shebang’, instruct the shell, through which UNIX systems are accessed, to use PERL for executing the code. These lines of code are not executed by the interpreter (in this case PERL) but only instruct the operating system that the text and data in which the code itself resides is designated for the PERL interpreter. The characters that follow define the standard path to PERL. The ‘-w’ at the end instructs the interpreter to produce error messages. As a rule, this is used when newly written programs are being tested. The command ‘strict’ at the end tells the interpreter to generate error messages whenever the code is improperly written. The script itself is not something that can be started up with a double click on a symbol like a text processor, as is generally the case for an icon paradigm with a graphical user interface. It is started via a command line where an integer value is input. This is then transferred to the program as the ‘strength’ of the ‘bomb’. This specific value has to be entered. Otherwise, the following message appears: ‘Please do not run this script without reading the documentation’ and the program breaks off.
The ‘die’ subroutine causes the system to carry this out. If the value of the argument, which is given on the command line and passed through the so-called special array ‘@ARGV’ is already zero, the program ends without generating messages. If a value in the form of a positive or negative integer is specified on the command line, the error messages are passed over, and the code which follows afterwards is executed. In the fourth line, the variable ‘strength’ is declared and defined. It has to be modified by ‘my’ because the ‘strict’ instruction forces the programmer to observe a rigid syntax. In this way, the system can immediately identify improper declarations. The value of ‘strength’, which corresponds to the special variable ($ARGV), is determined from the input value on the command line. After this, the value that the user has chosen is increased by 1 (line 4). Up to this point, the program has either broken off, after inviting the user to read the documentation, or has increased the strength of the bomb by 1. What follows is the part where the fork process is initiated. The ‘while’ statement initiates a loop if the condition in the adjoining parenthesis (not fork) is not true. Despite this, the fork instruction triggers the first fork process. If the system determines that ‘not fork = true’, then the program goes to the first loop. The program stops here ( ‘exit’, line 6), if the variable ‘strength’ has already become 0 (lines 9, 10) through a ‘decrement’ process (indicated by ‘ -- ’). In this case, a 0 is sent to the standard output (line 7) and the execution of the program continues from line 8.
Back in line 5, if the system determines that ‘not fork’ is false, the program jumps to line 13 ( ‘goto [...]’). At this point, the program is instructed to jump back to line 8. Another ‘fork’ is started here. If the system comes back with ‘false’, the program again jumps back to line 13. If, during the process of ‘decrementing’ (which is indicated by ‘--’ ), the variable ‘strength’ is already 0 (lines 9, 10) and if the statement is ‘true’, the program stops again. Otherwise, the program produces the number 1 as an output and continues on in line 13. The string continues to be executed until every process by means of decrementing goes to zero. This only occurs however, if a positive value was input on the command line. If a negative value was entered, the program does not end. This is also the case for the program run if the first loop has ended and ‘strength’ (line 6) is still not exactly 0.
Along with the relationships of the conditions described in the program, and with the increasingly random dynamics that are produced by an output of 0 or 1, these processes are carried out identically in each offspring process. In general, this means that the script initialises a cascade of loops which, although they follow a programmed logic, use the inherent logic of the system itself in a way that it was not intended to be used. When the program is started, a succession of zeros and/or ones can be seen on the standard output device, which nowadays is usually a monitor screen. From this, the part that the ‘while’ statement has already executed can be recognized. The computer gradually becomes paralysed, and as this happens, the output changes.
The software can also be interpreted as being a random generator. However this does not fulfil the function that it had in the work of Nees, for example. In any case, the program can also be understood as displaying the finite nature of the computer – in contrast to the attributes ascribed to it by industry, which has elevated the machine to mythic levels of capabilities and possibilities in advertisements.
The program is efficiently written and so fulfils the requirement of a ‘normal’ computer program. In the way it works however, it overturns the paradigm of functioning. If an attempt were made to use the program in a productive context, there would be no more productivity because, most likely, the system would have to be restarted again and again. In this respect, it is something other than that which, by means of norms and other controls, is brought under control and classified as art and, at least in theory, remains controllable. In the digital day-to-day world, it is tempting to compare this to a virus. By placing the section of code in an artistic context, another arrangement for both the code and its developer is appropriate. As a rule, if the legal system steps in quickly to arrange the safeguarding of normality, lawsuits will be brought against programmers who do not follow the dominant paradigms of the respective programming languages, but rather use these languages intentionally for destructive purposes. If the functionality is described metaphorically as a virus, and is interpreted as such, then there is room for discussion. For this, only a limited analysis of the code, such as is undertaken above is needed. But here the work falls apart into code/effects and the context noted above, without any conclusions being drawn concerning possibly significant formalisms or conventional subjects. On the positive side, this would, however, contradict the definition of computer code which, being clear and unambiguous, excludes any semantic relationship in its elements. In a way, every higher language possesses the possibility of semantically charging symbols with variables which are named arbitrarily.
McLean calls the core variable of the program ‘strength’. As described earlier, a ‘my’ has to stand in front of this variable since all variables must be declared accurately. This produces the phrase ‘my strength’. If a lyrical ‘I’ were put into the text, then some interpretation would be needed to provide meaning. The instruction ‘twist’ could also be viewed similarly, a word that, in the context of programming, could be chosen arbitrarily, and which forms the anchor point for the ‘goto’ instruction. It does, however, seem that the relationship of three semantically-charged symbols, relative to the formal arrangement of the code, is rather arbitrary. The probability that there is then a subtext behind what is explicitly stated is small.
Although it is widely understood that computer art has a position in the art world, its role in relation to contemporary art is not quite clear. From a modernist perspective, the two works described above lack a relationship between the code and its effect. It is easy simply to use zeros and ones to represent the raw material of a calculating machine such as Forkbomb, but this raises questions about the role and necessity of such a device. In the author’s opinion this is a weak example of the representation of computing technique metaphorically. The two examples discussed also appear to disregard what could be considered intrinsic elements of the visual arts, such as an awareness of the references of all parts of an artwork and their logical relationship with each other, as well as the historical context.
This conclusion could not be drawn without understanding the code of both pieces, and therefore the author considers it advisable that art historians working with computer arts develop some understanding of the technical elements of such works.
1. This text appears under the Creative Commons Licence ‘Attribution NonCommercial-NoDerivs 2.0 Germany’ http://creativecommons.org/licenses/by-nc-nd/2.0/de/ (2 November 2005).
Back to contents