Archive

Tag Archives: software

An old advisor and friend – Warren Sack, recently gave a lecture at Goldsmiths about a chapter of his current software studies book project.

The “digital convergence” of the last few decades has coerced a number of industries into the business of computers and networks. The institutions of film, television, video, photography, printing, publishing have succumbed to a “rewriting” in digital format. This rewriting is only possible because of the new, uncanny form that language has taken, the language of computer programming, the language of software. The uncanny language form makes images, numbers, and languages “equivalents.” Consequently, to write today is a hybrid affair of code and commentary, programs and prose, in which one must tangle with this entanglement of images, numbers, and languages.

Warren Sack is a software designer, media theorist and artist whose work explores theories and designs for online public space and public discussion. His projects include work in Open Source software development, locative media, computer-supported translation, and systems for visualizing and facilitating online discussions. Sack’s work has been exhibited at the ZKM, Karlsruhe,; the New Museum, New York; and the Walker Art Center, Minneapolis. He has a Ph.D. from the MIT Media Laboratory and a BA in Computer Science and Psychology from Yale College. He is a Professor of Film & Digital Media at the University of California, Santa Cruz and, for the 2012-2013 academic year, an American Council of Learned Societies Digital Innovation Fellow and a Visiting Professor at the École nationale supérieure des télécommunications (Paris).

This lecture is co-hosted by the Centre for Innovation and Social Process and the Digital Culture Unit, Centre for Cultural Studies, Goldsmiths, University of London

Advertisements

The computer program known as “Eliza” was written in the mid 1960’s (64-66). The ‘doctor’ script of the Eliza program emulated a therapist, rephrasing statements into questions, creating an illusion of conversation. While the effect was striking at first, a computer that seemed to understand humans, its failing would eventually expose themselves, an process of user experience that would become known as ‘the Eliza effect’. In Noah Wardrip-Fruin’s Expressive Processing (Chapter 2: The Eliza Effect) he argues for a design strategy that doesn’t try to hide its internal processes, but instead makes the reverse engineering of the system into a critical part of ‘game play’. Rather then trying to hide its structure, such a notion puts the internal processes of a system into explicit creative negotiation with its users. I think this idea could work well, particularly for ‘educational’ systems, exploring particular content, while understanding its procedural context. But this leads to a wide conversation surrounding the ‘black box’ type relationship most users have with media.

some versions of Eliza online
http://www-ai.ijs.si/eliza/eliza.html
http://www.manifestation.com/neurotoys/eliza.php3

* Readings:
o Weizenbaum, Joseph. “ELIZA: A computer program for the study of natural language communication between man and machine.” Communications of the ACM, Volume 9 , Issue 1 (January 1966) http://portal.acm.org.oca.ucsc.edu/citation.cfm?id=365153.365168&coll=ACM&dl=ACM&CFID=72032383&CFTOKEN=36058992 [If this URL is difficult to access log onto the library website, either on campus or via the oca.library.ucsc.edu link, then search for “ACM Digital Library” and, once in the library, search for “Weizenbaum.”]
o Suchman, Lucy. Plans and Situated Actions: The Problem of Human-Machine Communication. New York: Cambridge University Press, 1987, pages 64-67. [Course Reader]
o Winograd, Terry. “Abstract: The ethics of machines which mimic people,” ACM Annual Conference/Annual Meeting, Proceedings of the 1984 annual conference of the ACM on The fifth generation challenge http://portal.acm.org.oca.ucsc.edu/citation.cfm?id=800171.809648&coll=ACM&dl=ACM&CFID=72032383&CFTOKEN=36058992 [If you are having problems accessing this, please follow the same instructions listed above for access to the Weizenbaum article.]
o Wardrip-Fruin, Noah. Expressive Processing: Digital Fictions, Computer Games, and Software Studies. Cambridge, Mass: MIT Press, 2009. (Chapter 2: The Eliza Effect).
o Dumit, Joseph. “Artificial Participation: An Interview with Warren Sack,” Zeroing in on the Year 2000: The Final Edition (Late Editions, 8) George E. Marcus, Editor (Chicago: University of Chicago Press, 2000)http://danm.ucsc.edu/~wsack/Writings/wsack-jdumit-interview.pdf

In the excerpts that we read this week, the notion of reality as a computer has taken a much larger position than I would have imagined. Most bluntly this view of the world, takes a kind of geno – pheno type approach to information and knowledge – and reality. The universe as a kind of processor, processing information on an atomic level. This matrix-simulation notion of reality, certainly isn’t new. But the previous uses I’ve come across seem to be brought up as a kind of metaphor for human understanding, whereas Wolfram seems to be pushing for an objective scientific claim. The whole venture gives a good harking to Plato’s theory of forms or ideas, the belief that there are abstract forms beyond perceivable experience that dictate the order of reality. Building off these claims, Sack uses his prose to unpack the development of arithmetic from the Platonic ear, positioning arithmetic as central to the negotiating of ideas and information.
In this stride for certainty, the “Computational Regime” emerges, reducing the “ontological requirements to a bare minimum. Rather that initial premise (such as God, an originary Logos, or the axioms of Euclidean geometry) out of which multiple entailments spin, computation requires only an elementary distinction between something and nothing (one and zero) and a small set of logical operations. ….Consequently, the Regime of Computation provides no foundations for truth and requires non, other than the minimalist ontological assumptions necessary to set the system running.” Over looking how this relates to the tradition of western meta-physics (I think this is something to possibility come back to later), I find striking parallels between the computational regimes ontology of something and nothing, and the traditions of thought that have emerged from Buddhism. While Buddhism as a whole is not short on meta-narratives, that could be attributed to what the computational regime calls universal “logical operations”; Zen Buddhism takes a much more deconstructed view, that is grounded on experiential knowledge, yet is anchored to the belief of something and nothing. In its (anti)intellectual approach of foregrounding the knowledge that reality is happening right now, they prioritize the process, a strategy that could be helpful in rethinking the emergence of the computational regime. I’m thinking it’s helpful, since I find in much of the texts, and arguments, we’ve been exploring there is an almost messianic sense of time, a undertone of salvation in the form of technological determinism. While from a rhetorical standpoint it seems like some kind of shifting of the guard, it’s ultimate direction is arguably the same… I guess from this position the computational regime operates in a space of historical continuity. What does all this mean from a more pragmatic perspective? I’m weary of claims, that say Manovich might make, saying there is “only one medium”. While yes, the ubiquity of computers has saturated cultural production it seems far from being hegemonic in its totality. I would argue for a computational materialism that describes systems of computation as overlapping and in coordination with each other, but still culturally hackable. My interest and concern, is that media in both a narrow and expanded view, now not only symbolically expresses the policies we uphold, but that its very materiality embodies it, and what the civic and cultural implications of such a development might be. As a seemingly banal example, we might consider a traffic light, in the previous order of media as a source of information, the street light informs us that we should stop, and if we want to adhere to the law we do. In the emerging reality, the traffic light would simply stop us, regardless of will. My interest is not is defending a romanticized notion of individualism and choice, but in preserving the humanizing agency of a system who’s totality includes the culture hackers, artists, and dropouts.

Readings:
• Sack, Warren. The Software Arts. unpublished manuscript. (Chapter 1: Artithmetic) [Course Reader]
• Hayles, Katherine. My Mother Was a Computer: Digital Subjects and Literary Texts.  Chicago: University of Chicago Press, 2005. (Chapter 1: Intermediation: Textuality and the Regime of Computation) [Course Reader]
• Wolfram, Stephen. A New Kind of Science. Champaign, IL: Wolfram Media, 2002. (Chapter 1: The Foundations for a New Kind of Science &  Chapter 2: The Crucial Experiment) http://www.wolframscience.com/nksonline/toc.html

Seymour Papert’s writing, Mind Storm, examines the notion of an object of learning, a tool to be used in the active creation/learning process of children. Pulling from his own story, of falling in love with automobiles and transmission gears as a child, Papert explores the idea of thinking with an object and in the process discovering complex mathematical ideas – but arguably most universal in its application is the revelation that one’s “relationship” to the material at hand is the driving force in actualizing the event of knowledge. Invested in the notion that (educational) context is the guiding principle that establishes a students relationship to x – material, and therefor his or her learning, Papert reiterates the findings and strategies of what is now a long tradition of “progressive” educators and educational philosophers… such as John Dewey, Rudolf Steiner, and many others. What differs Papert from these other thinkers is his critical vision of the computer’s role in contemporary education, rather then “the computer… program[ing] the child. … the child programs the computer.” In this way, the computer and the LOGO programming environment, specially designed for children, acts as an object to be thought through, a space for creative and critical exploration and development. Papert makes a strong case for general educational benefits of grasping complex math ideas in the context/process of play and creation. He situates this kind of work within a of pedagogy that is directional rather then instructional, that is to say when uncertainty and confusion arises students are shown how to think through the issues, and debug the problems, rather then shown how to simply do it correctly.

Some 30 years after Mindstorm’s 1980 publishing the ubiquitous presence of computers in middle class households is quite clear, yet it seems the ambitions of Papert and his logo colleagues, in making the inherent flexibility of the computers transparent and accessible did not flourish the way they would have hoped. Certainly such flexibility is slowly, little by little, entering the popular understanding of computers, and changing the educational terrain. But what Papert’s hints at is nothing short of a radical shift in how we think of human literacy. Not a literacy that is solely tethered to computers as a technology, as much as a form of literacy that is intimately conscious of what computers embody so well – process.

cash register

Software Studies takes multiple approaches in understanding and enacting the notion that knowledge is currently being rewritten, from prose to software. The idea that knowledge is being rewritten, is simultaneously fairly obvious (considering how we’re reading this right now), and yet very contentious in its implications. Crowning such contention is the term “digital ideology”, a short hand for saying, computers are effecting the way we think, our unspoken assumptions, our beliefs, the very edge of conceptual possibility. (Google isn’t a company, it’s the air we breath!) While “digital ideology” seems to hint at something radically profound, it runs the risk of either being tautology, or worse yet an esoteric claim on reality, but hell, you have to start somewhere… and it seems Software Studies is off to a decent start.  (watch the videos from the software studies conference) So what does Software Studies actually look like? Well, it seems on one hand to be about humanizing software, both in its perception/reception and literacy, and in the knowledge it embodies. To the latter point, it seems Software studies is invested in the belief that disciplines that previously did not explicitly use software, start to. An example of this might be Lev Manovich’s “cultural analytics”. Imagine the operation room for NASA, or some military venture, but instead of monitoring a trip to the moon, or impending doom, you’re examining the evolving color palette of fashion magazines, or the rhetoric of late night talk shows. Anything that doesn’t have a digital trail, needs to start producing one. It evokes what Kevin Kelly calls the “network of things”, that makes up the urban space as it becomes increasingly saturated with more and more computation. It’s a bit of a leap, but I see a parallel between this vision of the “network of things” and the cultural analytics that follow, and Clay Shirky’s description of media, as having shifted from a source of information to a site of action. Media becomes the folding and organizing of place. This leads me to my final point, I think for the digital ideology of software studies to ground itself, it needs to look outside the explicit domain of software to find its effects. How do seemly disparate cultural forces produce a constellation of thought that orients us today?