Skip to main content
eScholarship
Open Access Publications from the University of California

Software is a layer of control and communication that permeates contemporary culture: the engine that drives cyberculture, new media, games, and digital art, as well as the work of businesses and militaries, typesetters and DJs. Platforms are the layers of hardware and software relationships that enable and constrain software expressions. Software studies and platform studies are related interdisciplinary fields of research that approach these topics both as technical artifacts and from the perspectives of the social sciences, humanities, and arts. What are the histories, cultures, and aesthetics of software and platforms? More specifically, for the papers collected here, to investigate the logics of visualization, simulation, and representation in contemporary digital arts and culture is to engage in software and platform studies.

Critical engagements with software and platforms are always interrelated, although the emphasis can change significantly. Platform studies sometimes situates itself within a "levels" model of new media, in which the lowest level, platform, underpins code, form/function, interface, and reception/operation. The platform exerts creative constraint on the levels it supports through design specification. Consequently, platform studies investigates the relationships between the hardware and software design of computing systems and the creative works produced on those systems. These investigations can be bottom up or top down, running from the history of a specification to the consequences for enabled productions, or from a particular digital work back down to the platform that provides context and constraint for its unique implementation.

Software studies investigates the interrelated questions of how software is implicated in culture at the macro-level and how software signifies at the micro-level. Different investigations have emphasized the data, execution, or source code aspects of software. The Cultural Analytics research agenda asks how software society (big data, network culture, etc.) functions as a whole, and seeks answers to large-scale cultural questions through the application of data-driven methodologies such as exploration, mining, and visualization. Expressive Processing investigates what specific software processes express, both in the concepts embodied in their designs and in the intellectual histories of their algorithms. Critical Code Studies emphasizes how those articulations are made and circulate in their pre-compiled state: the aesthetics of the programmer and the significance of the keyword or the comment.

Current approaches to software studies and platform studies are richly varied in their different initial emphases, but all tend to lead in to a shared interdisciplinary emphasis on holistic study that brings cultural / historical vantage points to hardware and software, data and execution, production and reception.

Theme Leaders:
Jeremy Douglass, PhD. Researcher in Software Studies at UCSD.
jeremydouglass@gmail.com
Noah Wardrip-Fruin, Assistant Professor, Computer Science Department, University of California, Santa Cruz.
nwf@ucsc.edu.

Cover page of Programming and Fold

Programming and Fold

(2009)

Programming offers arguably the greatest opportunity for creative investment in the computer. But, given the mechanistic relationship between source code and executable and the highly constrained formalisms of programming, it is hard to see where creativity would find a place within the rigor and determinism of code. This paper places this question of creativity in the context of a broader problem of creativity in the digital generally, then identifies an ontological structure, called a fold or edge, that marks the creative moment of digital interaction. In programming, the edge appears in the object, recognizable in object-oriented programming but common to every creative innovation in coding technique.

Cover page of System Intentionality and the Artificial Intelligence Hermeneutic Network: the Role of Intentional Vocabulary

System Intentionality and the Artificial Intelligence Hermeneutic Network: the Role of Intentional Vocabulary

(2009)

Computer systems that are designed explicitly to exhibit intentionality embody a phenomenon of increasing cultural importance. In typical discourse about arti�cial intelligence (AI) systems, system intentionality is often seen as a technical and ontological property of a program, resulting from its underlying algorithms and knowledge engineering. Infuenced by hermeneutic approaches to text analysis and drawing from the areas of actor-network theory and philosophy of mind, this paper proposes a humanistic framework for analysis of AI systems stating that system intentionality is narrated and interpreted by its human creators and users. We pay special attention to the discursive strategies embedded in source code and technical literature of software systems that include such narration and interpretation. Finally, we demonstrate the utility of our theory with a close reading of an AI system, Hofstadter and Mitchell's Copycat.

Cover page of Scholarly Civilization: Utilizing 4X Gaming as a Framework for Humanities Digital Media

Scholarly Civilization: Utilizing 4X Gaming as a Framework for Humanities Digital Media

(2009)

While much attention has been given to first-person shooters and puzzle games in academic scholarship, large-scale Civilization style games (known colloquially as 4X games) have received comparatively scant attention. The map-based nature of these games, with an emphasis on socio-political, socio-environmental, cultural and military activity, is particularly well-suited as a medium to express historical knowledge. However, to adapt a medium designed to entertain players to a scholarly medium for the analysis of historical processes requires a thorough understanding of the structure of 4X games and the manner in which historical processes are represented in a map-based space. This paper analyzes the spatial and processual systems in FreeCiv and the Civilization series of games —specifically, an examination of the use of container-oriented, tile-based maps contrasted with modern historical GIS based on point and polygon data reveals best practices from the entertainment gaming community that may prove highly suitable for adoption in the digital humanities. The creation of tiled maps using defined environmental and social terrain and unit types may also provide accessibility to non-coding scholars to academic commons-based peer collaborative creation of new humanities digital media. The defined interaction between game objects, such as cities, irrigated farmland and military units, provides a second entry point for scholars, who through critique of existing game dynamics can define a more historically accurate system subject to peer-review. As a digital humanities medium, such a system would also prove suitable for the integration of multi-paradigm modeling techniques.

Cover page of Disrupting Heteronormative Codes: When Cylons in Slash Goggles Ogle AnnaKournikova

Disrupting Heteronormative Codes: When Cylons in Slash Goggles Ogle AnnaKournikova

(2009)

In this paper, I outline the heteronormative characteristics of computer code using a Critical Code Studies approach. First, I introduce Zach Blas’ transCoder: Queer Programming Antilanguage. With this scripting bible, I interpret Julie Levin Russo’s Slash Goggles algorithm, fictional software for exploring variant romantic pair possibilities and sexual subtexts (or slashtexts) on the remake of the television program “Battlestar Gallactica.” Out of these tools, I develop a framework for viewing the heteronormative code in other functioning algorithms. Applying the tools to 2000-2001 AnnaKournikova Visual Basic Script worm, I interrogate the viral qualities of heterosocial norms. This paper also includes discussions of encryption, fan culture, and Cylons.

Cover page of Translation (is) Not Localization: Language in Gaming

Translation (is) Not Localization: Language in Gaming

(2009)

In this paper, I elaborate in the difference between the concepts of localization and translation and how they relate to the movement, distribution, and understanding of different versions of the Square-Enix game Kingdom Hearts.

Cover page of Fake Bit: Imitation and Limitation

Fake Bit: Imitation and Limitation

(2009)

A small but growing trend in video game development uses the “obsolete” graphics and sound of 1980s-era, 8-bit microcomputers to create “fake 8-bit” games on today’s hardware platforms. This paper explores the trend by looking at a specific case study, the platform-adventure game La-Mulana, which was inspired by the Japanese MSX computer platform. Discussion includes the specific aesthetic traits the game adopts (as well as ignores), and the 8-bit technological structures that caused them in their original 1980s MSX incarnation. The role of technology in shaping aesthetics, and the persistence of such effects beyond the lifetime of the originating technologies, is considered as a more general “retro media” phenomenon.

Cover page of The Other Software

The Other Software

(2009)

This paper considers the absence of the human actor, specifically the programmer, from Friedrich Kittler’s analysis of software in his essay There is no Software. By focusing too intently on the machine and its specific, material existence, Kittler removes the human user / operator / writer from his analysis of software. Thus, he has no choice but to interpret the layers of language, assembler, opcode and WordPerfect, DOS, BIOS—both chains ending in an essentializing reduction to voltages—as an attempt to obfuscate the material operations of the machine in the name of intellectual property.

By both reasserting the presence of the programmer within Kittler’s structure, and attacking the conception of code-as-text, this essay offers an alternate description of the being of software, one which emphasizes not just the execution of code on the machine, but also the programmer’s role as reader and writer of code.

Cover page of Software Studies in action: Open Source and Free Software in Brazil

Software Studies in action: Open Source and Free Software in Brazil

(2009)

This article tells the singular story of the growth of Free Software and Open Source in Brazil - encouraged by the government, opposed by the world's largest software enterprise – throughout the experiments of a country in search of its democratic and independent identity.

Cover page of Rules for Role Play in Virtual Game Worlds Case Study: The Pataphysic Institute

Rules for Role Play in Virtual Game Worlds Case Study: The Pataphysic Institute

(2009)

The Pataphysic Institute (PI) is a prototype MMORPG de- veloped in order to experiment with game mechanics en- hancing the playing experience. In this paper aspects of the design the prototype which support players' expression of consistent interesting characters are reported. The design of these features builds upon results of user tests of a pre- vious iteration of the prototype. The game-play in PI is based on the semiautonomous agent-architecture the Mind Module.

Cover page of Shaping Stories and Building Worlds on Interactive Fiction Platforms

Shaping Stories and Building Worlds on Interactive Fiction Platforms

(2009)

Adventure game development systems are platforms from the developer’s perspective. This paper investigates several subtle differences between these platforms, focusing on two systems for interactive fiction development. We consider how these platform differences may have influenced authors as they developed systems for simulation and storytelling. Through close readings of Dan Shiovitz’s Bad Machine (1998), written in TADS 2, and Emily Short’s Savoir-Faire (2002), written in Inform 6, we discuss how these two interactive fiction authoring systems may have influenced the structure of simulated story worlds that were built in them. We extend this comparative approach to larger sets of games, looking at interactive wordplay and the presentation of information within the story. In concluding, we describe how critics, scholars, and developers may be able to more usefully consider the platform level in discussions of games, electronic literature, and digital art.