Code Drift: Essays in Critical Digital Studies
Aesthetics structure experiences in formal perceptual ways and provide interpretive tools, at times constructing meaning. Given that sensory expression — most often visual, sometimes sonic or tactile — is the only means to perceive many contemporary data sets, aesthetics are fundamental, not additive, to the emerging field of Data Visualization. Data comprise a set of organized measurements created by instruments that calibrate quantifiable qualities of an original source (natural, artificial or recombinant). “Data” are both an abstraction and mediation of actual phenomena. Whitelaw describes data as “a set of measurements extracted from the flux of the real [that] are abstract, blank, meaningless”  and become information only when they are placed into an interpretive context. This requires building algorithms that allow for selection, extraction, organization, analysis and presentation.  Visualizations allow the comparison of a set of values, the illustration of relationships between data points, the indication of the parts of a system and the relationship and interaction of these parts, the creation and interpretations of maps, the tracking of change over time and the analysis of text.  Designers (including programmers and animators) and artists create the interfaces that allow interaction with data. The resulting images create a bridge between the empirical world and the viewer, revealing patterns of the source data that evoke interpretation.
The growth of cloud computing, visual search engines, penetration of fast broadband and wireless networks have all created an ideal environment for an explosion of capacity in Data Visualization. Data Visualization is growing exponentially in scientific, social science and even humanities research, as well as in commercial applications such as social media.  Our expectations of the intelligibility and accessibility of data have shifted with the growth of databases and search technologies. This situation creates an expanding demand for tools and expressions that facilitate finding information and analysis. In the last decade, strict separations between scientific visualization and information visualization have eroded. Firstly, entire new practices that cross the boundaries of information and science, such as genomics and bioinformatics, have developed. These fields rely on Data Visualization to excavate structures in large-scale data sets. They have no photo-realist technologies to fall back on. Secondly, as Lev Manovich has remarked, Data Visualization allows representations to be mapped onto each other, to compare and overlay vastly different data sets, permitting the representation of infinite permutations and complexity. 
Data Visualizations make visible “features that exist across multiple dimensions…. [W]e discover unimagined effects, and we challenge imagined ones.”  A successful visualization may cross multiple boundaries and provide different perspectives on the same data set. Data Visualization offers the possibility of fundamental new insights, a moment of understanding that reveals hidden processes or complex relationships, breaks through existing barriers and sharpens the focus on knowledge while providing visual pleasure. The field of Data Visualisation contains aesthetic practices that draw from art, design, computer and information science and the sciences. This may mean that the intentions of the maker, whether artist, designer, computer scientist or team, does not always align with the uses, interpretations or applications of Data Visualization. Humans use intentional tools in unanticipated ways.
2. What is at Stake?
A set of related questions are at play in considering the aesthetics of Data Visualization:
- Should Data Visualization represent the materiality behind the source data in as realistic a form as possible? How are visualizations tied back to Nature or vast systems like the Internet? Given the presence of the artist or designer in realizing the visualization, can such tools be objective? Should they be?
- How does cognitive science knowledge shape Data Visualization aesthetics? Do efficiency and simplicity trump beauty as the keys to insight, or can usability and beauty combine? Are Data Visualizations primarily utilities or can Data Visualization be both utilities and forms of expression?
- How do the aesthetics of interaction and immersion impact Data Visualization?
- How may aesthetic stances, embedded in visual culture and art, add value to Data Visualization? How do the design methodologies that artists and designers choose affect aesthetics?
- To what extent have common tropes emerged within the field of Data Visualization? What can we learn from the similarity of metaphors and aesthetic parallels found in the work of designers, artists and designer-built or computer scientist created visualization?
3. Return to Realism?
In an article titled “The Petabyte Age” (2008), Wired Magazine recently declared “The End of Science.”  Editor Chris Anderson argued that scientists must end hypothesis and experimentation; instead, science must move entirely to data analysis derived from “big data” sets that lie beyond the natural limits of human comprehension and require “dimensionally agnostic statistics.”  The more scientists learn about physics and biology, the harder it becomes to create testable models. Instead, Anderson argues, researchers should search for patterns and relate these to the data’s source in order to build a fresh analysis, working with pattern recognition and theorization from the abstract back to the material world. Anderson’s view proposes big data as providing the means to rescue science from subjectivity and speculation, offering a twenty-first century version of realism and objectivity, in which data stands in for the real.
Anderson’s imperative requires an assessment of the ways that even the most literal visualizations are tied to formal decisions of representation as well as other mediations. The indexical, at times illustrative, qualities of scientific visualizations are most apparent in representations of natural structures or phenomena that draw from or map onto photographic imagery. Figure 1 below illustrates a 3D model of a virus structure in which six different proteins are interacting in complex ways. The data was captured using electron microscopy. The visualization is built in Chimera, a C++ and Python software built to assist in molecular graphics.  Scientists had already discovered the symmetrical structure of the virus and had faint images of its form — the visualization is built on top of the image that the microscope captures through 3D modeling.  The visualization extends scientific knowledge by allowing the user to manipulate the virus and thus understand how its multiple layers might interact and penetrate cell walls. Layering, colour and interaction experience were key aspects of the aesthetics that designers brought into play.
The images in Figure 2 represent 3D vector-field texture-based volumetric flow visualizations of tornados.  While the application is specific to storms, the algorithm, data structure and metaphor are of value to multiple disciplines that study flow, such as mechanics, physics, meteorology, medicine and geology.  Advances in texture-mapping graphics capabilities make these images possible and combine with depth sorting, illumination, haloing and colour attenuation to enhance perception and depth.  The images are aesthetically compelling — drawn by the computer from the data points and able to capture the dynamics of a storm.
Figure 3 is a 3D visualization of a solar storm that occurred on Halloween 2003. NASA’s visualization laboratory created the image by combining a model of the earth with “daily-averaged particle flux data from the SAMPEX satellite by propagating the particle flux values along field lines of a simple magnetic dipole.”  By making flux and field lines visible, it is meant to illustrate the ways that energy particles from the solar storm transformed the structure of the Earth’s radiation belts.  Design decisions are apparent in this quote, “The color-scale on the cross section is violet for low flux and white for high flux. The translucent gray arcs represent the field lines of the Earth’s dipole field.” 
The visualizations above bear close resemblance with what science has previously discovered or represented through photographic media, yet each seeks to extend that knowledge in speculative ways through adding animated visualizations built from data. Each image requires design decisions that move the image into a field of visual analysis. These images operate within a tradition of scientific description as tools for deductive reasoning. Still, data are not the same as their source, even when data represent the empirical world. It is clear that the interpretation of data introduces another level of mediation. This condition creates limits and opportunities. There are degrees of possible aesthetic relationships between the source of the data, their structure and visual expression. Other examples in this essay will indicate the ways in which the naturalism of scientific visualization becomes an aesthetic source of metaphors — that artists and designers mobilize within information visualization. This appropriation may represent neo-realist practices when nature and the Internet are conflated.
What point of view are we seeing in these examples? Is it that of the phenomena studied, the scientist, the algorithm or the designer? The tenets of scientific realism propose that there is a universal shared world of perception that makes up common sense, and discovery manifests this world through shared understanding.  The aim of science is to accurately describe reality. The empirical world, including its invisible dimensions and its description through analysis, thus becomes of paramount importance. The rationalist roots of scientific realism suggest that perception leads directly to action, and presupposes the alignment of reality and image. Scientists such as Pierre Boulanger et al. argue that it is necessary to keep the metaphor close to the look (whether observed or photographic) of the data’s source.  Visualization then becomes the means to make the invisible visible. Yet the aesthetics of scientific realism may create limits to imagination, tying visualization too tightly to “analytic reasoning” , which could fail to deploy the transformative power of visual experience. A further challenge to realism occurs when the source cannot be seen, only measured and then imagined.
Debates regarding scientific realism have included some recognition that the observer — whether an instrument or a human — has an impact on the means of expressing the data for an experiment. Barad, a physicist and a philosopher of science, observes in Meeting the Universe Halfway that there is not a one-to-one relationship between the ontology of the world and its discovery, as is claimed by “the traditional realist.”  The “common-sense” view of Nature is continually entangled in the theoretical and experimental practices that mark its description, as is human society.  Yet science still makes meaning of the sometimes-invisible material world, and we must pay equal attention to empirical research, as it produces ontological knowledge. These observations are equally true when considering large-scale information systems that are hybrid forms of physics, engineering, human and machine interaction.
The field of Data Visualization is compelling because it carries the traces of the empirical world and its instruments of measurement and representation. Case studies of visualizations — some with the same data set — underscore how data sets are shaped by prior decisions, such as the instruments chosen to collect the data, the structure of the database, source and sampling methods and software choices.  These are elements that implicate data and put a mediating frame around notions of objectivity.  The use of literal metaphors in Data Visualization may suggest a level of accuracy that is impossible to achieve. After all, a visualization of an Internet packet is many degrees of separation away from the conditions of production of that packet and of its producers.
Data Visualizations carry with them the aesthetics and assumptions of their contributing technologies. They are discoveries in their own right, creating new kinds of experiences. Data Visualization technologies absorb aesthetics of 2D and 3D graphics and animation systems, with their formal styles and malleability. In the past decade a new set of graphics tools — some viable for online visualization, others only available through super computer networks or in the laboratory — have become available, as either open source (such as Processing)  or proprietary software.  The more finished the tool, the more that styles and capacities are embedded. Artists, designers and computer scientists continue to build and adapt tools to their specific needs. Mash-up techniques and technologies originating in the DJ, alternative music and VJ (video disc jockey) worlds transcode data from more than one source within single integrated tools and search engines allow the ability to mix what were once discrete structural approaches to data types.  Each new source of data adds its aesthetic properties and limits. GPS, Bluetooth, geo-tagging, localization and personalization capacities in smart mobile devices permit a rapid growth of data from business, social and advertizing applications, with layering techniques emerging, particularly with the growth of augmented reality.  Whitelaw dubs these practices “data-bending”  as they layer contexts and can allow for the emergence of new imagery or meanings. This observation is fundamental to the ways that Data Visualizations can be used as instruments to permit new insights.
In this world of multi-variant data sets metaphors need not be literally tied to data structures to be meaningful, as the variable interpretations of Internet traffic below illustrate. Each speaks differently to the nature of Internet flow, with its packets and protocols at work. Lisa Jevbratt built generative algorithms in which chance intervened in the gathering of Internet data sets to produce unexpected, abstract and beautiful forms of expression, seeking the disclosure of hidden patterns in the Internet. The early project 1:1 is meant to collect and display “the addresses of every Web site in the world and interfaces through which to view and use the database.”  The title suggests the elusiveness of 1:1 within the vastness of the Internet. Jevbratt paints with pixels, a common sampling mechanism. In this case the samples return as images of network topologies that, once flattened, appear as abstract paintings.  Five interfaces (Migration, Hierarchical, Every, Random, Excursion) visualize the databases and provide a means to navigate the Web, aiming to instill a sense of the Web as an entity. Using the pictorial frame, she places sequences of frames (queries) around data points, as illustrated in the interface shown in Figure 4. The piece is simultaneously deconstructive and expressive.
The resulting Infome software deploys web crawlers that automatically trawl web sites, collecting data and building visualizations. Infome was meant to illustrate the ways that the Internet is integrated into human activity, or, “the DNA of contemporary behaviour.”  Artists in exhibitions such as LifeLike were commissioned to create settings for the crawlers’ behaviours, define their sources and choose the forms the visualization would take — pixels, lines, or other mappings their crawlers would use. Jevbratt states, “The data set resulting from many revisits will have repetitions talking about the structure of the sites, revealing its topology.”  Data Beautiful, in Figure 5, makes use of Infome to build such a map of Internet traffic.
In his artist’s statement, Lev Manovich enthuses: “Information tools and information interfaces are the future of aesthetics,” expressing the underlying infrastructure of thousands of “gaussian curves,” “packets,” and “matrices.”  Infome combines the underlying structure of the Internet and its technologies with human patterns of use to provide data sets that result in radically different visual forms. Arijana Kajfes’ Search? q=fool and Search?=Moon, also created using Infome, are made up of twenty-two cards that contain colour values taken from “each visited link of 1000 requested links for each of the major arcane of the Tarot” as an “endless beckoning to a possibility.”  Kajfes ties human online activity back to an ancient metaphor, as shown in Figure 6.
Calling on the tropes of scientific visualization, other designers and artists have relied upon cosmological or biological imagery to depict the Internet. Opte is an interactive site that bridges an Internet analysis tool, art and popular culture. It was created to,
…make a visual representation of a space that is very much one-dimensional, a metaphysical universe. The data represented and collected here serves a multitude of purposes: Modeling the Internet, analyzing wasted IP space, IP space distribution, detecting the result of natural disasters, weather, war, and aesthetics/art. 
It allowed users to create maps of Internet usage. By measuring the geography of deployment, Opte was capable of producing maps of the Internet with as many as five million edges on a daily basis, representing “Class A allocation of IP space to different registrars in the world.”  Barrett Lyon’s Opte provided a socio-economic mapping of the Internet with infinite layers. It is depicted in Figure 7.
Julian Oliver, the Select Parks games designer, created Packet Garden, illustrated in Figure 8, a commission by Arnolfini Gallery.  It has a luscious organic feel. It monitors users’ Internet usage, the servers they visit, their upload and download practices and the protocols that their usage requires. It functions as a “network diary” , growing an automatic garden that the user unconsciously cultivates.
Formal strategies and metaphors differ in these five examples — from painterly abstraction, to pixilation, to the light rays of a constellation, to gardening with seed packets, each providing the viewer with discrete readings yet representing the underlying structures of the Internet. Each offer different models of interactivity, whether passive viewing, building one’s own visualizations, adding one’s own data or flowing data through the metaphor. Gordon Kindlemann proposes that the very power of Data Visualization is that objective and subjective views cohere, inspiring new insights.  These works prove that very point.
Science itself contains variant views of reality and its analysis is contradictory and chaotic, with different worlds — episteme and ontology — side by side. New trends in science acknowledge phenomenology, complexity theory and emergence. There is recognition that complex systems are difficult to predict and represent, as scientists such as Kaye Mason, Jorg Denzinger and Sheelagh Carpendale argue.  The strength, not the weakness, of Data Visualization is its ability to use algorithms to present emergent properties and different points of view. Realist, static notions of common sense fail to comprehend the value that even disciplinary difference brings, driving rather towards homogenization and “group-think,” an end-game defined by John N. Bray et al., with consequent reduction in problem-solving capabilities.  These contradictory views within science allow elasticity in aesthetics and provide fertile ground for artists and designers who choose to collaborate with scientists.
4. Utility and Beauty in Data Visualization
Given that Data Visualization can assist fundamental discovery, or influence social policy and economics, it is no surprise that in the past the application of Data Visualization has been motivated primarily by teleological analyses (willful thinking and predictable outcomes), with regard to both the goals of human activity and the ways that machines or tools can serve these. Visualizations are understood as utilities, translating data into meaningful communication that can represent reality. Edward Tufte proposes that Data Visualizations are “complex ideas communicated with clarity, precision and efficiency.”  Perceptions about realism, common sense and the ways discovery and insight occur have a direct impact on notions of beauty in Data Visualization. Despite examples of aesthetically demanding yet instrumental work, a belief in realism and objectivity leads some scientists to suggest that attractiveness is equal to subjectivity or illegibility.  Ben Mathews assigns aesthetics to functionality and ease of interface use. Rapid comprehension is then the goal of this design aesthetic.  Simplicity is closely aligned to Occam’s razor or lex parsimoniae — the mathematical and scientific view that the simplest solution is the best and most common-sense one. Aesthetics are biased towards the symmetrical and highly legible, with a spare Modernist look.
The acceptance of beauty in scientific and informatics imagery differs between generations and types of science and designers. As visualizations of the truly imperceptible nano-technology world proceeds, image-making becomes generous. The publication SEED: the Future of Science represents theoretical and applied science and often features material drawn from art and science (ArtSci) collaboration.  SEED foregrounds vivid illustrations and commissions breath-taking visualizations. Andrew Vande Moere’s argument should put to rest calls to segregate beauty and utility. He maintains the blog infosthetics.com, one of the key sites for debates about the practice of Data Visualization. He argues for lush images: “The best works are those where the aesthetics help people understand the data, where they’re almost telling a story.”  Beautiful visualizations compel not only experts, but also the public. Vande Moere is convinced that consumers shown an effective visualization of energy wastage data will adopt energy-efficient practices.  Legibility, instrumentality and beauty need not be discordant.
Currents of thought in art and design argue for the Data Visualization practices of both fields to be separated. Caroline Ziemkiewicz and Robert Kosara propose differentiation between “pragmatic” Data Visualization that allows efficient reading of data, and “artistic” Data Visualization that uses data in abstract or metaphorical ways.  Kosara feels that creative interpretations can “hurt perception” when fast analysis is needed yet can result in “sublime” or “contemplative” experiences at other times.  Mitchell Whitelaw argues that artists should not allow their Data Visualizations to become designs, that is, an “aestheticized (and perhaps functionally impaired) form of scientific Data Visualization.”  These positions are unfortunate because they legislate a separation between a teleological use-value and an intrinsic aesthetic value. Extracting meaning and insight from these representations of data requires powerful aesthetics that balance emotion (such as awe), contemplation and deep analysis. 
Ben Fry creates Data Visualizations that double as utilities that enable scientific investigation and art works. Genome Valence, in Figure 9, visualizes biological data and builds structures and relationships.  The software makes use of the algorithm BLAST, used most frequently for genome searches. It draws the genome sequence with a ribbon of text that moves through the sequence selecting its correct letters. It was exhibited in the Whitney Biennial.
In the 2007 project Aligning Humans and Animals, Data Visualizations created from the Mammalian Genome Project at the Broad Institute, sequences of human and other mammalian DNA are aligned across a browser that indicates the evolutionary distance from the animal to the human.  It is illustrated in Figure 10 below and was printed in SEED Magazine.
These examples suggest that the context impacts the ways that Data Visualizations are used and read as much as the image itself. The material functions of context and situatedness are core understandings in the field of visual and new media art.
Studies such as those of HCI (human computer interaction) scholars Noam Tractinsky, A. S. Katz and D. Ikar demonstrate that users pay greater attention to beautiful images and that usability and beauty are viable companions.  For example, Jane Prophet’s 2002 work Cell  was created with mathematician Mark d’Inverno, adult stem-cell geneticist Neil Theise, computer scientist Rob Saunders and curator Peter Ride. A still is illustrated in Figure 11.
Cell is visually arresting software built to facilitate Theise’s breakthrough research, demonstrate relationships between previously invisible phenomena, test a series of mathematical and programming challenges and result in an art work.  Prophet notes the ways that collaboration was premised on the notion, “that artists can ‘imagine’ scientific and mathematical theories and thereby influence the development of scientific, mathematical, and computer science research and their associated aesthetics.”  Theise’s results “revised understandings of human liver microanatomy which, in turn, led directly to identification of possible liver stem cell niches and the marrow-to-liver regeneration pathway.” His collaboration with Prophet, and her revealing visualizations, led Theise to a new interest in theoretical biology and complexity theory.  At the same time, Prophet, who is a visual artist, continually reminded the team and her audiences of the value that beauty brought to this discovery process.
Prophet’s work segues into aesthetic discourses about the sublime and the uncanny. Art and literature define the sublime — whether nature or immense artificial systems — as the threatening unknown that cannot be fully grasped by human understanding. Sublime imagery seeks transcendence, elevating the everyday to godliness. “Raw” data stand in for nature (red in tooth and claw) and nature is extended to the vast information web that constitutes the Internet and digital information. Data can be perceived as primary material — not produced — concrete and objective, rather than contingent and relational.
Some artists, such as Lisa Jevbratt  or Barrett Lyon, shown earlier, describe emergent properties and systems as an evolutionary living force. Jebratt argues that genetic code melded with computer code signals a new sublime or unknowable, uncanny beauty.  Jevbratt intertwines the materiality of data with programming (coding) as a material and conjuring practice. She says,
To write code is to create reality. It could be likened with the production of artificial DNA, of oligonucleotides — a process where life is written. Or it could be seen as a more obviously physical act of generating and moving around material, an act that has dimensionality, which is nonlinear. 
Coding does act as a means of bringing a virtual world into being through the manipulation of mathematics (and its aesthetics) as manifested through data points and computation.  The study of data as a material with distinct properties (mathematical and indexical) must not throw away the constructivist wisdom that has allowed an analysis of the intertwined relationship between knowledge and its mode of production. While human intervention is required to produce meaning from the originating data (e.g. weather patterns, plant growth or mobile phone use), the transformation process should not return to romantic notions of alchemy, affected only by a cognoscenti of programmers, (artists) and designers. The notion of an unconscious and shared “natural” aesthetic is a problematic construction, as any survey of contemporary international art practice quickly suggests, as art is bound by differentiation. In this view perception is relational and contextual, constructed through the complex intertwining of object, maker and viewer.  These arguments require a located maker and viewer and militate against totalizing notions of beauty. Historical references to “nature,” its relationship to culture and various past expressions, whether domestic chic or formalism, can serve as a double entendre, reminding us of the tension between the ontological and epistemic.
For some artists, an attraction to Data Visualization stems from the challenge of excavating hidden patterns and structures and emerging beauty from the obliqueness of a data set, at times reconnecting these with the social or political conditions of their production. In his 2002 “Data Visualization as New Abstraction and Anti-Sublime,” Lev Manovich argues that visualizations of data by artists may create synthetic meaning rather than support mystification.  Manovich indicates links between early Modernist abstraction and contemporary artists’ Data Visualization. However, the complexity and form of the structures that artists disclose have changed since the Modernist era, as have the conditions of belief — skepticism characterizes art, not early twentieth century optimism and essentialism. The formal properties of the database are lateral and associative. It privileges the paradigm (perception of the structure or theorisation) over a narrative hierarchy.
Understandings of how to treat data as a material play out in the making of visualizations. Two distinct approaches to design arise, representing bottom up/top down processes. Edward Tufte argues that Data Visualization requires choosing data sets that are of value to the researcher, mining the data, creating a structure for the data, analyzing that data set to find meaningful ways to represent it, analyzing patterns, translating the analysis through aesthetic representation, refining the representation to better communicate, and creating means of manipulating the data.  In Tufte’s view, data enunciate their own structures. There is no base case with data; it is inductive reasoning that pulls out knowledge. Through this process they find form, and sometimes also find metaphor or narrative. This may be viewed as data naturalism, structuralism, bearing a truth to materials approach or, in working with large-scale data sets, representing phenomena that cannot be viewed, data-driven design.
Ben Fry proposes a procedure that begins with narrative or story form.  He argues that the designer must start not with the data set but with the empirical question asked by the researcher. Fry then works his way back to data. He considers the nature of the data to be obtained, finds data to fit the question and parses them to provide a structural fit for their meaning, then orders them into categories and filters out all but the data of interest. This approach maintains the role of the scientist in producing theory (a base case), illustrating, testing and deducing. It also offers an opportunity for metaphor, design variation and the recognition of multiple interpretations of the same data set by different disciplines. Both approaches need comparative testing to see how each impacts discovery in the fields where they are applied. In both instances a challenge for artists and designers is to sustain a constructivist understanding of imagery while openly exploring the indexical properties of data.
5. Context and Cognitive Science
Cognitive Science and Data Visualization have a closely linked history (now emerging into a new field of visual analytics). There are a set of challenges and contributions in the application of cognitive science to Data Visualization. Cognitive science brings tools to the understanding of cognition and the function of the brain, and further to the mediation between brain and machine that evokes visualizations. Cognitive science analyzes the differences and similarities in visual and textual cognition. It divides over the bearing that cultural difference may bring to universal tools or to specific visualizations. Cognitive science may need to grapple with the move to collaborative subjectivities and away from individual consciousness.
Data Visualization requires both the awareness of cognitive aspects of human visual apprehension, such as colour theory, and the need to make the visualization meaningful to a user’s context. In a much-quoted statement, Edward Tufte describes graphical excellence as “that which gives the viewer the greatest numbers of ideas in the shortest time with the least ink in the smallest space.”  Ware proposes that Data Visualization is the scientific study of “distributed cognition”  between pattern mechanisms in the human brain and the algorithms that map data to the computer , connecting human cognition, computer memory and its related algorithms and the physical actions of the user.  Indeed successful design requires attention to the physiology of brain, hand and eye. However, these formulae describe a mechanism at work in the perception of visualizations but are bereft of understanding the ways that human experience differs from machine, encompassing the non-linear as well as inductive processes at work. Unfortunately, because of the focus on treating Data Visualizations primarily as utilities, much cognitive science research in the field studies techniques of performance enhancement, that is, legibility and speed rather than breakthrough discovery or the play of poetics or insight. 
Ideas about nature, reality, culture and common sense play out in the field of Cognitive Science. Like scientific realism Cognitive Science has gravitated towards a Kantian notion of “common sense,” which encompasses logic, morality and aesthetics.  Immanuel Kant promotes logic, equating it was purposefulness and demotes aesthetic judgment as mere taste. Given how important the facilitation of insight through data visualisation is, the simplification of imagination to efficient perception is a problem. Of more value may be Kant’s proposal that aesthetics are a transaction between the artist, the object and the audience, suggesting that the viewer completes the image. This is a process wherein an embodied subject is in constant formation, in a state of “momentariness” akin to Gilles Deleuze’s notion of becoming and allowing insight and awe. 
Valuable lessons from Cognitive Science can help designers and artists understand the differences between reading and viewing, and the ways that visuals can allow pattern recognition and text can act to lock down meaning and context in visualizations. These lie in parallel to current theories about the image that reside within visual culture studies, locating aspects of cognition outside of conscious grasp. Ware provides useful signposts by distinguishing linguistic expression as conditioned by the “‘ifs,’ ‘ands’ and ‘buts’ of natural language,” from visual language which is relational and conditioned by pattern analysis, “‘connected,’ ‘inside,’ ‘outside, or ‘part of.'”  A cautionary note is required here as well, for poetics makes use of language to create patterns and graphic fonts such as Forte or Bauhaus 93 or Matura indicate that stylistic signifiers overcome the content they may contain. These boundaries further dissolve in the popular field of text visualization, where semantic and social networking relationships are discovered through visual and textual patterns. An example is Chat Circles.  Users choose a colour when logging on and their name is placed next to their circle. Individual activity is shown by changes in the size, intensity and colour of users on the screen with text appearing in a bubble and conversation trees showing the topical and subject line history of a chat. Participants” circles are bright when they post, with intensity fading from non-active users or lurkers. Using the metaphor of sound, where proximity brings focus to hearing, the participant approaches a topic in order to enter it. A series of screens from Chat Circles are shown in Figure 12 below.
Chat Circles provides a model for a highly interactive and visually appealing space.
Equally problematic is the tendency of many twentieth-century Cognitive Scientists to universalize perception and cognition. Contrary research from other strains of cognitive science suggests that context and culture effects perception, and that viewers have different experiences in relation to what makes the same Data Visualization effective. Rather than a normative notion of cognition, Francisco J. Varela, Even Thompson and Eleanor Rosch draw on evolutionary biology to reject notions of fitness and optimal adaption. They adopt a “proscriptive model” in which diversity is “woven into the basic constraint of maintaining a continuous lineage” and “the evolutionary process both shapes and is shaped by the coupling with the environment.” 
Learning and difference play key roles. Varela, Thompson and Rosch show that because understandings are culturally learned, categories, such as colour perception, are not assumed to be objective; hence, “lexical classifications of colour can affect subjective judgments of similarity.”  This formulation links perception and aesthetic categories together. Such an approach to cognitive science requires a mix of intrinsic and extrinsic factors in understanding the mind and allows a better understanding of cultural diversity. Sensory cognition remains of critical importance in forming judgments, aligning with the need for aesthetics in the field of Data Visualization that take these processes into account. Providing different users with varied metaphors, even shifting colour templates in the interface, can allow perception and analysis of the visualization.
Even when taking diversity into account, cognitive science primarily focuses on individual perception, rather than the emergence of hybrid-group experiences and collective identities as result of the new sociality produced by Internet communication. Warren Sack states that “aesthetics for the Internet needs to concentrate on producing the means for visualizing and understanding how social and semantic relationships intertwine and communities and common sense emerge.”  He observes that new identities overcome cultural difference, although difference is the starting point. Perhaps it is more accurate to state that rather than a new universality, new particular and contingent identities form.
Visualization systems that represent collaborative efforts or discourses require an aesthetic that allows the emergence of common and collectively constructed experiences and identities. It is logical that designs with high degrees of interactivity would facilitate the creation of new identities or “intersubjectivities” , a term coined by Vilem Flusser, for conjunctures in which identities conjoin productively. Figure 13 provides two prototypes from CodeZebraOS, an interactive conversation visualization tool. A neural network extracts and averages behaviors within and between texts.  Graphics show relationships between topical chat postings, using graphics to assign an emotional tone to each thread and topic.
6. Interactivity and Immersion
Earlier examples have demonstrated degrees of interactivity in Data Visualizations. Interactivity appears to be an important part of cognitive processes, of learning by doing, of engaging the body through navigation. The third space that Bruno Latour describes between subject, object and technology is the site of “interactivity, intelligence and creativity.”  Ron Burnett offers the explanation that part of the power of the “third space” of technology-mediated experience for the participant is the opportunity to gain agency by learning the system and aggregating knowledge through play. The same may be true of gaining visual understanding while navigating data sets. This leads to an aesthetics that allows users to exert agency through learning a system and even to adapt and change outcomes.
There are different levels of interactivity within digital media and within Data Visualizations. Some Data Visualizations simply provide navigation capacity such as the ability to click on or mouse over material that the user chooses. For example They Rule allows the viewer to choose the specific sub-set of data that will be visualized.  It indicates connections between individuals on corporate boards in America and responds to additional data added by participants. Other designs offer the ability to interact with others while viewing or navigating. Yet other forms of interactivity privilege the impact of the information flowing through the site; in this instance data acts as an agent — interactivity is the flow of data issuing from a stock market feed, a geological phenomenon or a conversation.
Interactivity and related cognitive processes imply a time-based experience. Navigating 2D and 3D visualizations often requires rapt attention. Building on Deleuze’s writings on cinema, Hansen argues for an aesthetics that is appropriate to the temporal experiences of digital media.  Digital media create opportunities for humans to experience time and space in ways that stretch and extend their existing physical apparatus.  Data Visualizations of large and multi-dimensional data files occur on 3D screens and at times in 3D CAVE environments. These are full body experiences in which the user navigates data in real time, performing discovery simultaneously or with retrospective thought. Aesthetics is mediated between the body and its object in a continual flow or “becoming.” 
Data Visualization can also occur as an illustrative sidebar to highly interactive social media activity. Social media companies commission visualizations to allow users to catalogue their resources and better understand and organize their relationships with others. Social media by definition favours user generated content, sometimes integrating it with various media assets. Fidg’t is shown in Figure 14 below. 
The Fidg’t interface grows Alife-style floral clusters of metadata tags into a bubble map. Last Forward , shown in Figure 15 below, lets users create tag-maps of Last.fm and their networks of relationships and choices. Its structure is similar to Fidg’t, but has a literal and indexical aesthetic, hence is less visually pleasing. Both of these tools help users to share their assets and preferences with their social network.
Fernanda B. Viégas and Martin Wattenberg created Many Eyes  in 2006 in order to popularize the use of Data Visualization and provide a tool-kit for building visualizations. They hoped for at least three uses of Data Visualization: to interpret textual data, to analyze complex objects and to use visualizations to initiate “social data exploration.”  Many Eyes is a highly interactive site where participants can add their own data and they or other participants create visualizations from that data using a set of given templates. Users then export their visualizations to their social media sites.
Mitchell Whitelaw observes that Data Visualization “turns towards immersion and sensation; it emphasizes openness and intuition, rather than the extraction of value or meaning.”  This suggests the segregation of discovery from experiences of the body. As Data Visualization tools develop and as big data sets are produced and accessible, an aesthetic that favors highly interactive and even immersive applications that engage body and mind is emerging — one that allows rational analysis, predicated on affective experience.
7. Artists’ Data Visualization
There are a number of analyses and practices that artists have brought to Data Visualization, such as structuralism (including in relation to language and technology), context creation and context specificity, data deconstruction, artists’ software and tool-making and narrative and abstraction. The notion of context as an aesthetic frame is relevant here. In Context Providers, Margot Lovejoy, Christiane Paul and Victoria Vesna designate the ways that artists define their practice as creating the context or structure (such as tools) for others to add content to.  Data Visualization tools can be understood as contexts that allow users to input their own data. Digital design strategies often derive from the context of use rather than from a data set, with data chosen to fit the researcher’s needs and the interface designed for their culture of use. Such context variability reinforces the subjective and mediated nature of Data Visualization practices.
Some artists, seeking the “essence” of nature, materials and social practices, have chosen to create structuralist art and technology works that disclose and analyze their source materials. There is a thorough line in traditions of minimalist sculpture and structuralist filmmaking, video art and graphics art. In the digital age, when software is often the technology underlying works of art, the truth-to-materials strategy can also lead to the creation of tools or combinations of tools, software, platforms and systems by artists.  These practices link to an aesthetics of abstraction, as illustrated earlier in Lisa Jevbratt’s work and in recent visualizations by Martin Wattenberg and Fernanda B. Viegas , shown in Figure 16, that use an algorithm to extract “peak” colour patterns, in this instance from luxury brand magazine fashion and design spreads, illustrating principles of colour, placement and focus in graphic design.
Bruno Latour  and John Law and John Hassard  describe technologies as invisible non-human actors, affecting the performance of a social network or process. Early new-media art works favoured deconstructive interventions such as cracking or breaking technologies, or building broken interfaces, deploying Bertolt Brecht’s notion of “alienation effects.”  Visualization is an alternative, compelling strategy for some artists, a means to excavate technological structures that hold hidden hierarchies of power.  In Data Visualization formalism and politicized deconstruction merge by creating visualizations that reveal socio-political relationships within the data. Other artists become tool makers in order to get close to the materiality of data. Others offer critiques of the aesthetic norms of scientific and information visualization.
Simon Pope and Matthew Fuller (1995) created Web Stalker, one of the first tools to crawl the Web and build a visual diagram of hidden relationships between domains and their hierarchical ordering, discrediting any notion of search engine neutrality.  Its form is now common to many Data Visualization tools in social media. The Web Stalker is illustrated in Figure 17.
In The Secret Lives of Numbers (2002, 2008), shown in Figure 18, Golan Levin and his collaborators Jonathan Feinberg, Shelly Wynecoop and Martin Wattenberg, seek an understanding of which numbers reoccur more than others “in order to determine the relative popularity of every integer between 0 and one million” , and to consider why this takes place — finding links to the functioning of human memory, social rituals and the structure of commerce. In the face of our society’s belief in the objectivity and power of mathematics, Levin instead argues for the subjectivity of numbers — and by implication — data, stating,
Humanity’s fascination with numbers is ancient and complex. Our present relationship with numbers reveals both a highly developed tool and a highly developed user, working together to measure, create, and predict both ourselves and the world around us. But like every symbiotic couple, the tool we would like to believe is separate from us (and thus objective) is actually an intricate reflection of our thoughts, interests, and capabilities. 
Levin and his collaborators draw on Edward Tufte (2001) and Colin Ware’s (2004) rules of simplicity of display to comment on the aesthetics and practices of scientific visualization, and at the same time develop a malleable, beautiful and interactive visualization made up of data sets pulled from a wide range of search engines over a five-year period.
Else/Where Mapping New Cartographies of Networks and Territories, a compilation by Janet Abrams and Peter Hall, includes an anthology of artists’ geographic mapping projects.  There are many artists’ projects that map corporate and military power relationships. They Rule provides a node-and-link graphical overview of corporate links, members of boards of directors and their social networks.  A Thousand Points of Light, a visualization by Naeem Mohaiemen in The Disappeared in America Project by the Visible Collective/Dan-Bergman, is an animated map of mass detentions that occurred in the United States after September 11, 2001, providing information about the detainees and their countries of origin.  Viewers can update the map with their own data. Such attempts to enforce transparency onto techno-culture and offer an overt critique of power relationships may be described as data deconstruction.
Stock market feeds have been a fecund source for projects such as Joshua Portway and Lise Autogena’s Stock Market Planetarium depicted in Figure 19.  This elegant and ironic installation plays off the scientific trope and information metaphor of cosmology visualizations, suggesting a new astrological universe of corporations and their stocks, as artificial life creatures that mutate, propagate and die in the market, feeding off of its movements and making graphical transitions, clumping and influencing the weight of the depicted universe.
One of the most dynamic growth areas of Data Visualization is text visualization, whether the massive quantities of scientific texts, social media output, chat, or descriptive metadata. As they did with other media, visual artists began to treat language and text as material in the twentieth century, continuing nineteenth-century artists’ fascination with literature.  In the 1980s, artists applied structural semiotic tools to the visual image, a set of practices that are mirrored in data-mining text analysis tools. Artists with an interest in linguistics and conceptualism now turn to Data Visualization as a digital trajectory to linguistic intervention, semiotics and conceptualism.
Temporal structures define how text-based relationships emerge in the Internet, with synchronous and asynchronous experiences providing very different feelings, intimacies and forms of consciousness. These pile on top of each other in layers, allowing social relationships and expressions to feel like a thick texture of condensed time. Some participatory works by artists pin down and focus this endless movement. Mary Flanagan (2004) created applets, such as Phage, that users download onto their computer.  Phage psychoanalyzes the user’s hard drive over the course of a week, revealing the user’s obsessions and encouraging awareness of how subjects are constructed through their data. Victoria Vesna’s NoTime screensaver builds an aesthetic around the search for information on the Internet.  It is a contributory work in which multiple participants create and compare profiles with human and non-human agents. Vesna makes use of agent-based technologies to create emergent behaviours that feed on the user’s identity. This bot redraws patterns for each participant as transactions occur in their Internet accounts. The relative autonomy of each agent reveals the routes and relationships of transactions, normally invisible to users.
We Feel Fine, by Jonathan Harris and Sep Kamvar, bears an interest in affective expression and uses the measurement of text data to find it.  We Feel Fine builds emotional portraits of specific online populations by extracting expressions of feelings from Weblogs. The project provides six movements (like a symphony), driven by statistical analysis and data aggregation, and then reshaped by users’ paths through the data. Feelings accumulate in mounds on the screen, quivering when the mouse-cursor passes over. The site is poignant and amusing. It is shown in Figure 20.
Other artists are drawn beyond structural analysis to poetics.
Data Visualization becomes a means to write concrete poetry. Brad Paley (2003) created Textarc, a tool that allows text to be processed.  Key words are quantified and brought to the foreground. It has been applied to literature, bodies of conference data, calendars and other corpus. Stephanie Posavec explores differences “in writing style between authors of modern classics” through her project Writing without Words.  An example is shown in Figure 21.
Posavec parses text in an expressive and poetic manner to create works such as Sentence Drawings and Sentence Length as part of her series.
As Fernanda B. Viégas and Martin Wattenberg have done for social media, artists have created tools that permit individuals and groups to input their own data to visualize with highly interactive tools, building a sense of agency by discovering patterns within data sets and aligning them to other data subjects. Alexander Galloway developed Data Visualization tools  in keeping with his belief that “software art” and open-source activities provide examples of “counter protocol practices.”  With the Radical Software Group, he created CarnivorePE, inspired by the FBI’s surveillance of the Internet with Carnivore, or DCS1000.  Galloway reverse-engineered the FBI software and created a Data Visualization capability for “artistic” users, with “new functionality including: artist-made diagnostic clients, remote access, full subject targeting, full data targeting, volume buffering, and transport-protocol filtering.”  The surveillance tool called CarnivorePE is “a software application that listens to all Internet traffic (email, Web surfing, etc.) on a specific local network.” 
For artists, a critique from the sidelines is not adequate. The perspectives that artists bring to Data Visualization may permit new realizations about the original phenomena that are being represented. Despite some artists’ intentions to the contrary, there appears to be a potential for instrumentalist interpretations even in Data Visualizations that are expressive abstractions of data. Artists have challenged formal strategies and tropes (graphs, charts, simple tree maps) as they emerge, bringing composite imagery, natural references, references to art-historical practices and strong aesthetic cues into play. Rather than step away from cognitive science and studies, artists bring training in aesthetics. They may push back against reductionist formal assumptions while successfully using visual skills with roots in aesthetic debates and practices. Artists work with data as a material — looking for underlying patterns and finding an expression, or looking at context and imposing narrative or form to their structure. Given artists’ precociousness with language and context analysis, their cross-disciplinary collaborations should result in enhanced visual literacy with the field of Data Visualization.
8. The Emergence of Conventions
As the Data Visualization field developed, a set of formal techniques and conventions emerged linking data structures and metaphors.  Even to arrive at the first stage of Data Visualization requires decisions about data extraction structures and metaphor. Correlations between art, design and computer science (programmer created) metaphors and styles can be seen in examples in this article. Many Data Visualizations effectively mobilized visual language and form or used gamesmanship or play, hence appealing to pleasure or a sense of beauty. Many artists and designers created utilities, whether to make a point about the politics of a data source or to assist in understanding a scientific phenomenon. Not all are effective in combining pleasure and utility, perhaps driven by an aversion to beauty found in some branches of science, or underscoring an artist’s decision to focus on the deconstructive goals of a project and forsake beauty, or some lack visual knowledge.
Most examples of Data Visualization in this article provide a clear context for the source of the data, either directly within the Data Visualization, through labelling, or by facilitating a user’s upload of their own data. Nearly half were literal, that is, used imagery and navigation strategies that were evidently tied to the data’s structure or used metaphors or images that clearly showed a lineage. There is a clear link between data structures and the metaphors or representational strategies deployed. Node-and-link structures lend themselves to manipulation into abstract form — flowers or other plant-like images, or cosmologies. Tag clouds and bubble charts may become plant-like or cell-like images, solar systems or word montages. Map structures are shown as maps. Stack maps become blocks. Swarms may become explosions, cells or creatures. Block histograms may become bull’s eyes, other block forms, or mazes. As with metaphors and characteristics, there were multiple kinds of Data Visualizations overlaid or adjacent in some projects. What is perhaps surprising is the frequency, or perhaps redundancy, of five metaphors: networks, cosmologies, plants, explosions and maps.
Nature was invoked frequently. The use of natural metaphors (tree-like, floral, etc.) occurred in artist, designer and programmer-created works, regardless of whether the source of the data was natural or artificial. The prominence of natural metaphors may indicate the merging of scientific and information visualization; it may represent mystification — the correlation of sublime nature and sublime data — or an ironic stance towards mystification; it may suggest a growing sense of concern about the biological world, its extraction into data and the need for an ethos of responsibility towards the empirical world. It is in the interaction of computer code and genetic code that new forms, virtual and physical, come into being. Rather than eliciting a fear of the unknown, in which data are sublime or become a simple deconstruction, the summoning of a new hybrid world could be placed within a sense of responsibility to both human and non-human life. Issues of aesthetics and ethics are present, if not visible, in the tools we build and use.
Data Visualizations are indexical, ultimately tied to the source of the data, whether material or artificial, yet mediated; they are not the source. Data Visualizations have a genuine role in disclosing patterns, relationships and processes that are impossible to reveal without extraction, analysis and visualization. Data Visualizations are designed or authored by humans, hence capturing both a subjective view and objective scientific analysis. They bridge a materialist and constructivist practice. Data must not be treated as a new unknowable and threatening-yet-beautiful sublime essence, replacing the Nature that it represents. This is particularly true because visualizations and the data that they make meaningful also function within a world of science and social politics, helping to make convincing arguments through the metaphors that they enjoin.
Data Visualization aesthetics are always contextual, depending on the data source and, equally, are read in context, whether by scientist, social media user or art audience member. Data do not exist in themselves, and data risk mystification. Many of the sources of data are already structured. The assumptions of these structures, the material impacts of underlying technologies and particular software and the pervasive presence of tropes and metaphors need continual unraveling.
Cognitive science has a key role to play in developing visualization aesthetics that privilege pattern recognition. Viewing complex 3D images and navigating through these requires eye-hand coordination and focused perception. Designing to facilitate or at times disrupt cognition requires that artists or designers draw from this body of knowledge. At the same time, cognitive science needs to recognize that visual expression carries with it the aesthetics and aesthetic traditions of its source technologies and the subjectivity of visual images at the most fundamental level.
Data Visualization approaches derived from the art world are of value in their own right, producing compelling works of art, and valuable as a means to raise new questions and approaches to data. For example, the formalism of abstraction can result in breath-taking beauty or be applied to discovery. Structuralism can tell us about the nature of the source data. Despite the emergence of common structures and metaphors, this is an emerging practice. Art’s deconstructive tendencies are helpful in unfolding assumptions that are built into data collection and structure. There is a need for tools that provoke new insights in the fields where Data Visualizations are applied. Experimental, abstract, multi-dimensional, highly interactive works can be immersive and provocative, perhaps more so than simplified visualizations that illustrate pre-figured assumptions. Aesthetics that can evoke and provoke others disciplines yet draw from the formal and critical values of art are the most promising, and the most difficult to attain. This is a field in which art and design practices can engage in multiple layers of discovery — of new forms of expression and of new realizations in the fields that are aligned with the source data, be these genomics, physics, economics, or information theory.
 M. Whitelaw, “Art Against Information: Case Studies in Data Practice,” Proceedings, Fibreculture, ed. A. Murphy, (Perth: Digital Arts and Culture Conference, 2006): 2, http://journal.fibreculture.org/issue11/issue11whitelaw.html (accessed March, 2010).
 See K. Glance, Raw Data (2008), http://searchdatamanagement.techtarget.com/ sDefinition/0,sid91_gci878172,00.html Needham, MA: Search DataManagement.com (accessed March, 2010). For a minority of writers, “information” is a specific type of data that derives from communications systems using information technology, such as the Internet.
 This list builds on categories created by F. B.Viegas and M. Wattenberg (2007), http://manyeyes.alphaworks.ibm.com/manyeyes/ page/Visualization_Options.html (accessed March, 2010).
 Social media encompass all forms of media that allow participatory communication within the Internet, the vast array of “places, tools, services allowing individuals to express themselves…in order to meet, to publish, share and socialize.” (F. Cavazzo, Social Media Landscape, (2008): 3, http://fredcarvazzo.net/2008/06/09/social-media-landscape (accessed December, 2009)).
 See L. Manovich, “The Database as Symbolic Form,” Database Aesthetics: Art in the Age of Information Overload, ed. V. Vesna (Minneapolis: University of Minnesota Press, 2007): 39-60. The quality and quantity of data influences the nature of a visualization and its ability to be a valuable tool for interpretation.
 In evaluating their fifteen years of existence, Wired editors noted their own anxious tendency to declare endings rather than evolutions. For example, Anderson (1998) dramatically announced the End of Television, rather than its disintermediation and reconstitution in YouTube by Harley & Chen (2007) C. Hurley & S. Chen (2007) youtube.com [Internet]. http://www.youtube.com/ [Accessed March, 2010]. and the like. See C. Anderson, “The Petabyte Age: The Power of Big Data,” Wired (July, 2008): 141-3.
 C. Anderson, “The Petabyte Age: The Power of Big Data,” Wired (July, 2008): 143.
 E. Pettersen, T. Goddard, C.C. Huang, G.S. Couch, D.M. Greenblatt, S. Mange and T.E. Ferrin, “UCSF Chimera — A Visualization System for Exploratory Research and Analysis,” Wiley InterScience (6 May 2004), www.interscience.wiley.com (accessed March, 2010).
 T. Ferrin (2010), http://www.apple.com/science/insidetheimage/ferrin/ (accessed March, 2010).
 S. W. Park, B.Budge, L. Linsen, B. Hamann, & K. I. Joy, “Dense Geometic Flow Visualization,” Eurographics — IEEE VGTC Symposium on Visualization, eds. K.W. Brodie, D.J. Duke, K.I. Joy (2005): 1-8.
 T. Brigman, J.W. Williams, and G. Shirah (animators), D. Baker, and S.G. Kanekal (scientists), Earth’s Radiation Belts Tremble Under the Impact of an Electrical Storm: Halloween 2003 Solar Storm, (2004) http://svs.gsfc.nasa.gov/ (accessed March, 2010).
 These can have dramatic impacts on space vehicles.
 Brigman, et al.
 See P. Godfrey-Smith, Theory and Reality: An introduction to the philosophy of science (Chicago: Chicago University Press, 2003) for a thorough overview of the history of scientific philosophy.
 P. Boulanger, S. Diamond, and T. Erickson, “A Boom with a View” (keynote lecture at Powering Innovation Conference, Toronto: ORION/CANARIE, 2008).
 F.B. Viegas and M. Wattenberg, Artistic Data Visualization: Beyond Visual Analysis, (Unpublished paper available from Cambridge, Massachusetts: Visual Communication Lab, IBM Research, 2005): 2.
 K. Barad, Meeting the Universe half Way: Quantum Physics and the Entanglement of Matter and Meaning, (Durham and London: Duke University Press, 2007): 41.
 See S. Kember, Cyberfeminism and Artificial Life, (London: Routledge, 2003) for a closely related discussion of artificial life, gender constructs and concepts of nature.
 The case studies were presented at SIGGRAPH and IEEE panels as well as several Banff New Media Institute summits. See S. Diamond et al. “Visualization, Semantics and Aesthetics,” (Conference Proceedings SIGGRAPH, 2001) http://www.siggraph.org/s2001/conference/panels/panels6.html (accessed January, 2007); S. Diamond and S. Kennard, Banff New Media Institute Web Archives, (Banff: Banff New Media Institute, 1993-2009) http://www.siggraph.org/s2001/conference/panels/panels6.html (accessed March, 2010).
 Evaluations of Data Visualizations should raise concerns about the quality of “source” or “raw” data, and challenge the assumption that once the data have been “cooked,” that is, digitized and standardized, they guarantee accuracy.
 C. Reas and B. Fry, Processing: A Programming Handbook for Visual Designers and Artists (Cambridge, Massachusetts: MIT Press, 2007).
 A visualization laboratory might have the following systems and languages: the artist-friendly open source Processing, Side Effects, IBM Cognos, Virtools, Quest 3D, Unity 3D, the Autodesk Entertainment Bundle including Maya, Virtual Director with VMaya (extended by NCSA), Adobe Tool Kit including Action Script, Flex, Flex Builder, Flash, Action Script, Max/MSP/Jitter, Java, C++, NetVR, a 3D network visualization tool.
 See S. Diamond, ed., “Remix,” Horizonezero.ca Issue 8 (April/May 2003) www.horizonzero.ca (accessed December, 2008) for an overview of the history of remix culture, and P.D. Miller aka DJ Spooky That Subliminal Kid, Rhythm Science (Cambridge: MIT Press, 2004); P.D. Miller aka DJ Spooky That Subliminal Kid, Sound Unbound: Sampling Digital Music and Culture (Cambridge: MIT Press, 2008) for history, technology and cultural studies on the impacts of remix.
 For current research and commercial applications for such location-based tools see M. Longford and S. Diamond, Mobile Digital Commons Network (2004-2007) http://www.mobilenation.ca/mdcn/ (accessed March 2010); Diamond et al. in M. Ladly and P. Beesley eds., Mobile Nation, (Waterloo: Riverside Architectural Press, 2010); The Mobile Experience Innovation Centre (2010) http://www.meic.ocad.ca/ (accessed March, 2010); Gardner and Shea, Portage (2006) http://www.mobilelab.ca/portage/ (accessed December, 2009).
 See M. Whitelaw, “Art Against Information: Case Studies in Data Practice,” Proceedings, Fibreculture, ed. A Murphy (Perth: Digital Arts and Culture Conference, 2006): 12, http://journal.fibreculture.org/issue11/issue11whitelaw.html (accessed March 2010).
 See The Infome, (1999-2002): 4, http://220.127.116.11/~jevbratt/1_to_1/index_ng.html (accessed March, 2010).
 Whitelaw (Ibid. endnote xxviii) points out that these works continue twentieth-century art practice that is engaged with rethinking the frame.
 See L. Jevbratt, “The Infome — The ontology and expression of code and protocols,” (presentation, Crash, London, 2005): 1, http://journal.fibreculture.org/issue11/issue11_whitelaw.html (accessed December, 2009).
 Ibid, 4.
 L. Manovich, “Data Beautiful,” The Infome (2005): 5, http://18.104.22.168/~jevbratt/1_to_1/index_ng.html (accessed December, 2009).
 See S. Kafje, “Search? q=fool and Search?=Moon,” The Infome (2005) http://22.214.171.124/~jevbratt/1_to_1/index_ng.html (accessed December, 2009).
 Somewhat ironically, the last communication from Opte was a note from Lyon stating that he had received a request from his service provider to create an image of a “distributed denial of service attack” (Ibid).
 Ibid, 3.
 See G. Kindlemann, “Is There Science in Visualization?” IEEE Compendium IEEE Visualization, In T.J. Jankun-Kelly, R. Kosara, G. Kindlemann, C. North, C. Ware, and E.W. Bethel (transcript, London, IEEE, October, 2006): 4, http://www.cse.msstate.edu/~tjk/ (accessed December, 2009).
 Debates regarding realism and the subjective role of the designer arise in discussions of how to represent multidimensional data sets in humanities or social science. See K. Mason, J. Denzinger and M.S.T Carpendale, “Negotiating Gestalt: Artistic expression and coalition formation in multi-agent systems,” Proceedings of AAAMA, July 2004 (New York: ACM, 2004): 1350-1351 for valuable insights. E. Tufte, Beautiful Evidence (Cheshire, CT: Graphic Press, 2006): 138 acknowledges that analyses of human behaviour are “often so distant from any kind of law like understanding” that there are “multivariate uncertainties about causality.” However, he argues that representations of causality, comparison and multivariate complexity prevail in social science as well as science; hence Data Visualization must provide uniform methods for all disciplines. B. Fry (2008) Visualizing Data (North Sebastopol, CA: O’Reilly) believes that Tufte simplifies the challenge of manipulating complex data. He posits that designers constantly make choices about which dimensions to show while also facing technical difficulties with multivariate data.
 J.N. Bray, J. Less, L. Smith, and L. Yorks, Collaborative Inquiry in Practice (Thousand Oaks: Sage, 2000): 105.
 The Visual Display of Quantitative Information, 2nd ed. (Cheshire, CT: Graphics Press, 2001): 51.
 Expert cosmology and climate change visualization designers such as Gloria Simmons-Brown, Jay Anne English and Donna Cox are Data Visualization professionals. They report that their collaborators from the sciences have rejected visually engaging imagery for fear that the results would be refused by committees reviewing for publication. Tensions between information visualization designers who call on aesthetics and metaphor and scientific visualization designers who are pressed towards realism are evident at conferences such as Information Visualization. An excellent discussion of this occurs in J. English, “Cosmos versus Canvas: Tensions between art and science in astronomy image. The art and science of Data Visualization,” Horizonzero.ca, no. 6 (2002) www.HorizonZero.ca (accessed March, 2010).
 B. Matthews, When stakeholders represent others in design conversations (paper presented at Nordscode Seminar, Lyngby, Denmark, April 28-30, 2004).
 Ibid, 52.
 See C. Ziemkiewicz and R. Kosara, “The Shaping of Information by Visual Metaphors,” Transactions on Visualization and Computer Graphics 14, no. 16 (New York: IEEE, November/December, 2008): 1269-1276.
 R. Kosara, “Visualization Criticism — The Missing Link Between Information Visualization and Art,” Proceedings of the 11th International Conference on Information Visualization (North Carolina: IV, 2007): 631-636, 634.
 M. Whitelaw, “Art Against Information: Case Studies in Data Practice,” Proceedings, Fibreculture, ed. A. Murphy, (Perth: Digital Arts and Culture Conference, 2006): 13, http://journal.fibreculture.org/issue11/issue11whitelaw.html (accessed December, 2009).
 For example, Tracktinsky & Zmiri prove through their research on users’ use of skinning technologies that aesthetic judgment, including visual and emotional pleasure, is as critical as functionality in visual understanding. See N. Tractinsky and D. Zmiri, “Exploring Attributes of Skins as Potential Antecedents of Emotion in HCI,” Aesthetic Computing, ed. P. Fishwick (Cambridge, MA: MIT Press, 2005): 1-28.
 See N. Tractinsky, A.S. Katz, and D. Ikar, “What is beautiful is usable. Interacting with Computers,” Human Computer Interaction 13, no. 2 (2000): 127-145. Also see later work that analyses discomfort with beauty in computer science and tracks changing attitudes and evidence-based studies, N. Tractinsky, “A Few Short Notes on the Study of Beauty in HCI,” Human-Computer Interaction 19, no. 4, (2004): 351-357.
 J. Prophet, Cell (2002) http://www.janeprophet.com/CellApplet02/CellApplet100.html (accessed March, 2010).
 P. Fishwick, S. Diehl, J. Prophet and J. Lowgren, “Perspective on Aesthetic Computing,” Leonardo 38 (2005): 133-141, 136.
 As well as previously cited works see L. Jevbratt, Interspecies Collaboration (2009) http://jevbratt.com/ (accessed February, 2009); and L. Jevbratt, 22: Search and Thou Shall Find (2008) http://126.96.36.199/~jevbratt/lifelike/artists/kajfes/index.html (accessed March, 2010).
 The uncanny adds elements of discomfort or grotesqueness to images to startle viewers. Most recently Jevbratt has moved on to build emulators for the sensory apparatus of animal species.
 L. Jevbratt, “The Infome — The ontology and expressions of code and protocols,” (presentation at Crash, London, 2005): 1, http://journal.fibreculture.org/issue11/issue11_whitelaw.html (accessed March, 2010).
 The alignment of code and graphics with mathematics, rather than text, is a point made by both R. Burnett, How Images Think (Cambridge, MA: MIT Press, 2005); and L. Manovich, “The Database as Symbolic Form,” Database Aesthetics: Art in the Age of Information Overload, ed. V. Vesna (Minneapolis, MN: University of Minnesota Press, 2007): 39-60.
 Indeed Hegel postulated the need for a qualitative understanding of beauty, rather than a quantitative one, challenging some aspects of scientific realism or essentialism. See G.W.F. Hegel, Aesthetics. Lectures on Fine Art, trans. T.M. Knox, 2 vols. (Oxford: Clarendon Press, 1975).
 E. Tufte, Beautiful Evidence (Cheshire, CT: Graphic Press, 2006); and E. Tufte, The Visual Display of Quantitative Information, 2nd ed. (Cheshire, CT: Graphic Press, 2001).
 E. Tufte, The Visual Display of Quantitative Information, 2nd ed. (Cheshire, CT: Graphics Press, 2001): 51.
 See C. Ware, Information Visualization: Perception for Design, 2nd ed. (San Francisco, CA: Morgan Kaufmann Series in Interactive Technologies, 2004).
 Information Visualization under Chen’s (2008) leadership is currently redefining the field towards visual analytics, combining perception research and visualization. See C. Chen, Information Visualization (London: Palgrave MacMillan Ltd., 2006-2009).
 See C. Ware and R. Bobrow, “Supporting Visual Queries on Medium-size Node-link Diagrams,” Information Visualizations 4 (London: Palgrave, 2005), 49-58.
 Ware et al., tested pattern recognition to see how viewers might speedily find the shortest paths in “spring layout diagrams.” See C. Ware, H. Purchase, L. Colpoys, and M. McGill, “Cognitive Measures of Graph Aesthetics,” Information Visualization 1 (London: Palgrave, 2002), 103-110; see also H. Ware, D.H. Franck Ware, D. Gui, and G. Franck, Visualizing Object Oriented Software in Three Dimensions (North York: IBM Canada Centre for Advanced Studies, 2006), 612-620 compare 2D and 3D comprehension and formal strategies for complex network layouts on p. 612.
 See E. Kant and A. Wood, eds., Basic Writings of Kant: 1724-1804 (NY, Toronto: Random House Modern Literary Classics, 2001); and E. Kant, Critique of Pure Reason, Critique of Judgment, Trans. N.K. Smith (London: Palgrave, MacMillan, 2000).
 F.J. Varela, E. Thompson, and E. Rosch, The Embodied Mind: Cognitive Science and Human Experience (Cambridge, MA: MIT Press, 1993), 71.
 C. Ware, Patterns and Words, “Logic and Narrative: What can we expect of a visual language?” Proceedings 2007 IEEE Symposium on Visual Languages and Human Centric Computing (2007):11.
 Documentation is found at J. Donath and F. B. Viegas, “The Chat Circle Series: Explorations in designing abstract graphical communications interfaces,” Proceedings of DIS (London: DIS, Siggragh, ACM, 2002): 1-10.
 See F.J. Varela, E. Thompson, and E. Rosch, The Embodied Mind: Cognitive science and human experience (Cambridge, MA: MIT Press, 1993), 51.
 Ibid, 171.
 See W. Sacks, “Network Aesthetics,” Database Aesthetics: Art in the Age of Information Overload, ed. V. Vesna (Minneapolis, MN: University of Minnesota Press, 2007): 205.
 See V. Flusser, “Memories,” Ars Electronica, ed. T. Druckery (Cambridge, MA: MIT Press, 1999): 203.
 See S. Diamond, “CodeZebra: Promising Interplay,” aRt &D: Research and development in art, ed. J. Brouwer et al. (Rotterdam: V2_NAi Publishers, 2005): 140-149.
 R. Burnett, How Images Think (Cambridge, MA: MIT Press, 2005), 176.
 See M.B.N. Hansen, New Philosophy for New Media (Cambridge, MA: MIT Press, 2004).
 In Cinema Two Deleuze suggests to scientists that the human brain has changed because of its twentieth-century immersion in filmic and other temporal media. See G. Deleuze, Cinema Two: The time-image, trans. H. Tomlinson and R. Galeta (London: Continuum, 1989).
 Ibid, 198.
 C.M. Danis, F.B. Viegas, M. Wattenberg, and J. Kris, “Your Place or Mine? Visualization as a Community Component,” CHI 2008 Proceedings, April 5-10, 2008 (Florence Italy: ACM, 2008): 275-284, 804.
 M. Whitelaw, “Art Against Information: Case Studies in Data Practice,” Proceedings, Fibreculture, ed. A. Murphy, (Perth: Digital Arts and Culture Conference, 2006): 10, http://journal.fibreculture.org/issue11/issue11whitelaw.html (accessed March, 2010).
 M. Lovejoy, C. Paul and V. Vesna, eds., (2010) Context Providers: Conditions of Meaning in Media Art (New York: Intellect Ltd.)
 See S. Diamond and M. Ladly, “Creating Methodologies for Mobile Platforms,” Mobile Nation, eds. M. Ladly and P. Beesley (Waterloo: Riverside Architectural Press, 2008).
 See B. Latour, “Visualization and Cognition” [Originally Published as “Les ‘vues’ de l’espirit”], Culture Technique, no. 17 (June, 1985) www.bruno.latour.fr (accessed December, 2008); and B. Latour, “Science Po Project,” (keynote address, The Future of Objectivity Conference, Toronto, 2008) http://www.sciences-po.fr/portail/ and http://research.ischool.utoronto.ca/objectivity/abs.html (accessed February, 2010).
 See J. Law and J Hassard, eds., Actor Network Theory and After (London: Blackwell Publishing, 1999).
 See B. Brecht, “Epic Theatre”, Marxism and Art: Essays Classic and Contemporary, ed. M. Solomon (New York: Vintage Books, 1974): 360-369; and for a fuller explication see B. Brecht, Brecht on Theatre (New York: Hill and Wang, 1964).
 See J. Law and J Hassard, eds., Actor Network Theory and After (London: Blackwell Publishing, 1999).
 Ibid, 3.
 J. Abrams, P. Hall, eds., Elsewhere Mapping: New Cartographies of Networks and Territories ( Minneapolis: University of Minnesota Press, 2006).
 J. On, They Rule (2004) http://www.theyrule.net/2004/tr2.php (accessed December, 2009). Andrew Cook has created Data Visualizations mapping current military spending and patterns of power. See A. Cook, Military Spending (2008) http://www.acooke.org/andrew/writing/arms.html (accessed December, 2009).
 By the 1970s the constructivist and linguistic turn suggested that there were no essential qualities to the world, only expressions of experience through unconscious language, gesture and presence. It was language that made or disrupted meaning and identity.
 S. Posavec, Writing Without Words (2006) http://www.itsbeenreal.co.uk/index.php?/writing-without-words/about-this-project/, 2008 (Accessed March, 2010).
 See E. Thacker, “Foreword: Protocol is as Protocol Does”, Protocol: How control exists after decentralization, ed. A.R. Galloway (Cambridge, MA: MIT Press, 2004): xxvii.
 Ibid, 3.
 For categories and tropes see M. Eppler, Domain Maps (Lugano: The University of Lugano, Università della Svizzera italiana-USI, 2007) http://www.elab.usilu.net/usi10anni/ knowledge_domain_maps/visualization_scholars/ fields of visualization studies (accessed December, 2009); and M. Eppler, Mapping Tools in Overview (2007) http://www.visual-literacy.org/pages/maps/mapping_tools_radar/radar.html (accessed December, 2009)