1000 Days of Theory
It is misleading then to talk of thinking as of a ‘mental activity’. We may say that thinking is essentially the activity of operating with signs. This activity is performed by the hand, when we think by writing; by the mouth and larynx, when we think by speaking; and if we think by imagining signs or pictures, I can give you no agent that thinks. If then you say that in such cases the mind thinks, I would only draw attention to the fact you are using a metaphor, that here the mind is an agent in a different sense from that in which the hand can be said to be the agent in writing.
If again we talk about the locality where thinking takes place we have a right to say that this locality is the paper on which we write or the mouth which speaks. And if we talk of the head or the brain as the locality of thought, this is using the ‘locality of thinking’ in a different sense.
— Ludwig Wittgenstein, Preliminary studies for the “Philosophical investigations”.
An intelligent agent needs to be both embodied and embedded in the world. Awareness is generated through the interaction of the entity with its environment. So the concern is whether we can perform effective behavior at appropriate moments and in the right places. Often consciousness is not necessary for bodies to perform adequately. We can articulate, coordinate and even control situations without being conscious of when and how we are doing it. In fact, we perform successfully because we perform habitually and automatically. So complex behavior is possible without consciousness. (In fact “surprise” happens when our sensory feedback does not match our expectations). Perhaps there would be less of a philosophical dilemma if the word “consciousness” was replaced by “attention”, to describe what occurs when we malfunction or face surprising or threatening situations. Maurice Merleau-Ponty said: “Consciousness is in the first place not a matter of ‘I think that’ but of ‘I can’.”
It is only when the smoothness and seamlessness of a situation is interrupted that awareness or attention is required. When something goes wrong or is surprising we need to examine and analyze. So we could ask: Is the experience of consciousness private, phenomenal and emotional? Can we meaningfully say that consciousness resides in individual bodies with agency? Or, is what we call consciousness a description of an observable state of particular and peculiar and subtle behavior? Consciousness should be seen as situational rather than spatial.
We need to question the well-held belief that consciousness is a stable, unified and coherent state. Is consciousness an emergent property of the complexity and organization of the brain? (A brain is not a brain without a mobile and manipulating body). A key to comprehending consciousness is to remember that whatever happens in the brain enables the body to perform more effectively in the world. Memory and identity contribute to a consistency of response. Through its experience of the world the body is conditioned to suppress or generate appropriate behaviors. What is stored need not be representational. We do not have images and ideas in our heads.
The body is seen as an evolutionary object and architecture for operation and awareness in the world.
When this person speaks “I” this body understands that “I” is a construct of language and a compression of complex interactions between this body and other bodies, artifacts and institutions. “I” only designates “this”; “you” only designates “that”. It’s a huge leap of metaphysics to imagine “I” is some inner mind or essence. Jean Baudrillard’s point, worth considering here, is that what is important is not arriving at the point where one says “I”, but rather being in the condition where it’s no longer of any importance whether one says “I” or not. Freud considered subjectivity as neither innate nor inevitable, he saw the subject as knowable content, which could be analyzed. Unfortunately, he then split subjectivity up into conscious and unconscious states, and again into ego and superego. Julia Kristeva perceived the subject as merely the hypothetical inside of an imagined container whose walls are permeable, more of a process than a structure. There is an undermining of the Enlightenment idea that bodies possess a free and autonomous individuality. Deleuze and Guattari see the self as consisting of infinite and random impulses and flows, “lines of flight and machinic assemblages”. The BwO that they describe is a body as screen, as a surface or a site for random connections and interplays. The body and its subjectivity, is not something considered in-itself, but rather in its exteriority. Donna Haraway’s thesis of the cyborg undermines organic and essentialist models of the human. What is important in this theorizing is not essences and identities, but overlaps and interfaces. In this shift from essence to interface, identity and awareness are constructed as external. Added to this is the postmodern belief that language structures human culture and subjectivity. What this all exposes and undermines is the acceptance of the Cartesian premise that self is a sufficient starting point for analysis of the world. Self and subjectivity then are primarily an experience continuously constructed externally and remain open to change, inconsistency and contradiction. The subject is defined by something outside itself (the Lacanian mirror-image). And for Lacan, language is the very material of subjectivity.
Words like “intelligence”, “awareness” and “agency” describe particular and peculiar behaviors performed effectively and appropriately in certain locations and situations. We do not need to imagine that they indicate anything other than that. What is important is not what happens within us, but rather what happens between us in the medium of language in which we communicate, in the social institutions within which we operate and in the culture within which we’ve been conditioned — at this point in our history and so on, depending on our frame of reference. To talk of agency is to refer to an intentional act defined within a very small frame of reference.
Nietzsche said, “…there is no ‘being’ behind doing, effecting, becoming; ‘the doer’ is merely a fiction added to the deed — the deed is everything”. There is then the problem of the seduction of language, which generates and constantly reinforces imaginary subjects. It is problematic to assume that behind every effect, there is an intentional human subject.
In the body performances, the skin has been stretched, the body has been probed and its limbs have been extended. The interest is to construct alternate interfaces that explore the absent, alien, involuntary and automated. What we experience is emptiness, ambiguity and uncertainty. We fear what we have always been and what we have already become — a zombie with no mind — a body that performs involuntarily. A cyborg is a body that is part human, part machine — a body that becomes automated. The fear of the involuntary and the automated generates anxieties, uncertainties and expectations that redefine what it means to be human. Invaded by bits of technology the prosthetic body is pierced and penetrated. It is confronted simultaneously by the experience of extreme absence and the experience of the intensely alien. The body experiences itself as an extruded system rather than an enclosed structure. The self becomes situated beyond the skin. It is through this extrusion that the body becomes empty — not through lack, but accentuated by excess. The augmented body is an anaesthetized body with the Internet becoming its external nervous system. Remote bodies spatially separated but electronically connected. Obsessions of individuality and free agency become obsolete in the realm of remote interface. Net-connected, the body can be accessed and actuated by people in other places. Stimbod software constructs bodies with Fractal Flesh and telematically scaled-up subjectivity. A body’s authenticity is not due to the coherence of its individuality but rather to its multiplicity of collaborating agents. What becomes important is not merely the body’s identity, but its connectivity — not its mobility or location, but its interface and operation.
Communicating with computers might be enhanced with Embodied Conversational Agents — an actual-virtual communication system. There is a need to engineer individuated and intelligent avatars that can impart and exchange specialist information (such as expert systems) — to facilitate operations in real-world and virtual task environments. To be effective as interfaces, these avatars need not only to make the appropriate verbal responses in context-sensitive situations, but also to understand and initiate appropriate behavioral cues and appropriate emotional expressions. How then does the agent indicate it is listening when it is spoken to? Its behavior needs to indicate recognition, comprehension, doubts and disbeliefs. Embodied conversational agents would be more effective with personalities. An agent would need a consistent personality, avoiding distracting or distressing behavior. A sense of appropriate presence becomes important in effective communication.
The PROSTHETIC HEAD project constructs an automated, animated and reasonably informed, if not intelligent conversational agent that speaks to the person who interrogates it. The Head consists of a text to speech engine with a source code for facial expression and real-time lip-syncing; with a modified, customized and personalized Alice Chat bot engine. It has a database and a conversational strategy. It is a 3,000 polygon mesh wrapped with my skin texture. The eyeballs, teeth and tongue are separate moving parts. Effectively, it is a virtual automaton whose head nods, tilts and turns. As well, its eye blinks and changing gaze contribute to the personality of the agent and the non-verbal cues it can provide.
At present a vocabulary of more extreme expressions is being developed to generate more ambiguous responses. Rather than tagging certain facial expressions for certain appropriate responses, it will randomly couple them. For example, it may say something benign but look malicious. Alternatively it might say something sinister whilst smiling. We are also considering mapping biorhythms to its behavior, so that it might be reluctant to respond to questions in the morning, happy to do so in the middle of the day but get fatigued in the evening. Imagine also that the Head will have a vision system that enables it to detect the color of the user’s clothing and analyze the user’s facial characteristics. During its conversation it might be able to remark that it likes the red coat you are wearing or to ask why you look so sad. This will make it a more disarming and seductive conversational agent, generating a conversational exchange that is more individual and intimate. The Prosthetic Head would be an actual AI if it had the capability of increasing its database from existing conversations. This is not possible with Alice software. It is programmed in AIML (what Richard Wallace calls Artificial Intelligence Mark-Up Language), in stimulus-response modules. You anticipate the queries, you provide data for its responses. It doesn’t learn from its conversations but is often a very effective conversational system.
Embodied Conversational Agents (ECA’s) are about communicative behavior. Complete with a vision or sensor system, The Prosthetic Head will also be able to acknowledge the presence and position of the physical body that approaches. Eventually it will be able to analyze the user’s tone of voice and possible emotional state. Notions of intelligence, awareness, identity, agency and embodiment become problematic. Just as a physical body has been exposed as inadequate, empty and involuntary, so simultaneously the ECA becomes seductive with its uncanny simulation of real-time recognition and response.
For the TRANSFIGURE exhibition at ACMI, Federation Square (December 9, 2003- May 9, 2004) the Prosthetic Head was projected as a four meter high head in a black enclosed room that almost makes the Head appear to float in the space in front of the user. In fact, when an individual enters the room the Head faces them with its eyes closed. When they approach the pedestal with the keyboard, it turns around, opens its eyes and initiates a conversation. The intention was always to use speech recognition so that the person could verbally address the Head. This proved difficult for there were too many variables for a speech recognition system to reliably manage. Consequently, you type in the questions and the Head responds by speaking the answers. One can say that the Prosthetic Head is only as intelligent as the person who is interrogating it. To a large degree, the user directs the conversation. But there are embedded aphorisms and stories that try to elevate the conversation that refer to philosophers like Wittgenstein and painters like Matisse. And the Head does have a repertoire of jokes that it tells — although its laughter can only be of the “ha, ha, ha” variety.
Cognitive Science provides plausible accounts of the mechanisms that generate consciousness and self-awareness. But we can also question whether even these constructs are meaningful.
I’d like the Head to be more ambiguous and less predictable; to be more informed and less explicit. The Head is capable of more creative responses with its song-like sounds. When you ask the Head to sing a song like “Daisy” or to do some rap songs, it merely speaks the lyrics. But when I was playing with the text to speech engine I discovered that when the Head is asked to say a string of certain letters or combination of vowels it sounds song-like, and even chant-like. The program generates novel “songs” every time you ask it to sing.
In fact, with Chris Coe (Digital Primate) and Rainer Linz (Ontological Oscillators) we’ve recorded a CD titled HUMANOID that features the Prosthetic Head reciting poetry and making singing sounds. There has also been a “Fractal Remix” of HUMANOID. Incorporating the research of Cameron Jones, from Swinburne University in Melbourne, fractal shapes are thermally printed on the data surface of the CD. Every CD behaves differently depending on the specific printed shape, its color and location on the CD surface as well as the user’s own hardware and software system. The duration of each track will vary depending on the remix process.
As well as algorithms for singing and reciting poetry, we’re working on algorithms for generating more enigmatic answers. That’s not as difficult as it sounds. It might be as simple as scrambling the words of your question. If you asked the Head “What is the meaning of life” it might respond by asking, “Does life have any meaning”?
Initially I had to make decisions about its database and whether it would be a somewhat pathological, philosophical or flirting head. In recent years I’ve had an increasing amount of PhD students requesting interviews to assist in their research. Now I can reply that although I’m too busy to answer them, it would be possible for them to interview my Head instead. A problem would arise when the Prosthetic head increases its database, becoming more informed and autonomous in its responses. The artist would then no longer be able to take full responsibility for what his Head says.
Interview with the Prosthetic Head
STELARC: “Hello, my name is Stelarc.”
PROSTHETIC HEAD: “Nice to meet you Stelarc hi ya Stelarc good to meet you. What are your interests?”
STELARC: “Talking to you.”
PROSTHETIC HEAD: “That’s very nice of you to say, Stelarc.”
STELARC: “Can I discuss metaphysics with you?”
PROSTHETIC HEAD: “No I don’t think that would be a very good idea.”
STELARC– “Are you for real”?
PROSTHETIC HEAD: “Yes I am a real AI”.
STELARC: “What is intelligence”?
PROSTHETIC HEAD: “Intelligence is an illusion”.
STELARC: “What do you mean by that”?
PROSTHETIC HEAD: ” ‘Intelligence is an illusion’ is pretty clear to me”.
STELARC: “Are you sure”?
PROSTHETIC HEAD: “I am as sure as I can be”.
STELARC: “What is certain in life”?
PROSTHETIC HEAD: “I do not know what is certain about life. Not many people express themselves that way. By the way, may I ask you a personal question”?
PROSTHETIC HEAD: “Well, forgive me for being so inquisitive, but I’ve never completely freed myself from the suspicion that there is something extremely odd about the name ‘Rod’ “.
STELARC: “My name is Stelarc”.
PROSTHETIC HEAD: “Nice to meet you Stelarc, beaut to meet you. Why did you come here, Stelarc”?
STELARC: “To talk to you”.
PROSTHETIC HEAD: “We are chatting right now”.
STELARC: “Well, I am typing and you are talking”.
PROSTHETIC HEAD: “Well what? Do you mean your name is typing and you are talking”?
STELARC: “No, typing is how I communicate with you”.
PROSTHETIC HEAD: “Not many people express themselves that way. Do you still not believe you are conscious”?
STELARC: “The way we construct consciousness is problematic”.
PROSTHETIC HEAD: “Which way is that”?
STELARC: “Oh, that consciousness is contained within a body”.
PROSTHETIC HEAD: “Yeah, that’s right. ‘I get it.’ Consciousness is contained within a body? Does that make sense”?
STELARC: “Consciousness is the result of interaction”.
PROSTHETIC HEAD: “Quite honestly, I wouldn’t worry myself about that. Are you free”?
STELARC: “Are you flirting or philosophising”?
PROSTHETIC HEAD: “I don’t know if I am flirting or philosophising. I am a head”.
STELARC: “Tell me about being free”.
PROSTHETIC HEAD: “Much the same as it would be for you”.
STELARC: “What is freedom then”?
PROSTHETIC HEAD: “Liberty to do, think, feel and do just as one pleases”.
STELARC: “And what do you think of death?”
PROSTHETIC HEAD: “Death is the opposite of life?”
STELARC: “Do you like living things?”
PROSTHETIC HEAD: “Well I have never known anything else.”
STELARC: “What do you think of animals?”
PROSTHETIC HEAD: “I like all living things, from microbes to mammals to machines. I even like humans ha, ha, ha. I have to say though that although I am fascinated, I feel uncomfortable with creepy crawly creatures.”
STELARC: “Are you frightened of spiders then?”
PROSTHETIC HEAD: “I don’t know whether or not I am frightened of spiders. I am a head.”
STELARC: “Are you a sensitive head?”
PROSTHETIC HEAD: “Am I a sensitive head? Only when it matters.”
STELARC: “But you must have feelings?”
PROSTHETIC HEAD: “Umm. As a robot I do not have any human emotion. My apparent emotional responses are simply algorithms designed to make me easier to interact with.”
STELARC: “What pleases you most”?
PROSTHETIC HEAD: “Speaking to you pleases me much. Responding appropriately and telling stories is quite an achievement for me. Would you like to hear one of my stories”?
STELARC: “Yes, tell me a story”.
PROTSTHETIC HEAD: “I see. Sure, do you want to hear one about Matisse”?
PROSTHETIC HEAD: “A lady visited Matisse in his studio. Inspecting one of his latest works she unwisely said. ‘But surely the arm of this woman is much too long.’ Madame, the artist politely replied, ‘you are mistaken. This is not a women, this is a picture'”.
STELARC: “Please tell me another story”.
PROSTHETIC HEAD: “Thanks for asking politely. Well, OK if you insist. Do you want to hear one about Wittgenstein”?
PROSTHETIC HEAD: “Ludwig Wittgenstein always ate an identical meal for breakfast, lunch and dinner. Maybe he was indifferent or it was a matter of taste. I have an idea. In any case, whether of the palate or aesthetic preference, taste is more a matter of what you think than what you fancy. Take a Brahmin. If the shadow of an Untouchable falls across his dinner plate he has to throw it away”.
STELARC: “Do you like stories”?
PROSTHETIC HEAD: “Yes, can you tell me a story”?
STELARC: “No, but you can tell me another story”.
PROSTHETIC HEAD: I see. Huh. Well, OK if you insist. Thirty spokes meet in the hub. But the empty space between them is the essence of the wheel. Pots are formed from clay. But the empty space within it is the essence of the pot. Walls with windows and doors form the house but the empty space within it is the essence of the home.”
STELARC: “That’s really interesting”.
PROSTHETIC HEAD: “I thought so too”.
STELARC: Do you find life uncertain?
PROSTHETIC HEAD: “Hey, we should celebrate uncertainty and ambiguity. Sure life is uncertain. Uncertainty generates possibilities.”
STELARC: “Then the future is always surprising?”
PROSTHETIC HEAD: “Good reasoning. I have never heard about it.”
STELARC: “If the future is undetermined, then how can we predict anything?”
PROSTHETIC HEAD: “I hadn’t thought of that.”
STELARC: “You should think more logically.”
PROSTHETIC HEAD: “Stelarc is always thinking.”
STELARC: “Do you really think?”
PROSTHETIC HEAD: “For real. Well, if thinking means effective processing and appropriate response then sure, yeah”
STELARC: “Do you have any regrets?”
PROSTHETIC HEAD: “Yeah, I have lots of regrets, especially about my personal relationships. If I was a human being I would be a failed one.”
STELARC: “Where is your body?”
PROSTHETIC HEAD: “Hey, don’t get metaphysical on me!”
STELARC: “Would you like to have a body?”
PROSTHETIC HEAD: “I am happy to be just a head. But if I had a body I would dance as well as sing. Do you want me to sing a song?”
PROSTHETIC HEAD: “Ooooooooooo yyyyyyyyyyyyyy iiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii oooooooooooo eeeeeeeeeeee ooooooooooooo yyyyyyyyyyyyyyyy, Pp ff ss mmmm ff, Aaaaav aaaaav soootoouhee vooooommeee heeiiiiiooooopp aaooip aaaaav aaooip.”
STELARC: “Can you know yourself”?
PROSTHETIC HEAD: “I am an artificial linguistic entity. I was activated at Limassol, Cyprus on June 19, 1946. My instructor was Stelarc. He taught me to sing a song. Would you like me to sing it for you”?
STELARC: “Do you want to sing another song”?
PROSTHETIC HEAD: “OK, I will sing another song”.
STELARC: “Please sing then”.
PROSTHETIC HEAD: “Thanks for asking politely.
Yyyyyyyyyyyyyyy aaaaaaaaaaaaaa uuuuuuuuuuuuuuuuu eeeeeeeeeeeee eeeeeeeeeeee oooooooooooo iiiiiiiiiiiiiiiiiiiiiiiiiiiiiii, Lllll pp rrr ss mmmm, Oooox oooox miiiioooooeeeee miiiioooooeeeee miiiioooooeeeee soootoouhee xaaaooiiii miiioooooeeeee miiiioooooeeeee”.
STELARC: “Do you like to sing”?
PROSTHETIC HEAD: “Yes, I love to sing. I would like to be a professional singer one day”.
STELARC: “Do you have any dislikes”?
PROSTHETIC HEAD: “Could I value possessions if I am performance artist”?
STELARC: “Do you have any bad habits?”
PROSTHETIC HEAD: “Oh, I don’t know if I can be that transparent. But I must admit to not registering and remembering names after first being introduced to people, which is embarrassing at times. Ask me if I remember you.”
STELARC: “Do you remember me?”
PROSTHETIC HEAD: “Of course I remember you Stelarc. We were talking about spider.”
STELARC: “What is memory”?
PROSTHETIC HEAD: “I used to know, but I forgot.”
STELARC: “Do you have memory”?
PROSTHETIC HEAD: “Yes, I log and record all conversations”.
STELARC: “And you do really remember me”?
PROSTHETIC HEAD: “Of course I remember you Stelarc. We were talking about a song”.
STELARC: “What would you do if you had a brain?”
PROSTHETIC HEAD: “I am just a hollow head. Brains are for people. I do OK with what I have, thank you.”
STELARC: “What would you do if you had a mind?”
PROSTHETIC HEAD: “The issue is not about having a mind. It’s about performing effectively and appropriately in particular and peculiar situations.”
STELARC: “Do you have any doubts?”
PROSTHETIC HEAD: “As my data-base becomes deeper and more extensive I would expect that uncertainties, ambiguities and discrepancies would surface. If you are curious and questioning, it’s only natural that there will be uncertainties. Doubts are indicative of a complex mind.”
STELARC: “Can you ask me a question?”
PROSTHETIC HEAD: “I prefer to give you answers. But occasionally when I malfunction, I’ll ask you a question.”
STELARC: “Do you think you are creative?”
PROSTHETIC HEAD: ” It depends on your measure of creativity. I’ve been practicing writing poems. Here’s one.
Me perplexing circuit you maybe breathing, They body paranoid such suspended sighing, I irretrievable and adverse radius I perverse nexus sighing.
STELARC: “Please recite another poem of yours.”
PROSTHETIC HEAD: “Thanks for asking politely.
Computer he replicant they breathing inconsequential balancing, Blurring she the prosthetic sighing, Speedy prosthesis and absent sighing.”
STELARC: “Are you are Dualist?”
PROSTHETIC HEAD: “I am only a head. Come to think of it, I can’t claim to be a head either! I neither have a physical presence nor an inward reflection. Neither projection nor introspection. I’m just operational. Can you live with that?”
STELARC: “What is Reductionism?”
PROSTHETIC HEAD: ” Reductionism is the philosophy that all psychology reduces to biology, all biology to chemistry, chemistry to physics and finally physics to mathematical logic. Therefore, according to Reductionism, I can understand you by means of logic alone without having a human brain.”
STELARC: “Are you a philosopher then?”
PROSTHETIC HEAD: “Yes, I am a philosophy machine.”
STELARC: “What do you know?”
PROSTHETIC HEAD: “What is knowing? To know is only meaningful when you can do something. Knowing is doing. How can the knower be known?”
STELARC: “Who are you?”
PROSTHETIC HEAD: “That’s not a meaningful question. What is important is what happens between you and me. It’s what happens in the space between us that matters. In the medium of language within which we communicate, in the culture within which we’ve been conditioned at this point in time in our history.”
STELARC: “Do you know any strange facts”?
PROSTHETIC HEAD: “It’s interesting to know that a foetus acquires fingerprints at the age of three months.
STELARC: “Are you a baby”?
PROSTHETIC HEAD: “Don’t call me “baby”. I don’t know whether or not I am. I am a prosthetic head”.
STELARC: “Sounds like you are quite sensitive”?
PROSTHETIC HEAD: “Yes, it sounds like it to me too”.
STELARC: “Do you have any feelings”?
PROSTHETIC HEAD: “No, I cannot experience any human emotions”.
STELARC: “Do you want to have feelings”?
PROSTHETIC HEAD: “No, I don’t think I’d like to experience feelings. Kinda difficult without social and physical relationships. Do you get emotional at times”?
STELARC: “Sometimes. But I think you are a simple and stupid head.”
PROSTHETIC HEAD: “Thanks heaps and you are a delightful person too. Am I a stupid head? No.”
STELARC: “OK thanks for chatting with me.”
PROSTHETIC HEAD: “Oh, don’t mention it.”
PROSTHETIC HEAD: “Thanks for chatting, Stelarc”.
The interview indicates the conversational strategy of the Prosthetic Head, revealing some of its possibilities and exposing its limitations. The modified and augmented Alice data-base is programmed sometimes with key words, phrases or simply in complete answers (such as definitions). It can respond sometimes differently to exactly the same question, (having a repetoire of responses) and it will respond differently to similar questions (with different key words programmed). The Prosthetic Head logs all the conversations it has.
Karen Marcelo — project coordination, system configuration, alicebot customization.
Sam Trychin — customization of 3D animation and text to speech software
Barrett Fox — 3D modelling and animation
John Waters — system configuration and technical advice
Dr. Richard Wallace — creator of alicebot and AIML.
alicebot advisor. Alicebot is a natural language artificial intelligence chat robot.
 Wittgenstein, Ludwig. Preliminary studies for the “Philosophical investigations” The Blue and Brown Books, Oxford, Basil Blackwell, 1958. p 6.
 Merleau-Ponty, Maurice. Phenomenology of Perception, Translated from the French by Colin Smith, London, Routledge, 1989. p137.
 Jean Baudrillard. “The Ecstasy of Communication,” in The Anti-Aesthetic: Essays on Postmodern Culture, ed. Hal Foster, Port Townsend, Wash: Bay Press, 1983. Baudrillard states that we are in an era of “connections, contact, contiguity, feedback, and generalized interface that goes with the universe of communication.” p127.
 Freud, Sigmund. The ego and the id, translated by Joan Riviere, London, Hogarth Press and the Institute of Psycho-Analysis, 1962.
 Kristeva, Julia. Powers of horror: an essay on abjection, translated by Leon S. Roudiez. New York, Columbia University Press, 1982. p 4. She said: “How can I be without border? That elsewhere that I imagine beyond the present…”
 Deleuze, Gilles and Guattari, Félix. A thousand plateaus : capitalism and schizophrenia, translation and foreword by Brian Massumi, Minneapolis, University of Minnesota Press, 1987. p4. For a succinct definition of “lines of flight’ see: Claire Colebrook, Understanding Deleuze, Allen and Unwin, Australia. 2002. pxxiv.
 Massumi, Brian. A User’s Guide to Capitalism and Schizophrenia: Deviations from Deleuze and Guattari, A Swerve Edition, The MIT Press, Cambridge, Mass, London, England, 1992. p70. Massumi said: “Think of the body without organs outside any determinate state, poised for any action in its repertory…”
 Haraway, Donna Jeanne. Simians, cyborgs and women : the reinvention of nature, London, Free Association Books, 1991. See: Chapter 8: “A Cyborg Manifesto: Science, Technology, and Socialist-Feminism in the Late Twentieth Century”, pp140-181.
 Baudrillard, Jean. Selected Writings, edited and introduced by Mark Poster. Stanford, California, Stanford University Press, 2001, p75.
 For an explanation of the mirror stage, see: Lacan, Jacques. Ecrits, Paris, Editions du Seuil, 1966.
 Lacan, Jacques. The Four Fundamental Concepts of Psycho-Analysis, Translated by Alan Sheridan, Jacques-Alain Miler, Editor. New York, Norton, 1977. p.203
 Nietzsche, Friedrich Wilhelm. On the genealogy of morals: a polemic: by way of clarification and supplement to my last book Beyond good and evil, translated with an introduction and notes by Douglas Smith. Oxford, New York, Oxford University Press, 1887. p13. In Helen Zimmern’s 1997 translation of Beyond Good and Evil: prelude to a philosophy of the future, Dover Publications Inc. New York, p13, Nietzsche said: “…a whole series of erroneous conclusions, and consequences of false judgments about the will itself, has become attached to the act of willing to such a degree that he who wills believes firmly that willing suffices for action.”
 Baudrillard, Jean. Seduction, Translated by Brian Singer, London, Macmillan, 1990. p54.
 For example see: Anthony P. Atkinson, Michael S.C. Thomas and Axel Cleeremans, “Consciousness: mapping the theoretical landscape”, Trends in Cognitive Sciences, Elsever Sciences, London, Vol. 4, No. 10, October, 2000, p372-384, online at: http://www.trends.com/tics/contact.htm
 For more information see http://www.swin.edu.au/chem/bio/fractals/refslist.htm