Theory Beyond the Codes
The need to understand security through the interrelationships of surveillance, labor, and capital has become a pressing issue since Edward Snowden’s revelations about the scope of the National Security Agency’s (NSA) pervasive monitoring programs in 2013. Security is a network of social modes autonomously enacting authority (the security apparatus ). Central to this framework is the aspiration to the ‘state of information,’ most apparent in the surveillance Snowden documented, where vast amounts of information are collected and stored precisely so that they can be instrumentally deployed to both predict future behaviors and police past actions. The authorization for increasing the scope and breadth of collected information originates with this aspiration — simply one dimension of the political economy of digital capitalism — and so cannot be considered in isolation. Addressing the challenges posed by pervasive monitoring requires the recognition that it is not an isolated phenomenon — rather, it is reflective of a broader collection of mutually reinforcing tendencies in digital capitalism itself. Surveillance, however broad and omnipresent, nevertheless is simply an epiphenomenon resulting from other, more fundamental demands.
Contemporary surveillance has its origins with earlier forms of surveillance: this issue was an ongoing concern throughout the twentieth century, immediately apparent not only in fictional works (George Orwell’s Nineteen Eighty-four, published in 1949) but in the political realm as well (the scandal over wiretapping in the 1970s known as “Watergate“). Yet there were only occasional moments when the extent of the surveillance undertaken ever became apparent. Its clandestine nature has limited analysis and consideration of its role in digital capitalism. By nature, surveillance is surreptitious, secretive, suspected — but only rarely demonstrated — at the same time, it also demands deception about its existence, a fact that Orwell noted in his novel. The uncertainty prior to Snowden’s revelations is reflective of these ambiguities: the memoranda and other documentation leaked to the press by Snowden, unlike similar leaks and claims made in the decade prior to his highly visible release of NSA documents, provided direct evidence of not only the (formerly) conspiracy theorists’ claim that surveillance is omnipresent, but the extent of its technical capacities to record, integrate, and process the vast amount of data generated by automating this surveillance so it no longer requires human oversight. To assert the materiality of the digital against disavowals of the physical dimensions of these technologies, in opposition to the ‘aura of the digital,’ is essential to this analysis. Digital automation increasingly performs tasks that were formerly the exclusive domain of human intelligence, in the process enabling a broader and more complete surveillance than ever before. The ability to automate the recognition of faces, the ability to listen and transcribe speech, both tasks that require a different kind of intelligence than found in a clockwork mechanism, has enabled the pervasive monitoring of everyone’s every activity rather than a small portion of those performed by selected individuals — this expansive surveillance system is what Snowden revealed.
The broader significance of this confirmation is not technical, nor even an issue of privacy: surveillance has become a tool not only of governments, but of business, and of crime. The databases produced through this pervasive monitoring have become productive domains in themselves, creating value through the autonomous digital rearrangement of the information they contain. This new variety of unintelligent production impacts the organization and structure of society as a whole, creating a systemic crisis for capitalist value production that is unlike the periodic financial crises precipitated by a decline in the rate of value production over the past two hundred years: the deployment of surveillance, independent of any particular purpose, is linked to the inherent instability of digital capitalism; the forms of digital automation that enable pervasive monitoring are the root cause of this instability.
However, surveillance is only one half of a complementary pair. The systematic production of uncertainty (“‘ignorance'”) termed ‘agnotology’ provides what surveillance itself cannot: the control and limitation of interpretations (the use value of information). Intimate connections between surveillance and agnotology emerged following Snowden’s announcement to the press — it was not the document’s contents released by Snowden that provided the confirmation of the surveillance program’s factuality; it was the response to their release by the United States government that demonstrated not only that the documents were true, but that their contents were of great importance. The tendency to dismiss this information with a “we know this already” response misapprehends their meaning: the confirmation of Snowden’s claims is a momentary breach in the agnotology that has historically surrounded information about these broad programs of observation-recording-analysis. Confirmation brought this agnotological dimension into focus, allowing a consideration of how one reinforces the other. These linkages are not apparent when considered in isolation; resistance only becomes possible after their mutually reinforcing relationship becomes visible. This examination of the political economy of the security apparatus through the agnotology::surveillance dynamic is diagnostic in nature. It seeks to make understanding what responses, if any, are possible.
Linkages between agnotology, hyperreality, and surveillance converge in the security assemblage: a paradigm of observation and control whose function is both immaterially productive (it enables the autonomous semiotic generation of value) and restrictive (it enables the mobilization of physical/immaterial force to defend this immaterial production). These productive-restrictive activities are distinct, yet mutually reinforcing — they form a dynamic cycle masked by the aura of the digital’s stripping of physicality from conscious consideration. Without this distanciation of the physical, the productive-restrictive cycles would become apparent through their necessarily disenfranchising actions as human agency is usurped by automated processes and autonomous oversight. The security assemblage appears as an impartial, disinterested alternative to the variable contingency of human agency: its uniformly applied mechanical responses create an illusion of objectivity. This autonomous response is a crystallized ideology, an inflexible restriction iterated by the all-or-nothing logic of digital protocols that are incapable of ambiguity, plurality, or contingency apparent in, for example, the “right to read” implemented as Digital Rights Management (DRM) — either you have authorization or you do not. Authorization implicitly demands a continuous monitoring and maintenance where its authoritarian dimensions — what has been termed “pervasive monitoring” by the Internet Engineering Task Force (IETF) in “BCP 188: Pervasive Monitoring Is an Attack” — become readily apparent not only in the immaterial “space” of digital technology, but in the physical world as well. However, the IETF’s description of pervasive monitoring should not be limited to immaterial forms of surveillance:
Pervasive Monitoring (PM) is widespread (and often covert) surveillance through intrusive gathering of protocol artifacts, including application content, or protocol metadata such as headers. Active or passive wiretaps and traffic analysis, (e.g., correlation, timing or measuring packet sizes), or subverting the cryptographic keys used to secure protocols can also be used as part of pervasive monitoring. PM is distinguished by being indiscriminate and very large scale, rather than by introducing new types of technical compromise. 
The indiscriminate nature of this surveillance — that it captures all communications, not only those being specifically targeted for examination — is its necessary and sufficient criterion; the analysis this surveillance offers depends on a vast data collection and collation. Yet this surveillance is not limited to those actions online; it is applied to everyone, both in the physical world and in the immaterial realm of the Internet: pervasive monitoring also includes how these digital technologies have been applied to surveillance in the physical world, for example, with facial recognition on streets and in stores, passive traffic cameras that capture and log vehicle movements, and the monitoring of geolocation through cellphone tracking. All the contemporary machineries of surveillance depend on digital technology — whether employed as immaterial production or socio-political control — serving to reify the security assemblage in this implementation of pervasive monitoring itself. And these technologies function for both commercial interests and governments in much the same way: DRM is the most visible prominence of this implicit, ubiquitous system that directly impacts the human readable form of digital objects, but this most apparent example is precisely an isolated surfacing of larger, dominant systems for control and observation that lie within the ‘database’ that enables immaterial production.
These multivalent dimensions of ‘security’ in digital capitalism reify the convergent aspects of agnotology and surveillance — each is a reciprocal justification for the other: agnotology renders established knowledge uncertain, requiring greater detail and contextual understanding; surveillance provides this understanding, but at the same time produces so much data that its interpretation becomes uncertain because of the destabilizing effects of the equivalences posed by agnotology. Their linkage is a “virtuous circle” where each begets the other, making their expansion inevitable: the logic providing these justifications is inherently circular; this circularity is not a flaw of the system, but its precise focus — a circularity necessary for digital capitalism to become dominant.
The prophylactic disenfranchisement of human agency enables the generation of new domains for commercial expansion by transforming non-productive use values into new forms of value via immaterial production through the surveillance capacities of digital technology: the transference of this implicitly policing action is apparent in the shift to digital capitalism itself. Once a physically productive economy becomes one based on semiotic manipulation, the foundation of production undergoes a fundamental transformation from facture to reconfiguration — the database as a model. This transformation simultaneously enlists pervasive monitoring via surveillance (data collection) as the technical means for both the expansion of productive capacities and their defense against any socio-political challenges that might emerge. This change invokes surveillance at its most basic level: the immaterial securities that are so central to the circulation of values within digital capitalism depend on the database for their recombinative processes. It is precisely this protocol of recombination and permutation that characterizes digital semiosis as distinct from the meaning-construction of a human-oriented semiotic process. The resulting values are unintelligent; their meaning for the database is dependent on the full set, rather than on individual configurations. It is through this unintelligibility that digital production aspires to completeness, revealing its link to the ‘goal’ of all securitizing processes. This semiosis is the digital aspiration to the state of information coupled with the innate need of digital capitalism for a continuous growth of values. The semiotic expansion of immaterially generated values is apparent in how capital-productive ‘domains’ expand within society; all these activities are a reflection of attempts to reify the aura of information in immanent form. This instrumentality demonstrates how aspirations to the state of information become a literal tool of control and prediction (the security assemblage).
Thus, agnotology is not a cause, but a symptom of the expansive nature of the semiotic processes embedded within and enabled by digital capitalism. Disjunctions between physical assets and their role as immaterial tokens in semiotic production (via the database) reflect the structural demand in capitalism for continuous expansion (growth). Semiotic production is unintelligent, generating values through logical operations rather than directed, coherent action; it is autonomous, but unconscious. Agnotology is uniquely suited to the demands of digital capitalist surveillance by interrupting the evaluative process assumed to lie at the base of all market decisions (the “rationality of markets”) — in other words, agnotological uncertainty makes any choice appear equally “good” (valid) — conventionally closed only by the utility (use value) of those interpretations: it emerges when the contingent relationships between production and representation are recognized as being arbitrary, their meaning unstable, with dependencies relative to their particular application at any given moment, a process inherent to the unintelligence of semiotic facture. This continuous expansion of immaterial production is simultaneously an expansion of value accumulation without restraint. Eliding differences allows the semiotic manipulation of values; the “openness” of interpretation expands without constraint. This shift is performed by the complex relationship of physical and immaterial commodities in the valorization process mediated by agnotology and surveillance.
Connections between agnotology and hyperreality provide the foundation for the ‘security assemblage.’ Their similarities are readily apparent: agnotology is a particular failure of knowledge and interpretation (focused epistemologically on the methods and procedures by which we arrive at conclusions, think), while the hyperreal is a specific effect on the perception/conception of the physical world itself (focused ontologically, transforming the underlying interpretation of the physical). Both have a semiotic character, but with divergent foci. Their impacts on interpretation originate with the same semiotic function in digital capitalism: the substitution of the semiotically produced for immanent physicality, enabling/contributing to the capitalist demand for the expansion of markets into new, previously unvalorized domains. These processes act together as enablers for semiotic recombination, each reinforcing the other in the denial of physicality inherent to digital capitalism.
The semiotic dissolution of ‘reality’ into interpretations contingent on a collection of a priori assumptions — the hyperreal — is a precondition for the dominance of agnotology. The processes inaugurated by the rupture between the hyperreal and the conception of the (historical) reality it supplanted are logically circular, self-reinforcing methodologies: where ‘the real’ was considered uniform and inviolate, the hyperreal is contingent and fabricated. This shift is not a “crisis of meaning,” but a transformation of meaning qua meaning, from a singular construction (linear) to a multiplicity of contingent potentials (network). The aura of information and the aspiration towards the state of information dominate this process of semiotic production: they are eminent in the network of different (competing) contingencies as anything categorized as ‘the real’ enables the reification of information as/within a database upon which various ‘operations’ produce momentary, unintelligent interpretations — semiotic (re)configurations — that can and will be challenged by later emergent alternatives.
Hyperreality is a shift to a “contingent epistemology” that reveals/depends on a fundamental uncertainty about ontology: in replacing ‘the real’ with its (semiotic) double, those empirical foundations commonly employed as a check on interpretations become subject to flux, variability, and instability. The ability to distinguish causes from effects, epistemological from ontological concerns, knowledge from uncertainty depend on a stable system of signification that is no longer available with the emergence of hyperreality. Elision of a priori distinctions, their conflagration, is the operative demonstration that the hyperreal is dominant: the particular breach in epistemological understanding undermines knowledge in other domains in a mutually disruptive fashion. The meta-stable (contingent) nature of the hyperreal was noted by Jean Baudrillard in his prominent theorization “The Precession of Simulacra” that developed the semiotic dimensions of the hyperreal explicitly as an ontological instability that creates epistemic doubt:
All the hypotheses of manipulation are reversible in an endless whirligig. For manipulation is a floating causality where positivity and negativity engender and overlap with one another, where there is no longer any active or passive. . . . Is any given bombing in Italy the work of leftist extremists, or of extreme right-wing provocation, or staged by centrists to bring every terrorist extreme into disrepute and to shore up its own failing power, or again, is it a police-inspired scenario in order to appeal to public security? 
The nature of “any given bombing” (its ontology) becomes a demonstration of the hyperreal: because all ‘terrorist’ actions are done to evoke a specific political result, and so are not neutral, naturally occurring events, their underlying, intentional purpose is also always a specific political goal that depends on the ontological origins of the event itself, quite apart from the ‘terrorist’ action. In assessing these events, what renders them comprehensible as political actions is the creation of an interpretation revealing, demonstrating, and/or inferring this innate purpose (identifying the ontological nature of the particular event). The “‘ignorance'” that hyperreality describes is of a different character than traditional ‘ignorance’: it is an “‘ignorance'” that is superposed between true/false, certain/uncertain, known/unknown. In place of opposition, these positions lose their distinctness and become equivalent. Baudrillard’s argument suggests those developments now recognized as agnotology: any interpretation is more accurately understood as a direct product of the particular model — simulation — employed to generate a specific interpretation (understood as the organization of ‘facts’ and their meaning). The range of mutually exclusive interpretations he poses for any such terrorist act are superposed; all these potentials cannot be true at the same time, yet distinguishing one from another is problematic, if not impossible. Digital semiosis reiterates the protocols of hyperreality: the resulting values are unintelligent, their meaning dependent on the full set, rather than as individual significations. Which particular interpretation is selected as ‘true’ reflects the innate biases of the human interpreter, rather than a logic of ‘facts.’ (The hyperreal is mute to binary oppositions such as true/false, factual/counterfactual, real/unreal that rationalize superposition into singularity.) Interpretation depends on a priori models; the hyperreal/agnotological disrupt human agency not through a disavowal of meaning, but through a surplusage of superposed potentials.
Employing models to create interpretations demonstrates ‘the real’ is specifically semiotic precisely because ‘the real’ is contained by its model (simulation) in a web of interdependent interpretations where any given ‘fact’ is at one and the same time another model and series of relationships whose stability (‘factness’) is contingent on their arrangement within the particular interpretation. The semiotic nature of interpretation is not limited to significance (meaning), but includes other forms of interpretation that are also subject to the same contingency under hyperreality: affect, causation, sensory experience . . . the instability (or contingent nature) of all interpretations in hyperreality is agnotology. Each particular interpretation depends on a network of potentials focused around how the event-being-interpreted is conceived in advance of its interpretation by the model used to produce that particular understanding: its nature as a political action in service of a particular political end is therefore dependent on the specific model employed to evaluate it, shaping conclusions in advance of their creation. Agnotology is the particular “‘ignorance'” of hyperreality that is the inability to select a ‘fact,’ any fact.
The meta-stability Baudrillard identifies as symptomatic of the hyperreal is the affective result of agnotology — that no interpretation can be definitively chosen is precisely his point. This model itself is symptomatic of how contingencies collapse ‘certainty’ into its opposite: an infinite regression of signifiers around a ‘fact,’ any fact — thus, agnotological. Our inability to separate one potential interpretation from others reflects the shift from a realm of facts (reality) to one based in a logic of semiosis (hyperreality) — this transformation of knowledge and its foundations in argument is the effect of agnotological processes in action. The network of these potential interpretations and relationships (even those that are mutually exclusive) define the event-being-interpreted through a web of potential relationships and exclusions (the state of information). The editorial selection and fabrication of a singular interpretation reflects one set of ‘facts’ that may be countered by an alternative set that is equally potential, plausible, yet mutually exclusive and contradictory; this shifting of connections is the aura of information in action. The uncertainty of this situation reflects how the initial rupture (the hyperreal) challenges established knowledge; it describes an epistemological failure of empirical and logical relationships that disrupts not only epistemological but ontological knowledge as well.
Agnotology engages in semiosis precisely by ungrounding the conventional limitations imposed via utilitarian concerns (the demand that use value be homologous with a particular use). The apparent opposition of these two features of digital capitalism is ironic, since both function in the same fundamental way to expand/generate new domains for capitalist valorization. Semiotic production creates value limited only by the scarcity of capital. What appears in this new unbounded mode is a surfeit production of use values without function (use); it separates agency from efficacy, resulting in all evaluations of significance coming to depend on irrational (affective) procedures quite independent of any potential result or application.
Hyperreality and agnotology are mutual reflections enabling the expansion of digital capitalism through the surveillance demanded by this aspiration to the state of information: the final potential Baudrillard suggests — “a police-inspired scenario in order to appeal to public security” — is not an accident or coincidence; ‘terrorism’ always serves as opportunity for capitalist expansion into unvalorized domains. The distinction between the intrusion of the police and the intrusion of commercial interests in digital capitalism disappears with the decomposition into translation/recording of all activities as data, and the concurrent emergence of pervasive monitoring as autonomous productive action. In an interview with The Atlantic in 2010,  Google CEO Eric Schmidt makes this innate linkage between surveillance and production both threatening and obvious:
“We know where you are. We know where you’ve been. We can more or less know what you’re thinking about. [. . .] Your digital identity will live forever . . . because there’s no delete button.” 
The potential immortality of digital files means that, unless destroyed, their information remains available and accessible in perpetuity — the collection and reconfiguration of data necessitates the maintenance and expansion of the database. Without this ever-expanding archive, the valorization produced by semiotic production cannot proceed: it depends on the ability to generate novel relationships. Demands for increasingly intrusive data collection is an integral part of how semiosis interfaces with surveillance — it becomes the primary activity of this production: surveillance is its own end product. This assertion of dominance enabling and enabled by continuous, omnipresent surveillance makes authoritarianism the logical form of this political economy: whatever the organization of society, it will inevitably tend towards concentrations of power and authority as these are reflections of the structural demands imposed by the dynamics of the security assemblage.
The annexation of new domains for digital capitalism requires the replacement of ‘the real’ with the hyperreal, since this shift is the fundamental condition for semiotic production’s dominance; the ‘security’ network (most apparent as surveillance itself) is a fundamental dimension of semiotic production. The commercial and authoritarian dimensions of surveillance merge and overlap in this process: the NSA surveillance programs disclosed by Edward Snowden are simultaneously both a government collection protocol that employs commercial digital technology and resources (Snowden himself was not a government employee, but rather an “outside contractor” working for a commercial enterprise ) and database of valuable commercial data. Semiosis removes ‘intent’ from the crystallized form that purpose assumes in the database and expands into previously non-semiotic realms, mirroring the expansion of capitalism into unvalorized domains, in the process demonstrating how digital capitalism has broken value generation free from any constraint imposed by use value.
Surveillance is the logical antithesis of agnotology: it acts to produce certainty rather than uncertainty. ‘Security’ provides a far-reaching, nebulous justification for a range of actions, from expansions of surveillance (immaterial production) to war and imperialism (primitive accumulation). Baudrillard’s “police-inspired scenario in order to appeal to public security” serves as an underlying excuse for violations and suspensions of human rights, due process, and habeas corpus. All the various interpretations enabled by the instrumental database are united by an implicit threat — whether physical (violence) or immaterial (default) — that justifies imposed/intrusive authority (‘police action’) as a protective measure. ‘Reality’ has become an effect of what data has been collected and stored, as David Cole noted in his discussion of how the NSA uses their surveillance-generated database to make assessments of threat:
NSA General Counsel Stewart Baker has said, “metadata absolutely tells you everything about somebody’s life. If you have enough metadata, you don’t really need content.” When I quoted Baker at a recent debate at Johns Hopkins University [The Price of Privacy: Re-Evaluating the NSA, A Debate, April 7, 2014], my opponent, General Michael Hayden, former director of the NSA and the CIA, called Baker’s comment “absolutely correct,” and raised him one, asserting, “We kill people based on metadata.” 
The expansion/assimilation of data becomes the totalizing nature of the database via the recording/recoding: semiosis reifies this continuous surveillance as instrumentality; the database is ‘reality.’ Everyone is potentially a ‘terror’ threat. It identifies those allowed to live and those who are killed, shifting responsibility from the human agents who give the orders to kill to the digital system’s encoding and arrangement of data. The question the database requires (as with any ‘terrorist action’) is one of purpose: what is it for? The NSA program — known in the mid-2000s as “Total Information Awareness”  — makes the answer literally apparent, and its digital aspiration explicit: to convert the state of information into immanent instrumentality. Security researcher Wolfgang Sutzl identified this fundamental purpose with the ability to contain and anticipate outcomes, linking it to the instrumental function of a productive apparatus:
The actions through which security is “performed” concern the construction of physical and informational architectures of seriousness and essential “sameness.” Here, everything happens the way it happens because other possibilities have been rendered impossible. 
The “uniformity” Sutzl describes is a consequence of the digital recording itself — the transformation of all actions and events into data (following the aura of information). Ideology is reified as technology; its demands become the only potentials possible in an attempt to contain and limit those alternative potentials always already present in the state of information. Producing value (economic or political) by reconfiguring and rearranging data has as its goal this total containment of future outcomes; its predictive capacity is an aspiration to the state of information — a system attempting to create an instrumentality of “completeness” — and is the reason that the agnotology arising from this state simultaneously requires/enables its antithesis, surveillance.
All selections, choices become contingent, an effect of attempts to render the state of information as instrumentality: incompatible interpretations are equivalent within this database, ironically reflecting the demands posed by agnotology. The irrationality and arbitrariness of this agnotological marketplace reflects the priorities of the database (the aura of information): all positions are equivalent as data points; the conception as information (data) disregards its meaning (use value) — collapsing historical dualities of true/false, real/unreal, life/death. Agnotology produces a ‘capitalist market’ where no rational decision is possible in a dynamic where the demands of the security assemblage create the conditions for agnotology through surveillance, reinforcing the demand for greater certainty posed by pervasive monitoring itself. In the absence of epistemological checks against ‘reality,’ any decision becomes inherently a reflection of irrational factors extrinsic to evaluation. The disenfranchisement of human agency the database produces is the reiteration of this shift, a disavowal of responsibility for actions onto the matrix of surveillance data, separating effect from choice — a result uniting both the political and commercial dimensions of the agnotology::surveillance dynamic.
Meaning is independent of the database itself; the collection of data and its relations follow semiotic rules of combination, but without the lexical concern for their significance. Data reflects only the uniformity of the digital protocol — reifying and aspiring to the state of information — all positions, even when contradictory and mutually exclusive, coexist as discrete datapoints awaiting semiosis. This process is not capable of concern with the meaning (value) of what has been collected, indexed, referenced, and compiled. The nature of these values depends on how the database is employed — whether for commercial or political reasons, the results are irrelevant to the form and collection of data; values only become apparent through semiotic (re)configuration. Their significance (the use value of a particular semiotic configuration) is beyond the scope of what has been catalogued; it is the nature of the semiotic processes within the database that any significance generated remain unrecognized. This valorization of (previously private) non-productive action enables the transformation of all formerly non-productive activities into varieties of labor from which value can be extracted through the creation of a broad new arena for economic development without corresponding compensation: the ‘digital author’ — the subject of the surveillance created and reified in/through the database itself.  The ambivalent nature of this production reflects a semiotic reassembly with both political and commercial functions, where any values created are shifts in categorization and internal relationship specific to the database (the valorized liquidation of use value via surveillance). Such a process is not dependent on human agency — it is instead automated through algorithmic analysis, a semiotic production without human intervention or direct oversight.
The ‘security assemblage’ originates with the everyday understanding of “security”: a cluster of ideas focused on protection, freedom, and vigilance — as well as a specific meaning in financial terms: the linkage of legal obligations to specific debts. This assemblage’s formulation is coincident with capitalist investment practices generally; this fact emerges when we remember that investments are called “securities.” The underlying displacement or postponement of the ‘desired’ result (the definitional ‘goal’) — actually producing “security” — is required for the security assemblage to function, a symptom-effect of the underlying capitalist dynamic embodied in it: the investment in an expected but nevertheless hypothetical (i.e., the ‘risk’ of investing in a stock) future ‘payoff.’ This offset of results from means creates a linkage of expanding surveillance following the logic of surveillance itself: the limitations discovered by surveillance necessitate further examination, inaugurating a fractal-like infinite recursion, recalling Michel Foucault’s observation in Birth of the Clinic that “knowledge invents the secret.”  It is precisely the observational demands of empiricism that those domains not subject to observation become apparent. Once set in motion, the demands of “security” require perpetually intensifying effort, apparent in surveillance itself. This continuous, expansive demand is not a failure of surveillance, but a demonstration of its efficacy.
Because the security assemblage’s main purpose is the impossible task of eliminating all “risk,” it provides the ideal capitalist product: one that all citizens must purchase, but which can never actually be delivered — all profit, no risk; it is the imaginary “free lunch” reified in the aura of the digital. Continuously expanding investments in security, coupled with the increasing expenditures on surveillance they necessitate, are fundamental features of the security state, as is the continuous expansion of pervasive monitoring into all aspects of life. These technologies and protocols of observation-recording-analysis were recognized by the IETF as being uniformly deployed by criminals, corporations, and governments in their assessment of the impact that pervasive monitoring has on privacy. The transformative effects they note demonstrate the agnotology::surveillance dialectic at work:
[Pervasive monitoring is an attack that] may change the content of the communication, record the content or external characteristics of the communication, or through correlation with other communication events, reveal information the parties did not intend to be revealed. It may also have other effects that similarly subvert the intent of a communicator. [. . .] The motivation for PM can range from non-targeted nation-state surveillance, to legal but privacy-unfriendly purposes by commercial enterprises, to illegal actions by criminals. The same techniques to achieve PM can be used regardless of motivation. Thus, we cannot defend against the most nefarious actors while allowing monitoring by other actors no matter how benevolent some might consider them to be, since the actions required of the attacker are indistinguishable from other attacks. 
As the introduction to this report notes, there is no a priori means to distinguish between actors (criminal/corporate/government) in terms of how they use the tools of surveillance. The methods and technologies employed in pervasive monitoring are neutral to the human intent of its deployment, reflecting the underlying nature of the database and its reification of the state of information. This technique is the reason the IETF terms this surveillance an ‘attack.’ It is the technique itself that produces the problem it seeks to resolve: it is continuously countered by the agnotology that provides its proximate justification  — thus, there is a continual expansion (the “pervasive” in ‘pervasive monitoring’) because achieving the “completeness” that is the goal of this security-through-surveillance is impossible.
However, this duality — agnotology::surveillance — is an adaptive network that impedes resistance and ensnares all activities that attempt to escape or evade its logic (to the extent that ‘terrorism’ can be seen as a variety of ‘anti-globalization protest,’ it is a failure because it generates and substantiates the security assemblage). This amorphous, absorptive complex reflects the requirements created by security’s structural aspiration (shared by the digital) to achieve the state of information as immanent instrumentality of prediction and control: based in semiotic networks of relationships, this process is infinite, uncompletable, and thus continuously demanding of more data gathered through ever greater surveillance. Schmidt’s comment — “We can more or less know what you’re thinking about.”  — documents this instrumental goal through the presumption that what the surveilled “think about” and what those thoughts are coincide with the materials interceptable via surveillance. His presumption that what is collected in the database is capable of completely describing those individuals being examined demonstrates the authoritarian dimensions of this instrumentality: those who are so fully described that their thoughts are predictable cannot be considered “free” in any sense of the term.
Attempts to unmask this construct inaugurate the infinite regression of hyperreality where what one finds ‘beneath’ one semiotic mask is simply a second, a third, each identifiable by the progressive ease of its rupture: the underlying nature of the hyperreal is its construction as constellated signs, themselves moveable into new arrangements. The infinity of interpretations develops the arbitrariness of semiotic disassembly into a regression of successive layers, producing a vertigo of interpretation recognizable in the aura of information itself — an unbounded process that is a reification of the state of information. The agnotology::surveillance dynamic cannot be challenged along traditional lines of rational interrogation, logic, or evidence: these are always already captured by this process since they posit a retrograde return to use value (immanent ‘reality’); concern for a metaphysical value created through ethical (moral) considerations of social reproduction is cast aside by a technological determinism that replaces the human with the autonomous digital — social functions supplanted by digital efficacy. That this shift is also simultaneously a function of digital capitalism presents a direct demonstration of how it deploys information collection (pervasive monitoring) in its aspiration to achieve the state of information as an immanent instrumentality without regard for social or legal constraint.
The periodic crashes of capitalism are a symptom of the overextension inherent in capitalism itself, apparent in cycles of excessive production that Karl Marx described in the nineteenth century:
The stupendous productivity developing under the capitalist mode of production relative to population, and the increase, if not in the same proportion, of capital-values (not just of their material substance), which grow much more rapidly than the population, contradict the basis, which constantly narrows in relation to the expanding wealth, and for which all this immense productiveness works. They also contradict the conditions under which this swelling capital augments its value. Hence the crises. 
The “crises” Marx identifies are specifically financial, and instead of offering expansions and potentials for capitalist growth, they are destructive of value: it is the contradictions between “expanding wealth,” the conditions of production, and the purpose of capitalist production generally that creates the crisis. The difference between a physical facture and semiotic facture becomes apparent in the role that crisis has in Marx’s account, and its role in semiotic production. The crisis of the nineteenth century is one of overproduction outstripping demand and the capacity to generate profit. In digital capitalism, “crisis” arises from an inability to meet the demands posed by the scarcity of capital’s constraints on semiotic facture: it is only through the addition of an external source of value that the system can continue. The shift from capital as repository of value to capital as title to future production forces an expansion into previously unvalorized domains; surveillance mirrors the capitalist colonization of these same domains: they are different aspects of the same process of expansion where any crisis, natural or manmade, can provide an opportunity for exploitation as a revenue source for capitalist expansion. The perversity of this system arises because there are a finite number of external sources, and when those are depleted, the system necessarily enters a crisis.
However, moments of ‘systemic failure’ are not indicators that capitalism will implode; instead, what occurs is a retrenching that results in an expansion of capitalist processes into new domains in what journalist Naomi Klein called “disaster capitalism.” Her eponymously titled book explains the exploitation of disruptive social events as a method of economic expansion:
On August 5, 2004, the White House created the Office of the Coordinator for Reconstruction and Stabilization. [. . .] The office’s mandate is not to rebuild any old states, you see, but to create “democratic and market-oriented ones. . . . The work is far too slow, if it is happening at all. Foreign consultants live high on costs-plus expense accounts and thousand-dollar-a-day salaries, while locals are shut out of much-needed jobs, training and decision-making. Expert “democracy builders” lecture governments on the importance of transparency and “good governance,” yet most contractors and NGOs refuse to open their books to those same governments, let alone give them control over how their aid money is spent. [. . .] But if the reconstruction industry is stunningly inept at rebuilding, that may be because rebuilding is not its primary purpose. According to [Shalmali Guttal, a Bangalore-based researcher with Focus on the Global South], “It’s not reconstruction at all — it’s about reshaping everything.” If anything, the stories of corruption and incompetence serve to mask this deeper scandal: the rise of a predatory form of disaster capitalism that uses the desperation and fear created by catastrophe to engage in radical social and economic engineering. And on this front, the reconstruction industry works so quickly and efficiently that the privatizations and land grabs are usually locked in before the local population knows what hit them. 
‘Systemic failure’ offers digital capitalism an opportunity to expand through the liquidation/elimination of competition — both economic and political (consider the repressive effects that spread through the United States of America as a result of ‘terrorist’ actions in 2001). These ‘systemic failures’ make the security assemblage explicit — what is secured is the ability of capitalist expansion to continue: reconstruction after a natural disaster, rebuilding after war, or “recovery” are primarily opportunities for the profit-generating process of stabilization: expanding demands for new production. The ideal situation for this unconstrained capitalist expansion is an open-ended conflict without apparent criteria for victory or readily attainable grounds for an end to the conflict itself: the “Cold War” between the United States and the Soviet Union during the twentieth century provided a framework for expansion and revenue generation echoed by that of the “War on Terror.” Actions that lead to a reduction or resolution of the conditions creating ‘terror’ are less significant than the exploitation of those actions as a means to further establish and expand these processes; it is precisely how surveillance seeks to justify its violations of both legal and historical limits on its expansion throughout the social.
Security necessarily employs ever-increasing monitoring tactics: surveillance, data mining, and “coercive interrogation” (torture) are all part of the same cycle of observation-recording-analysis that defines the ‘security assemblage.’ It does not matter what the proximate cause (source) is — or whether it is successful in the attack — any particular challenge to the authority of this system is thus an irrelevant a priori, since each event only serves to strengthen the systemic demand for pervasive monitoring. In the security assemblage, a failure in the instrumentality of information demonstrates the need for greater surveillance, not its futility.
Where law acts to ameliorate, security seeks to dominate through a totalitarian control not just of actions but of all potential actions. Security becomes an impossible goal because it postulates and requires the complete ability to monitor and predict all future behaviors; the conjunction of an ascendant ‘security assemblage’ and the emergence of the digital is not coincidental — the attempt to actualize the assemblage depends on the digital processing and immediate recall encased within the database itself. Semiotic production (via the database), the hyperreal, and agnotology all reflect the same structural demand in capitalism for continuous expansion (growth). Their development is mutually reinforcing as they are complementary dimensions of the same implicit processes in action: the shift to a semiotic model of production that itself has a modular, recombinative character.
Automation is ideally suited to the inherently permutational character of semiotic production: the logical rearrangement of a limited quantity of variables into all possible configurations. This is the logic of the database deployed as productive methodology — one that can proceed without human agency since it is mechanical, rather than one that requires the intervention of human judgment. This kind of production is most immediately apparent in the use of High Frequency Trading (HFT) systems: computers and software employ algorithms that automate decisions about stock purchase orders including price, timing, and size without human intervention; HFT generates financialized profits from the exchange of stocks, commodities, and other derivatives in the financial markets. For these computer programs, speed and proximity to the financial markets (via high speed data connections) are essential to their ability to generate multiple, sequential trades in microseconds. It is this factor that necessitates their being automated systems using digital technology.
HFT reveals one of the clearest examples of the semiotic procedures of digital capitalism in action. The Nanex analysis of the first “flash crash” in 2010 demonstrates how agnotology can be translated into forms that impact automated systems as well, simply by using the sequential nature of data processing (i.e., the linearity of computers) to create immanent uncertainty.
The first HFT “flash crash” in the financial markets happened on May 6, 2010. This event presents a model for how agnotology can emerge within autonomous digital systems; machines lack the comprehension of meaning characteristic of agnotology in human interpretations — agnotology can only be created through exploiting the structure of the machinic instrumentally itself: it arises through an asymmetry of information in an otherwise “open system” where all participants have equal access — a reflection of the ‘transparency’ digital systems demand for pervasive monitoring to be fully efficacious. It seems reasonable to assume that this exploit will become the norm unless rules are implemented to prevent it. Nanex’s analysis noted the agnotological effect and considered its implications within the financial markets:
What benefit could there be to whomever is generating these extremely high quote rates? After thoughtful analysis, we can only think of one. Competition between HFT systems today has reached the point where microseconds matter. Any edge one has to process information faster than a competitor makes all the difference in this game. If you could generate a large number of quotes that your competitors have to process, but you can ignore since you generated them, you gain valuable processing time. This is an extremely disturbing development, because as more HFT systems start doing this, it is only a matter of time before quote-stuffing shuts down the entire market from congestion. We think it played an active role in the final drop on 5/6/2010, and urge everyone involved to take a look at what is going on. Our recommendation for a simple 50ms quote expiration rule would eliminate quote-stuffing and level the playing field without impacting legitimate trading. 
The italicized text identifies the agnotological function in operation here: the “congestion” caused by the large number of quotes forces other HTF systems (those not already aware of the quotes because they didn’t generate them) to process these requests — creating an information gap between one system and all the others. The discrepancy in information possessed by one system generating quotes versus all others who must process those same quotes enables the system which generated the quotes to gain a competitive edge because it does not need to process the sequence as a whole to assess its impact. For information processing systems, what we can see with this action is the automation of an agnotism and its application to the HFT computers. The time required to process the series of quotes has an impact on what the other HTF systems will do, but first they must address the entire sequence; the system generating those quotes already has this information.
Because successful interpretation depends specifically on both access to relevant information and the more specific ability to apply and employ it, the organization as a whole has an inbuilt bias towards the accumulation and concentration of information maximally. The baseline condition for success within such structures historically has been one determined by an information differential: those lying at the greater end of the gradient tend towards success and dominance, while those falling at the lesser end tend to fail, excluding such mediating factors as already established positions and authorities that tend to replicate themselves. Information differentials scale between aggregate actions by individuals within this construct. It is the means by which we evaluate claims and establish basic facts that agnotology challenges: dissent over foundational information and basic questions of what the ‘facts’ are produces agnotology, with the concomitant, necessary result that agnotology blocks our ability to create ordered, logical interpretations. In this regard agnotology behaves in a schizoid fashion, splitting and conflating the relationship of cause and effect, their identity as logical sequence.
Since the automation in HFT is typical of all forms of digital facture — the digital procedures all having the same foundation as instrumental code — the question to ask is whose desires does it serve? Automated facture has a different character than the autonomy of the (human) agency it replaces — the particular form of a digital work, when rendered for human viewing/encounter, makes the purpose of both the code and the particular data stream it engages apparent, whether it “is” a movie, a piece of music, a text, or anything else, the data itself is encoded for human purposes. The illusion that our devices function without our input, without responding to our desires and demands, is a reflection of their (human-originating) design, and the functions these machines are constructed to achieve — the superficially mysterious, perfect nature of the digitally manufactured, its magical aura — works to obscure the underlying physical reality of the digital and its subservience to human choices and agency — these foundations are all hidden by the aura of the digital. The creation of agnotological effects depends on individual human agencies whose cumulative impacts emerge with variable coherence at different levels of social organization; the dependent relationship between the functioning of the digital technology, and the demands made by the desires of human society (providing its formal basis in capitalism).
Without a social function — given and directed by the human agency that puts these devices into action — the digital, however active the device itself may be, is unintelligent. It is confronted by immaterial production’s generation of value without function (use value): that digital technology is designed for and functions in service to particular human demands is lost when the digital aura dominates. Technology is a crystallization of human agency externalized in/as the machine; not only do we forget their physical basis as devices, we also forget their dependence on human desires and demands, enabling the transfer of agency to the autonomous system. Like all machines, digital software and hardware are constructed to meet specific human-originating goals, and these goals are the ‘reality’ of function (the use value of the machine), not the instrumentality it creates (its use).
The social realm of human desires and needs are of an entirely different order than their instrumentalities. The connections are implicit, rather than explicit, and so require a jump of interpretation to move from one level of this construction to another: the structure as a whole is necessarily interrelated to the political economy and social organization of the human society that produced it. And it is here that the duality of the digital becomes apparent from its earliest moments as a (military) technology being developed in American universities during the 1950s and 1960s.
Digital technology intersects with the political economy and the problema posed by human agency (labor) in this convergence of agnotology, hyperreality, and surveillance. Agnotology hijacks the traditional need to accumulate information (literalized in/by digital technology’s storage and databasing of data without reference to its nature, factuality, meaning, or interconnectedness) through its relationship to the state of information: it necessarily introduces equally valid, yet contradictory, information and interpretations. The issue becomes not simply a matter of economic or class structure, but of relations of greater and lesser control produced, maintained, and reified by how digital technology and the ideology of the digital reinforce each other in this accumulation of alternatives that generates ambiguity around issues of basic factuality and fundamental knowledge. The database and its semiotic processes proceed without the possibility of recognition or comprehension of any meaning thus produced.
The underlying implication of capitalist valuations (the concept of exchange value) is that value resides in productive action where currency represents a promissory note secured by future production. In digital capitalism, use value becomes a productive (immaterial) source of ‘new’ value via the semiotic process itself — this is the immaterial facture specific to digital systems. Automation entails a shift entirely unlike human labor, as Marx noted about the inherent connections necessitated by labor, society, and capital:
The capitalist process of production is simultaneously a process of accumulation. [. . .] But with the development of social productivity of labor the mass of produced use-values, of which the means of production form a part, grows still more. And the additional labor, through whose appropriation this additional wealth can be reconverted into capital, does not depend on the value, but on the mass of these means of production (including means of subsistence), because in the production process the laborers have nothing to do with the value, but with the use-value, of the means of production. Accumulation itself, however, and the concentration of capital that goes with it, is a material means of increasing productivity. Now, this growth of the means of production includes the growth of the working population, the creation of a working population, which corresponds to the surplus-capital, or even exceeds its general requirements, thus leading to an over-population of workers. [. . .] By applying methods which yield relative surplus-value (introduction and improvement of machinery) it would produce a far more rapid, artificial, relative over-population, which in its turn, would be a breeding-ground for a really swift propagation of the population, since under capitalist production misery produces population. 
While Marx is describing literal population growth, such an interpretation would be incomplete: while the quantity of human labor does increase, its growth is a given, no matter what happens. The replacement of human labor by machines, however, has an immediate and dramatic impact: a ‘working population’ that exceeds the productive requirements of capital for labor. Price inflation — the increased price of commodities and the consequent devaluation of currency — is recognizable as a superficial increase in value counterposed by the equalizing force of currency devaluation: there are no “profits” being produced, only a reshuffling of promissory notes against future production, foreshadowing digital capitalism. These costs of this labor, what he calls “variable capital,” are resolved as automation supplants human agency into “constant capital” — the costs of machinery and raw materials, without the variable expenditures posed by human labor. The result is an apparent production of value without an expenditure of any values produced by human labor. The generation of commodities through autonomous labor suggests a fundamental rupture in the production of values within capitalism, which implies the irrelevance of human labor and social foundations for value in digital capitalism. This implication is the aura of the digital splitting physical from immaterial concerns, even as it elides the physical entirely.
Marx’s conception of uniform labor power (untrained productive ability) inherently requires a basic ‘lack of skill.’ Its negation of social reproduction (dissolution of human agency) is inherent in this paradigm shift as it transforms compartmentalized human labor to automation and then to immaterial labor — from human activity to autonomous, semiotically generated commodity. Automated production and the earlier fragmentation of the assembly line are a challenge to human agency through its displacement of the skilled craftsman’s expertise and productive capacity. This fact finds ironic implementation through the rejection of ‘decoration’ (the most apparent marker of highly skilled hand labor) common to the art and design movements at the end of the nineteenth century that focused on the critique of industrial production. The linkages of commercial and moral concerns in Adolph Loos’s discussion of production, titled “Ornament and Crime” (1910), is typical of the paternalistic view of labor intrinsic to capitalism — what in the United States was called the “Protestant Work Ethic” — a linkage that enables and validates what are primarily commercial determinants and excuses for the social stratification of society and the forms of value generated through automation. Loos’s rejection of human agency ensures the displacement of skilled labor by unskilled labor, a factor in the industrialization of the “Arts and Crafts Movement” in the United States, a shift enabled by the assembly line and later reaffirmed in the automation of human productive capacity:
The advancement of culture is synonymous with the removal of ornament from objects of daily use. [. . .] It represents a crime against the national economy, and as a result of it, human labor, money, and material are ruined. Time cannot compensate for this kind of damage. 
Loos’s claim that the rejection of ornamentation is necessary for cultural development masks the underlying difficulty posed by the production of ornamentation: it requires skill and was associated with careful craftsmanship. The ‘deskilling’ this rejection of decoration entails is implicit in Loos’s argument: “wasted capital” is the primary focus of his theory-manifesto; it is concerned with justifying and validating what would appear to be unfinished commodities (lacking the surface finish provided by decoration). The highly skilled work needed to create these decorations also required more manufacturing time than the production of simple, unornamented objects that simultaneously required less skill and so could be more easily automated:
Ornament is wasted manpower and therefore wasted health. It has always been like this. But today it means wasted material, and both mean wasted capital. 
His actual justification for the rejection of decoration (human, skilled agency applied to production) is financial — eliminating the additional capital expended in producing decoration — has been disguised as a moral crusade against degeneracy. The argument against ornament is commercial, a supposition that hides an underlying concern with productivity: that it takes longer to produce an ornamented object than one without it — these “savings” result in higher productivity, i.e. more objects produced. This concern with rate of production necessarily implies a process of surveillance over those engaged in the labor, a monitoring of their work process and rate of facture — a dimension that becomes literally a new form of production through pervasive monitoring.
The transformations produced by urbanization, industrialization, universal literacy, and the democratic access to information fundamentally shifted this earlier condition, but without altering the baseline assumption that more information is equivalent to success — it is this ideology that is reified in the ‘security assemblage’ as attempts to create an instrumentality of information. Yet, there is a crucial difference between the values generated by an information-rich environment via databases linked to digital technology, and those created by the information-poor one: pre-industrial societies’ social structures self-replicate, not because information is less available, or necessarily less easily stored, but because it is less transmissible — accessible — to those who might otherwise use it; agnotology reproduces this condition within highly automated, inforich digital capitalism through the hyperreal by undermining the interpretative process and creating decoherence about social, political, and environmental conditions.
The problem posed by the inforich society is not access to information — accessing information becomes a commonplace through the always-on computer network — but rather the issue of coherence. Agnotology acts to generate decoherence: it undermines the ability to determine what information is factual and valid for constructing interpretations. At the same time, the concept of “factuality” becomes something that has been termed “truthiness” — information that appears to be valid. Yet agnotology is more than simply ‘ignorance,’ or a result of an information gradient or differential. The agnotism that is so apparent in digital capitalism generally is one where unusual and seemingly unlikely claims are presented without any acknowledgment that there is conflicting or contradictory evidence. The decoherence generated by agnotism serves established hierarchies within the political economy by rendering ‘human resources’ impotent to effect changes or challenge established social organization. An inability to resolve ‘controversy’ within the socio-political domain is one of the most visible symptoms of this decoherence in action.
Human agency requires a reciprocal relationship with the immanent physical world; it is this capacity to alter and effect the physical environment that is apparent in the emergence of Modernism and industrial capitalism following the Enlightenment’s invention of humanism in the eighteenth century. Capitalism’s definition as a worker’s externalization of their productive capacity — human agency — is an adaptation of this emphasis on individual activity as productive model grounded in the social reproduction of human society. The emergence of agnotology is demonstrative of the shift from the historical capitalist production to one without reference or concern for the social: the recording and observation immediately recognizable as pervasive monitoring is one dimension of a general emergence of digital automation and facture that only becomes possible when the social itself is subject to dissolution.
Immaterial production reveals the law of automation in action as intellectual labor first becomes a commodity, then is simplified into autonomous processes — following the historical trajectory in the nineteenth century’s deskilling of labor inherent to assembly line production. This apparent alienation of human agency is innate to capitalism — the externalization of productive capacity was its first, definitional moment. Digital capitalism remains basically linked to a humanist conception of production through externalized agency; it is equivalent to the Modern period in its elevation of humanist values (agency) above all else, as demonstrated by contemporary wage disparities in the United States between the CEO’s salary and the salaries of those who perform labor: the decider — the CEO — has a high salary because of the high degree of agency, while those who perform the actual labor are deserving of only a tiny salary because they do not have agency: those without agency are those without value. The reassertion of human agency is not a critical response to the alienation posed by digital capitalism, but a dimension of the system producing the alienation itself. It is the situation of value in agency that enables capitalism itself.
Until the advent of digital technology, intellectual labor fundamentally required human agency (it could not be automated); only the CEO still retains this inviolability, hence the salary disparity. The issue of human agency, rather than a ‘barbarous relic,’ remains a fundamental constraint on all production and value generation precisely because value is a crystallization of specific social demands that are coincident with, and ultimately dependent on, human agency. It is worth remembering that all currency (money) is a reification of a social relationship — without this human dimension, value ceases to exist. Value is what the security apparatus acts to protect, replacing the social (the reified combination of human agency and the intelligent relationships accompanying that agency) with its own instrumental connections and procedures, autonomous and unintelligent, so that the (historical) foundation in human agency shifts to the autonomous digital system.
The transition to automation necessarily violates the basic foundation of capitalism itself: that workers exchange their labor (externalized productive capacity rendered as a ‘commodity’) for payment that is then recycled as the funds that labor spends to purchase the products of their own labor. The integrity of this foundation is violated as automated production replaces workers without enabling their shift into other forms of production — in the permanent replacement of human labor by automated processes: what emerges is no longer the classically defined ‘capitalism’ of Marx.
Agnotology produces an alienated human agency quite apart from the traditional definition of capitalism itself: a reversion to earlier modes of human agency does not escape this problematic; it is these modes that have produced it. Challenges to human agency are at one and the same time the challenges digital capitalism poses for the social, demonstrated by the contradiction of value where agnotology acts to preserve value in the same way that the aura of the digital strips physicality from consciousness. The dehumanization of production that is the ultimate effect of the law of automation does nothing to address issues of value; quite the contrary, it makes questions of value production become central to any critical analysis, inherently leading back to the construction of the social realm. The elimination of human agency from production is reproduced by pervasive monitoring; surveillance itself is an alienation from value emergent in the hyperreal’s rupture with the conditions of physicality. It is a fundamental transformation of how the social is constructed, and indicates a fundamental shift in the nature of capitalism itself. Value becomes not a social relationship but a technical assertion backed by authoritarian domination. The security assemblage acts to maintain the established order, preventing the emergence of alternatives; the heterotopias offered by agnotology act to dissipate what cannot otherwise be contained. The aspiration to the state of information coupled with semiotic production renders agency moot.
The dynamic of agnotology::surveillance functions simultaneously as affirmation of this hierarchy and as the means for its perpetuation, even as the system it serves grows more precarious. The problem is neither a question of agency, nor automation, nor even value production; instead, it is the paradox that lies at the foundations of capitalism in its development within Modernism. The Modernist concern with self-determination, individuality, and autonomy (agency) finds form in capitalism with workers’ externalization (alienation) of their own internal “productive capacity.” It is no longer a matter of choosing to act or not act, to do or not do; agency is contained: it is rendered powerless by the instrumentality of agnotology::surveillance. Methods of resistance and opposition developed in the nineteenth and twentieth centuries are neutralized in advance of their action. This is the problem that the Critical Art Ensemble directly addressed in their 1996 analysis Digital Resistances where the concept of ‘tactical media’ — a specifically undefined concept in their proposition — demonstrated the security response to theoretical challenges.  It is the undefined that becomes problematic in this system of authoritarian domination — that which retains the ambiguous character of the absent object, invisible except for the displacement it induces around itself, a factor that is an innate feature of how pervasive monitoring is a neutral system, serving all masters equally whatever their purposes, as the Critical Art Ensemble noted in 1994:
The primary concern among the military/corporate cyber police (Computer Emergency Response Team, the Secret Service, and the FBI’s National Computer Crime Squad) is that nomadic strategy and tactics are being employed at this very moment by contestational groups and individuals (in the words of authority, “criminal” groups). The cyberpolice and their elite masters are living under the sign of virtual catastrophe (that is, anticipating the electronic disaster that could happen) in much the same way that the oppressed have lived under the signs of virtual war (the war that we are forever preparing for but never comes) and virtual surveillance (the knowledge that we may be watched by the eye of authority).
The current wave of paranoia began in early 1994 with the discovery of “sniffer” programs. Apparently some adept crackers are collecting passwords for unknown purposes. 
The ‘terror’ that the Critical Art Ensemble identifies at the dawn of the Internet in the 1990s as a mass medium is identical to those elements that pervasive monitoring is ineffective at identifying: the dimensions of meaning that transform semiosis into value. The unknown use value that the information collected might have is precisely what makes it dangerous, makes countering and containing its potential a necessity. The challenge is not one of agency, but inherent to the observation-recording-analysis cycle itself: the transformation of unintelligent semiosis to meaning.
The rise of agnotology as an affect distinct from ‘ignorance,’ disinformation, misinformation, lies, or other propaganda can be traced to its basis in undecidability: unlike its (apparent, historical) parallels, whose foundations are essentially nonfactual and can be recognized as such, the foundations of agnotology merge with and undermine the discursive process itself; the “‘ignorance'” it produces does not reflect a lack of information, but rather is the mirror-like doppelganger of knowledge, dissipating action and challenge through a meta-stable hyperreality — actions without discernable reasons (use value) have limited to no impact on the conditions of reality, and lead inexorably to what psychology terms “learned helplessness” — a situation that innately supports the established hierarchy and order, while at the same time justifying the restriction, elimination, and criminalization of dissent/opposition through the ‘security assemblage.’ Disenfranchisement is the purpose of the security apparatus, shifting the maintenance of value from the social realm to the reified digital. What has been secured in this process is the future.
 Jean Baudrillard, Simulations, trans. Phil Beitchman, Paul Foss, and Paul Patton, (New York: Semiotexte, 1983), 30-32.
 Derek Thompson, “Google’s CEO: ‘The Laws are written by Lobbyists,'” The Atlantic (October 1, 2010), http://www.theatlantic.com/technology/archive/2010/10/googles-ceo-the-laws-are-written-by-lobbyists/63908/#video (accessed on January 12, 2014).
 Yasha Levine, “The Psychological Dark Side of Gmail,” Alternet (December 31, 2013), http://www.alternet.org/media/google-using-gmail-build-psychological-profiles-hundreds-millions-people (accessed on January 12, 2014).
 Luke Harding, “How Edward Snowden went from loyal NSA contractor to whistleblower,” The Guardian (January 31, 2014), http://www.theguardian.com/world/2014/feb/01/edward-snowden-intelligence-leak-nsa-contractor-extract (accessed on May 11, 2014).
 David Cole, “We Kill People Based on Metadata” in New York Review of Books (May 10, 2014), http://www.nybooks.com/blogs/nyrblog/2014/may/10/we-kill-people-based-metadata/ (accessed on May 11, 2014).
 The “Total Information Awareness” Program has been extensively discussed and covered in the press. For a summary, The Center for Media and Democracy provides background on such programs through their “Source Watch” website, See .
 Wolfgang Sutzl, “Tragic Extremes,” CTheory td058 (September 20, 2007), http://www.ctheory.net/articles.aspx?id=582 (accessed on May 11, 2014).
 Michael Betancourt, “Valorization of the Author,” Hz no. 10 (2007), http://www.hz-journal.org/n10/betancourt.html (accessed on July 3, 2014).
 Michel Foucault, The Birth of the Clinic, trans. A.M. Sheridan-Smith (Routledge, 2003), 200-01.
 S. Farrell, “BCP 188: Pervasive Monitoring Is an Attack.”
 James Tully, “Communication and Imperialism,” CTheory td035 (February 22, 2006), http://www.ctheory.net/articles.aspx?id=508 (accessed on January 12, 2014).
 Yasha Levine, “The Psychological Dark Side of Gmail,” Alternet (December 31, 2013), http://www.alternet.org/media/google-using-gmail-build-psychological-profiles-hundreds-millions-people (accessed on January 12, 2014).
 Karl Marx, Capital vol. 3, trans. Samuel Moore and Edward Aveling (New York: International Publishers, 1996), 181.
 Naomi Klein, “The Rise of Disaster Capitalism,” The Nation (May 2, 2005), http://www.thenation.com/doc/20050502/klein (May 11, 2014).
 Eric Scott Hunsader, Jeffrey Donovan, and David O’Neill, “Analysis of the ‘Flash Crash,’ Date of Event: May 6, 2010′ (June 18, 2010), quoted in http://www.zerohedge.com/article/how-hft-quote-stuffing-caused-market-crash-may-6-and-threatens-destroy-entire-market-any-mom (accessed on July 10, 2014).
 Marx, Karl, Capital, vol. 3, 150-51.
 Adolph Loos, Crime and Ornament: The Arts and Popular Culture in the Shadow of Adolph Loos, eds. Bernie Miller and Melony Ward (New York: XYZ Books, 2002), 30-31.
 Ibid., 33.
 Critical Art Ensemble, Digital Resistance: Explorations in Tactical Media, (Brooklyn: Autonomedia, 2001).
 Ibid., 28-29.