Science in the Performance Stratum: Hunting for Higgs and Nature as Performance (2014)

24
The final version of this article has been published as: Maaike Bleeker and Iris van der Tuin (2014). ‘Science in the Performance Stratum: Hunting for Higgs and Nature as Performance’. International Journal of Performance Arts and Digital Media 10(2): 232-45. Special issue: ‘Hybridity: The Intersections between Performing Arts and Science’, eds. Eirini Nedelkopoulou and Mary Oliver. Science in the performance stratum: Hunting for Higgs and nature as performance Maaike Bleeker and Iris van der Tuin Despite the difficulties in defining performance, the concept is central to the application of science. The practice of engineering begins with a hypothesis concerning how a given technology will perform: that is, engineers will start with a set of predictions. Next, the hypothesis is tested in the lab during a developmental stage using both models and prototypes. Then the testing moves to the field, for ultimately: ‘The proof is in the performance.’ (McKenzie 2001: 107) Within the humanities, Judith Butler, among others, has demonstrated the importance of ‘performance’ for an understanding of how bodies come to matter: materiality and performativity are indistinguishable in processes of meaning-making (1993). Gender, for example, is not a matter of either nature or nurture, but rather pertains to the complex ways in

Transcript of Science in the Performance Stratum: Hunting for Higgs and Nature as Performance (2014)

The final version of this article has been published as:

Maaike Bleeker and Iris van der Tuin (2014). ‘Science in the Performance Stratum:

Hunting for Higgs and Nature as Performance’. International Journal of Performance

Arts and Digital Media 10(2): 232-45. Special issue: ‘Hybridity: The Intersections

between Performing Arts and Science’, eds. Eirini Nedelkopoulou and Mary Oliver.

Science in the performance stratum: Hunting for Higgs and nature as performance

Maaike Bleeker and Iris van der Tuin

Despite the difficulties in defining performance, the concept is central to the application of

science. The practice of engineering begins with a hypothesis concerning how a given

technology will perform: that is, engineers will start with a set of predictions. Next, the

hypothesis is tested in the lab during a developmental stage using both models and

prototypes. Then the testing moves to the field, for ultimately: ‘The proof is in the

performance.’ (McKenzie 2001: 107)

Within the humanities, Judith Butler, among others, has demonstrated the importance of

‘performance’ for an understanding of how bodies come to matter: materiality and

performativity are indistinguishable in processes of meaning-making (1993). Gender, for

example, is not a matter of either nature or nurture, but rather pertains to the complex ways in

which biology and culture interact.1 But not only do people perform, technologies do so too.

They have to. Technologies enact the procedures they are designed to execute and this

enactment is described and evaluated in terms of their performance. Technologies have to

‘perform, or else’, as Jon McKenzie puts it in his book with the same title (2001). This claim

has far-reaching consequences; technologies are now assumed to participate actively in

processes of meaning-making. Technologies’ execution of procedures comes with a certain

surprise owing to the indistinguishability of materiality and performativity inherent to

meaning-making.

In his book, McKenzie traces the emergence of performance as a key concept in three

different fields: that of culture, of management, and of technology. He shows how in each of

these fields performance as a key term enacts a challenge: the challenge to be efficient (in

organizational management), the challenge of efficacy (in cultural performance), and the

challenge of being effective (in technology). Although the emergence and spread of the notion

of performance in these fields took place to a large extent independently from one another,2

these developments are not independent, but interconnected. At first sight it may seem that the

challenge to perform organizationally (to help the efficiency of companies and other

institutions) and the challenge to perform culturally (to foreground and resist dominant norms

of social control) are diametrically opposed. However, McKenzie’s point is precisely that

such an oppositional understanding obscures the fact that the situation is much more complex.

To avoid an approach built on binary oppositions, he brings in techno-performance as a third

performance paradigm:

                                                                                                                         1 There is much debate about Butler’s assumed ‘linguisticism’ and her self-asserted inability to allow for the agency of matter or to even discuss materiality. This polemicized debate falls beyond the scope of this article, and we wish to refer to Sari Irni’s recent overview for clarification (2013). 2 Independently in the sense that people using the notion of performance within one field were usually largely unaware of how the same notion was used in the other fields.

[A]though performance functions as a working concept in a number of technical sciences

and an array of manufacturing industries, although its application in the computer sciences

is so vast that it has been institutionalized in High Performance Computing Centres, and

although product information and marketing campaigns have placed this highly

technological performance in our garages, kitchens and living rooms, despite all this,

technological performance has largely escaped the critical attention of historians and

philosophers of science. (McKenzie 2001: 11)

Including techno-performance in our considerations of the emergence of performance as a key

concept is important regarding the ways in which technology is challenged to ‘perform, or

else’. This approach draws attention to performativity as involving a complex intertwining of

normativity and transgression; an intertwining that all too easily remains unnoticed in the

discourses around cultural performance and organizational management. In the discourse

around cultural performance, in which transgression is favoured as a way of exposing,

criticizing, and surpassing the norm, paradoxically the liminal has become the norm. In the

discourse around organizational management, challenging workers to be creative and ‘think

outside the format’ in order to increase profit, liminality too is imposed as norm, albeit in a

different way.

The challenge of techno-performance is its effectiveness. Measuring effectiveness

involves norms against which performance is tested (‘the proof is in the performance’). These

norms are part of the interpretative framework through which the performance of technology

is understood, designed and measured while at the same time the challenge put on technology

to perform is often also to challenge the norm and surpass it. Techno-performance thus draws

attention to the relationship between the challenge to ‘perform, or else’ characteristic of the

emergence of performance in a diversity of fields and a restructuring of what Michel Foucault

has termed the power/knowledge formation (1980). While Foucault exposed the disciplining

power of knowledge and how this power is modelled on an interplay between repression and

liberation (potestas and potentia), the rise of performance marks a transformation towards

power/knowledge modelled on what Martin Heidegger describes as a ‘challenging forth’3.

Power here manifests itself both in how norms are imposed and in how transgression becomes

the new standard. The result is a qualitative mutation of what we call knowledge: ‘a

becoming-performative of knowledge itself’ (McKenzie 2001: 14). This is what McKenzie

terms the rise of the ‘performance stratum’: a complex power/knowledge formation with far

reaching practical and epistemological implications. Knowledge claims no longer to refer to

something ‘out there’, but come to matter too and can therefore only be affirmed as ‘true’

when a complex entanglement of words and practices is being sustained. These ‘onto-

epistemological’ implications are our concern. Feminist science studies scholar and

theoretical particle physicist Karen Barad has argued that in order to address such concerns,

we must proceed to ‘the study of practices of knowing in being’ (2003: 829).

In science studies, Andrew Pickering and other sociologists of scientific practice draw

attention to the performance of scientists in the laboratory and show how this illuminates the

ways in which knowledge is a socio-material product emerging from ‘the mangle of practice’

(1995). Bruno Latour and others propose to conceive of this ‘mangle’ in terms of a network of

interactions between human and non-human actors (1987). Not only do scientists perform, so

do their instruments and even the objects under investigation. The proposal here is

methodological: taking more seriously than ever what emerges as science, and therewith as a

scientific fact, we are enabled to refrain from the oppositional logic of fact and fiction—or

oxymorons like ‘faction’—but can study the reality-producing effect of scientific practice.

                                                                                                                         3 Heidegger’s “The Question Concerning Technology” in Heidegger 1993: 308-341. See also

McKenzie 2001: 155-172 on the translation of this term.  

Barad, speaking about physics, goes as far as to state that matter itself is as active as our

interpretative frameworks, so that we do not give meaning to matter, but matter and meaning

co-constitute each other (2007). Her argument for ‘the study of practices of knowing in being’

implies that performativity—with its conforming as well as its norm-shifting effects—is

situated at the very heart of both scientific work itself and engagements with it. The Baradian

framework proposes to study how the scientist emerges as a scientist, the instrument as

instrument, and the object of investigation as an object of investigation. This onto-

epistemology differs from a network approach in that—to use Donna Haraway’s words—

‘[b]eings do not pre-exist their relatings’ (2003: 6). By foregrounding the processual relating,

which precedes the relata as well as their relationship as it has come about, the identities and

subjectivities of entities in the scientific laboratory are studied when they are truly ‘in the

making’, and matter’s entanglement with meaning becomes analyzable. The play here is with

causality: causal relations are not denied, but seen as effects; and in order to make them

analyzable them as effects, one must start one’s research ‘in the mangle’. Following Butler,

Barad, Vicki Kirby (2011) and others, and reading their work through the work of McKenzie,

we argue for the productivity of this perspective as it was first developed within the

humanities, not only for understanding human bodies and by extension of what it means to be

human, but also for our understanding of knowledge production in the sciences.

In the following, we will present a first step towards such an approach and we will suggest

how, brought to bear on the hunt for the Higgs particle at the European Organization for

Nuclear Research, or CERN (Geneva), this approach raises the question of whether in

scientific experiments not only scientists and technology are put under pressure, but also

nature itself is required to ‘perform, or else’. At the very frontier of particle physics, this hunt

encountered a complex intertwining of matter and meaning that is also the subject of

performance studies and the philosophy of science. Furthermore, we suggest, a posthuman

understanding of the entanglement of meaning and matter may provide a useful perspective

on the implications of the transformations in research and knowledge production currently

identified as Digital Humanities.

Matter and Meaning

The spectacular development of performance concepts over the past half century, the

movements of generalization in such divergent areas as technology, management, and

culture, the patterns of joint performance-challenges—all these suggest that the world

itself is being challenged to perform—or else. (McKenzie 2001: 158)

McKenzie is referring here to Heidegger’s idea that technologies challenge forth the world, a

point made in The Question Concerning Technology. For Heidegger, technology is a way of

revealing what is intimately connected to knowledge. Heidegger explains how the word

technology stems from the Greek word technikon which means that which belongs to techne.

And with regard to techne he observes two things:

One is that techne is the name not only for the activities and skills of the craftsman but

also for the arts of the mind and the fine arts. Techne belongs to bringing-forth, to

poesis; it is something poetic. The other thing that we should observe with regard to

techne is even more important. From the earliest times until Plato the word techne is

linked with the word episteme. Both words are terms for knowing in the widest sense.

(Heidegger 1993: 318, italics in original)

Technology reveals ‘whatever does not bring itself forth and does not yet lie here before us’

(Heidegger 1993: 319). It shares with knowing that it gathers together in advance the aspect

and the matter of that which is crafted with a view to the finished thing envisaged as

completed, ‘and from this gathering determines the matter of its construction’ (Heidegger

1993: 319). Heidegger also observes something specific about how modern technology

reveals; modern technology’s way of revealing takes the shape of Herausforderen, or

challenging forth. McKenzie refers to translator William Lovitt’s note on this term to explain

that:

Herausforderen means to challenge, to call forth or summon to action, to demand

positively, to provoke. It is composed of the verb forden (to demand, to summon, to

challenge) and the adverbial prefixes her- (hither) and aus- (out). Although Lovitt does

not draw attention to it, his translation also bears with it the sense of inauthenticity, for

‘challenging’ and ‘challenging forth’ from the Latin calumniare (to accuse falsely),

and is also related to calvi (to deceive) and calumnia (trickery). (McKenzie 2001: 156,

italics in original)

Characteristic of modern technology’s ways of revealing is that this does not unfold into a

bringing forth in the sense of poesis but happens through a challenging (Herausforderen) that

places a demand and that it takes shape through claiming, regulating, and ordering4. It is not

merely a challenging but the challenge to bring forth a particular framing or ordering of

knowledge.

                                                                                                                         4 Heidegger gives the example of how in mining the earth is made to bring forth uranium or how in a hydroelectric power plant a dam is instrumental in yielding forth electricity (1993: 320-321).

Physics is Heidegger’s example of how modern science ‘entraps nature as a calculable

coherence of forces’ (1993: 326) and how as a result man is challenged forth into revealing.

Modern physics is not experimental physics because it applies apparatus to the

questioning of nature. The reverse is true. Because physics, indeed already as pure

theory, sets nature up to exhibit itself as a coherence of forces calculable in advance, it

orders its experiments precisely for the purpose of asking whether and how nature

reports upon itself when set up in this way. (Heidegger 1993: 326)

Modern physics, Heidegger argues, paved the way for what he considers to be the essence of

technology as a challenging forth into ordering of nature as the object of representational

knowledge of which humans are the subject. Heidegger’s approach to what McKenzie much

later would term techno-performance, thus acknowledges the ‘agency’ of technology (without

mentioning the term) and draws attention to the intimate connection between the performance

of technology and knowledge. Nature, on the other hand, remains passive in his approach,

something that is acted upon rather than itself performing. Here, McKenzie’s further

elaborations on techno-performance and suggests the possibility of a more active

understanding of matter that points in the direction of a non-anthropocentric approach to the

relationship between humans, technology and knowledge.

Like Heidegger, McKenzie too points to an intimate connection between technology

and knowledge, in particular to the way in which performativity becomes the legitimation, not

only of cultural performance, organizational management and technology, but also of

knowledge. Here McKenzie refers to Jean-François Lyotard’s seminal The Postmodern

Condition: A Report of Knowledge, in which Lyotard famously observes a radical change in

the status of knowledge in post-industrial societies ([1979] 1984). A change that he sees

underway since the 1950s, the same time period in which McKenzie locates the emergence of

the performance stratum.

Modern knowledge fully emerged in industrial societies of the nineteenth and early

twentieth centuries and legitimated itself upon what Lyotard calls ‘grand narratives’

(…). Postmodern knowledge, by contrast, legitimates itself by ‘optimizing the

system’s performance—efficiency’. Its emergence marks the decline of grand

narratives within academic and public discourse and the growing hegemony of

computer technologies. Significantly, Lyotard names this postmodern legitimation

‘performativity’. (McKenzie 2001: 14)

The emergence of performance and performativity as key terms in a variety of fields is

indicative of a more general transformation that affects the validation of various practices,

including those of producing knowledge. This transformation entails a shift towards

productivity as key to evaluation, validation and legitimation. Research and knowledge too

have to perform, or else. McKenzie demonstrates with a detailed analysis of the development

and disaster of the spacecraft Challenger, how this situation turns the world itself into a test

site where scientists, under the pressure to perform, set out to put technology under the

pressure to perform a successful shuttle mission as the proof of their practice—and failed.

McKenzie’s analysis comes close to Peter Galison’s work on causality in the case of

the unhappy accident of the plane crash. Despite the fact that Galison shows that the billiard

ball model of causality is in fact only one of the possible approaches to this relationality, this

straightforward model is extremely popular amongst contemporary politicians, policy makers,

sometimes activists, and scholars of the applied and fundamental kind (2000). Galison shows

convincingly that after an unexpected, but always/already calculated event, such as a plane

crash, a huge mass of people expect to find a rational cause, and expect to find it following a

rational method of investigation. Family members of the deceased, company owners of

planes, producers of the tiniest technical details of planes, insurance companies, the ones

hooked to the news, politicians, world leaders, all expect, or hope, to find an isolatable reason.

Galison shows, firstly, that this cause is not isolatable, nor to be found rationally. What if the

cause of plane crash x, or any other event with consequences that are as huge as a plane crash,

is the tiniest material detail of a ‘thing’ such as a plane? What if this small mechanical piece

that seems to have singlehandedly caused a huge plane to change direction and crash is

already exchanged for a human cause, and back again? Who was responsible for screwing that

piece of metal to the plane’s wing? How are we to politically deal with this ongoing exchange

between pieces and systems of agential matter and instantiations of human agency? And how

are we to epistemologically justify the fact that our research results in accident investigation

shifts between tiny little pieces of metal acting on their own or in a system, and a human or a

group of humans malfunctioning? What political and epistemological tools do we have at

hand for responding to the interplay between agential matter and human agency? And isn’t

matter something inherently mute? Aren’t humans precisely human because they have power

over mute matter? The minute researchers claim that a human subject is responsible for the

malfunctioning of matter, agential matter has already re-presented itself. Galison

demonstrates not only the possibility of a material doer behind the deed. He also shows that

matter, in addition to being agential, and just like humans, does not exist on its own as a

relatum that enters into a causal relationship. There is never a single agent, there are always

multiple agents at work, and these multiple agents are not solely technical or human. The

solely technical cause is nothing but the desire of the physical scientist, and the solely human

one is just the wish of the social scientist. Both care for problem solving and seek resolution.

The general audience simply needs either the one or the other single cause. It is the way in

which we conduct our research into past events such as accidents that produces the ‘cause’ we

end up isolating, and that cause changes with the chosen methodology used to analyse the

past event.

Similarly then, in his analysis of the failure of the Challenger, McKenzie shows the

materialization of this disaster resulted from how the pressure to perform played out in the

various networks of human and non-human actors that were realized in a great diversity of

practices that together produced the Challenger, and the disaster. One thing these practices

and networks shared was the challenge to make the Challenger, to make its mission happen.

That is, the challenge to make the mission materialize as an event in space and time.

McKenzie’s point is not that the pressure to perform was too high and that this pressure is to

blame for the failure of the mission. Rather, his analysis shows that the way in which the

disaster materialized cannot be understood from a single cause or failure but follows from

how the pressure to perform at work in a multiplicity of networks at different times produced

this disaster in a way beyond the control or imagination of individual agents involved in

making the decisions. Individual agents aimed precisely to prevent a disaster from happening.

Understanding how this disaster could nevertheless materialize requires a careful analysis of

how decisions are being made in a diversity of independent practices that feed back into one

another. It also requires acknowledging the agency of nonhumans (machines, animals, texts,

and hybrids, among others) in these networks of what Barad terms ‘agential intra-actions’ and

how such agency is a matter of enactment (2007).

Beyond Epistemology

Barad argues that what she calls ‘representationalism’ entails the theorizing of knowledge

claims from a starting point of assuming a fundamental gap between scholarly representations

on the one hand, and on the other, a world ‘out there’ to which the representations allegedly

refer (2007: 46). ‘[I]n particular,’ she states, ‘that which is represented is held to be

independent of all practices of representing’ (2007: 46). The knowing subject is assumed to be

present as a third, similarly independent or pre-existing member of the relationship and the

instruments get ascribed a neutral mediating function. Barad argues, then, that ‘[t]his taken-

for-granted ontological gap generates questions of the accuracy of representations’ and puts

the focus on questions of correspondence between descriptions and reality (2007: 47). Science

studies scholars have generated alternatives, and Barad mentions Joseph Rouse in particular,

who has successfully cut across the two famous instantiations of representationalism: on the

one hand ‘realism’, which places its bets on words representing nature without interfering,

and ‘social constructivism’, which argues that nature, as much as knowledge claims, are

social constructions, on the other. Non-representationalist studies are an alternative to both

and dive into the heart of scientific entanglements of words and material practices; in fact,

they ‘focus inquiry on the practices or performances of representing, as well as the productive

effects of those practices and the conditions for their efficacy’ (2007: 49). This requires an

understanding of matter as a doing, and of agency as the dynamism of this process, its

performativity. That is, what is required is an understanding of agency as ‘not something that

someone or something has’ (2003: 826) but as a matter of enactment. She continues that

‘agency is the enactment of iterative changes to particular practices through the dynamics of

intra-activity’ (2003: 827).

Barad refers to Niels Bohr, one of the founders of quantum mechanics, for an

elaboration of agential practices as intra-actions, or performances, in terms of ‘apparatuses’.

The apparatuses used by physicists, Bohr argues, are not merely observational devices or

laboratory instruments. They are not neutral probes of an independently existing natural

world, used by independently existing scientists, yet neither are they themselves

deterministically imposing a specific outcome. Rather they are sets of practices intra-acting

with other sets of practices, and as a set of practices they afford intra-actions: ‘specific causal

material enactments that may or may not involve humans’ (2003: 817). She explains her point

with an example that we will quote at length because it draws attention to how the nature of

phenomena, in this case light, does not exist independently from light intra-acting with the

apparatus used to measure its characteristic but results from it:

When light passes through a two-slit diffraction grating and forms a diffraction pattern

it is said to exhibit a wavelike behavior. But there is also evidence that light exhibits

particle-like characteristics, called photons. If one wanted to test this hypothesis, the

diffraction apparatus could be modified in such a way as to allow a determination of

which slit a given photon passes through (since particles only go through a single slit

at a time). The result of running this experiment is that the diffraction pattern is

destroyed! Classically, these two results together seem contradictory—frustrating

efforts to specify the true ontological nature of light. Bohr resolves this wave particle

duality paradox as follows: the objective referent is not some abstract, independently

existing entity but rather the phenomenon of light intra-acting with the apparatus. The

first apparatus gives determinate meaning to the notion of ‘wave,’ while the second

provides determinate meaning to the notion of ‘particle’. The notions of ‘wave’ and

‘particle’ do not refer to inherent characteristics of an object that precedes its intra-

action. There are no such independently existing objects with inherent characteristics.

The two different apparatuses effect different cuts, that is, draw different distinctions

delineating the ‘measured object’ from the ‘measuring instrument’. In other words,

they differ in their local material resolutions of the inherent ontological indeterminacy.

There is no conflict because the two different results mark different intra-actions.

(Barad 2003: 815-6, n. 21, italics in original)

The phenomenon of light is the result of these intra-acting components and its appearance

therefore cannot be understood independently from the measuring apparatus. Nor is the way

light appears simply the effect of the measuring apparatus. Rather, the different appearances

of the phenomenon of light result from the different intra-actions of light and the apparatus

(including the ones handling, in this case, the diffraction apparatus). These intra-actions are

specific causal material enactments that give shape to the phenomenon of light as it emerges

from its relation with the apparatus. The referent light emerges from how causal material

relationships are effected and this results in light appearing as a wave at one time and as a

particle at the other.

The Proof is in the Performance

Technologies (…) are made to perform through a circular process of hypothesis and

measurement, prediction and evaluation. (…) Even the most common and apparently

simple technological performances, such as those of water skis and flame-resistant

carpets, are the results of intense research that entails the measurement, the evaluation,

and increasingly, the modelling of performance. (McKenzie 2001: 110)

Technologies are designed, challenged to perform, and then evaluated for their effectiveness.

The results are fed back into the designing process and used to further develop the technology

towards performing according to the engineering hypothesis on which the design is based.

The result is a kind of feedback loop between predictions and performance: ‘performance is

evaluated in terms of predictions that are then modified based on performance and

subsequently used as basis for evaluating the next performance, and so on and so on’ (2001:

107). So, actually what is happening is not only that technologies are made to enact the

procedures they are designed to execute (i.e., matter being disciplined into performance) but

also their performance feeds back into the design through which the engineers enact their

understanding of their performance. Drawing on Barad we might describe the designing

process thus as a set of intra-actions. In such intra-actions, agency is not merely a matter of

humans trying to make matter perform according to their predictions, but also of the

enactment of intra-actions between humans and matter that mutually implicate each other and

feedback into each other. The successful performance of technologies like water skis and

flame resistant carpets is the result.

What if these technologies being tested and challenged are not water skis, flame-

resistant carpets, or spacecrafts, but scientific instruments (apparatuses), designed for and

used in experiments? What if they are particle accelerators, like the one used at CERN to

produce the experimental proof of Peter Higgs’ (and others’) 1964 theory about the Higgs

boson? Billions worth of money and years of work were invested in immense experimental

apparatuses in which particles are made to collide together at close to the speed of light. Huge

detectors were constructed to observe and record the results of experiments intended to give

‘the physicists clues about how the particles interact, and provides insights into the

fundamental laws of nature’ (CERN official website)5. The aim is to study the characteristics

of particles and their behavior, while at the same time the experiments are constructed with a

specific particle (and the characteristics it is assumed to have) in mind, namely to prove the

existence of the Higgs boson.

The Higgs boson is such an interesting ‘object’ of research because in the theoretical

models of nuclear physics referred to as a particle (thus suggesting it being matter), whereas

making it manifest as particle was precisely the challenge of the hunt. And making it manifest

                                                                                                                         5 http://home.web.cern.ch/about Last accessed March 2, 2014.

meant to make it perform in a way that would confirm the interconnectedness of the Higgs

mechanism, the Higgs field and the Higgs particle, and how this is embedded in the Standard

Model. Demonstrating the truth of Higgs’ theory therefore meant to design an experiment in

which the Higgs particle—“ the missing piece in the Standard Model puzzle” as the press

release of the Royal Swedish Academy of Sciences announcing the Nobel Prize in Physics for

2013 puts it6—is literally challenged forth as matter (even though its ultra-short existence can

only be detected after the fact by detecting the traces of its decay). This challenging forth

happens by means of apparatuses constructed to make matter perform and at the same time it

is within this performance that the Higgs particle is supposed to materialize. That is, matter

here is a doing, an enacting of the intra-actions afforded by the apparatus (the accelerator).

In the case of the collider, therefore, the proof of the performance is not only a matter

of the efficiency of the technology to do what it is designed to do (to produce the collisions

that will demonstrate the existence of the Higgs particle). In order for the experiment to be

successful, the Higgs particle has to perform as well. And it has to perform within the

parameters according to which the detectors are designed to detect its traces. In the collider

one might therefore say, nature is made to perform, and to perform in such a way as to

produce the proof of Higgs’ theory (and the collider’s intended efficiency).

Hunting for Higgs

Following Barad we may argue that the Higgs particle appears the way it did from the specific

causal material enactments, or intra-actions, afforded by the apparatus, and that agency here

has to be understood as the dynamism of this process. At CERN, the proof of Higgs’ theory

was brought about by decisions being made at a great diversity of relatively independent

moments, decisions that are not necessarily about the goal that retrospectively makes them

                                                                                                                         6 http://www.kva.se/en/pressroom/Press-releases-2013/The-Nobel-Prize-in-Physics-2013/, last accessed on 8 March 2014

meaningful as part of how the particle appeared. This is not to say the detection of the Higgs

boson as proof of Higgs’ theory was a coincidence that came about unintentionally. On the

contrary, the hunt for the Higgs boson was an extremely carefully coordinated and long-term

scientific project in which nothing was left to chance. However, precisely the scale of the

project, time-wise and other, means that its outcome is the result of many different decisions

made at many different moments in the process by many different agents. Although ultimately

all these decisions were part of a process that aimed for one overarching goal, the scale of the

project meant that the agents involved in the innumerable amount of decisions to be made

could do so only from a point of partial vision with regard to the totality of the project.

Therefore, understanding how the Higgs boson was made to perform will require a careful

analysis of how decisions have been made in a diversity of seemingly independent practices

that fed back into one another, as well as acknowledging the agency of non-humans

(machines, animals, texts, and hybrids, among others) in these networks of ‘agential intra-

actions’. ‘Understanding’, here, does not mean finding out what this processes of decision

making tells us about how scientists and others make decisions (a sociology of science).

Rather, understanding means grasping how the Higgs particle materialized the way it did as a

result of the ways in which the pressure to perform plays out through extensive networks of

human and non-human actors. Networks that are realized in the diverse practices that together

make up the hunt,

Furthermore, such understanding will require taking into account that a large number

of decisions were not made by humans. Large-scale highly technological research projects

like the hunt for Higgs at CERN involve complex constellations of scientists and their

instruments in which human researchers are literally nodes in networks that operate on a scale

and in cognitive modes that exceed human understanding. For example, the complexity of the

analysis required to interpret the data produced by the particle accelerator at CERN was

simply impossible until machines were developed that were able to perform the complex

calculations required to do so. It was human intelligence that built the machines. However,

being the product of complex collaborations of large groups of experts, the capacities of the

machines nor the measurements produced by these machines can be understood as being

controlled by individual human agency. Furthermore, the machines do not perform on their

own. Rather, groups of humans and technology form complex assemblages of actors (human-

machine-matter-data) interacting with one another in various ways.

Wouter Verkerke, head of statistical analysis at Nikhef (the Dutch partner of CERN),

observes in the Nikhef Annual Report for the year 2012:

Calculation of the production rate of Higgs bosons shows that at the LHC roughly one

in 5 billion collisions will result in a Higgs boson. Thus to collect a sample of a few

hundred collisions with a produced Higgs boson, more than a trillion proton-proton

collisions will need to be produced and examined. To meet this requirement the LHC

collides protons 40 million times per second. It is impossible to record the detector

data for a trillion collision as each event results in about 200MB of data. Thus events

are preselected in real-time with a three-level ‘trigger system’, that rapidly examines

recorded collision data and discards the large majority of events that are deemed

uninterested …. The first-level selection is built on custom hardware that reduces 40

million events per second to approximately 100.000 events per second. The next two

levels are software-based and further reduce the rate to about 600 events per second.

Despite the aggressive data reduction through online trigger pre-selection, a few

petabyte (=1000 Terabyte) worth of information is produced every year the LHC runs.

It requires about 10.000 year of computing time to reconstruct all these events.

(Verkerke 2012: 16-17)

Finding evidence for Higgs involves connecting an incredible amount of scraps left after the

fact and reading them as traces, or as, in Barad’s terms, adopted from Bohr, ‘permanent marks

left on bodies’ (Barad 2007: 119, 174, 197). This would be impossible without the modes of

processing data made possible by computer technology. Agency (of observation), in this

situation, cannot be understood separately from the technology used to process the data. A

large amount of interpretations needs already to have been made before human agency is able

to engage with the data and in the processes that exceed the capacities and procedures not

fully graspable by human agency. This raises questions concerning the relationship between

human agency and technology, and of technology as itself endowed with agency.

With the concept of intra-action, Barad attempts to do justice to the fact that the

atomist, or simply individualist, metaphysics of any representationalism imports assumptions

into epistemological study which have distortive effects. In her words:

(…) discursive practices are specific material (re)configurings of the world through

which the determination of boundaries, properties, and meanings is differentially

enacted. That is, discursive practices are ongoing agential intra-actions of the world

through which specific determinacies (along with complementary indeterminacies) are

enacted within the phenomena produced. (2007: 148-9; original emphasis)

The important corollary claim is that this does not do away with causality (as if the study of

science has lost its ability to make worthy observations). Onto-epistemologists deal with non-

linear causality and the non-finite. First, whereas we cannot assume that scientists work

according to a plan, we must acknowledge that observations of seemingly clear-cut objects,

effective use of measuring and recording instruments, and scientific results are constantly

being made by scholars all over the world. An onto-epistemological approach like Barad’s is

not to debunk scientific research (cf. Latour 2004). Second, the results, instruments and

objects have to be read as excessive according to Haraway, which means that they reach

beyond their limits: ‘(w)hat boundaries provisionally contain remains generative, productive

of meanings and bodies’ (1988: 594).

Barad’s following statement summarizes what we suggest has happened to (the

proofing of) the Higgs particle at CERN:

Meaning is not a property of individual words of groups of words but an ongoing

performance of the world in its differential dance of intelligibility and unintelligibility.

(2007: 149)

This suggests not only that the word ‘Higgs boson’ is carried by the experimental set up of the

collider (including nature) and the entire machinery around it (financing, politics, publishing

market) but also that the curious object itself plays an active role in the constellation. The

proof of Higgs’ existence required not only particular modes of behaving (performing) from

the scientists involved, but placed similar demands on the Higgs particle itself and the

technology involved. They too have to perform, or else the particle (which had to be assumed

for the making of the collider technology) does not exist. In and through its performance, the

particle ‘matters’. The Higgs particle thus urges us to understand that the object (ontology—

the existence of the Higgs particle) and the subject of knowledge (epistemology—the way in

which the Higgs particle is captured) are co-constitutive. This also raises the question in

which sense the hunt for Higgs is fundamentally different from other experiments. Is it the

scale that makes this experiment stand out? Or is the hunt for Higgs a sort of experimentum

crucis with the materiality of meaning, i.e. the possibility to detect in material form that

which gives meaning to the experimental project and its scientific basis?

Posthumanist Performativity

A posthumanist understanding of performativity points to the materiality of meaning making:

to how discursive practices and material phenomena do not stand in a relationship of

externality to one another but are mutually implicated in the dynamics of what Barad calls

intra-activity. ‘The point is not merely that there are important material factors in addition to

discursive ones; rather the issue is the conjoined material-discursive nature of constraints,

conditions and practices’ (Barad 2003: 823). Matter is substance in its intra-active becoming

and this intra-activity Barad proposes to understand as performativity.

Understood as intra-activity, the performativity of knowledge production describes

how knowledge is a matter of enactment of relationships that precede the subject and object of

knowledge as their relata. This does not mean that the subject and object of knowledge

become meaningless, both in reality and for analyses of knowledge production, but rather that

they result from how relationships are enacted. This happens not only in big science projects

like the hunt for the Higgs particle but also, for example, in databases where interfaces afford

intra-actions from which the objects of knowledge emerge as relata. How they appear is as

much a matter of the performance of technology as of our entanglement with this technology.

Here it seems, much is to be gained from a joint science-humanities perspective on the

performativity of practices of knowledge production. A posthumanist understanding of

performativity not only presents a useful perspective on big science research projects in which

technology is instrumental in setting up experiments that exceed human capacities, but also

with regard to the rise of digital culture and the transformations associated with the so-called

Digital Humanities. The emphasis on databases, as N. Katherine Hayles observes, ‘shifts the

emphasis from argumentation—a rhetorical form that historically has foregrounded context,

crafted prose, logical relationships and audience response—to data elements embedded in

forms in which the structure and parameters embody significant implications’ (2012: 39).

Different interfaces can make the same database perform differently, according to what users

find useful or scholars want to convey.

Machine reading differs considerably from human reading and affords new ways of

understanding. More than that: human intelligence cannot be understood separately from the

technology through which humans relate to their environments. This idea may seem

controversial to scientists considering their tools to be generators of objective data existing

independently from the ideas of scientists and presenting the material proof (or not) of the

validity of the theory. It may seem controversial as well to humanities scholars who consider

ideas and insights to be the product of human agency, and understand agency as something

existing independently from the technology used in undertaking research. However, the idea

that cognition developed through the interaction of humans with tools and technologies is not

controversial at all in other fields of research like paleoanthropology, evolutionary biology,

and neurophysiology, pointing to the intimate connection between the development of human

intelligence and the tools and technologies used by humans. In this interaction, Latour, Callon

and Law argue, all entities achieve significance in relation to others. This is what they term

‘relational materiality’, with which they propose a material extension of semiotics that allows

for an anti-essentialist approach to science in terms of a network of heterogeneous elements

realized within a set of diverse practices (Law and Hassard eds 1999). Taking seriously the

agency of non-humans, this network is conceived as a heterogeneous amalgamation of

textual, conceptual, social, and technical actors. On our part, we propose to proceed as such in

field as diverse, yet intrinsically connected as performance studies, digital humanities and

philosophy of science.

The ideas elaborated in this article were developed in dialogue with Jan van den Berg, theatre

maker (a.o. Higgs—Stand Up Physics) and co-author (with Hannie van den Bergh) of the

award winning documentary Higgs: Into the Heart of Imagination.

References

Barad, Karen (2003), ‘Posthumanist performativity: Toward an understanding of how matter

comes to matter’, Signs: Journal of Women in Culture and Society 28: 3, pp. 801-31.

Barad, Karen (2007), Meeting the Universe Halfway. Quantum Physics and the Entanglement

of Matter and Meaning, Durham & London: Duke University Press.

Butler, Judith (1993), Bodies that Matter: On the Discursive Limits of ‘Sex’, New York and

London: Routledge.

Foucault, Michel (1980), Power/Knowledge: Selected Interviews and Other Writings, 1972-

1977, edited and translated by Colin Gordon, New York: Pantheon Books.

Galison, Peter (2000), ‘An Accident of history’, in Peter Galison and Alex Roland (eds),

Atmospheric Flight in the Twentieth Century, Dordrecht: Kluwer Academic Press, pp. 6-43.

Haraway, Donna (1988), ‘Situated Knowledges: The Science Question in Feminism and The

Privilege of Partial Perspective’, Feminist Studies 14: 3, pp. 575-99.

Haraway, Donna J. (2003), The Companion Species Manifesto: Dogs, People, and Significant

Otherness, Chicago, IL: Prickly Paradigm Press.

Heidegger, Martin (1993), Basic Writings. Revised & Expanded edition edited by David

Farrell Krell, New York : Harper Collins Publishers.

Hoel, A. Sissel and Iris van der Tuin (2013), ‘The ontological force of technicity: Reading

Cassirer and Simondon diffractively’, Philosophy & Technology 26: 2, pp. 187-202.

Irni, Sari (2013), ‘The politics of materiality: Affective encounters in a transdisciplinary

debate’, European Journal of Women’s Studies 20: 4, pp. 347-360.

Kirby, Vicki (2011), Quantum Anthropologies: life at large, Durham & London: Duke

University Press.

Latour, Bruno (1987), Science in Action: How to Follow Scientists and Engineers Through

Society, Cambridge, MA: Harvard University Press.

Latour, Bruno (2004), ‘Why has critique run out of steam? From matters of fact to matters of

concern’, Critical Inquiry 30: pp. 225-48.

Law, John and John Hassard (eds) (1999), Actor Network Theory and After, Oxford and

Keele: Blackwell and the Sociological Review.

Lyotard, Jean-François ([1979] 1984), The Postmodern Condition: A Report on Knowledge,

Minneapolis: University of Minnesota Press.

McKenzie, Jon (2001), Perform or Else. From Discipline to Performance, New York and

London: Routledge.

Pickering, Andrew (1995), The Mangle of Practice: Time, Agency, and Science, Chicago:

University of Chicago Press.

Verkerke, Wouter (2012). ‘Higgs Bosons Tracked From Collision To Publication’, Nikhef

Annual Report.

https://www.nikhef.nl/fileadmin/Doc/Docs%20&%20pdf/Nikhef_Annual_Report_2012.pdf:

pp. 15-20 Last accessed 28-6-2014