: N 2(38), 2006


Manuel De Landa

by Manuel DeLanda

A philosophy's ontology is the set of entities it is committed to assert actually exist, or the
types of entities that according to that philosophy populate reality. Although historically there
have been a great variety of ontological commitments we may classify them into three main
groups. To begin with, there are philosophers for whom reality has no existence independently
from the human mind that perceives it, so their ontology consists mostly of mental entities,
whether these are thought as transcendental objects or, on the contrary, as linguistic
representations or social conventions. This ontological stance is usually referred to as "idealism".
Then there are philosophers who may grant to the objects of everyday experience a mindindependent existence, but who remain skeptical that theoretical entities (both unobservable
relations such as physical causes as well as unobservable entities such as electrons) posses such
mind-independence. Pragmatists, positivists and instrumentalists of different stripes all subscribe
to one or another version of this ontological stance. Finally, there are philosophers who grant
reality full autonomy from the human mind, disregarding the difference between the observable
and the unobservable as betraying a deep anthropocentrism. In other words, while the previous
stances deal only with phenomena (things as they appear to the human mind) the latter also
includes nuomena (things in themselves). Philosophers adopting this stance are said to have a
realist ontology. Deleuze is such a realist philosopher.

On the other hand, realist philosophers need not agree about the contents of this mindindependent
reality. In particular, Deleuze rejects several of the entities taken for granted in
ordinary forms of realism. To take the most obvious example, in some realist approaches the
world is thought to be composed of fully-formed objects whose identity is guaranteed by their
possession of an essence, a core set of properties that defines what these objects are. Deleuze is
not a realist about essences, or any other transcendental entity, so in his philosophy something
else is needed to explain what gives objects their identity and what preserves this identity
through time. Another way of expressing this idea is to say that naive realists believe in the
existence of both general categories and their particular instantiations. The crucial relation here
is one of class membership, a set of particulars belonging to a given class or category if they
share a common core of properties. Deleuze replaces the relation between the general and the
particular with that between the universal and the singular. Or more exactly, general types (such
as animal species) are replaced with larger spatio-temporal individuals, so that a given species is
as singular, as unique, as historically contingent as the organisms that belong to it. Indeed, even
the last expression needs to be corrected since the relation between organisms and species is not
one of tokens belonging to types, but one of wholes and parts: singular individual organisms are
the component working parts of (larger) singular individual species. Or what amounts to the
same thing, larger scale individual entities emerge from the causal interactions of a population of
smaller scale individuals. On the other hand, general laws and the particular events or processes
that obey them, are replaced by universal singularities. Here the best example comes from
classical physics. While in its original formulation the basic ideas of this field were given the
form of general laws (Newtons laws of motion, for example) in the eighteenth and nineteenth
century it acquired an alternative form: most classical processes from optical to gravitational
were seen to conform to a least principle, that is, they were viewed as governed by a
singularity in the form of a minimum point. This minimum was in a sense more universal than
the laws themselves since the laws of optics, of motion and of gravitation could all be seen as
regularities in processes governed by one and the same universal singularity.

In short, in place of the relation between the general and the particular Deleuze puts the
universal-singular and the individual-singular, a much more radical maneuver than the simple
nominalist move of disregarding general classes and sticking to particulars. Similarly, his
proposal is more radical than the conventionalist maneuver of simply declaring general
categories to be social constructions. No doubt, there are many categories which do not pick
out a real larger-scale individual in the world (the category schizophrenia, for example, may
actually group together several different mental conditions) and to that extent these are mere
constructions. But it would be wrong to argue that every category is like this and that not to view
all general categories as mere conventions is to espouse a form of essentialism. In fact, the
opposite is true: to simply replace essences with social conventions quickly degenerates in a
form of social essentialism. Essences and general categories (not to mention general laws) are
very hard to get rid of and simple nominalist or conventionalist maneuvers do not achieve the
desired goal. I will spend the rest of this essay sketching how Deleuze proposes to perform this
difficult feat, but in a nut shell it boils down to this: the identity of each individual entity (as well
as any resemblances among the individuals belonging to a given population) needs to be
accounted for by the details of the individuation process that historically generated the entity in
question; and any regularities in the processes themselves (especially regular or recurrent
features in different processes) must be accounted for in terms of an immanent (non
transcendent) abstract structure. Deleuze uses the term "intensive" when discussing individuation
processes, and the term "virtual" to refer to the ontological status of abstract structures, so I will
begin by defining these two terms.
But before I start let me take care of a couple of possible objections. All the theoretical
resources which I will use to define processes of individuation come from the hard sciences:
physics, chemistry, biology. Similarly, all the resources needed to define immanent processstructures come from mathematics: topology, group theory, dynamical systems theory. This immediately raises the following objection: how can one develop a realist ontology which is
supposed to serve as a foundation for scientific knowledge while from the start one presupposes
that there is such thing as "objective knowledge"? If the point of a realist ontology was
foundational this would indeed constitute a vicious circle. But one does not have to believe in
rock-solid foundations at all. One may alternatively view the role of the philosopher as allowing
the bootstrapping of an ontology. Much as in the realm of computers the vicious circle between
hardware and software (software must be loaded into hardware, but "loading" is a software
function) is broken by hard wiring a little bit of software, so a realist ontology may be lifted by
its own bootstraps by assuming a little bit of objective knowledge and then accounting for the
rest. Whether this move is legitimate would be checked by the overall coherence of the resulting
ontology and by verifying that it does indeed avoid the postulation of general entities (ideal
types, eternal laws). Clearly, an ontology where general laws are not among the contents of a
mind-independent world would radically break with standard scientific conceptions and in this
sense it would not be dependent on physical science's own ontology. A less problematic
objection is the choice of a starting point: why bootstrapping via the physical or the biological
when one could begin with the social component of science (its institutions, its ideologies)?. The
short answer is that for a realist whose goal in to create a mind-independent ontology the starting
point must be those areas of the world which may be thought as having existed prior to the
emergence of humanity in this planet. Having said this, let me discuss Deleuze's conception of
intensive individuation processes.
The science of thermodynamics distinguishes between extensive and intensive physical
properties. Extensive properties include basic spatial properties such as length, area, and volume,
as well as quantities such as amount of energy or entropy. All of these are defined as quantities
that are intrinsically divisible: if we divide a volume of matter into two equal halves we end up
with two volumes, each half the extent of the original one. Intensive properties, on the other
hand, are properties such as temperature, pressure, speed, or density which cannot be so divided.
If we take a volume of water at 90 degrees of temperature, for instance, and break it up into two
equal parts, we do not end up with two half volumes at 45 degrees each, but with two half
volumes at the original temperature. {1} Deleuze argues, however, that an intensive property is
not one that is indivisible so much as one that cannot be divided without involving a change in
kind.{2} The temperature of a given volume of liquid water, for example, can indeed be
"divided" by heating the container from underneath creating a temperature difference between its
top and bottom portions. Yet, while prior to the heating the system was at equilibrium, once the
temperature difference is created the system will be away from equilibrium, that is, we can
divide its temperature but in so doing we change the system qualitatively. Indeed if the
temperature difference is made intense enough the system will undergo a phase transition,
developing the periodic pattern of fluid motion known as "convection". Thus, in a very real
sense, phase transitions do divide the temperature scale but in so doing they mark sudden
changes in the spatial structure of a material.

There are two crucial philosophical ideas involved in the definition of intensive
properties. One is that intensive differences drive processes. At its simplest, a difference in
intensity will tend to cancel itself out and in the process it will drive a system back to
equilibrium. This tendency explains why temperature or pressure cannot be divided in extension:
whatever differences are created during the division process will be objectively averaged out and
the original equilibrium temperature or pressure will be restored. The second idea is that
intensities are characterized by critical thresholds marking points at which a material
spontaneously and abruptly changes in structure. The sudden change of liquid water into ice, or
the equally abrupt change into steam, are familiar examples of these critical events occurring at
very well defined intensive thresholds. The crucial ontological role played by intensive
differences in Deleuze's philosophy is expressed in the following quote:
Difference is not diversity. Diversity is given, but difference is that by which the given
is given...Difference is not phenomenon but the nuomenon closest to the phenomenon...Every
phenomenon refers to an inequality by which it is conditioned...Everything which happens and
everything which appears is correlated with orders of differences: differences of level,
temperature, pressure, tension, potential, difference of intensity. {3}
This quote clearly shows that far from focusing on the appearances which are given in
human experience (phenomena) Deleuze's ontology reaches out to the mind-independent
processes (nuomena) which give rise to these appearances in the first place. On the other hand, in
the page following this quote Deleuze argues that, despite this important insight, nineteenth
century thermodynamics cannot provide the insights he needs for his ontology because that
branch of physics became obsessed with the final equilibrium forms at the expense of the
difference-driven process which gives rise to those forms. Given that intensive differences are
supposed to define individuation processes, not individual products, to study systems where
these differences are already canceled defeats the very purpose of the concept. This shortcoming
of classical thermodynamics has today been repaired in the latest version of this branch of
physics, appropriately labeled far-from-equilibrium thermodynamics. Although Deleuze does
not explicitly refer to this new branch of science, it is clear that far-from-equilibrium
thermodynamics meets all the objections he raises against its nineteenth century counterpart. In
particular, the systems studied in this new discipline are not closed but open, continuously
traversed by a flow of energy and matter which does not allow the differences in intensity to be
canceled. If the only endogenous tendency explained by intensities was the tendency towards
simple and unique equilibria then the importance of the concept would be in fact quite limited.
But in far-from-equilibrium conditions a wider variety of endogenous tendencies appears:
systems may still tend to a steady state but now these equilibria may come in bunches, and more
importantly, these multiple equilibria may not be steady state but cyclical or even turbulent.
In addition to this traditional meaning of the term "intensive", a meaning which as I just
said relates to the endogenous tendencies of processes, Deleuze uses the term in a second but
closely related sense, one referring to the capacities of final products to enter into further
processes. In particular, in this second sense the term refers to the capacity of individual entities
to enter as components into heterogeneous assemblages, that is, assemblages in which the
components differences are not canceled through homogenization. (Reference to "differences
that are not canceled" is what unites the two senses of the term "intensive" in Deleuze's
work).This idea may be illustrated with examples from biology. One of the intensive processes
which fascinates Deleuze is the process of embryogenesis, the process that converts a fertilized
egg into an individual organism. This process is driven by intensive differences (for example,
different densities or concentrations of certain chemicals) and as such is an example of intensive
individuation in the first sense. The extensive properties of an actual organism (as well as the
qualities which define its identity) are produced by spatio-temporal dynamisms (or what amounts
to the same thing, by self-organizing processes) driven by intensive differences. In other words,
individual organisms are "actualized" via a difference-driven morphogenetic process. As Deleuze
puts it:
How does actualization occur in things themselves?...Beneath the actual qualities and
extensities [of things themselves] there are spatio-temporal dynamisms. They must be surveyed
in every domain, even though they are ordinarily hidden by the constituted qualities and
extensities. Embryology shows that the division of the egg is secondary in relation to more
significant morphogenetic movements: the augmentation of free surfaces, stretching of cellular
layers, invagination by folding, regional displacement of groups. A whole kinematics of the egg
appears which implies a dynamic. {4}
Once the individual organism is produced, however, its extensities and qualities will hide
the original intensive process and its endogenous tendencies. In other words, the process will
become hidden under the product. But this product, in turn, will possess in addition to a well
defined set of properties (extensive and qualitative) an open set of capacities to interact with
other such individuals, organic and non-organic. In particular, biological organisms are capable
of forming the heterogeneous assemblages we call "ecosystems" playing a given role in a
complex food web, and its constant flow of matter and energy. Deleuze refers to these
capabilities as "affects", the capacity of an individual to affect and be affected by other
individuals. Given that the potential interactions which an organism may have cannot be given in
advance, its affects (as opposed to its qualities and extensities) do not form a closed set.
The term affect is closely related to the term affordance introduced by James Gibson
within the context of a theory of ecological interactions. {5} Gibson distinguishes between the
intrinsic properties of things and their affordances. A piece of ground does have its own intrinsic
properties determining, for example, how horizontal or slanted, how flat, concave or convex, and
how rigid it is. But to be capable of affording support to a walking animal is not just another
intrinsic property, it is a capacity which may not be exercised if there are no animals around.
Given that capacities are relational in this sense, what an individual affords another may depend
on factors like their relative spatial scales: the surface of a pond or lake may not afford a large
animal a walking medium, but it does to a small insect which can walk on it because it is not
heavy enough to break through the surface tension of the water. Affordances are also symmetric,
that is, they involve both capacities to affect and be affected. For example, a hole in the ground
affords a fleeing animal a place to hide, but such animal could also dig its own hole, thus
affecting or changing the ground itself. Similarly, an animal may flee because a predator affords
it danger but it itself affords nutrition to the predator. Thus the assemblages "walking animalsolid
ground-gravity" or "predator-prey-hole in the ground" reveal capacities which are
dependent on, but not reducible to, the assemblage components' properties. As Deleuze puts it:
We know nothing about a body until we know what it can do, what its affects are, how
they can or cannot enter into composition with other affects, with the affects of another body,
either to destroy that body or to be destroyed by it, either to exchange actions and passions with
it or to join with it in composing a more powerful body. {6}
Singularities and affects, endogenous tendencies and open-ended capacities, define the
realm of the intensive in Deleuze's ontology. As he argues, intensive thinking implies a
completely different conception of matter as well as constituting a major shift in Western ideas
on the genesis of form. An essentialist ontology assumes not only that form preexists it material
realization, but also that matter is an inert receptacle for eternal forms imposed on it from the
outside. Deleuze refers to this conception of morphogenesis as "the hylomorphic model".
Intensive thinking, on the other hand, breaks with essentialism by endowing matter with
morphogenetic capabilities of its own. Artisans and craftsman, in his view, understand this other
conception of matter and form, at least implicitly: they tease out a form out of an active material,
collaborating with it in the production of a final product rather than commanding it to obey and
passively receive a previously defined form. As Deleuze writes, the hylomorphic model:
.... assumes a fixed form and a matter deemed homogeneous. It is the idea of the law that
assures the models coherence, since laws are what submits matter to this or that form, and
conversely, realize in matter a given property deduced from the form ....[But the] hylomorphic
model leaves many things, active and affective, by the way side. On the one hand, to the formed
or formable matter we must add an entire energetic materiality in movement, carrying
singularities....that are already like implicit forms that are topological, rather than geometrical,
and that combine with processes of deformation: for example, the variable undulations and
torsions of the fibers guiding the operations of splitting wood. On the other hand, to the essential
properties of matter deriving from the formal essence we must add variable intensive affects,
now resulting from the operation, now on the contrary, making it possible: for example, wood
that is more or less porous, more or less elastic and resistant. At any rate, it is a question of
surrendering to the wood, then following where it leads by connecting operations to a materiality
instead of imposing a form upon a matter... {7}
Let me move on now and describe that other realm of the Deleuzian world which
complements the intensive: the virtual. Tendencies and capacities are both modal terms, that is,
unlike properties which are always fully realized in an individual entity, tendencies and
capacities are only potential, in that they may not ever be realized. This creates a fundamental
ontological problem for Deleuze because modal terms are typically treated in terms of the
concept of "possibility", and this concept has traditionally been associated with essentialism.
Although I cannot go into a full discussion of modal logic and its notion of "possible worlds", the
link between essences and possibilities can be easily grasped if we think that, like an essence,
which represents an eternal archetype resembling the entities which realize it, a possible state or
relation also resembles that which realize it. In other words, the process of realization seems to
add very little to a possibility other than giving it "reality", everything else being already given.
Possible individuals, for example, are pictured as already possessing the extensities and qualities
of their real counterparts, if only potentially. It is to deal with this problem that Deleuze creates
the notion of the virtual. In his words:
What difference can there be between the existent and the non-existent if the nonexistent
is already possible, already included in the concept and having all the characteristics that
the concept confers upon it as a possibility?...The possible and the virtual are ...distinguished by
the fact that one refers to the form of identity in the concept, whereas the other designates a pure
multiplicity ... which radically excludes the identical as a prior condition.... To the extent that the
possible is open to realization it is understood as an image of the real, while the real is
supposed to resemble the possible. That is why it is difficult to understand what existence adds to
the concept when all it does is double like with like... Actualization breaks with resemblance as a
process no less than it does with identity as a principle. In this sense, actualization or
differenciation is always a genuine creation. Actual terms never resemble the singularities they
incarnate... For a potential or virtual object to be actualized is to create divergent lines which
correspond to without resembling a virtual multiplicity. {8}
Let me give a simple example of how mathematical singularities (as part of what defines
a multiplicity) lead to an entirely different way of viewing the genesis of physical forms. There
are a large number of different physical structures which form spontaneously as their
components try to meet certain energetic requirements. These components may be constrained,
for example, to seek a point of minimal free energy, like a soap bubble, which acquires its
spherical form by minimizing surface tension, or a common salt crystal, which adopts the form
of a cube by minimizing bonding energy. One way of describing the situation would be to say
that a topological form (a singular point) guides a process which results in many different
physical forms, including spheres and cubes, each one with different geometric properties. This
is what Deleuze means when he says that singularities are like "implicit forms that are
topological rather than geometric." {9} This may be contrasted to the essentialist approach in
which the explanation for the spherical form of soap bubbles, for instance, would be framed in
terms of the essence of sphericity, that is, of geometrically characterized essences acting as ideal
forms. Unlike essences (or possibilities) which resemble that which realizes them, a singularity is
always divergently actualized, that is, it guides intensive processes which differentiate it,
resulting in a set of individual entities which is not given in advance and which need not
resemble one another.
The concept of a "singularity" in the sense in which I am using it here is a mathematical
concept, so care should be taken to endow it with ontological significance. In particular,
mathematical singularities act as attractors for trajectories representing possible processes for a
given system within a given dynamical model. They are supposed to explain the long-term
tendencies in the processes represented by those trajectories. How to move from an entity
informing the behavior of a mathematical model (phase space) to a real entity objectively
governing intensive processes is a complex technical problem which Deleuze tackles but which
cannot be described here. (For full details see my forthcoming "Intensive Science and Virtual
Philosophy"). In what follows I will assume that the technical difficulties can be surmounted and
that a realist interpretation of some features of these models can be successfully given. But I
should at least quote Deleuze on his ontological commitment to the real counterparts of these
mathematical entities:
"The virtual is not opposed to the real but to the actual. The virtual is fully real in so far
as it is virtual....Indeed, the virtual must be defined as strictly a part of the real object as though
the object had one part of itself in the virtual into which it plunged as though into an objective
dimension....The reality of the virtual consists of the differential elements and relations along
with the singular points which correspond to them. The reality of the virtual is structure. We
must avoid giving the elements and relations that form a structure an actuality which they do not
have, and withdrawing from them a reality which they have." {10}
Two more details must be added before arriving at the definition of a virtual multiplicity.
First, singularities are not always topological points but also closed loops of different kinds,
defining not processes tending towards a steady state, but also processes in which the final
product displays endogenous oscillations (periodic attractors) as well as turbulent behavior
(chaotic attractors). Second, besides attractors we need to include bifurcations. Bifurcations
define recurrent sequences of topological forms. There is a well-studied sequence, for instance,
that begins with a point attractor which, at a critical value of a control parameter, becomes
unstable and bifurcates into a periodic attractor. This cyclic singularity, in turn, may become
unstable at another critical value and undergo a sequence of instabilities (several period-doubling
bifurcations) which transform it into a chaotic attractor.
This cascade of bifurcations can, in turn, be related to actual recurring sequences in
physical processes. There is, for example, a realization of the above cascade occurring in a well
studied series of distinct hydrodynamic flow patterns (steady-state, cyclic, and turbulent flow).
Each of these recurrent flow patterns appears one after the other at well defined critical
thresholds of temperature or speed. The sequence of phase transitions may be initiated by heating
a water container from below. At low temperatures the flow of heat from top to bottom, referred
to as thermal conduction, is simple and steady, displaying only a bland, featureless overall
pattern, having what mathematicians call a "high degree of symmetry". {11} At a critical point
of temperature, however, this steady flow suddenly disappears and another one takes its place,
thermal convection, in which coherent rolls of water form, rotating either clockwise or
counterclockwise. The water container now has structure and, for the same reason, has lost some
symmetry. As the temperature continues to intensify another threshold is reached, the flow loses
its orderly periodic form and a new pattern takes over: turbulence. The cascade that yields the
sequence conduction-convection-turbulence is, indeed, more complicated and may be studied in
detail through the use of a special machine called the Couette-Taylor apparatus, which speeds up
(rather than heating up) the liquid material. At least seven different flow patterns are revealed by
this machine, each appearing at a specific critical point in speed, and thanks to the simple
cylindrical shape of the apparatus, each phase transition may be directly related to a broken
symmetry in the group of transformations of the cylinder.{12}
As can be seen from this example a cascade of symmetry-breaking bifurcations may be
faithfully realized in a physical system. And although the physical system will embody specific
causal mechanisms that bring about the transitions, the abstract cascade itself is mechanismindependent.
In other words, the abstract structure may be actualized by quite different causal
mechanisms. As the biologist Brian Goodwin has pointed out, portions of this hydrodynamic
sequence may be observed in a completely different process, the complex morphogenetic
sequence which turns a fertilized egg into a fully developed organism. After describing another
instance of a sequence of flow patterns in hydrodynamics Goodwin says:
"The point of the description is not to suggest that morphogenetic patterns originate from
the hydrodynamic properties of living organisms ....What I want to emphasize is simply that
many pattern-generating processes share with developing organisms the characteristic that spatial
detail unfolds progressively simply as a result of the laws of the process. In the hydrodynamic
example we see how an initially smooth fluid flow past a barrier goes through a symmetrybreaking
event to give a spatially periodic pattern, followed by the elaboration of local nonlinear
detail which develops out of the periodicity. Embryonic development follows a similar
qualitative course: initially smooth primary axes, themselves the result of spatial bifurcation
from a uniform state, bifurcate to spatially-periodic patterns such as segments [in an insect
body], within which finer detail develops....through a progressive expression of nonlinearities
and successive bifurcations....The role of gene products in such an unfolding is to stabilize a
particular morphogenetic pathway by facilitating a sequence of pattern transitions, resulting in a
particular morphology". {13}
This is, in a nut shell, the realist ontology of Gilles Deleuze: a world of actual individual
entities (nested within one another at different spatio-temporal scales), produced by intensive
individuation processes, themselves governed by virtual multiplicities. I left out many details, of
course, including a discussion of the space formed by multiplicities (called "the plane of
immanence" or "plane of consistency"), the form of temporality of this space (a crucial question
if multiplicities are to be different from timeless archetypes), as well as the way in which this
virtual spacetime is constantly formed and unformed (this involves introducing one more entity,
half-virtual half-intensive, called a "line of flight"). I will not discuss these further issues here,
vital as they are, given that the elements already introduced are sufficiently unfamiliar to raise
questions of their own. Even without discussing planes of immanence and lines of flight one may
legitimately ask whether we really need such an inflationary ontology, an ontology so heavily
laden with unfamiliar entities. The answer is that this ontology eliminates a host of other entities
(general types and laws) and that, in the final balance sheet, it turns out to be leaner not heavier
than what it replaces.
To illustrate this point let me return to the ideas that opened this essay and give a couple
of examples of how the universal-singular and the individual-singular can replace the old relation
between the general and the particular. First of all, there is no general recipe for this, other than
the fact that traditional static classifications must be replaced by a symmetry-breaking abstract
structure (accounting for the regularities in the classified entities) as well as concrete intensive
individuation processes (accounting for the production of the classified entities). How to perform
this replacement needs to be worked out case by case, a fact that illustrates that the study of the
intensive as well as that of the virtual is ultimately empirical. Well worked out examples of this
replacement are easier to give in physics and chemistry, harder in biology, and even harder in the
social sciences. Not that there is a shortage of static classifications of "ideal types" in sociology
and economics that need replacement. (A realist cannot afford to play this need down by
referring to ideal types as "heuristic constructs". This only postpones the day of reckoning when
one has to explain why a given classification is in fact heuristically useful.) But I have been able
to work out only the actual component of a social ontology, the nested set of individual entities
(persons, institutional organizations, cities, nation states) that are discussed in the companion
essay. I am still in the process of compiling lists of examples of social intensive properties: from
wage differentials driving migration flows, to prestige differentials driving the flows of linguistic
loans from one language to another, to power differentials determining the direction and intensity
of surplus extractions. I have not yet managed to theorize the virtual component of a social
ontology, an unsurprising fact given that as I will argue in a moment, the virtual component of
biological processes is still a mystery. The examples that follow, however, should illustrate the
direction that such a replacing project would follow.
Perhaps the best example of a successful general classification which can already be
replaced by virtual and intensive entities is the famous Periodic Table of the Elements which
categorizes chemical species. The table itself has a colorful history given that many scientists
had already discerned regularities in the properties of the chemical elements (when ordered by
atomic weight) prior to Mendelev stamping his name on the classification in 1869. Several
decades earlier one scientist had already discerned a simple arithmetical relation between triads
of elements, and later on others noticed that certain properties (like chemical reactivity) recurred
every seventh or eighth element. One of them even gave a musical explanation for these
rhythms, which, as it turns out, are more complicated than a simple 8-fold symmetry. What
constitutes Mendelev's great achievement is that he was the first one to have the courage to leave
open gaps in the classification instead of trying to impose an artificial completeness on it. This
matters because in the 1860's only around sixty elements had been isolated, so the holes in
Mendelev's table were like daring predictions that yet undiscovered entities must exist. He
himself predicted the existence of germanium on the basis of a gap near silicon. The Curies later
on predicted the existence of radium on the basis of its neighbor barium. {14}
I take the rhythms of the table as being as real (and as in need of further explanation) as
anything that science has ever discovered. I realize that within the ranks of the sociology of
science there are many who doubt this fact, thinking that, for example, had Priestley's phlogiston
triumphed over Lavoisiers's oxygen a entirely different chemistry would have evolved. I
completely deny the truth of this assertion but I won't engage this argument here. As I said
before, this conventionalist maneuver only pretends to get rid of eternal archetypes and succeeds
only in replacing them with social essences: conventional forms imposed upon an amorphous
world very close to the inert matter of classical essentialism. We need a more radical
replacement, one that does not simply replace God the creator with Society the creator.
The virtual multiplicity underlying this famous classification has been recently worked
out. Given that the rhythms of the table emerge when one arranges the chemical species by the
number of protons their atomic nuclei possess (their atomic number) and that the nature of the
outer shell of electrons is what gives these elements their chemical properties, it should come as
no surprise that the multiplicity in question is a symmetry-breaking abstract structure which
relates the shape of electron orbitals to the atomic number. I mentioned before the sequence of
bifurcations in fluid flow dynamics which unfolds as one increases the intensity of speed or
temperature to certain critical thresholds. Similarly, the sequence of broken symmetries behind
the table may be seen to unfold as one injects more and more energy into a basic hydrogen atom.
The single electron of this atom inhabits a shell with the form (and symmetry) of a sphere.
Exciting this atom to the next level yields either a second larger spherical orbital or one of three
possible orbitals with a two-lobed symmetry (two-lobes with three different orientations). This
new type of orbital has indeed the right mathematical form to be what a sphere would be if it had
lost some of its symmetry. Injecting even more energy we reach a point at which the two-lobed
orbital bifurcates into a four-lobed one (with five different possibilities) which in turn yields a
six-lobed one as the excitation gets intense enough. In reality, this unfolding sequence does not
occur to a hydrogen atom but to atoms with an increasing number of protons, boron being the
first element to use the first non-spherically symmetric orbital. {15}
This abstract structure of progressively more broken spherical symmetries is a beautiful
illustration of a Deleuzian multiplicity, and it accounts for the numerical rhythms of the table: the
number of available orbitals or subshells at each energy level multiplied by two, given that two
electrons of opposite spin may inhabit the same shell, perfectly fits the recurrent sequences of
chemical properties. And yet, such an abstract structure is not enough. To this virtual entity one
must then add an intensive process which embodies it without resembling it and which
physically individuates the different chemical species. This intensive process is known as stellar
nucleosynthesis and, as its name indicates, it occurs today mostly within stars (veritable factories
for chemical assembly) although presumably it may also have occurred under the much greater
intensities present right after the Big Bang. The latter were sufficient to individuate hydrogen
and helium, but the rest of the elements had to wait millions of years until intensive differences
in density allowed the individuation of the stars themselves. In this other environment, the next
two elements (lithium and beryllium) are individuated via a synthesis of the first two, and these
new species in turn become the gateway to reach the conditions under which larger and larger
nuclei may be individuated. Just what degree of nucleosynthesis a given star may achieve
depends on specific critical thresholds. To quote P. W. Atkins:
"The first stage in the life of a newly formed star begins when its temperature has risen to
about 10 million degrees Kelvin. This is the hydrogen-burning stage of the star's life cycle, when
hydrogen nuclei fuse to form helium...When about 10 percent of the hydrogen in the star has
been consumed a further contraction takes place and the central region of the star rises to over
100 million degrees....Now helium burning can begin in the dense hot core, and helium nuclei
fuse into beryllium, carbon, and oxygen....In sufficiently massive stars those with a mass at
least four times that of our sun the temperature can rise to a billion degrees, and carbon burning
and oxygen burning can begin. These processes result in the formation of elements....including
sodium, magnesium, silicon and sulphur." {16}
I will not discuss the technical details of just exactly how these different syntheses are
carried out. It is enough for my purposes that intensive differences as well as intensive thresholds
are crucially involved. But even in this sketchy form the nature of the different chemical species
already looks quite different than they do when a naive realist tackles the question of their
identity. Such a philosopher would remark that, given that if one changes the atomic number of
an element one thereby changes its identity, the atomic number must be the essence of the
element in question. In a Deleuzian ontology such a statement would be inadmissible. Not only
does atomic number by itself not explain why an element has the properties it has (we need also
the theory of electron orbitals), but the actual process of creating increasingly heavier atomic
nuclei is completely ignored, even though this is what actually creates individual atoms with a
given identity. Moreover, the reification of atomic numbers into essences ignores yet other
individuation processes, those responsible for the creation of larger scale entities, such as a
sample of gold or iron large enough to be held in ones hand. Naive realists treat the properties of
such large samples as being merely the sum of the properties of its atoms, hence reducible to
them. But this reducibility is, in fact, an illusion.
In particular, much as between individual cells and the individual organisms that they
compose there are several intermediate structures bridging the two scales (tissues, organs, organ
systems) so between individual atoms of gold or iron and an individual bulk piece of solid
material there are intermediately scaled structures that bridge the micro and macro scales:
individual atoms form crystals; individual crystals form small grains; individual small grains
form larger grains, and so on. Both crystals and grains of different sizes are individuated
following specific causal processes, and the properties of an individual bulk sample emerge from
the causal interactions between these intermediate structures. There are some properties of gold
or iron, such as having a specific melting point, for example, which by definition do not belong
to individual atoms since single atoms do not melt. Although individual crystals may be said to
melt, in reality it takes a population of crystals with a minimum critical size (a so-called
microcluster) for the melting point of the bulk sample to emerge. Moreover, the properties of a
bulk sample do not emerge all at once at a given critical scale but appear one at a time at
different scales. {17}
There is, in fact, even more to this intensive story given that what I have said so far deals
only with one aspect of intensities, tendencies, and says nothing about capacities: the different
capabilities of the chemical elements to enter into heterogeneous assemblages with each other.
Although the simplest such assemblages (dyadic molecules) may be put into tables displaying
rhythms of their own, the table approach is hopeless when dealing with, say, the seemingly
unlimited number of possible carbon compounds. I will let my case rest here, however, since the
contrast with the picture that the naive realist holds is already clear enough. Instead let me move
on from chemical to biological species. In this case too we have inherited complex static
classifications exhibiting regularities that cry out for further explanation. I must say in advance
that the picture here is much less clear than that of chemistry, given that the classification is
much more complex and that history plays an even greater role. I will tackle the question in the
opposite order I did for Mendelev's table, starting with the intensive aspects.
A crucial idea here is the definition of a species as an individual entity, differing from
organisms only in spatio-temporal scale. The individuation of a species consists basically of two
separate operations: a sorting operation performed by natural selection, and a consolidation
operation performed by reproductive isolation, that is, by the closing of the gene pool of a
species to external genetic influences. Each of these operations may, in turn, be spelled out in
terms of intensive differences and thresholds. The idea that, for example, a given predator
species exerts selection pressures on a prey species needs to be explained in terms of the
relations between the densities of the populations of predators and prey. In many cases these two
populations form a dynamical system which exhibits endogenous equilibria such as a stable
cycle of boom and bust. The other operation, reproductive isolation, needs also to be defined
intensively, in this case in terms of the rates of flow of external genetic materials into a species
gene pool. Philosophically, this translation eliminates the temptation to characterize species in
terms of a static classification where their identity is simply assumed and all one does is record
observed similarities in the final products.
In a Deleuzian ontology resemblance and identity must be treated not as fundamental but
as derivative concepts. If selection pressures happen to be uniform in space and constant in time
we should expect to find more resemblance among the members of a population than if those
selection forces are weak or changing. Similarly, the degree to which a species possesses a clearcut
identity will depend on the degree to which a given reproductive community is effectively
isolated. Many plant species, for example, retain their capacity to hybridize throughout their lives
(they can exchange genetic materials with other plant species) and hence possess a less clear-cut
genetic identity than perfectly reproductively isolated animals. In short, the degree of
resemblance and identity exhibited by organisms of a given species depends on contingent
historical details of the process of individuation, and is therefore not to be taken for granted.
What would the virtual component of this treatment of species be like? This is a highly
speculative question given the current state of evolutionary biology. It involves moving from the
lowest level of the classification (the species) to a level much higher up, that of the phylum.
Phyla represent the level of classification just underneath kingdom. The animal kingdom, for
example, divides into several phyla including chordata, the phylum which we as vertebrates
belong. But phyla may be treated not just as labels for static categories but also as abstract body
plans. We know little about what these body plans are, but some progress has been made in
defining some of its parts, like the "abstract vertebrate limb", or more technically, the tetrapod
limb, a structure which may take many divergent forms, ranging from the bird wing, to the single
digit limb in the horse, to the human hand and its opposed thumb. It is very hard to define this
structure in terms of the common properties of all the adult forms, that is, by concentrating on
homologies at the level of the final product. But focusing instead on the embryological processes
that produce this structure allows the creation of a more satisfactory classification. As one author
puts it, this new classificatory approach sees limb homology as emerging from a common
process (asymmetric branching and segmenting), rather than as a precisely repeated archetypal
pattern. {18}
The abstract component of this process may indeed be defined in terms of attractors
(stable pathways of development) and bifurcations (divergences or abrupt changes in these
pathways). What actual limb emerges as a final product (wing, hand, hoof) may be explained by
showing what bifurcations the genes of a given species allow to occurr. When embryological
development reaches the budding of fingers in a limb's tip, for example, some genes (those of
horses) will inhibit the occurrence of the event (and so prevent fingers from forming) while
others (those of humans) will not. In the case of snakes or dolphins the very branching of the
limb may be inhibited. So what we would need is a topological description of a virtual vertebrate
(not just the limbs) and an account of how different genetic materials select certain pathways
within this body plan and inhibit certain others, thus resulting in different species' morphologies.
While we are far from having such a topological model (not to mention far from being able to
express it as a symmetry-breaking cascade) the work that has already been done on the subject
suggests that we can be optimistic about its prospects. At any rate, within a Deleuzian ontology
one is forced to pursue this empirical research since we cannot be satisfied with a static
classification recording resemblances and identities. For the same reason, one cannot view
selection pressures as sculpting animal and plant forms in detail (a conception that assumes an
inert matter receiving form from the outside) but as teasing out a form out of a
morphogenetically pregnant material. And similarly for genes: they cannot be seen as defining a
blueprint of the final product (and hence its essence) but only as a program to guide selforganizing
embryological processes towards a given final state.
Given that so much empirical and theoretical work remains to be done in biology, it
should come as no surprise that taking a Deleuzian approach to sociology or economics is such a
speculative enterprise. As I said there is no shortage of targets (all those "ideal types" that
sociologists since Max Weber are so busy producing, for example) but there is clearly a long
way to go. The very first task is to replace general entities (the market, the state) with concrete
singular individuals of a larger scale than persons (institutional organizations) and to define what
other yet larger-scale individual entities we should be ontologically committed to (cities and
nation states are good candidates). Then the intensive processes giving rise to each individual
entity must be investigated in detail and regularities in the processes recorded. Only then will we
have any inkling as to what virtual structures may be lurking behind it all. We are now only at
the beginning of such a long journey but it is a journey which any non-naive realist philosopher
cannot afford not to undertake.

1} Gordon Van Wylen. Thermodynamics. (John Wiley & Sons, New York, 1963). Page 16.
2} What is the significance of these indivisible distances that are ceaselessly transformed and
cannot be divided or transformed without their elements changing in nature each time? Is it not
the intensive character of this type of multiplicitys elements and the relations between them?
Exactly like a speed or a temperature, which is not composed of other speeds or temperatures,
but rather is enveloped in or envelops others, each of which marks a change in nature. Gilles
Deleuze and Felix Guattari. A Thousand Plateaus. (University of Minnesota Press, Minneapolis,
1987). Pages 31-33.
3} Gilles Deleuze. Difference and Repetition. (Columbia University Press, New York, 1994)
Page 222.
4} Ibid. Page 214.
5} James J. Gibson. The Ecological Approach To Visual Perception. (Houghton Mifflin
Company, Boston 1979). Pages 15-16.
6} Gilles Deleuze and Felix Guattari. A Thousand Plateaus. Op. Cit. Page 257.
7} Gilles Deleuze and Felix Guattari. A Thousand Plateaus. Op. Cit. Page 408. (Italics in the
8} Gilles Deleuze. Difference and Repetition. Op. Cit. Pages 211-212.
9} Gilles Deleuze and Felix Guattari. A Thousand Plateaus. Op. Cit. Page 408.
10} Gilles Deleuze. Difference and Repetition. Op. Cit. Pages 208-209. (Italics in the original).
Deleuze borrows the ontological distinction between the actual and the virtual from Bergson.
See: Gilles Deleuze. Bergsonism. Op. Cit. Pages 96-97.
11} The mathematical concept of "symmetry" is defined in terms of "groups of transformations".
For example, the set consisting of rotations by ninety degrees (that is, a set containing rotations
by 0, 90, 180, 270 degrees) forms a group, since any two consecutive rotations produce a rotation
also in the group, provided 360 degrees is taken as zero. (Besides this "closure", sets must have
several other formal properties before counting as groups).The importance of groups of
transformations is that they can be used to classify geometric figures by their invariants: if we
performed one of this group's rotations on a cube, an observer which did not witness the
transformation would not be able to notice that any change had actually occurred (that is, the
visual appearance of the cube would remain invariant relative to this observer). On the other
hand, the cube would not remain invariant under rotations by, say, 45 degrees, but a sphere
would. Indeed, a sphere remains visually unchanged under rotations by any amount of degrees.
Mathematically this is expressed by saying that the sphere has more symmetry than the cube
relative to the rotation transformation. That is, degree of symmetry is measured by the number of
transformations in a group that leave a property invariant, and relations between figures may be
established if the group of one is included in (or is a subgroup of) the group of the other.
12} Gregoire Nicolis and Ilya Prigogine. Exploring Complexity. (W.H. Freeman, New York
1989). Pages 12-15. See also: Ian Stewart and Martin Golubitsky. Fearful Symmetry.
(Blackwell, Oxford UK, 1992). Chapter 5.
13} Brian C. Goodwin. The Evolution of Generic Forms. In Organizational Constraints on the
Dynamics of Evolution. Edited by J. Maynard Smith and G. Vida. (Manchester University Press,
Manchester, 1990). Pages 113-114.
14} P. W. Atkins. The Periodic Kingdom. (Basic Books, New York, 1995). Chapter 7.
15} Vincent Icke. The Force of Symmetry. (Cambridge University Press, UK, 1995). Pages 150-
16} P. W. Atkins. The Periodic Kingdom. Op. Cit. Pages 72-73.
17} Michael A. Duncan and Dennis H. Rouvray. Microclusters. (Scientific American, December,
1989). Page 113.

  [ ]

: [ (0)]     [ ]

Copyright (c) ""