Alienation and the Alien

Marx and Freud diagnosed the modern condition as one of alienation. People are increasingly cut off from the relations that sustain them – economic, social, political, and even physical. The response has been a return to community – localist movements, community building, practices that emphasize the relations rather than the transactions. While I think this is a positive move, I question whether it offers the relief from alienation that we desire.

We feel alienated – depressed, disconnected, uncertain, precarious, and vulnerable – and so we seek out connection.  We seek it, more often than not, in those spaces and with those others that are comfortable and most like ourselves. At the end of last year, I met a woman who was part of an “intentional community” in DC. There are, in fact, many of these – an entire network! They are essentially group houses where lots of young professionals, interns, college students and others come together to rent a house and share costs. These “intentional community” homes go a step further.  They seek out people who are interested in a particular lifestyle and work together within the community of the house to make that lifestyle a reality. In this woman’s house, they were committed to a completely vegan diet, shared everything, and had weekly meetings to discuss house issues. While I don’t have any qualm with those values in particular, the retreat into this relatively homogeneous community was unsettling to me. One man had applied to join the house but wanted to be able to eat fish that he caught. From the woman’s story, it sounded as if he went to great lengths to compromise – keeping the fish in a cooler in the basement separate from all other food, and preparing and consuming it away from all other food preparation and consumption. However, the compromise was not enough for her or the others in the house – meat of any kind would simply not be allowed. This made me skeptical of the communitarian movements that are so common today, and whether they are the cure for alienation that we seek.

Alienation is the result of disconnection and isolation – the breakdown or abstraction of constitutive relations.  The retreat into community, it seems to me, may fit very well in an alienated world. If you can slip easily and quietly into your community in the evenings and on weekends, then your alienation the rest of the time might be a little more bearable. But producing a community that is homogeneous and isolated is simply another form of alienation – a shared alienation.

Another way of looking at it is that alienation is a condition of the system of modernity in which we find ourselves. This is the insight of Marx and Freud, because the fact of alienation itself was not in doubt in the early industrial era. What they made apparent is that alienation is not a condition of individuals but of the collective world in which we live. If that’s the case, then a retreat into community might provide temporary solace, but never the true cure for alienation that we seek because it only changes the internal conditions of the community and not the relations that constitute our alienated existence.

alien resurrection newborn

So what’s the alternative?  If we can’t cure alienation by building community, then what can we do? If alienation is the condition of the world we live in, then the cure for alienation, it seems, might be to seek connection in the most alien places possible. To ignore entirely the constraints imposed upon us by the world and seek new forms of interconnection – alien, monstrous, perverse, taboo, hybrid, and unnatural. To seek out (inter-class) contact rather than (intra-class) networking.

These kinds of alien connections entail a risk – an acceptance of vulnerability. There is danger in relating to something that is wholly other and alien. Communication may fail. Some relations may be painful or destructive. These are precisely the kinds of relations that are fled from in the retreat to community – the uncomfortable, uneasy, and dangerous relations. But the retreat into community is a retreat into the world of alienation and not an escape from it. Escape takes effort.

That’s not to say we can’t do both – build community and seek out other forms of connection as well. In some cases, the two may not be mutually exclusive, in others they may.  It’s just to say that community building, in itself, is not the cure for alienation. The cure lies in breaking down the barriers that define community in our alienated world and composing new forms of relation that might begin to constitute a new world. This cannot happen as long as we seek out the easy, the comfortable, and the risk-free.

Wealth and Fluidity

A long time ago, back at Eidetic Illuminations, I wrote this post on what I called then (in a moment of pretension and vanity) Jeremy Trombley’s Economic Principle No. 1.  I called it Economic Principle No. 1 because, to me, it was (and still is) the most fundamental principle of economics – something that every economic theory should take for granted (but most don’t) and that should inform our economic choices.  The principle as formulated back then (and I like the vulgarity of it, so I’m not going to dress it up now):

Wealth may trickle down, but it fucking FLOWS upward!

What this means is that, in order to function appropriately, every economy needs a redistributive mechanism.  Without some kind of redistribution, wealth will tend to accumulate in the hands of fewer and fewer people creating an instability in the system – a blockage of flow, you might say. Every economy has some kind of redistribution – taxes, potlatch, gift economies, etc. – even our own. However, increasingly, what I think we see is a co-opting of redistribution such that the wealth goes back to the wealthy – creating the illusion of redistribution without the actual benefits.  In any case, I’ve covered all that before.  The point of this post is different.

I called this Economic Principle No. 1, and I stand by that.  However, thinking about it recently, I’ve realized that there’s a more fundamental issue – a necessary condition for Economic Principle No. 1 to be valid, which also goes largely unrecognized and has major economic implications.  That is:

Wealth is fluid.

In order for Economic Principle No. 1 to work, wealth must be understood to flow as a fluid rather than to be solid and static.  In fact, this is implicit in many of the ways we talk about wealth and money now – liquid assets, capital flows, etc., but I don’t think that the real implications are taken into consideration.  That wealth is fluid means that it’s about relations and flows rather than individual accumulation.  We have all probably heard parables of wealthy men with their gold on, say, a deserted island.  There the gold does the men no good, because it is not the substance that matters – not the artefact that bears the wealth – but the ability to exchange – to move wealth from one person to another and transform it from one kind of substance to another like the alchemist trick of turning lead into gold. If, on the other hand, we treat wealth as static, solid stuff, then the motivation will be to hold it, to keep it solid as much as possible, and allow it to accumulate.

All of this is fairly simplistic, I know.  Someday when I’ve finished my PhD and have time and resources to research whatever I want, I will indulge my interest in Economic Anthropology and explore these ideas in more complex and ethnographic contexts.  I kick myself for not having taken a course back at KU when I had the chance, but my schedule didn’t allow it.  Nevertheless, I believe that these principles are important for a reasonable understanding of economics, though they may not be deterministic and many complex factors may come into play in any given economy.  I wanted to put the idea out there, though, and see what kind of thought it generates.

(Update) In the Flesh: Vulnerability in the Anthropocene and Beyond

Over the last few months, Adam Robbert and I have been working on assembling a volume on vulnerability to be titled In the Flesh: Vulnerability in the Anthropocene and Beyond.  At the end of May, we sent out a request for submissions and asked that those interested respond to let us know by mid-June.  The results are in, and we have 11 confirmed contributors from a variety of backgrounds and with a variety of topics in mind.  Over the next month, we hope to assemble abstracts from these contributors and put together a final prospectus for the book to be sent for final approval to Punctum.  We also plan on trying to recruit a few more contributors with the hope of rounding out the volume.  I will continue to post regular updates as the process unfolds. If you have any questions or suggestions, feel free to post them here or contact either Adam or me.  As it continues to come together, I can’t help but think that this will be a very valuable book and I am thrilled to be playing such a central role in its production.



“The bad news is you’re falling through the air, nothing to hang on to, no parachute. The good news is there’s no ground.”

-Chögyam Trungpa Rinpoche

I’ve used this quote before. It’s a beautiful sentiment and I stand by it, but it is misleading. The truth is, there is no ground except those we make for ourselves and others.

As Rinpoche’s quote suggests, a ground can be settling – providing a place for stable footing and an end to the perpetual cosmic fall. On the other hand a ground can be terrifying as one falls uncontrolably towards it.

If we can’t get rid of grounds and simply enjoy the fall, then we should at least try and have more of the former and fewer of the latter types.  One can never be sure, though, which type one is approaching when one is simply falling uncontrollably.

Modeling Discipline

Another aspect of modeling that I’ve come across in my reading lately is the extent to which the practice disciplines model users in particular ways.  Modeling is not simply something that modelers do while sitting in front of the computer terminal, it is a practice that shapes the way modelers think and behave.  This is demonstrated effectively in three of the articles I’ve read: one by Anders Kristian Munk titled “Emancipating Nature: What the Flood Apprentice Learned froma Modelling Tutorial”, the second by Myanna Lahsen titled “Seductive Simulations? Uncertainty Distribution Around Climate Models”, and the third by Chunglin Kwa titled “Modelling Technologies of Control.”

In the first article, by Munk, the author describes his experience undergoing a tutorial course on flood modeling, and the various stages that one must go through in order to build even a simple flood model.  The course begins with the development of a “perceptual model” of flooding. Not a computer model, but a conceptual model sketched out with pen and paper, this practice allows the modeler to think through all of the different factors that might come into play in a flood scenario: evapotranspiration, rainfall, soil, vegetation, human settlement, etc.  The effect of this practice is to isolate out those factors that can be accounted for in the model and those that can’t.  It separates out nature as a distinct domain which the model speaks for, but separated from the politics that are ever-present in the causes, consequences, and management of flooding:

The model, which claims to speak on behalf of nature, can be asked its opinion on different possible outcomes of a political process, which claims to speak on behalf of society.  The answers can of course be fed back across the divide, but it seems important to the professional ethos of the modeller that a divide is maintained in spite of the tendency of successive flood events to transgress any such partition.  If our role vis-á-vis nature will be to anticipate, then our role vis-á-vis society will be to notify.  Anticipating nature; notifying society.”

Once the perceptual model has been produced, the process of computer modeling can begin.  This process involves a number of translations and reductions – making the perceptual model fit to the computer’s interface.  This is another disciplining process, through repeated interactions with the computer and software, the modeler comes to understand and react to what the simulation needs.  The modeler is being shaped by the model just as the model is shaped by the modeler (the two are mangled, to use Pickering’s term):

What feels like a real achievement is that the model is actually running after an unending stream of bugs and error messages.  Finally, we seem to be doing something right; we have learned how to feed it the correct stuff; it is complying.  Or rather, we are complying: at this point in the training exercise the happy amateurs are tuning in to the demands of the software and for the first time we get a sense of what it means to become a modeller.”

In the end, the author reflects on how the process of flood modeling – of separating out nature from society – depends on the production of “thoroughly post-natural hybrid.”  Through this process the modeler is transformed and remade, acquiring new habits, dispositions and affects that no doubt carry over into other aspects of her life.


The second article, by Lahsen, is meant as a critique of Mackenzie’s certainty trough.  The certainty trough is a model for how different groups (generally within the sciences) view a particular scientific process.  As the author describes, it uses a “distance trope” to differentiate the different groups.  Those who are closest to the production of the process are typically somewhat aware of the uncertainties embodied in the process; those who are not involved in the production but use the methods and results tend to be less conscious of the uncertainties and thus less skeptical about the process; finally, those who are outside of the process entirely – because they are committed, presumably, to an entirely different methodology – tend to have the highest skepticism of the process.  Lahsen identifies, using modeling as her example, several points that complicate the certainty trough: first, that there are generally many sites of production involved, so distance is not an effective metaphor; second, that the distinction between producers and users is not very clear; third, that “outsiders” are often called upon to help validate models; and finally, that there may be psychological and social reasons why modelers would not have an overabundance of confidence in their models.

It’s this last feature that I want to discuss in terms of discipline.  That there are social reasons why modelers would appear over-confident in their models seems straight-forward.  Modelers compete for funding and access to resources, so they are invested in making their models appear as accurate and valuable as possible.  However, the psychological effects that Lahsen describes are not so evident.  From her research, she shows that, as a result of continual contact with the model, come to see the simulation as a reality itself.  Instead of talking about the simulated ocean in the model, the modelers will talk about simply “the ocean.”  In effect, the modelers are “seduced” by the models into believing that they are the reality that they attempt to simulate.  This leads to an attachment to the model, and, again, an overconfidence in the model’s depiction of reality.

The third article by Kwa explores models as tools of control.  He delves into the history of modeling, including environmental/climate modeling, economic modeling, and modeling for military purposes, in order to show a relationship between modeling and different modes of control.  His primary example comes from John Von Neumann’s work with weather modeling.  Von Neumann, among others, envisaged a computer model of weather on a global scale, which could be used to control weather and transform the climate to better suit human needs.  Examples include the use of “cloud seeding” to promote rain in drought-ridden areas, or to hinder enemy troops and obscure aircraft during military operations.  Another, more extreme example, is the use of nuclear explosions to redirect monsoons to cool hotter climates and improve conditions for agriculture.  This period from about WWII to 1973 (Kwa is specific in this regard), is one where technologists imagined that models and other technologies could bring about total, centralized control on a global scale.  In some ways, it is the structure of models themselves which give the illusion of total control – the way they are laid out offering a gods-eye view of the world, and one that can be easily manipulated and observed.

According to Kwa, around 1973, there was a shift from this kind of large-scale modeling towards more localized modeling efforts with specific problems in mind.  He says that this was the result of the emergence of the counter-culture movements in the 1960s rather than any kind of specific failures of the models themselves.  The advent of the personal computer further promoted localization and specificity in modeling approaches.  Kwa suggests that this does not mean that the goal of control was abandoned with the earlier modeling methods, but that it shifted into this localized register.

In my mind, the question is what kinds of body-minds are produced out of these modeling practices? How do the habits, dispositions, and affects developed in front of a computer screen carry over into other aspects of life?  None of the authors address this specifically, but it’s an important question that I might try to investigate in my own research.  Furthermore, I would hope to answer the question what happens when modeling methods are changed?  What kinds of new subjectivities might be produced?  It’s a lot to think about.

The Politics of Simulation

I’ve been quiet for a few weeks now because I’ve been diligently studying for my second area exam.  This one will be about the research that has been done within the social sciences on environmental modeling.  Most of this research has focused on the General Circulation Models (GCMs) that are used to understand and predict the effects of anthropogenic climate change, however, there are those who deal with water quality modeling and flood modeling.  What I’ve found interesting in these accounts is the extent to which – at times unacknowledged by the researchers themselves – politics plays a formative role in the development and distribution of computer simulations.

As an example, I’ll take Matthias Heymann’s essay “Constructing Evidence and Trust” in the volume The Social Life of Climate Change Models (edited by Kirsten Hastrup and Martin Skrydstrup) which looks at the different ways that models gain confidence within the scientific community (the author purposefully brackets off scientific confidence from public confidence in order to avoid confusion).  In the article, Heymann identifies four sources – I would almost say orders – of confidence. The first order of confidence comes from the emergent features that are produced in the simulation runs.  That these emergent features resemble patterns one would expect to see in the actual climate (but were not part of the initial programming – thus “emergent”), provides a degree of confidence that the models are capturing the basic physical laws that govern the climate.  The second order of confidence comes from the quantitative fit between simulation results and observed data.  That these models can successfully back-forecast and produce reasonably similar results offers a further confidence in their capabilities.  The third order of confidence results from the similarity of behavior between different models.  Because the models tend to agree despite differences in parameters and code suggests that the models are effectively modeling the same general phenomena.  Finally, Heymann argues that the fourth source of confidence comes from a rhetorical strategy that makes the uncertainties in simulation results invisible.  There is no quantitative method for measuring the uncertainty of these models – this would take multiple model runs which consumes a lot of time and resources.  At the same time, model results are depicted with graphs and the weight of statistical certainty.  Although the uncertainty of the models is always mentioned, it cannot compete with the weight of graphic depictions provided by the simulation runs.


This last order of certainty is the only point in Heymann’s four sources where politics seems to enter.  There are clear political reasons (in excess of the practical reasons) why scientists might downplay the uncertainty of their models. One reason is that modelers are competing for funding, and they have a vested interest in portraying their models in the best possible light.  Another reason is that there is a general sense in the scientific community (and I have seen this firsthand in my own encounters with modelers) that any sign of uncertainty will become fodder for climate skeptics and deniers or the media to attack.  A third reason, not mentioned by Heymann, but discussed at length in another article by Lahsen called “Seductive Simulations” is that modelers themselves may be psychologically attached to and overly confident in the reality of their models.

Clearly, politics comes in at the tail end of the modeling performance. Once the models have been run, and all of the other sources of confidence have been achieved, it is the presentation of models that takes on a political aspect.  However, Heymann points to another point where politics enters the equation from the very beginning.  In fact, although he doesn’t recognize it as such, politics could be said to be the real first order source of confidence in modeling.  Heymann describes the history of climate modeling in two phases.  The first phase is the development of basic climate models starting in the 1950s and going into the early 1970s.  This era is characterized by a general lack of confidence in modeling, and frequent warnings in scientific papers about the inadequacy of the model results.  The second phase begins in the 1970s (in A Vast Machine, Edwards marks a similar break occurring around the same time, but for slightly different reasons), and this is the beginning of confidence and the use of climate models to influence public opinion and policy.  This phase begins with the ascension of a new generation of climate modelers (Schneider, Kellog, and Hansen are the most frequently mentioned) who are driven by the urgency of climate change.  They argue in scientific papers that the that urgency of climate change demands that we use the best tools we have to understand the causes and effects as best as we can so that we might be able to intervene rapidly – a first-order political validation for the use of climate models. In other words, we must have confidence in the models because they are the best tools we have and the urgency of the problem demands that we use them.  Heymann provides the following quote from Schneider:

My view is that once we know reasonably well how an individual climatic process works and how it is affected by human activities (e.g. CO2-radiation effect), we are obliged to use our present models to determine whether the changes induced by these human activities could be large enough to be important to society.”

Had this new generation of modelers – coming of age in the era of the environmental movement – not taken such a political stance, the models might still have been used eventually.  However, it is questionable whether they would have developed the required confidence (by way of the other four methods mentioned above) for many more years without the initial impetus and push of confidence provided by this political urgency.  What this means is that politics runs through the practice of modeling from the very beginning, and that, contrary to the general sense that politics must be kept separate from science, in this case political urgency actually provides a degree of initial confidence and a basis for the development of further confidence without which climate modeling might have remained stagnant for a long time.