Fade of the Polaroid: Towards a Political Ontology of the 70s
…that memories are the only possessions which no-one can take from us, belongs in the storehouse of impotently sentimental consolations that the subject, resignedly withdrawing into inwardness, would like to believe as the very fulfilment that he has given up. In setting up his own archives, the subject seizes his own stock of experience as property, so making it something wholly external to himself. Past inner life is turned into furniture just as, conversely, every Biedermeier piece was memory made wood. The interior where the soul accommodates its collection of memoirs and curios is derelict. Memories cannot be conserved in drawers and pigeon-holes; in them the past is indissolubly woven into the present. No-one has them at his disposal in the free and voluntary way that is praised in Jean Paul’s fulsome sentences. Precisely where they become controllable and objectified, where the subject believes himself entirely sure of them, memories fade like delicate wallpapers in bright sunlight. But where, protected by oblivion, they keep their strength, they are endangered like all that is alive.
—Theodor Adorno (2005: 166)
The clocks are never synchronized, the schedules never coordinated, every epoch is a discordant mix of divergent rhythms, unequal durations, and variable speeds.
—Rebecca Comay (2011: 4)
Though we are tempted to imagine time as intrinsically open, free to combine and re-combine with moments past or still to come, eras instead become compulsively entangled with each other, linked in such a way that neither can be understood apart. We know, after Walter Benjamin, that history looks less like a finished building than it does the latter ruined—shards of basement in the attic, holes slashed through floors at strange angles, staircases that end suddenly, mid-air. When times interpenetrate like this they find themselves suddenly linked by a historically necessary energy of filiation or disavowal. In a structure loosely analogous to that of the unconscious of an individual subject, a time enters into an orbit with another period or era. Times fall in love, though the parameters here are not defined by transparency or fullness, but dependency, fear, aggression, and misrecognition.
It may be that our own time has entered into precisely such a relationship with the 1970s. Unlike the 1990s, which may still be too close to us, the 1960s, 70s, and 80s are decades amenable to representation as discrete units or periods; they strike us as wholes bound by a certain internal aesthetic logic or flow. It may be, of course, that this is little more than an optical illusion produced by representation itself: we intuit the 1960s as such only because we’ve been trained by popular culture to recognize its tell-tale cues and signs. The unrepresentability of the 1990s then may in fact turn out to have been nothing more than the interval required by a culture to transform its past into a concept; alternatively, it may be that there is actually something in the object, in the historical specificity of the 1990s itself, that prevents its translation into a coherent image or idea. Today one watches the TV show Friends fully aware of the way its tone, style, and forms of speech are proper to the period, but the various bits and pieces that comprise the decade’s content seem more like an aggregate of externally related parts—one damn thing after another—than they do the organelles of a functioning temporal whole. Where certain “decades” come to appear wrapped around central organizing events/problems or are saturated from within by dominant styles (anything from fashion to sounds), others seem to wade through a zone of indifferentiation, a tonelessness that leaves its key objects and moments linked by nothing more than sheer contiguity. It may be in fact that the 1970s were among the last eras phraseable in the idiom of the “era” itself—that they were, in some complex way, the last real “decade.”
But why might this be? Is it that when compared with the 1990s (and later the 2000s) the 70s altered culturally and technologically at a comparatively slower pace, that this rate of change was still slow enough to congeal into the recognizablity of a style or idea? Is it less a question of pace and more one of quality, of the kind of change that took place in this period, with the internet and mobile phones representing a more substantive redistribution of space and time than the CD? Is it that the 1970s were not yet technologically fragmented in the way that the 1990s and later our own time would be, with the cultural tone of our age now effectively dispersed across a thousand platforms, media, and “content providers”? Is it the belated effect of globalization on our capacity to generalize an era? Or is it that the 1990s when held up against the 1970s were experienced even by those living through them as a kind of terminus or endpoint, a time apparently without events (at least in the West), which, depending on one’s perspective, marked the triumph of liberalism and a new era of perpetual peace, or alternatively (and less naïvely), the triumph of unfettered capitalism and a new era of hyper-consumerist banality. It may be too that this failure of the 1990s to achieve its own iconicity is the expression within historiography of the problem that Guy Debord once posed as the moment in which all that was once lived moves into the domain of the image. The capacity of an era to register events as actually happening would in this sense be the minimal ontological condition for the existence of Hegelian Geist, of an era’s inner spiritual coherence and necessity. It is in this sense that we might be said to be living in something like a long 1990s, a decade without discernible texture that persists, despite its contradictions, despite the obvious differences between then and now, for as long as no viable political alternatives to neoliberalism can be imagined. It could be too, finally, that this very conceit, that of a transformation in the capacity of history to generate “eras” proper, tells us less about the logic of the 1970s, the last “real” decade, than it does the structure of our own time’s desire (a time, perhaps, so desperate to mark its own specificity that it is open to imagining itself as unprecedentedly devoid of sense, feeling, historicity, etc.). In the postmodern drive to frame the present as an end—to history or experience, for example—the present itself is flooded with a vibrating ecstasy of the new, a sense that nothing like this has ever happened before and that we, here, at the end of history can now know all of the things those who came before us didn’t.
The most obvious symptom of our moment’s entanglement with the 1970s is the intensity with which we continue to attach ourselves to its artifacts. This is expressed first in the ease with which film and music from the decade continue to be consumed under the sign of the “classic,” a concept that clearly imbues productions from the era with greater authenticity or originality than their counterparts in the present. We should not assume that this is something like the natural aura of history, one that organically begins to fringe all things past or old at intervals that can be predicted in advance. Instead, it would appear that the dimensions that accrue around the concept of classic rock and film in the present are soldered to many of the properties of the 1970s itself. What is it about the decade as a whole that allows for this intensified investment, as if it were the time itself, its own grittiness, its own contradictory realness, that vibrates through the signature cultural objects and gestures we tether to the period? Isn’t there a way, after all, in which everything we know about the 1970s happens in the light of a strangely universalized New York, a New York of the movies, one rotten with crime and sex but also gorgeously soaked in neon? The dorm rooms of university students, especially those of men, continue to bizarrely orbit the era: Led Zeppelin, Pink Floyd, and Bob Marley posters in music, The Godfather, Mean Streets, Clockwork Orange in film. Even a cursory glance into these spaces reveals a strangely paralyzed campus imaginary; the expected transition to a version of the classic grounded in new content, in the “best of the nineties,” takes place, but only partially. If the process by which things become translated into the idiom of the classic is linear, a huge machine that slides through time along predictable inter-generational cycles—say, every 25 years—then the mechanism almost certainly jammed in the 1970s and has since stuttered around the decade with a strange insistency. Perhaps it is not surprising then that among the most oft-encountered (contemporary) posters found in these spaces is that of Quentin Tarantino’s film Pulp Fiction, a 1990s text completely saturated by the motifs and forms of the 1970s (and featuring one of the decade’s most recognizable stars): it is as if we could only cobble together our conception of the classic through the detritus of the 1970s, as if the decade had become necessary to any attempt on the part of a text to convincingly canonize itself. Troublesome is the way this ontologization of the decade’s key figures and motifs dovetails with the logics of contemporary misogyny, fears about the declining manliness of men, and the confusing vagueness of gender roles fed into a nostalgia for a time when “men were still men.” The brute maleness of the mobster—or any one of the decade’s myriad agents of charming violence, ranging from serial killers to rogue cops—comes to be intuited as somehow closer to the savage Real of things themselves. The humiliations of the present—say the banalities of office work—can then be re-calibrated as female, as markers of a mass emasculation of men that has led them into a world of fakeness and passivity. This is precisely the position taken by Fight Club; it comes as no surprise then that Tyler Durden is decked out in the garb of the 1970s (wide-collared disco shirt, aviator glasses, vintage leather jacket, etc.)—he’s functioning in 1999 as an id dressed up as natural masculinity. To what extent this “natural masculinity”—grounded in a fantasy of the 1970s as a time of unconstrained male gesture and desire—continues to haunt the moustaches of urban hipsters is an open question, one not easily solved by an invocation of irony.
We consume the period’s visual culture, then, also through its reiterations in contemporary content set in the period. 1990s cinema looped back to the decade constantly—Goodfellas (1990), Casino (1995), Boogie Nights (1997), Summer of Sam (1999), 54 (1998)—orbiting objects and motifs (disco, mobsters, the birth of the serial killer) that continue to haunt contemporary films (Zodiac, American Hustle, The Good Guys). Again, the kind of tone grounding these films reveals an era that is intense, expressive, high, fast, and violent—manically Real in a way that has been lost to a seemingly less volatile, more “mediated” present (with this mediation often fantasized from the both the Right and the Centre as the encroachments of “political correctness”). Recent television—Narcos, Mindhunter, etc.—mirror many of these interests and continue the trend of imagining the life-world of the decade as mostly violent and male. So total is the penetration of the fictional universe of Star Wars into the molecules of contemporary representation that it is easy to forget that it is in many ways, at least for those over the age of 40, a living artifact from the 1970s, one inseparable from an encounter with the period’s core logics and contradictions. We could not have imagined in 1977 that Star Wars was the Trojan horse for a new way of being in the world. That films could become worlds—self-sustaining spaces in which a whole generation might imaginatively live out much of its time, spaces complete with alternative histories, entire cosmologies—would have been surprising to those alive amidst the pressing historicity of the 1970s (and for whom film often critically reflected on the most relevant historical matters of their day, from the war in Vietnam to Watergate). It is perhaps a mark of how badly things have gone politically in the wake of Reagan—America’s first (but not last) Hollywood President—that our culture continues to understand these expansionary fictional universes as no more than good clean fun, a fun, albeit, that has expanded—through bedsheets and toothbrushes, video games, and food packaging—in directly inverse proportion to the capacity of individuals to understand in even the most minimal of ways their own place in history. Wouldn’t this be the ultimate expression of postmodern thinking taken to its extreme limit? A world in which every subject, having chosen the content it likes best (Star Wars, Harry Potter, X-Men, etc.), embraces the sovereign right not just to escape politics, but the planet itself, distant fictional galaxies rendered in greater detail (and lusher colour) than the basic political outlines of their own neighbourhoods? The sight of a middle-aged man at home with his collection of Star Wars figurines—or pointing proudly at the office to a life-size Boba Fett doll he’d had installed to improve morale—reminds us that, despite all of the delusions of modern adulthood, it at least always held in reserve the ghost of a materialist ontology. For all of its conservatism, for all of the ways every claim to adulthood is a lie, there remains in the latter’s contempt for children’s fictions, and myth itself, a bare historical-materialist gesture—an insistence on the serious Oneness of situations and on the existence of a ground we somehow complexly and share. After 1977 it became possible to be in possession of detailed technical knowledge of the blueprints of the Death Star (and to display this knowledge as edgy intelligence), while at the same time openly (and unashamedly) knowing nothing about the existence of Toussaint Louverture.
“Back to the 1970s”
If popular culture returns to an idea of the 1970s that is at the very least variegated, mainstream political discourse perpetuates a much less flexible image of the decade as a period of undifferentiated ruin. One could argue that neoliberalism in many ways survives, masking its own profound failure, on the basis of a highly codified set of associations—what we might call “stock footage”—that frame the 1970s and social democracy itself as an objectionable form of politics, ludicrous to re-consider as viable. The codes at work here appear most clearly in the near-hysteria that has greeted the rise of Jeremy Corbyn in Britain. In 2015, Centrist Labour MP David Blunkett claimed that those voting for Corbyn were mostly hard-Left militants fueled by an irrational politics of hate (of the rich, the successful, etc.); if left unchecked they would drag Britain back to the 1970s, a time in which the nation was torn apart by “strikes, food shortages, and blackouts.” To tilt in the direction of social democracy—higher taxes, for example, or tightened regulatory regimes—would be to unthinkingly follow “a road to nowhere.” A nightmarish montage accompanies even the merest hint of a return to these policies: corpses left unburied by unionized gravediggers; the three-day work week (imposed to conserve coal supplies hobbled by striking miners); streets crowded with uncollected garbage (and flush with rats); a series of States of Emergency (five in total) declared by Edward Heath between 1970-74 (placing social democracy on the same dangerous plane as terrorism or war). Instead of being a conjuncture of possibility plied by myriad speculative futures, the 1970s in this view is reduced to the scintillating obviousness of crisis. Thatcherite austerity then comes to appear as necessary medicine, a strict but fundamentally sound treatment designed for a patient that would have died without it. In this sense, neoliberalism is never opposed to a genuine political alternative or different form of political reason, but only ever to networks of dangerous drives, instincts, and emotions—to an irrational expansion of the political into the sovereign necessity of the market. In other words, neoliberalism lacks interlocutors because those who contest it are always no more than force-fields of instincts. Any desire to “return to the 1970s”—that is to systematically reassess its political legacy—can only be understood as: a) a naïve form of economic illiteracy (there is, after all, no such thing as an economically sound social democracy) or b) a bad Marxist death drive—a desire for the pleasures of stupid negation, of a class war that destroys for the sake of destruction itself. Such a desire, either way, is nothing less than unnatural— a corpse left out in the sun.
One way to track the tropes at work in neoliberalism’s occluded history of the world is to focus on one of its great, spectral bogeys: inflation. Hatred of inflation is perhaps the closest thing our moment has to a (bi-partisan) moral absolute. Left unchecked, allowed to “spiral,” inflation is almost universally decried as wrong or risky; economic policy (so we’re told by the central bankers and institutional lenders) should be tailored to prioritize and control this danger, even if it is at the cost of a rise in unemployment or involves significant cuts to basic services. That such a choice in the 1950s—the era of One Nation conservatives such as Harold Macmillan—would have been unthinkable, not just politically, but on emphatically moral grounds is entirely forgotten. Stranger, though, is the way our moment finds in inflation an idea of catastrophe that is more readily imaginable and a greater spur to action than the risk posed to life by climate crisis. This is not an empty assertion: governments regularly act politically to curb inflation even as they do nothing in the face of potential extinction. It is as if the hyper-inflationary environment, one in which the simple act of exchange spectacularly collapses, presents a more complicated puzzle—and a more terrible prospect—than the collapse of the global eco-system: one can fantasize, for example, dramatic scientific fixes for climate change or imagine a world in which humans eek out an existence on the edges of a changed natural world, but our creativity fades when tasked with the spectre of a $10,000 load of bread. One cannot survive or endure inflation; one can only immediately move to extinguish it. In a capitalist environment in which everyone is spontaneously relativist, nothing is more structurally surreal or really more fundamentally evil than a shift in the stability of prices—it is as if the consistency of money were the last of the classical certitudes, one that persists despite the fact that it was precisely the marketization of life—the sovereignty of money—that killed the old truths in the first place. It is perhaps not surprising then that historians regularly locate the Holocaust, a politics of death taken to a point beyond all limits, as emerging out of the terrible fog of the 1923 Weimar inflation. The message is clear: keep one’s monetary house in order or risk a return of the repressed of world-historical proportions.
Hatred of inflation passes for truth, even in an age characterized by the suspicion of absolutes, because it links the experimental skepticism of the natural sciences with a much older customary logic grounded in the association of chaos with excess. In contradistinction to the moralizing Christian or Confucian, the neoliberal economist can point to the political necessity of anti-inflationary measures, not as an injunction to ascesis or moral balance but as an effect of unquestionable natural scientific law (complete with precise numbers and graphs). If we can’t imagine measuring mass unhappiness we can at least know precisely what’s going on in the stability of our money: we can accord to chaos a precise measure and respond to it with monetary governance. Yet this injunction (to balance) works precisely because it lies so closely to the inherited customary norms that structure the West—dreams of order as harmony, balance finally restored, and of excess or chaos as an unnatural deviation from things as they should be. Our moment illustrates or dramatizes this chaos using stock photos of the 1970s. On the other side of inflation is a future drawn directly from the past: scenes of riot, produce rotting out of the backs of trucks, garbage-strewn streets, etc. The tone here is biblical; this is a time of plague and rot. Nothing better signals social failure in the eyes of the middle class than the public display of uncollected trash: it contains the spectre of a “Third World” [this is not a claim about the “Third World” but about the way the latter is imagined in the minds of the white middle class], devolution, the threat of a complete collapse of liberal civility. Inflation, for neoliberals, is a moral fable in which the main villains are profligate (self-serving) welfare states and greedy, wage-distorting unions: at the root of inflationary chaos, one that ends the natural simplicity of buying and selling, are states and unions who have made it all so troublingly political.
Thus, we must reject the idea that inflation is axiomatically bad or that it is simply the symptom of self-evident economic failure. This is because inflation reveals the truth that every claim of an economy to transcendental naturalness is false. The fog of inflation makes clear the falsity of growth, its claim to be an axiom, and its seeming automatic quality. In an inflationary spiral there is no longer any sense that an economy is something comprised of individuals nor that it is simply a natural whole that operates behind the back of its agents; instead, the world splits into virulent stakes and interests, classes and forces, a fog or smoke in which everything is suddenly debatable. Coal doesn’t simply move along smooth tracks from pit to factory, but is slowed down by the Real of truculent labour, the inconvenient fact that nothing happens without the latter’s consent. Money doesn’t flow from hand to hand, between a seller and a natural buyer, but takes the form, finally, of a problem. Money in these contexts becomes the local historical invention it has never ceased to be. Certainly, any possible Left politics has to “keep the lights on,” “keep the trains running,” etc., but whatever Left efficiency stands to be imagined by future praxis will also be distinct from its present counterpart by being oriented from the beginning towards the possibility of a life never wholly sutured to efficiency in the first place. A life, in other words, in which efficiency never becomes a governing ideology (nor a justification for suffering or exploitation).
It bears keeping in mind that the last moment one could realistically imagine the planet’s future as communist—or post-capitalist, socialist, etc.—passed quietly and without anyone really noticing on a day without a date sometime in the 1970s. All over the world—in China, parts of Africa, South America, and even at the system’s very centre (in the United States, Germany, etc.)—it was possible in the 1970s, buoyed by a sense for the continuing relevance of social democracy, for the political power of students and unions and for the revolutions that continued to emerge in places like Nicaragua or Afghanistan to conclude that the planet was still tilting slowly to the Left. There were signs of crisis, certainly, and symptoms of accumulating contradictions and limits, but almost nobody envisioned the answer to these problems in the form of a jarring lurch to the Right; apart from a tiny minority, mostly Friedmanite economists or policy wonks such as Keith Joseph, the thought of using unemployment to tame inflation, or of actively disempowering the unions, was unimaginable. Adorno, of course, is correct to point to the ways Auschwitz interrupted the Enlightenment dream of perpetual progress; yet it was precisely the defeat of those who had engineered Auschwitz, combined with the post-war spread of social democracy, that made it easy to see the slow trickling into common sense of once-radical Left ideas—unionism, full employment, etc.—as an extension of Reason into the last remaining bastions of ignorance and privilege. For many in 1975 the idea that post-secondary education should be free (or near-free) was as accepted as the suggestion in 2018 that a cigarette should never be smoked in the hallway of a hospital. Publically funded libraries were then as axiomatically irreversible as the rights of women to drive or vote. Even those on the Right—such as Edward Heath or Richard Nixon—broadly conceded as necessary many of the things that today, under neoliberalism, we view as excesses or impossibilities (workers’ rights, for example, or pensions).
To live in the 1970s was to inhabit a horizon on which the future was, if not Red, at least reddish or pink. This strange, now almost structurally unrememberable fact, is at the heart of the 1974 travel diary of Roland Barthes’ time in China. When he notes with amazement the “absolute uniformity” of the outfits worn by citizens in the People’s Republic, he is to some extent channeling a fairly predictable liberal response to communist alterity: sameness encountered in this most private of domains—that of fashion and the bodily articulation of the personality—can only be registered as repression, as the banal symptom of totalitarianism, rather than as a difference that capacitates as much as it limits. Beneath the many snarky liberal asides that pepper his diary, however, there is at the same time something more—a sense for the sheer exteriority of communism. It is along the thread of this anatomist’s gaze, one that restlessly but amorally documents differences, that the text comes to register communism as a gigantic, world-historical object. Communism, on this account, is not merely the history of a radical dream, nor a subjective process sustained by the activity of militants, but something that has already happened to the world (in the form of MiGs and free health care, but also shorter working days and, yes, even gulags). Barthes texts registers, in other words, the scalar totality of communism—its hugeness but also its improbability, all of the risk and torpor it had to traverse to exist at all. Barthes, who taught us to read our bodies like books and that outfits too were systems of signs, finds in the command economy a kind of absolute alterity or limit: “the reading of the social dimension is turned upside down. Uniform isn’t uniformity” (57). To move from within the naturalness of a world in which we dress ourselves comfortably in any manner we like to a world in which the heterogeneity of fashion, its empiricism, has been arrested by centralized production is to move anthropologically between two radically different life-worlds or ways of being alive. Millions of people suddenly wear the same piece of clothing, a piece of clothing that is finally nothing more or less than fabric itself, fabric worn on a secularized body for which there no longer any Gods (save, maybe, for Mao). Certainly, these garments are alienated, still blurred at the edges by Maoist myth, but at the same instant, they are nothing more than cloth, and so become objects on the edge of every personal imaginary, objects of utility and use value, freed on some level from the imaginary itself. To contend with the 1970s then is to contend in part with the remarkable richness and residual ontological signatures of actually existing communism.
A film such as Tinker, Tailor, Soldier, Spy, for example, presents communism not as a spectral ideal, nor as a well-intentioned feeling, nor even as a form of malevolent extremity or failure, but as a boringly existent force, something bluntly present in the world. Communism in such a film is an object among objects, something imbued with conatus, struggling to remain in existence but certainly there, real, a fact among facts. In such films the Eastern Bloc is not demonized, but encountered like natural history, “beyond good and evil.” This quality—that of boring, amoral facticity—still comes through in the kind of photographic travelogues of Moscow or Leningrad put out by National Geographic in the decade. Communism in fact gets directly folded into the magazine’s vision of the world as a system of cultural rather than political differences—communism itself becomes a kind of local colour, slightly exoticized, for sure, but nevertheless included as such within the variegated spectacle of the “human family.” Regardless of one’s position on the relative strengths and weaknesses of the Soviet system, the fact it happened at all—and that for many decades fed, educated, and clothed its citizens and produced cutting-edge scientific research—remains a significant political fact. This is because even in the rottenness of the Soviet experiment there is the trace of a miracle, a break, an outside for thought and practice. Jean-Paul Sartre remains correct that without this rotten, beautiful experiment, the world would have remained ontologically bourgeois. Alongside attempts to discredit Left imagination by reference to its blighted history—in which its existence was exhausted by failure—there exists another tendency, one very much at the heart of neoliberalism: it is not only that communism failed, that the facts of its existence were eaten up by failure, it is that its failure was so profound that it comes to be perceived as never having existed in the first place. In this context, the bare gesture of pointing to communism as having existed at all (and in a form not simply isomorphic with failure) becomes political.
Recently, we have begun to hear a lot about the supposed end of postmodernism, the turn, after Slavoj Žižek, Alain Badiou, Quentin Meillassoux, Jodi Dean, and others, towards a post-postmodernity. Whether it be in the form of a return to the radical intensity of Truth, a disruptive materialist psychoanalytic Real, a certain kind of Marxist sociological determinacy, or even, as in Object Oriented Philosophy, a thought capable of grasping things themselves (rather than simply their appearances), our moment can be said to be characterized by a desire to exit the era in which philosophy came to see itself as a storyteller rather than as a practitioner of strict German Wissenschaft. What then, of the claim, that postmodernism is over? In so far as these are claims for a turn within the restricted cultural sphere of philosophy they are certainly correct; positions that celebrate the irreconcilable multiplicity of perspectives, the sovereignty of pleasure, or the ecstasy of an identity in constant flux have never looked less interesting nor as philosophically weak as they do today. Yet as the name for an actual historical era it is arguable—despite all the talk of a return of the repressed of history, a new cycle of the Real, a return to the divisiveness and intensity of struggles, and so on—that postmodernity as a politico-aesthetic regime has never been more securely founded. Anybody who has spoken at length to a Trump supporter, a fan of the Kardashians, a liberal banker, or an urban “creative” knows all too well that the negative remains as moribund as it felt to Herbert Marcuse writing at the end of the post-war boom: the only way to seriously believe that we are living in an era of sharpened negation is to confine one’s conversation to a tiny coterie of like-minded academics. Badiou’s meta-philosophy is not just true in the weak sense that it compellingly describes the structure of human history, it is true in the stronger sense of offering to humans a picture of themselves as radically capable of change. Yet nothing in the grandeur or even descriptive adequacy of Badiou’s position changes the fact that there was perhaps no time in history in which it was more difficult to actually make (let alone sustain) a truth claim. In many ways, the core texts of Jean Baudrillard on simulation or Debord on the spectacle or Frederic Jameson on the flattening of affect—all written before the advent of the internet, social media, and a 24/7 temporality—now look less like the slightly mad, “abstract” rantings that serious social scientists once denounced them as, and more like sober, empirical accounts of the world as it is. We live in a moment, we should recall, in which mainstream scientists and thinkers as well as some of the world’s most influential “business leaders” (Elon Musk, for example) have sincerely come to believe that reality is a sophisticated simulation. This simulation hypothesis—famously articulated by Nick Bostrom in 2013—points to the possibility of a time in which we can plausibly imagine a human being who, after spending its day in various simulated realities (VR, television, etc.) removes the goggles only to encounter a world it also openly believes to be false (or second-order). This is unparalleled cultural territory, the strange revenge of Platonism (though a Platonism miserably emptied of truth and of the possibility of a world beyond the cave).
It may be that the experience we once called “Being”—that old lofty Heideggerian Dasein—itself died, along with the communist outside, on that obscure day lost somewhere in the 1970s. That Jameson was diagnosing this situation in the 1990s is remarkable given how preliminary the symptoms were at the time. Given this context, there is a way in which 1970s visual culture may end up carrying a heavier ontological signature than much of the cinema which comes before or after it. Like a photograph taken by someone at the instant before their death—the genre of the death selfie is now commonplace among stegophiles, extreme tourists, etc. —1970s visual culture registers the traces of a Dasein intensified in the moment before its own erasure. It should come as no surprise then that the signals left by the collapse of a thematics of Being (and even of an end to the motif of collapse itself) ping louder the closer we get to those visual artifacts that commemorate or register the technologies most implicated in this process. There is something impossibly odd about the sight on film of a 1970s telephone booth, an uncanniness that can’t be understood apart from the operations of a certain diffuse historical-materialist metaphysics. An image of a contemporary cell-phone or laptop has no capacity to register the difference between the postmodern and what came before it—they are bluntly contemporaneous with themselves. However, this immediately changes when we are presented with primitive prototypes of these objects or even with wholly other objects on alternative developmental arcs with roughly the same functions or operations (the tape-recorder, type-writers, etc.). It also comes as no surprise that details about this ontological shift are refracted through the visual history (and after-effects) of the technologies implicated in this erasure and in the fundamental redistribution of space-time it involves. 1970s films reveal to us a world that is at once uncannily similar and totally different. It is the uncanny proximity to ourselves—offices that are recognizable but computerless, fully contemporary automobiles outfitted with ash trays and dial-switch radios—that allows us to witness materially proof of the fact that there was life before the smart phone. Revealed here is the objective superfluousness of all of those modes and habits that make up the fabric of contemporary communication, the presence to desire of a world content despite the absence of wifi. This historical structure of desire—the bliss of the past vis a vis all of the pleasures or “necessities” held in store for it by the future—may be less universal than one might think, with the washing machine, for example, “dreamed of” by the historical suffering of women’s bodies in a way that has no analogue in the cellphone.
The 1970s is a time that is close enough to resemble ours but at once separated from us by an unfathomable distance. Though one could point to the great ontologists of 1970s cinema—for example, Andrei Tarkovsky or Bela Tarr, in whose films we are confronted by a gritty being-there of History we encounter almost nowhere today—even films in popular, plot-driven genres seem ontologically haunted vis-à-vis their contemporary analogues. This is evident mostly on the level of pace, in a remarkable slowness that characterizes so much of the film production of the period and in which what is happening on the screen is never quite absorbed into the immediacy of its notional content.
There is no going “back to the 70s”. There are, however, good reasons for thinking that any possible means out of the present—out of neoliberalism, out of postmodernism—will require a detour through the decade’s repressed political and ontological signatures. It is easy to romanticize the 70s, a time which, after all, provided us with some of the last great photos of Revolt, of history captured collectively by a genuinely oppositional Idea. It is not romanticism, though, that leads us back curiously to flit through old shoe-boxes of Polaroids (shots of long-gone suburban streets, of faded birthday parties, of now-rusted playgrounds, of loved ones dead for decades, etc.). Held up against the immateriality of the digital image, the Polaroid today has about it the aura of a cemetery or burial ground. Why is this the case? Though the Polaroid extracts a moment from the flux in which it takes place in a way that is similar to the digital image, it suddenly transforms that moment into an object that is itself instantly claimed by singularity and time, itself immediately unrepeatable and subject to deterioration. Rather than disappearing into the permanence of an orbiting Cloud, the Polaroid object can now be lost, shredded, fade, burn, etc. Unlike the traditional photograph, however, the moment extracted from the flux is not separated from its transformation into an object by the interval of development: instead, slightly displaced, it appears within that very same here-and-now. We are haunted by the Polaroid–an aesthetic now widely circulated on Instagram filters, for example—not just because it was superseded as a medium by the arc of technological change (that is, not just because its dead). Rather, the desire of the Instagram filter is the fade of the Polaroid: what it craves, on the border of everything it finds intolerable about the present, is ontology. It isn’t nostalgia then that leads us back to the Polaroid, nor a belief in some kind of unmediated Being or Erfahrung, but a tinkerer’s interest in the possibilities inherent in everything still capable of fading.
Works Cited
Adorno, Theodore. Minima Moralia: Reflections from Damaged Life. Verso, 2005.
Barthes, Roland. Travels in China. Polity Press, 2012.
Comay, Rebecca. Mourning Sickness: Hegel and the French Revolution. Stanford University Press, 2011.
Wooding, David. “Corbyn will take us back to the 70s”. The Sun. September 12th, 2015. Available at: https://www.thesun.co.uk/archives/politics/89310/corbyn-will-take-us-back-to-the-70s/. 07 October 2018.