POM Wonderful Presents: The Greatest Movie Ever Sold (Morgan Spurlock, USA, 2011)

American cinema, Blogpost, Documentary, Film education, Film reviews, Neurocinematics

There is not necessarily that much to say about POM Wonderful Presents: The Greatest Movie Ever Sold, in that the premise of the film is pretty simple.

That is, Morgan Spurlock, he of Supersize Me (USA, 2004) fame, has made a film that exposes to what degree product placement – or what we might call just plain advertising – is a common practice in the film, television and new media industries.

I hope that such people do not exist (because they’d have to be what I might uncharitably term morons), but we can hypothesise that not everyone already knows this. And if not everyone already knows this, then bravo Morgan Spurlock for bringing it to our/their attention.

Beyond that, The Greatest Movie Ever Sold is not necessarily as brilliant as all that. And I’d perhaps even go so far as to say that it is disingenuous.

The Review Bit (in which – enviously? – I reproach Morgan Spurlock for thinking that a wink and a smile mitigates the trick he is playing on me)
The film is smart and ironic, sure – but its disingenuous nature comes through when Spurlock takes (seeming) swipes at bizarre North American corporate practices, such as the weird psychoanalytic branding exercise that he goes through early on in the film.

We see Morgan subjected to countless questions that seem to go on for hours – and after being grilled in this intense manner he is told – entirely anticlimactically – that he/his brand is a combination of intelligent and witty (I can’t remember the exact phrase – but it was cheesey).

My point is that if Morgan expects us (as at least I seem to think that he does) to laugh, somewhat bitterly, at how people can make money selling transparent clothes to the Emperor (psychoanalytic branding that tells anyone with a modicum of self-awareness what they probably know about themselves already), then why does he not expect us already to know precisely the other ‘insights’ that his documentary reveals – namely, that advertising is everywhere?

In this way, The Greatest Movie Ever Sold is not really about advertising, but about Morgan Spurlock – and his access to the beautiful classes (even if he has not in fact ‘made it’ in ‘real life’).

The film claims that he is not selling out but buying in. To be honest, I think that both of these terms pertain to the same logic of capital-as-justification-of-one’s-existence that Spurlock might not necessarily critique, but the critique of which is surely a strong part of his No Logo-reading target audience.

Spurlock might aim for ‘transparency’ – but this in itself is problematic. As pointed out to me in the past by an astute former colleague, if something is transparent, it is invisible. While Spurlock might make apparent something that advertisers themselves have for a long time been wanting to make as apparent as possible – namely their brand – Spurlock also seeks to make transparent – i.e. invisible – his very conformity with the practices that his film might otherwise seek to critique.

Irony and humour are aplenty in the film, as Spurlock seeks to make a doc-buster that is corporate sponsored in its entirety while being about the prevalence of corporate sponsorship. There seems no room in this world for gifts or sacrifices, or any of those things that might otherwise suggest a spirit and sense of community beyond the quest for material profit. And for all of Spurlock’s success (and his failures) in getting money from the brand dynasties, it does seem to lack, how do I put it?, soul.

The Simpsons Movie (David Silverman, USA, 2007) opens with Homer shouting from an onscreen audience that the Simpsons Movie within the Simpsons Movie that he is watching is no better than the TV show, and a rip-off. Similarly, Wayne’s World (Penelope Spheeris, USA, 1992) has a protracted sketch in which Wayne (Mike Myers) explains how he will not sell out to corporate sponsorship while simultaneously advertising a host of products from pizzas from trainers.

In other words, Hollywood has been pretty up-front about the fact that it has been peddling advertisements to us/short-changing us in the form of films for a long time. Hell – although I am here shifting slightly into the realm of the online viral, but some ‘advertainments’ – such as Zack Galafianakis’ wonderful Vodka Movie – are pretty good.

In this way, Spurlock does not take his film to the level of, say, the Yes Men in their critique of contemporary corporate practices. In their far-too-little-seen The Yes Men Fix The World (Andy Bichlbaum, Mike Bonanno and Kurt Engfehr, France/UK/USA, 2009), there is a scene in which the titular Yes Men try to convince a gathering of corporate bigwigs that they could make a shitload of money by, literally, repackaging shit back to consumers (that’s vaguely how I remember it, anyway – perhaps someone can correct me if I’m wrong). Hollywood also does this – but given that shit stinks and causes disease if not carefully disposed of, sometimes it’s good to rub it back in the noses of those who deposit it, as per the Yes Men. In comparison, Spurlock just seems to enjoy wading through shit to get to the silver screen a little too much.

Anyway, now to…

The Real Blog – not about but inspired by the so-branded Greatest Movie Ever Sold
At one point in Spurlock’s film, he talks to Martin Lindstrom of Buyology, which is also the name of a book about marketing and its effects on the brain.

Lindstrom shows to Spurlock images of his brain while watching a Coke commercial.

Lindstrom explains that at a certain moment in the commercial (it is not made particularly clear which moment, since Spurlock – like many neuro-whatever evangelists – tries to blind us with ‘science’ rather than a precise explanation), Spurlock’s brain releases dopamine, which suggests an addiction of sorts – inspired by the commercial. That is, or so the film seems to suggest, Spurlock underwent the same effects of a ‘Coke high’ thinking about Coke – which in turn suggested his avowed desire for a Coke at the time of watching the advert – as involved in actually drinking a Coke.

What is not clear from this is whether Spurlock’s ‘addiction’ is to Coke, or at the very least to its effects, or rather to images that can spur desire through their very presence for that which they depict.

My critique of the lack of clarity offered by Spurlock and which I extended to neuro-evangelists is not because, Raymond Tallis-style, I wish to dismiss ‘neuromania.’ Indeed, I personally think that neuroscience has enormous amounts of insight to offer us.

But I am not sure that the right questions are being asked of neuroscience at present in order for us fully to understand the implication of its results.

I have written a few papers, published and forthcoming, on what neuroscience might mean for film studies, particularly in the realm of images attracting our attentions through fast cutting rates, through the exaggerated use of colour, and through various acting techniques (associated predominantly with Stanislavski’s ‘system‘ and Strasberg’s ‘method‘ – which are of course different things, but I group them together because the former spawned its offshoot, the latter). And it is this area of studying film that I wish to pursue further – and on a level of seriousness far greater than that more playfully adopted for a previous posting on sleeping in the cinema.

This will sound quite outlandish – particularly to academic readers – because it is a crazy, Burroughs-esque proposal. But I think that a neuroscientific approach to cinema will help bring us closer to answering one question, which I formulate thus: can there be such a thing as image addiction?

Why is this an important question to ask/answer – and what does neuroscience have to do with it?

It is an important question – at least in my eyes – for the following reason: there is a long line in film studies history of people who argue for and against (predominantly against) the possibility that humans can or do mistake cinematic images for reality. This question, however, is all wrong – even if slightly more interesting than it seems easy to dismiss.

Far more important is the following: it is not that humans mistake films for reality (or if they do, this is not as significant as what follows), it is that humans commonly mistake reality for cinema.

What do I mean by this? I mean every time we feel disappointed that we are not in a film. I mean humanity’s obsession with watching moving images on screens at every possible opportunity such that life – and even ‘slow’ films – become boring and intolerable to people who must have their fix instead of bright colour and fast action. I mean the widespread aspiration to be on film, or at the very least to become an image (what I like to refer to as ‘becoming light’) on a screen (the final abandonment of the body and the ability to be – as an image – in all places at once [travelling at ‘light speed’]). I mean our inability to look interlocutors in the eye because we are too transfixed by the TV screen glowing in the corner of the pub. I mean – and I know this sensation intensely – the sense of immersion and loss of self that I feel when I watch films.

This is what I call image addiction.

But why neuroscience?

Because neuroscience might be able to help show to what extent – be it through conspiracy or otherwise – moving images and their accompanying sounds literally wire our brains in a certain fashion, such that we do all (come closer to) thinking in exactly the same way, repeating the same bullshit mantras to each other, dreaming only minor variations of the same things, etc.

Don’t get me wrong. If we adopted a psychoanalytic – instead of a neuroscientific – discourse – we might realise that the literal wiring of our brains is heavily influenced – and perhaps even relies/historically has relied upon – ‘fantasy’ of other kinds beyond the cinematic, and which we might even more broadly label the ‘culture’ in which we live.

But a neuroscientific demonstration of how this is so (if, indeed, it is so – this is only my hunch at the moment) might then open up debate on every philosophical level: ontologically, to what extent is reality determined by fiction? Ethically, how many images, of what kind, and using what styles, can or should we see if we want to retain some sense of a mythological self that – impossibly in my eyes – is ‘untouched’ by the world (be that by cinema in the world or the world itself that contains cinema) and belongs ‘purely’ to us? Indeed, this might open up debate not only about which ontology and which ethics, but regarding the entire issue of both ontology and ethics – and how historically they have been framed…

For those interested in what academic researchers do, I am trying at present to create a network of scholars interested in ‘neurocinematics’ (which is not to deny that various scholars are already working on these issues in their own ways). I am sceptical that I will be successful in attracting funding, not because the idea is not ‘sexy’ but because I am not sure, at present, whether I know enough neuroscientists to work with, and I also figure I might be too much of a no-mark academic to land a plum grant from a funding institution that has never heard of me. But I shall try nonetheless.

In conclusion, then, Lindstrom’s comment to Spurlock is unclear, but it raises the issue that I think is at the heart of where I want my academic research to go (even if I want to retain strong interests in other academic areas, predominantly film studies, and even if I might ditch all of this to write and/or direct films if anyone ever gave me the money to do so beyond my breaking my bank account every time I put light to lens – my filmmaking being my own desire to become light): is Spurlock addicted to what is in the image, or to the image itself? Can we separate them? Does looking at Coke can yield the same effect as looking at the stylised Coke can in the image (i.e. it is the properties not strictly of the can, but of the can in the image) that trigger the response of which Lindstrom speaks…

I suspect that both the advertising and movie industries are funding this kind of research as we speak. If the corporate giants can go straight to your brain, they will do. Inception (Christopher Nolan, USA/UK, 2010), then, becomes no lie (not that inception/influence of some sort has not always been in existence – as I argue here). In some sense, then, such research is morally indispensable; what I mean by this is that if corporate giants discover and protect methods of accessing the brain directly, then it is up to academics to let humans know how this happens, to make them aware of ‘inception,’ to bring people back to that most unfashionable of approaches to studying film, ideological critique.

In some senses, then, this is simply the rehashing under new paradigms the same old questions that have been banging around since cinema’s, ahem, inception, and even before. Only the stakes are now higher.

Might I say that a full neurocinematic programme might simply prove according to some scientific paradigm what many of us have known all along?

Wait a second, isn’t this also what I accuse Morgan Spurlock of doing with The Greatest Movie Ever Sold, accusing him of being a self-serving hypocrite for doing something that I myself seem to want to do?

Maybe Spurlock’s film is better than I thought, then. Maybe it is important and ingenious, because of its invisible transparency, not in spite of it.

There probably is no academic study that is devoid of corporate sponsorship somewhere along the line these days. There certainly will be even less if the politicians do not open their ears at some point and listen to what people are beginning literally to scream at them with regard to higher education and other issues. That is, that is must be as free of outside interests as possible, even if the quest for true objectivity is impossible to achieve.

Indeed, if we are talking about the possibility of corporate – or even gubernatorial – brain control (which is not the same as mind control, I hasten to add, though one could lead to the other), then we need to know whether it is possible, how it might happen, and what we can do about it. Before our bodies are all snatched away by the light (note: even now I cannot escape movie references) of the screen and before we are all turned into dependent image junkies who need the images just to feel alive – the over-dependent equivalent of the good, small dose that a movie like Perfect Sense (David Mackenzie, UK/Denmark, 2011) seems to offer – as written about here.

Perfect Sense (David Mackenzie, UK/Denmark, 2011)

Blogpost, British cinema, European cinema, Film reviews, Uncategorized

I have an hour to write this blog. Let’s see what can get done in that time. Given the rush, though, apologies for ensuing typos. I guess they are a hazard of trying to type too fast and of restructuring sentences as one goes along.

That said, this blog is inspired by David Mackenzie’s latest film, Perfect Sense. The premise of the film is relatively simple: humanity is slowly losing its senses, one by one, starting with smell, then taste, then hearing, then sight…

While the film has snatches of this global pandemic – pictured as crowds in Asia and Africa eating, stumbling, touching and the like, the main action takes place in contemporary Glasgow, where epidemiologist, Susan (Eva Green), endeavours to tackle the problem while starting a relationship with chef, Michael (Ewan MacGregor).

This will not be the blog to question the ‘global’ images, which perhaps deserve some critique. No film can get everything right, but the short/sharp treatment that the rest of the world gets in comparison to the main bourgeois couple perhaps deserves greater discussion than I’ll offer here.

Nor will this blog really discuss the interesting illustration that the film provides of how memory is associated with the senses – figured here as ‘disappearing’ images, rendered in photographic (as opposed to cinematographic) form.

Furthermore, the blog will not really try to fit this film in with the ongoing cycle of cataclysmic films that have migrated from the mainstream (provide your own examples) into the more ‘art house’ strand (again, your own examples will suffice, although I also do not want to overlook older examples of artsy treatments of the eschaton, from Tarkovsky to Zulawski, etc).

Rather, this blog will start by mentioning how Le fabuleux destin d’Amélie Poulain (Jean-Pierre Jeunet, France, 2001) was considered at one point through the framework of pétisme.

In the above-linked article, the author, Michelle Scatton-Tessier, discusses how Amélie was linked to a discourse of ‘simple,’ often physical, pleasures: dipping a hand in a pile of beans, spotting continuity errors in old films, etc.

The reason that I start with this is to say that if – at least implicitly – Scatton-Tessier suggests a sort of light-weightedness in Jeunet’s film as a result of his foregrounding of such ‘simple’ experiences, then Perfect Sense is in some respects all about such pleasures. In a world without smell, taste, sound or, ultimately, sight (and beyond that, presumably, touch), our everyday experiences are returned to us in all of their glory – an affirmation that fundamentally we are alive and with the world.

Since such pétisme is one of the film’s main focuses, it inspires thoughts about the nature of joy. For your information, this idea of joy is inspired by the Jewish Portuguese-Dutch philosopher, Baruch Spinoza, as well as by the recent neurology-inspired treatment of Spinoza given by neuroscientist Antonio Damasio.

So… what is joy? Well, joy is the feeling evoked when humans feel alive and with the world. Joy is, in this sense, both sensual and physical, but also supremely conscious. That is to say, were we in the world as unthinking zombies, we would still feel. But those moments when we become most intensely aware that we feel – these moments are what I would like to refer to as joyful ones.

Joy has, as Spinoza would argue, its own ethics. That is, joy is a feeling associated with enworldedness, and to be enworlded is to recognise others. In recognising others, we lead an ethical life that seeks to maximise one’s own joy and the joy of others, rather than frustrating or destroying the joy of others for the sake of selfish concerns.

In an hour (now 40 minutes), I do not have the time to explain in greater detail any empirical basis for such an argument, were a coherent one easily to be constructed even after 4,000 minutes of writing. So we must push on…

We live in an age in which joy seems a rare commodity. Perhaps it needs to be rare and the pain and sorrow that surround joy give to it its value in our lives. Nonetheless, I feel no embarrassment in connecting the need for joy to the absence of joy found in our current political, economic and ecological situation – and which has found expression in recent and ongoing movements towards social change. In other words, Perfect Sense emerges, in some respects, as a treatise on finding joy, even as we become increasingly disconnected from the world with which fundamentally we are inseparable.

In this sense, the affliction of senselessness that strikes humanity in the film is not uniquely an allegory of a vengeful nature, but perhaps most particularly an allegory of our own practice of alienating ourselves from the world. That is, the affliction is not nature, but humanity itself.

This is arguably a skewed reading of the film. No disservice is intended to David Mackenzie and his collaborators, in particular screenwriter Kim Fupz Aakeson. But these were my thoughts during and since the film, so I pursue them nonetheless – because as is my usual bent, I want to talk not really about this film, but to use this film as a pathway to talking about film itself (and for the pedants that may come across this blog, I do not strictly mean film in the material sense – polyester – but film as an audiovisual text).

For, humanity’s alienation from the world leads de facto to a reduction in joy. This is dangerous water: what myth of antediluvean joy am I evoking to suggest that at some point in time humans were in harmony with nature? None, I hope. My point would be that humans have always only ever been in, or better with, nature; it’s just that we have created precisely a myth that we are not. What I wish to evoke, then, is not so much a time in which humans are ‘once again’ reconnected with nature – but rather an understanding that we are, always have been, and for the duration of our crawling over the face of this – and perhaps other planets – always will be, with nature. Pétisme is not a silly little pastime; it lies at the root of understanding our place in the world…

If this sounds tree-hugging, then hold on to your underwear, because I am about to get more romantic still as I defend idealism, perhaps even naïveté.

Perfect Sense suggests that we most appreciate things when we are about to lose them. The loss of smell in the film is prefaced by tears, the loss of taste by gluttony, and loss of hearing by anger and the loss of sight by tenderness and love. Although tears, gluttony and anger might all be characterised negatively, they are also all intense feelings.

My point here, though, is that intensity is accompanied almost necessarily by its opposite; something perhaps that we might call neutrality.

In a not-necessarily-uniquely-metaphorical sense, we can say that the same is true of all experiences. Vision can only be appreciated by blinking. Without blinking, our eyes would become dry and/or burnt and we would no longer be able to see. What is more, our eyes move the whole time through a process called saccades. We cannot see, I have read, during the saccade movement itself, but only during the fixation that happens between each saccade.

Aside from the possibility that vision is based upon fixation and stillness – while change, or more precisely time itself, is invisible to us – this suggests, to me at least, that in some senses we can only see by looking away.

How does this relate to idealism and naïveté? Well, perhaps cinema, too, is a process that involves us turning our eyes (and other senses, of course) away from reality as we might normally define it, and placing them before an alternative, supposedly fictional world that is not our own.

While some argue that reality can be fundamentally disappointing after seeing certain films, I felt invigorated after seeing Perfect Sense, keen once again to marvel the minute details that all too often I overlook.

This is no doubt cheesey, but it is for me also true, so in the name of honesty, I cannot deny my own susceptibility to such romanticism. In other words, Perfect Sense itself – as well as the story that it tells – functions as a metaphor of how looking away (figured within the film as deprivation, figured in my experience as cinema-going) can help us to see – if not more clearly, then at least (and perhaps only temporarilty) differently upon looking back once again.

In other words, looking at the world not as it is but as it might be – via the ideal worlds of fiction – can help us better, or at least differently, to see the world as it is. Idealism functions to support our understanding of the world.

I don’t consider this to be a whole-hearted embrace of anything-goes illusionism. But then I also don’t believe that there is (access to) an objective reality against which we can measure how ‘realistic’ or otherwise our vision is (even if we can enhance our understanding of reality by repeating experiments and seeing how consistently our understanding of it is correct). We are not detached observers of reality; we are constituent parts of reality, co-constituting both ourselves and the reality that surrounds in an ongoing process. In this way, the ideals of idealism are not (to be mistaken for) reality; but they are a part of reality.

To switch vocabulary, we might say that in some senses, then, and as Gilles Deleuze might suggest, the actual is a product of the virtual. That is, the virtual is the realm of ideas – that which has no material being, but whose effects on the real world are undeniable in that the virtual is the ‘darkness’ (or the looking away) that allows us to see the light.

And contemplating ideas, or the virtual, is perhaps also to lead a life that is virtuous. Virtue here is not the turning away from and the rejection of the reality that surrounds us for the sake of living in a dream. But it is understanding that dreams, ideals, virtuality, and perhaps also cinema, have an important role to play in how we frame – and thereby co-constitute – both ourselves and reality.

The more we realise our interdependent nature – interdependent with nature/reality and with each other – the more – Spinoza again – our lives are virtuous. The more joy we let into our lives. The more, perhaps, we love. This is perhaps to live ecologically – and to realise that films are an important part of our (mediated) ecology.

I’ve not yet seen it, though I suspect I will, but Contagion (Steven Soderbergh, USA/United Arab Emirates, 2011) will probably overshadow Perfect Sense as the epidemic film of 2011. However, by turning away from the mainstream and towards the more independent-minded British film that is Perfect Sense, perhaps we can re-think our relationship with mainstream movies, rethink what constitutes a mainstream movie, and realise that independence in filmmaking – idealism of a sort – is what refreshes cinema. No disrespect to Soderbergh, but Mackenzie has produced a remarkable film on what is surely a fraction of the budget…

Notes from LFF: Dragonslayer (Tristan Patterson, USA, 2011)

American cinema, Blogpost, Documentary, Film reviews, London Film Festival 2011, Uncategorized

If I Wish (Kore-eda Hirokazu, Japan, 2011) only obliquely situates itself within the context of the current global economic downturn, Dragonslayer does so in a much more overt fashion.

Or rather, hearing director Tristan Patterson discussing the project, the very first thing that he said was that his subject, bum-like, nearing middle-age skater dude Joshua ‘Skreech’ Sandoval, was an amazingly intriguing character given that he just skates through the world as around him everyone’s lives – in the USA at least – are turned upside down thanks to the post-crisis fallout and its effects on the man in the street.

Like the would-be rock star father in I Wish, Sandoval gets by doing what he can: he works as little as possible, he gets – seemingly minimal – sponsorship for his skating, he smokes (a lot) of weed, he drinks, he is friendly to all and sundry, and he travels (in whatever capacity he can).

Sandoval emerges as a figurehead of the not-so-much angry generation as the don’t-give-a-fuck generation. Let me be clear about what this ‘generation’ (if it is one) does not give a fuck. This generation seemingly does give a fuck about the world, as Sandoval’s insistent trips into nature to go camping and fishing suggest.

That about which Sandoval and others do not give a fuck is the current organisation of the human world. That is, questions of economics, perhaps also of politics, interest Sandoval, and presumably many more like him, so little, that he is just happy getting by in his own way as he can.

This position has a romanticism of its own: a middle-class viewer like me truly fears for Sandoval, as perhaps does he himself, when, come the film’s end, it is documented that he is now working (at least part-time) serving beer in a bowling alley. How can he survive, not least since he has a young infant in tow, on the bare minimum that he currently has?

But perhaps how is the wrong question, and a real reason to admire Sandoval. How is not important. The only important thing is that he will survive – no matter how hard the world is made for him as a result of the choices he has made so far (economic imprisonment based upon ‘economic crimes’ that are not illegal at all, but which will keep Sandoval from material wealth for as long as he lives unless either he changes or he breaks – miraculously – into Hollywood or some such).

Beyond Sandoval and some unanswered questions regarding his relationship to the film (his – perhaps unlikely and younger – girlfriend, Leslie Brown, starts going out with him during filming – and whether the presence of the cameras has anything to do with this is not explored; similarly, given his economic situation, one wonders whether the filmmakers had any performance money to give to Sandoval), the film is well shot.

Patterson and his cinemtographer, Eric Koretz, said that they did not want the film to look like a skater movie. Only it is hard not to look like a skater movie given the light qualities of southern California that play such an important part in skater culture (and which has been co-opted by Coke, etc).

It is furthermore hard not to look like a skater movie when the film features so many obligatory emptied swimming pools.

And yet, Koretz and Patterson are not using mobile fish-eyes (although they do use the indy rock soundtrack), and they are capturing blades of grass, fluffy toys hanging from rear view mirrors and the like.

In short, then, Dragonslayer cannot but be a skater film – but it also is, as implied by the discussion of Sandoval’s seeming ethos/philosophy above, an activist – or perhaps better, a comtemporary beat – film.

Beyond the cinematography, this is evoked most formally in the editing. The film has a schizoid feel thanks to its insistence on (Godard-style) cutting short musical segments before they can become the obligatory music video, and the often chaotic juxtaposition of impressionistic images. This all caged by an 11-chapter structure that runs from 10 down to 0 over the course of the film. 0.

L’œuf, the egg, or what tennis umpires call love.

Sandoval’s dream is to be the only person moving on a planet stilled in time. He can empty pools and skate, raid fridges, drive other people’s cars. Just go wherever he wants to (there is a strong emphasis in the film on Sandoval jumping over fences – the wilful making-common of an otherwise increasingly privatised terrain).

While Sandoval does say that in this fantasy world he’d make some people wake up to hang out with him (how would he choose?), his vision does sound more like a desire for solipsism, or to be the One in a planet full of otherwise disposable human beings.

But 1 always needs 0 (hence the digital image’s democratic possibility for 1+0). And it is pleasing that the film takes us to 0. Not because it favours annihilation over anything/everything. Not because this is a negative choice in comparison to the possibility of an additive world in which the numbers just get bigger and bigger.

But perhaps because 0 is the only way to get back to the ground and to extend a sophophily of love. Not the love of wisdom, but the wisdom of loving.

Sandoval is perhaps an anti-hero of our times. But in others he is just a plain hero. Even with his bowling alley job, the Quixotic elements seem to remain as he takes a hit from the bong on the way to work.

We don’t all need to be dope smoking stoner skater dudes. That surely cannot be the message here. But the message clearly is: we must have the courage, the love of ourselves and of others, to go forward into the world in a fearless fashion. To be ourselves.

The self is a performance, no doubt about it. But it is a performance into which one injects one’s heart and soul, hence its courageous nature. Nothing disingenuous here.

Indeed, one of Sandoval’s friends, a total stoner who seems barely coherent, speaks of reading Spinoza’s Ethics. Spinoza here emerges as a great reference point for the film. For Spinoza argues (in my reading of him at least) that if truly we become ourselves, we cannot but love others and the world that surrounds us.

You can Occupy wherever you like. But the success of this movement can only begin with the occupying of oneself with oneself, filling oneself up with one’s own understanding and vision of the world, such that it can only spill over into the commons with love and respect.

Notes from the LFF: I Wish (Kore-eda Hirokazu, Japan, 2011)

Blogpost, Film reviews, Japanese Cinema, Uncategorized

Kore-eda Hirokazu’s latest film looks set not to get distribution in the UK. Or at least does not have any at time of writing. For this reason, I wanted to see it at the 2011 London Film Festival.

Indeed, avoiding premières of films that I shall no doubt see in the fullness of time seems low down on my agenda of trying to see films at this year’s festival. Rather, I have tried to see films I am unlikely to see (on a big screen, at any rate) anywhere but here.

Kiseki/I Wish is a light-hearted tale of separated brothers, Koichi and Ryu, the former of whom lives with his grandparents and mother, the latter of whom lives with his dad, who is lead singer in a band.

Like Hirokazu’s previous Dare mo shiranai/Nobody Knows (Japan, 2004), this film is about family – in particular disrupted families that struggle to stay in touch in spite of the prevalence of mobile telecommunications technologies (which feature prominently in the film).

Furthermore, given I Wish’s interest in inter-generational relationships, the film also shares common ground with Aruitemo aruitemo/Still Walking (Japan, 2008), not least because this films shares Still Walking’s lighter tone, not least in its characters’ forging of friendships in spite of adversity, while Nobody Knows is comparatively gruelling in its vision of abandoned Japanese children.

Koichi is studying in Kagoshima under the shadow of a permanently smoking volcano, about which no one seems to be doing anything (like running away in case of eruption).

It becomes hard not to consider the threat of natural disaster as supremely ironic in the light of the recent Tōhoku earthquake. And yet, in spite of these parallels, the possibility of natural disaster seems to function more metaphorically here.

That is, Koichi takes the volcano himself as a symbol for his frustration at being separated from his brother, and at his parents’ separation in general.

He and Ryu – and friends – decide to meet up to wish for a miracle, a miracle that will come to pass if they are at the spot where two Shinkansen/bullet trains cross paths on the line between Kagoshima and Fukuoka, the town where Ryu lives with his dad.

The Shinkansen functions as a symbol of hope: perhaps humans and their astounding ability for technological progress will sidestep the dangers that living in/with nature seems to entail.

However, more fitting – for me – would be a recognised synthesis of man and nature. That is, the volcano and the bullet train both can share the same world, a world that Koichi eventually, if ambiguously, wishes for instead of the reunion of his parents.

Ryu, who is all of eight years of age (Koichi is perhaps ten or eleven), upbraids his father – as his mother has done – for not being responsible enough towards his children as he pursues his dream of being a rock star. The father is not without success, but childhood neglect does seem to be a prevalent issue, both for Ryu and for Koichi, whose mother goes out with old school friends instead – arguably – of looking after her son.

Grandparents, both Ryu and Koichi’s real ones and an old couple with whom they stay on their Shinkansen pilgrimage, step in to help this young generation.

However, the film does not seem to offer an outright critique of the middle generation. Not only is the film relatively sympathetic towards them (there are no villains in this film), but they also seem to be prey to circumstances that are in part beyond their control.

Koichi and Ryu’s mother takes a job at a supermarket, a job about which she is ashamed given that most of her peers have better jobs and pay (or so she says). Their father, meanwhile, pursues his childhood dream, as mentioned.

One gets a sense of how one must choose between family and a career, be that a supposedly ‘infantile’ desire like becoming a rock star, or having a ‘better’ job than supermarket shift work.

To choose family seems to preclude great financial success. Furthermore, to pursue independence as an artist (the father’s music is described as ‘indy,’ which Koichi explains to Ryu as meaning that one ‘has to work harder at it’ than a mainstream success) seems also to get in the way of family.

Since the film does not judge the parents, the question becomes: what kind of a world is it that we live in such that the following choices become our only options: either you make money, or you pursue your dream, or you have a family?

There is no definite answer to this question, except perhaps that this simply is the world we live in, and perhaps it is better to have this world than no world at all – the ‘lesson’ that Koichi seems to learn as the film progresses.

There is a scent of the global financial crisis underwriting I Wish, which does the arguably clichéd trick of having children pronounce the most profound wisdom. But if it is a question of economics that has shaped the kind of world in which the above are the only available options, then perhaps the wilfully naïve thoughts of children can be recognised as a return to insightful simplicity in an era that preens itself like a misunderstood romantic on account of its own forced complexity.

The real complexity here is not forced; it is the natural emergence of a community regardless of what else the world (and humanity) throws in its (own) path.

The film is shot with Hirokazu’s characteristic handheld camera, allowing his performers to live on the screen, together with relatively common long shots that also ensure that his characters are contextualised in a world characterised by competing technologies and temporalities, dreams and desires.

Sleeping in the Cinema

Blogpost, Film education, Uncategorized

The below is a rough draft of a paper I was going to present at an academic conference in London this summer, but from which I have withdrawn because I can’t really afford it.

It is relatively ‘whimsical’ and not ‘hard science,’ though it flirts with some science in it.

But I offer it as ‘notes’ about what for me is a prominent aspect of the film viewing experience, falling asleep.

“You are 8 ½. What an age for a boy to ask about cinema and dream! It occurs to me that that same evening, Dadda was telling me that his falling asleep in the cinema is a particular honour to the film in question. He was telling me this as a compliment, his having snored through three of the four films released last year in which I appeared.”

– Tilda Swinton (2006: 111)

In an age when film studies wishes to map almost every aspect of the film experience – from ideological influence to affective response, from audience feedback to galvanic skin responses, sleeping in the cinema remains an overlooked aspect of spectatorship.

And yet, what does it mean to sleep in the cinema? Is it simply an index of a film’s failure to capture the attention of the viewer, such that they prefer instead to doze off in pursuit of more interesting thoughts? Or might sleeping in the cinema be something more akin to what Tilda Swinton playfully suggests is her father’s experience of the majority of her films – that is, an honour and a compliment to the film in question?

Theories of why humans sleep vary, although given that not all animal species do sleep, the prevailing logic would suggest that sleep does serve some function that benefits us, and which outweighs the dangers that are associated with sleeping, namely that one is not particularly aware of the potential dangers that could be lurking in one’s vicinity while in that particular state.

What is more, we spend roughly one third of our existence asleep, which reinforces the notion that it must serve some evolutionarily beneficial purpose.

The most common and seemingly plausible theory of sleep is that humans do it for the sake of information storage.

Various studies have shown that sleep enhances synaptic efficacy ‘through oscillatory neural activity providing “dynamic stabilisation” for neural circuits storing inherited information and information acquired through experience… Sleep, therefore, serves the maintenance of inherited and acquired memories as well as the process of storage of new memory traces’ (Krueger et al 1999: 121).

In other words, sleep fulfils some of the same functions that waking life achieves, namely our adaptation to the environment: ‘the major function of sleep is to maintain our ability to adapt to a continually changing environment since that ability is dependent on brain microcircuitry’ (Krueger et al 1999: 126).

By keeping our brains fluid and malleable, sleep enables us better to consolidate memories, which in turn enable us better to navigate our waking world.

It is perhaps useful at this point to explain that there are two separate modes of sleep, which some view as supporting ‘quantitatively different states of consciousness’ (Hobson and Pace-Schott 2002: 685), namely rapid eye movement (REM) and non-rapid eye movement (NREM) sleep.

Dreams only take place in REM sleep, which is deemed to ‘release hallucinosis at the expense of thought,’ perhaps because ‘the activated forebrain is aminergically demodulated compared with waking and NREM sleep’ (Hobson and Pace-Schott 2002: 686).

What this latter phrase means is that the neurons used to regulate (or modulate) the size and intensity of certain brain waves (e.g. ponto-geniculo-occipital, or PGO waves) during NREM and waking life do not fire (the waves are ‘demodulated’).

As a result of this, and the hyperactivation and deactivation of other brain regions, REM sleep, or dream, is characterised by ‘the lack of self-reflective awareness, the inability to control dream action voluntarily, and the impoverishment of analytical thought’ (Hobson and Pace-Schott 2002: 686).

Having differentiated between REM and NREM sleep, though, it is important to remember that both seem to serve a similar function: NREM sleep ‘could allow recent inputs to be reiterated in a manner that promotes plasticity processes that are associated with memory consolidation,’ while during REM sleep ‘the brain is reactivated but the microchemistry and regional activation patterns are markedly different from those of waking and NREM sleep.’

As a result, Hobson and Pace-Schott conclude that ‘[c]ortically consolidated memories, originally stored during NREM by iterative processes such as corticopetal information outflow from the hippocampus, would thus be integrated with other stored memories during REM’ (Hobson and Pace-Schott 2002: 691).

If both REM and NREM sleep help to consolidate memory, albeit in different ways, during sleep, then the distinction is not necessarily a useful one to draw with regard to sleeping in the cinema, not least because it is hard to determine, even via introspection based upon personal experience, what kind of sleep goes on in the cinema – if there is a constant type of sleep that does happen in the cinema at all.

What is more, the case has been made that there is in fact slippage between REM sleep, NREM sleep and waking life. This is not just on account of the fact that we can ‘hallucinate’ during waking life, or have ‘day dreams,’ such that ‘all conscious states – including waking – might have some quantifiable aspects of dream-like mental activity’ (Hobson and Pace-Schott 2002: 684).

Instead, this is based upon the fact that parts of the brain are always ‘asleep’ while other parts are more activated, meaning that even our waking life is characterised in part by areas of our brain sleeping.

In spite of this blurred boundary between waking, REM, and NREM sleep, however, the distinction might be useful for us in thinking about more ‘ecological’ causes of sleep.

The cinema is a darkened room; although light shines from a projector and is reflected from a screen, the room is predominantly dark.

Arguably (see Brown 2011), the light from the screen, particularly in the case of rapid changes of intensity and colour (i.e. lots of onscreen movement in the form of figural motion, camera motion, and cutting) in conjunction with loud noise, is enough to activate our attention in a quasi-involuntary manner.

However, the darkness of the room might also be important, since sunlight inhibits the production of melatonin. Melatonin is a compound that synchronises the biological clock; that is, once darkness falls, melatonin is released by the pineal gland, and this is useful as an anti-oxidant and for the immune system.

The darkness of the cinema, then, may also bring about a release of melatonin, which in turn prepares us for sleep.

I might add that melatonin is produced from serotonin in the human body. Both melatonin and serotonin have been considered as playing a role in human sleep, and both have also been used in the manufacture of recreational drugs for their hallucinatory qualities.

Now, cinema has been equated with the dream state since at least the 1950s (for example, Langer, 1953), but rather than the typically psychoanalytic slant given to the relationship between dream and film, I should like here to pursue a different, admittedly (equally?) speculative line.

Both serotonin and melatonin are neurotransmitters; that is, they help to transmit signals across neurons. When, as mentioned earlier, aminergic demodulation takes place, serotonin and melatonin allow the level of hallucinosis to rise.

Serotonin in particular is linked to feelings of euphoria (it is used in MDMA, or ecstasy).

It is interesting that as a transmitter – a guardian in the neuronal gateways – serotonin is between actual signals, but it modifies which signals travel through our brain – which connections are made.

Or rather, serotonin, from my understanding of it, enables brain plasticity; that is, it enables more, not fewer connections, to be made, and is comparatively inhibited, if still at work, during waking and NREM sleep as opposed to REM sleep.

As such, serotonin and melatonin (but the latter seemingly to a lesser extent) are a means of regulating not what we envision, but how we envision it; for creating and cementing new connections in the brain.

On a purely speculative level, in an era whereby scans of the human brain are being carried out during film viewing (see Hasson et al 2004; 2008; Kauppi et al 2010), it would be interesting to see if there are any similarities between brain function during REM sleep and film viewing – that is, whether the human brain considers film viewing in general, or certain types of film viewing, to be a form of hallucinosis.

As a neurotransmitter, which sits between signals or brain events, there is something intriguing about serotonin; as the interval between brain signals, we might consider a neurotransmitter to be more temporal than spatial: neurons themselves have extension, while a neurotransmitter is what decides whether to convert an action potential into an actual action.

As such, the neurotransmitter sits at the threshold between the conscious and the unconscious, between potential and action, between perception and hallucination, and between space (extension) and time (intensity).

Maximising serotonin levels, both in REM sleep and in hallucinosis (and maybe even in cinema?) is the foregrounding of the temporal and the intense rather than the spatial elements and extensive/motor processes of the brain.

Approaching the issue of sleeping in the cinema from the introspective point of view – that is, basing thoughts upon personal experience – during the period from 1 September 2007 to 1 September 2008, I went to the cinema roughly 150 times.

I fell asleep during roughly one third of the films that I saw at the cinema, which is of course a high tally.

I cannot say for certain, but based upon running times and/or conversations with others in attendance, my sleep typically lasted between 2 and 20 minutes.

There are several factors that contribute to my sleeping: typically I do not sleep particularly great lengths of time at night (six hours on average), and I do as a result often find myself tired during the day.

Post-perandial cinema visits, particularly in the early afternoon, would most often induce sleep, as to a lesser extent might early to mid-afternoon screenings before which I had not eaten.

In addition, early to mid-afternoon screenings tend to involve fewer patrons; the presence of other patrons in the theatre, in terms of atmospheric noise, temperature and perhaps also in terms of ‘emotional contagion,’ can affect the way in which we view a film – and the absence of others might also increase the likelihood of sleep.

What is more, my slouching posture and the comfort of the chairs, in addition to the potential effect that the melatonin-inducing darkness of the cinema hall can have on viewers, may also contribute to my nodding off.

Although these factors are important to take into account, I know from experience, however, that I also fell (and typically fall) asleep far more often during what we might term ‘art house’ films than I did (or do) during what we might term ‘blockbusters,’ if drawing such a crude dichotomy between film types be allowed for the sake of argument.

Rarely did I fall asleep for lack of enjoyment; if I may speak personally, very few are the films I do not like, and I like art house films most of all, at least in proportion to the number that I see when compared to the relatively few blockbusters that I actually enjoy relative to the number of those that I see.

Now, art house films tend to have smaller audiences than blockbusters, and given my desire to see them during the cheaper early to mid-afternoon screening slots, the small audience size may well have an even greater effect on my likelihood of sleeping than watching a blockbuster during the day.

That is to say, I suspect that each of these factors plays a role in my falling asleep in the cinema.

However, the most common factor seemed to be the art house nature of the films; that is, regardless of my (often high) level of enjoyment, the relatively slow nature of these films, in terms of movement onscreen, camera motion and in terms of cutting rate, helped to bring about sleep.

This stands to reason on a certain level: if we are aroused by fast action and the loud explosions of blockbusters, it is more likely that we will feel drowsy and/or fall asleep when no danger is clear or present.

However, I should like to offer a different reason, not necessarily in contradiction of this prior reason, but certainly alongside it.

The afore-mentioned work on what happens in the human brain during film viewing, or ‘neurocinematics,’ suggests that audiences in fact respond very similarly to mainstream films like Vertigo (Alfred Hitchcock, USA, 1958), while there seems to be a much greater level of independent brain response during less action-packed films, the least amount of what Hasson and his team (2008) call ‘intersubjective correlation’ (ISC) taking place when viewers see a video of a tree in a park in which ‘nothing,’ so to speak, happens.

According to Hasson et al (2004: 452), the brain regions that are most commonly correlated intersubjectively during mainstream film viewing are the parahippocampal gyrus, the superior temporal gyrus, the anterior temporal poles, and the temporal-parietal junction.

The parahippocampal cortex has been identified as playing a role in REM sleep, in that it allows the sense of movement, emotion and affective salience to emerge (Hobson and Pace-Schott 2002: 687).

Furthermore, in both film viewing and REM sleep, the fusiforim gyrus, which is useful for face recognition, has been found to function.

While circumstantial at best, it might be possible to suggest from this evidence that mainstream film viewing does function cerebrally in a manner akin to dream, especially because the movement associated with the parahippocampal gyrus is illusory in both cases.

Krueger et al suggest, contrary to much sleep research, that sleep is not dependent upon prolonged wakefulness, but rather upon synaptic use.

That is, ‘exposure to rich environments’ can increase the amount of REM sleep that we have (Krueger et al 1999: 124).

It is not entirely clear what they mean by a rich environment.

Since I do not typically fall asleep in mainstream films, we might conclude that these are not such ‘rich environments,’ for example. In spite of the rapid movement and motion in mainstream films, we could argue, that such films do much work for audiences and that they do not force the brain to work harder to comprehend what is going on. In fact, the ease with which we can understand mainstream films would in this case suggest that they are ‘simplified’ (and not ‘rich’) versions of reality (even if such ‘simple’ scenes also stimulate our attention through continued visual and auditory renewal/stimulation).

However, the greater levels of direct stimulation involved in mainstream cinema would suggest some ‘richness’ – and that it is the relatively ‘unrich’ environment of the cinema during art house fare that encourages us, or at any rate me, to sleep.

My point is not here to resolve this conundrum, even if all types of cinema might be said to constitute some sort of hallucinosis, in that we see objects that are only images, but which we might on a certain level take for real, as we do a dream during our experience thereof.

Rather, my point is to say that while cinema might be akin to a shared dream, in that it can induce similar thought patterns across multiple viewers in similar regions of the brain as fire during sleeping, it is the cinema that does not involve such synchronisation of viewers’ brain patterns that is more likely – if I am anything to go by – to induce sleep itself.

Since humans are collectively involved in a world that is always affecting us, it is hard to separate when its influence ends, if at all. If we were to cling to the notion of a subjective self, however, who wanted to think for itself, then we might conceivably argue that sleep is the time when, paradoxically, given that we have little motor control over dreams and do not remember NREM sleep, that our thoughts are most ‘our own.’

That is, we function (more) ‘offline’ when asleep than during waking, during which time our thoughts constantly are being shaped by the world around us.

If Hasson et al’s research is anything to go by, then mainstream film might well bring humans together (we ‘correlate’), but it increases the probability of humans all thinking in the same way, perhaps by virtue of the simplified version of reality that mainstream cinema has to offer.

This is not to say that mainstream cinema will make automatons of us all (unless it already has).

It is to say, however, that films that do not impose images upon us, but which allow us actively to explore the image in our own way (less intersubjective correlation, more ‘art house’) might naturally induce in us that state of maximised (if never absolute) ‘independent’ thought, which is sleep.

The fact that we must search through these environments, rather than have information delivered to us in an obvious if stimulating manner, might even make them ‘rich environments’ that naturally tire us, because we must process individual (new?) thought patterns and associations that we have created for ourselves rather than had imposed upon us.

In this sense, perhaps sleeping during a film is not only an honour for and a compliment to the film in question, but it is a gift from the film to encourage independent thought, it is an act of love in certain senses.

Anthropologically speaking, humans do not always sleep just anywhere and with anyone (even though inebriation, among other altered states of consciousness, might lead us to believe that we do).

And yet, humans do feel the need to sleep with someone, even if this so-called ‘need’ is cultural.

Indeed, the act of sleeping with another human, which often is synonymous with the act of love, is perhaps the most intimate relationship that two humans can have.

In Being Singular Plural (2000), Jean-Luc Nancy argues that humans must recognise the fundamental ‘withness’ of their existence.

That is, humans do not lead lives in which they can objectively observe each other, detached in their observations, but instead we are always at all points with each other, leading a relative existence, in the sense that we are always only ever coexisting, and that, indeed, there is no existence without coexistence and communcation.

Nancy writes:

“‘to speak with’ is not so much speaking to oneself or to one another, nor is it ‘saying’ (declaring, naming), nor is it proffering (bringing forth meaning or bringing meaning to light). Rather, ‘to speak with’ is the conversation (and sustaining) and conatus of a being-exposed, which exposes only the secret of its own exposition. Saying ‘to speak with’ is like saying ‘to sleep with,’ ‘to go out with’ (co-ire), or ‘to live with’: it us a (eu)phemism for (not) saying nothing less than what ‘wanting to say’ means [le ‘vouloir-dire’ veut dire] in many different ways; that is to say, it says Being itself as communication and thinking: the co-agitatio of Being.” (Nancy 2000: 92-93)

Picking apart this passage, Nancy offers communication as a means of exposing oneself, of opening oneself up to the other (and elsewhere, Nancy [2008] has written about how exposure is part of the cinematic experience, as we are ex-peau-sed to the skin (pellicule) of the film).

To open oneself up in this way is like sleeping with or going with: co-itus/coitus as part of this communication.

Paradoxically, it takes sleeping with someone else, that experience in which we are most ‘ourselves’ because ‘offline’ (even if never fully so), in order fully to ‘communicate’ or expose oneself to the other.

It is to accept and to be accepted by the other, a level of thought in which we are not the detached, thinking observer that Descartes proposes as the mind split from the body, and which finds expression in his cogito ergo sum.

In an age in which neuroscience has tried to overthrow the sway under which Descartes’ most famous phrase has held us (see, for example, Damasio 1994), because for neuroscience there is no detached thought/mind-body dualism since we are always only ever embodied, in that our ‘higher’ conscious processes stem from and cannot live without our so-called ‘lower’ viscera and emotions, then it would seem that we must abandon the mind-body dualism.

However, this does not necessarily mean that we must abandon the cogito entirely.

Descartes first proposes je pense, donc je suis as one of only three things about which he can have no doubt in Discourse on Method (Descartes 1998 [1637]: 53).

He refines this phrase in Principles of Philosophy (2009 [1644]: 17), where he argues that we might well imagine that there is no God and that we have no body, but that we cannot doubt our minds, because thinking determines that we must have a mind.

Descartes goes on to define thought, or cogitatio:

“By the word thought, I understand all that which so takes place in us that we of ourselves are immediately conscious of it: and, according, not only to understand (INTELLIGERE, ENTENDRE), to will (VELLE), to imagine (IMAGINARI), but even to perceive (SENTIRE, SENTIR), are here the same as to think (COGITARE, PENSER). For if I say I see, or, I walk, therefore I am; and if I understand by vision or walking the act of my eyes or of my limbs, which is the work of the body, the conclusion is not absolutely certain, because, as is often the case in dreams, I may think that I see or walk, although I do not open my eyes or move from my place, and even, perhaps, although I have no body: but if I mean the sensation itself, or consciousness of seeing or walking, the knowledge is manifestly certain, because it is then referred to the mind, which alone perceives or is conscious that it sees or walks.” (Descartes 2009: 18)

If thought and the mind are precisely embodied, Descartes’ definition of cogitatio would seem misguided.

However, if, as Nancy explains to us, we remember that cogitatio is derived from co-agitatio, which etymologically speaking means to act, move, or do with, then even cogitation is always already a phenomenon done with others (and, after Damasio, with one’s body).

With regard to cinema, we might remember that there is a paradox in that to sleep/to do with another the thing that arguably requires the least ‘withness’ is in fact perhaps the most intimate or the greatest exposure than one can make of one’s self.

This paradox is logical, since if we are only with others, then one’s most ‘detached’ self is precisely that which is least ‘with’ others – i.e. self-hood is only defined with others, and so that which is most un-other-like about us, our sleeping self, is paradoxically that which is most unique to us; we are most ourselves when we are least ourselves.

Furthermore, this paradox is mirrored by the fact that to cogitate, which Descartes uphold as the highest indicator of the mind’s separation from the body, is in fact only ever a thinking with, both with our bodies and with others.

The cogito is in fact a co-agito.

Sleeping in the cinema, in which we are ‘most ourselves’ becomes in this way a communion with the film.

Many humans sleep alone, within spaces that are familiar to them. Perhaps it is as much the space of the cinema as with any particular film that we feel so intimate and safe that we can allow ourselves sleep.

That I do not sleep during blockbusters leads me to believe that I probably do not trust blockbusters; their fast movement may be arousing in terms of being attention-grabbing, but they also enervate me, making me alert and worried that something is about to happen.

The art house film, meanwhile, is a friend, or a lover, with whom I feel safe, and in a space that feels safe to me.

Since it exposes to me those things that are more intimate and meaningful than does the blockbuster, then I expose to it that which is most private in my life, my sleeping self.

We go together in a strange coitus, a co-agitatio akin to that of cogitation in the real world (and which perhaps we might differentiate from the egocentric survival instincts that the explosions of the action film seem to encourage).

I feel safe in the cinema perhaps because of familiarity, making it not a ‘rich environment’; but while a blockbuster may grab my attention, it does not necessary entertain me.

Art house films are the richest environment in which, or better with which, I think (I co-agitate) the most; blockbusters are not wholly ‘brainless,’ not least because the mind and the brain are embodied, and we can and often do have very visceral responses to blockbusters, which in turn can induce new, richer thoughts.

But the phrase ‘brainless’ is not unuseful in getting to the root of our relationship with blockbusters, of differentiating these simplified versions of reality with the complexity of art house films.

I love cinema, but if my willingness to sleep with art house film is anything to go by, I feel happiest with it.

I am promiscuous in my cinematic tastes, responding to and interested in many of the different experiences that cinema can offer; but I am happiest with the slow, thoughtful films, that sometimes even allow me to think ‘offline’ for a while, to sleep, perchance to dream.

Cinema has long since been associated with dream, and yet sleeping in the cinema is typically thought of as being a negative experience, a sign of boredom.

Cinephiles, together with cognitive studies of cinema, seem predominantly interested in visual and aesthetic pleasure, and in attention and arousal.

And yet cinema can indeed send audiences to sleep.

Contrary to the ‘boring’ and ‘slow’ film argument, this can in fact be the most intimate relationship one can have with a film, even if paradoxically it means not even ‘seeing’ or ‘hearing’ the film (though we can still sense its presence).

To sleep with a film is a sign of cinephilia.

References
Brown, William (2011), ‘Resisting the Psycho-Logic of Intensified Continuity,’ Projections: The Journal for Movies and Mind, 5:1, pp. 69-86.

Damasio, Antonio (1994), Descartes’ Error: Emotion, Reason, and the Human Brain, London: Vintage.

Descartes, René (1998 [1637]), Discourse on Method and The Meditations (trans. F.E. Sutcliffe), London: Penguin.

Descartes, René (2009 [1644]), Principles of Philosophy (trans. John Veitch), Whitefish, Mass.: Wilder Publications.

Hasson, Uri, Yuval Nir, Ifat Levy, Galit Fuhrmann, and Rafael Malach (2004), ‘Intersubject synchronization of cortical activity during natural vision,’ Science, 303: 5664, pp. 1634–1640.

Hasson, Uri, Ohad Landesman, Barbara Knappmayer, Ignacio Vallines, Nava Rubin and Davd J Heeger (2008), ‘Neurocinematics: The Neuroscience of Film,’ Projections: The Journal for Movies and Mind, 2:1, pp. 1-26.

Hobson, J. Allan, and Edward F. Pace-Schott (2002), ‘The Cognitive Neuroscience of Sleep: Neuronal Systems, Consciousness and Learning,’ Nature, 3 (September), pp. 679-693.

Kauppi, Jukka-Pekka, Iiro P. Jääskeläinen, Mikko Sams and Jussi Tohka (2010), ‘Inter-subject correlation of brain hemodynamic responses during watching a movie: localisation in space and frequency,’ Frontiers in Neuroinformatics, 4:5, pp. 1-10.

Krueger, James M., Ferenc Obál Jr, and Jidong Fang (1999), ‘Why we sleep: a theoretical view of sleep function,’ Sleep Medicine Reviews, 3:2, pp. 119-129.

Nancy, Jean-Luc (2000), Being Singular Plural, trans. R.D. Richardson and A.E. O’Byrne, Stanford: Stanford University Press.

Nancy, Jean-Luc (2008), ‘Claire Denis: Icon of Ferocity’ (trans. Peter Enright), in James Phillips (ed.), Cinematic Thinking: Philosophical Approaches to the New Cinema, Stanford: Stanford University Press, pp. 160-170.

Swinton, Tilda (2006), ‘Film: State of Cinema Address, 49th San Francisco International Film Festival, 29 April 2006Critical Quarterly, 48:3 (Autumn), pp. 110-120.

No dark sarcasm (three)

Blogpost, Film education, Uncategorized

The previous two blogs, then, serve as a prelude to this final blog, which of course must look at the New College of the Humanities and the emergent surrounding discourse.

If I have argued in the previous two blogs that we should not abolish Oxbridge at all, but simply recognise that Oxbridge-centrism leads to misconceptions about higher education in the UK, then I am naturally dubious about an institution that is being mediated as creating ‘a rival to Oxbridge.’

In these days of free enterprise, anyone should be allowed to make money in the way that they see fit, provided, of course, that it is legal (and even then a lot of people still make money – and lots of it – illegally).

Many, if not all, UK universities are currently trying to come up with ways to make more money in the face of government cuts (although they might well be doing this even if there were not government cuts).

These schemes include the development of short courses, diplomas, and various other educational packages in various/many academic disciplines, which typically might create a market out of local potential part-time students, who will pay for a diploma and thus help to fund the university.

It is hard to be enthusiastic about creating more teaching for oneself in an age when one might already feel that one does quite enough already, thank you.

However, the reason that I mention this is that, in their own way, universities are not necessarily charging their students more to keep afloat (since undergraduate fees have been capped at £9,000 for domestic students), but they are coming up with ways of getting money from other, temporary students in order to create some economic stability.

And of course out of a sense of pedagogical altruism.

If this is what is happening widely already in higher education institutions, then, particularly in these market-driven times, it is perhaps only a small leap for higher education institutions to privatise themselves. That is, to free themselves from the burden of having to develop smaller money-making schemes that run constantly for the sake of charging students enough money that the institution can forego entirely its government subsidy, and instead just crack on on its own.

Charging £18,000 a year, this is precisely what the NCHum is doing. Sort of.

The NCHum is an interesting concept. Regardless of the ‘celebrity’ academics that will teach there, presenting obligatory modules on applied ethics, scientific literacy and logic and critical thinking seems a sound idea. Having one-with-one tutorials also, as mentioned in my first blog, has benefits for students. Preparing students for the world of work, another NCHum promise, is also fine. It is not as if other universities do not already try to do this.

There are some gray areas surrounding how the college will run, though, that do give pause for thought.

Firstly, the college will offer degrees from the University of London and it will provide students with access to University of London facilities, such as teaching spaces, libraries and so on.

Since the University of London has acquiesced to this, presumably they feel that this is sound business – a kind of outsourcing of various courses that otherwise it might have been unwieldy to put in place.

However, what this means with regard to which students can have access to which resources and at which times remains to be seen. Will the students at NCHum, as a result of their higher outlaying of money, get preferential treatment over other University of London students when the time inevitably comes that they are after the same resources?

Logistically speaking, I query the university’s claim that ‘all applicants’ will be interviewed. This will be a time-consuming process (and one that Jones in my first blog feels discriminates against those who have not had preparation for the interview, even if I – and Canfor-Dumas and Glancy – feel that this might not necessarily be the case). Given that this will likely be very time-consuming, I wonder if it is really true, then.

Furthermore, although this is a ‘soft’ point to make, it is unlikely that many of the ‘celebrity’ staff members will be doing the nitty gritty of one-with-one tutorials week-in, week-out, meaning that what access one has to the ‘great minds’ teaching at NCHum will be limited at best.

Besides, just because NCHum places 14 high-profile academics in a single, small institution does not mean that those academics are either great teachers, or that other institutions do not have as many, if not more, at least in terms of raw figures, prominent academics across their disciplines.

Indeed, by the time British academics have reached the post of professor, they are all pretty eminent and will undoubtedly have vast swathes of knowledge to pass on, even if their work has not led to TV shows and interviews because it is not ‘sexy.’

Terry Eagleton has argued that the education at NCHum will be not necessarily be as open-minded as all that:

“The new college, staffed as it is by such notable liberals, will of course be open to all viewpoints. Well, sort of. One takes it there will not be a theology department. It is reasonable to suppose that Tariq Ali will not be appointed professor of politics. The teaching of history, if the work of Dawkins and Grayling is anything to judge by, will be of a distinctly Whiggish kind. Grayling peddles a Just So version of English history, breathtaking in its crudity and complacency, in which freedom has been on the rise for centuries and has only recently run into trouble. Dawkins touts a simple-minded, off-the-peg version of Enlightenment in which people in the west have all been getting nicer and nicer, and would have ended up as civilised as an Oxford high table were it not for a nasty bunch of religious fundamentalists. Who would pay £18,000 a year to listen to this outdated Victorian rationalism when they could buy themselves a second-hand copy of John Stuart Mill?”

However, even though I reproduce this paragraph in full, it is not as if many universities do not have departments that are characterised by a loosely unified outlook on the world. Perhaps in an outmoded fashion, but I am thinking of ‘Marxist’ campuses from the 1970s and 1980s, or even today, when many universities might spit, for example, on Milton Friedman so much as hear what he has to say were he to turn up to (let alone be invited for) a guest lecture.

In the same way that I have criticised unthinking Oxbridge-centrism in the previous two posts, there seems to be a vague logic of NCHum-centrism underlying Grayling’s talk of the New College (although he does of course have a new college to try to sell).

A Telegraph article says the following:

“The college claims to offer a ‘new model of higher education for the humanities in the UK’ and will prepare undergraduates for degrees in Law, Economics and humanities subjects including History, Philosophy and English literature.”

Law, economics, history, philosophy and English literature. This is not a list that inspires thoughts of a ‘new model’ of education. It is not, for example, as if NCHum is offering an innovative course in the logic of the digital world – a course that might take in elements from sociology, media, geography, economics, politics, philosophy and history all at the same time (and which might make a good course).

In other words, Grayling seems to be bigging up something as innovative that many universities would call ‘business as usual.’ Indeed, in a witty article in the Standard, some former colleagues at UCL feel that Grayling is not only not offering original courses, but that he has even purloined modules from them.

Furthermore, with 14 celebrity staff members and a small handful of others named on the website, it strikes me that there may not be much choice in modules at NCHum. I’ll return to this below, but even if students get excellent one-with-one tuition on a weekly basis, it strikes me that – at least in these early stages and before the university has had a chance to grow – the model of education here is pretty much a top-down one, as opposed to the bottom-up quest of inspiring to learn that many other universities try to offer.

This may seem a frivolous, or indeed a poor, point to make: “get some discipline in them, that’s what these youngsters need.” But then again, this also suggests that the NCHum already/so far has something loosely approaching a one-size-fits-all ethos that does not necessarily tally at all with, in Grayling’s own words, what it means to be human today.

I note, by the way, that NCHum is not offering film. This is a pity, because with some film expertise, be it in criticism or production, they might well have been able to produce a video for their site that does not cut Grayling off mid-sentence at the end of his introduction to the college.

In the same Telegraph article, Grayling is quoted as saying: “Our ambition is to prepare gifted young people for high-level careers and rich and satisfying lives.”

The use of the word ‘rich’ is almost certainly intended in the sense of ‘diverse’ and ‘full’ – but it also betrays the central ethos behind the institution – and which is my only real beef with it – and that is its commercial-mindedness.

Grayling and his colleagues, being smart people, will no doubt be aware that we are living in times in which ecology plays a major part in our thinking. He will also be aware that while there is plenty of seeming evidence to suggest that we must pay urgent concern to our environment and become better citizens, this discourse is also the product of various processes, including the spread of ideas via the media. Grayling’s colleague, Richard Dawkins, calls them ‘memes.’

It is not that our planet is not in trouble; but Grayling’s appeal to ‘humanity’ bespeaks a sense of opportunism, in terms of how he wields the term, that is timely. It is a canny riding of the meme wave. How can I be so cynical, you might ask?

I am inclined towards this cynical interpretation of his words (that is, I don’t believe him), because this is also a university that is preparing ‘rich’ people, to use Grayling’s other term.

Rich people can be responsibly rich. Just because historically it has been the relentless pursuit of riches that has led to, among other things, colonialism, widespread global poverty, slavery, and war (although this is not how Grayling’s other colleague, Niall Ferguson, reads history), this does not mean that it will not always be so. In fact, to give Ferguson his due, it is hard for us to know whether the world would be any more or less civilised without the ascent of money. But at the very least there is a tension between pursuing humanity and pursuing riches.

Maybe Grayling is indeed hoping to prepare a group of super-enlightened students whose ‘high level careers’ in fact help to bring about the redistribution of wealth and opportunities. But the hierarchy implicit in ‘high level careers’ does not bode well.

I criticised Owen Jones in part one of this blog for seeming to love Oxbridge, while at the same time hating it. His logic seemed to be that you have not proven yourself excellent if you have not been to Oxbridge, perhaps even that only Oxbridge people can be excellent.

And yet, this logic of exceptionalism is something to be guarded against, or wary about: maybe enlightened people are paradoxically exceptional by not succumbing to the logic of exceptionalism. They are perfectly, adequately, perhaps even exceptionally intelligent, but they simply do not want to go to Oxbridge; they perhaps do not want to be bankers or lawyers or management consultants; maybe they want to be park wardens, gardeners, electricians – who knows? Or maybe they simply do not want to go give up on many of the things that they believe in (‘happiness’) in order to do well in the world (or, perhaps more accurately, to be seen to do well in the world).

Once one does well – or is seen to do well – it must be difficult not to believe that one is not doing well. If other people can see that I am doing well – and tell me as much, then I must be doing well, or so the logic goes. It’s not that these people are not doing well, nor that they should believe necessarily that they are not doing well. But I am saying this to propose that I can understand why people ending up believing in their own exceptional nature – be you AC Grayling or one of his students.

The twin forces of luck and fate – luck in that you were in the right place at the right time; fate in that you were the kind of person whose chances of being in the right place at the right time were maximised from birth – get quickly forgotten. It was all the exceptional person’s doing – or so it is to be believed. History is full of great men (sic.).

However, exceptionalism does not tally that easily with humanism – which, at least for me, implies a sense of democracy and the kind of concepts that the French put in their constitution. A ‘new Oxbridge’ based on the logic of exceptionalism takes us away from the logic of humanist togetherness…

The pledge to help students to understand humanity, then, seems to work only for certain humans. This much is affirmed by the price tag of the university, which, as mentioned, is £18,000 per year.

Indeed, while part of me would want to teach at the NCHum, and while part of me would also want to study there, it is the price tag that is the real kicker.

Not because I cannot afford it (although I cannot). But because this college privatises education.

Privatisation is the retreat from the public. More particularly, it is a retreat from the common – the common wealth, the common good. The private wealth, the private good – well, we have had logicians and economists who have argued that these things are in fact the path to the common wealth and the common good.

And it is people (like Milton Friedman) whose private-based policies have led to the increasing rates of disparity in economic wealth that the world has seen accelerate over the last three decades plus.

Privatisation is to embrace solipsism. It is to deny that we are in this world together, and it is to fall for the notion that one is, or must be, exceptional. Exceptions tend not to believe that their exceptional status is illusory; they also tend to forget that it is only thanks to the tacit permission of others that their exceptional status can come into existence in the first place.

Possibly greater levels of privatisation will lead to a common good – because the majority will be able to take no more and will remind those that have that they only do so because of the people who have not (and because the people who have not – sometimes out even of kindness – let them).

With regard to NCHum, then, we might look at a BBC article, which quotes UCU general secretary Sally Hunt as saying the following:

“While many would love the opportunity to be taught by the likes of AC Grayling and Richard Dawkins, at £18,000 a go it seems it won’t be the very brightest but those with the deepest pockets who are afforded the chance. The launch of this college highlights the government’s failure to protect art and humanities and is further proof that its university funding plans will entrench inequality within higher education.”

In other words, the privatisation of higher education, which the creation of the NCHum seems to signal, might well, even in spite of scholarships on offer (i.e. in spite of ‘exceptions’), lead to the acceleration of the creation of a two-tier education system that has as its pre-existing counterparts the private and public sectors of secondary and primary education.

An article from the London Review of Books, already mentioned in one of the previous posts, quotes Jonathan Cole, the former provost and dean of faculties at Columbia, as writing that

“in addition to fee inflation, a major contributor to the increased cost of higher education in America stems from the perverse assumption that students are ‘customers’, that the customer is always right, and what he or she demands must be purchased. Money is well-spent on psychological counselling, but the number of offices that focus on student activities, athletics and athletic facilities, summer job placement and outsourced dining services, to say nothing of the dormitory rooms and suites that only the Four Seasons can match, leads to an expansion of administrators and increased cost of administration.”

This is from an article that I read before the announcement of NCHum, and which was arguing that British higher education should not look to the American Ivy League as its model.

In the face of the creation of the NCHum, the students better get their dollar’s worth. But more important is the fact that if one institution will go private, then perhaps others will follow, and only those institutions that can afford all of the above ‘services’ for their ‘customers’ will stand a chance of surviving, which puts in peril the hopes of many students who may find themselves unable to go, or at least put off from going, to university for financial reasons.

The NCHum is seemingly a private institution backed by some London investors. Do these investors get much say in the curriculum? While Grayling says that he wants his students to develop critical thinking, might these backers in fact want the institution to develop a certain kind of brain that will be good for [a certain type of] business as the students graduate into jobs at these self-same firms that sponsor the institution?

A friend who used to work at Lehman Brothers once told me that their HR team did not bother to look at candidates for jobs who have a PhD or equivalent. The reason he gave to me was that, unless the PhD was in maths or economics, the chances are that the candidate would think too independently. Lehmans, allegedly, preferred to hire younger graduates whose minds they could mould according to the Lehman ethos.

Then again, rather than standing as anecdotal evidence for the fact that banks only want a certain ‘type of brain,’ this might explain why Lehmans went bust…

Finally, Howard Hotson in his LRB piece explains that “the American company that owns BPP University College – which David Willetts granted university status only last year – recently lost its appeal in the US Supreme Court after being found guilty of defrauding its shareholders and is under investigation by the US Higher Learning Commission for deceiving students about the career value of its degrees.”

Earlier I explained how there is a system of peer review, external examination and various other mechanisms that mean that universities in the UK have to work together, even if they are also in competition for limited resources.

The privatisation of education (which can also lead – as happens in big pharma and the like – to the production of only a certain type of knowledge, which is based on a particular agenda and ideology – and which does not, in spite of pretenses – have any or much ‘objective’ truth status [what is truth?]) means that institutions can (presumably – although I want to be corrected if I am wrong on this score) ignore the edicts of colleagues from other institutions. They can, as BPP did do, defraud shareholders and, more pertinently, deceive students about the value of their degree.

In the current climate, it is hard to be sure which degrees are ‘value for money,’ not least because so much of that value must rely on the perceptions of the students themselves.

Evidently, the company that owns BPP has also been caught out – so there are mechanisms in place to stop this from happening – in the USA, at least. However, this does only point to the possibility that a lack of transparency via privatisation might inevitably lead to some form of corruption – with higher education being the ultimate loser.

Maybe all I must conclude is that I wish NCHum luck. In the prisoner’s dilemma that is the current state of higher education, you pushed the button to get the bigger reward first, which in some circles is the logical thing to do.

In a world in which we are together, though, and in which the emergence of humanity is tied to the origins of virtue, as much as it is to the (deeply misunderstood?) selfish gene, then humanity is our common wealth – not some of it, but all of it.

If I expressed fear that the real problem with higher education is not Oxbridge but that people who are already rich typically end even richer, then perhaps Oxbridge, and even the NCHum, offer nothing to the rich kids that can afford to study there that life would not offer them anyway (more riches). In this sense, if NCHum takes in rich kids to churn out adults that will get richer, what have they really taught anyone?

Taking in students more democratically – now so hard to do in the age of top-up fees – and encouraging students from all manner of diverse backgrounds to become better humans, more together both in themselves and in the world, to encourage them to learn not just new things, but new ways of learning, new ways of thinking, the likes of which we have not even begun to conceive – this might well be priceless and real value for money.

No dark sarcasm (two)

Blogpost, Uncategorized

As a follow-up to the last blog, I should now like to address a second article, which was a follow up to Owen Jones’ LabourList.org>posting.

Alex Canfor-Dumas provides, with Josh Glancy, a riposte to Jones also on LabourList.org, and in many ways it is a perfectly reasonable exposé of some of the logical shortcomings that I also found in Jones’ piece – namely that it cannot help but loving Oxbridge in spite of the fact that it also wants to abolish it.

However, Canfor-Dumas, like Jones, also succumbs (inevitably, perhaps, because he is studying there) to some Oxbridge-centrism.

He concludes his piece as follows:

“Beneath the privileged veneer of drinking clubs and anachronistic formal clothing, Oxbridge provides an exceptional educational experience that is both available to and embraced by students from a wide range of backgrounds. There is an environment of scholarship and a culture that celebrates original thought and glorifies academic achievement. Britain needs more, not less, of these qualities. Oxbridge is not perfect, but it is here to stay – and rightly so.”

“Exceptional” cannot but find its way into the article come its climax. For the myth of being exceptional is what keeps Oxbridge alive. Woven into this understanding of Oxbridge, then, is an apparently unquestionable value attributed not to what should be general, but to that which is ‘exceptional’ or ‘out of the ordinary.’

In my last post, I explained how there are grounds to argue that Oxbridge is not particularly exceptional – and I’d like to pursue this further here.

For, Canfor-Dumas (and Glancy) end their post by saying that Oxbridge has “an environment of scholarship and a culture that celebrates original thought and glorifies academic achievement” – and that the UK needs more of this!

What this statement implies is that Oxbridge is the only beacon of academic excellence in the UK – and that the other 150-odd higher education institutions can go hang themselves. Apparently (if I can take this interpretation further) they somehow do not encourage (I’m not sure I want to use the word ‘glorify’) academic achievement. This is the preserve of Oxford and Cambridge, whose good example should be followed.

So… what are those other 150+ institutions doing? What are they doing with all of those really smart kids that do not even want to go to Oxbridge? One would fear somehow that those other institutions (and I take it that mine should be included here) are not only not encouraging academic excellence – but that they are even stifling it.

Since I work at a higher education institution that is not Oxbridge, I have to take exception to this. And since neither Canfor-Dumas nor Glancy brings this matter to much light in their article, I shall endeavour to do so here.

In my last blog, I wrote about how students tend to be proud of their alma mater, no matter who she is. Of course, Oxbridge students are no different – so Oxbridge-centrism is as natural to Oxbridge students as egocentrism, or the ability only to see things from one’s own point of view, is natural to humans. We can read, see, hear, and feel many different things, but since we only ever have our own bodies through which we can filter the world, we will always only ever experience the world for ourselves.

I am partial to a recent advert, or ‘manifesto’ as its producers call it, for Bacardi rum. Not only is this because I like Bacardi, but it is also because, simple (simplistic?) as its message is, it says that we are all together, and that we should act not in a solipsistic manner (I enjoy the idea of cutting up white headphone leads) but together.

Even though this is an advertisement (though perhaps significantly an advertisement for a substance that is what we might call mind-altering when consumed in enough quantity), I want to agree with its secondary ‘message.’ If its first message is ‘buy Bacardi,’ which people can take or leave, its secondary message is, as mentioned, that we are all together.

Thinking in terms of togetherness, or ‘withness,’ then, is what I want to explore here in terms of higher education. What is posited in both Jones and Canfor-Dumas’ postings is the sense that it is to Oxbridge alone that the burden of excellent education must fall.

This is simply not true, as the 150+ other higher education institutions in the UK testify. We can therefore rework the Oxbridge-centrism that Canfor-Dumas and Glancy (cannot help but?) display when they write, as I shall repeat, that: “Oxbridge provides an exceptional educational experience that is both available to and embraced by students from a wide range of backgrounds. There is an environment of scholarship and a culture that celebrates original thought and glorifies academic achievement.”

In its place, we can say that higher education provides an excellent educational experience that is both available to and embraced by students from a wide range of backgrounds. There is throughout higher education an environment of scholarship and a culture that celebrates original thought and glorifies academic achievement.

I have replaced Canfor-Dumas and Glancy’s use of the term exceptional here with the term excellent. I have changed this precisely because Oxbridge is not exceptional in pursuing these tasks. It is perfectly normal for Oxbridge to pursue scholarship and original thought, but that is because all higher education establishments pursue these goals – and Oxbridge is included among these higher education establishments.

I am a lecturer in film. You can, as far as I am aware, study film at Oxbridge – both at undergraduate and graduate levels. However, the opportunities to do so are relatively limited: it forms part of Modern Languages and English degrees. That is, you can do the odd module on film. What is more, Oxford for one has a Master’s in Film Aesthetics, while there are in British film studies academia a number of lecturers who did their doctorates at Oxford or Cambridge.

I don’t want to rehearse in great detail why I think film is a legitimate course of study, since this is not my point here.

Briefly, though, I think it is of pressing importance to study film and audiovisual culture more generally because they are such all-pervasive phenomena in our screen-filled world. These media often do not encourage us to question what they show to us, nor their legitimacy as a whole. Instead, then, university (and school) courses in film and media offer us with the opportunity to encourage critical thought with regard to film and media.

Critical thought itself is a valuable commodity – while understanding how and why images and sounds work in terms of creating meaning is also of great value to those who precisely wish to have, for example, a recognised brand in the world.

As such, not only is the study of film of some cultural urgency (because it dominates so much of our attention and, by extension, our thought), but it also endows one with skills that transfer well to all aspects of life, but, for the sake of placating anxious students, in particular the business world/the world of work.

To take an example: some people, Margaret Hodge included, have argued that to study film and media is to undertake a ‘Mickey Mouse’ course. That a term from the media (‘Mickey Mouse’) is used to define such courses demonstrates quite clearly the extent to which we freely and unthinkingly use media terms in everyday discourse. That is, so pervasive is, in this instance, Mickey Mouse, that we use it to dismiss such courses, while to couch the criticism in these terms reasserts the precise validity of needing to think through what Mickey Mouse is and what influence he has had on our society.

I can imagine that a thorough analysis of Mickey Mouse alone would have to take in the following, if not more: the history of art, particularly animated drawing styles; the history of film; sociological aspects of what a mouse means; psychological and cognitive aspects, in terms how and why the image is pleasing to have the popularity that it does; a theoretical understanding of mass media, such that we can work out how the image of Mickey has become globally recognisable; an economic history of the deals that similarly have allowed Mickey to become dominant as a global brand. And so on.

I arrived at film from a background in Modern Languages and then in Philosophy. Even though I want to turn my back on neither of these subjects, and in fact feel that it is truly important not only to maintain an interest in these, but to expand my interests further outwards, I find that film is the culmination of more or less all subjects and disciplines.

The reason that I give my personal journey towards film is to posit that I see it not as a lesser field of enquiry than the more traditional subjects that I studied before it. The range of approaches that one could take to Mickey Mouse alone also suggests that to study film is to have to take on a massive amount of information that in fact by definition transcends, or better incorporates, many academic disciplines. And therefore that while film might be considered a ‘soft’ topic in some corners, it is in fact very hard.

Indeed, while some film scholarship admittedly does not help itself in terms of being accessible and ‘relevant,’ many students find film in particular a disillusioning object of study – precisely because it is not as easy as they thought it was going to be! As is often the case, that which on appearance seems most ‘natural’ and ‘easy’ is in fact the most complex thing.

And the reason that I want to make this point is to bring us back to the discussion of Oxbridge-centrism raised earlier. If one cannot study film in any great detail at Oxbridge, at least not really until post-graduate level, and if you are convinced that we should spend more time studying film and audiovisual media more generally, because they are dominant in terms of shaping how we understand and interpret the world, then Oxbridge probably not where you want to go to study your degree.

We could knock Oxbridge for not providing a full undergraduate program in film – but this would be counter-productive, because it would re-affirm that it is only when Oxbridge does something that it is truly ‘valid.’ As if UCL were not responsible for pioneering the study of English literature; and as if Birmingham were not a true pioneer of Cultural Studies (also not really available at Oxbridge).

Better than knocking Oxbridge, though, would simply be to recognise that all higher education institutions – even if with economic incentives working somewhere under the surface (they have these at Oxbridge, too) – encourage critical and original thought. And going to any or many universities can help provide students with the means to develop their capacity for critical and original thought.

Let me put this another way. Film is, for me at least, of vital importance for us to study. Not only can you not really study this at Oxbridge at undergraduate level, but even if you could, you might as well go where the teaching is the best that you will receive in the UK. I will sound arrogant when I say this, but part of me means it: if you want to receive the best tuition in film in the UK, maybe even in the world, then you might as well come to my university, Roehampton, to learn with me. It is not that my students miss out on Oxbridge, then, but that Oxbridge students miss out on me.

Naturally, my egocentrism here is obvious to the point of easy ridicule. But it serves to illustrate the Oxbridge-centrism that clings to the idea that Oxbridge is the be-all and end-all of higher education.

I have not met everyone working in film studies in the UK, but after a fairly aggressive few years of conference attending and networking, I can say that I know a good representative crowd. Rarely do I find among them academics who would not hold their own in more or less any academic forum, particularly on their area of expertise (of course).

I don’t know who is the ‘smartest’ of them all (which is not at all the same as being the best teacher, anyway), but I find the absolute vast majority of these academics to be smart – certainly smart enough to challenge me and, I guess, a good many if not most 18 year olds. And this is regardless of the institution at which they teach.

In fact, I know a good number of the Oxbridge staff members who teach film studies. They are all smart, too; but, again, not necessarily more or less smart than a lot of other people at other universities in the UK.

Having a doctorate is basically a necessity nowadays to be an academic; those few academics who do not have a doctorate are typically quite senior and the product of an era whereby some students went straight from undergraduate study to higher education teaching. Back in the ‘cowboy’ days of the 1950s and 1960s, this might have been possible. But not nowadays. And even these academics are smart, too, even if they don’t have the requisite qualifications.

The reason that I mention this is that you cannot just get a doctorate like that. You must present your work to a panel of academics both from your institution and from at least one other. The idea is to ensure standards across the board – be you from Oxford, Cambridge, Roehampton (my employer), or Worcester (my hometown).

As such – at least in theory – smartness, while not necessarily being equal, at least meets what is perceived to be the minimum requirement not just at Oxbridge, but at all higher education institutions.

What is more, all universities have external examiners and peer review panels made up of academics from other institutions, including Oxford and Cambridge, who judge standards of teaching, as well as procedural and structural standards. These are also designed to ensure parity across all institutions.

It is not that we should think about which academics are best, then. If you want academics to nominate who is best, then each academic might as well nominate themselves (as I did above). Rather than ranking themselves, though, academics might be better off remembering that they are in this together.

Contra Canfor-Dumas and Glancy, then, we should have good faith that all academics are doing their best; they are all – perhaps with some exceptions (though arguably at Oxbridge as much as anywhere else) – good at what they do; and they all take pride in doing it.

Besides which, none of this has yet included the students, who will respond to different teachers in different ways and at different moments in their life. As such, who can know who is the ‘best’?

Well, we can all apparently know who is the ‘best’ – because we live in a country that obsesses over league tables and the like. What is more, academics take part in a particular process called the Research Assessment Exercise (RAE), now redubbed the Research Excellence Framework (REF). The National Student Survey (NSS) also judges which university offers the best student experience – and produces a league table with the results.

With these and other surveys taking place, there is in fact a cacophony of results, with each university (naturally) bigging up those tables in which they do well, while keeping hushed about those in which they do not.

This article suggests that the RAE and the REF apply different enough criteria that the same institutions during the same period of time can have quite different scores. How we are supposed to know which is the best way of measuring the ‘best’ is surely difficult. I look at such surveys with some scepticism.

In spite of this, one might still contend that Oxbridge are excellent because they do well in many of these surveys across the board. There must be some reason that they manage always to do well…

History no doubt plays a part here. When you happen to be several hundred years older than many of the other universities in the UK, then the accumulated wealth, procedures, facilities, and, for example, collections of materials (I mean books in libraries, etc), do of course help when comparing this to other institutions.

Oxbridge are excellent universities. I am contending, however, that their excellence is not unique to them. This article suggests, among other things, that the UK in fact has the best academic system worldwide when we consider not just the top universities but the top 300 or so. Our money goes further and gets better results than American universities when we take into GDP, league tables, and various other factors.

Bizarrely, the rest of the world does not feature much – suggesting some possible (likely?) Anglo-American-centrism in the tables that the article analyses.

However, I mention it to say that if we factor in time/history, then the UK universities do even better. Or rather, Oxbridge of course does well, but perhaps not as well as it should do in terms of the advantage that its age simply has/should give to it.

What might also be interesting is how these results would read when compared with the socio-economic status of the average student at the university. This is a slightly different question to the fees structure, but I’d be interested to see both which universities cater for the richest kids, and whether those universities offer significantly better results for one’s sterling.

However, the focus here is to say that academic excellence is not confined to Oxbridge, excellent though Oxbridge is. Even if academic excellence were confined to Oxbridge, as seems to be Canfor-Dumas and Glancy’s take on things, the ‘Oxbridge ethos’ of encouraging academic excellence everywhere could presumably only be put into practice by recognising, encouraging and nurturing the development of other academic institutions. It would be folly to think that one could provide the entire nation’s higher education in just two provincial cities.

To this end, it is only logical that Canfor-Dumas and Glancy extend their argument not to abolish Oxbridge, by spreading the academic love around the UK more generally.

Fortunately for the two authors, this has already happened. What could happen a bit more in recognition of the achievements and aspirations of other universities, though, is a loosening of perception that only Oxbridge counts.

Instead of such Oxbridge-centrism, which is reflected by the media’s obsession with Oxbridge more generally, we might see that excellence that takes place everywhere. The excellence of smaller universities that, in spite of the material and historical disadvantages that they have, still pull their weight.

All universities are together in pursuit of excellence – both in terms of the staff members and their research and in terms of encouraging students to do the same. It is an act of solipsism – though one to which we are all susceptible – to feel that one is the only person really working hard here. As an act of solipsism, one feels one ought to cut those white headphone leads and tell them that we others are here and we are doing our best, even if in slightly different ways.

As an act of solipsism, believing that other universities might not want to or simply are not encouraging academic excellence is also an act of bad faith.

One can understand why there is bad faith around: resources for universities are scarce and becoming only more so. In this climate, we distrust others, since they might well be scheming to get as much back from as little investment as possible – the academic equivalent of the welfare scrounger.

Furthermore, scarce resources mean that those who have them guard them jealously – part of their reasoning naturally becoming that somehow they deserve those resources for some cosmic reason often attributed to their exceptional nature.

Good faith, however, believes that others can and will do their best – and that they should not be unfairly treated as a result. It believes that we are all in this together. And if we recognise this much, then maybe we will be able to encourage the excellence, or even to realise the potential, not solely of a/the few, but of the many, perhaps even (as an aspiration) of us all.

No dark sarcasm (one)

Blogpost, Film education, Uncategorized

I have been reading a number of things about higher education in the UK recently – which stands to reason as a result of recent changes in fee structures and the like. And, of course, as a result of the creation of NCHum, set up by AC Grayling n’ chums (sorry), and due to open in Bloomsbury in 2012.

I think this blog is about some of the fuzzy logic that seems to exist – for me – when discussing these issues.

Let’s start with Owen Jones’ posting on LabourList.org about how Oxbridge should be abolished.

Oxbridge is well known as a preserve of people whose parents are from wealthy backgrounds and who will more than likely end up wealthy themselves. In spite of the university’s efforts, the (lack of) diversity in the student body is not particularly changing, says Jones, in part because rich kids can prepare more easily for the entrance interview as a result of the fact that they are rich.

I would be disappointed if any school did not offer help to students in preparing for university – or any – interviews – and I don’t see how money makes that much difference. Except, perhaps, that one can buy ‘better’ preparation…?

There are no stats on this, so forgive my speculation, but I suspect that more people ‘waste’ their money on pursuing such expensive preparation by not getting in to Oxbridge than there are state school students with supposedly no preparation who apply and are accepted. To be substantiated, of course – but the point is to query precisely what role money might play in this process.

Jones points out that he met thickos when he was studying at Oxford – and that he’s met smart people outside of Oxford. Aside from the fact that neither of these things should surprise Jones at all – if, that is, he wishes to retain the habit of imposing judgments on people according to what he perceives their abilities to be – this only suggests that Jones suffers from the Oxbridge graduate’s snobbish sense that it is almost impossible for non-Oxbridge people to be smart.

And yet Jones also writes this:

“Many bright young people from comprehensives simply do not want to go to Oxbridge, because they don’t want to spend their university years stuck with those they fear will be arrogant, braying, overprivileged youngsters who may as well have grown up on a different planet. That might be unfair, but that’s certainly how many feel.”

Whether or not the above conceptions of Oxbridge actually are unfair in terms of what Oxford is like, this paragraph does further mystify the point of Jones’ article. He seems to be reprimanding Oxbridge for not attracting (exceptional) students (from ‘normal’ backgrounds), while pointing out that such students/would-be students don’t want to go there.

Concentrating on the first half of this equation, then, Jones seems to be saying that it is a pity for people from ‘normal’ backgrounds not to be going to Oxbridge. That is, despite wanting to abolish Oxbridge, there is a deep-seated belief in the article about Oxbridge’s superiority over other universities (in the UK). That is, Oxbridge here does offer something exceptional to which a more diverse body of students should apparently have access, but they don’t – so Oxbridge should be abolished.

And yet, if many bright students actively do not want to go to Oxbridge, but instead go to other universities, then presumably these other institutions get a fair share of brightness. That is, if brightness is spread democratically throughout the British or any population, then it is not necessarily the brightness of the students at Oxbridge that makes it exceptional.

In this respect, one should not care what university one goes to – or even if one goes to university at all. Smart people are smart people and that is all there is to it. Or rather, everyone is smart in their own way. One does not need Oxbridge to validate this, and it is not unfair if Oxbridge does not validate this.

Jones, however, seems to suggest that everyone who does not make it to Oxbridge somehow seems to have missed out. And yet, whenever I step out on to the street and I see people wearing hoodies, shirts, and all sorts of other garments that speak of their alma mater, I see a huge range of institutions walking around the streets of London and elsewhere. That is, students seem proud of their alma mater no matter who she is. Not everyone is walking around thinking that they ought to be wearing Oxford or Cambridge stash. So perhaps Jones might refrain from his Oxbridge-centrism in believing that there are only two universities in the UK.

I do have some sympathy for Jones’ argument, but that sympathy must be based on the perceived fact that Oxbridge does offer something different to other universities, a difference that can also at times be perceived as ‘better.’

Not going to Oxbridge does not prevent people from succeeding. Not going to university, in fact, does not prevent people from succeeding – no matter how we define success, even if the prime measure of success tends to be financial.

However, given Jones’ mention of the history of Oxbridge and its record of producing British and world leaders in all domains of existence, then it is hard to deny that something different is there.

What is this difference?

Oxbridge students work notoriously ‘harder’ than students at many other institutions. For example, they must produce the amount of work in a fortnight that many other students have to produce in a semester. I therefore suspect that Oxbridge encourages not just time for reflection, which can often be offered as a justification for university tout court, but it also enforces hard work, pure and simple. Not an aptitude, but the acquired habit of hard work, then, might be a measure of this supposed difference.

There are issues to explore here about what ‘hard work’ means, though. Producing more work does not necessarily mean producing better work, even if practicing the ‘art’ of essay writing and other forms of assessment almost certainly does lead to improvements in quality. Practice, then, is key.

However, spending longer on fewer essays is also a form of practice – a form of making good that which one has time to make good, rather than rushing off essays at a rate of two per week (or whatever it may be). In this sense, I genuinely believe that all universities encourage (if not so much enforce) their students to work hard. This, then, is not for me the ‘difference.’

Oxbridge typically has students attend lectures, seminars and tutorials. These latter can in particular see students work in ‘groups’ of as few as one person with one tutor. This kind of personal attention might also set Oxbridge apart from many other institutions.

However, I am inclined not to believe this. This is not because students do not benefit from one-with-one tutorials. I think that they do – although I also think that students can benefit in different ways from group sessions in which they exchange ideas amongst each other.

I believe that this does not make Oxbridge that different, however, because many universities in fact offer – at least to those who ask for them (and many do) – one-with-one sessions for their students. Staff members feel hard pushed to decline calls for private tutorial sessions because they know that the student is paying. So in some senses this is something that other universities offer.

(Note that I am guarded here: ‘in some senses’ is supposed to suggest that there are at least more similarities between Oxbridge and other institutions than there are differences.)

So, if Oxbridge graduates dominate the halls of power within the UK and further afield, in a manner far more significant than any other university or set of universities in this country, then what leads to this is again something else.

At a wedding (in Oxfordshire) this weekend, I was ushering vehicles to the official car park with a friend when one of the guests drove past. My friend knew this person (I did not), and he told me that the guest had performed poorly in his A Levels (two Ns) – the reason this came up being that it was way back at school that my friend had last seen this person.

The poor A Level results had (resits taken for granted) not prevented this wedding guest from turning up in a relatively new BMW that almost certainly will remain beyond my means for many years to come.

This is anecdotal evidence at best, but obviously being the kind of person that Owen Jones would probably find a bit dim had not prevented this human being from going on to make – or at least give the appearance of making – a decent living. (And being able to appear well off requires a fair amount of money in and of itself.)

Talking later on to this person, however, it became apparent that he was what many people would call posh.

I have recently felt at times that I have made a grave mistake in going into higher education as a career. I have friends – not least the other usher with whom I was directing cars to the wedding car park – who have made swathes of cash that will to a high degree of probability elude me until I breathe no more.

Since I am in a society that measures success so emphatically by wealth, it is hard not to be affected by its logic – and in this sense, I fear that I should have made money as a lawyer or a management consultant – because I simply cannot keep up now with the high-spending lifestyles of many of my friends, who become non-friends because I cannot afford to see them as regularly as I would want to.

The reason that I have introduced this aside about money is that the wedding guest was – according to available reports – not academically that sharp, but he was – from the evidence presented to me – from a relatively wealthy background.

The reason for this bracketing aside about the wedding, then, is to say that what sets Oxbridge aside is not strictly its structure of education, because I have argued that its structure of education is not necessarily that unique, and therefore not that different from many other institutions.

Without going to Oxbridge, this wedding guest has (I am arguing here) made lots of money. What he has in common with many successful Oxbridge graduates, then, is a wealthy background.

Brains help you to make money. But brains are not necessary for making money. What best helps people to make money is having it in the first place. It is not that Oxbridge students are particularly more clever or particularly better prepared for the ‘real’ world, then. Or this is what I am arguing here. What many Oxbridge students have, though, is a wealthy background.

Abolishing Oxbridge will not change a system in which the rich get richer and the poor get poorer. Oxbridge could offer more scholarships to more students from poorer backgrounds. Some would no doubt benefit from this by being ‘better off’ in later life – but mainly because of the connections they will have made or can make with those who are already from wealthy backgrounds.

This kind of ‘exceptionalism’ (by which I mean that those students who follow this route would be ‘exceptional’ people, and therefore already divorced from the ‘unexceptional’ rest) does not help to change a system that is inherently conservative, in that money is what you need to make money and to gain access to the corridors of power.

But this system of wealth breeding wealth would continue unabaited whether Oxbridge existed or not. Oxbridge is simply a symbol, then, of one’s pre-existing socio-economic status. If you want to be a part of the wealthy classes, then why not try to get in at the university level and go to Oxbridge?

However, while Oxbridge might help (as might many, if not any, other university degree), being from a ‘lower’ socio-economic background will be the main hindrance from achieving wealth. Perhaps many Oxbridge graduates make more money than they might have without their degree, but the real issue at stake is that those who already have power are the ones who gain/retain power.

Exceptionalism does not redistribute power; it simply confirms its being confined to certain spheres (those who already have it); there might be slight changes in the personnel in possession of power, but power is still the jealously guarded preserve of the few. Given that brains are evenly distributed – as proposed earlier – power and brains do not correlate to each other, nor even link up causally.

I’d like to explore one more aspect of our system of conservatism. The truth of wealth breeding wealth is systematically hidden from our population – not least because of its enormous emphasis on exceptionalism, which gives the illusion that everyone stands an ‘equal’ chance of gaining access to it.

Given the lack of encouragement to make this realisation (that power is not distributed evenly), it is not surprising that many youngsters – or even older people – do not win much power. In fact, we are systematically encouraged to accept everything as it is now.

Unsurprisingly, then, kids of 18 years of age do not necessarily know that there is a game of power, nor even how they will play it. Without knowing that they are being played, then, many continue through life – perhaps until expiry – without ever realising that they were supporting the increasing grip on power by those who had it already. The majority of people are not empowered – and a major part of their disempowerment is not even realising it.

The above is not to say that, after Jones, everyone should apply to Oxbridge; one cannot burden two, or even 150, higher education institutions with the task of redistributing power.

Nor is the above to say that abolishing Oxbridge will redress this (im)balance of power.

Oxbridge, or rather various of its alumni like Jones, should stop wringing their hands with the higher educational equivalent of white man’s burden, whereby only Oxbridge can save higher education and if it cannot then higher education as a whole should be damned.

Students wear their university brand and love their alma mater whoever she may be – and they are correct to do this. It shows that people care about their higher education and what it can offer, and those few people who believe that you are no one if you weren’t at Oxbridge can go forth and multiply.

All universities (as I shall argue in the next part of this blog) are by and large equal. What university you go to, or that you go to university at all, is not going to make or break you – even if all university experiences tend to be immensely formative for those who undertake them.

What makes and breaks you is how much you have already. Oxford University hoodies and t-shirts are sold all over London and further afield – while stash from my current employer does not feature too highly in the tourist shops around town. This stash is bought by tourists not because of the brains they have or do not have, but because of the aspirations to power that such a label implies. It is a show of money, be it real or proclaimed, as much as a BMW is.

All of these ‘fake’ Oxbridge graduates who wear the t-shirt but never actually attended the universities reveal a truth, then, about the universities themselves: they are a brand, a spectacle of power, that helps to convince other people that more power should be given to those who already (seem to) have it.

Don’t get me wrong; this is a complex issue, the full complexity of which I have not got to grips with here. But if Oxbridge is the icon, then the real deity is power itself. Closing Oxbridge would not change anything; power would simply rebrand and perhaps relocate. Convincing the world that this power is there for the taking, that it can be distributed evenly, perhaps, with or without Oxbridge’s existence, is the real task of higher education.

Notes from CPH PIX: Road to Nowhere (Monte Hellman, USA, 2010)

American cinema, Blogpost, cph pix 2011, Uncategorized

Having just waxed lyrical about the joys of seeing unexpected films at any point in time, but perhaps at film festivals in particular, it might sound contrary to now write a blog that in part is about disappointment – although the two feelings go somewhat hand in hand.

Monte Hellman is something of a cult figure, and someone about whom I certainly have read more than I have seen. That is, I have seen Two-Lane Blacktop (USA, 1971), which some feel is perhaps the finest ‘underground’ American film of all time – and I did like it, not least for its pure obsession with cars and engineering and to hell really with plot.

But as far as seeing Hellman’s other output goes, that is it. I have not seen – though I do really want to see – Cockfighter (USA, 1974), for example, if for no other reason than to see the film the (apocryphal) tagline of which is ‘He came into town with his cock in hand, and what he did with it was illegal in 49 states.’

For a career that now spans 50+ years, for which the first feature was Beast from Haunted Cave (USA, 1959), Hellman has not made that much in the way of feature length films. Alongside the few already mentioned, there are some 1960s westerns, including Ride the Whirlwind (USA, 1965) and The Shooting (USA, 1968), and then Iguana (Italy/Spain/Switzerland/USA, 1988), the last feature that he made.

A Roger Corman protégé, Monte Hellman has in fact been relatively unproductive, given that films from Corman and his acolytes was intense, particularly in the late 1960s period (when Hellman was, admittedly, most active, it seems).

Anyway, given that this is his first feature in 22 years, given that I have only seen one of his films before, given that I liked it (though not as much as some people), and given that he has a magnificent reputation, I was expecting great things from Road to Nowhere.

Disappointment is a sensation one can often have at the movies. In fact, since a lot of my research is on digital technology and cinema, I often find myself in front of special effects rubbish that really I ought to have known better than to watch – especially at this stage in life – and which – as is to be expected – was not the film I hoped it would be. If I get the chance to blog about it, perhaps I can elaborate on this feeling with regard to the recent Sucker Punch (Zack Snyder, USA/Canada, 2011), which not only disappointed me (though to be expected from Zack Snyder), but in fact also appalled me in certain respects (though perhaps to be expected from Zack Snyder).

Don’t get me wrong: there could be a perverse satisfaction in being disappointed – I am fully prepared to admit it. But whether I go looking for disappointment or not, this does not mean that I do not feel it.

Strangely, disappointment is about the most negative feeling I feel towards any film, or at least I don’t feel much worse about a film for very long. Sucker Punch might have appalled me at moments, but I don’t really hate it; I am just… disappointed. Since I cannot pinpoint with more finesse my feelings, perhaps this feeling is unclear to some. But I suppose it is like wishing the best out of one’s team members, only to find that they are cynical players who either do not care or, worse, will cheat to win.

That said, there are grades of disappointment. Sucker Punch can disappointment because Zack Snyder has not suddenly grown up and decided to use his talents for creating striking images in (what I would deem to be) a mature manner. And Road to Nowhere can disappoint because sometimes one expects so much – too much – from a filmmaker like Hellman, an expectation in part built up out of hype and reputation and not necessarily out of personal experience of the filmmaker’s films – that it is perhaps almost inevitable that the film will not live up to it.

To be honest, I am not sure how – or even if – I was disappointed by Road to Nowhere. Sometimes one is overwhelmed by a film (Zulawski’s Possession, for example). And sometimes one is underwhelmed by a film (Sucker Punch). Sometimes, however, one is simply whelmed – neither over nor under, though there is always a sense that a whelming film is ever so slightly an underwhelming film, but one sticks with whelming to convey the neutrality, or better the indifference, of one’s feelings and thoughts.

There is much to commend Road to Nowhere:

– It is slow – but in a challenging fashion that makes one want to think about the reasons why ’empty’ moments are not in mainstream films more often, or even at all, as opposed simply to finding it a ‘slow’ and boring film. It is also a Hollywood ‘insider’ movie in that it is a film about filmmaking (I’ll give the plot – as best as I can explain it – below).

– It is a film featuring much mise-en-abyme, which is to say that the film is a film about a film, and one never quite knows whether one is watching simply ‘the film’ (after a fashion, one is always only watching ‘the film’), or whether one is watching ‘the film within the film.’

– Furthermore, Hellman, according to a review in Cinema Scope, shot the film on Canon 5D Mark II cameras – that is, cameras that are predominantly used for still images – and this has a very interesting effect on the look of the film. For, while much of the film seems to take place in sunny locations, at every moment there seems to be a quasi-visible filter of darkness between us and the ‘image.’ I don’t know if this was achieved with the Canons, but I’d not seen this sort of view quite so insistently before and so attribute it to the unusual cameras used to make the film. Of course, this strange grain of darkness is not between us and the image; it is in the image, even if the effect is that somehow we cannot quite see clearly what is going on in the film. Interesting, and appropriate for a film that has noir-ish elements like this one.

Plot
Okay. So the film starts with a DVD being inserted into a laptop. On the laptop screen a film starts playing and the camera closes in on the laptop screen until it feels the entire cinema screen that we are watching (unless we are watching the film on our own laptops). We never know from this point on whether what we are watching is still the camera recording the screen of a laptop in one single and unbroken take, or whether we are seeing a or the ‘real’ film.

The film that we see on the laptop screen is called Road to Nowhere and it is directed by Mitchell Haven (Tygh Runyan – cinema’s doppelgänger of Matthew Holtmeier). It tells the story of Velma Duran, a seeming seductress of sorts, or perhaps just the patsy of a corrupt politician, who ran away/was framed for running away – or so we are led to believe – with US$100 million of North Carolina state money. Duran is played by Laurel Graham (Shannyn Sossamon), although it transpires that Laurel Graham is in fact a false identity developed by none other than… Velma Duran, in order to cover up the fact that she is not dead. Except that this may not be true – since it may be a pre-arranged identity swap carried out by Laurel Graham with her co-actors.

In short, then, Road to Nowhere is a good old puzzle film in which it is hard – if not impossible – for us to discern ‘truth’ (whatever that is) from ‘fiction.’ There is breaking the fourth wall a-plenty in this film, including in the climactic moments of the film when Haven kills Bruno (Waylon Payne), who has killed Laurel/Velma. Like a crazy film director who can only filter things through the lens of a camera, he starts to film the victims (on his Canon 5D Mark II), before his camera looks directly into ‘our’ camera and we are offered a reverse shot, which shows the entire crew working and watching the scene. Nonetheless, there is no ‘cut’ this time (as there are at other moments in the film) and the cops turn up and arrest him.

We are then perhaps taken back to the beginning of the film and the DVD in the laptop. Haven is in prison being shown the film by Natalie Post (Dominique Swain), a local investigative blogger who had been helpful to Haven in filming his Road to Nowhere movie by giving him insight and facts. The conversation between the two of them ends as a guard takes Post out of the cell – and the film ends.

In other words, and in a manner that for many audiences will be frustrating, the film goes nowhere and, like Two-Lane Blacktop which never reaches its destination, the film challenges the whole myth of teleology, or reaching a fixed goal, as a whole.

This is not the disappointing, or whelming, thing about Road to Nowhere. This, in fact, is perhaps the most pleasing thing about the film. An unresolved conundrum is here very pleasing, and much more so than the ‘ooh, is it still an illusion?’ malarkey that is the end of Inception (Christopher Nolan, USA/UK, 2010).

What is whelming, for me, about Road to Nowhere is that I have been to nowhere so many times now that I feel quite familiar in it. Perhaps this is hubris on my part; but there comes to be something very predictable about the film that has no easy resolution. Perhaps this is in part the point: life is banal and certainly it has not set goal that we can foresee at all – and we, like the film, always end up other than where we expected, in a place that is a strange mix of what we expected (our fantasies) and a contradiction of that (‘reality’). But some films can take you to weird places and still leave you lost.

Hellman’s nowhere just seemed to feel a bit too familiar, then. This I can compare to Andrzej Zulawski’s Szamanka (Poland/France/Switzerland, 1996), which I also saw at CPH PIX, and which is so weird (like Possession about which I blogged yesterday) that I do not know what to make of it at all. For all its ‘faults,’ I am rather just fascinated by its strangeness. And so the familiarity of Nowhere‘s nowhere seemed to let it down.

Road to Nowhere is better than 50, maybe even 100 Sucker Punches. (By how much it is better is a silly thing to quantify. It is just better by virtue of being more interesting, even if Sucker Punch, too, wants to try to get you to think about ‘is it real or not?’) But one wonders whether the illusion/reality question needs to be posed in new ways for something truly startling to come out of it. The question is still a good one – but there are perhaps other, more penetrating questions, lying somewhere in wait.

Notes from CPH PIX: Sumarlandið/Summerland (Grímur Hákonarson, Iceland, 2010)

Blogpost, cph pix 2011, Icelandic cinema, Uncategorized

Summerland is not the only Icelandic picture that I have seen – but I must admit that I have not seen many, and certainly not all of the recent ‘landmark’ Icelandic movies that have come out since 2000.

(Think 101 Reykjavík (Baltasar Kormákur, Iceland/Denmark/France/Norway/Germany, 2000), Nói albínói/Noi the Albino (Dagur Kári, Iceland/Germany/UK/Denmark, 2003), and Beowulf & Grendel (Sturla Gunnarsson, Canada/UK/Iceland/USA/Australia, 2005) and you have more or less my complete knowledge of Icelandic cinema.)

The film is a comedy – perhaps in the vein of Aki Kaurismäki, if to revert to comparisons with Finns is not too condescending or ‘obvious’ a step to take – about a family who live in Kópavogur, about which I know nothing, but who try to run a local tourism business. This they do by stealing visitors from the ‘official’ tour of the vicinity and taking them to their ghost house, where pater familias Óskar (Kjartan Guðjónsson) tries to scare visitors. Mother Lára (Ólafía Hrönn Jónsdóttir) is also in the ghost business – but as a (seemingly) genuine medium, who talks to the dead, or those who live in the titular Summerland, as a result of the energy that is channeled through the local elf stones, in which live Iceland’s long lost but historical elven ancestors.

However, because the family home is threatened with repossession as a result of debts, Óskar sells the elf stone in the family garden to a German art dealer (Wolfgang Müller) – and even though he makes a tidy 50,000 euro from the sale, everything proceeds to go wrong from here: Lára falls into a coma, their daughter starts a relationship with a local anti-spiritual campaigner, their son loses his best friend (because, or so the son thinks, the best friend is or was an elf), and the town decides that it is going to sell off other elf stones in order to save the local economy.

However, Óskar sees the error of his ways and although he does not get back the money for the elf stone that he sold from his garden, he does stop the town’s larger elf stones from being sold by placing himself between the digger that will extract them and the stones themselves. Hailed as a martyr, a sense of community is restored and the town itself becomes something of a tourist destination, Óskar’s ghost house in particular, meaning that, in theory, everything is well in the world.

There are two differences between this film and the others mentioned above – or at least there are two differences that I want here to discuss. Firstly, this is the first film that I have seen since Iceland went bust in 2008. And secondly, this is the first Icelandic film that I have seen that is not an international co-production.

The reason for mentioning the first is hopefully self-evident: this is a film that deals with Iceland selling off its traditional assets as a result of being too international-minded in the pursuit of both profit and, perhaps more tellingly, ‘survival.’ In an Iceland that denies its history, signalled here by a belief in the spirit world – the land in the past where it always was summer and Icelanders were happy – and by the fact that both Óskar and the town in general want to sell the elf stones, the message of the film seems strongly to be: hold on to what is truly Icelandic, because it is only in this way that we will be able happily or in a satisfactory manner to ‘compete’ internationally. In fact, it is by embracing its past that Iceland emerges as a viable tourist destination – and not by becoming a bland destination that has the same things as everywhere else (Kópavogur is home to Iceland’s largest shopping mall, not that it features in Summerland).

Secondly, the fact that this is not a co-production suggests more or less a similar thing, but on a filmic level. Rather than trying to make a Europudding featuring (with all due respect) famous stars like Victoria Abril (Reykjavik 101) or Gerard Butler (Beowulf & Grendel), Summerland is a ‘uniquely’ Icelandic film – and perhaps it benefits all the more from being so. For it is potentially a downside of the international co-production that it becomes obsessed with markets: who does it please from where, how can it make money in various territories, et cetera. Instead, Summerland arguably just does what it wants to, and in the course of this it sticks (proverbially speaking if not literally) two fingers up at the rest of Europe, here signified through the presence of the (problematically) gay German art collector (and his lover).

Given that Summerland is a comedy (albeit one that is – and I hate this term when applied to comedy – ‘bittersweet’), and given that – or so the cliché goes – comedy does not travel, then Summerland is a ‘risk.’ But then again, if the packed house at the cool Husets Biograf is anything to go by, comedy does travel (we could mythologise this about some sort of interest in ‘Scandinavian’ cinema), and, indeed, the more ‘Icelandic’ the film is, the better it fares. For what – paradoxically – sells better (than comedy) is a sense of making a film that one cares about as opposed to making a film that is intended to satisfy certain so-called needs in certain markets.

(I hope that Afterimages, my film showing at CPH PIX, is taken in this way – even though it is not ostensibly a comedy.)

Now, the above is more or less all that I have to say superficially about the film – but it is of course more complex than the above words can convey. Óskar and family got into debt for trying to do something ‘authentic,’ or at the very least independent and different in Iceland. Had they played safe, they might not have got into debt at all. Furthermore, Óskar does sell off his elf stone and does ease his financial worries through doing so – regardless of the subsequent romantic consequences of this act.

In other words, interpreting the film ‘economically’/as an allegory of recent economic history (which is my doing and therefore my mistake, if mistake it is) is not necessarily an easy task. The economic crisis is caused by localism, while globalisation can and does bring financial rewards, even if at the expense of ‘culture’ (here, elves).

Sure, following a sacrifice of the pater familias, Iceland can re-emerge as both economically viable and as ‘Icelandic,’ but then it seems that the very terms of economic imprisonment are the same as the terms of escape. In other words, there is no clear or easy history to the Icelandic economic crisis, and certainly no clear or easy solution, even if at first blush Summerland seems to suggest as much.

Furthermore, the film also requires the removal of the patriarch (who never really was that empowered?) for this to happen. That is, the cause of all of the problem – the guy that sold his country out – is also the route towards greater economic well-being. I have nowhere specific to go with this analysis, but I find it interesting nonetheless.

Either way, as has happened already a couple of times – and as should become clear from subsequent blogs – Summerland was not a film that I had intended to see here. In fact, I was hoping to see Meek’s Cutoff (Kelly Reichardt, USA, 2010), but missed it because I stupidly got wrong the time of the film’s start.

But this is also one of the incidental pleasures of festivals as I understand them: having missed or simply not being able to attend the higher profile stuff can, if one is determined and can afford to see a/any film anyway, one always ends up seeing something of great interest and warmth. I am sad I missed Meek’s, although I am sure I’ll catch it at some point before too long.

But in hindsight, I am happier for having seen Summerland, not least because of the fantastic atmosphere engendered by the full house at the Husets Biograf (there is so much to write about what being in the cinema with a warm crowd can do to one’s response to a film, as opposed to the solipsistic practice of watching films on DVD on one’s laptop). I am also happier for having seen Summerland because in all likelihood I will be able to see Meek’s Cutoff before too long anyway (it had just started playing in London before I came out to Copenhagen).

In some respects, this sounds like the ‘festival twat,’ who can namedrop films that no one else has seen, nor will they likely get the chance to see, except indeed on DVD at home, where the experience might be all the more disappointing by virtue of the viewing circumstances (being with people is always better, or so say I).

But in another respect, I hold by it: I don’t normally get the chance to see films like Summerland, and I might not normally take up such a chance when I do get it (not least because I wanted to see the Reichardt film ahead of it). But, be it by hook or by crook, I have seen it – and this is what going to the cinema in general, and festivals in particular, is all about, or the experience that for me is the most pleasurable.

That is, the less I know about a film in advance, the more fun I have. I don’t know if others feel the same way, but in certain respects I sometimes wonder that it would not be great simply to have films showing – and one gets what one receives, without having to ask for a particular thing in advance. Bring on the days where promotion and publicity count for nothing…

Afterthought (which I meant to include in the main blog, but forgot about): Summerland‘s presence at film festivals, including CPH PIX, might make of the film’s story something like a self-fulfilling prophecy. Maybe not many, but some people will see the film, and – be it consciously or otherwise – somewhere in their line of reasoning it will play a part in deciding them to go to Iceland, be it for a full-on holiday or for a weekend break. Other scholars study set jetting in more detail than I do, but an independent Icelandic film plays a part in helping the Icelandic community to recover, both economically and culturally, by functioning as a film that plays abroad and as a film that might inspire tourism. In other words, although the ‘recourse’ to an Icelandic as opposed to European co-production might seem to reinvigour nationalistic sentiments, paradoxically its ‘meaning’ is always already ‘global’ as soon as the film circulates beyond the boundaries of its home nation. Again, I’ve not much to add to this, but it is an interesting and almost contradictory process nonetheless.