Smart people are stupid (in a good way) – and ‘stupid’ people are smart (in a bad way)

Blogpost, Film education

(A note of thanks to staff at the University of Reading, where I presented a paper that is loosely on a related topic to this post on 1 November 2012. In particular, discussion with Simone Knox, John Gibbs, Lisa Purse, James MacDowell and Ian Banks, allowed me to think critically about what I had discussed at Reading, such that this blog post might come into existence.)

I teach a module to first year film students called Reading Visual Aesthetics. The first exercise that we get students to do for that module is to describe what they see in a ten-shot sequence taken from a film.

When I first taught this module, I thought that the exercise might be a bit pointless; perhaps more time should be spent on analysing rather than describing. However, as time has gone on, the more I appreciate what a fantastic exercise this is to get students to undertake.

For, what is excellent about this exercise is how it reveals and/or exposes how we take for granted – or look unthinkingly – at many of the things and objects that surround us in everyday life, but which perhaps we should not take for granted.

This year, we showed a ten-shot sequence from the opening of The Opposite of Sex (Don Roos, USA, 1998) – and as happens every year, many of the students wrote their description pretty hastily and then sat in the classroom looking bored as the two-hour time limit that they have clocked down.

This in spite of my exhortations to check through work, to keep looking at the sequence (which we show 15 times over the course of the two hours – it becomes etched in your memory) and to keep writing down details that they see. I always insist that the exercise is harder than the students think, but nonetheless this does not stop many students from assuming that the exercise will be – in the parlance of our times – a piece of piss. And by and large those who believe this do most poorly on the exercise…

Which will lead me to the conclusion that I will make at the end of this blog and which is outlined in its title: smart people are stupid (in a good way) and ‘stupid’ people – by which I really mean lazy people – are smart – but in a bad way.

This blog is not about the poor spelling that I find in these descriptions, nor about the common errors that film students make with regard to film terminology (for example, barely any students correctly identified the opening sequence’s crane shots as crane shots, but instead called them pans or simply camera movements; this is not to mention how editing remains practically invisible to most students – and filmgoers more generally – with barely any students making mention of the dissolves that feature in this sequence).

The kinds of errors I have just mentioned are common, particularly to students just starting out in film – and they are common enough these days that I almost feel no point in commenting on them, especially – sadly enough – the spelling.

(Indeed, a quick comment: I had to read nine essays before finding one that spelt the word cigarette correctly. However, I should make clear that this is not a criticism of the students at my university in particular. I have come across poor spelling at all of the institutions at which I have taught film, which include Oxford, St Andrews and Roehampton. So anyone inclined to make any assumptions about the latter university and/or its students as a result of its not being so well known as the other two… well, desist immediately.)

Instead, what is interesting is the nature of the descriptions made. Or rather, how many things that are right before our eyes are often invisible to us, or do not seem worth commenting upon.

In the opening sequence of The Opposite of Sex, we see a firebrand Dede (Christina Ricci) trash her stepfather’s funeral and run away from home with the help of her quasi-boyfriend Randy (William Lee Scott) to Indiana, where she hopes to stay with her half-brother Bill (the wonderful Martin Donovan). There she meets Bill’s boyfriend, Matt (Ivan Sergei), whom we see towards the end of the self-same opening ten-shot sequence (he opens Bill’s front door in shot nine).

So, what sort of details from this sequence did barely anyone talk about?

Well, we can start with quite general things. Only one student mentioned that the characters speak in American accents, with no one making any reference to their race (all of the characters are white in this sequence).

Well, isn’t this obvious, you’ll perhaps say to yourself, since this is an American movie about white people? So obvious that it is not worth mentioning.

Well, yes. On a certain level it is obvious that we are dealing with white Americans – if you know anything about the film in advance – but that’s precisely my point. We should look at things precisely as if we knew nothing about the film in advance. For the minute that anything is obvious to us, we start regarding that thing as natural and we no longer question what surrounds us.

If, when asked to describe what you see in a film sequence depicting white Americans, you feel that someone being white is not worth mentioning, nor that someone is American (let alone from which part of America, what class their accent seems to betray, etc),then their whiteness is (after the venerable Richard Dyer) invisible, or naturalised.

Now you might say that you would not pass comment either if the characters were black and/or spoke with Russian accents. You – you will tell me in your best thinking or unthinking David Brent impression – are colour blind.

Well, aside from the fact that so-called colour blindness negates difference (something that we should do at our peril), and aside from the fact that I would probably not believe you (since I don’t think colour blindness exists – or if it does, it only speaks, as it does in the case of David Brent, of a condescending and predominantly white attitude towards racial difference), I think that we must describe what we see in the best and most appropriate language that we have.

And since we see someone’s skin colour, we should during a description exercise describe it.

Failing to do so implies that anyone who reads the description shares a similar white or white-centred outlook on the world, and the implicit assumption that the world is white. By virtue of this being an unthinking assumption, it also is only a few steps away from suggesting that the world should be white and/or white-centred.

Similarly, to feel that an American accent (even perhaps the fact that the characters are speaking English) is not worthy of comment also belies the belief that all movies are American, that America is somehow the natural home of cinema.

In other words, if it is considered ‘natural’ (well, naturally the film is about white Americans) that a film like The Opposite of Sex is about white Americans, then whiteness and Americanness are naturalised. By which I mean to say that they are normal, not necessary for comment, while all that deviates from this norm is, well, abnormal, deviant and somehow unnatural.

I can imagine some people having a hard time agreeing with, so I am going to bring forward three other examples that hopefully will make more clear what I mean.

During this sequence there are two night-time shots, one featuring Dede packing a bag in her bedroom, and one featuring her sneaking from her house, across a lawn and to Randy’s car. In the first shot, we see a bedside lamp and in the latter we can see lights from the house’s interior as well as a flash of Randy’s headlights.

What is interesting is that many of the sequence descriptions that I marked suggested that the lighting throughout the sequence is natural lighting.

That students put this in spite of repeated explanations in class that more often than not what looks like natural lighting is as a result of very powerful lamps is not the point that I wish to make. Rather, the point that I wish to make is that we know absolutely well that neither an electric bedside lamp nor a set of car headlights is natural lighting.

These are man-made phenomena. And yet they are so commonplace to us that they have become naturalised; we mistake as natural something that is man-made and, to a certain extent, artificial.

So if we end up mistaking manmade inventions like electric lighting for nature – perhaps a typical occurence for those humans who are surrounded everyday by such items – then perhaps we can see how this also becomes the case in terms of whiteness and Americanness. So commonplace are whiteness and Americanness in cinema that we take them as natural – when of course cinema could be very different.

Perhaps another way to think about this is that if electric lighting has become so commonplace as to be natural, then we should understand that nature is perhaps malleable and not absolute or fixed in nature. In this way, cinema need not predominantly depict whiteness and Americanness – but for some reason it does.

So we need to think about why this is the case – and we can perhaps then begin to construct a different cinema that is not so white-centred and Amerigocentric, but which instead is more ‘democratic’ and egalitarian.

The second of my three examples is the notion of costume. One student did very insightfully put that the costume in the film mimics the fashion of the late 1990s – or words to that effect.

We often unthinkingly assume that films should be about the contemporary age (and perhaps we do not even question the constructed nature of costumes in, precisely, costume dramas and period films).

And yet costumes in films – and costumes in general – are constructed and they tell us information about where and when they come from – even if most of the time we do not bother to analyse such things.

Finally, a couple of students noticed a yellow car in one shot that shows Dede approaching Bill’s front door (though none identified it as a Volkswagen, which surprised me; nor even as a hatchback, which disappointed me).

Only one student, however, said that this is a funny detail since we might typically associate a little yellow VW with a gay character – and Bill is a ‘real life homo’ as Dede tells us in her voice over during this sequence.

My point here is not to deconstruct why a yellow VW hatchback might be deemed gay – though such an argument no doubt deserves to be made elsewhere, since the link between the one thing (yellow VW hatchback) and the other thing (homosexuality) is certainly not natural, but a cultural construct.

Rather, it is to say that I am surprised no one commented on the car at all.

I have no empirical evidence for this, but I suspect that most students notice Bill’s yellow VW hatchback and that it conforms to Dede’s characterisation of him as gay in the voice over.

That is, while the visual joke that is made might well have been lost on some viewers (particularly those who precisely do not see the link as natural between a yellow VW hatchback and a gay owner), my guess is that most viewers ‘got it’ but did not feel the need to describe the car or the ‘appropriateness’ of the car’s colour – again because the point is supposedly too obvious.

This despite the fact that students have only been asked to make a description!

Now, here is where we send this blog in the direction of its title and conclusion – but we’ll do this by turning to what various scholars say in film studies about the experience of film viewing.

The great David Bordwell – and many cognitivist film scholars before and since – have long argued that the brain is working overtime during film viewing and that it really is a miracle of intelligence that people can work out that a shot of a woman at a desk after a shot of the outside of an office block means that the woman is (most likely) inside that office block and working at her desk.

This is no doubt true – and its truth pertains to the yellow car gag from The Opposite of Sex, too. It is a miracle of intelligence that people ‘get’ that the yellow car is a visual gag that reaffirms that Bill is gay (while at the same suggesting to us that what we are seeing is Dede’s version of events – as affirmed by her self-conscious voice over – and not necessarily, therefore, a trustworthy account of events. That is, in Dede’s head Bill of course has a yellow VW hatchback because he’s a complete flamer – but this is not necessarily the truth, nor how Bill would see things, nor necessarily as things are or were).

However, while it is a miracle of intelligence that we get the joke so quickly, automatically even – i.e. without having to think about it – it is also problematic precisely because we do not think about it.

Why do I say this?

I say this because the making-automatic or natural of associations and thoughts (manmade lighting = natural; contemporary clothing = natural; predominant whiteness and Americanness = natural; yellow VW hatchback = gay) has what I shall call a profoundly ideological aspect to it.

This is most clear in the “yellow VW hatchback = gay” idea. Yellow VW hatchbacks are of course not gay – but the association between a stereotype of homosexual American men as liking bright colours and small, relatively sporty and European cars has been made natural that not only did most people see the joke, not only did (I wager) most people get the joke, but when specifically asked, so natural did the joke seem that only one student even thought to comment on it.

We might say that finding a yellow VW hatchback to be gay is harmless. Ostensibly it is, and I do not think The Opposite of Sex a homophobic film – though it certainly deals with explicit homophobia as a theme. Nonetheless, we make these kinds of unthinking and automatic associations the whole time – and sometimes they really can be of a problematic nature (historical – unthinking? – shorthand would reach for World War Two Nazi propaganda here: it is unhealthy when a society starts to associate Jews with rats).

So you may not think that there is a particularly worrisome ideology about the yellow VW hatchback joke in The Opposite of Sex, but there is an ideology at play nonetheless.

(And it is Dede’s – problematic – homophobic ideology that is on display here, since it is she telling the story and she who would paint Bill as a typical flamer with a yellow VW hatchback – even if at play there is also the film’s own, non-homophobic ideology that creates some distance between us and Dede – we hear her voice over and so know that she might be manipulating events such that we see things her way and at the same time we learn not to trust her, meaning that we are not necessarily sharing her homophobic perspective but rather laughing at it as we see the yellow VW hatchback – making of The Opposite of Sex a very sophisticated film indeed.)

My argument is not that ideology = bad. I am of the view that one cannot escape ideology – but I am also of the view that ideology becomes dangerous when unthinkingly do we accept as natural, unchanging and as a given something/anything that is not natural, precisely because nature is malleable and susceptible to change (as opposed to being, precisely, unchanging).

Ideological perception – seeing the yellow VW hatchback as gay – needs to be thought about explicitly. In other words, we need to make un-automatic that which is automatic in our minds; we need to bring into thought precisely that which is otherwise unthinking. Because, as mentioned, otherwise we run the risk of some form of Nazism, or fascism.

Or, put less hysterically, if we just accept the world in an automatic or unthinking fashion, then we are not looking at the world for ourselves, but we are seeing it as others want us to see it. We are willing accomplices in our own subjugation to a version of reality that we could change if we wanted to.

(A sidenote – aimed mainly at film scholars: it is beginning slowly to be acknowledged – but the kind of automatic thought whereby yellow VW hatchback = gay means that we see films, and perhaps reality itself, as a system not of stand alone objects but as signs (yellow VW hatchback is not gay, but that we see it as such means that yellow VW hatchback has stopped being a yellow VW hatchback and has become instead a sign of homosexuality). In other words, that we see semiotically means that semiotics – and film as a language, language here being defined as a process, as the making-linguistic, the making-semiotic of cinema and of reality itself – might well rear its head back into film studies – even if it was precisely against such a semiotic approach to understanding cinema that David Bordwell and other cognitivists adopted the cognitive framework in the first place!)

If we are seeing the world not for ourselves but as others want us to see it, then perhaps nowhere is this more clear than in, of course, film viewing. That most students did not put into words the yellow VW, or the edits that of course they did see but to which they did not pay attention, makes this most clear: we see the film, but we do not see through the film. We see what the film shows us, but we do not see the film itself. We see the content and the story – but not its form, or how it is being told – even though this form, which exceeds our attention, is incessantly before us, right before our very eyes for us to see – if we had the eyes to do so.

When we have the eyes to see the invisible links, to rethink the associations that are otherwise automatic, then we begin to learn. Learning is the confrontation of the new – it is the rewiring of neurons in the brain, the making of new associations. The minute we stop learning, our brain will begin to atrophy – since only the same old clusters of neurons will fire as we begin to see the world in an automatic and unthinking fashion.

The minute we start thinking, or rewiring neurons, then we are no longer (as much) prey (be that willing or unwilling) to ideology; we move into changing ideology – we become political beings – as well as social, ecologically-embedded beings working on the construction of reality, of what is deemed natural, whether or not everyone agrees with the direction in which we want to change things. We bring into our conscious mind that which previously was unconscious – we become smarter – we develop the possibility to control our own destiny – we develop free will – we develop our capacity for freedom, both of thought and of deed.

So here’s where the title and conclusion of this blog post comes into play.

When our automatic perceptions rule our existence, in some ways we are functioning in a very smart fashion; we are efficient and do not need to waste energy consciously thinking about stuff since we can negotiate and navigate our way around reality in a smooth and energy-saving fashion.

But this is, after Daniel Kahnemann, also laziness. So ‘stupid’ – or what I really mean is lazy – people are smart. They are efficient and don’t have to, or don’t want to, think about things. But I see this, laziness, as being a bad thing. Why? Because it is not to get involved in the world, it is not to think and to re-think reality and what surrounds us, to fulfil one’s potential – to waste one’s life, in short.

(Note: a footballer gets so good at football that it becomes unthinking to them. My point is not that we should resist automation entirely – because sometimes being able to naturalise or automate skills, such as controlling and passing a football, are good things. But we should not rest on our laurels and we should always work at improving our game, on acquiring new skills. What is true of football is true of thought, even though thinking is frowned upon in British society and even though our government is prepared to take away much of the investment in education – sport for the brain – at the same time as pouring money into sport, even though sport by definition can employ or make employable far fewer people than can education as a whole. Scholars may not be as famous as David Beckham – but they are as good at what they do as Beckham is at what he does. A.J. Ayer is as big a man as Mike Tyson.)

Smart people, on the other hand, are a bit stupid, because they expend energy on analysing, rethinking, and asking questions. However, while intelligence is therefore not necessarily efficient (and therefore runs counter to the capitalist ethos and ideology that drives our society, if not our whole world, making the question of education and thought a deeply ideological one!), intelligence is the means to freedom, to thinking new things, to invention. It is by definition experimental; it is by definition somewhat speculative. But unless we create the conditions – for ourselves and for others – to realise our potential, then that potential is just going to be sat wasting away.

I imagine a film sequence description that one day will become obsessed with trying to take into account the particles of air that are in the frame of the camera, but which are too numerous to mention, and each quivering blade of grass in the wind – in addition to all of the large or human scale objects that we can see.

And while such a description might not get top marks (since in dedicating its energy to elements of the film that most people overlook), I will surely know that there is a keen, inquisitive and free intelligence at work – even if its intelligence is signalled in the very stupidity of its description (the description being stupid because mildly inappropriate). And I will expect future great things of that person.

Indeed, what importance are grades? Truly original work cannot really be graded at all – since it will at first seem entirely inappropriate and stupid to the person marking it. But university – perhaps education as a whole – is not and should not be about grades (which is to impose upon people a fixed – automatic and unthinking – system of thought that has its final goal, or telos, decided in advance, or a priori). That education and university are about grades is a direct manifestation of the capitalist and unthinking logic that is invading every last aspect of our world. So it is time to rethink such things…

So don’t worry about grades, but instead worry about thinking, about fulfilling potential, about working out what your brain and your body can do, what you can do in, with and for the world, about bootstrapping yourself into conscious thought, about being different, about becoming free.

(But please, dear students, don’t take this as an excuse not to make any effort, to be lazy. On the contrary, stupid intelligence of this sort cannot be lazy – but lazy intelligence is perhaps one of the most stupid things around.)

The Last Projectionist (Tom Lawes, UK, 2011)

Blogpost, British cinema, Documentary, Film education, Film reviews, Uncategorized

Summer 2012 has been quite the summer of the documentary in terms of the number of documentary films given theatrical releases.

One that has quietly been touring the UK and gathering attention is Tom Lawes’ Last Projectionist, a self-financed, quasi-professional film by the owner of Birmingham’s Electric Cinema, which is apparently the oldest working cinema in the UK.

The film does a few things: it tells the story of that cinema, mapping its ups and downs, its rebrandings and reopenings against the backdrop of twentieth century history and a history of twentieth century cinemagoing; it tells of the decline of polyester-based cinema and of the conversion towards digital projection in cinemas, not least through the eyes of various Birmingham-based projectionists who gather to reminisce about old times; it elaborates the importance of the cinema as a specific venue in which to regard and to revere film; and it speaks of a love of cinema in all of its forms that is both touching and inspiring.

The film adopts an anecdotal approach to its various themes, but instead of this meaning that the film is unstructured (as were, for example, Tom Lawes’ anecdotes during the Q&A with him that I saw after a screening at the Curzon Soho), the film becomes all the more human and warm for this very reason.

For The Last Projectionist reminds us of several things that are very important, and yet which are easy for us to forget. All of the things about which the film reminds us are linked – and, oddly enough, they are linked in some respects by pertaining to the opposite of everything that mainstream cinema promotes.

What do I mean by this?

What I mean by this is that The Last Projectionist celebrates that which is often unseen and/or overlooked by mainstream cinema, because mainstream cinema would not deem such things worthy of its attention.

What are the examples of this?

Well, the examples of this are both in the film, but they also are the film itself, particularly if one casts aside the fact that the film does in part function as an advert for the Electric – as well as a celebration of cinemagoing more generally.

Examples in the film.

Well, the film is in part about projectionists. Projectionists are the invisible presence in cinemas – men (typically) whom we never see, but who are hidden away behind us in their booths showing films. In other words, The Last Projectionist reminds us of the important role that projectionists in particular and perhaps technicians in general play in the cinema experience.

Indeed, the assembled projectionists in Lawes’ film have mucked in in general at the cinemas where they have worked: dealing not just with reels of movies, but with the maintenance and upkeep of the cinema theatres in general.

Secondly, then, the film also reminds us of the importance of the theatrical venue itself. Lawes himself reminisces fondly about how the venue is as important as the film in terms of the cinema experience – something that Gabriele Pedullà has also written about recently, not least in the context of people watching more and more films on their laptops in anonymous and/or domestic spaces that are not dedicated to the film alone.

Thirdly, The Last Projectionist reminds us that cinema in the UK is not just about Soho and various studios in and around London. From the Brummie accents to the social history that the film offers (Lawes interviews his grandmother-in-law, who remembers the earliest film screenings in Birmingham, as well as various other details of life throughout the years), the film is as much a paean to Birmingham as it is to cinema. Perhaps an overlooked aspect of the film, nonetheless it is fantastic to see onscreen a major city that was at one time a chamber in the beating heart of England and which remains one of the most important cities in the contemporary UK.

Fourthly, a kind of combination of the last two points, The Last Projectionist show normal, working and middle class people, talking about normal, working and middle class life – a kind of democratic cinema that is interested in normal people and what they do, and which is all too rare in a mainstream cinema that is interested not in how everyone is remarkable but in demarcating how only certain people and things are remarkable.

(That said, while the film celebrates cinema owners who have created remarkable and comfortable spaces in which to watch films, and while it takes time to denigrate the cinema chains with their fast food approach to film viewing, The Last Projectionist does not take time to question whether the ‘bourgeoisification’ of cinemagoing at art house and repertory venues fundamentally excludes from art house cinema and from a sense of film history the working classes who traditionally supported cinema in a/the most widespread fashion – by going to watch films.)

What is more, The Last Projectionist fifthly and repeatedly reminds us that mainstream cinemas have through the ages often been propped up by the hidden undergrowth of film production, namely soft- and hard-core pornography. The Electric itself – in various of its incarnations – has screened skin flicks, particularly in the 1970s and 1980s. The point is no less simple than to say that these supposedly seedier aspects of the film industry have in fact helped to keep theatrical venues afloat in the face of economic downturns, competition from home film viewing and television, and so on. We should remember that cinema as a whole is a complex ecosystem in which all parts have their role to play – and that to remove one aspect would disrupt the whole in a fundamental and perhaps detrimental way.

And, finally, the style of Lawes’ film itself reminds us that cinema in its most well-known, widely advertised, and economically rich manifestations relies precisely upon grassroots filmmaking at this level. Lawes may not be a twenty something hipster (he’s an early 40s hipster if his IMDb date of birth is accurate), but it is evident that he makes films not strictly for business purposes/industrial reasons, but because he loves cinema, he loves the venue of cinema, the experience of the theatre, and the many types of film that are on offer. Without this, all of your self-important Hollywood stars who – to generalise enormously and unjustly – believe that the world owes them their wealth because of their supreme talent (a mythology hard not to believe about oneself if one is surrounded always by flashing lights) – well, these Hollywood stars would be nothing. Their stardom is dependent on normal people in Birmingham, England (some 5,335 miles away) – as it is dependent on viewers in Sabang, Salta and Salalah.

Although not strictly amateur, then, the independence of The Last Projectionist makes it truly emblematic of the foundations upon which the most professional cinema relies.

As polyester-based film becomes a thing of the past, it disappears into darkness. In fact, the film strip itself was always invisible – the contents of its images occupying the attention of most viewers who gave – and perhaps still give – no thought to how the images get to the screen.

In a sense, then, The Last Projectionist is a celebration of darkness – of that darkness which upholds and creates the conditions for the beauty of the images on the screen. If the theatrical experience is more intense than watching films ‘in broad daylight’, it is because the room is in darkness – it is invisible. And so of all of the things that make of The Last Projectionist a total delight, what links them is darkness, the fact that they are normally overlooked. And this infuses the film on every level.

In many ways, any film lover should watch The Last Projectionist: it is a lesson in film history, as well as a testimony to the power of cinema. But it is also a democratic (enough) film that it reminds us that even stars need the surrounding darkness in order for their lustre to seem so bright.

Brief thoughts on Q.I.: XL (broadcast 6 April 2012) – or why educated British people despise thinking, working class people

Blogpost, Film education, Television

For those who do not know, Q.I. is a television celebrity panel quiz show in the UK hosted by Stephen Fry. It involves Fry asking questions about all manner of topics, the answers to which often seem obvious – because mythology offers us falsehoods as truths – but which always are not.

So Fry will ask something like “What was Gandhi’s first name?” and if one of the panelists (typically a man called Alan Davies) say “Mahatma” (in a specific episode, Alan Davies in fact proposes “Randy” as Gandhi’s first name), then he is docked some points (there may be a formula for how many, but it is not important; Davies was docked 150 points for saying “Randy”). Fry then corrects Davies and the others, and explains that “Mohandas” was Gandhi’s first name (depending on which Gandhi you wish to talk about, of course), and that “Mahatma” is the Sanskrit term for “Great soul”.

In this way, a popular error (that reveals specifically British ignorance, perhaps) is revealed and an educational element is delivered in comic form, because Q.I. typically features several of a group of British comedians that endlessly wander from one celebrity panel show to the next (Bill Bailey, Sean Lock, Dara Ó Briain, Jimmy Carr, Sandi Toksvig, John Sessions, Jo Brand, Phill Jupitus, Rich Hall, Rob Brydon, Clive Anderson, David Mitchell, Danny Baker, etc). It has two permanent cast members: Stephen Fry and Alan Davies, the former as MC, the latter as the sightly buffoonish one who always gets the questions wrong, but in a charming amd mildly amusing fashion.

Now, last night, Fry and Davies hosted Sandi Toksvig, Jimmy Carr and Lee Mack. At one point, Fry posed a question along the lines of “how is it that the Mona Lisa’s eyes follow you around the room when you see it in the Louvre?” This we can oppose to the way in which a human’s eyes do not follow the observer of that human around the room – if the human continues to look at the same spot while the observer moves…

For whatever reason, it fell to Lee Mack to answer this question – and whether right or wrong, answer it he did with some ingenuity. Mack proposed that the reason the Mona Lisa’s eyes always follow us around the room is because we are never looking at the woman in the Mona Lisa painting, but at Leonardo’s vision of her. Since in the painting she is looking directly at the painter, and since we can only see the painter’s perspective of her (she is not a sculpture around which we can walk – although Mack did also sharply say that the Mona Lisa’s eyes do not follow you if you walk behind her), then it stands to reason that from any/all angles, it will seem as though she, the Mona Lisa, is looking at the painting’s spectator.

There is something ingenious about Mack’s explanation – which Fry himself was beginning to praise at one point, before Jimmy Carr and Sandi Toksvig basically railroaded Mack’s explanation into the ground as rubbish and irrelevant.

Q.I. is a funny show and so I am ‘killing’ the comedy a bit here, for which apologies. But something important seemed to happen at this moment that I think merits comment.

Lee Mack’s comedy persona is more or less “northern” humour, which translates in the UK into “working class” humour – although of a sophistication significantly greater than the “classic” (code for racist) “northern humour” that is associated with the likes of Bernard Manning.

Alan Davies, meanwhile, plays the show as a charming and well spoken buffoon. He is the “ninny” of the series, but a relatively middle to upper middle class ninny – not quite the same as, but somewhere approximate to the kind of well-meaning toff/quasi-toff that Harry Enfield satirised at length with his Tim Nice-but-Dim character, and which more recently has been revitalised by Matt Lacey with his viral Orlando character (“And then I chundered… everywhaaah…”).

This leaves Fry, Toksvig and Carr. It is perhaps not so well known that Carr is highly educated, because his humour tends to be cruel and scatological – while Fry and Toksvig are, latterly at least, known for their sophistication, not least linguistically. Nonetheless, all three of Fry, Toksvig and Carr received their higher education at the University of Cambridge, attending Queen’s, Girton and Gonville and Caius respectively (the show’s creator, John Lloyd, also attended Cambridge, going to Trinity College).

Now, let’s be clear. Lee Mack is very well educated, too. He attended Brunel University, while Alan Davies was at the University of Kent. But Mack’s persona is much more mainstream than this would belie (as, arguably, is Jimmy Carr’s, but with significantly less warmth than Mack’s).

However, when we see two Cambridge graduates dismissing Mack’s inventive response to Fry’s question, with Fry later in the episode also joining in by calling Mack “stupid” (even though in a charming and friendly manner), one cannot help but feel that something else is going on beyond simply a display of knowledge.

For, Mack’s explanation is – whether right or wrong – both clear and ingenious. It may well be a piece of typical Q.I. logic. That is, Mack’s explanation for why the Mona Lisa’s eyes follow you around the room might well be spurious if logical-seeming – although he does not stand corrected in the episode (as far as I recall, the show moves on, leaving one with the impression that Mack might have been right – he is not corrected; but that on the whole he is wrong – everyone derides him).

In offering a clear and ingenious answer to the question, Mack has shown not necessarily knowledge, but he has shown intelligence and an ability to think on his feet. What is odd, then, is how intelligence of this kind is here punished by the existing intelligentsia with their Cambridge degrees and their ability to reel off facts as if knowledge were superior to understanding and intelligence.

One cannot help but wonder whether the issue really is about class and/or regionality (which, to generalise, is often and erroneously bound up with class by relatively “posh” “southerners” [Toksvig’s Danish provenance complicating this matter somewhat, except for the fact that her English accent is as Queen’s as one could hope for]), whose poshness and southernness passes for normality and which certainly is imbued with symbolic if not actual power when its disapproval of northern intelligence is implied as/taken as being the definitive framework through which to understand Mack.

One cannot help but recall one of Britain’s favourite comedy sketches – the so-called “Class Sketch” from The Frost Report in 1966, featuring (Cambridge-educated) John Cleese “looking down on” Ronnie Barker (no higher education, but from the relatively prosperous city of Oxford), who in turn “looks down on” Ronnie Corbett (no higher education).

That is, comedy and class seem thoroughly intertwined in the UK, with one’s type of humour playing a role in one’s class consciousness. But where the “Class Sketch” pokes fun at the British class system (personally I don’t find the sketch funny so much as plain scathing), yesterday’s episode of Q.I. seems to reveal the persistence of knowledge and the power of holding knowledge as the preserve of the educated, middle and upper classes (read Oxbrige-educated), while intelligence and the ability to question authority through actual thinking (Lee Mack coming up with an ingenious answer all on his own) being branded as subversive, threatening and to be quashed.

In a sense the entire ideology of Q.I. is herein revealed: the show takes “popular wisdom” (people who think that Gandhi’s first name is Mahatma) and exemplifies that it is often “wrong,” while demonstrating that “real” knowledge – and thus power – lies in the hands of the enlightened few.

There is more to say about the show’s audience and the projection of intelligence and wit that makes it for many viewers a pleasurable and aspirational experience – not least because it all looks so easy for these performers.

(Indeed, in this episode, Mack is also the only comic made to look as though they are working on/at their comedy, when he bungles a “ye olde Second World War” joke, that again he must explain – “brilliantly” according to Fry – as being a failed improvisation; the others would never have to do so much as working – because they are not, by implication, working class, while Lee Mack (in fact highly educated) is.)

If you who is reading this knows me, you might be thinking “pot-kettle-black”, in that I am also highly educated (three degrees from the University of Oxford). But my point here is not simply to cast stones at the Cambridge “toffs” that dominate Q.I., which is a show I enjoy immensely (otherwise why would I watch it from time to time?).

Rather, it is to demonstrate the way in which even highly educated people can – as this episode of Q.I. seems to reveal – succumb to the logic that having answers and holding power are the most important things to possess in life – and that threats to them (Lee Mack’s display of intelligence, so much more threatening than Alan Davies’ otherwise well-meaning ninniness) must be extinguished. It is Mack, in showing thought/intelligence, and in showing that he works and can sometimes fail at his comedy, who demonstrates something beyond power here – thought, intelligence, doubt, and in some respects true comedy in the face of the scientific imperative towards “true” knowledge. It is a pity that the others – supposedly highly educated – dismiss him so quickly – even if in a comic fashion and on nominally friendly terms.

For a short while, UK residents can catch the show on the BBC’s iPlayer here.

Notes from the LFF: Hello! Shu Xian Sheng/Mr Tree (Han Jie, China, 2011)

Blogpost, Chinese cinema, Film education, Film reviews, Uncategorized

It is opportune that I saw Mr Tree in the same week that I taught about Sixth Generation Chinese filmmaker Jia Zhangke in two separate classes.

For, Jia acts as producer of Mr Tree, and Han Jie’s film, while by no means a Jia Zhangke ‘rip-off,’ definitely contains themes that are also of close concern to Jia, especially the effects of modernisation on rural life.

I shall deal more with Mr Tree below. But I’d like to reflect a little bit on teaching Jia Zhangke, not so that I can write about Jia specifically, but so that I can deal with the reception of Chinese cinema – and art house cinema more generally – in the West, and also to illustrate to those who might be interested what studying cinema at university can involve.

This week I used two different Jia films for two different modules that I am teaching this term. The first film is Shijie/The World (China/Japan/France, 2004) for a module that I am teaching on Digital Cinema. The reason behind this choice was to explore the ways in which digital cameras have reinvigorated the possibility for filmmakers to create ambitious projects on relatively low budgets, and which offer up an alternative view of the world to that which seems increasingly to be replicated not just in mainstream Hollywood cinema, but across all mainstreams worldwide. In other words, The World serves as a means to explore how/whether digital technology enables independent and artistic world cinema.

And the second film is Jia’s first feature, Xiao Wu/Pickpocket (Hong Kong/China, 1998) for a module that I am teaching called Guerrilla Filmmaking. The aim of this module is, in the spirit of De fem benspænd/The Five Obstructions (Jørgen Leth and Lars von Trier, Denmark/Switzerland/Belgium/France, 2003), to set my students regular and short film projects on certain topics and involving certain formal constraints. As well as making the films, the students are invited to reflect critically on their projects – explaining what they have learnt, from the practical to the political to the philosophical. The students are also invited to talk about how they get their films seen once they are in existence.

The reason for showing Xiao Wu was/is not because this is a film made on a micro-budget, as per other films that I show my students as part of the module, including my colleague’s activist film, Chronicle of Protest (Michael Chanan, UK, 2011), as well as my own two features, En Attendant Godard (UK, 2009) and Afterimages (UK, 2010).

The reason for choosing the film is because Xiao Wu was made without a permit. Jia just went into the streets and filmed – and this is noticeable from the variable sound quality, from the inconsistent lighting, and especially from the way in which ‘extras’ – in fact just people in the street – often turn and look directly at the camera, while the actors – all non-professionals – carry on regardless. In other words, Xiao Wu serves as a means to explore the possibility of simply going out into the street and filming, guerrilla-style.

A phrase that seems to get repeated a lot at the moment is ‘go big or go home.’ In some senses, my Guerrilla Filmmaking module is precisely not about going big – but about working out how to use the means at one’s disposal to say what one wants to say. Not to make a film for the purpose solely of trying to please others. But about using film as an expressive (and supremely malleable) medium to convey one’s own thoughts and ideas. The module is intended to encourage students precisely to think and to have ideas, then, and to endeavour to put these into audiovisual form.

Anyway, with regard to my classes, I introduced Jia, the director of both films, as belonging to the so-called sixth generation of Chinese filmmakers – the previous five generations taking Chinese cinema from its early origins to the 1930s (first), through to China’s 1940s cinematic heyday (second), Chinese cinema under Communism (third), the (lack of) cinema of the Cultural Revolution (fourth), and the rise of the fifth generation in the 1980s and 1990s, the fifth generation including filmmakers such as Zhang Yimou and Chen Kaige.

Obviously, the latter two are still making films, as anyone who has seen Ying xiong/Hero (Zhang Yimou, Hong Kong/China, 2002), Shi mian mai fu/House of Flying Daggers (Zhang Yimou, China/Hong Kong, 2004) and Wu ji/The Promise (Chen Kaige, China/USA/South Korea, 2005) will know.

Now, while Chen and Zhang have both moved into blockbuster filmmaking, as the above examples demonstrate, they still plough the same thematic fields that they explored in their early, career-making films. That is, they make historical films, often featuring strong heroines, exploring China’s past to reflect – often critically – on the present, in particular the myth of nation-building and unification (even if their films can be read in a reactionary way, as Hero perhaps most clearly exemplifies in its decision to have a rebellious asssassin not kill a tyrannical leader because the latter’s work in unifying China, even if achieved by the sword, is finally understood by the assassin to be a ‘good thing’).

By contrast, the sixth generation, with Jia as one of its figureheads, concentrates more upon the contemporary, taking in issues of forced migration within China – particularly for the purposes of modernisation, urbanisation, and the alienating side-effects of globalisation.

Many sixth generation films were made without permits – such as Xiao Wu (The World, by contrast, was Jia’s first film to be made with a permit; more on the film can be read here). As such, they are often defined as ‘underground’ films, although this title can be misleading in that ‘underground’ can function as much as a brand as it does a qualification for unauthorised – and therefore supposedly ‘authentic’ – portraits of the nation’s contemporaneity.

Now, Jia’s films are ‘slow’ – consisting of ponderous long takes in which minimal action takes place; the emphasis often seems to be less on characters and more on the spaces and places in which the ‘action’ (or lack thereof) takes place.

For this reason it perhaps came as no surprise that my students – all bar one – said of The World that it is ‘boring’ – and, more controversially, that the filmmaker has a ‘duty’ (I can’t recall if this was the exact term used) to make ‘interesting’ and ‘entertaining’ films.

This prompted a diatribe from their lecturer (me) about the attention economy in which we live, and the foundations of which are built upon computers (i.e. digital technology) in their various guises (including iPhones, iPods, iPads, and the like – cheers, Steve Jobs). That is, that boredom is intolerable in the contemporary age, and that everything must happen at the accelerated pace of the entertainment industries, with what David Bordwell has defined as ‘intensified continuity‘ and which Steven Shaviro more recently has called a ‘post-continuity‘ culture at its core.

In contrast to this, there are – on a general level – filmmakers who feel the need to represent the fact that for all of the attention(-deficient) economy that bombards the bourgeoisie, and for all of the ease of movement that the global rich enjoy – both actual and virtual, there are many people who are left behind. Whose lives are slow. Who cannot and/or who do not want, perhaps, to lead their life at the speed of light.

Do these kinds of lives, I put to my students, not merit depiction? Who decides what is ‘cinematic’ and what is not? And would making an ‘exciting’ (i.e. ‘fast’) film about lives that many people might deem ‘unexciting’ (and ‘slow’) not be an inappropriate if one were trying to remain faithful to one’s subject matter and/or one’s own ideas thereupon?

Without wishing to overlook the specifically Chinese provenance of Jia’s films, or indeed the very constructed nature of his fictions (we cannot read them as entirely accurate representations of Chinese reality, even if he uses devices that typically we associate with that ethos), my argument in class also proposed that there is no consensus on what constitutes ‘entertaining’ with regard to film – and that perhaps there should not be such a consensus, otherwise all films would look and feel the same.

Now, I am not sure how convincing my diatribe was. One of my students – the most vocal critic of The World (Xiao Wu was ‘better’ because it had something of a plot – and, perhaps crucially, is 50 minutes shorter) – has blogged in spite of my defence of Jia that he (and I paraphrase) should not make this kind of film, since alienating audiences (there is no specification of what kind of audience is being considered here, the assumption being that all audiences are the same) is one of the worst sins of filmmaking.

I would link to the student’s blog – because I do not want to deprive them of their input in the dialogue I am creating. Alas, the blog is on a site closed to all outside of my university (and even then you need to be registered on the software, Mahara, that hosts it). So, apologies for those who wish to but cannot read the blog – perhaps especially to those who would agree with the student’s outlook on filmmaking in general and this film in particular.

Now, I want to try to avoid coming across as high-minded and condescending to my student(s) – for they are entitled to their thoughts, even if I also find it mildly frustrating to make a case for art cinema that is duly and adamantly cast aside for the sake of imposing a pre-existing set of criteria regarding what constitutes ‘good’ cinema (i.e. I probably am both high-minded and condescending at the last).

I also am wary about ‘picking on’ one or any of my students, not least because this one is certainly engaged and a keen participant in my classes. That is, I greatly appreciate what this person contributes to my classes, even if I do not agree with them, and even if I feel the need to encourage in them a more critical perspective.

(Interestingly, when it was established prior to showing it that Xiao Wu is, in the words of another student, ‘what we would call a “festival film,”‘ this also brought about a greater level of (perceived) engagement – as if one cannot watch films ‘properly’ without being given the correct prompt/lens through which to view them.)

This blogospheric excursion into teaching the cinema of Jia Zhangke may have exposed my limitations as a teacher, in that I failed to convince my students about the validity of The World, and to a lesser extent that of Xiao Wu, the ‘boringness’ of which – apparently – outweighed any interest in what Jia was trying to do; i.e. I could not get my students to consider what The World is, since they preferred instead to talk about what the film is not.

Furthermore, this excursion into teaching Jia Zhangke might also have exposed the limitations of top-down teaching as a whole; others involved in education, at any level, may share my sadness when I see value judgements made repeatedly in spite of insistent attempts to foster not simple judgement but critical engagement.

However, I mention all this as a preface to discussing Mr Tree, which, as mentioned, was produced by Jia and which shares with his films a similar set of concerns, because the issue of pace and boredom lies at the heart of what in different ways I have elsewhere defined as the war of/for our cinema screens and the political, perhaps even ethical, dilemmas facing filmmakers when making films about certain subjects.

Han Jie’s film is, like Jia’s films more generally, contemplative. Shu (whose name means ‘tree’ and who is played by Baoqiang Wang) is a drifter-type, who is a little bit crazy, a little bit weird.

He has a job as a car mechanic that he soon loses after inflicting upon himself an accident: he uses a blowtorch without the face mask and temporarily blinds himself, prompting his boss to let him go. He falls in love with a local deaf mute, Xiaomei (Zhuo Tan), and endeavours to woo her without much success – at least initially.

The local kids kind of ridicule Shu, although he seems well connected, hanging out with the local businessmen (who are trying to oust his mother from their family home for the sake of developing the land for business purposes; there is a coal mine in the area). He drinks, has the odd fight, wanders around his town, goes to the big city in Jilin, the northern province where he lives, and spends a bit of team cleaning up the school that a friend from his hometown runs, and he finally gets married to Xiaomei.

Except that on his wedding day, Shu is miserable. This is mainly because he has begun to see the ghost of his dead brother – a brother who apparently was hanged by his father from the tree in which Shu sometimes hides – and who is thoroughly ‘modern’ in his corduroys, hipster haircut, cool girlfriend and jacket.

Xiaomei makes love to him, but then leaves Shu, because he does not care for her. Shu then predicts accurately that the local mining industry – which has also already claimed the life of one his other local friends – will cause the water in the area to stop flowing.

Something of a prophet, it would seem, Shu then seemingly becomes rich by advising the mining company how correctly to bring to an end the malpractice that thus far has characterised it.

However, Shu’s ostensible success is revealed latterly as a fantasy, as is his reunion with a pregnant Xiaomei. In other words, Shu becomes mad, not least because his life is marked by the death of his brother and his friend. His descent into incoherence, however, seems to reflect the insistent modernisation that the village/town is undergoing through the mining company and other forces: people are moved out of their homes, and the ‘traditional’ ways seem to be disappearing as people are offered TVs and other mod cons to accept the questionable business plans for the area.

Even though Shu seemingly goes mad, he is still a character that seems to be able to see. As mentioned, he is temporarily blinded at the start of the film, but there is a strong emphasis on vision and visuality in the film, more often than not associated with Shu. He may be a living anachronism, incapable or unwilling to go along with the times, while his former friends get increasingly rich, but perhaps that is because he realises more than anyone else the confusion and chaos that is descending upon Jilin and the industrialisation of one of China’s most beautiful provinces (it is one of the ‘four major natural wonders of China’ – along with the Three Gorges Valley, the Rimmed Trees that also are in Jilin, and the Stone Forest of Yunnan).

Indeed, Shu seems to have these changes inscribed physically on his body: he moves in a twisted and awkward fashion (a great performance from Baoqiang), and often bears cuts, bruises and scars.

Furthermore, not only through his name, but also physically do we see Shu in connected with nature: as mentioned, he hides up a tree, but he also walks and runs most places – and he certainly does not have easy access to the good cars that his local friends seem to have. Even Shu’s brother drives a taxi, suggesting that he is moving along with the times, rather than being left behind as Shu is.

As such, Mr Tree is an interesting film that implicitly critiques what can be interpreted as the modernisation of China, which in turn leads to the disappearance of traditional ways of life – embodied here by Shu.

The film’s eventual descent into fantasy makes of Mr Tree a film that is only questionably realistic (although this critique – bizarrely – never seems to be made against, say, Miracolo a Milano/Miracle in Milan (Vittorio de Sica, Italy, 1951), a core film of the influential Italian neorealist movement, from which both this and Jia’s films seem to take inspiration, and which itself has an entire fantasy ending featuring, as implied by the title, a miracle made only more bitter by the fact that it is fantastic and impossible).

Nonetheless, as per much sixth generation filmmaking (if the term still applies – how long can a generation last before becoming a new one?), the film is a politicised glimpse into contemporary Chinese life.

It is only fitting that the film adopts the ‘slow’ pace that it does, filming predominantly in long shot to ground Shu and the other characters in the space/place that they inhabit rather than to have us view the film as simply a character portrait.

Again, this is not to overlook the complex roots of the film in Chinese culture – my reading might seem to ground the film uniquely in a genealogy of films and style of filmmaking – but it is to suggest that aesthetic strategies (how one shapes the look, feel, pace, and intensity of a film) are strongly tied to the political.

Hero this film neither is nor could be, interesting though Zhang’s most accessible work is in and of itself. For my part, then, I can only continue to reiterate, perhaps narcissistically (if I can never convince anyone who thinks otherwise), that judging films according to criteria of ‘good’ and ‘bad’ (with good being fast and exciting, bad being slow and ‘boring’) is pointless. It is better, rather, for us to think about what the film seems to be saying.

Not just to watch the film as entertainment, but to read, or to think about the film – perhaps even to find about the cinematic, industrial, and cultural contexts – among others – in which the film was made.

This is what I try to do as a lecturer in film. Sometimes I feel very strongly about it; the attention economy has us in its grips, and we will overlook many important – nay, vital – things if we do not pay enough respect to that which surrounds us. Some films try to do this by being deliberately slow. This is not bad; it is a strategy for trying to induce thought and thoughtfulness.

While I personally think that there are ‘problems’ with this ‘strategy’ (it is too teleological, it perhaps stratifies film into entertainment vs art house modes that rarely meet, and whose audiences rarely meet, and I am certain that one can think critically about Hollywood or any other mainstream style of cinema), it remains an important one.

If my choice of films and my teaching style run the risk of boring my students, a ‘problem’ that I might be called into account for when I have to proactively to address the feedback that my students eventually will give me for my teaching methods and choices, then this is just an issue that I/they will have to face together.

While I like fast films, too, I want to emphasise here how I am in praise of slow films – and why. I hope that this blog might help to convince someone – anyone – that slow films (all films?!) are important and not to be overlooked…

POM Wonderful Presents: The Greatest Movie Ever Sold (Morgan Spurlock, USA, 2011)

American cinema, Blogpost, Documentary, Film education, Film reviews, Neurocinematics

There is not necessarily that much to say about POM Wonderful Presents: The Greatest Movie Ever Sold, in that the premise of the film is pretty simple.

That is, Morgan Spurlock, he of Supersize Me (USA, 2004) fame, has made a film that exposes to what degree product placement – or what we might call just plain advertising – is a common practice in the film, television and new media industries.

I hope that such people do not exist (because they’d have to be what I might uncharitably term morons), but we can hypothesise that not everyone already knows this. And if not everyone already knows this, then bravo Morgan Spurlock for bringing it to our/their attention.

Beyond that, The Greatest Movie Ever Sold is not necessarily as brilliant as all that. And I’d perhaps even go so far as to say that it is disingenuous.

The Review Bit (in which – enviously? – I reproach Morgan Spurlock for thinking that a wink and a smile mitigates the trick he is playing on me)
The film is smart and ironic, sure – but its disingenuous nature comes through when Spurlock takes (seeming) swipes at bizarre North American corporate practices, such as the weird psychoanalytic branding exercise that he goes through early on in the film.

We see Morgan subjected to countless questions that seem to go on for hours – and after being grilled in this intense manner he is told – entirely anticlimactically – that he/his brand is a combination of intelligent and witty (I can’t remember the exact phrase – but it was cheesey).

My point is that if Morgan expects us (as at least I seem to think that he does) to laugh, somewhat bitterly, at how people can make money selling transparent clothes to the Emperor (psychoanalytic branding that tells anyone with a modicum of self-awareness what they probably know about themselves already), then why does he not expect us already to know precisely the other ‘insights’ that his documentary reveals – namely, that advertising is everywhere?

In this way, The Greatest Movie Ever Sold is not really about advertising, but about Morgan Spurlock – and his access to the beautiful classes (even if he has not in fact ‘made it’ in ‘real life’).

The film claims that he is not selling out but buying in. To be honest, I think that both of these terms pertain to the same logic of capital-as-justification-of-one’s-existence that Spurlock might not necessarily critique, but the critique of which is surely a strong part of his No Logo-reading target audience.

Spurlock might aim for ‘transparency’ – but this in itself is problematic. As pointed out to me in the past by an astute former colleague, if something is transparent, it is invisible. While Spurlock might make apparent something that advertisers themselves have for a long time been wanting to make as apparent as possible – namely their brand – Spurlock also seeks to make transparent – i.e. invisible – his very conformity with the practices that his film might otherwise seek to critique.

Irony and humour are aplenty in the film, as Spurlock seeks to make a doc-buster that is corporate sponsored in its entirety while being about the prevalence of corporate sponsorship. There seems no room in this world for gifts or sacrifices, or any of those things that might otherwise suggest a spirit and sense of community beyond the quest for material profit. And for all of Spurlock’s success (and his failures) in getting money from the brand dynasties, it does seem to lack, how do I put it?, soul.

The Simpsons Movie (David Silverman, USA, 2007) opens with Homer shouting from an onscreen audience that the Simpsons Movie within the Simpsons Movie that he is watching is no better than the TV show, and a rip-off. Similarly, Wayne’s World (Penelope Spheeris, USA, 1992) has a protracted sketch in which Wayne (Mike Myers) explains how he will not sell out to corporate sponsorship while simultaneously advertising a host of products from pizzas from trainers.

In other words, Hollywood has been pretty up-front about the fact that it has been peddling advertisements to us/short-changing us in the form of films for a long time. Hell – although I am here shifting slightly into the realm of the online viral, but some ‘advertainments’ – such as Zack Galafianakis’ wonderful Vodka Movie – are pretty good.

In this way, Spurlock does not take his film to the level of, say, the Yes Men in their critique of contemporary corporate practices. In their far-too-little-seen The Yes Men Fix The World (Andy Bichlbaum, Mike Bonanno and Kurt Engfehr, France/UK/USA, 2009), there is a scene in which the titular Yes Men try to convince a gathering of corporate bigwigs that they could make a shitload of money by, literally, repackaging shit back to consumers (that’s vaguely how I remember it, anyway – perhaps someone can correct me if I’m wrong). Hollywood also does this – but given that shit stinks and causes disease if not carefully disposed of, sometimes it’s good to rub it back in the noses of those who deposit it, as per the Yes Men. In comparison, Spurlock just seems to enjoy wading through shit to get to the silver screen a little too much.

Anyway, now to…

The Real Blog – not about but inspired by the so-branded Greatest Movie Ever Sold
At one point in Spurlock’s film, he talks to Martin Lindstrom of Buyology, which is also the name of a book about marketing and its effects on the brain.

Lindstrom shows to Spurlock images of his brain while watching a Coke commercial.

Lindstrom explains that at a certain moment in the commercial (it is not made particularly clear which moment, since Spurlock – like many neuro-whatever evangelists – tries to blind us with ‘science’ rather than a precise explanation), Spurlock’s brain releases dopamine, which suggests an addiction of sorts – inspired by the commercial. That is, or so the film seems to suggest, Spurlock underwent the same effects of a ‘Coke high’ thinking about Coke – which in turn suggested his avowed desire for a Coke at the time of watching the advert – as involved in actually drinking a Coke.

What is not clear from this is whether Spurlock’s ‘addiction’ is to Coke, or at the very least to its effects, or rather to images that can spur desire through their very presence for that which they depict.

My critique of the lack of clarity offered by Spurlock and which I extended to neuro-evangelists is not because, Raymond Tallis-style, I wish to dismiss ‘neuromania.’ Indeed, I personally think that neuroscience has enormous amounts of insight to offer us.

But I am not sure that the right questions are being asked of neuroscience at present in order for us fully to understand the implication of its results.

I have written a few papers, published and forthcoming, on what neuroscience might mean for film studies, particularly in the realm of images attracting our attentions through fast cutting rates, through the exaggerated use of colour, and through various acting techniques (associated predominantly with Stanislavski’s ‘system‘ and Strasberg’s ‘method‘ – which are of course different things, but I group them together because the former spawned its offshoot, the latter). And it is this area of studying film that I wish to pursue further – and on a level of seriousness far greater than that more playfully adopted for a previous posting on sleeping in the cinema.

This will sound quite outlandish – particularly to academic readers – because it is a crazy, Burroughs-esque proposal. But I think that a neuroscientific approach to cinema will help bring us closer to answering one question, which I formulate thus: can there be such a thing as image addiction?

Why is this an important question to ask/answer – and what does neuroscience have to do with it?

It is an important question – at least in my eyes – for the following reason: there is a long line in film studies history of people who argue for and against (predominantly against) the possibility that humans can or do mistake cinematic images for reality. This question, however, is all wrong – even if slightly more interesting than it seems easy to dismiss.

Far more important is the following: it is not that humans mistake films for reality (or if they do, this is not as significant as what follows), it is that humans commonly mistake reality for cinema.

What do I mean by this? I mean every time we feel disappointed that we are not in a film. I mean humanity’s obsession with watching moving images on screens at every possible opportunity such that life – and even ‘slow’ films – become boring and intolerable to people who must have their fix instead of bright colour and fast action. I mean the widespread aspiration to be on film, or at the very least to become an image (what I like to refer to as ‘becoming light’) on a screen (the final abandonment of the body and the ability to be – as an image – in all places at once [travelling at ‘light speed’]). I mean our inability to look interlocutors in the eye because we are too transfixed by the TV screen glowing in the corner of the pub. I mean – and I know this sensation intensely – the sense of immersion and loss of self that I feel when I watch films.

This is what I call image addiction.

But why neuroscience?

Because neuroscience might be able to help show to what extent – be it through conspiracy or otherwise – moving images and their accompanying sounds literally wire our brains in a certain fashion, such that we do all (come closer to) thinking in exactly the same way, repeating the same bullshit mantras to each other, dreaming only minor variations of the same things, etc.

Don’t get me wrong. If we adopted a psychoanalytic – instead of a neuroscientific – discourse – we might realise that the literal wiring of our brains is heavily influenced – and perhaps even relies/historically has relied upon – ‘fantasy’ of other kinds beyond the cinematic, and which we might even more broadly label the ‘culture’ in which we live.

But a neuroscientific demonstration of how this is so (if, indeed, it is so – this is only my hunch at the moment) might then open up debate on every philosophical level: ontologically, to what extent is reality determined by fiction? Ethically, how many images, of what kind, and using what styles, can or should we see if we want to retain some sense of a mythological self that – impossibly in my eyes – is ‘untouched’ by the world (be that by cinema in the world or the world itself that contains cinema) and belongs ‘purely’ to us? Indeed, this might open up debate not only about which ontology and which ethics, but regarding the entire issue of both ontology and ethics – and how historically they have been framed…

For those interested in what academic researchers do, I am trying at present to create a network of scholars interested in ‘neurocinematics’ (which is not to deny that various scholars are already working on these issues in their own ways). I am sceptical that I will be successful in attracting funding, not because the idea is not ‘sexy’ but because I am not sure, at present, whether I know enough neuroscientists to work with, and I also figure I might be too much of a no-mark academic to land a plum grant from a funding institution that has never heard of me. But I shall try nonetheless.

In conclusion, then, Lindstrom’s comment to Spurlock is unclear, but it raises the issue that I think is at the heart of where I want my academic research to go (even if I want to retain strong interests in other academic areas, predominantly film studies, and even if I might ditch all of this to write and/or direct films if anyone ever gave me the money to do so beyond my breaking my bank account every time I put light to lens – my filmmaking being my own desire to become light): is Spurlock addicted to what is in the image, or to the image itself? Can we separate them? Does looking at Coke can yield the same effect as looking at the stylised Coke can in the image (i.e. it is the properties not strictly of the can, but of the can in the image) that trigger the response of which Lindstrom speaks…

I suspect that both the advertising and movie industries are funding this kind of research as we speak. If the corporate giants can go straight to your brain, they will do. Inception (Christopher Nolan, USA/UK, 2010), then, becomes no lie (not that inception/influence of some sort has not always been in existence – as I argue here). In some sense, then, such research is morally indispensable; what I mean by this is that if corporate giants discover and protect methods of accessing the brain directly, then it is up to academics to let humans know how this happens, to make them aware of ‘inception,’ to bring people back to that most unfashionable of approaches to studying film, ideological critique.

In some senses, then, this is simply the rehashing under new paradigms the same old questions that have been banging around since cinema’s, ahem, inception, and even before. Only the stakes are now higher.

Might I say that a full neurocinematic programme might simply prove according to some scientific paradigm what many of us have known all along?

Wait a second, isn’t this also what I accuse Morgan Spurlock of doing with The Greatest Movie Ever Sold, accusing him of being a self-serving hypocrite for doing something that I myself seem to want to do?

Maybe Spurlock’s film is better than I thought, then. Maybe it is important and ingenious, because of its invisible transparency, not in spite of it.

There probably is no academic study that is devoid of corporate sponsorship somewhere along the line these days. There certainly will be even less if the politicians do not open their ears at some point and listen to what people are beginning literally to scream at them with regard to higher education and other issues. That is, that is must be as free of outside interests as possible, even if the quest for true objectivity is impossible to achieve.

Indeed, if we are talking about the possibility of corporate – or even gubernatorial – brain control (which is not the same as mind control, I hasten to add, though one could lead to the other), then we need to know whether it is possible, how it might happen, and what we can do about it. Before our bodies are all snatched away by the light (note: even now I cannot escape movie references) of the screen and before we are all turned into dependent image junkies who need the images just to feel alive – the over-dependent equivalent of the good, small dose that a movie like Perfect Sense (David Mackenzie, UK/Denmark, 2011) seems to offer – as written about here.

Sleeping in the Cinema

Blogpost, Film education, Uncategorized

The below is a rough draft of a paper I was going to present at an academic conference in London this summer, but from which I have withdrawn because I can’t really afford it.

It is relatively ‘whimsical’ and not ‘hard science,’ though it flirts with some science in it.

But I offer it as ‘notes’ about what for me is a prominent aspect of the film viewing experience, falling asleep.

“You are 8 ½. What an age for a boy to ask about cinema and dream! It occurs to me that that same evening, Dadda was telling me that his falling asleep in the cinema is a particular honour to the film in question. He was telling me this as a compliment, his having snored through three of the four films released last year in which I appeared.”

– Tilda Swinton (2006: 111)

In an age when film studies wishes to map almost every aspect of the film experience – from ideological influence to affective response, from audience feedback to galvanic skin responses, sleeping in the cinema remains an overlooked aspect of spectatorship.

And yet, what does it mean to sleep in the cinema? Is it simply an index of a film’s failure to capture the attention of the viewer, such that they prefer instead to doze off in pursuit of more interesting thoughts? Or might sleeping in the cinema be something more akin to what Tilda Swinton playfully suggests is her father’s experience of the majority of her films – that is, an honour and a compliment to the film in question?

Theories of why humans sleep vary, although given that not all animal species do sleep, the prevailing logic would suggest that sleep does serve some function that benefits us, and which outweighs the dangers that are associated with sleeping, namely that one is not particularly aware of the potential dangers that could be lurking in one’s vicinity while in that particular state.

What is more, we spend roughly one third of our existence asleep, which reinforces the notion that it must serve some evolutionarily beneficial purpose.

The most common and seemingly plausible theory of sleep is that humans do it for the sake of information storage.

Various studies have shown that sleep enhances synaptic efficacy ‘through oscillatory neural activity providing “dynamic stabilisation” for neural circuits storing inherited information and information acquired through experience… Sleep, therefore, serves the maintenance of inherited and acquired memories as well as the process of storage of new memory traces’ (Krueger et al 1999: 121).

In other words, sleep fulfils some of the same functions that waking life achieves, namely our adaptation to the environment: ‘the major function of sleep is to maintain our ability to adapt to a continually changing environment since that ability is dependent on brain microcircuitry’ (Krueger et al 1999: 126).

By keeping our brains fluid and malleable, sleep enables us better to consolidate memories, which in turn enable us better to navigate our waking world.

It is perhaps useful at this point to explain that there are two separate modes of sleep, which some view as supporting ‘quantitatively different states of consciousness’ (Hobson and Pace-Schott 2002: 685), namely rapid eye movement (REM) and non-rapid eye movement (NREM) sleep.

Dreams only take place in REM sleep, which is deemed to ‘release hallucinosis at the expense of thought,’ perhaps because ‘the activated forebrain is aminergically demodulated compared with waking and NREM sleep’ (Hobson and Pace-Schott 2002: 686).

What this latter phrase means is that the neurons used to regulate (or modulate) the size and intensity of certain brain waves (e.g. ponto-geniculo-occipital, or PGO waves) during NREM and waking life do not fire (the waves are ‘demodulated’).

As a result of this, and the hyperactivation and deactivation of other brain regions, REM sleep, or dream, is characterised by ‘the lack of self-reflective awareness, the inability to control dream action voluntarily, and the impoverishment of analytical thought’ (Hobson and Pace-Schott 2002: 686).

Having differentiated between REM and NREM sleep, though, it is important to remember that both seem to serve a similar function: NREM sleep ‘could allow recent inputs to be reiterated in a manner that promotes plasticity processes that are associated with memory consolidation,’ while during REM sleep ‘the brain is reactivated but the microchemistry and regional activation patterns are markedly different from those of waking and NREM sleep.’

As a result, Hobson and Pace-Schott conclude that ‘[c]ortically consolidated memories, originally stored during NREM by iterative processes such as corticopetal information outflow from the hippocampus, would thus be integrated with other stored memories during REM’ (Hobson and Pace-Schott 2002: 691).

If both REM and NREM sleep help to consolidate memory, albeit in different ways, during sleep, then the distinction is not necessarily a useful one to draw with regard to sleeping in the cinema, not least because it is hard to determine, even via introspection based upon personal experience, what kind of sleep goes on in the cinema – if there is a constant type of sleep that does happen in the cinema at all.

What is more, the case has been made that there is in fact slippage between REM sleep, NREM sleep and waking life. This is not just on account of the fact that we can ‘hallucinate’ during waking life, or have ‘day dreams,’ such that ‘all conscious states – including waking – might have some quantifiable aspects of dream-like mental activity’ (Hobson and Pace-Schott 2002: 684).

Instead, this is based upon the fact that parts of the brain are always ‘asleep’ while other parts are more activated, meaning that even our waking life is characterised in part by areas of our brain sleeping.

In spite of this blurred boundary between waking, REM, and NREM sleep, however, the distinction might be useful for us in thinking about more ‘ecological’ causes of sleep.

The cinema is a darkened room; although light shines from a projector and is reflected from a screen, the room is predominantly dark.

Arguably (see Brown 2011), the light from the screen, particularly in the case of rapid changes of intensity and colour (i.e. lots of onscreen movement in the form of figural motion, camera motion, and cutting) in conjunction with loud noise, is enough to activate our attention in a quasi-involuntary manner.

However, the darkness of the room might also be important, since sunlight inhibits the production of melatonin. Melatonin is a compound that synchronises the biological clock; that is, once darkness falls, melatonin is released by the pineal gland, and this is useful as an anti-oxidant and for the immune system.

The darkness of the cinema, then, may also bring about a release of melatonin, which in turn prepares us for sleep.

I might add that melatonin is produced from serotonin in the human body. Both melatonin and serotonin have been considered as playing a role in human sleep, and both have also been used in the manufacture of recreational drugs for their hallucinatory qualities.

Now, cinema has been equated with the dream state since at least the 1950s (for example, Langer, 1953), but rather than the typically psychoanalytic slant given to the relationship between dream and film, I should like here to pursue a different, admittedly (equally?) speculative line.

Both serotonin and melatonin are neurotransmitters; that is, they help to transmit signals across neurons. When, as mentioned earlier, aminergic demodulation takes place, serotonin and melatonin allow the level of hallucinosis to rise.

Serotonin in particular is linked to feelings of euphoria (it is used in MDMA, or ecstasy).

It is interesting that as a transmitter – a guardian in the neuronal gateways – serotonin is between actual signals, but it modifies which signals travel through our brain – which connections are made.

Or rather, serotonin, from my understanding of it, enables brain plasticity; that is, it enables more, not fewer connections, to be made, and is comparatively inhibited, if still at work, during waking and NREM sleep as opposed to REM sleep.

As such, serotonin and melatonin (but the latter seemingly to a lesser extent) are a means of regulating not what we envision, but how we envision it; for creating and cementing new connections in the brain.

On a purely speculative level, in an era whereby scans of the human brain are being carried out during film viewing (see Hasson et al 2004; 2008; Kauppi et al 2010), it would be interesting to see if there are any similarities between brain function during REM sleep and film viewing – that is, whether the human brain considers film viewing in general, or certain types of film viewing, to be a form of hallucinosis.

As a neurotransmitter, which sits between signals or brain events, there is something intriguing about serotonin; as the interval between brain signals, we might consider a neurotransmitter to be more temporal than spatial: neurons themselves have extension, while a neurotransmitter is what decides whether to convert an action potential into an actual action.

As such, the neurotransmitter sits at the threshold between the conscious and the unconscious, between potential and action, between perception and hallucination, and between space (extension) and time (intensity).

Maximising serotonin levels, both in REM sleep and in hallucinosis (and maybe even in cinema?) is the foregrounding of the temporal and the intense rather than the spatial elements and extensive/motor processes of the brain.

Approaching the issue of sleeping in the cinema from the introspective point of view – that is, basing thoughts upon personal experience – during the period from 1 September 2007 to 1 September 2008, I went to the cinema roughly 150 times.

I fell asleep during roughly one third of the films that I saw at the cinema, which is of course a high tally.

I cannot say for certain, but based upon running times and/or conversations with others in attendance, my sleep typically lasted between 2 and 20 minutes.

There are several factors that contribute to my sleeping: typically I do not sleep particularly great lengths of time at night (six hours on average), and I do as a result often find myself tired during the day.

Post-perandial cinema visits, particularly in the early afternoon, would most often induce sleep, as to a lesser extent might early to mid-afternoon screenings before which I had not eaten.

In addition, early to mid-afternoon screenings tend to involve fewer patrons; the presence of other patrons in the theatre, in terms of atmospheric noise, temperature and perhaps also in terms of ‘emotional contagion,’ can affect the way in which we view a film – and the absence of others might also increase the likelihood of sleep.

What is more, my slouching posture and the comfort of the chairs, in addition to the potential effect that the melatonin-inducing darkness of the cinema hall can have on viewers, may also contribute to my nodding off.

Although these factors are important to take into account, I know from experience, however, that I also fell (and typically fall) asleep far more often during what we might term ‘art house’ films than I did (or do) during what we might term ‘blockbusters,’ if drawing such a crude dichotomy between film types be allowed for the sake of argument.

Rarely did I fall asleep for lack of enjoyment; if I may speak personally, very few are the films I do not like, and I like art house films most of all, at least in proportion to the number that I see when compared to the relatively few blockbusters that I actually enjoy relative to the number of those that I see.

Now, art house films tend to have smaller audiences than blockbusters, and given my desire to see them during the cheaper early to mid-afternoon screening slots, the small audience size may well have an even greater effect on my likelihood of sleeping than watching a blockbuster during the day.

That is to say, I suspect that each of these factors plays a role in my falling asleep in the cinema.

However, the most common factor seemed to be the art house nature of the films; that is, regardless of my (often high) level of enjoyment, the relatively slow nature of these films, in terms of movement onscreen, camera motion and in terms of cutting rate, helped to bring about sleep.

This stands to reason on a certain level: if we are aroused by fast action and the loud explosions of blockbusters, it is more likely that we will feel drowsy and/or fall asleep when no danger is clear or present.

However, I should like to offer a different reason, not necessarily in contradiction of this prior reason, but certainly alongside it.

The afore-mentioned work on what happens in the human brain during film viewing, or ‘neurocinematics,’ suggests that audiences in fact respond very similarly to mainstream films like Vertigo (Alfred Hitchcock, USA, 1958), while there seems to be a much greater level of independent brain response during less action-packed films, the least amount of what Hasson and his team (2008) call ‘intersubjective correlation’ (ISC) taking place when viewers see a video of a tree in a park in which ‘nothing,’ so to speak, happens.

According to Hasson et al (2004: 452), the brain regions that are most commonly correlated intersubjectively during mainstream film viewing are the parahippocampal gyrus, the superior temporal gyrus, the anterior temporal poles, and the temporal-parietal junction.

The parahippocampal cortex has been identified as playing a role in REM sleep, in that it allows the sense of movement, emotion and affective salience to emerge (Hobson and Pace-Schott 2002: 687).

Furthermore, in both film viewing and REM sleep, the fusiforim gyrus, which is useful for face recognition, has been found to function.

While circumstantial at best, it might be possible to suggest from this evidence that mainstream film viewing does function cerebrally in a manner akin to dream, especially because the movement associated with the parahippocampal gyrus is illusory in both cases.

Krueger et al suggest, contrary to much sleep research, that sleep is not dependent upon prolonged wakefulness, but rather upon synaptic use.

That is, ‘exposure to rich environments’ can increase the amount of REM sleep that we have (Krueger et al 1999: 124).

It is not entirely clear what they mean by a rich environment.

Since I do not typically fall asleep in mainstream films, we might conclude that these are not such ‘rich environments,’ for example. In spite of the rapid movement and motion in mainstream films, we could argue, that such films do much work for audiences and that they do not force the brain to work harder to comprehend what is going on. In fact, the ease with which we can understand mainstream films would in this case suggest that they are ‘simplified’ (and not ‘rich’) versions of reality (even if such ‘simple’ scenes also stimulate our attention through continued visual and auditory renewal/stimulation).

However, the greater levels of direct stimulation involved in mainstream cinema would suggest some ‘richness’ – and that it is the relatively ‘unrich’ environment of the cinema during art house fare that encourages us, or at any rate me, to sleep.

My point is not here to resolve this conundrum, even if all types of cinema might be said to constitute some sort of hallucinosis, in that we see objects that are only images, but which we might on a certain level take for real, as we do a dream during our experience thereof.

Rather, my point is to say that while cinema might be akin to a shared dream, in that it can induce similar thought patterns across multiple viewers in similar regions of the brain as fire during sleeping, it is the cinema that does not involve such synchronisation of viewers’ brain patterns that is more likely – if I am anything to go by – to induce sleep itself.

Since humans are collectively involved in a world that is always affecting us, it is hard to separate when its influence ends, if at all. If we were to cling to the notion of a subjective self, however, who wanted to think for itself, then we might conceivably argue that sleep is the time when, paradoxically, given that we have little motor control over dreams and do not remember NREM sleep, that our thoughts are most ‘our own.’

That is, we function (more) ‘offline’ when asleep than during waking, during which time our thoughts constantly are being shaped by the world around us.

If Hasson et al’s research is anything to go by, then mainstream film might well bring humans together (we ‘correlate’), but it increases the probability of humans all thinking in the same way, perhaps by virtue of the simplified version of reality that mainstream cinema has to offer.

This is not to say that mainstream cinema will make automatons of us all (unless it already has).

It is to say, however, that films that do not impose images upon us, but which allow us actively to explore the image in our own way (less intersubjective correlation, more ‘art house’) might naturally induce in us that state of maximised (if never absolute) ‘independent’ thought, which is sleep.

The fact that we must search through these environments, rather than have information delivered to us in an obvious if stimulating manner, might even make them ‘rich environments’ that naturally tire us, because we must process individual (new?) thought patterns and associations that we have created for ourselves rather than had imposed upon us.

In this sense, perhaps sleeping during a film is not only an honour for and a compliment to the film in question, but it is a gift from the film to encourage independent thought, it is an act of love in certain senses.

Anthropologically speaking, humans do not always sleep just anywhere and with anyone (even though inebriation, among other altered states of consciousness, might lead us to believe that we do).

And yet, humans do feel the need to sleep with someone, even if this so-called ‘need’ is cultural.

Indeed, the act of sleeping with another human, which often is synonymous with the act of love, is perhaps the most intimate relationship that two humans can have.

In Being Singular Plural (2000), Jean-Luc Nancy argues that humans must recognise the fundamental ‘withness’ of their existence.

That is, humans do not lead lives in which they can objectively observe each other, detached in their observations, but instead we are always at all points with each other, leading a relative existence, in the sense that we are always only ever coexisting, and that, indeed, there is no existence without coexistence and communcation.

Nancy writes:

“‘to speak with’ is not so much speaking to oneself or to one another, nor is it ‘saying’ (declaring, naming), nor is it proffering (bringing forth meaning or bringing meaning to light). Rather, ‘to speak with’ is the conversation (and sustaining) and conatus of a being-exposed, which exposes only the secret of its own exposition. Saying ‘to speak with’ is like saying ‘to sleep with,’ ‘to go out with’ (co-ire), or ‘to live with’: it us a (eu)phemism for (not) saying nothing less than what ‘wanting to say’ means [le ‘vouloir-dire’ veut dire] in many different ways; that is to say, it says Being itself as communication and thinking: the co-agitatio of Being.” (Nancy 2000: 92-93)

Picking apart this passage, Nancy offers communication as a means of exposing oneself, of opening oneself up to the other (and elsewhere, Nancy [2008] has written about how exposure is part of the cinematic experience, as we are ex-peau-sed to the skin (pellicule) of the film).

To open oneself up in this way is like sleeping with or going with: co-itus/coitus as part of this communication.

Paradoxically, it takes sleeping with someone else, that experience in which we are most ‘ourselves’ because ‘offline’ (even if never fully so), in order fully to ‘communicate’ or expose oneself to the other.

It is to accept and to be accepted by the other, a level of thought in which we are not the detached, thinking observer that Descartes proposes as the mind split from the body, and which finds expression in his cogito ergo sum.

In an age in which neuroscience has tried to overthrow the sway under which Descartes’ most famous phrase has held us (see, for example, Damasio 1994), because for neuroscience there is no detached thought/mind-body dualism since we are always only ever embodied, in that our ‘higher’ conscious processes stem from and cannot live without our so-called ‘lower’ viscera and emotions, then it would seem that we must abandon the mind-body dualism.

However, this does not necessarily mean that we must abandon the cogito entirely.

Descartes first proposes je pense, donc je suis as one of only three things about which he can have no doubt in Discourse on Method (Descartes 1998 [1637]: 53).

He refines this phrase in Principles of Philosophy (2009 [1644]: 17), where he argues that we might well imagine that there is no God and that we have no body, but that we cannot doubt our minds, because thinking determines that we must have a mind.

Descartes goes on to define thought, or cogitatio:

“By the word thought, I understand all that which so takes place in us that we of ourselves are immediately conscious of it: and, according, not only to understand (INTELLIGERE, ENTENDRE), to will (VELLE), to imagine (IMAGINARI), but even to perceive (SENTIRE, SENTIR), are here the same as to think (COGITARE, PENSER). For if I say I see, or, I walk, therefore I am; and if I understand by vision or walking the act of my eyes or of my limbs, which is the work of the body, the conclusion is not absolutely certain, because, as is often the case in dreams, I may think that I see or walk, although I do not open my eyes or move from my place, and even, perhaps, although I have no body: but if I mean the sensation itself, or consciousness of seeing or walking, the knowledge is manifestly certain, because it is then referred to the mind, which alone perceives or is conscious that it sees or walks.” (Descartes 2009: 18)

If thought and the mind are precisely embodied, Descartes’ definition of cogitatio would seem misguided.

However, if, as Nancy explains to us, we remember that cogitatio is derived from co-agitatio, which etymologically speaking means to act, move, or do with, then even cogitation is always already a phenomenon done with others (and, after Damasio, with one’s body).

With regard to cinema, we might remember that there is a paradox in that to sleep/to do with another the thing that arguably requires the least ‘withness’ is in fact perhaps the most intimate or the greatest exposure than one can make of one’s self.

This paradox is logical, since if we are only with others, then one’s most ‘detached’ self is precisely that which is least ‘with’ others – i.e. self-hood is only defined with others, and so that which is most un-other-like about us, our sleeping self, is paradoxically that which is most unique to us; we are most ourselves when we are least ourselves.

Furthermore, this paradox is mirrored by the fact that to cogitate, which Descartes uphold as the highest indicator of the mind’s separation from the body, is in fact only ever a thinking with, both with our bodies and with others.

The cogito is in fact a co-agito.

Sleeping in the cinema, in which we are ‘most ourselves’ becomes in this way a communion with the film.

Many humans sleep alone, within spaces that are familiar to them. Perhaps it is as much the space of the cinema as with any particular film that we feel so intimate and safe that we can allow ourselves sleep.

That I do not sleep during blockbusters leads me to believe that I probably do not trust blockbusters; their fast movement may be arousing in terms of being attention-grabbing, but they also enervate me, making me alert and worried that something is about to happen.

The art house film, meanwhile, is a friend, or a lover, with whom I feel safe, and in a space that feels safe to me.

Since it exposes to me those things that are more intimate and meaningful than does the blockbuster, then I expose to it that which is most private in my life, my sleeping self.

We go together in a strange coitus, a co-agitatio akin to that of cogitation in the real world (and which perhaps we might differentiate from the egocentric survival instincts that the explosions of the action film seem to encourage).

I feel safe in the cinema perhaps because of familiarity, making it not a ‘rich environment’; but while a blockbuster may grab my attention, it does not necessary entertain me.

Art house films are the richest environment in which, or better with which, I think (I co-agitate) the most; blockbusters are not wholly ‘brainless,’ not least because the mind and the brain are embodied, and we can and often do have very visceral responses to blockbusters, which in turn can induce new, richer thoughts.

But the phrase ‘brainless’ is not unuseful in getting to the root of our relationship with blockbusters, of differentiating these simplified versions of reality with the complexity of art house films.

I love cinema, but if my willingness to sleep with art house film is anything to go by, I feel happiest with it.

I am promiscuous in my cinematic tastes, responding to and interested in many of the different experiences that cinema can offer; but I am happiest with the slow, thoughtful films, that sometimes even allow me to think ‘offline’ for a while, to sleep, perchance to dream.

Cinema has long since been associated with dream, and yet sleeping in the cinema is typically thought of as being a negative experience, a sign of boredom.

Cinephiles, together with cognitive studies of cinema, seem predominantly interested in visual and aesthetic pleasure, and in attention and arousal.

And yet cinema can indeed send audiences to sleep.

Contrary to the ‘boring’ and ‘slow’ film argument, this can in fact be the most intimate relationship one can have with a film, even if paradoxically it means not even ‘seeing’ or ‘hearing’ the film (though we can still sense its presence).

To sleep with a film is a sign of cinephilia.

References
Brown, William (2011), ‘Resisting the Psycho-Logic of Intensified Continuity,’ Projections: The Journal for Movies and Mind, 5:1, pp. 69-86.

Damasio, Antonio (1994), Descartes’ Error: Emotion, Reason, and the Human Brain, London: Vintage.

Descartes, René (1998 [1637]), Discourse on Method and The Meditations (trans. F.E. Sutcliffe), London: Penguin.

Descartes, René (2009 [1644]), Principles of Philosophy (trans. John Veitch), Whitefish, Mass.: Wilder Publications.

Hasson, Uri, Yuval Nir, Ifat Levy, Galit Fuhrmann, and Rafael Malach (2004), ‘Intersubject synchronization of cortical activity during natural vision,’ Science, 303: 5664, pp. 1634–1640.

Hasson, Uri, Ohad Landesman, Barbara Knappmayer, Ignacio Vallines, Nava Rubin and Davd J Heeger (2008), ‘Neurocinematics: The Neuroscience of Film,’ Projections: The Journal for Movies and Mind, 2:1, pp. 1-26.

Hobson, J. Allan, and Edward F. Pace-Schott (2002), ‘The Cognitive Neuroscience of Sleep: Neuronal Systems, Consciousness and Learning,’ Nature, 3 (September), pp. 679-693.

Kauppi, Jukka-Pekka, Iiro P. Jääskeläinen, Mikko Sams and Jussi Tohka (2010), ‘Inter-subject correlation of brain hemodynamic responses during watching a movie: localisation in space and frequency,’ Frontiers in Neuroinformatics, 4:5, pp. 1-10.

Krueger, James M., Ferenc Obál Jr, and Jidong Fang (1999), ‘Why we sleep: a theoretical view of sleep function,’ Sleep Medicine Reviews, 3:2, pp. 119-129.

Nancy, Jean-Luc (2000), Being Singular Plural, trans. R.D. Richardson and A.E. O’Byrne, Stanford: Stanford University Press.

Nancy, Jean-Luc (2008), ‘Claire Denis: Icon of Ferocity’ (trans. Peter Enright), in James Phillips (ed.), Cinematic Thinking: Philosophical Approaches to the New Cinema, Stanford: Stanford University Press, pp. 160-170.

Swinton, Tilda (2006), ‘Film: State of Cinema Address, 49th San Francisco International Film Festival, 29 April 2006Critical Quarterly, 48:3 (Autumn), pp. 110-120.

No dark sarcasm (three)

Blogpost, Film education, Uncategorized

The previous two blogs, then, serve as a prelude to this final blog, which of course must look at the New College of the Humanities and the emergent surrounding discourse.

If I have argued in the previous two blogs that we should not abolish Oxbridge at all, but simply recognise that Oxbridge-centrism leads to misconceptions about higher education in the UK, then I am naturally dubious about an institution that is being mediated as creating ‘a rival to Oxbridge.’

In these days of free enterprise, anyone should be allowed to make money in the way that they see fit, provided, of course, that it is legal (and even then a lot of people still make money – and lots of it – illegally).

Many, if not all, UK universities are currently trying to come up with ways to make more money in the face of government cuts (although they might well be doing this even if there were not government cuts).

These schemes include the development of short courses, diplomas, and various other educational packages in various/many academic disciplines, which typically might create a market out of local potential part-time students, who will pay for a diploma and thus help to fund the university.

It is hard to be enthusiastic about creating more teaching for oneself in an age when one might already feel that one does quite enough already, thank you.

However, the reason that I mention this is that, in their own way, universities are not necessarily charging their students more to keep afloat (since undergraduate fees have been capped at £9,000 for domestic students), but they are coming up with ways of getting money from other, temporary students in order to create some economic stability.

And of course out of a sense of pedagogical altruism.

If this is what is happening widely already in higher education institutions, then, particularly in these market-driven times, it is perhaps only a small leap for higher education institutions to privatise themselves. That is, to free themselves from the burden of having to develop smaller money-making schemes that run constantly for the sake of charging students enough money that the institution can forego entirely its government subsidy, and instead just crack on on its own.

Charging £18,000 a year, this is precisely what the NCHum is doing. Sort of.

The NCHum is an interesting concept. Regardless of the ‘celebrity’ academics that will teach there, presenting obligatory modules on applied ethics, scientific literacy and logic and critical thinking seems a sound idea. Having one-with-one tutorials also, as mentioned in my first blog, has benefits for students. Preparing students for the world of work, another NCHum promise, is also fine. It is not as if other universities do not already try to do this.

There are some gray areas surrounding how the college will run, though, that do give pause for thought.

Firstly, the college will offer degrees from the University of London and it will provide students with access to University of London facilities, such as teaching spaces, libraries and so on.

Since the University of London has acquiesced to this, presumably they feel that this is sound business – a kind of outsourcing of various courses that otherwise it might have been unwieldy to put in place.

However, what this means with regard to which students can have access to which resources and at which times remains to be seen. Will the students at NCHum, as a result of their higher outlaying of money, get preferential treatment over other University of London students when the time inevitably comes that they are after the same resources?

Logistically speaking, I query the university’s claim that ‘all applicants’ will be interviewed. This will be a time-consuming process (and one that Jones in my first blog feels discriminates against those who have not had preparation for the interview, even if I – and Canfor-Dumas and Glancy – feel that this might not necessarily be the case). Given that this will likely be very time-consuming, I wonder if it is really true, then.

Furthermore, although this is a ‘soft’ point to make, it is unlikely that many of the ‘celebrity’ staff members will be doing the nitty gritty of one-with-one tutorials week-in, week-out, meaning that what access one has to the ‘great minds’ teaching at NCHum will be limited at best.

Besides, just because NCHum places 14 high-profile academics in a single, small institution does not mean that those academics are either great teachers, or that other institutions do not have as many, if not more, at least in terms of raw figures, prominent academics across their disciplines.

Indeed, by the time British academics have reached the post of professor, they are all pretty eminent and will undoubtedly have vast swathes of knowledge to pass on, even if their work has not led to TV shows and interviews because it is not ‘sexy.’

Terry Eagleton has argued that the education at NCHum will be not necessarily be as open-minded as all that:

“The new college, staffed as it is by such notable liberals, will of course be open to all viewpoints. Well, sort of. One takes it there will not be a theology department. It is reasonable to suppose that Tariq Ali will not be appointed professor of politics. The teaching of history, if the work of Dawkins and Grayling is anything to judge by, will be of a distinctly Whiggish kind. Grayling peddles a Just So version of English history, breathtaking in its crudity and complacency, in which freedom has been on the rise for centuries and has only recently run into trouble. Dawkins touts a simple-minded, off-the-peg version of Enlightenment in which people in the west have all been getting nicer and nicer, and would have ended up as civilised as an Oxford high table were it not for a nasty bunch of religious fundamentalists. Who would pay £18,000 a year to listen to this outdated Victorian rationalism when they could buy themselves a second-hand copy of John Stuart Mill?”

However, even though I reproduce this paragraph in full, it is not as if many universities do not have departments that are characterised by a loosely unified outlook on the world. Perhaps in an outmoded fashion, but I am thinking of ‘Marxist’ campuses from the 1970s and 1980s, or even today, when many universities might spit, for example, on Milton Friedman so much as hear what he has to say were he to turn up to (let alone be invited for) a guest lecture.

In the same way that I have criticised unthinking Oxbridge-centrism in the previous two posts, there seems to be a vague logic of NCHum-centrism underlying Grayling’s talk of the New College (although he does of course have a new college to try to sell).

A Telegraph article says the following:

“The college claims to offer a ‘new model of higher education for the humanities in the UK’ and will prepare undergraduates for degrees in Law, Economics and humanities subjects including History, Philosophy and English literature.”

Law, economics, history, philosophy and English literature. This is not a list that inspires thoughts of a ‘new model’ of education. It is not, for example, as if NCHum is offering an innovative course in the logic of the digital world – a course that might take in elements from sociology, media, geography, economics, politics, philosophy and history all at the same time (and which might make a good course).

In other words, Grayling seems to be bigging up something as innovative that many universities would call ‘business as usual.’ Indeed, in a witty article in the Standard, some former colleagues at UCL feel that Grayling is not only not offering original courses, but that he has even purloined modules from them.

Furthermore, with 14 celebrity staff members and a small handful of others named on the website, it strikes me that there may not be much choice in modules at NCHum. I’ll return to this below, but even if students get excellent one-with-one tuition on a weekly basis, it strikes me that – at least in these early stages and before the university has had a chance to grow – the model of education here is pretty much a top-down one, as opposed to the bottom-up quest of inspiring to learn that many other universities try to offer.

This may seem a frivolous, or indeed a poor, point to make: “get some discipline in them, that’s what these youngsters need.” But then again, this also suggests that the NCHum already/so far has something loosely approaching a one-size-fits-all ethos that does not necessarily tally at all with, in Grayling’s own words, what it means to be human today.

I note, by the way, that NCHum is not offering film. This is a pity, because with some film expertise, be it in criticism or production, they might well have been able to produce a video for their site that does not cut Grayling off mid-sentence at the end of his introduction to the college.

In the same Telegraph article, Grayling is quoted as saying: “Our ambition is to prepare gifted young people for high-level careers and rich and satisfying lives.”

The use of the word ‘rich’ is almost certainly intended in the sense of ‘diverse’ and ‘full’ – but it also betrays the central ethos behind the institution – and which is my only real beef with it – and that is its commercial-mindedness.

Grayling and his colleagues, being smart people, will no doubt be aware that we are living in times in which ecology plays a major part in our thinking. He will also be aware that while there is plenty of seeming evidence to suggest that we must pay urgent concern to our environment and become better citizens, this discourse is also the product of various processes, including the spread of ideas via the media. Grayling’s colleague, Richard Dawkins, calls them ‘memes.’

It is not that our planet is not in trouble; but Grayling’s appeal to ‘humanity’ bespeaks a sense of opportunism, in terms of how he wields the term, that is timely. It is a canny riding of the meme wave. How can I be so cynical, you might ask?

I am inclined towards this cynical interpretation of his words (that is, I don’t believe him), because this is also a university that is preparing ‘rich’ people, to use Grayling’s other term.

Rich people can be responsibly rich. Just because historically it has been the relentless pursuit of riches that has led to, among other things, colonialism, widespread global poverty, slavery, and war (although this is not how Grayling’s other colleague, Niall Ferguson, reads history), this does not mean that it will not always be so. In fact, to give Ferguson his due, it is hard for us to know whether the world would be any more or less civilised without the ascent of money. But at the very least there is a tension between pursuing humanity and pursuing riches.

Maybe Grayling is indeed hoping to prepare a group of super-enlightened students whose ‘high level careers’ in fact help to bring about the redistribution of wealth and opportunities. But the hierarchy implicit in ‘high level careers’ does not bode well.

I criticised Owen Jones in part one of this blog for seeming to love Oxbridge, while at the same time hating it. His logic seemed to be that you have not proven yourself excellent if you have not been to Oxbridge, perhaps even that only Oxbridge people can be excellent.

And yet, this logic of exceptionalism is something to be guarded against, or wary about: maybe enlightened people are paradoxically exceptional by not succumbing to the logic of exceptionalism. They are perfectly, adequately, perhaps even exceptionally intelligent, but they simply do not want to go to Oxbridge; they perhaps do not want to be bankers or lawyers or management consultants; maybe they want to be park wardens, gardeners, electricians – who knows? Or maybe they simply do not want to go give up on many of the things that they believe in (‘happiness’) in order to do well in the world (or, perhaps more accurately, to be seen to do well in the world).

Once one does well – or is seen to do well – it must be difficult not to believe that one is not doing well. If other people can see that I am doing well – and tell me as much, then I must be doing well, or so the logic goes. It’s not that these people are not doing well, nor that they should believe necessarily that they are not doing well. But I am saying this to propose that I can understand why people ending up believing in their own exceptional nature – be you AC Grayling or one of his students.

The twin forces of luck and fate – luck in that you were in the right place at the right time; fate in that you were the kind of person whose chances of being in the right place at the right time were maximised from birth – get quickly forgotten. It was all the exceptional person’s doing – or so it is to be believed. History is full of great men (sic.).

However, exceptionalism does not tally that easily with humanism – which, at least for me, implies a sense of democracy and the kind of concepts that the French put in their constitution. A ‘new Oxbridge’ based on the logic of exceptionalism takes us away from the logic of humanist togetherness…

The pledge to help students to understand humanity, then, seems to work only for certain humans. This much is affirmed by the price tag of the university, which, as mentioned, is £18,000 per year.

Indeed, while part of me would want to teach at the NCHum, and while part of me would also want to study there, it is the price tag that is the real kicker.

Not because I cannot afford it (although I cannot). But because this college privatises education.

Privatisation is the retreat from the public. More particularly, it is a retreat from the common – the common wealth, the common good. The private wealth, the private good – well, we have had logicians and economists who have argued that these things are in fact the path to the common wealth and the common good.

And it is people (like Milton Friedman) whose private-based policies have led to the increasing rates of disparity in economic wealth that the world has seen accelerate over the last three decades plus.

Privatisation is to embrace solipsism. It is to deny that we are in this world together, and it is to fall for the notion that one is, or must be, exceptional. Exceptions tend not to believe that their exceptional status is illusory; they also tend to forget that it is only thanks to the tacit permission of others that their exceptional status can come into existence in the first place.

Possibly greater levels of privatisation will lead to a common good – because the majority will be able to take no more and will remind those that have that they only do so because of the people who have not (and because the people who have not – sometimes out even of kindness – let them).

With regard to NCHum, then, we might look at a BBC article, which quotes UCU general secretary Sally Hunt as saying the following:

“While many would love the opportunity to be taught by the likes of AC Grayling and Richard Dawkins, at £18,000 a go it seems it won’t be the very brightest but those with the deepest pockets who are afforded the chance. The launch of this college highlights the government’s failure to protect art and humanities and is further proof that its university funding plans will entrench inequality within higher education.”

In other words, the privatisation of higher education, which the creation of the NCHum seems to signal, might well, even in spite of scholarships on offer (i.e. in spite of ‘exceptions’), lead to the acceleration of the creation of a two-tier education system that has as its pre-existing counterparts the private and public sectors of secondary and primary education.

An article from the London Review of Books, already mentioned in one of the previous posts, quotes Jonathan Cole, the former provost and dean of faculties at Columbia, as writing that

“in addition to fee inflation, a major contributor to the increased cost of higher education in America stems from the perverse assumption that students are ‘customers’, that the customer is always right, and what he or she demands must be purchased. Money is well-spent on psychological counselling, but the number of offices that focus on student activities, athletics and athletic facilities, summer job placement and outsourced dining services, to say nothing of the dormitory rooms and suites that only the Four Seasons can match, leads to an expansion of administrators and increased cost of administration.”

This is from an article that I read before the announcement of NCHum, and which was arguing that British higher education should not look to the American Ivy League as its model.

In the face of the creation of the NCHum, the students better get their dollar’s worth. But more important is the fact that if one institution will go private, then perhaps others will follow, and only those institutions that can afford all of the above ‘services’ for their ‘customers’ will stand a chance of surviving, which puts in peril the hopes of many students who may find themselves unable to go, or at least put off from going, to university for financial reasons.

The NCHum is seemingly a private institution backed by some London investors. Do these investors get much say in the curriculum? While Grayling says that he wants his students to develop critical thinking, might these backers in fact want the institution to develop a certain kind of brain that will be good for [a certain type of] business as the students graduate into jobs at these self-same firms that sponsor the institution?

A friend who used to work at Lehman Brothers once told me that their HR team did not bother to look at candidates for jobs who have a PhD or equivalent. The reason he gave to me was that, unless the PhD was in maths or economics, the chances are that the candidate would think too independently. Lehmans, allegedly, preferred to hire younger graduates whose minds they could mould according to the Lehman ethos.

Then again, rather than standing as anecdotal evidence for the fact that banks only want a certain ‘type of brain,’ this might explain why Lehmans went bust…

Finally, Howard Hotson in his LRB piece explains that “the American company that owns BPP University College – which David Willetts granted university status only last year – recently lost its appeal in the US Supreme Court after being found guilty of defrauding its shareholders and is under investigation by the US Higher Learning Commission for deceiving students about the career value of its degrees.”

Earlier I explained how there is a system of peer review, external examination and various other mechanisms that mean that universities in the UK have to work together, even if they are also in competition for limited resources.

The privatisation of education (which can also lead – as happens in big pharma and the like – to the production of only a certain type of knowledge, which is based on a particular agenda and ideology – and which does not, in spite of pretenses – have any or much ‘objective’ truth status [what is truth?]) means that institutions can (presumably – although I want to be corrected if I am wrong on this score) ignore the edicts of colleagues from other institutions. They can, as BPP did do, defraud shareholders and, more pertinently, deceive students about the value of their degree.

In the current climate, it is hard to be sure which degrees are ‘value for money,’ not least because so much of that value must rely on the perceptions of the students themselves.

Evidently, the company that owns BPP has also been caught out – so there are mechanisms in place to stop this from happening – in the USA, at least. However, this does only point to the possibility that a lack of transparency via privatisation might inevitably lead to some form of corruption – with higher education being the ultimate loser.

Maybe all I must conclude is that I wish NCHum luck. In the prisoner’s dilemma that is the current state of higher education, you pushed the button to get the bigger reward first, which in some circles is the logical thing to do.

In a world in which we are together, though, and in which the emergence of humanity is tied to the origins of virtue, as much as it is to the (deeply misunderstood?) selfish gene, then humanity is our common wealth – not some of it, but all of it.

If I expressed fear that the real problem with higher education is not Oxbridge but that people who are already rich typically end even richer, then perhaps Oxbridge, and even the NCHum, offer nothing to the rich kids that can afford to study there that life would not offer them anyway (more riches). In this sense, if NCHum takes in rich kids to churn out adults that will get richer, what have they really taught anyone?

Taking in students more democratically – now so hard to do in the age of top-up fees – and encouraging students from all manner of diverse backgrounds to become better humans, more together both in themselves and in the world, to encourage them to learn not just new things, but new ways of learning, new ways of thinking, the likes of which we have not even begun to conceive – this might well be priceless and real value for money.

No dark sarcasm (one)

Blogpost, Film education, Uncategorized

I have been reading a number of things about higher education in the UK recently – which stands to reason as a result of recent changes in fee structures and the like. And, of course, as a result of the creation of NCHum, set up by AC Grayling n’ chums (sorry), and due to open in Bloomsbury in 2012.

I think this blog is about some of the fuzzy logic that seems to exist – for me – when discussing these issues.

Let’s start with Owen Jones’ posting on LabourList.org about how Oxbridge should be abolished.

Oxbridge is well known as a preserve of people whose parents are from wealthy backgrounds and who will more than likely end up wealthy themselves. In spite of the university’s efforts, the (lack of) diversity in the student body is not particularly changing, says Jones, in part because rich kids can prepare more easily for the entrance interview as a result of the fact that they are rich.

I would be disappointed if any school did not offer help to students in preparing for university – or any – interviews – and I don’t see how money makes that much difference. Except, perhaps, that one can buy ‘better’ preparation…?

There are no stats on this, so forgive my speculation, but I suspect that more people ‘waste’ their money on pursuing such expensive preparation by not getting in to Oxbridge than there are state school students with supposedly no preparation who apply and are accepted. To be substantiated, of course – but the point is to query precisely what role money might play in this process.

Jones points out that he met thickos when he was studying at Oxford – and that he’s met smart people outside of Oxford. Aside from the fact that neither of these things should surprise Jones at all – if, that is, he wishes to retain the habit of imposing judgments on people according to what he perceives their abilities to be – this only suggests that Jones suffers from the Oxbridge graduate’s snobbish sense that it is almost impossible for non-Oxbridge people to be smart.

And yet Jones also writes this:

“Many bright young people from comprehensives simply do not want to go to Oxbridge, because they don’t want to spend their university years stuck with those they fear will be arrogant, braying, overprivileged youngsters who may as well have grown up on a different planet. That might be unfair, but that’s certainly how many feel.”

Whether or not the above conceptions of Oxbridge actually are unfair in terms of what Oxford is like, this paragraph does further mystify the point of Jones’ article. He seems to be reprimanding Oxbridge for not attracting (exceptional) students (from ‘normal’ backgrounds), while pointing out that such students/would-be students don’t want to go there.

Concentrating on the first half of this equation, then, Jones seems to be saying that it is a pity for people from ‘normal’ backgrounds not to be going to Oxbridge. That is, despite wanting to abolish Oxbridge, there is a deep-seated belief in the article about Oxbridge’s superiority over other universities (in the UK). That is, Oxbridge here does offer something exceptional to which a more diverse body of students should apparently have access, but they don’t – so Oxbridge should be abolished.

And yet, if many bright students actively do not want to go to Oxbridge, but instead go to other universities, then presumably these other institutions get a fair share of brightness. That is, if brightness is spread democratically throughout the British or any population, then it is not necessarily the brightness of the students at Oxbridge that makes it exceptional.

In this respect, one should not care what university one goes to – or even if one goes to university at all. Smart people are smart people and that is all there is to it. Or rather, everyone is smart in their own way. One does not need Oxbridge to validate this, and it is not unfair if Oxbridge does not validate this.

Jones, however, seems to suggest that everyone who does not make it to Oxbridge somehow seems to have missed out. And yet, whenever I step out on to the street and I see people wearing hoodies, shirts, and all sorts of other garments that speak of their alma mater, I see a huge range of institutions walking around the streets of London and elsewhere. That is, students seem proud of their alma mater no matter who she is. Not everyone is walking around thinking that they ought to be wearing Oxford or Cambridge stash. So perhaps Jones might refrain from his Oxbridge-centrism in believing that there are only two universities in the UK.

I do have some sympathy for Jones’ argument, but that sympathy must be based on the perceived fact that Oxbridge does offer something different to other universities, a difference that can also at times be perceived as ‘better.’

Not going to Oxbridge does not prevent people from succeeding. Not going to university, in fact, does not prevent people from succeeding – no matter how we define success, even if the prime measure of success tends to be financial.

However, given Jones’ mention of the history of Oxbridge and its record of producing British and world leaders in all domains of existence, then it is hard to deny that something different is there.

What is this difference?

Oxbridge students work notoriously ‘harder’ than students at many other institutions. For example, they must produce the amount of work in a fortnight that many other students have to produce in a semester. I therefore suspect that Oxbridge encourages not just time for reflection, which can often be offered as a justification for university tout court, but it also enforces hard work, pure and simple. Not an aptitude, but the acquired habit of hard work, then, might be a measure of this supposed difference.

There are issues to explore here about what ‘hard work’ means, though. Producing more work does not necessarily mean producing better work, even if practicing the ‘art’ of essay writing and other forms of assessment almost certainly does lead to improvements in quality. Practice, then, is key.

However, spending longer on fewer essays is also a form of practice – a form of making good that which one has time to make good, rather than rushing off essays at a rate of two per week (or whatever it may be). In this sense, I genuinely believe that all universities encourage (if not so much enforce) their students to work hard. This, then, is not for me the ‘difference.’

Oxbridge typically has students attend lectures, seminars and tutorials. These latter can in particular see students work in ‘groups’ of as few as one person with one tutor. This kind of personal attention might also set Oxbridge apart from many other institutions.

However, I am inclined not to believe this. This is not because students do not benefit from one-with-one tutorials. I think that they do – although I also think that students can benefit in different ways from group sessions in which they exchange ideas amongst each other.

I believe that this does not make Oxbridge that different, however, because many universities in fact offer – at least to those who ask for them (and many do) – one-with-one sessions for their students. Staff members feel hard pushed to decline calls for private tutorial sessions because they know that the student is paying. So in some senses this is something that other universities offer.

(Note that I am guarded here: ‘in some senses’ is supposed to suggest that there are at least more similarities between Oxbridge and other institutions than there are differences.)

So, if Oxbridge graduates dominate the halls of power within the UK and further afield, in a manner far more significant than any other university or set of universities in this country, then what leads to this is again something else.

At a wedding (in Oxfordshire) this weekend, I was ushering vehicles to the official car park with a friend when one of the guests drove past. My friend knew this person (I did not), and he told me that the guest had performed poorly in his A Levels (two Ns) – the reason this came up being that it was way back at school that my friend had last seen this person.

The poor A Level results had (resits taken for granted) not prevented this wedding guest from turning up in a relatively new BMW that almost certainly will remain beyond my means for many years to come.

This is anecdotal evidence at best, but obviously being the kind of person that Owen Jones would probably find a bit dim had not prevented this human being from going on to make – or at least give the appearance of making – a decent living. (And being able to appear well off requires a fair amount of money in and of itself.)

Talking later on to this person, however, it became apparent that he was what many people would call posh.

I have recently felt at times that I have made a grave mistake in going into higher education as a career. I have friends – not least the other usher with whom I was directing cars to the wedding car park – who have made swathes of cash that will to a high degree of probability elude me until I breathe no more.

Since I am in a society that measures success so emphatically by wealth, it is hard not to be affected by its logic – and in this sense, I fear that I should have made money as a lawyer or a management consultant – because I simply cannot keep up now with the high-spending lifestyles of many of my friends, who become non-friends because I cannot afford to see them as regularly as I would want to.

The reason that I have introduced this aside about money is that the wedding guest was – according to available reports – not academically that sharp, but he was – from the evidence presented to me – from a relatively wealthy background.

The reason for this bracketing aside about the wedding, then, is to say that what sets Oxbridge aside is not strictly its structure of education, because I have argued that its structure of education is not necessarily that unique, and therefore not that different from many other institutions.

Without going to Oxbridge, this wedding guest has (I am arguing here) made lots of money. What he has in common with many successful Oxbridge graduates, then, is a wealthy background.

Brains help you to make money. But brains are not necessary for making money. What best helps people to make money is having it in the first place. It is not that Oxbridge students are particularly more clever or particularly better prepared for the ‘real’ world, then. Or this is what I am arguing here. What many Oxbridge students have, though, is a wealthy background.

Abolishing Oxbridge will not change a system in which the rich get richer and the poor get poorer. Oxbridge could offer more scholarships to more students from poorer backgrounds. Some would no doubt benefit from this by being ‘better off’ in later life – but mainly because of the connections they will have made or can make with those who are already from wealthy backgrounds.

This kind of ‘exceptionalism’ (by which I mean that those students who follow this route would be ‘exceptional’ people, and therefore already divorced from the ‘unexceptional’ rest) does not help to change a system that is inherently conservative, in that money is what you need to make money and to gain access to the corridors of power.

But this system of wealth breeding wealth would continue unabaited whether Oxbridge existed or not. Oxbridge is simply a symbol, then, of one’s pre-existing socio-economic status. If you want to be a part of the wealthy classes, then why not try to get in at the university level and go to Oxbridge?

However, while Oxbridge might help (as might many, if not any, other university degree), being from a ‘lower’ socio-economic background will be the main hindrance from achieving wealth. Perhaps many Oxbridge graduates make more money than they might have without their degree, but the real issue at stake is that those who already have power are the ones who gain/retain power.

Exceptionalism does not redistribute power; it simply confirms its being confined to certain spheres (those who already have it); there might be slight changes in the personnel in possession of power, but power is still the jealously guarded preserve of the few. Given that brains are evenly distributed – as proposed earlier – power and brains do not correlate to each other, nor even link up causally.

I’d like to explore one more aspect of our system of conservatism. The truth of wealth breeding wealth is systematically hidden from our population – not least because of its enormous emphasis on exceptionalism, which gives the illusion that everyone stands an ‘equal’ chance of gaining access to it.

Given the lack of encouragement to make this realisation (that power is not distributed evenly), it is not surprising that many youngsters – or even older people – do not win much power. In fact, we are systematically encouraged to accept everything as it is now.

Unsurprisingly, then, kids of 18 years of age do not necessarily know that there is a game of power, nor even how they will play it. Without knowing that they are being played, then, many continue through life – perhaps until expiry – without ever realising that they were supporting the increasing grip on power by those who had it already. The majority of people are not empowered – and a major part of their disempowerment is not even realising it.

The above is not to say that, after Jones, everyone should apply to Oxbridge; one cannot burden two, or even 150, higher education institutions with the task of redistributing power.

Nor is the above to say that abolishing Oxbridge will redress this (im)balance of power.

Oxbridge, or rather various of its alumni like Jones, should stop wringing their hands with the higher educational equivalent of white man’s burden, whereby only Oxbridge can save higher education and if it cannot then higher education as a whole should be damned.

Students wear their university brand and love their alma mater whoever she may be – and they are correct to do this. It shows that people care about their higher education and what it can offer, and those few people who believe that you are no one if you weren’t at Oxbridge can go forth and multiply.

All universities (as I shall argue in the next part of this blog) are by and large equal. What university you go to, or that you go to university at all, is not going to make or break you – even if all university experiences tend to be immensely formative for those who undertake them.

What makes and breaks you is how much you have already. Oxford University hoodies and t-shirts are sold all over London and further afield – while stash from my current employer does not feature too highly in the tourist shops around town. This stash is bought by tourists not because of the brains they have or do not have, but because of the aspirations to power that such a label implies. It is a show of money, be it real or proclaimed, as much as a BMW is.

All of these ‘fake’ Oxbridge graduates who wear the t-shirt but never actually attended the universities reveal a truth, then, about the universities themselves: they are a brand, a spectacle of power, that helps to convince other people that more power should be given to those who already (seem to) have it.

Don’t get me wrong; this is a complex issue, the full complexity of which I have not got to grips with here. But if Oxbridge is the icon, then the real deity is power itself. Closing Oxbridge would not change anything; power would simply rebrand and perhaps relocate. Convincing the world that this power is there for the taking, that it can be distributed evenly, perhaps, with or without Oxbridge’s existence, is the real task of higher education.

Why film?

Blogpost, Film education, Uncategorized

There is no need to fear or hope, but only to look for new weapons.
– Gilles Deleuze

Everywhere capitalism sets in motion schizo-flows that animate “our” arts and “our” sciences, just as they congeal into the production of “our own” sick, the schizophrenics. We have seen that the relationship of schizophrenia to capitalism went far beyond problems of modes of living, environment, ideology, et cetera, and that it should be examined at the deepest level of one and the same economy, one and the same production process. Our society produces schizos the same way it produces Prell [Dop] shampoo or Ford [Renault] cars, the only difference being that the schizos are not salable. How then does one explain the fact that capitalist production is constantly arresting the schizophrenic process into a confined clinical entity, as though it saw in this process the image of its own death coming from within? Why does it make the schizophrenic into a sick person – not only nominally but in reality? Why does it confine its madmen and madwomen instead of seeing in them its own heros and heroines, its own fulfilment? And where it can no longer recognize the figure of a simple illness, why does it keep its artists and even its scientists under such close surveillance – as though they risked unleashing flows that would be dangerous for capitalist production and charged with a revolutionary potential, so long as these flows are not co-opted or absorbed by the laws of the market? Why does it form in turn a gigantic machine for social repression-psychic repression, aimed at what nevertheless constitutes its own reality – the decoded flows?
The answer – as we have seen – is that capitalism is indeed the limit of all societies….

– Gilles Deleuze and Félix Guattari

The last limit, between resource depletion and technological “progress”, not only remains but has become absolute – the death of the planet. This limit cannot be internalized by capital (although the nuclear arms race of the Cold War period that transformed the “advanced” nations into permanent war economies based on postponed conflagration was a delirious attempt to do just that). It can, however, be crossed. It is capitalism’s destiny to cross it. For although capitalism has turned quantum into its mode of operation, it has done so in the service of quantity: consumption and accumulation are, have been, and will always be its reason for being. Capitalism’s strength, and its fatal weakness, is to have elevated consumption and accumulation to the level of a principle marshalling superhuman forces of invention – and destruction. The abstract machine of consumption-accumulation has risen, [Donald] Trump-like in all its inhuman glory. Its fall will be a great deal harder.
– Brian Massumi

The social body is being laid bare, laid out, laid, excited, metamorphosed when hands clasp in greeting and in understanding and in commitment and also in parting. When the ear put against the cellular receiver is in contact with a voice from any tribe and any continent… Where the car on cruise control races the Los Angeles freeways, the hands free to dial the cellular phone, cut the lines of coke, or cock a handgun. Where the hearts, livers, kidneys of newly executed Chinese prisoners are rushed to clinics in Hong Kong, where ailing financiers and ageing media superstars arrive by limousine. When hands holding a video camera connect with hands on batons beating the black legs of a speeding motorist… Where hands extend into the Alaskan seas for oil-drenched seabirds. Where lips kiss the pain of the AIDS victim, where fingers close the eyes of the one whose agony has at length come to an end.
– Alphonso Lingis

Bombs exploding in Moscow. Landslides in Brazil. Floods in Australia. Haiti devastated. Over 34,000 people murdered in Mexico over the last five years in drug crime.

If the eschaton does draw near, and at times it seems to, then why (the fuck) are we reading and writing about films?

No doubt we are all simple beings who do the best that we can, but who fundamentally are not armed or invited to help with such bigger issues – and so reading and writing about film is our modest input into the world today. An engineer might honestly be more useful, though, in the face of a collapsing planet. Maybe the Arts and Humanities will just have to look after themselves for a bit while we ride through this (perfect?) storm.

Can film make a difference? This is a question that is often asked and which to me seems redundant: film of course makes a difference, as does every creative and critical act that we do, every thought that we have, and every breath that we draw. Each of these things, by involving rearrangements of molecules, fundamentally changes the constitution of the universe, making it different now from what it was before that work of art, that criticism, that thought and that breath came into existence. In a world of chaos and complexity theories, perhaps even these most trivial-seeming differences can have the most far-reaching consequences.

But are these real differences? Who knows? ‘I prayed to God, but he did not listen and so I stopped believing,’ say some converts, apparently unaware of Bunyan’s fable that those single-file footprints in the desert could be us hunched on the shoulders of a carrier God, not a sign of our solitude at all.

But we secretly know the score: if a film expert were shoved into an exploding Moscow airport, what would they do anyhow? Perhaps save someone, perhaps cower and cry, perhaps film it on their mobile phone in order to get news of the explosion online. But their being a film expert might not necessarily have shaped that response. We are all too human at the last, film experts especially so.

In the absence of being there, because as viewers of films we are never ‘there’ but somewhere else, in a safe and dark room, we might just wait for the inevitable films that will come out about these earth-ending events and then write about how they glorify these horrendous moments when they do. That’ll be useful, for sure.

Either way, I write this in the context of reading recent reviews of two films in particular that have elicited strong responses, namely The King’s Speech (Tom Hooper, UK/Australia/USA, 2010) and Black Swan (Darren Aronofsky, USA, 2010).

These responses have been both positive and negative. I’m not going to rehearse what most of them are about, although I will take issue briefly with Ryan Gilbey’s review of the latter film here, because it might be able to help me to connect this film thing with that heavy real world shit that sits at the top of this blog.

Gilbey obviously hated Black Swan, his main accusation being that the film is pompous, overblown and without subtlety. To which the inevitable response: “Subtlety? I got subtlety blowin’ out my ass!”

Or rather, how Gilbey knows for certain what a troubled mind is in order to say that the film has failed to portray one… Well, how he knows this beats me, even if he could lay claim to having a troubled mind or having known a few troubled minds himself.

Indeed, while his negative comparison of Black Swan with Repulsion (Roman Polanski, UK, 1965) is silly in that Polanski’s film is not exactly a masterclass in subtlety (walls coming to life, men hiding in Catherine Deneuve’s apartment, dead animals gathering flies, phallic candlestick beatings, razor blades, blood), it is interesting to wonder against what criteria he is trying to measure this.

Or rather, the criteria are obviously personal (he does not like the film), but in order to legitimise his view, Gilbey lays claim to an understanding of reality (what a troubled mind is, such that this is not an accurate portrayal of one) that simply cannot be quantified with certainty. Aronofsky’s film does not need to conform to what Ryan Gilbey wants it to be. Instead, we should look at the film for what it actually does – regardless of whether it is realistic or not. And perhaps we might even argue that the real world is in fact bigger and weirder than any one person can fathom, and that there probably are some people who have a touch of the Nina Sayers about them (Nina Sayers being the name of Natalie Portman’s ballerina in Black Swan).

That Gilbey compares Aronofsky’s film not to reality but to… another film (by Roman Polanski) illustrates cinematic thinking gone mad. We have mistaken our road maps for the terrain when we believe that films are reality, even if I shall back track and say that this is or at least can be a good thing later on in this blog.

To justify this not-as-brief-as-I-thought-it-would-be-when-I-started-a-few-paragraphs-ago-aside on Gilbey, the point is not really that Gilbey’s review is silly (although I hope that this aside does serve to render somewhat void Gilbey’s other recent comment in Sight and Sound magazine that Henry K Miller’s writing is too ‘review-like’ for what Gilbey thinks that organ should contain – apparently Gilbey knows how everyone is supposed to act, write, and make films), but to point out the drag that everyone feels to reach an extreme verdict on Black Swan, and The King’s Speech, both of which are fine if not exactly world-changing (except in the fashion that everything is world changing).

This blog, then, is not about those films, although I could probably muster some thoughts on both (Hooper is a good cadreur, Geoffrey Rush is loveable, Britain needs some Somme spirit, apparently; Aronofsky’s film is more modest to me than people seem to want it to be; the ending is not ‘real’ because Nina’s bleeding and death are too conveniently timed; creative women are, apparently, dangerous). Rather, this blog is about how we are in the grips of cinematic thinking as I term it – and if we are to get back to looking at the eschaton and worrying about its rather alarming potential for autopoiesis, then we need to start rethinking our thinking cinematically.

I read ‘cinematic thinking’ everywhere in student essays that are supposed to be critical but instead rehash review speak. ‘X perfectly portrays 1970s suburban life’ and ‘The camera moves perfectly around Y’ are particularly odd phrases to me. How do we know what is ‘perfect’?

I don’t intend this as a critique of the descriptive powers of 19-year olds. I just mean to say how such phrases reflect the way in which we are gripped by review speak. In the absence of a language that might see a moment in a film for what it is doing (even if only to the individual watching it at that moment in time), we instead have kneejerk recourse to meaningless cliché that does effectively convey the individual’s enjoyment (it’s ‘perfect’, after all), but which also gets nowhere closer to the specifics of a particular moment, and which furthermore needs to convey enjoyment according to some nebulous sense of use-value. For, in rehashing the hyperbolic language of the film review, the student – and even Ryan Gilbey – puts into play the sales talk that gets arses on cinema seats, the sole end goal of which is to line the filmmaker’s pockets.

If anyone reads this to follow it, and I hope it is follow-able, what I am calling ‘cinematic thinking,’ then, can be refined. Really, it is review thinking that is repeated – and in particular capitalist review thinking that is not really reviewing at all, but masqueraded sales pitching.

I don’t wish to be boring, but I am going to have pick apart the above paragraph to make my point clear. So bear with me…

Ryan Gilbey is a reviewer. Reviewing is his job and I would only be a hypocrite if I told him what he should (or should not) write in his reviews. And, in some senses, to knife Black Swan at a moment when everyone is lavishing it with praise is to try to counter the review speak of which I speak. Furthermore, Gilbey, at least hypothetically, has to respond to the pressures of reviewing: time limits, word limits, keeping some movie industry insiders sweet for the sake of future exclusives, supporting the agenda perhaps of his organ, and other political intricacies that no doubt arise, not least when he is (as in one or two cases he must be) pals, or at the very least good-willed acquaintances, with the people whose work he is reviewing.

So Gilbey is the unfortunate straw man erected here for a wider point, but which I have perhaps only been able to reach through him (I am sure he is human enough to take it). And that point is the moment when language fails us as we talk about films. When language does fail us, we resort to repeating what other people have said.

Don’t get me wrong. Maybe no one has original thoughts, maybe no combination of language is original (although I refuse to believe this). Furthermore, being a believer in spoof movies, I also believe that the deliberate deployment of clichés can in fact explode them from within, in the same way that repeating a word to ourselves over and over can become amusing because we see ‘through’ the word to its arbitrary sound in relation to its meaning. In the same way that a Buddhist might repeat their mantra to the point of enlightenment.

And while I am never absolutely to know how much, when, or indeed if anyone is ever thinking for themselves even if/when they are talking in clichés, I will still hold that not using clichés is a better way of giving the impression of autonomous thought (a reliable impression?) than not doing so.

Because film is audiovisual and language is, well, linguistic, ‘translating’ the one into the other is possibly one of the hardest things for us as humans to do in terms of cerebral endeavour. Arguably we ‘understand’ pictures and sounds with no training and quite naturally (these are ‘unlearned’ skills), but to a certain extent we have precisely to unlearn these skills if we want to get closer to understanding this process and how we should describe it and the pictures that we see themselves. In other words, we need to find the language to describe pictures and sounds (or, alternatively, we need to reply with other pictures and sounds – something that humans are doing more and more, but debate of that will have to wait for another time).

To describe in clichés – be they linguistic or audiovisual – is, for me, deeply problematic, not least because, as I have tried to outline above, it is a short-hand form of capitalist thought because implicitly it implies sales speak. Our brains are shaped by the language that we use and by the images that we see (both in real life and on screens). Of this I have no doubt, not least because our brains change at every moment in interaction with the world. But simply to repeat the same phrases is, or at least runs the risk of, never evolving thought in new directions. If the world needs original thoughts to solve the problems that in part might have arisen from humans being in the grips of ‘cinematic thinking,’ then we need to evolve thought in new directions. We need to not think in clichés. We need to test our linguistic abilities to the limit, because language can draw new meaning and potential out of not just cinematic images, but the audiovisual situation that is reality itself. Seeing the world anew because described anew/describing anew because seen anew, is precisely what will help us to change the world.

And yet cinematic thinking encourages us to fold everything into a neat system of use-value and pleasure. Pleasure, in particular, is a tricky customer, here; things that are ‘perfect’ for us are not necessarily perfect for the world and our place in it in the long-term, and yet it is comfortable thinking and comfort in general that, potentially, makes the mind weak (even if one might trace a long line of intellectuals from deeply privileged backgrounds, which is slightly to miss the point, but this will also have to wait for another time). That discomfort of needing to find new words, this is perhaps the key experience that gives hope to existence, since it demands original thinking – not ‘cinematic thinking.’

More examples of cinematic thinking, although in and of themselves these are perhaps clichés, so beware: people who deal with reality by describing it in terms of films (11 September 2001 being the main case in point). People whose knowledge of the universe is based upon computer-generated images that convert raw data to look like what they wanted it to look like rather than what it is, and yet who, again, use the computer-generated image as reality rather than a modification/simulation of reality. Everywhere the road map, never quite co-extensive with the terrain, still seems more appealing than the terrain, because more simple and easier to navigate. Indeed, the road map was designed for the purpose of navigation. Real explorers go where no maps have yet been drawn.

So, winding slowly to a conclusion, am I saying that film is evil since it and the industries that spawn and surround it infect our thinking, which in turn limits our potential for certain kinds of activity? Sort of. But not only is this a battle between the cinematic and non-cinematic (although I do not really see it as a battle at all, more like a curious dance), but it is also a tango that takes place within cinema – and within the world – our ability to describe the world in audiovisual and linguistic terms such that we see new sides to it, new potentials that might help us find a peaceful way out of the eschaton.

I am not saying that we should destroy cinema, then. But I am saying, because I stupidly believe it, that when we don’t think for ourselves, we naïvely repeat the clichés that others encourage us to say in order literally if not deliberately or conspiratorially to control us. To keep us buying whatever it is, whatever the consequences (as long as money is made). And the major source of the clichés with which we think? The cinema and the various new media that are its children. So why film? Because here we can tackle head on the limits and limitations of human thought, be it verbal, visual, audible or sensual. By trying – which is all that we mere humans can do – we might arrive at some new thoughts, perhaps even at a new mode of thought. And seeing and thinking the world anew, this might bring about some genuine change, that might (God help us) make the world a not necessarily a better place, but a different place in which our desire and ability for free thought, for our own thoughts expressed our own way, are given space and time – rather than than the tiny flatshare that the commodified thought of cinematic thinking tries to make our brains one and all.

Mickey Mouse Studies (for ODC)

Film education, Uncategorized

It is a tale told in a voice of urgency. It is a tale told in a state of extreme fatigue. It is important to enter into states of extreme fatigue, into extreme states. Knut Hamsun knows that it is by entering into liminal/extreme states that one achieves a sense of otherness. By going to places to which the body is not used, the mind must follow – and as a result new thoughts can be found, because new physical conditions are the conditions for new modes of thought, new mental conditions – or at least Spinoza might claim that this is so. What follows may not be new thoughts to you; they are not necessarily new thoughts to me – but they are thoughts that have tonight been illuminated by the spotlight of consciousness such that they merit attention in the form of comment. In the form of a blog.

Watch this. It will help.

It seems apparent that the Browne Report will involve a huge cut, maybe a 100 per cent cut, in funding for humanities at universities. What does this mean?

I am not sure – and never will be – that I grasp fully what is going on in the world. It’s a big world. It’s a world that is larger than I can fathom. How could I grasp fully the rationale behind the desertion of funding for the humanities in higher UK education? This is the response that logically will be given to this post (should anyone read it to want to respond): you don’t understand the bigger picture. Subtending this (as yet hypothetical) response is the supposition that the person who makes this claim does understand the ‘bigger’ picture – whatever that is.

I work in Film Studies: the camera can only take in as much information as it takes in, as can the human eye. If you have eyes to see the ‘bigger’ picture, then what you gain in size (‘bigness’), you lose in detail; what you gain in detail, you lose in size. By which I mean to say: I don’t know what the bigger picture is – because we all take light into our eyes, we all see – even the physically blind, and if I cannot trust my own sense of vision, in combination with my other senses – i.e. if I cannot trust my sense of self (whatever that is), then I must be blind. If I am blind, then my perceptions are pointless: I see nothing. If my own perceptions are pointless, then how can I perceive how pointful are the perceptions of others? And yet, it would be by telling me that I perceive poorly that my hypothetical antagonists would undermine what I am about to write. Well, I perceive as best I can, and I write as honestly as possible in accordance with my perception. And so if I am wrong, I apologise – but not to you who merely thinks me wrong – but to you, the invisible (to me – i.e. God) who knows me wrong.

(A problem with Film Studies regarding shot sizes: it seems to me that in general a close up is considered to be more ‘detailed’ than a long shot. But this is not the case. A close up is as detailed as a long shot. A close up fills the same amount of screen as does a long shot. As such, there is as much information in a close up as in a long shot. Indeed, a long shot is – permit some twisted argumentation – a close up of length, and a close up is a long shot of closeness. Like the camera, we take in as much as we can at any given moment in time. Our subjective existence is only ever as big as our subjective existence is – and if a pin prick destroys my belief in God, when others require an atom bomb for this to be the case, then the object of pin or bomb is irrelevant: the process of the destruction of faith is the thing that is important. As such, objects, which exist in space are often prioritised over happenings or processes, which exist in time. It does matter what is the catalyst. It matters, because the catalyst, the thing, has a material existence. It is made up of matter; as such, it matters. But in other ways, the thing does not matter at all – and the only thing that ‘matters’ is the process. It is the immaterial, the invisible that also, perhaps even really, counts: the process. Time is the overlooked element – because we impoverished souls cannot see it. But it is there, a black hole whose effects are visible everywhere while it itself can never be seen, because no light escapes from it. The Higgs boson as a particle of time; time as a physical entity. We are happy to accept that the bed that currently supports me is created from particles that have divergent origins – and if we separated this bed into its constituent particles, we would be happy, I suspect, to accept that each particle was not destined to become a bed, but that it had the potential to be a bed, that it had ‘bedness’ embedded within it. But we do not assume that time might be the same – which is what I wonder is the case: time, like space, consists of particles that, if we were to look at them from their point of ‘origin,’ would not cohere to the fleeting moments in time that are my breathing in and breathing out. Instead, they, like particles of space – like matter because they are material – in fact come from all over – and what coheres them together is simply organisation, as opposed to continued and persistent identity in the form of memory. Time seen from ‘without’ (Aeon) is not the opportunity for us to see the past and the future in a coherent sense; rather we see the chaos that is time, and it is the process of experiential time (Chronos) that makes ‘sense’ of time, and gives it a ‘chronology.’ Time in and of itself has no order; and what time we experience in life is the ordering of otherwise random particles, particles which, like those spatial particles that comprise a bed, come from all over the Aeon, are from any and everywhere, but which ‘randomly’/spontaneously cohere/self-organise. The point of this: to readjust the common assumptions made about time would be to readjust our common assumptions about identity, which in turn leads to challenges in the field of politics and ethics. In effect, to contemplate Aeon might lead to tangible changes in Chronos – in the way in which we lead our lives and act upon the shared assumption that we love each other.)

Back to the beef – and apologies if you are lost already – but it’s okay to get lost, because without getting lost you cannot find out where you are, you cannot re-think, I cannot learn: what do the proposed cuts in humanities funding suggest? Well, from my (by definition) limited point of view, they suggest a government-backed desire for universities not to encourage freedom of thought. Don’t get me wrong: a failure to back the humanities is not to say that the sciences, commonly if erroneously thought to be the humanities’ beautiful sister, do not encourage freedom of thought. Of course, the sciences do encourage freedom of thought. The sciences require freedom of thought for progress to happen. But it does mean that free thinking, perhaps the art that lends to the sciences its future, is undercut.

I am coming at this from the perspective of someone who works in Film Studies. Film Studies, as a relatively new discipline, often, like (New) Media Studies more generally, gets labelled a Mickey Mouse field of endeavour. What (the fuck) is the point of studying entertainment? Entertainment is simply there to entertain. It is not serious, and therefore is something not to take seriously.

Why do I think that Film, by which I mean the media more generally, is something to take seriously? Not only this, but why do I think that Film is perhaps the single cultural artifact to take more seriously than any other?

Firstly, the answer is in the way in which the question is posed. If one feels tempted to describe something like Film Studies as a Mickey Mouse endeavour, then the very fact that one uses a term from the history of film – Mickey Mouse – to describe the endeavour is highly significant. To describe Film Studies – among other disciplines – as ‘Mickey Mouse’ means that already one is influenced by the media – since Mickey Mouse is a media construct – that one wishes to dismiss. That is to say, to use the term ‘Mickey Mouse’ means that one is prey to not taking seriously precisely the media that helped to form the opinion that disciplines like Film Studies are ‘Mickey Mouse.’ If one is happy to use the term ‘Mickey Mouse’ without giving further thought to why one is using this phrase to describe this discipline, then the power of the media to hide their own operations already have you in their grip. We are encouraged not to take seriously the very thing that disavows how seriously we should take it. And, so I contend, as soon as we feel we should not take seriously something that shapes and helps to form our opinions, then this is the moment when we should begin to take these cultural influences – Mickey Mouse himself – most seriously indeed.

Teaching film is interesting: year after year (in the few, brief years I have been teaching it) I find students who express deep distrust in the idea that a piece of ‘mindless’ entertainment can influence our thinking. Two different but seemingly relevant examples come to mind.

Recently teaching Carl Theodor Dreyer’s Passion of Joan of Arc (France, 1928), conversation in class became obsessed with the idea that the prison guards who taunt and try to shame Joan (Falconetti) would not have had a metal pot – which occurs as a prop at one of these points in the film. It is not that students became concerned with the pot (which was raised as an example from them of an anachronism in the film) per se; or rather, I am not trying to single out a comment from a particular student as one in need of particular attention. That is, I am not trying to lord it over ‘ignorance,’ since I needed to go check on Wikipedia myself that metallurgy has been in recorded existence since at least 5,500BC. More, it is the idea that anyone – myself included – would think something like that 1431 would be an age in which metal would not be fashioned into something so basic as a pot/saucepan. By which I mean to say: that we collectively suffer from the perception that everything in the past was ignorant, and we are surprised – perhaps – by how long some ideas/technologies have been around.

I shall return to the above, but before that: secondly, in relation to Fritz Lang’s Metropolis (Germany, 1927), a student put forward the idea that because the film is ‘old,’ the world that Metropolis depicts is old, is irrelevant, and therefore that the film has nothing to say to us today and is simply an (un)interesting artifact of a time gone by in which – again – people were somehow less ‘clever’ than they are today.

Why are these examples worth mentioning? Well, let’s start with the second one first. In conflating the age of the film with irrelevance, we/I see the strange effect that film has on our society. That is to say, because the film is ‘old,’ it is alleged that it also has nothing to say about today. That a society of rich people is subtended by a society of impoverished and imprisoned workers apparently has nothing to do with the present age, because such problems have been eradicated – or so the theory would go. I personally believe that while things may from certain points of view have gotten ‘better,’ the world today is not perfect – and the gap between rich and poor continues to grow, in such a way that Metropolis remains a deeply relevant film – not least because it is set in the future, thereby suggesting that the ‘medieval’ not only coexists with the ‘modern,’ but that the modern needs the medieval to remain, precisely, medieval, in order for the modern to remain modern. But for someone to believe that this is so – that problems of class inequity – are today non-existent, an ‘artifact’ of the quaint and old-fashioned past, means that they are not given access to information that tells them about how such inequality not only persists but is the bedrock of contemporary privilege. And if they do not know/are in denial of the fact that problems of class/wealth do persist today, then this is perhaps because information about such things does not find its way into the media, is not publicised, which means that it does not find its way into the consensual consciousness. If people are not only unaware but also in denial of contemporary inequalities pertaining to wealth and class, and if this is related to the channels of information in the world, then a key question becomes: who has access to the channels of information/communication that do exist? Who controls these channels? And what is the agenda – be it conscious or otherwise – that determines the kind of information that is distributed via these channels? In other words: the media themselves, which here I umbrella under the term ‘film,’ determine our sense of the world – such that stories of class inequality are deemed to be ‘irrelevant,’ mythical even, in that they pertain to an age long since disappeared. Since this is not the case – that age of inequality is our age – the occultation of inequality via the media is something that we need to take very seriously. Cutting funding for the humanities – of which something like Film Studies forms a part – is to suggest that all is well with the media and the kind and channels of information that predominate. All may be ‘well,’ but to deny any encouragement critically to think about these things (via the removal of funding for the humanities) speaks of a complicity with – as opposed to a resistance to – the very messages and media that like to make themselves (the channels of communication and the agendas behind their content) invisible.

To return to the Joan of Arc example: this does not relate to Film Studies so much as to History – but the two are related. That we all are encouraged to think of our predecessors – the very humans that predicated our being here in the first place – as idiots and that everything that is ‘technological’ (i.e. pots and pans) is modern and ‘beyond’ the capabilities of those that lived less than 20 generations ago, is to fall foul of the idea that history in particular is not worth knowing. The past is another country, in the same way that countries that are ‘other’ to we Westerners are deemed somehow to be living in ‘the past,’ or ‘medieval.’ Film – and the media more generally – help to convey this message: if it ain’t fast and flashy, it is old and putrefied/shit. How fuckwitted might it be to assume that people from the past were fuckwits? And yet if – as I am contending – uncritically to think about our media and the messages that it distributes is also uncritically to assume that the contemporary saw the birth of everything ‘important,’ while the past was full of intellectual, cultural and moral retards, then a failure to take seriously the media that surround us might only lead further to this failure to grasp that at every moment in history, humans and all other creatures have been as brilliant (and probably as idiotic) as they can be. Perhaps we do not need to learn this lesson – in that no lesson is necessary, because we myopic humans cannot see time from without to know in advance what lessons will be useful to us before they become so. But that we can learn this lesson in and of itself means that we have developed a system of thought that probabilistically finds it useful to learn lessons – and in part to deny that – as the cutting of humanities funding seems to indicate – is not only to deny an opportunity not to re-perform the same mistakes as our ancestors, but it is also to deny in part something – learning from the past – that has become second/human nature.

Do the arts need funding to survive? Do the humanities? Nicholas Rombes has provocatively argued that young people today understand film and media far better than those that ‘educate’ them understand. In some respects, I have sympathy for this position. I do personally wonder that we have experienced something of a paradigm shift, starting with cinema, but continued with the digital era, whereby we think less in language and more in audiovision (for want of a better term, we think in ‘cinema’). Or rather: we – the multitude – have thought in cinema long before we have thought in language and will continue to do so. But language, not least because of the media, including voice and print press, that could distribute it, becomes the decentered medium for conveying thought, and is replaced by audiovision, by cinema, because cinema is more accurate, not least because it appeals to all of the senses, whereas ‘mere’ language – in many cases – appeals only to that supposedly rarefied – but in fact entirely embodied – phenomenon: the intellect.

If we think and, more importantly in the age of YouTube, if we express ourselves audiovisually, then the ‘translation’ that needs to take place in order for these audiovisually expressed thoughts and messages to be conveyed in the ‘old money’ of ‘rational’ and ‘academic’ language is always going to weaken the audiovisual message itself. Something – always – is lost in translation.

If this paradigm shift is happening/has happened, then perhaps the humanities do not need funding to survive. In the age of citizen tubing, then perhaps the arts do not, either. Or rather: maybe the arts and the humanities need funding in an absolute sense, but communication itself has changed such that language – spoken or written – no longer forms the core part of the process, but just another element, along with the tactile, sonorous and intellectual elements of film.

And if young people actually ‘speak’ audiovisual better than they ‘speak’ linguistic, then why waste money on training them to say in ‘old speak’ (i.e. in spoken/written language) that which they already understand through their bodies and which they already speak in audiovisual (here, ‘new’ speak)? In other words: why not cut funding in the humanities?

I am not saying that we should abandon spoken/written language; audiovisual does not replace it, but it sure as hell supplements and expands it. In fact, by this rationale, I think that not only should the humanities in general and Film Studies in particular benefit from continued governmental support, but that it is absolutely vital that this is the case. Otherwise we seriously risk alienation between generations and peers; we seriously risk failing to take seriously the ‘language’ that emerges when communication moves beyond words and into the realm of the senses. We the older people with the purse strings can moan all we like about how it is the fault of the young for not speaking our language; but it is our fault, similarly, for not speaking theirs. And humanities funding allows us to learn the (audiovisual) language of the young and to help it move into dialogue with the (linguistic) language of the old.

Do the arts need funding to survive? A year ago, I made a film called En Attendant Godard (UK, 2009). It has had some modest ‘success,’ and while I would be delighted to promise in this blog as a form of plug that I am happy to send the film to those that request it, provided they give me a postal address, the reason that I mention it is this. I made the film, which is far from being a good film (whatever that is), as a means of proving that one does not need funding anymore from anyone in order to make a… film. In other words, in the digital age if not before (but almost certainly before), artists do not need funding to survive.

(But it is not as if even the earliest professional artists did not need some form of payment – in terms of food and shelter – in order to survive. Artists need funding of a sort – but they will find a way to live even if their art is not what supports them in a material sense.)

To deprive people of things is to make them understand what they need, and it is to make them – perhaps – autonomous, in that they work out that of which they are deprived, and creatively they find ways to win it. Conceivably one might argue that there is a perverse benefit to cutting humanities funding: the humanities will have to find novel and innovative ways in order to remain relevant. Threaten it with death and at this moment it will feel most alive.

Beyond this, however, it was as a fuck you to funding bodies that I wanted to make the film. Not only that – but by having no funding, I could make the film I wanted, even if the film is (willfully) full of things over which I had no control and in which, in hindsight, I/the film revel – because having no control over, in particular, a large group scene that pays homage to Week End (Jean-Luc Godard, France/Italy, 1967), a scene that has had most criticism from people as a scene that should be cut from the film (as if other people knew better what the film should be [not because I do know better what the film should be, but perhaps precisely because no one can know what this, or any, film should be] meant that the film raised precisely these issues of what a or any film should be at all).

If I set out to ‘prove’ that one needs no funding to create art, and if I were successful in this bid (which is debatable), then what (the hell) am I complaining about? Well, what I feel upset about is that even if the unusual, even if the amateur, even if that most perplexing of phenomena in the capitalist world system – the uselesscan and will persist regardless of the lack of institutional support that comes it way, it is still an insult not to support such endeavours. Because I made this – and my next film, Afterimages – for no money, I hope that I am exempt here from sour grapes. If no one ever gives me money to make my films, this will not stop me from making them. No one can and no one will stop me. I shall not stop.

But, as at one point I make Alex, the main actor and character of En Attendant Godard, say, the world needs the useless, it needs the previously un-illuminated, it needs the (even willfully) opaque, in order for there to be progress. Not that progress is the movement towards a pre-defined goal or telos. Who knows that to which we are headed? But change, the hope for something better, is dependent upon that which is not now understood, in order for us to come to understand. If, perhaps contra Nicholas Rombes, we do not understand everything already, then we can only understand more, we can only learn, by coming into contact with that which we did not previously understand, with that which we did not previously know.

Even if I say so myself, there is more to my Godard film than this; but by isolating this aspect of the film, I want to reiterate, but now in a blog as opposed to in that film, that that which is now apparently useless can indeed come to be useful. And even if it never comes to be useful or liked, it has its place in the ‘grand scheme of things.’ But to rule out before the fact that neither the arts nor the humanities will have use or value, which seems to be the message of cutting humanities funding at universities, is precisely to pre-determine the (lack of) use and value of the arts – which historically do have incredible use and value, even if it is not clear, known or recognised at the time of that art work’s creation. (And I am not claiming that En Attendant Godard is this; I could not know if it will prove ‘useful,’ but I put it out there – as feebly/best as I can/could – in order at the very least to give it a try.)

(Trying: trying is a sign of faith. Being prepared to take risks is necessary, not as a thing, by which I mean one cannot know in advance – in spite of pressure to know in advance – what will be useful. If we knew, there would be no risk involved. But taking risks as a process is the cornerstone of progress – again, not towards a previously identified telos, but as a process in and of itself. The apparently ‘useless,’ therefore, is most useful. Outliers are de facto precisely that: people who lie outside of the currently useful. Not that EAG is useful – perhaps it never will be. But I put it out there in good faith that I am taking part in a human process of… good faith. Bad faith is risk aversion, a refusal to try anything new, a decision to forsake the foreign for the familiar – a decision to prejudge the foreign, to exclude, to dismiss, to show no interest, to ignore – be the object of that prejudice, exclusion, dismissal, disinterest and ignorance something from a different place or, in the case of ‘old films,’ from a different time.)

A paradox, which takes us in the direction of tautology, which perhaps is the profoundest level of insight that we can have about the world (namely, that it is as it is, even if we contribute to and change it, even if it is dynamic, precisely because it is dynamic, and any attempts at essentialisation/reification are doomed to failure): if art and the humanities can and will get by without governmental support in the UK – which they will because no government has nor will be able to stop a culture from becoming aware of itself – even if we educators are wrong in feeling that we play an active role in this education happening (because the students know it all already) and even if we are as a result of this already redundant, regardless of whether our employers make this officially so – then surely there is no problem in cutting funding regarding humanities and the arts?

In some senses, this is true.

But if you turn your back on the useless in favour of that which is deemed singularly useful, and if then the world changes, and you go rooting around to find that which earlier you discarded because now you realise it will come in very handy, but cannot find it because it has been destroyed in a fire, then you, my friend, are fucked.

You burn your humanities, you burn your past. And with no past, you have no future.

Idiots that people in the humanities are, though, you people who despise the humanities and who despise artists won’t be fucked. Because we’ll still be here when you do need us – and we’ll be here even if that day never arrives and you can die smugly saying that you got by without us because we were, indeed, as far as your existence was concerned, useless and you were correct in burning us off.

And like idiots, if ever you do need us, we’ll be ‘naïve’ enough according to your standards, to let you take advantage of us in the same way that school bullies occasionally condescend to the swots because they need their homework doing. You’ll think us weak for being ‘kind’ enough to help you, because your value system would never help anyone for free, because you value system is based not upon courage and having a heart/cœur, but upon, precisely, attributing ‘values.’ You’ll think us weak and you’ll never realise that it is only the strongest who can take your persecution, as opposed to feeling that to persecute is to show strength.

The paradox, the tautology: you are right to forsake the humanities. But you are wrong in thinking that this is because the humanities are the weak point in the world. The humanities, like the poor, are the strongest. You may never learn that we let you fuck us over because we are the only ones that can take it and you are fools that need to feel justified in your military industrial sense of self. Artists, like people from the past, are from your perspective idiots. You feel like you suffer us. But the truth is that we suffer you. We, in fact the multitude that you wish was yourself, hence your need to try to make everyone into a person of use and value, are not the people that need you; to cut the humanities funding proves that you need us. You need us precisely to sacrifice us, tautologically to make us the others we always already were.

The ‘you’ and ‘we’ just described, though, are actually just a ‘we’: we are all together, and to divide between ‘us’ and ‘them’/’you’ is potentially counterproductive. We need we in our diversity; we need we artists, even if artists and humanities scholars all we are not. We – stupidly – continue to hope that one day we will treat ourselves equally and with respect. We are all in this together, and while some of us want to discard certain members because the boat might sink if we are not removed from the equation, others of us continue working and fighting until the bitter, salty end, in the hope, perhaps even in the knowledge, that we will triumph – all of us – because faith in others is in fact a thing worth retaining now as ever before. This, surely, is the key to the humanities, be they topics in the ‘Humanities’ or in the ‘Sciences’: we believe in humanity, however good, bad, same, or different. As such, we want humanity to blossom; not just certain aspects of it.

We don’t have to; we won’t, perhaps, remember. But remembering that humanity is the heart of the Humanities can never be a bad thing to consider.