I was intending to write a post about Instagram, but after some reflection have decided not to. Partly because I don’t think the internet really needs another opinion on that service, whether it’s killed or saved photography as an art form and whether it’s really worth a billion dollars to anyone. And partly because having never used it – for a variety of reasons, but mostly because until recently I didn’t have a device I could use it with – I’m not really sure I’m qualified to talk about it at all. Instead I thought I’d write about a the wider issue around it of authenticity. By which I don’t mean the legally binding, “proof of creation” sense of the word, but the looser “genuine, non-faked object” sense.

This is not going to be a film versus digital post either, since that would also serve little purpose, and frankly I’m not sure I believe the two have to be pitted against each other continually to prove which is best. In my view each has its place in modern photography, and any kind of competition between the two is entirely artificial.

To me they are essentially two different mediums in the same way that a modern film photograph is different to a daguerreotype. This analogy is a reasonable one in many ways – a daguerreotype is a unique object that is hard to duplicate, and while an individual film negative is also unique it is comparatively easy to produce hundreds of identical optical prints, and yet compared to a digital capture it is expensive to do so in terms of both time and money.

It would be hard if not impossible to make a film photograph look like a daguerreotype, and I don’t think anyone ever tries. In my view it’s also hard to make a digital photograph look like it was shot on film, and yet people try this all the time. Aside from Instagram and its numerous mobile brethren, many software packages exist purely to help people simulate shooting film. I guess from a commercial stand point I can understand the reasoning – shooting film would be too slow, expensive and with a risk that something could go wrong during development, etc, so shoot with an expensive high-end digital camera, and post-process until it looks how you want. Myself, if I want something to look like film, it’s cheaper to just shoot actual film.

But, when I’m looking at the results of making a digital photograph look like film, there’s another problem – it never quite works. You can take a 20-megapixel image and run it through whatever you like but I have yet to see a convincing simulation of a film image. Matching tone and colour balance is relatively easy, but to fake grain is a lot harder and seems to me to be a little pointless. Pixels in a 20-megapixel DSLR are still larger (~6 microns) than the average grain size (~1 micron) in typical film emulsions, and are arranged in an extremely regular grid rather than randomly distributed. To try and simulate a small scale random distribution with larger elements surely means degrading the image quality, whichever way you look at it. Which is not to say that film is bad compared to digital, but that simulating film with digital is bad compared to digital.

My point here is why not simply remain true to a medium? Pixel sizes will inevitably continue to shrink, while quality will increase until they rival even very fine grain film, but the transformation required to turn a regular grid into an irregular distribution of grains is still going to lower image quality.

Apps that apply “creative filters” to your camera phone shots are another case. Here we find colour shifts and phony light leaks applied to degrade what are reasonably high quality images in order to make them look “retro” and therefore “better”. For me the idea of adding a light leak to a photograph deliberately is bizarre. I steer clear of Holgas and Lomo because I personally don’t see that as a positive thing in a photograph. I make pinhole cameras to do things I can’t do otherwise, not because I’m set on a lo-fi result – if I get light leaks I try and figure out where from and fix them. Plenty of people see things differently, and that’s fair enough, but in those cases a light leak is essentially a random element that is non-reproducible, not the predictable result of passing an array of numbers through an algorithm. If you have a modern cameraphone it’s probably many times better than my first digital camera – a 2-megapixel Canon Digital Ixus. It’s a remarkable piece of technology, so why not use it like one?

The aspect I find interesting in all this really is the question of why people want things to look like film at all. Even when not trying to truly mimic the nature of film, people will add sprocket holes and frame numbers to the edge of a photograph. One magazine I saw would every month run a feature where selected scenes from an actors career were highlighted with photographs with Fujifilm marks in the rebate as if to make them more “real” rather than simply stock agency photos. The illusion fell down somewhat when all the marks were the same every time. My theory is that the random structure of a film photograph with it’s essentially continuous tone variations is sought after because this is, after all, essentially how we naturally see the world with our own eyes. This is the same thing my vinyl aficionado friend was talking about when comparing a CD to a record. Our own senses work in an analogue manner, so when we sample things digitally we lose something that may not be quantitatively important, but that may make a qualitative difference.

Advertisements