Sony has been helping professionals realise their creative vision for decades. Supporting a huge range of leading independent cinematographers and directors – giving them the tools to bring stories to life and capture emotion in every frame. Richard Lewis, Chief Engineer at Sony Europe, explores the evolution of digital cinematography, from earliest experiments to its central role in modern film-making.
Early beginnings
From the 1980s professional filmmakers took the first exploratory steps recording moving images electronically as an alternative to film. In 1993, Sony introduced DigiBeta which transformed the economics of programme production compared with costly 16mm film, then the de facto standard for mainstream drama and documentaries. In the UK, the BBC Television Film Services Department was notable as an early adopter of Digital Betacam.
The film industry was far less inclined to throw out its 35mm inventory and replace it with standard-definition video tape overnight. For the major studios celluloid would remain the default choice for years to come, but for an emerging generation of younger movie makers on tight budgets, capturing electronically offered a compelling entry point into the industry.
For the first time, directors were offered a credible alternative to 16mm. Cinema’s traditional on-set production hierarchy – director, director of photography, cameraman, grip – was joined by an additional tier of ‘point-and-shoot’ film-makers armed only with a video camera and a compelling idea. Break-out movies like The Blair Witch Project (1999, shot on Sony Hi-8) grossed $140 million domestically, proving there was an international audience for movies made on low cost, accessible video technology.
Video continued to become an increasingly popular choice through the decade (2000-2010), with a steady string of low-budget features and documentaries produced on formats like DigiBeta and DVCAM. Lars von Trier and Thomas Vinterberg’s Dogme 95 movement flourished thanks in no small part to digital, while other respected auteurs like Wim Wenders, Mike Figgis, Spike Lee, Hal Hartley and Peter Greenaway embraced the technology.
The Star Wars Legacy
2002 was a landmark year in digital cinematography, when George Lucas and DP David Tattersall shot Star Wars: Episode II Attack of the Clones (2002) entirely in HD 24p using Sony’s HDW-F900 CineAlta camera. This was one of the first and most significant theatrical productions where digital effectively replaced 35mm analogue film in a conventional cinematic feature film workflow. Teaming a Sony camera with optics by Panavision, the F900 marked the first introduction of CineAlta – Sony’s brand-mark that distinguishes high-end cine products from mainstream broadcast and portable cameras.
At the time of its release, Lucas tried to persuade movie theatres to switch to digital projection – itself a brand new medium at the time. But unfortunately this vision was a little ahead of its time. Few cinemas were equipped to project in digital, so the vast majority of audiences saw the results of the HDCAM masters transferred back to conventional 35mm celluloid. Nonetheless, Lucas was giving a very clear signal where cinematographic workflow would be heading a decade later.
Subsequently, Lucas used the more advanced Sony HDC-F950 for Star Wars Episode III: Revenge of the Sith (2005) recorded on the new HDCAM SR format. The next CineAlta iteration, this acclaimed camera offered higher resolution and better colour reproduction than its predecessor.
Digital cinematography has long aroused strong opinions in the creative community. As a medium, it certainly offers plenty of advantages over celluloid. There’s no need to worry about expensive film stock or short running times per reel. Instant playback on set saves DP and director the agony of waiting overnight to view rushes.
But these operational attractions have been – and still are – balanced with a healthy debate about picture quality. Many directors resisted the first wave of digital, pointing out that HD video was no real match for 35mm film in terms of detail, colour gamut, exposure latitude and noise levels. And of course film grain – a physical consequence of the medium itself – has provided story-tellers with a powerful evocative tool in its own right. Equally, some film-makers have opted for digital as a conscious aesthetic choice. Shooting Tron: Legacy (2010) on the Sony F35 gave director Joseph Kosinski and DP Claudio Miranda a distinct on-screen look, suiting both the narrative and the computer-generated scenes that dominate much of the movie.
The 90’s saw the release of another landmark movie that would effectively silence any criticism of digital’s commercial viability. Photographed entirely in 3D by DP Vince Pace, James Cameron’s US$ 2.8 billion grossing Avatar (2009) employed the Fusion system. This pioneering set-up employed two Sony HDC-F950 cameras, capturing stereoscopic images with the aid of a half-silvered mirror.
F65 Arrives
The next step in the evolution of CineAlta came in 2011 with the launch of Sony’s first 8K capable camera, the F65. With a 20 megapixel sensor the same size as a Super 35mm 3-perforation film frame, the F65 represented a huge leap forward in image quality. With over four times the resolution of Full HD, 4K finally delivered pictures that made a credible match for celluloid. The numbers were impressive, with detail, colour gamut and latitude earning the acknowledgement that digital had earned its place on the movie lot.
Today, CineAlta is increasingly the go-to choice for Hollywood big names and independent film-makers. In May this year, French director Jacques Audiard won the Palme d'Or at the 68th annual Cannes Film Festival ceremony for Dheepan, shot on Sony F55 by DP Eponine Momenceau. This follows on from Nuri Bilge Ceylan being awarded the Palme d'Or in 2014 for Winter Sleep, shot using the F65 by Gökhan Tiryaki.
Alongside the flagship F65, lower-cost cameras like the F55 and F5 make true CineAlta quality accessible on limited budgets. And as 4K becomes the norm for digital movie-making, our industry is now turning its attention to other aspects of picture quality. In the quest for ever-more immersive story-telling, directors are looking to higher frame rates, High Dynamic Range (HDR) and a wider colour space that’s closer to the human eye as an adjunct to the increased resolution of 4K.
“Sony has a ‘scene-to-screen’ understanding that threads through our DNA. Listening to the needs of the filmmaking community we continue to create or refine the tools and technology we offer to help film-makers realise their personal vision”- Richard Lewis, Chief Engineer, Sony Europe.