Home Theater Forum and Systems banner
1 - 2 of 2 Posts

Senior Shackster
792 Posts
Discussion Starter · #1 · (Edited)
I thought I'd start this post fresh here since it's an important topic worthy of debate.
The film industry seems to be in a format war regarding what system a feature film
should be shot in. 35mm or some type of digital photography?

First off let me say that the end product in the non-theatrical home markets (cable, DVD,
Blu ray etc.) will ultimately be digital. But what generates the best results and what will
survive into the future?

Currently there are a number of digital advocates who suggest shooting new features
in either 4K, 2K or HD. The advantage is the cinematographer can see what the footage
looks like instantly on set and there is no film processing involved or a 24 hour wait to
screen the dailies. It's certainly easier to shoot in this method and might be quicker once
the set or location is lit.

Film takes longer to shoot. The stock has to be loaded in a black bag into the magainze which has a 10 minute capacity (1000 feet). It takes a bit longer to light providing the
cinematographer is talented enough to 'paint with light'. The negative has to be processed which is at least a day wait and transferred to whatever the editor will cut the movie on be it an Avid or 35mm workprint.

Since most films are edited digitally, there is really no difference in post production.

After editing a digital movie can just be out-putted to the desired format for exhibition.
Film needs to be negative matched and then either transferred from the negative
directly to digital or an answer print has to be made. Then 35mm intermediate duplicate elements are created for mass printing which is either an IP/IN or DI.

I haven't seen a 4K projection in a cinema as of this date. If the film was shot in 4K there
would be no generation loss so it would look as good as what was shot on set. I've seen
2K projection and 35mm high speed prints (third generation) and they are so unimpressive
I rarely go to the megaplex any more. However, in the past top quality film prints were made
in 70mm and 35mm dye transfer Technicolor. In each case they were struck directly from the camera negative which is the optimum quality for an analog format like film. And they looked spectacular. Better than any digital system I screened so far. But the reality is, few cinemas outside of Hollywood or film festivals exhibit those prints any more which is what caused the decline of the moviegoing experience in theaters (chronicled in my book, "The
Moviegoing Experience 1968-2001" www.mcfarlandpub.com). Today the generation loss between the 35mm camera negative and final release print does degrade the image quality on screen. So most likely a 4K image that originated in 4K would look better than a third generation 35mm release print. But not necessarily a 4K image transferred from a
35mm negative. Neither would certainly not look as good as a 70mm or dye transfer 35mm first generation print that were available through 2001. The lack of optimum quality film prints slants the argument in favor of digital projection in this area.

What about the quality difference in home video formats? Here's where it's gets dicey. I
have a film background and experienced movies at their zenith so I believe film emulsion is a superior art form. Of course I'm referring to cinematographers like Freddie Young who shot "Lawrence of Arabia" and called his camerawork "painting with light". I'm also referring to DPs like Ted Moore (Bond films) and Robert Burks (Hitchcock films). Aside from the incredible resolution of shooting in a large format like 65mm or 35mm VistaVision (horizontal exposure of two frames), film has unique characteristics not incorporated into digital pixels. Film imagery is based on it's grain structure. I'm not referring to 'graininess' but the subtle gradings of grain from foreground to background that is created when light hits a person or object on the emulsion itself. It's what gives film a sense of dimensionality that digital lacks. Digital is technically sharp but perhaps too sharp because the pixels lack this grain structure. Film photographs highlights more naturally and similar to how the human eye sees things. Digital tends to burn out the highlights
giving the image an unatural look which limits the lighting design on set. Film stock has
such a wide lattitude it has limitless lighting design potential.

What's also complicated and controversial is transferring film to digital formats. Modern T-Grain film emulsion has a 200 speed 4 point lattitude. Aside from the actual dye couplers being very fine grain (much finer grain than emulsions in the past), they can be under or over exposed a full 4 F. Stops and still be brought back into a sharp, fine grain image when being printed or scanned digitally. When transferrring a 35mm camera negative to HD (or any digital format in existence or to be invented), film's unique grain structure is scanned is replicated in the pixels (regardless of how many pixels is in that format). So the final digital image will look like a camera negative print and more lifelike and 'film like' than a movie shot digitally to begin with. Currently, 4K is the standard for transferrring a 35mm movie. 6K is being utilized for some large format 65mm transfers. Judging from the results of a well shot film by Ted Moore, the blu ray disc of "Goldfinger" in 16:9 looks spectacular and comparable to the original camera negative Technicolor prints made back in 1964.

And this brings up one of the most important attributes of originating on film. 35mm film can be adopted into any format now or in the future. Over the years I've heard many sales pitches from digital advocates claiming 'film is obsolete' and proposing the current digital format replace it entirely for principal photography. I heard this sales pitch regarding 2K which has been replaced by 4K and is no longer considered up to spec. But a film shot in 35mm can be transferred to either successfully. They are even developing 8K. The problem of shooting anything digitally is that video is like a moving target. No format lasts long. The equipment to access it is astronomically expensive and must be maintained. Once the format is eliminated, it might be difficult to impossible to access digital data in a format that
no longer exists decades from now. But a film element can be scanned in any system
that they come up with years from now because the image is a 'hard copy' on plastic. You can actually see it and touch it. It's not an electronic signal.

Which brings us to the most critical aspect of the debate. Archival implications.

For any audio/visual media to be considered 'archival', it must have the potential to last 100 years. While some digital advocates note how unstable early film was (Nitrate, pre-1983 Eastmancolor), these problems have been corrected. Modern tri-acetate low fade color negative stock is estimated to last 75 years in terms of the dye stability and longer if in good storage. All other pre-print materials are on estar base stock which is thought to have an expanded life of 100 years or more. Estar stock is used for the fine grain masters (a fine grain color positive made directly from the camera negative) and the black and white separations (a fine grain black and white positive of each primary color that when combined represent the full color negative). Most major studios who shoot their product in 35mm have these three elements for the long run stored in separate vaults. Low fade
35mm camera negative, low fade 35mm color interpositive and three no fade 35mm black and white separations. So this should give these movies a 100 year minimum lifespan which is certainly archival.

No digital media is considered archival. Period. That is obviously very disturbing and risky for movies shot in these formats whether it's a computer file or hard drive or anything based on magnetic oxide. (Oxide is a corrosive to the base it's contained on and sheds off). The only solution is to transfer the 2K or 4K or HD data to 35mm negative for the future. In the case of classic movies restored digitally, studios like Fox are making both new 35mm negatives and 35mm black and white separations out-putted from the restored 4K data. I screened the new "Sand Pebbles" print derived from a restored out-putted negative and it looked great but...the movie originated in 35mm Panavision which contained the unique
film grain structure and resolution which was replicated digitally as mentioned above. I don't know what a negative would look like it if it was out-putted from a 4K source. I know negatives out-putted from 2K sources don't look that good. Movies shot 2K and outputted
to 35mm years ago will also be of lesser quality than films shot in 35mm and scanned 4K today.

In terms of cost savings, that's a bit complicated. It depends on whether the producer factored in preservation materials in 35mm if they shot digitally. If they did, then it's more expensive to shoot in 2K, 4K or HD and out-put a 35mm negative from the digital master. Probably between $50,000 to $100,000. In those cases, it's cheaper to shoot in 35mm, have a cut camera negative and then transfer it to 4K for home video release. Even for low budget productions without preservation incorporated into the budget, it's cheaper to shoot in 35mm because you have the negative as the final product and future archival element even without the IP and B&W seps back up. Now if a producer shot the movie digitally and didn't make any film preservation elements it might be cheaper to shoot
digitally but then there would be the risk of planned obsolesence in the format or complete loss of the data in the long run. To shoot a digital movie and not be able to access the
data a decade later or lose it would be a cultural tragedy as well as a tremendous loss of revenue.

As I've noted before a producer can purchase left over 35mm film stock from a recent production inexpensively since stock depletes in value once it leaves the dealer. And many cameramen own their own equipment so it's a flat charge for the 35mm camera and the DP. Processing isn't that expensive compared to other costs. Negative matching is but it's a one time charge. I'm not sure what the new video equipment runs but I suspect it's considerably more expensive to rent or purchase and less reliable than mechnical 35mm cameras. The insurance costs might be greater to. A new 35mm motion picture camera is worth $100,000 new and maybe $10,000 used. A 4K digital camera is $800,000. So I'm not sure if there are any cost savings shooting digitally when these other factors are considered.

So in summary, the three basic issues are:

1) 35mm vs. Digital imagery for principal photography. Which looks better? Pros and Cons.

2) What looks best in cinemas regardless of how the movie was shot. A DLP or
a film print.

3) What format will last into the future.

In my opinion numbers 1 and 2 are debtable. Item number 3 isn't. Digital isn't archival,
at least at this moment in time. It's not beyond the realm of possibilty that some
permanent medium for storing digital data will be invented but right now the only long
term option is outputting the movie onto 35mm negative stock.

In the film industry some directors have taken sides in the debate. Spielberg and
M. Night Shyamalan insist on filming in 35mm even if they incorporate some digital
imagery to the final product. Others like George Lucas and James Cameron prefer
digital photography. There is also a third group that shoots on 35mm film but then
scans in the image to alter the entire color design digitally after the fact. The Cohen
Brothers did this with "O Brother, Where For Art Thou" and Scorsese did it for some of
"The Aviator". In both cases it was done in 2K but since they have 35mm film elements
it could be redone in the future in 4K and beyond which is a major attribute.

I prefer 35mm film for as much of the final negative as possible and only utilize digital imagery
as a last resort for certain shots. I do like the look of my 35mm movies transferred to HD
however because my lighting design and grain structure is replicated in the pixels. Digital
is a useful tool but I don't consider it the final product. Just one of the many formats a
35mm feature will be released in. In other words I consider digital useful 'software' but not 'hardware'.

And I admit I have some biases. I really hate digital stunts for example. They look totally
artificial and cartoonish to me which undermine the thrills of the action. I also prefer live action latex monsters and creatures (i.e. "The Thing", "Alien") over CGE creatures. In fact I found a F/X artist who could still do latex creatures for my latest film because of my preference for that type of monster.

Senior Shackster
792 Posts
Discussion Starter · #2 · (Edited)
If you're wondering what the resolution is of 35mm compared to 4K, they
are probably comparable (ignoring the emulsion vs. digital grain structure issues).
At least for a standard 35mm film intended for 1.85 cropping in cinemas
and 16:9 ratio in digital formats.

However, there are other film formats with higher resolution than 4K.
Super 35mm exposed the entire silent film ratio (including the area
reserved for the optical track) so it has a greater resolution. Older
processes used in the past which are still being released on DVD include
VistaVision and 65mm which are in the range of 8K and Imax even greater.
For film stock, the size of the image makes a big difference when digitally
scanning it. Super 35 is still used today so it represents a format that
still exceeds digital systems for principal photography.

And I should note that many if not most independent filmmakers don't seem
concerned with preserving their feature films. They assume or hope the
distributor will do that for them. Unless it's specified in the contract they
won't nor do they have any incentive since most agreements have a term
limit and if the distributor doesn't own the movie, why should they spend
money to save it for the long run. The owner should do that. If an indie
is lucky enough to get a major studio to buy their movie they will probably
make protection elements.

It's curious but when I speak to fellow indie filmmakers and bring this issue
up, most don't want to discuss it and sometimes they even mock me for
being so dilligent in preserving my own features. Lines like, "What do you
think you have, "Citizen Kane"?", suggesting that only universally acknowledged
great films should be saved. Of course my response to them is that everything
should be saved because who knows what will be considered worthy years from
now. I also tell them "Citizen Kane" is a really bad example of 'selective preservation'
because the film was a bomb when it was released and only became a classic
decades later through television syndication and film school presentations in 16mm.
"Carnival of Souls" was also a bomb when it came out but it was one of my favorite
horror flicks when I saw it on the late late show as a kid. I tracked down the director,
Herk Harvey, and restored the film in 35mm and had it deposited at the George
Eastman House archive for long term preservation under his name. Now it's considered
a classic but had I not taken action on this movie it might not exist today. Is
"Carnival of Souls" as worthy as "Citizen Kane". No, but it's a good little horror flick
and it should be preserved.

For unknown reasons there's a mental block regarding film preservation by many
directors both old and new. It's fortunate that outsiders (fanatic film buffs like
myself) have taken the initiative to save so many classics. Robert A. Harris
("Lawrence of Arabia"), Bob Gitt ("Lost Horizon") and Ron Haver ("A Star is Born")
were the men who restored and preserved classics they loved, not the directors
who made them. Strange isn't it. Now I understand in the pre-Television era,
movies really were a one shot deal like a stage presentation but after the advent
of syndication, all movies had a limitless lifespan. Why didn't the producers and
directors take action to make sure their films were properly preserved?

Some contemporary filmmakers do get involved with the preservation and restoration
of their movies like Spielberg, Lucas and Scorsese. Others are still like Richard Donner
who seem uninterested. It was the fans of "Superman II" that persuaded Warner Brothers
to allow Donner to finish the movie and re-cut it to his original intent. Not Donner himself.
I hope in film classes the professors and historians are telling aspiring filmmakers the
importance of preservation. There are so many famous 'lost' films and now that we
have the materials to preseve movies there's no excuse not to.
1 - 2 of 2 Posts