Intermittent Issues:
Digital Projection (Finally) Gets A Standard
Part 1: Overview, and The Image
By David Meek
August 23, 2005
BoxOfficeProphets.com

At last, my evil plans will come to fruition!

Overview

Digital Cinema Initiatives (DCI -- http://www.dcimovies.com/), an organization that was founded in 2002 by the major Hollywood studios, released their standards document for digital cinema in late July 2005. It outlines the essential elements of a digital projection package, from post-production at the studio to controling the lights and curtains in the theater. The document can be found at: http://www.dcimovies.com/DCI_Digital_Cinema_System_Spec_v1.pdf. This article will attempt to summarize key items from the DCI document, and try to put them into the larger context of what it means for the moviegoing public.

It is important to note that DCI was not the first group to work on a digital cinema standard. The Society of Motion Picture and Television Engineers (SMPTE -- http://www.smpte.org/) has an ongoing Technical Committee (SMPTE DC28: Technology Committee for Digital Cinema), which continues to meet and develop new ideas. The most striking aspect of the DCI work is that it is representative of the main content providers and their expectations for how this new platform will work. As the National Assocation of Theater Owners (NATO -- http://www.natoonline.org/) note in their response to the DCI document (found at: http://www.natoonline.org/DCI%20Final-Summary-7-27-051.pdf), the fact that a group run by the studios put into writing certain key specifications effectively means that any group that follows-up will be bound to the DCI standard, to some extent.

The DCI standard therefore represents a turning point in cinema history - we now have an established reference that insures that if an exhibitor invests in currently available projection technology, the core equipment will not be rendered obsolete by incremental improvements in upcoming years. (For the last decade, all equipment produced and purchased has effectively been a crapshoot, as no one could guarantee that an eventual standard would be backward compatable.)

This is not to say that parts of the system, especially in the area of data delivery and storage, might not require updating over the next three to five years as the studios and theater chains hammer out the final agreements on how to exchange the digital data. But the primary expense (the digital projector) can now be considered a safe future investment. The catch, as it has been all along, continues to be: who pays? I'll take that up in part two of this article.

Why Digital Cinema?

I actually wrote a full article on this subject several years ago ('The Future Is Digital', May 2002), so I don't want to retype the entire thing again. However, I will restate certain key points - for more detail, please go to my earlier article.

-- Studios spend tens of millions of dollars each year in producing 35mm film prints of their movies. Those prints go out into the world, where they get seriously abused in the thousands of multiplexes across the country and around the world. Damaged prints must be replaced; location-specific prints (with subtitles and/or dubbing) must be made; and all of this stock must be managed and stored somewhere.

-- The process of projecting a 35mm film print is inherently destructive to the print itself. The mechanism of the projector strains the print each time it is projected. This means that archival prints must be handled with care, lest one of the few remaining prints of a given film be seriously damaged. Also, due to the near-universal use of 'platters' in commercial projection (individual reels are spliced together into one continuous run of film), every print is cut and spliced multiple times through the first- and second-run cycle.

-- The projection mechanism, as well as its assorted peripherals, requires frequent hands-on maintenance (focus, framing, lens changes). Most theaters do not have the staffing to keep someone in the film booth throughout each presentation, resulting in patrons having to chase down someone to fix the problems. Additionally, few theaters allocate resources for the routine maintainence required for their equipment to perform to high standards.

Each of these items is addressed by digital projection: with one-time, up-front investments in data networks, studios can deliver their films at nearly zero cost; the same data set can contain all of the subtitles and dubbed audio tracks that a given geographic location might need, and can be switched on or off as appropriate for a given audience; digital projectors have very few moving parts, in comparison to film projectors, and should be more reliable with less hands-on effort; since the data is stored in digital form, the image will not degrade regardless of how many times it is projected or moved from location to location.

This is not to say that digital projection is perfect. I still have a soft spot in my heart for traditional film presentation. However, moviemaking is a business enterprise, and in the end money inevitably wins out. (Again, for more discussion on this topic, please refer to my earlier article.)

The Image

We begin with the central part of the DCI standard: the presentation of an image on a screen. This is more than just a bunch of dry numbers - these items will define our world as moviegoers for decades to come, just as a group of engineers over 70 years ago set the standard for film-based presentation, a model that still determines our theatrical experience today. Given the longevity of that standard, what we get from DCI represents the very best that we can expect from digital projection, probably for the entire lifetime of anyone reading this today.

Resolution

The part of the standard that most folks will fixate upon is resolution. Specifically, how much data will go into each image as it is projected on the screen. We talk in terms of pixels, basically in the same manner as digital cameras and computer monitors. The DCI standard does specify mandatory resolution. But we should get some terminology out of the way first.

There are some 'shorthand' values that are used in describing the resolution of a digital projector. When we talk of the HDTV standard in the United States, we talk in terms of 'lines' (horizontal lines per image, or height), like 1080i (1,080 lines per image) and 720p (720 lines per image). Cinema projection is referred to by the width of the image, and uses two key terms: 2K and 4K. The 'K' represents the number 1,000 (like in the Y2K bug, where '2K' meant the year 2000), and the number represents how many pixels across (wide) the image will be. So 2K means 2,000 pixels wide, and 4K means 4,000 pixels wide. Specifically, a '2K' projector will have a maximum resolution of 2048 pixels wide by 1080 pixels high, while a '4K' projector will be 4096 x 2160. (It is not a coincidence that the 1080i HDTV standard has the same number of horizontal lines as the 2K cinema standard.)

The earliest digital projectors (used for showing Episode I of the Star Wars prequels) used a 1K (1,000 pixels wide) projector, state-of-the-art for 1999. Hollywood has since gravitated to 2K as the in-house standard: post-production houses typically scan film at 2K resolution for digital masters, and the most common commercial digital projectors for movie theaters are 2K models -- based on a Digital Light Processing (DLP -- http://www.dlp.com/) chip from Texas Instruments. (I go into DLP technology in some depth in my earlier article.) Since the US HDTV standard is similar to the cinema 2K standard, this makes producing hi-def versions of films and TV shows for TV, versions for the upcoming high-definition DVD format (HD-DVD or Blu-ray), and versions for 2K Cinema files - all from one original master source - a much simpler process. Therefore, here in 2005, 2K represents the baseline for cinema resolution.

I should digress for a moment and address those letters after the HDTV numbers -- 'i' and 'p'. The 'I' refers to Interlaced, and describes a process by which half of each image is 'drawn' on the screen, skipping a line each pass (the first pass draws lines 1, 3, 5, 7, 9..., and the second pass draws lines 2, 4, 6, 8...). This approach goes back to the earliest days of television, and is used by the 1080i HDTV format. The 'P' stands for Progressive, which means that each line is drawn in order on screen for each image (1, 2, 3, 4, 5...), which is the method that computer monitors have used for display. Right now, only 720p uses this approach for HDTV.

Why does this matter? Well, the 2K Cinema standard is Progressive, meaning that all 1,080 lines are drawn in order. Given the technology in the 2K Cinema projectors, this means that current 1080i HDTV images are of somewhat lower quality than that of the 2K Cinema image, even though the resolution is almost identical. However, when High-Definition DVD discs are eventually released, they will contain 1080p (Progressive) signals. And TV manufacturers are now releasing the first 1080p-capable HDTV sets for home use (rear-projection and front-projection). With this upgrade, technology available in the home will be on a par with the 2K Cinema standard. So, when 1080p becomes widely available, for the first time in the entire history of motion picture projection it will be possible to equal the visual quality of a well-run movie theater in your home (without hauling a 35mm film projector into your den).

Of course, you don't have to limit projection to 2,000 pixels wide. Sony has been demonstrating a 4K (4,000 pixels wide) projector using their SXRD technology for over a year now, apparently trying to forestall any assumption that 2K is the final destination (and that DLP is a foregone conclusion as the underlying technology for cinema projectors). However, their projector is still a prototype (as of mid-2005), and the supporting hardware does not exist in a marketable format. Clearly, 4K is coming at some point -- but 2K represents the best that is both available and practical at this moment. One commentator made the comparison between 2K and 4K using 35mm and 70mm film: 70mm is a clearly superior format, but carries with it enough extra expense that its superiority is outweighed by 35mm's practicality.

The DCI group recognized these facts, and made an interesting choice: they set both 2K and 4K resolutions into their standard. They also called for backwards- and forwards-compatability, so that current 2K data files will display on the eventual 4K hardware, and future 4K data files will play without complications on current 2K hardware. This means that current theaters that invest in existing 2K projectors will not have to throw them out ten years from now, even if 4K gains 'traction' in Hollywood. And, if a theater eventually buys a 4K projector, they won't be locked out of an existing catalog of 2K releases.

Having seen a number of films projected using the 2K format (including Episodes II and III of the Star Wars prequels), I believe that it is "good enough" to get digital projection up and going. That is, it does not beat a well-projected 35mm film image on every count - from a near-screen vantage point, edges of sharply defined objects such as credit text do display a certain amount of pixellation. But it does improve on the 'average' projection experience in the great American multiplex to such an extent, that the man-on-the-street (or, the man-in-the-seat) should be able to express a preference for the digital version with little difficulty. If or when 4K comes around, I think we will then be in a position where the technology will completely surpass that of 35mm film.

Frame Rate

With resolution out of the way, we reach frame rate. The standard for motion picture film presentation has been, since the advent of sound-on-film, 24 frames per second (fps). The engineers at the time determined that 24 fps was fast enough to convincingly produce 'persistence of vision' (where your brain sees individual pictures going by so quickly that it no longer sees them as unique pictures, but perceives them as actual motion), but still slow enough to avoid wasting film stock on unnecessary higher frame rates. (Given the amount of footage shot each year, even one extra frame per second would have resulted in using tens of thousands of feet of extra film stock.) Ever since, 24 fps has ruled our lives in the movie theater.

In recognition of this, the DCI standard bases digital projection (as well as sound, which we'll get to in part two) on multiples of 24 fps. This allows for the conversion of film-based content directly to digital form, frame-for-frame. However, part of their decision got me to scratching my head. The DCI standard specifies that 2K data can come in either 24 or 48 fps (the specification is actually for hertz, but I think it makes more sense to keep talking about frames). But, 4K data is limited to only 24 fps - no more.

This may seem like nitpicking, but I can think of two issues that will be caught up in this decision: higher frame rates to improve overall picture quality, and 3D filmmaking. The former comes from a school of thought that 24 fps really isn't fast enough to avoid certain motion issues (like 'smearing' of people or objects moving rapidly through the frame). While the technology is radically different (digital projectors don't have shutters or blades interrupting the light path like film projectors), reactions to some of the higher-frame-rate film systems (like ToddAO, IMAX and MaxiVision48) do seem to point to higher frame rates producing a smoother image on screen.

And then we have 3D. Apparently left for dead after the short-lived 3D craze in the 1950s, 3D films have occasionally resurfaced, usually to disappear as quickly as they came. However, two things have changed the fate of 3D: IMAX 3D, and high-profile directors taking on ambitious new 3D projects. IMAX 3D took 3D filmmaking from its ignominious red-and-green paper-glasses (known as the 'anaglyph' method) stereotype, and elevated it to incredible heights. Based on radically improved technology (polarized lenses or synchronized LCD shutters in glasses, the superior 'rolling loop' IMAX technology, and higher frame rates), IMAX 3D films produced jaw-dropping results. As more and more 3D films are released to IMAX theaters, the potential audience for high-quality theatrical 3D releases continues to grow.

Some filmmakers have jumped in with both feet. James Cameron took the bathtub of money from his blockbuster hit film Titanic and poured it into pet projects. In 2003, he released Ghosts of the Abyss, an IMAX 3D film that returned to the site of the Titanic. This year, Cameron released Aliens of the Deep, an IMAX 3D film digging into undersea science and its relationship to space exploration. He is currently working on a feature film using the 3D rig developed for Aliens of the Deep. Robert Rodriguez, the maverick filmmaker behind such diverse fare as Desperado, Once Upon A Time In Mexico, the Spy Kids franchise and Sin City, has explored mainstream 3D with Spy Kids 3D and The Adventures of Shark Boy and Lava Girl in 3D. These were released using the old-style anaglyph/red-green glasses method, which allowed the films to play in any standard movie theater. And, we should also mention Tom Hanks, who will roll out an IMAX 3D project of his own, Magnificent Desolation: Walking on the Moon 3D, later this year.

Getting back to the frame rate issue: why would frame rate matter for 3D? Since all versions of 3D (other than the red-green anaglyph method) require separate images for each eye, the system must be able to at least take the 24 fps minimum for normal movies and double that (resulting in at least 24 unique frames per second for each eye). IMAX already exceeds that minimum, and the DCI standard for digital projection does allow for double-rate display (48 fps) on 2K projectors and data. But since the 4K standard explicitly forbids any rate other than 24 fps, that means that future digital 3D films will be stuck at the 2K level - even if 4K is later adopted as a preferred standard. (Based on a careful reading of the standards document, my guess is that the DCI group realized that the 4K data set will already be so massive at 24 fps that attempting to double that rate would overwhelm existing data networks.)

Compression

The last major item I'll address on the image side is compression. In the digital world, you have to take the individual images being captured by the camera and store them somehow for later editing and playback. Ideally, you'd just dump the raw data onto a bunch of computer drives, keeping 100% of the original values. The problem is that for a 2K film image, there are 2,211,840 pixels per image, and a new image is being captured at least 24 times per second. So: one second of 2K data means storing 53,084,160 pixels; one minute of data holds 3,185,049,600 (yes, that's three billion) pixels; and so on. Keep going out to the 120 minute 'average' film, and you end up addressing more than one-third of a trillion pixels of image data. (For a 4K data stream, you easily exceed one trillion pixels.) Also, keeping in mind that each of those pixels requires both color and brightness information - a conservative projection of completely raw data for a 4K movie would run into the quadrillions of bits. While computation and data storage have advanced dramatically just in the last few years, storing, processing and transferring that much data in raw form is simply not practical.

So the image data has to be compressed to some extent. There are three issues that must be balanced when considering compression: the quality of the output image; the available storage, in terms of size and speed; and the computational power required to compress and de-compress the data. When a real-world limit is placed on one of these items (such as the current DVD disc, with a little more than nine gigabytes of possible data storage per disc), the other items must adjust to face this limit. (People who produce DVDs often refer to a 'bit budget' for the disc, meaning that increasing the storage space used for the film reduces the available space for extra features, and vice-versa.)

The digital projection system for theaters doesn't have that fixed limitation from a physical medium, and places a higher priority on image quality than on storage space. As a result, DCI chose a relatively new compression standard, known as Motion JPEG 2000. This is a significant upgrade to the familiar JPEG standard that most of us are familiar with (for example, the digital camera at my house stores its images in JPEG format), with JPEG 2000 incorporating 'wavelet' technology. While the original JPEG format was 'lossy' (some data would be thrown out from locations where the program determined that they wouldn't be missed), JPEG 2000 provides both 'lossy' and 'lossless' options (lossless compression means that you can reverse the process and get back to exactly what you started with). DCI mandates lossless JPEG 2000 as the sole compression standard.

Getting a little more tech-nerdy for a moment, comparing Motion JPEG 2000 to the standard used in the DVD format (known as MPEG-2) points up some interesting differences for content providers. In producing a DVD, the data must eventually be stored in the MPEG-2 format. Once the data reaches this point, editing becomes more complicated. This is because MPEG-2, in order to maximize the amount of data that can be fit into a limited space, does not actually store each frame of the movie individually. Instead, the format only stores certain 'key' frames, and then stores only the portion of the image that changes from one frame to the next. This does wonders for the amount of data that can be stored in a fixed amount of space, but makes frame-specific editing difficult: if the frame you want to cut is between the keyframes, a fair amount of work has to be done to reconstruct the intermediate frames. (This is why MPEG-2 editing software is not commonplace, and good software is really expensive.)

Motion JPEG 2000 does not have this limitation, since it actually compresses each frame independently - what happens in one frame does not affect any frames that follow. So if a content provider stored their work in this format, they could do frame-specific editing with much less difficulty than with MPEG-2. It should be noted that DCI did not mandate that content providers use Motion JPEG 2000 during production or post-production; only the final version of the overall data package for the theaters must be in this format. However, since it is both lossless and frame-independent, it's conceivable that some production houses might eventually switch over on their own for in-house use.

In mandating JPEG 2000 as the compression standard, DCI also set a maximum cap on how much data can be used per second on the image. Setting this cap allows manufacturers to insure that the equipment that is sold to theaters today can keep up with the data streaming off of the hard drives without choking (and that the equipment wouldn't be thrown away two years from now). The maximum bandwidth that can be sent to the decoding hardware is 250 megabits per second. This roughly translates to 31 megabytes of data per second, or around 1.9 gigabytes of data per minute. So, for a hypothetical 120 minute movie, there would be approximately 225 gigabytes of data stored -- for the image only. (Sound and other data are additional.) Compare that to the storage capacity of HD-DVD (30 GB, and potentially 45 GB) and Blu-ray (54 GB for dual-layer discs), and you can clearly see that the digital cinema files will dramatically outpace even next-generation discs. I envision theaters having multiple terabyte RAID disk arrays, to store and manage all of the given data for the films currently being shown at a multiplex theater. (The DCI standard does specify that a multiplex theater allocate at least one terabyte of storage per screen in the facility, and requires a minimum level of redundancy in case of data loss. This can be individual storage per screening room, or centralized storage.) This may also lead to the creation of an entirely new job class: cinema network manager. (Hmmm...maybe there is a way for me to go back to work at a theater after all...)

Part two of this article will address audio, additional data, packaging, security...well, pretty much everything else. I hope you come back for it. Thanks for reading.