Can airlines nearly double the volume of IFE content they store on board without increasing the storage capacity of their current systems? Can content quality control be automated? Can post-production costs be reduced by using the cloud? Can content delivery cycles be shortened to effectively increase the duration of early window content?
For many airlines, any one of these possibilities could qualify as some kind of Holy Grail. And now, new and emerging technologies, standards and specifications are transforming these possibilities into clear opportunities for the future of IFE.
High efficiency video coding
High Efficiency Video Coding (HEVC) will have a great impact on IFE, and is the biggest key to opening opportunities to nearly double the volume of content that can stored using current capacity.
HEVC is also known as H.265, as it is the successor to H.264 mpeg 4, otherwise known as Advanced Video Coding (AVC). Mpeg 4 video encoding was adopted for IFE by the APEX Technology Committee (APEX 0403) following specification efforts that began more than 12 years ago, according to APEX technical director Bryan Rusenko. It is also the successor to the mpeg 1 and mpeg 2 codecs that are still supported on as many as half of the IFE systems currently flying. Mpeg 4 permitted the manipulation of individual ‘objects’ within a video frame, whereas in mpeg 1 and mpeg 2, the entire frame was the ‘object’.
HEVC was developed with the objective of nearly doubling the efficiency of mpeg 4. And while video coding efficiency is somewhat variable depending on the content being encoded, HEVC looks to have met its objective, according to Rusenko.
The terms ‘coding’, ‘encoding’ and ‘codec’ all refer to video compression technology. Video compression is required because the master digital files use more bits (i.e. information) and higher bit rates (i.e. speed of information playback) to archive the movie at the highest quality. Even digital cinema files, for projection on large theatrical screens, produce their high-quality pictures using much higher bit rates than files for use on a television screen, IFE display or consumer tablet.
The science of video compression takes advantage of the reality that there is not only redundant data within a frame, but there is also a high amount of redundant data in adjacent frames. Encoding these redundancies once and reusing the data in other regions or frames reduces the overall amount of data in the file, according to Rusenko. Played back at 24 frames per second, frames on one side or another of any specific frame may be very similar in terms of the information they carry. Rather than repeating all of that data, compression technology eliminates duplicate instances of the data and simply replays the original data in subsequent frames.
Using a technique known as motion compensated prediction, which encodes blocks of pixels by making reference to another area in the same frame (intra-prediction) or in another frame (inter-prediction), mpeg 4 defines macroblocks up to 16×16 pixels. HEVC has the capability of describing block sizes up to 64×64 pixels.
While the initial version of HEVC was ratified in January 2013, Rusenko says that it may be another year before HEVC is ready to be implemented for IFE use. Among the remaining matters to be resolved are the applicable patent royalties. The APEX Technology Committee’s Encoding and Encryption Technologies Working Group (EETWG), chaired by Pierre Schuberth of Thales InFlyt Experience and Rusenko, is developing a new specification for IFE.
HEVC will offer IFE systems providers two ways of taking advantage of its greater efficiencies. One way, of course, would be to maintain the current display quality at approximately half bit rate, thus nearly doubling the volume of content that can be stored in the same place. The alternative would be to deliver better quality, in terms of higher definition (4K) and higher dynamic range, to newer displays.
The first steps taken by the EETWG, according to Rusenko, were to revise APEX 0403, the specification written to support the use of mpeg 4 in IFE, so that the bit rate requirement is relaxed to enable systems providers to use emerging bit rate optimization tools while awaiting HEVC that will result in the same or increased display quality at potentially lower bit rates.
Internet media subtitles and captions
Internet Media Subtitles and Captions (IMSC) is an application of Timed Text Markup Language (TTML), which was developed for the distribution of subtitles and captions worldwide by W3C, according to the author of IMSC 1.0, Pierre-Anthony Lemieux of MovieLabs, who has participated in the activities of APEX’s Closed Caption Working Group (CCWG). IMSC reduces fragmentation by bringing together multiple profiles of TTML and supports both text- and image-based subtitles and captions.
Whether subtitles and captions should be text- or image-based is a question that the IFE community struggled with from its earliest implementation of closed captions nearly 10 years ago when the US Department of Transportation first announced its intention to develop requirements for the use of closed captions in IFE. At that time, the IFE industry had only begun to codify mpeg 4, which – as an object-based codec – was able to manage text within a frame as an object, something that mpeg 1 and mpeg 2 could not do as those standards dealt with the entire frame as a complete object.
The only way to implement closed captions – which convert the audio language in content its written form for hard-of-hearing viewers –in mpeg 1/2 was to use image-based captions in which a graphic image of captions was superimposed on the video frame in playback, explains Rusenko. To support subtitles – which convert audio language to a different language for hard-of-hearing people or those of a different language – two versions of every movie or television program, one with subtitles ‘burned in’ and the other without, are stored.
While there was some support for codifying both text- and image-based subtitles and captions, when APEX 0403 was codified, it was ultimately decided, Rusenko explains, to codify only rendered image captions, using bitmap technology. Today, only a limited amount of closed captions delivered to IFE systems provide bitmap captions – even to most of the IFE systems that are mpeg 4 and have the ability to use text. For certain kinds of languages, such as southeast Asian languages, which may be displayed in both vertical and horizontal formats, vertical formats may have to depend on rendered image formats for display. However, there appears to be little or no current demand for vertical display in IFE.
The most likely deliverable for closed captions in the future will be via some form of Timed Text, like SMPTE TT 2052 – used mostly in theatrical films and web-based content – and in some cases WebVTT, which is used frequently for television. To support mpeg 1/2 systems, mpeg 4 H.264 systems, and in the future HEVC H.265 systems, some form of conversion from the deliverable provided by the content provider to the content delivered to the aircraft will be involved. While APEX’s CCWG, led by Bryan Rusenko and this author, is still working to adopt a specification suitable to address the DOT’s current closed caption initiative, IMSC-1 – and the newer IMSC-2, in development – appear to offer a friendly format for such conversions.
The CCWG is working to ensure that the W3C initiative to define conversion (mappings) between Timed Text and WebVTT will support IFE needs, says Rusenko.
Interoperable master format
A key to speeding up the IFE content delivery supply chain is to begin to involve IFE content delivery into
the larger ecosystem that emphasizes the interoperability of source materials across markets.
One of these ecosystem initiatives comes from the SMPTE, an internationally recognized standards organization responsible for more than 600 standards, recommended practices and engineering guidelines for film-making, digital cinema, television production, audio recording and information technology.
As part of a move toward a standardized file-based workflow, SMPTE’s Interoperable Master Format (IMF) initiative advocates the creation of a single, standardized master file for worldwide distribution across multiple markets, including IFE. The concept builds on that of digital asset management, a process advocating the creation of content elements once at the highest level, e.g. a level above digital cinema, and then repurposing (versus recreating) those elements throughout the supply chain for each successive market.
SMPTE’s IMF Working Group is chaired by Annie Chang, VP of post-production technologies for The Walt Disney Studios, who supports the inclusion of IFE versioning in the IMF ecosystem. Chang includes, as one example of using IMF, the airline-edited version(s) of motion pictures in the presentations that she makes to familiarize the content community with the kind of elements and versioning supported by IMF.
A similar initiative is that of the Digital Entertainment Content Ecosystem (DECE), an alliance of more than 85 companies including motion picture studios, created to support UltraViolet, a cloud-based digital rights locker for movies and television programs, which allows consumers to stream and download content to multiple platforms and devices under a single license.
The content delivery standard for this initiative was originally called the Common File Format and is now known simply as the Common Format. Common Format defines a container for audio-visual content specifying how audio, video and subtitle content intended for synchronous playback is stored within a compliant file.
The DECE specification also defines how one or more coexisting digital rights management (DRM) systems can be used to protect the content. DECE’s Common Encryption Scheme specifies standard encryption and key mapping methods that can be used by one or more digital rights and key management systems to enable decryption of the same file using different DRMs.
Perhaps the biggest challenge for post-production facilities is the investment in encoding hardware and software with scalability in a rapidly changing technological environment. This can be particularly challenging for facilities that focus on limited markets, and for entities that provide post-production services as an adjunct to their principal business.
In recent years, several cloud services have emerged that serve multiple markets on a global basis, offering hosted, on-demand services as an alternative to investment in encoding hardware and software, and training of staff to attain encoding/transcoding expertise.
While a great deal of the focus of the APEX Technology Committee and its various working groups is aimed
at content delivery, APEX has created a Payment Technologies Working Group, which was originally chaired by Michael Planey of HM Planey Consultants, and is now chaired by Rich Salter, CTO of Lumexis, and co-chaired by Rusenko. The migration of the credit card industry away from magnetic stripe toward chip and PIN technology, along with the emergence of NFC and other payment technologies, challenges airlines with large investment in platforms that may result in the assumption of risks previously assumed by banks and credit card companies.
Several suppliers of automated quality control solutions are developing systems that may offer opportunities for the migration of IFE content quality control from manual to automated or semi-automated processes.
IN SUMMARY: Working with other formats
SMPTE-TT: Likely no conversion necessary to IMSC1
CEA 608/SCC: SMPTE RP 2052-10
CEA 708: SMPTE RP 2052-11
EBU STL: EBU Tech 3360
EBU-TT-D: No conversion necessary to IMSC1
WebVTT: Deliverable within the W3C
About the author
Michael Childers is a long-time industry content management consultant. He is in his third year as a member of the APEX Board of Directors and chair of its Technology Committee. He has chaired its Digital Content Management Working Group since 2000, and chairs the Closed Caption Working Group. He was appointed by the US Secretary of Transportation to serve on the Department of Transportation’s ACCESS Advisory Committee and co-chairs its IFE Working Group.
He is a member of the Society of Motion Picture and Television Engineers, has collaborated with its IMF Working Group for 10 years, and has worked with DECE for two years. Bringing IFE into the digital content ecosystem is a principal objective of his tenure with the Technology Committee.