Tuesday, April 23, 2024
Subscribe Now

Voice Of The Crew - Since 2002

Los Angeles, California

HomeCraftsCameraSMPTE 2014 to Explore Latest Developments in Ultra-HD, Image Processing, Theatrical Display...

SMPTE 2014 to Explore Latest Developments in Ultra-HD, Image Processing, Theatrical Display and Audio Technology

-

LR-Jim DeFilippis SMPTE 2014 Conference Co-Chair and SMPTE Fellow-email

Jim DeFilippis, SMPTE 2014 Conference Co-Chair and SMPTE Fellow.
Jim DeFilippis, SMPTE 2014 Conference Co-Chair and SMPTE Fellow.
The Society of Motion Picture and Television Engineers (SMPTE) announced that the SMPTE 2014 Annual Technical Conference and Exhibition (SMPTE 2014), from Oct. 20-23 in Hollywood, will focus on hot topics in the industry, including the latest developments in Ultra-HD standards and systems, as well as image processing, theatrical display and audio technology.

“The rapid development of UHDTV, image processing, cinema projector and immersive audio technologies over the past year will make for an exciting series of sessions at this year’s conference,” said Jim DeFilippis, SMPTE 2014 Conference co-chair and SMPTE Fellow. “Continued innovation in these areas promises not only to enrich storytelling itself, but also to alter and enhance the experience of moving images, both in the cinema and in the home. Experts from preeminent media technology companies around the world will present their latest work and findings in these dynamic areas of our industries.”

The session track “UHDTV: Building the Plane in Flight” will begin with a presentation by NHK‘s Kenichiro Ichikawa, Seiji Mitsuhashi, Mayumi Abe, Akira Hanada and Kohji Mitani, along with Mitsutoshi Kanetsuka of Sony, on a system capable of producing simultaneous 8K, 4K and 2K video in real time from a single 4K camera and continue with a presentation by Belden‘s Stephen Lampen on the challenges of transporting 4K (12 Gb/s) video over single-link coax and how they may be overcome. The series track will wrap up with a presentation by Archimedia‘s Josef Marc on the implications of viewing 4K and Ultra-HD content in a largely 2K world and 2K content on Ultra-HD screens as the infrastructure for 4K evolves.

In the session entitled “Dammit, Gamut, I Love You!” NHK’s Kenichiro Masaoka, Takayuki Yamashita, Yukiko Iwasaki, Yukihiro Nishida and Masayuki Sugawara will examine color management for wide-color-gamut Ultra-HD production. François Helt and Valerie La Torre of Highlands Technologies Solutions will look at a quality assessment framework for color conversions and perception. Lars Borg of Adobe will discuss improved methods for color matching between HD and Ultra-HD content, and Gary Demos of Image Essence will examine approaches to defining a high dynamic range (HDR) intermediate that can be used to help maintain the creative’s mastered intent. Presenters will discuss how these tools are being used to support development of wide-gamut displays, enable high-quality gamut mapping, and facilitate gamut conversion in which the perception of artistic intent is preserved from the initial working display to the viewer.

A session track entitled “Higher Frame Rates” asks “Is faster better?” will further examine the challenges, benefits and options for working at frame rates beyond 60Hz, including both video and high frame rate (HFR) cinema formats through a series of presentations beginning with David Richards of Moving Image Technologies, who will discuss 120 frames per second (fps) capture as a universal open production standard. Paola Hobson of InSync Technology will continue with “High Frame Rate Video Conversion,” which will be followed by a presentation by Keith Slavin and Chad Fogg of ISOVIDEO on quality advancements and automation challenges in file-based conversion, with a focus on noise reduction, deinterlacing, HFR and compression efficiency.

A session dedicated to display technologies will start with a presentation by consultant George Joblove, who will compare and contrast today’s numerous display performance measurements and what these photometric dimensions and units represent. A subsequent session by Peter Putman of Kramer Electronics on next-generation display interfaces will cover the latest versions of high-definition multimedia interface (HDMI), DisplayPort and the many variations on each standard. The session will wrap up with a presentation by 3M‘s Jimmy Thielen, James Hillis, John Van Derlofske, Dave Lamb and Art Lathrop on quantum dots, a new backlighting system for achieving the wider color gamuts required for Ultra-HD, particularly as defined by the International Telecommunications Union (ITU) Recommendation BT.2020.

The SMPTE 2014 session called “Advancements in Theatrical Display” will feature a presentation by Dolby LaboratoriesSuzanne Farrell, Scott Daly and Timo Kunkel on study results summarizing viewer preferences for cinema screen luminance dynamic range, followed by a presentation by Rick Posch of CR Media Technologies and Peter Ludé of RealD on development of an accurate and repeatable measurement method for speckle in laser illuminated projectors. The session will conclude with a presentation by Jim Houston of Starwatcher Digital and Bill Beck of Barco, who will discuss design considerations for cinema exhibition using laser illumination.

A three-part session “Developments in Audio Technology” will begin with presentations dedicated to tools for immersive audio. A presentation by Dolby Laboratories’ Charles Robinson and Nicolas Tsingos on cinematic sound scene description and rendering control will be followed by an examination of immersive audio systems and the management of consequent sounds, presented by Technicolor‘s William Redmann. Part one of the session will wrap up with a presentation that brings Robert Bleidt of Fraunhofer USA, Arne Borsum and Harald Fuchs of Fraunhofer IIS and Merrill Weiss together to discuss the opportunities that object-based audio provides for improving the listening experience and increasing listener involvement.

The second part of the audio technology developments session will look at the elements required to offer new audio services. A presentation by Jeffrey Riedmiller, Sripal Mehta, Prinyar Boon and Nicolas Tsingos of Dolby Laboratories will first examine a practical system for enabling interchange, distribution and delivery of next-generation audio experiences in the cinema. Shifting to include broadcast, a subsequent presentation by Thomas Lund of TC Electronic A/S will explore the technical aspects of loudness normalization versus speech normalization, followed by a presentation by Jon Paul of Scientific Conversion, who will provide an overview of the test data and recommendations for improved standards and reference designs for digital audio transmission.

The three-part session on audio technology developments will conclude with three presentations. The first by Dolby Laboratories’ Michael Babbitt will examine leading-edge work on audio data management and analysis. Patrick Waddell of Harmonic will look at issues related to the CALM Act in his presentation “Have Things Calmed Down?” The audio session will conclude with a look back at the origins of audio and video compression by Jon Paul.

In the conference’s multiple-part session on image processing, Seiichi Goshi of Kogakuin University will begin with a presentation introducing super resolution technology that uses nonlinear signal processing to create naturally appearing thin edges that do not exist in the original image. Technicolor’s Pierre Routhier will follow, presenting a model for motion control that ensures true 4K detail at capture, and Klaus Weber of Grass Valley will subsequently present on potential solutions for 4K or UHD image acquisition, with a focus on live broadcast production.

The second part of the image processing session will feature Scott Daly, Ning Xu and James Crenshaw of Dolby Laboratories, along with Vickrant Zunjarrao of Microsoft, who will provide an overview of a psycho-physical study isolating judder using fundamental signals, as well as what the results say about the appearance and magnitude of motion distortions from the viewer’s perspective. To conclude the session, Sony Electronics’ Gary Mandle will describe the systems used to acquire, develop, transmit and record massive detailed images recorded over five U.S. lunar orbiter missions from 1966 to 1967, as well as the story of how the tapes and video equipment were saved and refurbished so that these images could be archived and publicly distributed.

For more information, visit www.smpte2014.org.

- Advertisment -

Popular

All Of Us Strangers Cinematographer Jamie Ramsay Creates Dreamlike Nostalgia For...

0
All of Us Strangers is the latest film from Andrew Haigh (45 Years, Lean on Pete). It’s an adaptation of Taichi Yamada’s novel about...

Beowulf and 3-D

Mulan

Mulan Review