Filed in: Camera, Editing, Events, Featured, Postproduction, Visual FX

Createasphere Presents its Digital Process Workflow Lab

November 9, 2012 08:57 | By

ARRI Alexa Codex onboard S recorde

At Createasphere’s Entertainment Technology Expo this week in Burbank, show organizers introduced a new exhibit called the “Digital Process Workflow Lab” – an end-to-end workflow demonstration in an integrated cloud environment. The press got a special preview tour on Wednesday. The lab was broken into 12 stations, each one emphasizing the roles played by each of the 13 partners who participated in the lab. The stations led attendees from capture of images, through the various stages of editing, ending in the distribution, curation and monetization processes.

Dell strategist, telecommunications media and entertainment, Chad Andrews spoke for the first portion of the tour. He began by describing key objectives of the workflow, including interoperability, security and managing metadata, and then explained the differences between the three main types of cloud.

Codex Vault

David Bermbach, technical sales and application development manager, ARRI, then showed an example of the first stage of the on set portion of the workflow – image capture. With the ARRI Alexa Studio as his example, Bermbach was able to describe the specifications of the camera, including ARRIRAW and Codex onboard recording, and its benefits. Sarah Priestnall, Codex’s vice president for market development, was able to illustrate some of the attributes of the current onboard recording equipment by recalling a conversation she had recently with someone who had worked on Quantum of Solace (2008). “They had used one of our big studio recorders, which was, literally, the size of a small refrigerator,” Priestnall said. “It was with a DALSA 4K camera. We were laughing that it’s come to this now. It weighs two pounds, records 512 gigabytes onto one of these. So it just shows how technology’s changed in a short time.” She was also able to use the tour’s magic word – “metadata” – by mentioning the Codex Vault’s ability to take on metadata that may be useful further down the pipeline.

Fifth Kind

CEO of Fifth Kind, David Cronan, was present to share how his company’s asset management system is able to serve as a hub for the various types of information that comes from numerous on-set devices. According to Cronan, Fifth Kind supports 250 image formats, 50 different video codecs and all document formats. It is able to make frame-specific notes for film or page-specific notes for documents. A user could email a link to collaborators, which would lead them to an exact frame or a particular annotation. “This creates a completely virtual file system,” said Cronan. “So not only can you navigate this structure from the interface, but we actually emulate that on the back end as well. So I can now navigate scripts or sequences or shots and all this other information.”

Danny Gold spoke on behalf of Levels Beyond to explain their Reach Engine is used to manage metadata and propagate it through to various postproduction systems and then through to distribution. It is also able to search metadata and tag custom metadata relevant to different business processes.

Adobe’s technical liaison, Mitch Wood, described some aspects of Adobe Premiere Pro, which incorporates more than 30 third-party applications directly into its panels. One of Premiere Pro’s strengths, Wood explained, is its ability to work with native file formats. This means users do not need to spend their time transcoding files before working with them. They can instead immediately begin editing.

FilmLight Blackboard 2 Console

Nearing the end of the exhibit, FilmLight’s U.S. product manager, Peter Postma, was able to demonstrate its Baselight color grading system, which was used to bring together the highest quality postproduction elements and apply the final polishes. Postma explained that there is also an on-set version available, which passes on-set color grading metadata directly to the final color-grading suite at the post house. This means users can have the same look right from the beginning, through editorial then in the finishing stage, it can be polished.

Gold returned to explain how Reach Engine can, at this stage, be used for distribution. Reach Engine allows the user to take an edited clip and create sub-clips or take the entire video which can then be sent back into the cloud or sent out to various channels. Reach Engine can output videos for Hulu, iTunes, Amazon, Netflix, as well as some traditional cable, satellite and broadcast channels. Gold used YouTube as his working example. After automating the encoding of the YouTube format, links to the video can be emailed out to potential viewers. These links can be viewed online, accessed by mobile devices and posted directly to social networks. Once the video has accumulated some views, Reach Engine can aggregate the analytics of viewership, comments, etc. And all of this can be done without having to directly interface with YouTube or whichever channel or channels a user chooses. The analytics that have been gathered are pulled into the content library. Since the library has been monitoring the whole life cycle of that particular clip and that library is searchable, users may search the library for any information.

The DPW Lab was one of the key highlights of Createsphere’s Entertainment Technology Expo, which took place at the Burbank Marriott, Nov. 7-8.