Academy Software Foundation Technical Advisory Council (TAC) Meeting - September 3, 2025

Join the meeting at https://zoom-lfx.platform.linuxfoundation.org/meeting/97880950229?password=81d2940e-c055-43b9-9b5a-6cd7d7090feb

Voting Representative Attendees

Premier Member Representatives

  • Andy Jones - Netflix, Inc.
  • Chris Hall - Advanced Micro Devices (AMD)
  • Christopher Moore - Skydance Animation, LLC
  • Eric Enderton - NVIDIA Corporation
  • Erik Niemeyer - Intel Corporation
  • Gordon Bradley - Autodesk
  • Greg Denton - Microsoft Corporation
  • Jean-Michel Dignard - Epic Games, Inc
  • Jonathan Gerber - LAIKA, LLC
  • Kimball Thurston - Wētā FX Limited
  • Larry Gritz - Sony Pictures Imageworks
  • Matthew Low - DreamWorks Animation
  • Michael Min - Adobe Inc.
  • Michael B. Johnson - Apple Inc.
  • Rebecca Bever - Walt Disney Animation Studios
  • Ross Dickson - Amazon Web Services, Inc.
  • Scott Dyer - Academy of Motion Picture Arts and Sciences
  • Youngkwon Lim - Samsung Electronics Co. Ltd.

Project Representatives

  • Carol Payne - Diversity & Inclusion Working Group Representative, OpenColorIO Representative
  • Cary Phillips - OpenEXR Representative
  • Chris Kulla - Open Shading Language Representative
  • Diego Tavares Da Silva - OpenCue Representative
  • Jonathan Stone - MaterialX Representative
  • Ken Museth - OpenVDB Representative
  • Nick Porcino - Universal Scene Description Working Group Representative
  • Rachel Rose - Diversity & Inclusion Working Group Representative

Industry Representatives

  • Jean-Francois Panisset - Visual Effects Society

Non-Voting Attendees

Non-Voting Project and Working Group Representatives

  • Alexander Schwank - Universal Scene Description Working Group Representative
  • Anton Dukhovnikov - rawtoaces Representative
  • Daniel Greenstein - OpenImageIO Representative
  • Daryll Strauss - Zero Trust Working Group Representative
  • David Feltell - OpenAssetIO Representative
  • Eric Reinecke - OpenTimelineIO Representative
  • Erik Strauss - Open Review Initiative Representative
  • Gary Oberbrunner - OpenFX Representative
  • Jean-Christophe Morin - Rez Representative
  • John Mccarten - Rongotai Model Train Club (RMTC) Representative
  • Josh Bainbridge - OpenQMC Representative
  • Stephen Mackenzie - Rez Representative
  • Tommy Burnette - Dailies Notes Assistant Representative

LF Staff

  • David Morin - Academy Software Foundation
  • Emily Olin - Academy Software Foundation
  • John Mertic - The Linux Foundation
  • Yarille Ortiz - The Linux Foundation

Other Attendees

  • Jim Geduldick - OpenTrackIO
  • Jim Helman - MovieLabs / OpenTrackIO
  • James Uren - Mo-Sys / OpenTrackIO
  • Lee Kerley - Apple
  • Doug Walker - Autodesk / OCIO
  • Lorna Dumba - Framestore
  • Olga Avramenko - SPI / D&I WG
  • Ben Schofield - CDSA

Antitrust Policy Notice

Linux Foundation meetings involve participation by industry competitors, and it is the intention of the Linux Foundation to conduct all of its activities in accordance with applicable antitrust and competition laws. It is therefore extremely important that attendees adhere to meeting agendas, and be aware of, and not participate in, any activities that are prohibited under applicable US state, federal or foreign antitrust and competition laws.

Examples of types of actions that are prohibited at Linux Foundation meetings and in connection with Linux Foundation activities are described in the Linux Foundation Antitrust Policy available at linuxfoundation.org/antitrust-policy. If you have questions about these matters, please contact your company counsel, or if you are a member of the Linux Foundation, feel free to contact Andrew Updegrove of the firm of Gesmer Updegrove LLP, which provides legal counsel to the Linux Foundation.

Agenda

  • General Updates
    • Dev Days - September 25th #1134
    • Engineering Contribution Assessment #685
      • Larry: please update spreadsheet, noticed that no one had filled it in this year, project leads and company reps, please update your info. Same advice for Industry Build Matrix.
      • Kimball: I updated ours last week, did we update the right one? Larry: yes, you were the only one!
    • Quarterly project/WG leads email #1148
  • Annual Review: rawtoaces #475
  • Annual Review: Open Review Initiative #436
  • OpenTrackIO #603

Notes

  • General Updates
    • Dev Days - September 25th #1134
      • John: coming up this month!
    • Engineering Contribution Assessment #685
    • Quarterly project/WG leads email #1148
      • John: a checkin to know when your review is, update your landscape entry, awareness around dev days. First one will go out in next couple of days, project leads please look out for it. It isn’t spam!
  • Annual Review: rawtoaces #475
    • Presentation Slides
    • Anton Dukhovnikov
    • Mission
      • Seeks to provide a reliable and extensible software framework for the conversion of digital camera RAW files to a HDR scene referred format
    • Major Components
      • rawtoaces_core library (solvers)
      • rawtoaces_util library (read->transform->write)
      • rawtoaces CLI tool
      • Camera spectral measurements database
    • 2024-2025 highlights
      • TSC chair roled transferred from Alex Forsythe (AMPAS) to Anton Dukovnikov (WTA)
      • License change frmo AMPAS to Apache-2
      • Separate repo for the database (rawtoaces-data)
      • rawtoaces 1.1.0 release - first release since 1.0.0 in 2017
        • What had accumulated in repo in 8 years, bugfixes, no new features
      • rawtoaces 2.0.0 release - currently in beta
    • Commits over time
      • 39 commits by 4 people
      • 2 regular / active contributors
    • Roadmap: 2.0 release this fall
      • Removes dependency on libraw, AcesContainer, boost
        • Makes building and maintaining easier
      • public interface cleaned up
      • heavily based on OpenImageIO
        • reading raw images
        • writing exr images
        • all image processing via OpenImageIO::ImageBufAlgo
        • each step mimics an OpenIimageIO::ImageBufAlgo
        • can leverage all OIIO functionality
      • Would be great to have feedback, none yet
    • Roadmap: short-term goals
      • Transform cache (wip)
      • Lens corrections (wip)
        • vignetting
        • distortion
        • chromatic aberration
      • rawtoaces-specific CI images (wip)
      • user and developer documentation
        • Only a (fairly good) README file for now
      • better test suite
        • mostly cover color transform solvers
        • need a lot of images to test end to end functionality
        • at least one image for each camera we support + reference ACES container reference
        • where do we host them? experimenting with generating synthetic images on the fly, we have spectral sensitivity for cameras we support. Won’t have to create “real” raw files with OpenImageIO
    • Roadmap: Long term goals
      • Python bindings
      • Exposure stacking
      • Physically accurate scaling
      • GUI application
        • CLI tool is usable but not very user friendly. Main role is to be part of a pipeline, not executed manually.
      • Expand the database and document the camera profiling process
        • No new cameras since 8 years ago. WETA can contribute some measurements, but would be best to have a document on how to profile a camera so others can contribute
      • OpenSSF Best Practices badge
      • OpenFX plugin?
      • We don’t really hear from users, so hard to know what users need?
    • Help wanted!
      • Looking for:
        • Contributors
        • users: currently have 2 users in communication with us and report issues, feedback, ask for features
        • TSC members
          • Currently mostly just myself
          • TSC meetings are lonely! Next TSC meeting is right after TAC call. If someone wants to come say hi!
    • Question
      • Larry: great presentation for first (or 10th!) time. Curious about camera database, do you see camera manufacturers supplying the data? Anton: I don’t know. From my understanding that seems unlikely. Kimball: right. Likely path is to provide how to measure it, it’s not that expensive to put together the equipment. The manufacturers have the data, but they don’t typically publish it, and have shown resistance to doing that in the past.
      • Michael B: how is that different than the white paper you have to publish for ACES? Carol: you are going back another step. The IDTs for ACES come from the camera native space, whereas the camera sensitivities are in the full raw.
      • Carol: pip install helps make it easier for people to use it.
      • Larry: you don’t need to know the internals of a project to build Python bindings / wheel generation, those are things that people who know these things can help without being rawtoaces experts
  • Annual Review: Open Review Initiative #436
    • Reschuled for October 1st
  • OpenTrackIO #603
    • Presentation Slides
    • The 3 james
      • James Uren, Mos-Sys Engineering
      • Jim Geduldick, VFX/VP supervisor
      • Jim Helman, Movielabs
    • Presented to TAC about a year ago, was new to Open Source, I’m a dev by training, but first attempt at building something community based
    • Working on OpenTrackIO in context of SMPTE RIS (Rapid Industry Solutions) group
    • Agenda
      • Background
        • Development and current state of OpenTrackIO
      • Proposal
        • Transition of the software associated with OpenTrackIO to ASWF
      • SMPTE and RIS
    • Virtual Production
      • On-set VP is an umbrella term for real-time VFX spanning
        • Augmented Reality (AR)
        • Chroma Key (live broadcast and simul-cam for on-set VFX previz)
        • In-Camera VFX (ICVFX), eXtended Reality (XR for LED set extensions), other Mixed Reality (MR) combinations
      • VP requires live accurate camera tracking and lens modelling metadata to align the real world with the virtual every frame
        • In traditional VFX often done with match moving
    • Problem
      • Wild West VP : before OpenTrackIO, no open standard for VP metadata existed, resulting in many different communication protocols needing to be wrangled on-set and post
      • Metadata Producers : in the absense of any standard, lens, camera and tracking system manufacturers develop and maintain their own bespoke protocols
      • Metadata Consumers: real-time render engins and post-production tools must support multiple protocols
      • Adoption: software tools and reference implementations are required to encourage wide adoption by producers and consumer
      • The RIS group had a project for camera metadata (CamDKit), figure out what is important in metadata, don’t want to collect large amount of unnecessary information. Experts must determine the useful data to capture at various points in the production pipeline.
      • RIS had a spreadsheet list of database types, types of data that are important. Trying to coin “meta metadata”, aka “metadata about metadata”.
      • Separate metadata producers from metdata consumers
      • Have seen up to 16 different standards on a single set. Can end up with multiple formats, yet all these devices are mostly trying to do the same thing.
      • We need to define a standard (done, first version defined)
      • Encourage adoption by bringing in software components
    • Current Status
      • OpenTrackIO is an open standard for real-time on-set virtual production metadata
      • It was developed by the SMPTE RIS OSVP group and V1 published in Jan 2025
      • Spec at OpenTrackIO web site
      • Spec and associated web page is generated by RIS’s CamDKit
      • OpenTrackIO uses the OpenLensIO mathematical model, also developed by SMPTE RIS members
      • Camera tracking is only half the problem, need to know lens parameters to render (near)pixel accurate graphics
      • Lenses are all different, they have tolerances, need calibration procedures
      • People do this in different ways, but with the same purpose
      • Have produced a white paper of what is required of a lens by VP
      • OpenTrackIO support is available in Unreal Engine 5.6 (released June 2025). Also OpenLensIO going into 5.7
      • Protocol being implemented by Canon, Sony, Mo-Sys, Pixtope, Stage Precision, Disguise, Chaos Arena, and maybe more!
      • Demonstrated at Unreal Fest last month
      • This is a “real thing”, required by industry
    • Solution
      • Define the Standard
        • Industry experts define what is required and how it should be implemented
      • Documentation
      • Open Software Tools
      • Reference Implementations
    • Proposal
      • Create an open-source space
        • For libraries, tools and reference implementations
      • Initial contributions (confirmed)
        • SMPTE RIS will migrate their Python API, exmaples and generator tools to this space. Mo-Sys will migrate their C++ API and Pixotope their OpenLensIO reference implementation for image generation.
      • Expected Contributions
        • Epic’s OpenTrackIO Live Sender too. Mo-Sys’s OpenTrackIO simulator.
      • Future Contributions
        • Improved OpenLensIO reference implementation and shader. OpenLensIO lens calibration library (would still need tweaking on site, but still a big benefit). OpenTimelineIO integrations. USD camera integrations. Possible crossovers with existing projects.
    • Relationship with SMPTE
      • Maintenance and extensions to specification require the involvement of a quorum of industry experts. Updates to the standard are infrequence and clearly versioned.
      • By contrast, tools and reference implementations are more agile with multiple contributors constantly creating and updating tools. Maintenance and code review are managed via issues and pull requests. Automated Ci/CD help to maintain code quality and application stability.
      • SMPTE RIS group continues to maintain the OpenTrackIO specs.
      • SMPTE RIS migrates the tools and examples to ASWF that have been created during the development of the spec
    • SMPTE
      • Define the standard
      • Documentation
    • ASWF
      • Open Software Tools
      • Reference Implementations
    • Summary
      • OpentrackIO is a big step forward for interoperability in VP. To support its growth and expand adoption, it needs a home for developers to access tools and resources
    • Documentation is auto generated from repo, a JSON based protocol, modern and sensible
      • Also has link to math paper on lens characterization, a “practical lens model”
      • Aspects not covered in OpenCV like vignetting
      • CamDKit under BSD 3-clause
    • Have had conversations with multiple camera manufacturers about implementation of OpenTrackIO / OpenLensIO
      • Cannon are working closely with us, will show an OpenLensIO integration at IBC
      • On engine side, there’s a lot of interest in implementing this, currently a drop down box with multiple protocols to support
    • Eric R: great work, not an easy problem to solve. You hinted interest in USD, defining a JSON schema. What would be the carrier format? If you are recording to a camera card for instance?
      • Trying to solve the problem of transmission at first, so a transmission protocol, but the blobs of data can be stored as files, can store a blob of JSON per frame, could be referenced from OTIO. To containerize this, we want to provide guidelines for RTP and SMPTE 2110 containerizing for transmission.
      • Michael Min: we discussed whether OTIO would be the carrier for this data? Eric R: I lean more towards referencing another format. James U: think of it like an audio / video stream, each JSON (frame) blob is timecoded / PTP stamped and can be referenced from a timeline (like reference clips).
      • Eric R: for sampling frequency, what are you seeing people want this? Some metadata can be up to 600Hz? James U: each packet is connected to a specific frame, but can timestamp at any time. But normal mode is to match the frame rate. But currently can’t support multiple streams at different rates in the same package. Kimball: I want 150FPS camera metadata! Michael B: Vicom trakcs at 300 Hz.
      • James U: to achieve best quality, want to sync shutter of sampling system with camera shutter, eventually will use PTP to make sure you “expose” at the same instant. But a lot of tracking systems don’t do that, especially less expensive ones don’t have sync and just fire out data, sometimes at a high rate to mitigate accuracy. OpenTrackIO supports all that.
      • Eric R: Cooke lens metadata standard, how does it map? James U: maps with some simple scaling, also OpenCV. Cooke on RIS committee. Some Cooke data is proprietary. Haven’t yet dabbled in full anamorphic models.
      • Jim G: there’s the Cooke API, and the implementation, some manufacturers reorder data fields. Michael Min: Alex Mini and Alexa do it differently.
      • Michael Min: 150Hz Cooke metadata resampled to 24FPS is unreliable?
      • Carol: we all agree this is useful and needed. We will consider inclusion as a sandbox project? John: yes, that’s the entry point. They need to put together a project propose. Carol: we like to give TAC members a couple of weeks to discuss and review. From your previous presentations, looks like you’ve done a ton of work, what makes sense with ASWF. We’ve had ACES coming over, standards and specs vs code, so we now have more expertise.
      • Jim U: thank you for opportunity to present, looking forward to opportunity.

Next Meeting Agenda

  • General Updates
  • Annual Review: DPEL #472
  • Annual Review: OpenAssetIO #516