How to integrate ACES 2.0 into a game engine?

Hi.
I’m currently evaluating how to integrate ACES into our game engine.
I’ve read several documents, but I still don’t quite understand what ACES 2.0 is. I’m honestly not sure where to start.

Here’s my rough idea of the flow with current understanding. Can someone tell me if this makes sense?

  1. Read sRGB textures (albedo etc.) and calculate the lighting.(BRDF result is of cause luminance but here I must treat it as linear sRGB I assume?)
    //I don’t think sRGB limitation is mandatory but I think I need to decice my working space for the lighting?

  2. After lighting, convert the sRGB linear result to AP0? (Is this considered the Input Transform?)

  3. Apply Look Transform. Here I can use AP1 internally? (Is this ACEScg?)

  4. Apply Output Transform. (Since I can’t know the user’s display color space, should I just let them pick which colorspace they want to use and pray?)

I’m especially confused about how to convert RGB luminance into AP0, but I’m unsure about other parts too.
Any clarification would help a lot!

well i’m no expert but the first step in implementing ACES in a given software is to implement OpenColorIO, luckily they have documentation for developers.

Funny, I was wondering the same thing yesterday. if implementing OCIO would be a good solution for this question.

I am not a developer so I could be wrong.

well all the internal stuff you need to handle for ACES is what OCIO also needs, I’m not a color scientist but the way I’d go about it would be through OCIO, I know Godot doesn’t use OCIO but has ACES support but I’d be dubious over its color management support. Plus if you implement OCIO you can easily slot in custom configs instead of having one single hardcoded solution.

Thank you!

It seems like OCIO solves everything. I was thinking like OCIO is only for authoring tools like PS, Substance and everything.

Since I want to avoid conversions at runtime, I believe the correct integration approach is as follows:
Convert all external assets into our engine’s lighting calculation color space using OCIO, and then save them to disk as project-native assets.

Is this correct?

more or less, I must stress that I am by no means an authority on this topic as I am only an artist with some very minor developer experience but the way i understand it is that implementing OCIO happens at a very low level, as it handles basically everything to do with color, what you are suggesting and correct me if i am wrong, is doing a conversion in advance to whatever colorspace the engine is already using, that is definitely possible, but really the engine should support all possible colorspaces the artist will use.

Blender has a very simplified diagram to show what I mean, the ocio config has colorspaces for the textures, the scene’s colorspace and the display colorspace.

Luckily the other memeber in this thread Chris Brejon, has an explainer that works much better than I can on his website: Chapter 1: Color Management - Chris Brejon
I apologize if i got anything wrong but i really want more software out there supporting OCIO, so i’d do anything to help :slight_smile:

1 Like

Thanks, you are helping me a lot!

but really the engine should support all possible colorspaces the artist will use.

Do you mean like, I must let artists import color textures whatever colorspace they belong to, and let artists decice what color space lighting (linear space) use in a game engine? (I believe linear space still has 3 primary colors which means it might be sRGB or BT.2020 or anything)
Of course view transform(this is output transform I assume?) must be supported for the actual displays artists use, and players use, too.

And, I’ll take a look that article!

Depends on the game engine’s availability, if it’s an internal game engine for your studio then it’s probably just necessary to allow the artists to select colorspaces for textures, such as normal or roughness maps having zero tonemapping applied to it whilst color textures might have different tonemapping based on the format. if it’s publicly available then the developers might want to choose to use diferent linear spaces like in Unreal Engine developers can select their working colorspace. which will always be a linear colorspace. and of course you need to expose multiple view transforms based on the target display hardware.


https://dev.epicgames.com/community/learning/courses/qEl/unreal-engine-technical-guide-to-linear-content-creation-pipeline-development/KJZk/unreal-engine-color-pipeline-opencolorio
I apologize I cannot be of more help, but this seems more the work of a color scientist or developer than an artist talking to a developer lol.

1 Like

I would say this is correct, yes.

I would not let the artists decide the rendering space. I think you should have a supervisor or a TD take those decisions (which colorspace do we author the textures in ? Which colorspace do we use to render ? Which display transform should we use ?)

And the artists only worry about : am I loading a “color” map or a “data” map ?

Regards,
Chris

1 Like