Whats so special about the AWG IDT

I ran some pipe tests here using nuke and OCIO , comparing things like writing ap0 vs ap1 and writing 16 vs 32bit and then comparing back to the source, to quantify the loss.

for some reason that I cant point my finger to excactly - the linear AWG (alexa-wide-gamut) IDT behaves different to the Linear/RED and AP0 roundtrips , even going via linear/709 yields the same loss as RED and Ap0 but going from Alexa to acesCG or Ap0 yields pretty weird numbers.

Anyone can tell me why this is?

Not sure, but maybe it has something to do with the specific IDT you are using. Unlike RED, Arri provides 14 different IDTs for different ISOs. Maybe your synthetic chart goes outside the bounds of what Arri intended.

I am using linear/AWG as my source colorspace so it should just do a 3x3 matrix to convert the gamut

Still, Arri uses different matrices depending on color temp and ND filter. Which matrix are you using? You can find them all at https://github.com/ampas/aces-dev/tree/master/transforms/ctl/idt/vendorSupplied/arri/alexa/v3

ah interesting, never seen different IdTs for whitebalances , need to dig into that, Ive mostly used the flame and ocio and resolve implementation where there is just one for linear/AlexaWideGamut and multiple for the different ISO curves when dealing with log. I am using the linear/Awg idt which when looking at the ocio config is just a simple matrix alexaWideGamut to AP0

just kinda baffles me that ap0 and red are the same result and arri is the odd one out

There is only one matrix for ALEXA Wide Gamut V3. The different matrices in the repo are for raw CameraRGB to ACES. If your image is in AWG then the Kelvin/tint based matrix variation is already baked in.