I have confirmed the issue with 12-bit forcing YCC 4:2:0 with an HDFury Vertex, and I can’t stress enough that 12-bit is the wrong selection to make with 4K signals on the X1X/S. I’ve spent many months evaluating and troubleshooting various 4K devices in attempts to obtain the highest quality video output. I know selecting the lower bit-depth may seem counter-intuitive, but this setting is really only used for the system menus and SDR content. The system output will still automatically switch to 10-bit YCC 4:2:2 or 12-bit YCC 4:2:0 color depth when HDR content is detected. This is the highest quality SDR output you can obtain with the HDMI 2.0 spec.
#8 bit s*x full#
2160p 60Hz RGB 8-bit signals occupy the full 18-Gbps / 600 MHz bandwidth offered by HDMI 2.0 and compatible cables. This will allow true RGB output, as SDR content is intended to be viewed in. The proper setting for a 600 Mhz capable signal chain is to select 8-bit color depth. You will lose color accuracy for SDR content. Likewise, selecting 10-bit color depth will force all output to YCC 4:2:2. When 12-bit color depth is selected in the system settings, it will force all video output to YCC 4:2:0. In nearly everyone’s posted X1X/S TV settings, 12-bit color depth is being used however, this is the incorrect selection if your TV supports full 4K 600 MHz signals.