USB-C seemed a bit weird with its pin diagrams too. I've been doing some research on this. USB 3.0 actually introduced another data pin into its pin layout, and maintained backwards compatibility by designing the new connector to have two rows of pins (one that would line up with 2.0 devices, and the other than has the extra pin for 3.0 devices). In a sense, USB 3.0 ports are actually two USB ports fused together, one for each protocol. And most USB 3.0 hubs will actually connect any USB 2.0 devices into the USB 2.0 bus, separately from USB 3.0 devices. You essentially get two busses for every port.
USB-C continues this trend, and adds another set of data pins for USB 3.2 Gen 2x2 devices (if I'm getting this name correctly). But USB-C oddly still has the same USB 2.0 pins in the port, likely to support passive USB-C to USB 2.0 adapters. Furthermore, because USB-C is flippable, extra pins are required in the port to detect which direction the device is plugged in with. USB 2.0, as a protocol, doesn't expect pins to ever be reversed, so there has to be a redundant set of data pins for USB-C to support USB 2.0 (allowing the data to be transferred the same way regardless of whether the cable is flipped). In other words, where USB 2.0 required only two data pins to support its protocol, USB-C requires 4.
Just by supporting the USB 2.0 through USB-C, four pins have basically been wasted for any other use (besides negotiating the power delivery and data protocol, which IIRC is still done via USB 2.0 for some reason). They pretty much designed USB-C to use USB 2.0 as the lowest common denominator, which is a little weird in a world where USB 2.0 is this old.
Granted, many USB 2.0 devices still exist. USB 2.0 is fine for keyboards, mice, and other such accessories that might be (very slightly) more expensive to manufacture on USB 3.0 (sorry, USB 3.2 Gen 2x1), but that's probably a fair tradeoff. USB 3.0 devices aren't exactly expensive to manufacture, and the concern really only lies in supporting existing devices. If USB-C had eliminated USB 2.0 pins from the connector, USB-C -> USB-A adapters would have to have some active circuitry to send signals on another bus (or re-use other pins for USB-2.0 data transfer) which might require hardware and driver changes. Would this have been a bit painful for a year or two? Probably, but it would have given USB-C a lot more headroom (more pins) to design much faster protocols without pushing clock rates so high that fiber optics and active circuitry are required for TB3/4 cables that are longer than 1.6 feet long.
There is a lot that USB-C has done right (and being able to share chargers across all devices is amazing), among much faster data transfer speeds and the ability to plug displays into the same ports that you charge with. But it still feels like it wasn't the most future proof way of doing things in light of the developments over the last 10 years. In 2012, USB 2.0 was more relevant. But now, when we're pushing data transfer speeds to 40gbps, it seems to be becoming a limiting factor to use this as the lowest common denominator. I imagine it's made Thunderbolt much more challenging to engineer for high speeds than alternatives (such as HDMI) for example, but alas, I'm not an engineer and can't say whether that's truly an accurate assessment.
Edit: It turns out that USB-C port can in fact accommodate the use of two of the USB 2.0 pins for data protocols other than USB 2.0 through alternative mode (albeit not through the official USB-standard itself). These pins are used for DisplayPort and for certain other standards, but not for the USB 3.x or 4.x standards. The bad news is that many hosts don't support alternate mode, but the good news is that this will likely improve in the future.