If most games that you play fall under 4GiB, it can be acceptable not to go for 8GiB.
The game shown runs at 25fps with 3.0 but 40fps with 4.0 . 30fps would be the bare minimum.
Do you understand? Intentionally run a software that demand more VRAM than the graphic has is not a proper way to determine "if PCIe 2.0 is good enough for this particular card".
When running out of VRAM, the GPU have to work with system RAM via PCIe connection. Therefore, in that particular gaming benchmark, it's pretty much measuring the bandwidth difference between PCIe 3.0 vs PCIe 4.0. Of course will achieve that result.
If want more than 30FPS, lower the setting, keep the VRAM demand below 4GB is the correct way to handle that situation, but not let the card run in a PCIe 4.0 system and constantly running out of VRAM.
Also, 30FPS is your own bare minimum. You use your own standard, and a "run out of VRAM" situation to determine that PCIe 2.0 is not good enough for 5500XT, sounds not very appropriate / objective.
If the VRAM demand is under 4GB, and the 5500XT can do 100FPS on a PCIe 4.0 x8 system (in GPU limiting situation), but only 50FPS on a PCIe 2.0 x8 system. Then yes, PCIe 2.0 x8 isn't good enough for the card.
However, you make the situation become PCIe bandwidth limiting, but not GPU limiting, and then say PCIe 2.0 is not good enough for the card. That's meaningless.
If this logic works, then you can lower the resolution and graphic setting to minimum, then say X5690 is not good enough for 5500XT, because on a 9900X system, the game can only run at 100FPS with X5690, but 200FPS with 9900X. And my monitor is a 144Hz monitor, 144FPS is my bare minimum...
The problem is, you intentionally use an improper setting to make the situation become "not GPU limiting", and then use your own standard to decide if the system is good enough for that GPU. In my example, the situation become CPU limiting. Then I have a conclusion "X5690 is not good enough for 5500XT". As you can see, it's not a proper conclusion, because we are not really using the graphic card properly at the very beginning.
Anyway, you can have you own opinion, you don't have to agree with me. But please do not make "5500XT need PCIe 4.0 to perform" as a fact, according to your description, that only true under "running out of VRAM case" (in fact, even this case is not true, because the card still can't perform when run out of VRAM. It's just "when running out of VRAM, this card runs better with PCIe 4.0 x8 connection than PCIe 3.0 x8 connection).
IMO, if you want to help the others, better to post the link to let the others have a full picture of what's happening. And let them to decide if 5500XT is a good card in their own situation.