I still do not agree that cutting edge means exclusively GPU
graphic cards may perform many fancy new stuff, but in the end today softwares still rely massively on CPU code. I may sound sceptic but (just an example) I still remember clearly 10 years ago when GPU enthusiast doomed CPU engines because graphic cards propaganda promised 10, 100, 1000x faster render etc.
After 10 years:
-CPU rendering is nowhere near to be dead and is the tool of choice for any serious production
-GPU engines were never that much faster on average (I mean some scene may benefit by using a GPU a lot, other not so much, other will be faster on CPU)
-coding for GPU engines is still more problematic because of API, drivers etc (I’m not a coder as said, this comes from a coder that won academy award for rendering tech, so I tend to believe him)
Again, this do not mean that GPU are not important, but they are only a part of the hardware, just like the rendering is only a part of any 3D workflow.
We live in a world where many young people (not talking about you) grow up using video games and the marketing propaganda convinced them to buy always the greatest and fastest GPU, so I’m not that surprised that after years of brain washing they think the GPU is everything that matter.
Take this forum for example, some “3D expert” happens to have a gaming rig, they download Blender (not because is the best production tool, but because is free), perform a couple task, a few benchmark, and they end up thinking that you strictly need an Nvidia GPU, well professional 3D world is quite a bit different.