Windows 10's minimum system requirements do allow for 1GB of RAM. Windows 11 specifies 4GB.
I wouldn't run any remotely modern version of macOS with less than 4GB, and even that I feel is pushing it.
Linux you can get by with less depending how you configure things - But Linux is also only a kernel. My web server is a Linux machine that uses ~350MB of RAM while hosting my website. But it doesn't have a GUI or really much of anything running other than an SSH server, nginx, my own Rust program for dealing with file uploads and some minor things like an occasional cron-job to check if there's anything new on Apple's security blog and send me an email if there is.
Install a full KDE or Gnome system and you probably wouldn't want to go below 4GB with that either. A minimalistic i3/Sway or even XFCE based system could probably easily enough run on 1GB or 2GB with decent usability assuming an otherwise decently powerful machine.
I don't think it makes sense to talk about "bloat" as a generic or blanket term. One man's bloat is another's dream feature and vice versa. And browsers after all kinda do need to follow along with the W3C specs. How much memory is needed for a satisfactory experience also very much depends on the website you want to load. If you're just loading static HTML, the memory requirements are tiny. But watching 8K video on YT? I mean, a single, fully expanded frame for the frame buffer is 1GB by itself. And there's (depending on video) like 30 of those every second. Of course it's heavily compressed to take up less space, probably uses chroma subsampling, inter and intra frame compression and whatnot, but content is big.
And that's the last point; Things aren't what they were 15 years ago. Websites host experiences and web apps that they couldn't have back then, images are bigger and higher quality, video files are bigger and higher quality - Unfortunately a lot also has to do with tracking and advertisement. Then of course there's also the financial angle; Developers cost money. If you're a business and your website generally runs well enough that 99% of your users don't have a problem with it, would you rather pay your engineers to optimise it further or to develop new features to draw in more customers or other ways to generate more revenue? As we get more computing power the *need* to highly optimise things for a "good enough" experience becomes smaller and smaller. If you were developing video games in the 90's you *had* to highly optimise things or it wouldn't run at playable rates. If you develop websites today you can generally do so without thinking about performance concerns at all and still get something fairly usable. So sometimes you just have to ask yourself if the investment in either time or money is worth it. Developer time is a resource just like RAM is, and while it's the employer paying the bill rather than the end user, it's often *a lot* cheaper to buy more RAM than paying engineers.
With that said, a lot of time and effort is still spent on performance optimisations. Relative to the complexity of tasks performed, speed is honestly very good in general I find. We're also a lot more conscious of various security pitfalls now than we were 15 years ago, and a lot of security failsafes and mitigations come with performance costs, but it also means that a compromised website can't escape the JavaScript engine and install ransomware on your machine just from visiting a website*.
There's also a tradeoff between memory and speed. Sometimes we intentionally use a bit more memory to make something run faster. For instance, in the past, at work, we loaded view layout files from disk every time we needed them. I wrote a cacher for it so it only loads it from disk once and then if it needs the view layout again it fetches it from memory (though I also flush the cache in case of low memory warnings from the OS). This is part of why I keep telling people that using a lot of memory is not inherently bad. It doesn't mean your system would run poorly with less or that it's absolutely needed or that you can't expand your workload; A lot of the RAM may be used to cache things a little and can easily be flushed without huge impacts, but if you have free memory available, why not put it to use?
*There may still be security vulnerabilities from time to time, even ones as serious as that are theoretically possible, but a lot of mechanisms are in place to try and prevent that sort of thing and much more.