I think what a lot of people are missing is that sometimes, a condition can exist inside the computer that can only be cleared or resolved efficiently by resetting the system's memory, which is where many critical operating system components live while they are running. Therefore, the only way to reset the memory is to unload the OS from the memory, which can only be done non-destructively with a restart command.
Think of it this way: shutting down or restarting gives the computer the opportunity to unload things cleanly to prevent the corruption of any stored data, which prevents further issues. Computing devices are not appliances... yet. Would I expect a toaster or a coffeemaker to work reliably all the time when they're powered? Yes—because in many cases the parts that make it work are purely electromechanical in nature. There's no spectrum of functionality—it's either "working" or "not working".
Computing devices have electronic circuit pathways enmeshed with the electromechanical pieces which is what gives them the ability to perform the complex tasks we require of them. Unfortunately, that complexity comes with a cost—the spectrum of different levels of functionality. For a computing device, this could mean anything from "working normally" to "running slowly" to "kernel exception" to "not turning on". These extended states are what create the troubleshooting step of "have you tried turning it off and back on again?". The two conditions in the middle are resolved by a restart.
In many cases, even restarting a wayward daemon will leave the system in a largely unstable state which would not occur with a full restart. In today's world where our devices have *very high* reliability and *very fast* boot times, the occasional restart is no longer the enormous loss of time—enough to go get a cup of coffee, let's say—that it once was in the 90s.
If a user is experiencing the need to restart frequently—more than once a day at a maximum—then that user should be looking at their workflow and what applications they are using on the computer. Remember that working with large files (hundreds of MB to over a GB in size) on a system with 8GB or less of memory in high-demand applications will result in the computer chopping up the files into chunks in order to process it... which creates overhead and increases the chances that memory corruption will occur. For general purpose computing on today's devices, restarting isn't even a thing—and many users with that workflow will not be able to remember the last time they restarted... a testament to how stable things have become for computing devices since the end of the 20th century.