Why multithreading isn't everywhere?
Frame challenge: but it is everywhere.
Let's see, let's name some platforms:
- Desktops/laptops: one of the most common applications today is the browser. And to get a good performance modern browsers take every advantage they can get, including multithreading, GPUs, etc. At the very least every tab will get a separate thread. And many modern applications are also built in HTML with an embedded browser (for example Slack and Discord). Games, at least the bigger ones, also have embraced multithreading a long time ago.
- Servers: This day and age most servers deal with HTTP requests; other technologies are pretty niche. And webservers scale up nicely to all the cores you have. Sure, every request will most likely run on a single thread, but multiple threads means you can process multiple requests in parallel. It's absolutely standard. The other common part - the database software - also scales well and any serious engine uses multiple threads.
- Cell phones/tablets: browsers, again. But even without them there are still plenty of background tasks that repeatedly wake up and do a little something. Having multiple cores means that these background tasks affect your foreground app less and it seems more "snappy". Cell phone CPUs are pretty powerful already, but the low power usage requirement means that they're still slower compared to the desktops - and yet we use them perhaps even more extensively. Including for computation-heavy processes like games.
Long story short, if we went back to single-core CPUs, you'd notice it immediately. Modern operating systems have many processes working in parallel and reducing task switching gives serious benefits. Even if some programs individually have little benefits from multiple cores, the system as a whole almost always profits. That said, I suppose there is a limit on how many cores it makes sense to have for various systems. A cell phone with 64 cores will probably not be significantly faster than a cell phone with 32 cores.