This issue is not new
Back in the 1960s and 1970s, when we only had mainframes and minicomputers, there
was a distinction between smart terminals (thick clients) and dumb terminals
(thin clients).
In the 1980s, we got workstations (really expensive thick clients, purchased by people who perceived them as cheap compared to the mainframes and minis) and
microcomputers (far less expensive thick clients, purchased by people who
previously didn't have a computer at all).
In the early 1990s, the high cost of workstations gave rise to X terminals,
thin client devices which couldn't do much more than display the graphical user
interface. My manager bought one of those fancy new 19.2k modems and actually
tried doing Motif widget development from home.
In the mid 1990s, web browsers appeared. For a very brief time, this technology
was regarded only as a way to collaborate on hypertext documents. This phase of the
web lasted for most of an afternoon. Meanwhile,
back in Champaign, Illinois, the Unsung Hero and His Eminence were busy building
a web browser which had more "stuff" in it. What kind of stuff? The sort of stuff
that made web browsers into a platform for delivery of apps. And the technologies
of the web have been moving primarily in that direction ever since.
Java applets (developed a fatal disease called Swing)
ActiveX (declared dead seven years after it went missing)
Flash (murdered by Steve Jobs)
Silverlight (murdered by HTML5)
In the late 1990s, people (Oracle, I think?) tried to sell something called
a Network Computer. It was a little PC with a video card, some RAM, an
ethernet card, a web browser, and no hard disk. Thin.