The article notes the parallel between today's mantra of "any time, any place, anywhere", referring to access on desktops and mobile devices to the trend in the late 80s / early 90s of client-server which shifted applications from mainframes and green screens to elegant Windows GUI rendering.
In both cases, the server (or the cloud, if you will) holds the data and the end device accommodates the rendering and the user experience.
The article mentions two problems of client-server era.
Problem 1: Over the years, of course, the realisation dawned that client-server brought with it as many problems as it solved. As client machines multiplied, developers ended up having to develop and test for a whole range of workstation specs and environments, and whenever something changed operations staff had to worry about getting new versions of software out to every desktop.
Problem 2: As support became more complicated and users discovered that an intelligent client with local storage meant they could create their own little offline empire, the overhead, costs and risks began to escalate.Neither of these have gone away. In fact Problem 1 is significantly worse today. With more Operating Systems (desktops and mobile devices) and much shorter release cycles (with the expectation of backward compatibility) and Bring Your Own Device. Oh and then throw in SaaS and complex interactivity between multiple data stores.
Problem 2 still exists - but I think it has generally been cracked as replication / synchronisation is so commonplace now. (BTW, I'm delighted to be corrected by those in the know - please comment below)
The rise of the WebAnd do you remember the craze that lampooned client-server? Why, internet + web + HTML, of course, which in its early implementation, the client was dumb (or 'thin' in the parlance).
'So much easier / faster to develop and deploy' was the cry from the development trenches.
History is repeating itself again.