Lusting for the dumb terminal: lessons for the virtualization market
In the old days when mainframes roamed the world, there were things called dumb terminals that enabled users to connect to server based applications. These dumb terminals were — as their name suggests — dumb. They had almost no intelligence other than displaying code from an application. Fast forward to today. As we moved from traditional mainframe computing, the personal computer became the interface of choice for most users. Client/Server technologies provided the capability of having a graphical front end that could communicate with backend databases and logic. Now we are faced with an interesting and interesting twist of fate. It has become apparent that in many situations, computer users actually don’t need much intelligence on the front end. They need the logic and data that sits on the server (the customer service application, the call center application, the classroom application used to teach skills to students).
Ironically, we can’t go back to the good old days of the dumb terminal. Instead we have moved to the thin client, the locked down PC, and the virtual display interface. These approaches are part of the hot new area — virtualization. Now don’t get me wrong. I think that virtualization is quite important and will become an important way that customers will find ways to utilize existing resources in a much more pragmatic way. It will provide better protection for data and resources that might be compromised if too many users have free access to too much.
But I think it is important to keep in mind that this is not a new issue. The computer industry has a way of thinking that the old ways are always wrong and backwards. Yet unintended consequences are a fact of life — even in an industry that loves the future and is skeptical of the past. Now, with virtualization on the rise, we are reinventing what the industry had taken for granted in the mainframe days.