I admit that I haven’t written a blog in more than three months — but I do have a good reason. I just finished writing my latest book — not a Dummies book this time. It will be my first business book based on almost three decades in the computer industry. Once I know the publication date I will tell you a lot more about it. But as I was finishing this book I was thinking about my last book, Cloud Computing for Dummies that was published almost two years ago. As this anniversary approaches I thought it was appropriate to take a look back at what has changed. I could probably go on for quite a while talking about how little information was available at that point and how few CIOs were willing to talk about or even consider cloud computing as a strategy. But that’s old news. I decided that it would be most interesting to focus on eight of the changes that I have seen in this fast-moving market over the past two years.
Change One: IT is now on board with cloud computing. Cloud Computing has moved from a reaction to sluggish IT departments to a business strategy involving both business and technology leaders. A few years ago, business leaders were reading about Amazon and Google in business magazines. They knew little about what was behind the hype. They focused on the fact that these early cloud pioneers seemed to be efficient at making cloud capability available on demand. No paperwork and no waiting for the procurement department to process an order. Two years ago IT leaders tried to pretend that cloud computing was passing fad that would disappear. Now I am finding that IT is treating cloud computing as a center piece of their future strategies — even if they are only testing the waters.
Change Two: enterprise computing vendors are all in with both private and public cloud offerings. Two years ago most traditional IT vendors did not pay too much attention to the cloud. Today, most hardware, software, and services vendors have jumped on the bandwagon. They all have cloud computing strategies. Most of these vendors are clearly focused on a private cloud strategy. However, many are beginning to offer specialized public cloud services with a focus on security and manageability. These vendors are melding all types of cloud services — public, private, and hybrid into interesting and sometimes compelling offerings.
Change Three: Service Orientation will make cloud computing successful. Service Orientation was hot two years ago. The huge hype behind cloud computing led many pundits to proclaim that Service Oriented Architectures was dead and gone. In fact, cloud vendors that are succeeding are those that are building true business services without dependencies that can migrate between public, private and hybrid clouds have a competitive advantage.
Change Four: System Vendors are banking on integration. Does a cloud really need hardware? The dialog only two years ago surrounded the contention that clouds meant no hardware would be necessary. What a difference a few years can make. The emphasis coming primarily from the major systems vendors is that hardware indeed matters. These vendors are integrating cloud infrastructure services with their hardware.
Change Five: Cloud Security takes center stage. Yes, cloud security was a huge topic two years ago but the dialog is beginning to change. There are three conversations that I am hearing. First, cloud security is a huge issue that is holding back widespread adoption. Second, there are well designed software and hardware offerings that can make cloud computing safe. Third, public clouds are just as secure as a an internal data center because these vendors have more security experts than any traditional data center. In addition, a large number of venture backed cloud security companies are entering the market with new and quite compelling value propositions.
Change Six: Cloud Service Level Management is a primary customer concern. Two years ago no one our team interviewed for Cloud Computing for Dummies connected service level management with cloud computing. Now that customers are seriously planning for wide spread adoption of cloud computing they are seriously examining their required level of service for cloud computing. IT managers are reading the service level agreements from public cloud vendors and Software as a Service vendors carefully. They are looking beyond the service level for a single service and beginning to think about the overall service level across their own data centers as well as the other cloud services they intend to use.
Change Seven: IT cares most about service automation. No, automation in the data center is not new; it has been an important consideration for years. However, what is new is that IT management is looking at the cloud not just to avoid the costs of purchasing hardware. They are automation of both routine functions as well as business processes as the primary benefit of cloud computing. In the long run, IT management intends to focus on automation and reduce hardware to interchanagable commodities.
Change Eight: Cloud computing moves to the front office. Two years ago IT and business leaders saw cloud computing as a way to improve back office efficiency. This is beginning to change. With the flexibility of cloud computing, management is now looking at the potential for to quickly innovate business processes that touch partners and customers.
I spent the other week at a new conference called Cloud Connect. Being able to spend four days emerged in an industry discussion about cloud computing really allows you to step back and think about where we are with this emerging industry. While it would be possible to write endlessly about all the meeting and conversations I had, you probably wouldn’t have enough time to read all that. So, I’ll spare you and give you the top four things I learned at Cloud Connect. I recommend that you also take a look at Brenda Michelson’s blogs from the event for a lot more detail. I would also refer you to Joe McKendrick’s blog from the event.
1. Customers are still figuring out what Cloud Computing is all about. For those of us who spend way too many hours on the topic of cloud computing, it is easy to make the assumption that everyone knows what it is all about. The reality is that most customers do not understand what cloud computing is. Marcia Kaufman and I conducted a full day workshop called Introduction to Cloud. The more than 60 people who dedicated a full day to a discussion of all aspects of the cloud made it clear to us that they are still figuring out the difference between infrastructure as a service and platform as a service. They are still trying to understand the issues around security and what cloud computing will mean to their jobs.
2. There is a parallel universe out there among people who have been living and breathing cloud computing for the last few years. In their view the questions are very different. The big issues discussed among the well-connected were focused on a few key issues: is there such a thing as a private cloud?; Is Software as a Service really cloud computing? Will we ever have a true segmentation of the cloud computing market?
3. From the vantage point of the market, it is becoming clear that we are about to enter one of those transitional times in this important evolution of computing. Cloud Connect reminded me a lot of the early days of the commercial Unix market. When I attended my first Unix conference in the mid-1980s it was a different experience than going to a conference like Comdex. It was small. I could go and have a conversation with every vendor exhibiting. I had great meetings with true innovators. There was a spirit of change and innovation in the halls. I had the same feeling about the Cloud Connect conference. There were a small number of exhibitors. The key innovators driving the future of the market were there to discuss and debate the future. There was electricity in the air.
4. I also anticipate a change in the direction of cloud computing now that it is about to pass that tipping point. I am a student of history so I look for patterns. When Unix reached the stage where the giants woke up and started seeing huge opportunity, they jumped in with a vengeance. The great but small Unix technology companies were either acquired, got big or went out of business. I think that we are on the cusp of the same situation with cloud computing. IBM, HP, Microsoft, and a vast array of others have seen the future and it is the cloud. This will mean that emerging companies with great technology will have to be both really luck and really smart.
The bottom line is that Cloud Connect represented a seminal moment in cloud computing. There is plenty of fear among customers who are trying to figure out what it will mean to their own data centers. What will the organizational structure of the future look like? They don’t know and they are afraid. The innovative companies are looking at the coming armies of large vendors and are wondering how to keep their differentiation so that they can become the next Google rather than the next company whose name we can’t remember. There was much debate about two important issues: cloud standards and private clouds. Are these issues related? Of course. Standards always become an issue when there is a power grab in a market. If a Google, Microsoft, Amazon, IBM, or an Oracle is able to set the terms for cloud computing, market control can shift over night. Will standard interfaces be able to save the customer? And how about private clouds? Are they real? My observation and contention is that yes, private clouds are real. If you deploy the same automation, provisioning software, and workload management inside a company rather than inside a public cloud it is still a cloud. Ironically, the debate over the private cloud is also about power and position in the market, not about ideology. If a company like Google, Amazon, or name whichever company is your favorite flavor… is able to debunk the private cloud — guess who gets all the money? If you are a large company where IT and the data center is core to how you conduct business — you can and should have a private cloud that you control and manage.
So, after taking a step back I believe that we are witnessing the next generation of computing — the industrialization of computing. It might not be as much fun as the wild west that we are in the midst of right now but it is coming and should be here before we realize that it has happened.
Last year I wrote a post about what I called the Google Sneak attack. If you don’t feel like reading that post, I’ll make it simple for you. Google comes to market as a benign helpful little search engine that threatened no one. Fast forward a decade and Google now pulls in more ad revenue than most of the television networks combined. It has attacked Microsoft’s office franchise, is playing a key role in the cloud via Platform as a Service (Google AppEngine), not to mention the importance of its entry into the book business and who knows what else. But let’s turn our attention to Twitter. I’ve been using Twitter since 2007. For the first several months I couldn’t quite figure out what this was all about. It was confusing and intriguing at the same time. In fact, my first blog about Twitter suggested that the Emperor has no clothes.
So fast forward to the end of 2009 and several very interesting things are happening:
1. Twitter is becoming as much a part of the cultural and technical fabric as Google did just a few years ago
2. A partner ecosystem has grown up around Twitter. A post from February by Matt Ingram of Gigaom echos this point.
3. The number of individuals, large corporations, and small businesses are using Twitter as everything from the neighborhood water cooler to a sales channel.
What does mean? Despite detractors who wonder what you can possibly accomplish in 140 characters, it is becoming clear that this company without a published business plan does have a plan to dominate. It is, in fact, the same strategy that Google had. Which company would have been threatened by a small search company? And who could be threatened from a strange little company called Twitter that asked people to say it all in 140 characters? Today Twitter claims to have 18 Million users about 4% of adult internet users. I suspect that we will begin to see a slow but well orchestrated roll out of services that leverage the Twitter platform. I suspect that we will see a combination of advertising plus commercial software aimed at helping companies reach new customers in new channels.
I am confident that within the next two years this small, profitless, patient company will roll out a plan targeting social networking world dominance. It will be fun to watch.
As companies try to get a handle on the costs involved in running data centers. In fact, this is one of the primary reasons that companies are looking to cloud computing to make the headaches go away. Like everything else is the complex world of computing, clouds solve some problems but they also cause the same type of lock-in problems that our industry has experienced for a few decades.
I wanted to add a little perspective before I launch into my thoughts about portability in the cloud. So, I was thinking about the traditional data centers and how their performance has long been hampered because of their lack of homogeneity. The typical data center is filled with a warehouse of different hardware platforms, operating systems, applications, networks – to name but a few. You might want to think of them as archeological digs – tracing the history of the computer industry. To protect their turf, each vendor came up with their own platforms, proprietary operating systems and specialized applications that would only work on a single platform.
In addition to the complexities involved in managing this type of environment, the applications that run in these data centers are also trapped. In fact, one of the main reasons that large IT organizations ended up with so many different hardware platforms running a myriad of different operating systems was because applications were tightly intertwined with the operating system and the underlying hardware.
As we begin to move towards the industrialization of software, there has been an effort to separate the components of computing so that application code is separate from the underlying operating system and the hardware. This has been the allure of both service oriented architectures and virtualization. Service orientation has enabled companies to create clean web services interfaces and to create business services that can be reused for a lot of different situations. SOA has taught us the business benefits that can be gained from encapsulating existing code so that it is isolated from other application code, operating systems and hardware.
Sever Virtualization takes the existing “clean” interface that is between the hardware and the software and separates the two. One benefit of fueling rapid adoption and market growth is that there is no need for rewriting of software between the x86 instructions and the software. As Server virtualization moves into the data center, companies can dramatically consolidate the massive number of machines that are dramatically underutilized to a new machines that are used in a much more efficient manner. The resultant cost savings from server virtualization include reduction in physical boxes, heating, maintenance, overhead, cooling, power etc.
Server virtualization has enabled users to create virtual images to recapture some efficiency in the data center. And although it fixes the problem of operating systems bonded to hardware platforms, it does nothing to address the intertwining of applications and operating systems.
Why bring this issue up now? Don’t we have hypervisors that take care of all of our problems of separating operating systems from applications? Don’t companies simply spin up another virtual image and that is the end of the story. I think the answer is no – especially with the projected growth of the cloud environment.
I got thinking about this issue after having a fascinating conversation with Greg O’Connor, CEO of AppZero. AppZero’s value proposition is quite interesting. In essence, AppZero provides an environment that separates the application from the underlying operating system, effectively moving up to the next level of the stack.
The company’s focus is particularly on the Windows operating system and for good reason. Unlike Linux or Zos, the Windows operating system does not allow applications to operate in a partition. Partitions act to effectively isolate applications from one another so that if a bad thing happens to one application it cannot effect another application. Because it is not possible to separate or isolate applications in the Windows based server environment when something goes bad with one application, it can hurt the rest of the system and other application in Windows.
In addition, when an application is loaded into Windows, DLLs (Dynamic Link Libraries) are often loaded into the operating system. DLLs are shared across applications and installing a new application can overwrite the current DLL of another application. As you can imagine, this conflict can have really bad side effects. .
Even when applications are installed on different servers – physical or virtual — installing software in Windows is a complicated issue. Applications create registry entries, modify registry entries of shared DLLS copy new DLLs over share libraries. This arrangement works fine unless you want to move that application to another environment. Movement requires a lot of work for the organization making the transition to another platform. It is especially complicated for independent software vendors (ISVs) that need to be able to move their application to whichever platform their customers prefer.
The problem gets even more complex when you start looking at issues related to Platform as a Service (PaaS). With PaaS platform a customer is using a cloud service that includes everything from the operating system to application development tools and a testing environment. Many PaaS vendors have created their own language to be used to link components together. While there are benefits to having a well-architected development and deployment cloud platform, there is a huge danger of lock in. Now, most of the PaaS vendors that I have been talking to promise that they will make it easy for customers to move from one Cloud environment to another. Of course, although I always believe everything a vendor tells me (that is meant as a joke….to lighten the mood) but I think that customers have to be wary about these claims of interoperability.
That was why I was intrigued with AppZero’s approach. Since the company decouples the operating system from the application code, it provides portability of pre-installed application from one environment to the next. The company positions its approach as a virtual application appliance . In essence, this software is designed as a layer that sits between the operating system and the application. This layer intercepts file I/O, shared memory I/O as well as a specific DLL and keeps them in separate “containers” that are isolated from the application code.
Therefore, the actual application does not change any of the files or registry entries on a Windows server. In this way, a company could run a single instance of the windows server operating system. In essence, it isolates the applications, the specific dependencies and configurations from the operating system so it requires fewer operating systems to manage a Microsoft windows server based data center.
AppZero enables the user to load an application from the network rather than to the local disk. It therefore should simplify the job for data center operations management by enabling a single application image to be provisioned to multiple environments- enabling them to keep track of changes within a Windows environment because the application isn’t tied to a particular OS. AppZero has found a niche selling its offerings to ISVs that want to move their offerings across different platforms without having to have people install the application. By having the application pre-installed in a virtual application appliance, the ISV can remove many of the errors that occur when a customer install the application into there environment. The application that is delivered in a virtual application appliance container greatly reduces the variability of components that might be effect the application with traditional installation process. In addition, the company has been able to establish partnerships with both Amazon and GoGrid.
So, what does this have to do with portability and the cloud? It seems to me that this approach of separating layers of software so that interdependencies do not interfere with portability is one of the key ingredients in software portability in the cloud. Clearly, it isn’t the only issue to be solved. There are issues such as standard interfaces, standards for security, and the like. But I expect that many of these problems will be solved by a combination of lessons learned from existing standards from the Internet, web services, Service Orientation, systems and network management. We’ll be ok, as long as we don’t try to reinvent everything that has already been invented.
There has been a lot of discussions these days about private and public cloud. More discussion has been generated because both Amazon.com and Salesforce.com have added a Virtual Private Network (VPN) option to their public cloud services. What does this mean in the context of how customers will move to cloud computing? It is clear from the research that I have been doing that the private cloud and the hybrid cloud are real and will be part of the computing landscape for a long time. The emergence of the virtual private cloud is an early indication that customers some customers want a better guarantee of their data. The combination of a public cloud with the privacy offered by a VPN is only going to grow over the coming year.
So, is a Virtual Private Cloud still a public cloud? I particularly found the blog published by Amazon’s CTO,Werner Vogel’s announcing the virtual private cloud fascinating. On one hand, the private virtual cloud announcement is a proclamation that customers want to be able to have secure access to services on the Amazon EC2 Cloud. On the other hand, he is quite clear that this there is no such thing as a private cloud. Clearly, it is in Amazon’s best interest for customers to focus on public clouds. Vogel states in his blog that “What is called private clouds have little of these benefits (he means characteristics of the cloud) and as such I don’t think of them as true clouds” The four characteristics of the cloud he points to include:
- eliminating costs – lowering both capital expenses and operating costs
- elasticity – avoiding complex procurement cycles and improving time to market
- and removing undifferentiated heavy lifting by off loading data center operations
While I agree that there are many situations where this is an ideal approach for many businesses, I don’t think the situation is black and white. There are indeed shades of gray. In my view, a private cloud has to be architected to be different than a traditional data center. But like a traditional data center, it is protected by a firewall and sophisticated security. A private cloud will almost always be combined with some public cloud services (either capacity, software as a service, or platform as a service). So, I’ll take each of the three characteristics mentioned in Vogel’s blog and explain my view based on the fact that customers will make both economic and technical choices.
- eliminating costs – In reality there are data centers that work pretty well and are core to the business. The company has made an investment and therefore would not necessarily be able to lower costs. However, I expect that even if a company decided to go with a private cloud, there will be good reasons to use capacity on demand to fill gaps and expand for projects. In addition, a very large company will have the financial means to establish its own cloud that will be much more cost effective. A cost/benefit analysis of using a public cloud versus a private cloud is not straight forward. It requires a deep assessment of lots of different factors.
- elasticity – It is quite clear that many data centers do not have an efficient way to procure resources to users. However, if a data center is rearchitected to enable self-service provisioning, it can be transformed to better support users. Again, I expect that customers will take advantage of additional capacity or platform services even if they have private cloud services. This is especially true for companies where their computing infrastructure is the foundation of their business.
- removing undifferentiated services – This will really depend on whether the data center helps a company differentiate itself. There are definitely services that offer no value to the bottom line that should be placed in a public cloud (with a VPN for security, in some cases) such as electronic mail. However, where these services are at the core of the business and probably need to be in a private cloud. Many companies will select which services are not differentiated and which ones are and create a hybrid environment. Companies will have to do their homework both in terms of focus and costs. It might initially cost more to move a service such as email to a public cloud but will have huge resources in the long run. In other situations, paying per hour, etc. may be a lot more costly than you might imagine.
My bottom line is this. The cloud will continue to evolve over the coming decade and there is no one approach that will become the standard. The cloud is primarily an economic proposition that will require careful evaluation. Companies need to understand what their business is, what the value and role of the data center is and what is the best set of services available. The good news is that with the evolution of the cloud companies will have lots of good options.
I haven’t written a blog post in quite a while. Yes, I feel bad about that but I think I have a good excuse. I have been hard at work (along with my colleagues Marcia Kaufman, Robin Bloor, and Fern Halper) on Cloud Computing for Dummies. I will admit that we underestimated the effort. We thought that since we had already written Service Oriented Architectures for Dummies — twice; and Service Management for Dummies that Cloud Computing would be relatively easy. It wasn’t. Over the past six months we have learned a lot about the cloud and where it is headed. I thought that rather than try to rewrite the entire book right here I would give you a sense of some of the important things that I have learned. I will hold myself to 10 so that I don’t go overboard!
1. The cloud is both old and new at the same time. It is build on the knowledge and experience of timesharing, Internet services, Application Service Providers, hosting, and managed services. So, it is an evolution, not a revolution.
2. There are lots of shades of gray with cloud segmentation. Yes, there are three buckets that we put clouds into: infrastructure as a service, platform as a service, and software as a service. Now, that’s nice and simple. However, it isn’t because all of these areas are starting to blurr into each other. And, it is even more complicated because there is also business process as a service. This is not a distinct market unto itself – rather it is an important component in the cloud in general.
3. Market leadership is in flux. Six months ago the market place for cloud was fairly easy to figure out. There were companies like Amazon and Google and an assortment of other pure play companies. That landscape is shifting as we speak. The big guns like IBM, HP, EMC, VMware, Microsoft, and others are running in. They would like to control the cloud. It is indeed a market where big players will have a strategic advantage.
4. The cloud is an economic and business model. Business management wants the data center to be easily scalable and predictable and affordable. As it becomes clear that IT is the business, the industrialization of the data center follows. The economics of the cloud are complicated because so many factors are important: the cost of power; the cost of space; the existing resources — hardware, software, and personnel (and the status of utilization). Determining the most economical approach is harder than it might appear.
5. The private cloud is real. For a while there was a raging debate: is there such a thing as a private cloud? It has become clear to me that there is indeed a private cloud. A private cloud is the transformation of the data center into a modular, service oriented environment that makes the process of enabling users to safely procure infrastructure, platform and software services in a self-service manner. This may not be a replacement for an entire data center – a private cloud might be a portion of the data center dedicated to certain business units or certain tasks.
6. The hybrid cloud is the future. The future of the cloud is a combination of private, traditional data centers, hosting, and public clouds. Of course, there will be companies that will only use public cloud services for everything but the majority of companies will have a combination of cloud services.
7. Managing the cloud is complicated. This is not just a problem for the vendors providing cloud services. Any company using cloud services needs to be able to monitor service levels across the services they use. This will only get more complicated over time.
8. Security is king in the cloud. Many of the customers we talked to are scared about the security implications of putting their valuable data into a public cloud. Is it safe? Will my data cross country boarders? How strong is the vendor? What if it goes out of business? This issue is causing many customers to either only consider a private cloud or to hold back. The vendors who succeed in the cloud will have to have a strong brand that customers will trust. Security will always be a concern but it will be addressed by smart vendors.
9. Interoperability between clouds is the next frontier. In these early days customers tend to buy one service at a time for a single purpose — Salesforce.com for CRM, some compute services from Amazon, etc. However, over time, customers will want to have more interoperability across these platforms. They will want to be able to move their data and their code from one enviornment to another. There is some forward movement in this area but it is early. There are few standards for the cloud and little agreement.
10. The cloud in a box. There is a lot of packaging going on out there and it comes in two forms. Companies are creating appliance based environments for managing virtual images. Other vendors (especially the big ones like HP and IBM) are packaging their cloud offerings with their hardware for companies that want Private clouds.
I have only scratched the surface of this emerging market. What makes it so interesting and so important is that it actually is the coalescing of computing. It incorporates everything from hardware, management software, service orientation, security, software development, information management, the Internet, service managment, interoperability, and probably a dozen other components that I haven’t mentioned. It is truly the way we will achieve the industrialization of software.