Archive

Archive for the ‘Google’ Category

Eight things that changed since we wrote Cloud Computing for Dummies

October 8, 2010 3 comments

I admit that I haven’t written a blog in more than three months — but I do have a good reason. I just finished writing my latest book — not a Dummies book this time. It will be my first business book based on almost three decades in the computer industry. Once I know the publication date I will tell you a lot more about it. But as I was finishing this book I was thinking about my last book, Cloud Computing for Dummies that was published almost two years ago.  As this anniversary approaches I thought it was appropriate to take a look back at what has changed.  I could probably go on for quite a while talking about how little information was available at that point and how few CIOs were willing to talk about or even consider cloud computing as a strategy. But that’s old news.  I decided that it would be most interesting to focus on eight of the changes that I have seen in this fast-moving market over the past two years.

Change One: IT is now on board with cloud computing. Cloud Computing has moved from a reaction to sluggish IT departments to a business strategy involving both business and technology leaders.  A few years ago, business leaders were reading about Amazon and Google in business magazines. They knew little about what was behind the hype. They focused on the fact that these early cloud pioneers seemed to be efficient at making cloud capability available on demand. No paperwork and no waiting for the procurement department to process an order. Two years ago IT leaders tried to pretend that cloud computing was  passing fad that would disappear.  Now I am finding that IT is treating cloud computing as a center piece of their future strategies — even if they are only testing the waters.

Change Two: enterprise computing vendors are all in with both private and public cloud offerings. Two years ago most traditional IT vendors did not pay too much attention to the cloud.  Today, most hardware, software, and services vendors have jumped on the bandwagon. They all have cloud computing strategies.  Most of these vendors are clearly focused on a private cloud strategy. However, many are beginning to offer specialized public cloud services with a focus on security and manageability. These vendors are melding all types of cloud services — public, private, and hybrid into interesting and sometimes compelling offerings.

Change Three: Service Orientation will make cloud computing successful. Service Orientation was hot two years ago. The huge hype behind cloud computing led many pundits to proclaim that Service Oriented Architectures was dead and gone. In fact, cloud vendors that are succeeding are those that are building true business services without dependencies that can migrate between public, private and hybrid clouds have a competitive advantage.

Change Four: System Vendors are banking on integration. Does a cloud really need hardware? The dialog only two years ago surrounded the contention that clouds meant no hardware would be necessary. What a difference a few years can make. The emphasis coming primarily from the major systems vendors is that hardware indeed matters. These vendors are integrating cloud infrastructure services with their hardware.

Change Five: Cloud Security takes center stage. Yes, cloud security was a huge topic two years ago but the dialog is beginning to change. There are three conversations that I am hearing. First, cloud security is a huge issue that is holding back widespread adoption. Second, there are well designed software and hardware offerings that can make cloud computing safe. Third, public clouds are just as secure as a an internal data center because these vendors have more security experts than any traditional data center. In addition, a large number of venture backed cloud security companies are entering the market with new and quite compelling value propositions.

Change Six: Cloud Service Level Management is a  primary customer concern. Two years ago no one our team interviewed for Cloud Computing for Dummies connected service level management with cloud computing.   Now that customers are seriously planning for wide spread adoption of cloud computing they are seriously examining their required level of service for cloud computing. IT managers are reading the service level agreements from public cloud vendors and Software as a Service vendors carefully. They are looking beyond the service level for a single service and beginning to think about the overall service level across their own data centers as well as the other cloud services they intend to use.

Change Seven: IT cares most about service automation. No, automation in the data center is not new; it has been an important consideration for years. However, what is new is that IT management is looking at the cloud not just to avoid the costs of purchasing hardware. They are automation of both routine functions as well as business processes as the primary benefit of cloud computing. In the long run, IT management intends to focus on automation and reduce hardware to interchanagable commodities.

Change Eight: Cloud computing moves to the front office. Two years ago IT and business leaders saw cloud computing as a way to improve back office efficiency. This is beginning to change. With the flexibility of cloud computing, management is now looking at the potential for to quickly innovate business processes that touch partners and customers.

Why we about to move from cloud computing to industrial computing?

April 5, 2010 7 comments

I spent the other week at a new conference called Cloud Connect. Being able to spend four days emerged in an industry discussion about cloud computing really allows you to step back and think about where we are with this emerging industry. While it would be possible to write endlessly about all the meeting and conversations I had, you probably wouldn’t have enough time to read all that. So, I’ll spare you and give you the top four things I learned at Cloud Connect. I recommend that you also take a look at Brenda Michelson’s blogs from the event for a lot more detail. I would also refer you to Joe McKendrick’s blog from the event.

1. Customers are still figuring out what Cloud Computing is all about.  For those of us who spend way too many hours on the topic of cloud computing, it is easy to make the assumption that everyone knows what it is all about.  The reality is that most customers do not understand what cloud computing is.  Marcia Kaufman and I conducted a full day workshop called Introduction to Cloud. The more than 60 people who dedicated a full day to a discussion of all aspects of the cloud made it clear to us that they are still figuring out the difference between infrastructure as a service and platform as a service. They are still trying to understand the issues around security and what cloud computing will mean to their jobs.

2. There is a parallel universe out there among people who have been living and breathing cloud computing for the last few years. In their view the questions are very different. The big issues discussed among the well-connected were focused on a few key issues: is there such a thing as a private cloud?; Is Software as a Service really cloud computing? Will we ever have a true segmentation of the cloud computing market?

3. From the vantage point of the market, it is becoming clear that we are about to enter one of those transitional times in this important evolution of computing. Cloud Connect reminded me a lot of the early days of the commercial Unix market. When I attended my first Unix conference in the mid-1980s it was a different experience than going to a conference like Comdex. It was small. I could go and have a conversation with every vendor exhibiting. I had great meetings with true innovators. There was a spirit of change and innovation in the halls. I had the same feeling about the Cloud Connect conference. There were a small number of exhibitors. The key innovators driving the future of the market were there to discuss and debate the future. There was electricity in the air.

4. I also anticipate a change in the direction of cloud computing now that it is about to pass that tipping point. I am a student of history so I look for patterns. When Unix reached the stage where the giants woke up and started seeing huge opportunity, they jumped in with a vengeance. The great but small Unix technology companies were either acquired, got big or went out of business. I think that we are on the cusp of the same situation with cloud computing. IBM, HP, Microsoft, and a vast array of others have seen the future and it is the cloud. This will mean that emerging companies with great technology will have to be both really luck and really smart.

The bottom line is that Cloud Connect represented a seminal moment in cloud computing. There is plenty of fear among customers who are trying to figure out what it will mean to their own data centers. What will the organizational structure of the future look like? They don’t know and they are afraid. The innovative companies are looking at the coming armies of large vendors and are wondering how to keep their differentiation so that they can become the next Google rather than the next company whose name we can’t remember. There was much debate about two important issues: cloud standards and private clouds. Are these issues related? Of course. Standards always become an issue when there is a power grab in a market. If a Google, Microsoft, Amazon, IBM, or an Oracle is able to set the terms for cloud computing, market control can shift over night. Will standard interfaces be able to save the customer? And how about private clouds? Are they real? My observation and contention is that yes, private clouds are real. If you deploy the same automation, provisioning software, and workload management inside a company rather than inside a public cloud it is still a cloud. Ironically, the debate over the private cloud is also about power and position in the market, not about ideology. If a company like Google, Amazon, or name whichever company is your favorite flavor… is able to debunk the private cloud — guess who gets all the money? If you are a large company where IT and the data center is core to how you conduct business — you can and should have a private cloud that you control and manage.

So, after taking a step back I believe that we are witnessing the next generation of computing — the industrialization of computing. It might not be as much fun as the wild west that we are in the midst of right now but it is coming and should be here before we realize that it has happened.

The DNA of the Cloud Power Partnerships

January 15, 2010 2 comments

I have been thinking  alot about the new alliances forming around cloud computing over the past couple of months.  The most important of these moves are EMC,Cisco, and VMware, HP and Microsoft’s announced collaboration, and of course, Oracle’s planned acquisition of Sun.  Now, let’s add IBM’s cloud strategy into the mix which has a very different complexion from its competitors. And, of course, my discussion of the cloud power struggle wouldn’t be complete without adding in the insurgents — Google and Amazon.  While it is tempting to want to portray this power grab by all of the above as something brand new — it isn’t.  It is a replay of well-worn patterns that we have seen in the computer industry for the past several decades. Yes, I am old enough to have been around for all of these power shifts. So, I’d like to point out what the DNA of this power struggle looks like for the cloud and how we might see history repeating itself in the coming year.  So, here is a sample of how high profile partnerships have fared over the past few decades. While the past can never accurately predict the future, it does provide some interesting insights.

Partner realignment happens when the stakes change.  There was a time when Cisco was a very, very close partner with HP. In fact, I remember a time when HP got out of the customer service software market to collaborate with Cisco. That was back in 1997.

Here are the first couple of sentences from the press release:

SAN JOSE and PALO ALTO, Calif., Jan. 15, 1997 — Hewlett-Packard Company and Cisco Systems Inc. today announced an alliance to jointly develop Internet-ready networked-computing solutions to maximize the benefits of combining networking and computing. HP and Cisco will expand or begin collaboration in four areas: technology development, product integration, professional services and customer service and support.

If you are interested, here is a link to the full press release.  What’s my point? These type of partnerships are in both HP’s and Cisco’s DNA. Both companies have made significant and broad-reaching partnerships. For example, back in 2004, IBM and Cisco created a broad partnership focused on the data center. Here’s an excerpt from a CRN article:

From the April 29, 2004 issue of CRN Cisco Systems (NSDQ:CSCO) and IBM (NYSE:IBM) on Thursday expanded their long-standing strategic alliance to take aim at the data center market. Solution providers said the new integrated data center solutions, which include a Cisco Gigabit Ethernet Layer 2 switch module for IBM’s eServer Blade Center, will help speed deployment times and ease management of on-demand technology environments.
“This is a big win for IBM,” said Chris Swahn, president of sales at Amherst Technologies, a solution provider in Merrimack, N.H.
The partnership propels IBM past rival Hewlett-Packard, which has not been as quick to integrate its own ProCurve network equipment into its autonomic computing strategy, Swahn said.
Cisco and IBM said they are bringing together their server, storage, networking and management products to provide an integrated data center automation platform.

Here is a link to the rest of the article.

HP itself has had a long history of very interesting partnerships. A few that are most relevant include HP’s ill-fated partnership with BEA in the 1990s. At the time, HP invested $100 million in BEA to further the development of software to support HP’s software infrastructure and platform strategy.

HP Gives BEA $100m for Joint TP Development
Published:08-April-1999
By Computergram

Hewlett-Packard Co and BEA Systems Inc yesterday said they plan to develop new transaction processing software as well as integrate a raft of HP software with BEA’s WebLogic application server, OLTP and e-commerce software. In giving the nod to WebLogic as its choice of application server, HP stopped far short of an outright acquisition of the recently-troubled middleware company, a piece of Wall Street tittle tattle which has been doing the round for several weeks now. HP has agreed to put BEA products through all of its distribution channels and is committing $100m for integration and joint development.

Here’s a link to an article about the deal.

Oracle  probably has more partnerships and more entanglement with more companies than anyone else.  For example,  HP has a  longstanding partnership with Oracle on the data management front. HP partnered closely with Oracle and optimized its hardware for the Oracle database. Today, Oracle and HP have more than 100,000 joint customers. Likewise, Oracle has a strong partnership with IBM — especially around its solutions business. IBM Global Services operates a huge consulting practice based on implementing and running Oracle’s solutions.  Not to be outdone, EMC and Oracle have about 70,000 joint customers. Oracle supports EMC’s storage solutions for Oracle’s portfolio while EMC supports Oracle’s solutions portfolio.

Microsoft, like Oracle, has entanglements with most of the market leaders. Microsoft has partnered very closely with HP for the last couple of decades both on the PC front and on the software front. Clearly, the partnership between HP and Microsoft has evolved for many years so this latest partnership is a continuation of a long-standing relationship. Microsoft has long-standing relationships with EMC, Sun, and Oracle — to name a few.

And what about Amazon and Google? Because both companies were early innovators in cloud computing, they were able to gain credibility in a market that had not yet emerged as the center of power. Therefore, both companies were well positioned to create partnerships with every established vendors that needed to do something with the cloud.  Every company from IBM to Oracle to EMC and Microsoft — to name but a few — established partnerships with these companies. Amazon and Google were small, convenient and non-threatening. But as the power of both companies continues to –grow,  so will their ability to partner in the traditional way. I am reminded of the way IBM partnered with two small companies — Intel and Microsoft when it needed a processor and an operating system to help bring the IBM PC to market in the early 1980s.

The bottom line is that cloud computing is becoming more than a passing fad — it is the future of how computing will change in the coming decades. Because of this reality, partnerships are changing and will continue to change. So, I suspect that the pronouncements of strategic, critical and sustainable partnerships may or may not be worth the paper or compute cycles that created them. But the reality is that the power struggle for cloud dominance is on. It will not leave anything untouched. It will envelop hardware, software, networking, and services. No one can predict exactly what will happen, but the way these companies have acted in the past and the present give us clues to a chaotic and predictable future.

Predictions for 2010: clouds, mergers, social networks and analytics

December 15, 2009 7 comments

Yes, it is predictions time. Let me start by saying that no market change happens in a single year. Therefore, what is important is to look at the nuance of a market or a technology change in the context of its evolution. So, it is in this spirit that I will make a few predictions. I’ve decided to just list my top six predictions (I don’t like odd numbers). Next week I will add another five or six predictions.

  1. Cloud computing will move out of the fear, uncertainty and doubt phase to the reality phase for many customers. This means that large corporations will begin to move segments of their infrastructure and applications to the cloud. It will be a slow but steady movement. The biggest impact on the market is that customers will begin putting pressure on vendors to guarantee predictability and reliability and portability.
  2. Service Management will become mainstream. Over the past five years the focus of service management has been around ITIL (Information Technology Infrastructure Library) processes and certification. There is a subtle change happening as corporations are starting to take a more holistic view of how they can effectively manage how everything that has a sensor, an actuator, or a computer interface is managed. Cloud computing will have a major impact on the growing importance of service management.
  3. Cloud service providers will begin to drop their prices dramatically as competition intensifies. This will be one of the primary drivers of growth of the use of cloud services. It will put a lot of pressure on smaller niche cloud providers as the larger companies try to gain control of this emerging market.
  4. It is not a stretch to state that the pace of technology acquisitions will accelerate in 2010.  I expect that HP, IBM, Cisco, Oracle, Microsoft, Google, and CA will be extremely active. While it would be foolhardy to pick a single area, I’ll go out on a limb and suggest that security, data center management, service management, and information management will be the focus of many of the acquisitions.
  5. Social Networking will become much more mainstream than it was in 2009. Marketers will finally realize that blatant sales pitches on Twitter or Facebook just won’t cut it.  We will begin to see markets learn how to integrate social networking into the fabric of marketing programs. As this happens there will be hundreds of new start ups focused on analyzing the effectiveness of these marketing efforts.
  6. Information management is at the cusp of a major change. While the individual database remains important, the issue for customers is focus on the need to manage information holistically so that they can anticipate change. As markets grow increasingly complex and competitive, the hottest products in 2010 will those that help companies anticipate what will happen next.  So expect that anything with the term predictive analytics to be hot, hot, hot.

Predicting the future of computing by understanding the past

December 1, 2009 1 comment

Now that Thanksgiving is over I am ready to prepare to come up with predictions for 2010. This year, I decided to start by looking backwards. I first entered the computer industry in the late 1970s when mainframes roamed the earth and timesharing was king. Clearly, a lot has changed. But what I was thinking about was the assumptions that people had about the future of computing at that time and over the next several decades. So, I thought it would be instructive to mention a few interesting assumptions that I heard over the years. So, in preparation for my predictions in a couple of week, here are a few noteworthy predictions from past eras:

1. Late 1970s – The mainframe will always be the prevalent computing platform. The minicomputer is a toy.

2. Early 1980s – The PC will never be successful. It is for hobbyists. Who would ever want a personal computer in their home? And if they got one, what would they ever do with it — keep track of recipes?

3. Mid-1980s – The minicomputer will prevail. The personal computer and the networked based servers are just toys.

4. Mid-1980s – The leaders of the computer industry — IBM, Digital Equipment Corporation, and Wang Laboratories will prevail.

5. Early 1990s – The Internet has no real future as a computing platform. It is unreliable and too hard to use. How could it possibly scale enough to support millions of customers?

6. Early 1990s – Electronic Commerce is a pipe dream. It is too complicated to work in the real world.

7. Mid-1990s – If you give away software to gain “eyeballs” (the popular term in the era) and market share you will fail.

I could mention hundreds of other assumptions that I have come across that defied the conventional wisdom of the day. The reality is that these type of proof points are not without nuance.  For example, the mainframe remains an important platform today because of its ability to process high volume transactions and for its reliability and predictability. However, it is no longer the primary platform for computing. The minicomputer still exists but has morphed into more flexible server-based appliances. The PC would never have gotten off the ground without the pioneering work of done by Dan Bricklin and Bob Frankston who created the first PC-based spreadsheet. Also, if the mainframe and minicomputers had adopted a flexible computing model, corporations would never have brought millions of unmanageable PCs into their departments. Of the three computing giants of the late 80s, only IBM is still standing. Digital Equipment was swallowed by HP and Wang was bought by Getronics.  The lesson? Leaders come and go. Only the humble or paranoid survive. Who could have predicted the emergence of Google or Amazon.com? In the early days of online commerce it was unclear if it would really work. How could a vendor possible construct a system that could transmit transactions between partners and customers across the globe? It took time and lots of failures before it became the norm.

My final observation is actually the most complicated. In the mid-1990s during the dotcom era I worked with many companies that thought they could give away their software for a few dollars, gain a huge installed base and make money by monetizing those customers. I admit that I was skeptical. I would tell these companies, how can you make money and sustain your company? If you sell a few million copies of your software revenue will still be under $20 million — before expenses which would be huge. The reality is that none of these companies are around today. They simply couldn’t survive because there was no viable revenue model for the future. Fast forward almost 20 years. Google was built on top of the failures of these pioneers who understood that you could use an installed base to build something significant.

So, as I start to plan to predict 2010 I will try to keep in mind the assumptions, conventional wisdom, successes and failures of earlier times.

Tectonic shifts: HP Plus 3Com versus Cisco Plus EMC

November 18, 2009 4 comments

Just when it looked clear where the markets were lining up around data center automation and cloud computing, things change. I guess that is what makes this industry so very interesting.  The proposed acquisition by HP of 3Com is a direct challenge to Cisco’s network management franchise. However, the implications of this move go further than what meets the eye.  It also pits HP in a direct path against EMC with its Cisco partnership. And to make things even more interesting, it also puts these two companies in a competitive three way race against IBM and its cloud/data center automation strategy. And of course, it doesn’t stop there. A myriad of emerging companies like Google and Amazon want a larger share of the enterprise market for cloud services. Companies like Unisys and CSC that has focused on the outsourced secure data centers are getting into the act.

I don’t think that we will see a single winner — no matter what any one of these companies will tell you.  The winners in this market shift will be those companies can build a compelling platform and a compelling value proposition for a partner ecosystem.  The truth about the cloud is that it is not simply a network or a data center. It is a new way of providing services of all sorts that can support changing customer workloads in a secure and predictable manner.

In light of this, what does this say for HP’s plans to acquire 3Com? If we assume that the network infrastructure is a key component of an emerging cloud and data center strategy, HP is making a calculated risk in acquiring more assets in this market.  The company that has found that its ProCurve networking division has begun gaining traction. HP ProCurve Networking is the networking division of HP.  The division includes network switches, wireless access points, WAN routers, and Access Control servers and software.   ProCurve competes directly with Cisco in the networking switch market. When HP had a tight partnership with Cisco, the company de-emphasized the networking. However, once Cisco started to move into the server market, the handcuffs came off. The 3Com acquisition takes the competitive play to a new level. 3Com has a variety of good pieces of technology that HP could leverage within ProCurve. Even more significantly, it picks up a strong security product called TippingPoint, a 3Com acquisition. TippingPoint fills a critical hole in HP’s security offering. TippingPoint, offers network security offerings including intrusion prevention and a product that inspects network packets.  The former 3Com subsidiary has also established a database of security threats based a network of external researchers.

But I think that one of the most important reasons that HP bought 3Com is its strong relationships in the Chinese market. In fiscal year 2008 half of 3Com’s revenue came from its H3C joint venture with Chinese vendor, Huawei Technology. Therefore, it is not surprising that HP would have paid a premium to gain a foothold in this lucrative market. If HP is smart, it will do a good job leveraging the many software assets to build out both its networking assets as well as beefing up its software organization. In reality, HP is much more comfortable in the hardware market. Therefore, adding networking as a core competency makes sense. It will also bolster its position as a player in the high end data center market and in the private cloud space.

Cisco, on the other hand, is coming from the network and moving agressively into the cloud and the data center market.  The company has purchased a position with VMWare and has established a tight partnership with EMC as a go to market strategy.  For Cisco, it gives the company credibility and access to customers outside of its traditional markets. For EMC, the Cisco relationship strengthens its networking play.  But an even bigger value for the relationship is to present a bigger footprint to customers as they move to take on HP, IBM, and the assortment of other players who all want to win.  The Cisco/EMC/VMware play is to focus on the private cloud.  In their view a private cloud is very similar to a private, preconfigured data center.  It can be a compelling value proposition to a customer that needs a data center fast without having to deal with a lot of moving parts.  The real question from a cloud computing perspective is the key question: is this really a cloud?

It was inevitable that this quiet market dominated by Google and Amazon would heat up as the cloud becomes a real market force.  But I don’t expect that HP or Cisco/EMC will have a free run. They are being joined by IBM and Microsoft — among others. The impact could be better options for customers and prices that invariably will fall. The key to success for all of these players will be how well they manage what will be an increasingly heterogeneous, federated, and highly distributed hardware and software world. Management comes in many flavors: management of these highly distributed services and management of the workloads.

Is there a Twitter sneak attack in our future?

November 4, 2009 Leave a comment

Last year I wrote a post about what I called the Google Sneak attack. If you don’t feel like reading that post, I’ll make it simple for you. Google comes to market as a benign helpful little search engine that threatened no one. Fast forward a decade and Google now pulls in more ad revenue than most of the television networks combined. It has attacked Microsoft’s office franchise, is playing a key role in the cloud via Platform as a Service (Google AppEngine), not to mention the importance of its entry into the book business and who knows what else.  But let’s turn our attention to Twitter.  I’ve been using Twitter since 2007. For the first several months I couldn’t quite figure out what this was all about. It was confusing and intriguing at the same time.  In fact, my first blog about Twitter suggested that the Emperor has no clothes.

So fast forward to the end of 2009 and several very interesting things are happening:

1. Twitter is becoming as much a part of the cultural and technical fabric as Google did just a few years ago

2. A partner ecosystem has grown up around Twitter. A post from February by Matt Ingram of Gigaom echos this point.

3. The number of individuals, large corporations, and small businesses are using Twitter as everything from the neighborhood water cooler to a sales channel.

What does mean? Despite detractors who wonder what you can possibly accomplish in 140 characters, it is becoming clear that this company without a published business plan does have a plan to dominate.  It is, in fact, the same strategy that Google had. Which company would have been threatened by a small search company? And who could be threatened from a strange little company called Twitter that asked people to say it all in 140 characters? Today Twitter claims to have 18 Million users about 4% of adult internet users.  I suspect that we will begin to see a slow but well orchestrated roll out of services that leverage the Twitter platform. I suspect that we will see a combination of advertising plus commercial software aimed at helping companies reach new customers in new channels.

I am confident that within the next two years this small, profitless, patient company will roll out a plan targeting social networking world dominance. It will be fun to watch.

Ten things I learned while writing Cloud Computing for Dummies

August 14, 2009 14 comments

I haven’t written a blog post in quite a while. Yes, I feel bad about that but I think I have a good excuse. I have been hard at work (along with my colleagues Marcia Kaufman, Robin Bloor, and Fern Halper) on Cloud Computing for Dummies. I will admit that we underestimated the effort. We thought that since we had already written Service Oriented Architectures for Dummies — twice; and Service Management for Dummies that Cloud Computing would be relatively easy. It wasn’t. Over the past six months we have learned a lot about the cloud and where it is headed. I thought that rather than try to rewrite the entire book right here I would give you a sense of some of the important things that I have learned. I will hold myself to 10 so that I don’t go overboard!

1. The cloud is both old and new at the same time. It is build on the knowledge and experience of timesharing, Internet services, Application Service Providers, hosting, and managed services. So, it is an evolution, not a revolution.

2. There are lots of shades of gray with cloud segmentation. Yes, there are three buckets that we put clouds into: infrastructure as a service, platform as a service, and software as a service. Now, that’s nice and simple. However, it isn’t because all of these areas are starting to blurr into each other. And, it is even more complicated because there is also business process as a service. This is not a distinct market unto itself – rather it is an important component in the cloud in general.

3. Market leadership is in flux. Six months ago the market place for cloud was fairly easy to figure out. There were companies like Amazon and Google and an assortment of other pure play companies. That landscape is shifting as we speak. The big guns like IBM, HP, EMC, VMware, Microsoft, and others are running in. They would like to control the cloud. It is indeed a market where big players will have a strategic advantage.

4. The cloud is an economic and business model. Business management wants the data center to be easily scalable and predictable and affordable. As it becomes clear that IT is the business, the industrialization of the data center follows. The economics of the cloud are complicated because so many factors are important: the cost of power; the cost of space; the existing resources — hardware, software, and personnel (and the status of utilization). Determining the most economical approach is harder than it might appear.

5. The private cloud is real.  For a while there was a raging debate: is there such a thing as a private cloud? It has become clear to me that there is indeed a private cloud. A private cloud is the transformation of the data center into a modular, service oriented environment that makes the process of enabling users to safely procure infrastructure, platform and software services in a self-service manner.  This may not be a replacement for an entire data center – a private cloud might be a portion of the data center dedicated to certain business units or certain tasks.

6. The hybrid cloud is the future. The future of the cloud is a combination of private, traditional data centers, hosting, and public clouds. Of course, there will be companies that will only use public cloud services for everything but the majority of companies will have a combination of cloud services.

7. Managing the cloud is complicated. This is not just a problem for the vendors providing cloud services. Any company using cloud services needs to be able to monitor service levels across the services they use. This will only get more complicated over time.

8. Security is king in the cloud. Many of the customers we talked to are scared about the security implications of putting their valuable data into a public cloud. Is it safe? Will my data cross country boarders? How strong is the vendor? What if it goes out of business? This issue is causing many customers to either only consider a private cloud or to hold back. The vendors who succeed in the cloud will have to have a strong brand that customers will trust. Security will always be a concern but it will be addressed by smart vendors.

9. Interoperability between clouds is the next frontier. In these early days customers tend to buy one service at a time for a single purpose — Salesforce.com for CRM, some compute services from Amazon, etc. However, over time, customers will want to have more interoperability across these platforms. They will want to be able to move their data and their code from one enviornment to another. There is some forward movement in this area but it is early. There are few standards for the cloud and little agreement.

10. The cloud in a box. There is a lot of packaging going on out there and it comes in two forms. Companies are creating appliance based environments for managing virtual images. Other vendors (especially the big ones like HP and IBM) are packaging their cloud offerings with their hardware for companies that want Private clouds.

I have only scratched the surface of this emerging market. What makes it so interesting and so important is that it actually is the coalescing of computing. It incorporates everything from hardware, management software, service orientation, security, software development, information management,  the Internet, service managment, interoperability, and probably a dozen other components that I haven’t mentioned. It is truly the way we will achieve the industrialization of software.

Why Sun Microsystems can’t go it alone

April 6, 2009 7 comments

Like everyone else, I have been looking what would happen if IBM were to buy Sun Microsystems. I actually thought it sounded pretty good. IBM would get hardware, some database, virtualization, cloud, and operating system software. Oh, and did I mention that they would control Java. But it sounds (at least as I am writing this) the negociations have broken down. Greed is an interesting phenomenon. Prior to overtures by IBM, Sun’s stock price was around $3.00 a share. IBM was offering as much as $9.50 a share.  I actually thought that price was a bit high — but what do I know.

So, what happens now? I suspect this little drama is far from over. It is possible, if rumors are to be believed that Sun’s Chairman Scott McNealy will take over the reigns of the company once again to try to restore the company to its former glory. It has happened before. Steve Jobs returned to put Apple back on the right path. Michael Dell is trying to turn Dell into the innovator that it had been a decade ago.  Will it happen this time? I think that there are some difficulties with this plan, if it is indeed true. A lot has changed since Sun declared in the 1980s that the network was the computer. Clearly, the company leadership was right. I was an observer of the pragmatic and brilliant marketing company that Sun became in the 1980s, when I worked for its competitor Apollo Computer that was later purchased by HP.

Today, the market is quite different than the market Sun and McNealy had successfully finessed.  Today, the market is consolidating around either very strong global leaders such as IBM, HP, Microsoft, Oracle, Cisco, etc. There is a new generation of leaders emerging that had their start in the Internet era such as Google, Amazon, and Facebook and even Twitter. So, is there room for Sun to remake itself in this new world?

I guess that my take is that it will be very hard for Sun to resurface and remake itself. Here are the three main reasons that I have doubts and why I think that shareholders and board members should sell the company to IBM.

1. Sun Microsystems will have trouble regaining hardware leadership.  While it has some reasonable hardware assets, it is not big enough to take on HP or the emergence of Cisco as a hardware players.  Even companies like Google and Amazon play an important role in hardware — in the commodity relm.

2. While it owns some impressive software assets that it has bought over the past decade, Sun has never learned to leverage these assets to propel it into a leadership role.  It has further confused the market by opening sourcing its software. While this might be popular in a down market, it is not enough to create a repeatable revenue stream. I was watching a funny video of Steve Gilmore interviewing current CEO Jonathan Swartz (as a puppet) that I think captures part of Sun’s problems.

3. Is there a single area of technology where Sun can innovate and out shine its competitors? I imagine there might be some hidden jewels that are transformational and will turn the market upside down inside Sun — but I doubt it. I think that as Cloud Computing moves to center stage, Sun could be a player but not a leader. To be successful, Sun will have to find a way to lead in some area.

The bottom line is that I do not see a good future for Sun as an independent company.  I think that the damage has been done. Not only does the company have to regain shaky customer confidence but it quickly has to start making a profit. It is not an easy climate even for the strongest companies.  While it is possible that McNealy will surprise us all and turn Sun from a struggling player in a consolidating market to a leader but it is probably too late.  Customers who are watching this drama unfold will have to be convinced that Sun has staying power — not just for this year for future decades. If Sun tries to maintain independent, I predict a long and difficult path that will not necessarily end in success.

Is there beef behind SalesForce.Com?

May 29, 2008 3 comments

I have been following Salesforce.com since its founding in the mid-1990s. Initially the company started by creating a contact management system which evolved into the sales force platform it offers today. Last month I attended a small dinner meeting in Boston hosted by Marc Benioff, Chairman and CEO of SalesForce.com, for some partners and customers. I met the Steve Pugh, CEO of CODA Financials, a subsidiary of Coda, a UK based developer of accounting software. I was intrigued that the company had built its new generation financial application on top of Salesforce.com’s infrastructure. In my next post, I’ll talk about Coda and why they made this decision. But before that I wanted to take a look at the Salesforce platform.

What is most interesting about Salesforce is that it intended to build a platform from day one. In my discussions with Marc in the early days he focused not specifically on the benefits of CRM but rather on “No Software”. If you think about it that was a radical concept ten years ago.

Therefore, It goes without saying that Salesforce has been a Software as a Service pioneer. For example, in June 2003 launched sforce, one of the first web services based SaaS platforms. It offered partners a published SOAP-based API. Rather than viewing Salesforce as an application, it views it as a “database in the sky.” It interprets this database as an integration platform. Likewise, from a customer perspective, Salesforce has designed its environment to “look like a block”. What does that mean? I would probably use a different term maybe a infrastructure blackbox.

Salesforce’s approach to creating its ecosystem has been incremental. It began, for example, by allowing customers to change tabs and create their own database objects. Next, the company added what it called the AppExchange which added published APIs so that third party software providers could integrate their applications into the Salesforce platform. Most of the applications on AppExchange are more like utilities than full fledged packaged applications. Many of the packages sold through the AppExchange are “tracking applications” for example, there is an application that tracks information about commercial and residential properties; another application is designed to optimize the sales process for media/advertising companies; still another package is intended to help analyze sales data.

But this is just the beginning of what Salesforce has planned. The company is bringing in expertise from traditional infrastructure companies like Oracle and BEA — among others. It’s head of engineering came from eBay. Bringing in experienced management that understands enterprise scalability will be important — especially because of Salesforce’s vast ambitions. I have been reading blogs by various Salesforce.com followers and critics. Josh Greenbaum, whom I have known for more than 20 years has been quite critical of Salesforce and has predicted its demise (within 18 months). He makes the comparison between Salesforce.com and Siebel. While any company that has risen as fast as Salesforce.com has will be a target, I do not believe that Salesforce.com is in trouble. There are two reasons I believe that they have a good chance for sustainability: their underlying SOA architecture and the indications that ISVs are beginning to see the company as a viable infrastructure.

So, what is the path that Salesforce is following on its quest for infrastructuredom (is that a real word — probably not). One of the primary reasons for my optimism is that Salesforce.com has a combination of traditional development through a procedural language it calls Apex that is intended to help developers write stored procedures or SQL statements. While this may disappoint some, it is a pragmatic move. But more important than Apex is the development of a standard XML based stylesheet interfaces to a service designed for use with Salesforce applications. This allows a developer to change the way the application looks. It is, in essence, the interface as a service. A third capability that I like is the technique that Salesforce has designed for creating common objects. In essence, this is a basic packaging that allows a third party to create their own version of Salesforce for its customers. For example, this has enabled Accenture to create a version of Salesforce for its customers in the health care.

But what is behind the curtain of Salesforce? First, Salesforce uses the Oracle database as a technique for serving up file pages (not as a relational database). But the core Intellectual Property that sits on top of Oracle is a metadata architecture. It is designed as a multi-tenancy service. Salesforce considers this metadata stack as the core of its differentiation in the market. The metadata layer is complex and includes an application server called Resin. The Resin Application Server is a high-performance XML application server for use with JSPs, servlets, JavaBeans, XML, and a host of other technologies. On top of this metadata layers is an authorization server. The metadata layer is structured so that each organization has a unique access to the stack. Therefore, two companies could be physically connected to the same server but there would be no way for them to access each other’s data. The metadata layer will only point to the data that is specific to a user. The environment is designed so that each organization (i.e., customer) has a specific WSDL-based API. In fact, the architecture includes the approach of access APIs through the WSDL interface. There are two versions of WSDL — one general and one for a specific customer implementation. If a customer wants to share data, for example, they have to go through the general WSDL interface.

Salesforce’s approach is to use XML based interfaces as an integration approach. It has used this to integrate with Google Apps. Salesforce has already begun partnering with Google around Adwords. This move simply deepened the relationship since both companies are faced with competitive threats.

The bottom line is that I think that Salesforce.com is well positioned in the market. It has an underlying architecture that is well conceived based on a SOA approach. It has created an ecosystem of partners that leverage its APIs and rely on its network to build their businesses. Most importantly, SalesForce.com has created an application that is approachable to mortals (as opposed to software gods). Companies like Siebel, in contract, created a platform that was complicated for customers to use — and therefore many purchased the software and never used it.

Salesforce.com is not without challenges. It needs to continue to innovate on its platform so that it does not get caught off guard by large (Microsoft, SAP, and Oracle) players who aren’t happy with an upstart in a market they feel entitled to own. They are also at risk from upstarts like Zoho and open source CRM players like SugarCRM. If Salesforce.com can collect more packaged software vendors like Coda to build their next generation applications on top of Salesforce’s environment, they may be able to weather the inevitable threats.