Archive

Posts Tagged ‘cloud computing’

The lock-in risks of Software as a Service

May 3, 2010 3 comments

I started thinking a lot about software as a service environments and what this really means to customers.  I was talking to a CIO of a medium sized company the other day. His company is a customer of a major SaaS vendor (he didn’t want me to name the company). In the beginning things were quite good. The application is relatively easy to navigate and sales people were satisfied with the functionality. However, there was a problem. The use of this SaaS application was actually getting more complicated than the CIO had anticipated.  First, the company had discovered that they were locked into a three-year contract to support 450 sales people.  In addition, over the first several years of use, the company had hired a consultant to customize the workflow within the application.

So, what was the problem?  The CIO was increasingly alarmed about three issues:

  • The lack of elasticity. If the company suddenly had a bad quarter and wanted to reduce the number of licenses supported, they would be out of luck. One of the key promises of cloud computing and SaaS just went out the window.
  • High costs of the services model. It occurred to the CIO that the company was paying a lot more to support the SaaS application than it would have cost to buy an on premise CRM application. While there were many benefits to the reduced hardware and support requirements, the CIO was starting to wonder if the costs were justified.  Did the company really do the analysis to determine the long-term cost/benefit of cloud?  How would he be able to explain the long- term ramifications of budget increases that he expects will come to the CFO? It is not a conversation that he is looking forward to having.
  • No exit strategy. Given the amount of customization that the company has invested in, it is becoming increasingly clear that there is no easy answer – and no free lunch. One of the reasons that the company had decided to implement SaaS was the assumption that it would be possible to migrate from one SaaS application to another.  However, while it might be possible to migrate basic data from a SaaS application, it is almost impossible to migrate the process information. Shouldn’t there be a different approach to integration in clouds than for on premise?

The bottom line is that Software as a Service has many benefits in terms of more rapid deployment, initial savings in hardware and support services, and ease of access for a highly distributed workforce.  However, there are complications that are important to take into account.  Many SaaS vendors, like their counterparts in the on-premise world, are looking for long-term agreements and lock-in with customers.  These vendors expect and even encourage customers to customize their implication based on their specific business processes.  There is nothing wrong with this – to make applications like CRM and HR productive they need to reflect a company’s own methods of doing business. However, companies need to understand what they are getting into. It is easy to get caught in the hype of the magic land of SaaS.  As more and more SaaS companies are funded by venture capitalists, it is clear that they will not all survive. What happens to your customized processes and data if the company goes out of business?

It is becoming increasingly clear to me that we need a different approach to integration in the cloud than for on premise. It needs to leverage looser coupling, configurations rather than programmatic integration. We have the opportunity to rethink integration altogether – even for on premise applications.

There is no simple answer to the quandary.  Companies looking to deploy a SaaS application need to do their homework before barreling in.  Understand the risks and rewards. Can you separate out the business process from the basic SaaS application? Do you really want to lock yourself into a vendor you don’t know well? It may not be so easy to free your company, your processes, or your data.

IBM’s hardware sneak attack

April 13, 2010 5 comments

Yesterday I read an interesting blog commenting on why Oracle seems so interested in Sun’s hardware.

I quote from a comment by Brian Aker, former head of architecture for MySQL on the O’Reily Radar blog site.  He comments on his view on why Oracle bought Sun,

Brian Aker: I have my opinions, and they’re based on what I see happening in the market. IBM has been moving their P Series systems into datacenter after datacenter, replacing Sun-based hardware. I believe that Oracle saw this and asked themselves “What is the next thing that IBM is going to do?” That’s easy. IBM is going to start pushing DB2 and the rest of their software stack into those environments. Now whether or not they’ll be successful, I don’t know. I suspect once Oracle reflected on their own need for hardware to scale up on, they saw a need to dive into the hardware business. I’m betting that they looked at Apple’s margins on hardware, and saw potential in doing the same with Sun’s hardware business. I’m sure everything else Sun owned looked nice and scrumptious, but Oracle bought Sun for the hardware.

I think that Brian has a good point. In fact, in a post I wrote a few months ago, I commented on the fact that hardware is back.  It is somewhat ironic. For a long time, the assumption has been that a software platform is the right leverage point to control markets.  Clearly, the tide is shifting.  IBM, for example, has taken full advantage of customer concerns about the future of the Sun platform. But IBM is not stopping there. I predict a hardware sneak attack that encompasses IBM’s platform software strength (i.e., middleware, automation, analytics, and service management) combined with its hardware platforms.

IBM will use its strength in systems and middleware software to expand its footprint into Oracle’s backyard surrounding its software with an integrated platform designed to work as a system of systems.  It is clear that over the past five or six years IBM’s focus has been on software and services.  Software has long provided good profitability for IBM. Services has made enormous strides over the past decade as IBM has learned to codify knowledge and best practices into what I have called Service as Software. The other most important movement has been IBM’s focused effort over the past decade to revamp the underlying structure of its software into modular services that are used across its software portfolio. Combine this approach with industry focused business frameworks and you have a pretty good idea of where IBM is headed with its software and services portfolios.

The hardware strategy has begun to evolve in 2005 when IBM software bought a little hardware XML accelerator hardware appliance company called DataPower. Many market watchers were confused. What would IBM software do with a hardware platform?  Over time, IBM expanded the footprint of this platform and began to repurpose it as a means to pre-packaging software components. First there was a SOA-based appliance; then IBM added a virtual machine appliance called the CloudBurst appliance.  On the Lotus side of the business, IBM bought another appliance company that evolved into the Lotus Foundations platform.  Appliances became a great opportunity to package and preconfigure systems that could be remotely upgraded and managed.  This packaging of software with systems demonstrated the potential not only for simplicity for customers but a new way of adding value and revenue.

Now, IBM is taking the idea of packaging hardware with software to new levels.  It is starting to leverage the software and networking capability focused on hardware-driven systems. For example, within the systems environment, IBM is leveraging its knowledge of optimizing systems software so that it applications-based workloads can take advantage of capabilities such as threading, caching, and systems level networking.

In its recent announcement, IBM has developed its new hardware platforms based on the five most common workloads: transaction processing, analytics, business applications, records management and archiving, and collaboration.  What does this mean to customers? If a customer has a transaction oriented system, the most important capability is to ensure that the environment uses as many threads as possible to optimize speed of throughput. In addition, caching repetitive workloads will also ensure that transactions move through the system as quickly as possible. While this has been doable in the past, the difference is that these capabilities are packaged as an end-to-end system. Thus, implementation could be faster and more precise. The same can be said for analytics workloads. These workloads demand a high level of efficiency to enable customers to look for patterns in the data that help predict outcomes.     Analytics workloads require the caching and fast processing of   algorithms and data across multiple sources.

The bottom line is that IBM is looking at its hardware as an extension of the type of workloads they are required to support.  Rather than considering hardware as as set of separate platforms, IBM is following a systems of systems approach that is consistent with cloud computing.  With this type of approach, IBM will continue on the path of viewing a system as a combination of the hardware platform, the systems software, and systems-based networking.  These elements of computing are therefore configured based on the type of application and the nature of the current workload.

It is, in fact, workload optimization that is at the forefront of what is changing in hardware in the coming decade. This is true both in the data center and in the cloud. Cloud computing — and the hybrid environments that make up the future of computing are all predicated on predictable, scalable, and elastic workload management.  It is the way we will start thinking about computing as a continuum of all of the component parts combined — hardware, software, services, networking, storage, collaboration, and applications.  This reflects the dramatic changes that are just at the horizon.

Why we about to move from cloud computing to industrial computing?

April 5, 2010 7 comments

I spent the other week at a new conference called Cloud Connect. Being able to spend four days emerged in an industry discussion about cloud computing really allows you to step back and think about where we are with this emerging industry. While it would be possible to write endlessly about all the meeting and conversations I had, you probably wouldn’t have enough time to read all that. So, I’ll spare you and give you the top four things I learned at Cloud Connect. I recommend that you also take a look at Brenda Michelson’s blogs from the event for a lot more detail. I would also refer you to Joe McKendrick’s blog from the event.

1. Customers are still figuring out what Cloud Computing is all about.  For those of us who spend way too many hours on the topic of cloud computing, it is easy to make the assumption that everyone knows what it is all about.  The reality is that most customers do not understand what cloud computing is.  Marcia Kaufman and I conducted a full day workshop called Introduction to Cloud. The more than 60 people who dedicated a full day to a discussion of all aspects of the cloud made it clear to us that they are still figuring out the difference between infrastructure as a service and platform as a service. They are still trying to understand the issues around security and what cloud computing will mean to their jobs.

2. There is a parallel universe out there among people who have been living and breathing cloud computing for the last few years. In their view the questions are very different. The big issues discussed among the well-connected were focused on a few key issues: is there such a thing as a private cloud?; Is Software as a Service really cloud computing? Will we ever have a true segmentation of the cloud computing market?

3. From the vantage point of the market, it is becoming clear that we are about to enter one of those transitional times in this important evolution of computing. Cloud Connect reminded me a lot of the early days of the commercial Unix market. When I attended my first Unix conference in the mid-1980s it was a different experience than going to a conference like Comdex. It was small. I could go and have a conversation with every vendor exhibiting. I had great meetings with true innovators. There was a spirit of change and innovation in the halls. I had the same feeling about the Cloud Connect conference. There were a small number of exhibitors. The key innovators driving the future of the market were there to discuss and debate the future. There was electricity in the air.

4. I also anticipate a change in the direction of cloud computing now that it is about to pass that tipping point. I am a student of history so I look for patterns. When Unix reached the stage where the giants woke up and started seeing huge opportunity, they jumped in with a vengeance. The great but small Unix technology companies were either acquired, got big or went out of business. I think that we are on the cusp of the same situation with cloud computing. IBM, HP, Microsoft, and a vast array of others have seen the future and it is the cloud. This will mean that emerging companies with great technology will have to be both really luck and really smart.

The bottom line is that Cloud Connect represented a seminal moment in cloud computing. There is plenty of fear among customers who are trying to figure out what it will mean to their own data centers. What will the organizational structure of the future look like? They don’t know and they are afraid. The innovative companies are looking at the coming armies of large vendors and are wondering how to keep their differentiation so that they can become the next Google rather than the next company whose name we can’t remember. There was much debate about two important issues: cloud standards and private clouds. Are these issues related? Of course. Standards always become an issue when there is a power grab in a market. If a Google, Microsoft, Amazon, IBM, or an Oracle is able to set the terms for cloud computing, market control can shift over night. Will standard interfaces be able to save the customer? And how about private clouds? Are they real? My observation and contention is that yes, private clouds are real. If you deploy the same automation, provisioning software, and workload management inside a company rather than inside a public cloud it is still a cloud. Ironically, the debate over the private cloud is also about power and position in the market, not about ideology. If a company like Google, Amazon, or name whichever company is your favorite flavor… is able to debunk the private cloud — guess who gets all the money? If you are a large company where IT and the data center is core to how you conduct business — you can and should have a private cloud that you control and manage.

So, after taking a step back I believe that we are witnessing the next generation of computing — the industrialization of computing. It might not be as much fun as the wild west that we are in the midst of right now but it is coming and should be here before we realize that it has happened.

Can Informatica earn a place at the head table?

February 22, 2010 Leave a comment

Informatica might be thought of as the last independent data management company standing. In fact, that used to be Informatica’s main positioning in the market. That has begun to change over the last few years as Informatica can continued to make strategic acquisitions. Over the past two years Informatica has purchased five companies  — the most recent was Siperian, a significant player in Master Data Management solutions. These acquisitions have paid off. Today Informatica has past the $500 million revenue mark with about 4,000 customers. It has deepened its strategic partnerships with HP, Ascenture, Salesforce.com, and MicroStrategies,  In a nutshell, Informatica has made the transition from a focus on ETL (Extract, Transform, Load) tools to support data warehouses to a company focused broadly on managing information. Merv Adrian did a great job of providing context for Informatica’s strategy and acquisitions. To transition itself in the market, Informatica has set its sights on data service management — a culmination of data integration, master data management and data transformation, predictive analytics in a holistic manner across departments, divisions, and business partners.

In essence, Informatica is trying to position itself as a leading manager of data across its customers’ ecosystem. This requires a way to have consistent data definitions across silos (Master Data Management), ways to trust the integrity of that data (data cleansing), event processing, predictive analytics, integration tools to move and transform data, and the ability to prove that governance can be verified (data governance). Through its acquisitions, Informatica is working to put these pieces together. However, as a relatively small player living in a tough neighborhood (Oracle, IBM, SAS Institute,etc. it will be a difficult journey. This is one of the reasons that Informatica is putting so much emphasis on its new partner marketplace. A partner network can really help a smaller player appear and act bigger.

This Marketplace will include all of Informatica’s products. It will enable developers to develop within Informatica’s development cloud and deploy either in the cloud or on premise. Like its new partner marketplace, the cloud is offering another important opportunity for Informatica to compete. Informatica was an early partner with Salesforce.com. It has been offerings complementary information management products that can be used as options with Salesforce.com.  This has provided Informatica access to customers who might not have ever thought about Informatica in the past. In addition, it taught Informatica about the value of cloud computing as a platform for the future. Therefore, I expect that with Informatica’s strong cloud-based offerings will help the company maintain its industry position. In addition, I expect that the company’s newly strengthened partnership with HP will be very important in the company’s growth.

What is Informatica’s roadmap? It intends to continue to deliver new releases every six months including new data services and new data integration services. It will including develop these services with a self-service interfaces. In the end, its goal is to be a great data steward to its customers. This is an admirable goal. Informatica has made very good acquisitions that support its strategic goals. It is making the right bets on cloud and on a partner ecosystem. The question that remains is whether Informatica can truly scale to the size where it can sustain the competitive threats.  Companies like IBM, Oracle, Microsoft, SAP, and SAS Institute are not standing still.  Each of these companies have built and will continue to expand their information management strategies and portfolios of offerings. If Informatica can break the mold on ease of implementation on complex data service management it will have earned a place at the head table.

Why hardware still matters– at least for a couple of years

February 9, 2010 3 comments

It is easy to assume that with the excitement around cloud computing would put a damper on the hardware market. But I have news for you. I am predicting that over the next few years hardware will be front and center.  Why would I make such a wild prediction. Here are my three reasons.

1. Hardware is front and center in almost all aspects of the computer industry. It is no wonder that Oracle wants to become a hardware company. Hardware is tangible. It’s revenue hits the bottom line right away. Hardware can envelop software and keep customers pinned down for many, many years. New generation platforms in the form of hardware appliances are a convenient delivery platform that helps the sales cycle. It is no wonder that Oracle wants a hardware platform. It completes the equation and allows Oracle to position itself as a fully integrated computing company. Likewise, IBM and HP are focused on building up their war chest full of strong hardware platforms. If you believe that customers want to deal with one large brand..or two, then the winners want to control the entire computing ecosystem.

2. The cloud looms. Companies like Amazon.com and Google do not buy hardware from the big iron providers and never will. For economic reasons, these companies go directly to component providers and purchase custom designed chips, board, etc. This approach means that for a very low price, these cloud providers can reduce their power consumption by making sure that the components are optimize for massively scaled clouds.  These cloud vendors are focused on undercutting the opportunity and power of the big systems providers. Therefore, cloud providers care a lot about hardware — it is through optimization of the hardware that they can threaten the power equilibrium in the computer market.

3. The clash between cloud and on premise environments. It is clear that the computer marketplace is at a transition point. The cloud vendors are betting that they can get the costs based on optimization of everything so low that they win. The large Systems vendors are betting that their sophisticated systems combining hardware, software, and service will win because of their ability to better protect the integrity of the customer’s business. These vendors will all provide their own version of the public and private cloud to ensure that they maintain power.

So, in my view there will be an incredible focus on hardware over the next two years. This will actually be good for customers because the level of sophistication, cost/performance metrics will be impressive. This hardware renaissance will not last. In the long run, hardware will be commoditized. The end game will be interesting because of the cloud. It will not a zero sum game. No, the data center doesn’t go away. But the difference is that purpose built hardware will be optimized for workloads to support the massively scaled environments that will be the heart of the future of computing. And then, it will be all about the software, the data, and the integration.

3.

The DNA of the Cloud Power Partnerships

January 15, 2010 2 comments

I have been thinking  alot about the new alliances forming around cloud computing over the past couple of months.  The most important of these moves are EMC,Cisco, and VMware, HP and Microsoft’s announced collaboration, and of course, Oracle’s planned acquisition of Sun.  Now, let’s add IBM’s cloud strategy into the mix which has a very different complexion from its competitors. And, of course, my discussion of the cloud power struggle wouldn’t be complete without adding in the insurgents — Google and Amazon.  While it is tempting to want to portray this power grab by all of the above as something brand new — it isn’t.  It is a replay of well-worn patterns that we have seen in the computer industry for the past several decades. Yes, I am old enough to have been around for all of these power shifts. So, I’d like to point out what the DNA of this power struggle looks like for the cloud and how we might see history repeating itself in the coming year.  So, here is a sample of how high profile partnerships have fared over the past few decades. While the past can never accurately predict the future, it does provide some interesting insights.

Partner realignment happens when the stakes change.  There was a time when Cisco was a very, very close partner with HP. In fact, I remember a time when HP got out of the customer service software market to collaborate with Cisco. That was back in 1997.

Here are the first couple of sentences from the press release:

SAN JOSE and PALO ALTO, Calif., Jan. 15, 1997 — Hewlett-Packard Company and Cisco Systems Inc. today announced an alliance to jointly develop Internet-ready networked-computing solutions to maximize the benefits of combining networking and computing. HP and Cisco will expand or begin collaboration in four areas: technology development, product integration, professional services and customer service and support.

If you are interested, here is a link to the full press release.  What’s my point? These type of partnerships are in both HP’s and Cisco’s DNA. Both companies have made significant and broad-reaching partnerships. For example, back in 2004, IBM and Cisco created a broad partnership focused on the data center. Here’s an excerpt from a CRN article:

From the April 29, 2004 issue of CRN Cisco Systems (NSDQ:CSCO) and IBM (NYSE:IBM) on Thursday expanded their long-standing strategic alliance to take aim at the data center market. Solution providers said the new integrated data center solutions, which include a Cisco Gigabit Ethernet Layer 2 switch module for IBM’s eServer Blade Center, will help speed deployment times and ease management of on-demand technology environments.
“This is a big win for IBM,” said Chris Swahn, president of sales at Amherst Technologies, a solution provider in Merrimack, N.H.
The partnership propels IBM past rival Hewlett-Packard, which has not been as quick to integrate its own ProCurve network equipment into its autonomic computing strategy, Swahn said.
Cisco and IBM said they are bringing together their server, storage, networking and management products to provide an integrated data center automation platform.

Here is a link to the rest of the article.

HP itself has had a long history of very interesting partnerships. A few that are most relevant include HP’s ill-fated partnership with BEA in the 1990s. At the time, HP invested $100 million in BEA to further the development of software to support HP’s software infrastructure and platform strategy.

HP Gives BEA $100m for Joint TP Development
Published:08-April-1999
By Computergram

Hewlett-Packard Co and BEA Systems Inc yesterday said they plan to develop new transaction processing software as well as integrate a raft of HP software with BEA’s WebLogic application server, OLTP and e-commerce software. In giving the nod to WebLogic as its choice of application server, HP stopped far short of an outright acquisition of the recently-troubled middleware company, a piece of Wall Street tittle tattle which has been doing the round for several weeks now. HP has agreed to put BEA products through all of its distribution channels and is committing $100m for integration and joint development.

Here’s a link to an article about the deal.

Oracle  probably has more partnerships and more entanglement with more companies than anyone else.  For example,  HP has a  longstanding partnership with Oracle on the data management front. HP partnered closely with Oracle and optimized its hardware for the Oracle database. Today, Oracle and HP have more than 100,000 joint customers. Likewise, Oracle has a strong partnership with IBM — especially around its solutions business. IBM Global Services operates a huge consulting practice based on implementing and running Oracle’s solutions.  Not to be outdone, EMC and Oracle have about 70,000 joint customers. Oracle supports EMC’s storage solutions for Oracle’s portfolio while EMC supports Oracle’s solutions portfolio.

Microsoft, like Oracle, has entanglements with most of the market leaders. Microsoft has partnered very closely with HP for the last couple of decades both on the PC front and on the software front. Clearly, the partnership between HP and Microsoft has evolved for many years so this latest partnership is a continuation of a long-standing relationship. Microsoft has long-standing relationships with EMC, Sun, and Oracle — to name a few.

And what about Amazon and Google? Because both companies were early innovators in cloud computing, they were able to gain credibility in a market that had not yet emerged as the center of power. Therefore, both companies were well positioned to create partnerships with every established vendors that needed to do something with the cloud.  Every company from IBM to Oracle to EMC and Microsoft — to name but a few — established partnerships with these companies. Amazon and Google were small, convenient and non-threatening. But as the power of both companies continues to –grow,  so will their ability to partner in the traditional way. I am reminded of the way IBM partnered with two small companies — Intel and Microsoft when it needed a processor and an operating system to help bring the IBM PC to market in the early 1980s.

The bottom line is that cloud computing is becoming more than a passing fad — it is the future of how computing will change in the coming decades. Because of this reality, partnerships are changing and will continue to change. So, I suspect that the pronouncements of strategic, critical and sustainable partnerships may or may not be worth the paper or compute cycles that created them. But the reality is that the power struggle for cloud dominance is on. It will not leave anything untouched. It will envelop hardware, software, networking, and services. No one can predict exactly what will happen, but the way these companies have acted in the past and the present give us clues to a chaotic and predictable future.

Predictions for 2010: clouds, mergers, social networks and analytics

December 15, 2009 7 comments

Yes, it is predictions time. Let me start by saying that no market change happens in a single year. Therefore, what is important is to look at the nuance of a market or a technology change in the context of its evolution. So, it is in this spirit that I will make a few predictions. I’ve decided to just list my top six predictions (I don’t like odd numbers). Next week I will add another five or six predictions.

  1. Cloud computing will move out of the fear, uncertainty and doubt phase to the reality phase for many customers. This means that large corporations will begin to move segments of their infrastructure and applications to the cloud. It will be a slow but steady movement. The biggest impact on the market is that customers will begin putting pressure on vendors to guarantee predictability and reliability and portability.
  2. Service Management will become mainstream. Over the past five years the focus of service management has been around ITIL (Information Technology Infrastructure Library) processes and certification. There is a subtle change happening as corporations are starting to take a more holistic view of how they can effectively manage how everything that has a sensor, an actuator, or a computer interface is managed. Cloud computing will have a major impact on the growing importance of service management.
  3. Cloud service providers will begin to drop their prices dramatically as competition intensifies. This will be one of the primary drivers of growth of the use of cloud services. It will put a lot of pressure on smaller niche cloud providers as the larger companies try to gain control of this emerging market.
  4. It is not a stretch to state that the pace of technology acquisitions will accelerate in 2010.  I expect that HP, IBM, Cisco, Oracle, Microsoft, Google, and CA will be extremely active. While it would be foolhardy to pick a single area, I’ll go out on a limb and suggest that security, data center management, service management, and information management will be the focus of many of the acquisitions.
  5. Social Networking will become much more mainstream than it was in 2009. Marketers will finally realize that blatant sales pitches on Twitter or Facebook just won’t cut it.  We will begin to see markets learn how to integrate social networking into the fabric of marketing programs. As this happens there will be hundreds of new start ups focused on analyzing the effectiveness of these marketing efforts.
  6. Information management is at the cusp of a major change. While the individual database remains important, the issue for customers is focus on the need to manage information holistically so that they can anticipate change. As markets grow increasingly complex and competitive, the hottest products in 2010 will those that help companies anticipate what will happen next.  So expect that anything with the term predictive analytics to be hot, hot, hot.