Archive

Archive for the ‘partner ecosystem’ Category

Can IBM Build a Strong Cloud Partner Ecosystem?

May 4, 2011 1 comment

Despite all of the hand wringing surrounding Amazon.com’s service outages last week, it is clear to me that cloud computing is dramatically changing the delivery models of computing forever. We simply will not return to a model where organizations assume that they will consume primarily their own data center resources.  The traditional data center certainly isn’t going away but its role and its underlying technology will change forever.  One of the ramifications of this transition is the role of cloud infrastructure leaders in determining the direction of the partnership models.

Traditionally, System vendors have relied on partners to expand the coverage of their platforms. With the cloud, the requirement to have a strong partner ecosystem will not change. If anything, partners will be even more important in the cloud than they have been in traditional computing delivery models.  This is because with cloud computing, the barriers to leveraging different cloud-based software offerings – platform as a service and Software as a Service are very low. Any employee with a credit card can try out just about anything.  I think that the Amazon.com issues will be seen in the future as a tipping point for cloud computing. It, in fact, will not be the end to cloud but it will change the way companies view the way they select cloud partners.  Service management, scalability, and reliability will become the selection standard – not just for the end customer but for partners as well.

So, I was thinking about the cloud partnership model and how it is evolving. I expect that the major systems vendors will be in a perfect position to begin to reassert their power in the era of the cloud.  So, I decided to take a look at how IBM is approaching its partnership model in light of cloud computing.  Over the past several months, IBM has been revealing a new partnership model for the cloud computing market.  It has been difficult for most platform vendors to get noticed above the noise of cloud pioneers like Amazon and Google.  But this is starting to change.  It is not hard to figure out why.  IBM believes that cloud is a $181 billion business opportunity and it would like to grab a chunk of that opportunity.

Having followed IBM’s partnering initiatives for several decades I was not surprised to see a revamped cloud partnering program emerge this year. The new program is interesting for several different reasons.  First, it is focused on bringing together all of IBM’s cloud offerings across software, developer relations, hardware, and services into a single program.  This is important because it can be intimidating for an ISV, a Value Added Reseller, or a systems integrator to navigate the complexity of IBM’s offerings without some assistance.  In addition, IBM has to contend with a new breed of partners that are focused on public, private, and hybrid cloud offerings.

The new program is called the Cloud Specialty program and targeted to cover the entire cloud ecosystem including cloud builders (hardware and software resellers and systems integrators), Service Solution Providers (software and service resellers), Infrastructure Providers (telecom providers, hosting companies, Managed Service Providers, and distributors), Application Providers (ISVs and systems integrators), and Technology Providers (tools providers, and appliance vendors).

The focus of the cloud specialty program is not different than other partnering programs at IBM. It is focused on issues such as expanding the skills of partners, building revenue for both IBM and partners, and providing go to market programs to support its partners.  IBM is the first to admit that the complexity of the company and its offerings can be intimidating for partners.  Therefore, one of the objectives of the cloud specialty program is to clarify the requirements and benefits for partners. IBM is creating a tiered program based on the different types of cloud partners.  The level of partner investment and benefits differ based on the value of the type of partner and the expectation of those partners.  But there are some common offerings for all partners. All get early access to IBM’s cloud roadmap, use of the Partnerworld Cloud Specialty Mark, confidential updates on IBM’s cloud strategy and roadmap, internal use of LotusLive, networking opportunities. In addition, all these partners are entitled to up to $25,000 in business development funds.   There are some differences.  They include:

  • Cloud builders gain access to business leads, and access to IBM’s lab resources. In exchange these partners are expected to have IBM Cloud Reference architecture skills as well as cloud solutions provider and technical certification. They must also demonstrate ability to generate revenue. Revenue amounts vary based on the mix of hardware, software, and services that they resell.  They must also have two verified cloud references for the previous calendar year.
  • Service Solution Providers are provided with a named relationship manager and access to networking opportunities. In exchange, partners are expected to use IBM cloud products or services, demonstrate knowledge and skills in use of IBM cloud offerings, and the ability to generate $300,000 in revenue from the partnership.
  • Infrastructure Providers are given access to named IBM alliance manager, and access to business development workshops. In exchange, these partners are expected to use IBM’s cloud infrastructure products or services, demonstrate skills in IBM technology. Like service solution providers they must use and skills in IBM cloud offerings, have at least $300,000 a year in client references based on two cloud client references
  • Application Providers are given access to a named IBM alliance manager, and access to business development workshops. They are expected to use IBM cloud products or services, have skills in these technologies or services, and a minimum of $100,000 a year in revenue plus two cloud client references.
  • Technology Providers get access to networking opportunites, and IBM’s cloud and services assessment tools.  In exchange, these partners are required to demonstrate knowledge of IBM Cloud Reference architecture, have skills related to IBM’s cloud services. Like application providers, these partners must have at least $100,000 in IBM revenue and two client references.

What does IBM want? IBM’s goals with the cloud specialty program is to make it as attractive as possible for prospective partners to chose its platform. It is hoping that by offering financial and technical incentives that it can make inroads with cloud focused companies. For example, it is openings its labs and providing assistance to help partners define their offerings. IBM is also taking the unusual step of allowing partners to white label its products.  On the business development side, IBM is teaming with business partners on calls with prospective customers.  IBM anticipates that the impact on these partners could be significant – potentially generating as much as 30% gross margin growth.

Will the effort work? It is indeed an ambitious program. IBM will have to do a good job in explaining its huge portfolio of offerings to the prospective partners. For example, it has a range of services including CastIron for cloud integration, analytics services, collaboration services (based on LotusLive), middleware services, and Tivoli service management offerings.  In addition, IBM is encouraging partners to leverage its  extensive security services offerings.  It is also trying to encourage partners to leverage its hardware systems. One example of how IBM is trying to be more attractive to cloud-based companies like Software as a Service vendors to to price offerings attractively. Therefore, it is offering a subscription-based model for partners so that they can pay based on usage – the common model for most cloud platform vendors.

IBM is on the right track with this cloud focused partner initiative.  It is a sweeping program that is focused on provides a broad set of benefits for partners. It is pricing its services so that ISVs can rent a service (including IBM’s test and development cloud) by the month — an important issue in this emerging market.  It is also expecting partners to make a major investment in learning IBM’s software, hardware, and services offerings. It is also expecting partners to expand their knowledge of the markets they focus on.

Predictions for 2011: getting ready to compete in real time

December 1, 2010 3 comments

2010 was a transition year for the tech sector. It was the year when cloud suddenly began to look realistic to the large companies that had scorned it. It was the year when social media suddenly became serious business. And it was the year when hardware and software were being united as a platform – something like in the old mainframe days – but different because of high-level interfaces and modularity. There were also important trends starting to emerge like the important of managing information across both the enterprise and among partners and suppliers. Competition for ownership of the enterprise software ecosystem headed up as did the leadership of the emerging cloud computing ecosystem.

So, what do I predict for this coming year? While at the outset it might look like 2011 will be a continuation of what has been happening this year, I think there will be some important changes that will impact the world of enterprise software for the rest of the decade.

First, I think it is going to be a very big year for acquisitions. Now I have said that before and I will say it again. The software market is consolidating around major players that need to fill out their software infrastructure in order to compete. It will come as no surprise if HP begins to purchase software companies if it intends to compete with IBM and Oracle on the software front.  But IBM, Oracle, SAP, and Microsoft will not sit still either.  All these companies will purchase the incremental technology companies they need to compete and expand their share of wallet with their customers.

This will be a transitional year for the up and coming players like Google, Amazon, Netflix, Salesforce.com, and others that haven’t hit the radar yet.  These companies are plotting their own strategies to gain leadership. These companies will continue to push the boundaries in search of dominance.  As they push up market as they grab market share, they will face the familiar problem of being able to support customers who will expect them to act like adults.

Customer support, in fact, will bubble to the top of the issues for emerging as well as established companies in the enterprise space – especially as cloud computing becomes a well-established distribution and delivery platform for computing.  All these companies, whether well established or startups will have to balance the requirements to provide sophisticated customer support with the need to make profit.  This will impact everything from license and maintenance revenue to how companies will charge for consulting and support services.

But what are customers be looking for in 2011? Customers are always looking to reduce their IT expenses – that is a given. However, the major change in 2011 will be the need to innovative based on customer facing initiatives.  Of course, the idea of focusing on customer facing software itself isn’t new there are some subtle changes.  The new initiatives are based on leveraging social networking from a secure perspective to both drive business traffic, anticipate customer needs and issues before they become issues.  Companies will spend money innovating on customer relationships.

Cloud Computing is the other issue in 2011. While it was clearly a major differentiator in 2010, the cloud will take an important leap forward in 2011.  While companies were testing the water this year, next year, companies will be looking at best practices in cloud computing.  2011 will be there year where customers are going to focus on three key issues: data integration across public, private, and data centers, manageability both in terms of workload optimization, security, and overall performance.  The vendors that can demonstrate that they can provide the right level of service across cloud-based services will win significant business. These vendors will increasingly focus on expanding their partner ecosystem as a way to lock in customers to their cloud platform.

Most importantly, 2011 will be the year of analytics.  The technology industry continues to provide data at an accelerated pace never seen before. But what can we do with this data? What does it mean in organizations’ ability to make better business decisions and to prepare for an unpredictable future?  The traditional warehouse simply is too slow to be effective. 2011 will be the year where predictive analytics and information management overall will emerge as among the hottest and most important initiatives.

Now I know that we all like lists, so I will take what I’ve just said and put them into my top ten predictions:

1. Both today’s market leaders and upstarts are going to continue to acquire assets to become more competitive.  Many emerging startups will be scooped up before they see the light of day. At the same time, there will be almost as many startups emerge as we saw in the dot-com era.

2. Hardware will continue to evolve in a new way. The market will move away from hardware as a commodity. The hardware platform in 2010 will be differentiated based on software and packaging. 2010 will be the year of smart hardware packaged with enterprise software, often as appliances.

3. Cloud computing models will put extreme pressure on everything from software license and maintenance pricing to customer support. Integration between different cloud computing models will be front and center. The cloud model is moving out of risk adverse pilots to serious deployments. Best practices will emerge as a major issue for customers that see the cloud as a way to boost innovation and the rate of change.

4. Managing highly distributed services in a compliant and predictable manner will take center stage. Service management and service level agreements across cloud and on-premises environments will become a prerequisite for buyers.

5. Security software will be redefined based on challenges of customer facing initiatives and the need to more aggressively open the corporate environment to support a constantly morphing relationship with customers, partners, and suppliers.

6. The fear of lock in will reach a fever pitch in 2011. SaaS vendors will increasingly add functionality to tighten their grip on customers.  Traditional vendors will purchase more of the components to support the lifecycle needs of customers.  How can everything be integrated from a business process and data integration standpoint and still allow for portability? Today, the answers are not there.

7. The definition of an application is changing. The traditional view that the packaged application is hermetically sealed is going away. More of the new packaged applications will be based on service orientation based on best practices. These applications will be parameter-driven so that they can be changed in real time. And yes, Service Oriented Architectures (SOA) didn’t die after all.

8. Social networking grows up and will be become business social networks. These initiatives will be driven by line of business executives as a way to engage with customers and employees, gain insights into trends, to fix problems before they become widespread. Companies will leverage social networking to enhance agility and new business models.

9. Managing end points will be one of the key technology drivers in 2011. Smart phones, sensors, and tablet computers are refining what computing means. It will drive the requirement for a new approach to role and process based security.

10. Data management and predictive analytics will explode based on both the need to understand traditional information and the need to manage data coming from new sales and communications channels.

The bottom line is that 2011 will be the year where the seeds that have been planted over the last few years are now ready to become the drivers of a new generation of innovation and business change. Put together everything from the flexibility of service orientation, business process management innovation, the wide-spread impact of social and collaborative networks, the new delivery and deployment models of the cloud. Now apply tools to harness these environments like service management, new security platforms, and analytics. From my view, innovative companies are grabbing the threads of technology and focusing on outcomes. 2011 is going to be an important transition year. The corporations that get this right and transform themselves so that they are ready to change on a dime can win – even if they are smaller than their competitors.

Cashing in on the cloud

June 15, 2010 1 comment

I have been spending quite a bit of time these days at Cloud Computing events. Some of these events, like the Cloud Camps are wonderful opportunities for customers, vendors, consulted, and interested parties to exchange ideas in a very interactive format. If you haven’t been to one I strongly recommend them.  Dave Nielsen who is one of the founders of the Cloud Camp concept has done a great job not just jump starting these events but participating in most of them around the world.  In addition, Marcia Kaufman and I have been conducting a number of half and full day Introduction to Cloud Computing seminars in different cities.  What has been the most interesting observation from my view is that customers are no longer sitting on the side lines with their arms crossed. Customers are ready and eager to jump into to this new computing paradigm.  Often they are urged on by business leaders who instinctively see the value in turning computing into a scalable utility.  So, for the first time, there is a clear sense that there may well be money to be made.

While a lot of the focus lately has been on software developers, it is interesting to look at the channel as a huge opportunity to bring the cloud into a broader set of business customers.  I recently helped to run a couple of workshops with Sandy Carter, vice president of Software Group Channels for IBM.  Channel partners and distributors will be an increasingly important part of the cloud ecosystem. These companies typically have the organization and ability to reach into specialized customer markets with solutions.  These workshops are very interesting for a couple of reasons.  First, many distributors and channel partners are looking for guidance and direction about what the cloud is and what it means for these business.  Second, once these partners understand what resources are available to them they are in an excellent position to become a conduit for change.  The two workshops that IBM aptly named “Cool Cloud Cash” brought cloud computing into sharp focus for these partners.  These are savvy business leaders.  Once they understand how they can leverage cloud computing software, hardware, and services they start to see dollar signs.  In a sense, the channel is the most important avenue to bring cloud computing to the rest of the market — not just the early adopters.  IBM has a renewed focus on channel partners and is focused particularly on expanding its cloud partner ecosystem. One important aspect is new certifications in cloud computing. Given the fact that this is an immature market, it is important that distributors and channel partners are able to demonstrate to their customers that they have deep knowledge. It is especially important that platform vendors like IBM work closely with partners since they are both selling and representing them in the market.

Can Informatica earn a place at the head table?

February 22, 2010 Leave a comment

Informatica might be thought of as the last independent data management company standing. In fact, that used to be Informatica’s main positioning in the market. That has begun to change over the last few years as Informatica can continued to make strategic acquisitions. Over the past two years Informatica has purchased five companies  — the most recent was Siperian, a significant player in Master Data Management solutions. These acquisitions have paid off. Today Informatica has past the $500 million revenue mark with about 4,000 customers. It has deepened its strategic partnerships with HP, Ascenture, Salesforce.com, and MicroStrategies,  In a nutshell, Informatica has made the transition from a focus on ETL (Extract, Transform, Load) tools to support data warehouses to a company focused broadly on managing information. Merv Adrian did a great job of providing context for Informatica’s strategy and acquisitions. To transition itself in the market, Informatica has set its sights on data service management — a culmination of data integration, master data management and data transformation, predictive analytics in a holistic manner across departments, divisions, and business partners.

In essence, Informatica is trying to position itself as a leading manager of data across its customers’ ecosystem. This requires a way to have consistent data definitions across silos (Master Data Management), ways to trust the integrity of that data (data cleansing), event processing, predictive analytics, integration tools to move and transform data, and the ability to prove that governance can be verified (data governance). Through its acquisitions, Informatica is working to put these pieces together. However, as a relatively small player living in a tough neighborhood (Oracle, IBM, SAS Institute,etc. it will be a difficult journey. This is one of the reasons that Informatica is putting so much emphasis on its new partner marketplace. A partner network can really help a smaller player appear and act bigger.

This Marketplace will include all of Informatica’s products. It will enable developers to develop within Informatica’s development cloud and deploy either in the cloud or on premise. Like its new partner marketplace, the cloud is offering another important opportunity for Informatica to compete. Informatica was an early partner with Salesforce.com. It has been offerings complementary information management products that can be used as options with Salesforce.com.  This has provided Informatica access to customers who might not have ever thought about Informatica in the past. In addition, it taught Informatica about the value of cloud computing as a platform for the future. Therefore, I expect that with Informatica’s strong cloud-based offerings will help the company maintain its industry position. In addition, I expect that the company’s newly strengthened partnership with HP will be very important in the company’s growth.

What is Informatica’s roadmap? It intends to continue to deliver new releases every six months including new data services and new data integration services. It will including develop these services with a self-service interfaces. In the end, its goal is to be a great data steward to its customers. This is an admirable goal. Informatica has made very good acquisitions that support its strategic goals. It is making the right bets on cloud and on a partner ecosystem. The question that remains is whether Informatica can truly scale to the size where it can sustain the competitive threats.  Companies like IBM, Oracle, Microsoft, SAP, and SAS Institute are not standing still.  Each of these companies have built and will continue to expand their information management strategies and portfolios of offerings. If Informatica can break the mold on ease of implementation on complex data service management it will have earned a place at the head table.

Oracle + Sun: Five questions to ponder

January 27, 2010 3 comments

I spent a couple of hours today listening to Oracle talk about the long-awaited integration with Sun Microsystems. A real end of an era and beginning of a new one. What does this mean for Oracle? Whatever you might think about Oracle, you have to give the company credit for successfully integrating the 60 companies it has purchased over the past few years. Having watched hundreds and perhaps thousands of acquisitions over the last few decades, it is clear that integration is hard. There are overlapping technologies, teams, cultures, and egos. Oracle has successfully managed to leverage the IP from its acquisitions to support its business goals. For example, it has kept packaged software customers happy by improving the software. Peoplesoft customers, for example, were able to continue to use the software they had become dependent on in primarily the same way as before the acquisition. In some cases, the quality of the software actually improved dramatically. The path has been more complicated with the various middleware and infrastructure platforms the company has acquired over the years because of overlapping functionality.

The acquisition of Sun Microsystems is the biggest game changer for Oracle since the acquisition of PeopleSoft. There is little doubt that Sun has significant software and hardware IP that will be very important in defining Oracle in the 21st century. But I don’t expect this to be a simple journey. Here are the five key issues that I think will be tricky for Oracle to navigate. Obviously, this is not a complete list but it is a start.

Issue One: Can Oracle recreate the mainframe world? The mainframe is dead — long live the mainframe. Oracle has a new fondness for the mainframe and what that model could represent. So, if you combine Sun’s hardware, networking layer, storage, security, packaged applications, middleware into a package do you get to own total share of a customer’s wallet? That is the idea. Oracle management has determined that IBM had the right ideas in the 1960s — everything was nicely integrated and the customer never had to worry about the pieces working together.
Issue Two: Can you package everything together and still be an open platform? To its credit, Oracle has build its software on standards such as Unix/Linux, XML, Java, etc. So, can you have it both ways? Can you claim openness when the platform itself is hermetically sealed? I think it may be a stretch. In order to accomplish this goal, Oracle would have to have well-defined and published APIs. It would have to be able to certify that with these APIs the integrated platform won’t be broken. Not an easy task.
Issue Three: Can you manage a complex computing environment? Computing environments get complicated because there are so many moving parts. There are configurations that change; software gets patched; new operating system versions are introduced; emerging technology enters and messes up the well established environment. Oracle would like to automate the process of managing this process for customers. It is an appealing idea since configuration problems, missing links, and poor testing are often responsible for many of the outages in computing environments today. Will customers be willing to have this type of integrated environment controlled and managed by a single vendor? Some customers will be happy to turn over these headaches. Others may have too much legacy or want to work with a variety of vendors. This is not a new dilemma for customers. Customers have long had to rationalize the benefits of a single source of technology against the risks of being locked in.
Issue Four: Can you teach an old dog new tricks? Can Oracle really be a hardware vendor? Clearly, Sun continues to be a leader in hardware despite its diminished fortunes. But as anyone who has ventured into the hardware world knows, hardware is a tough, brutal game. In fact, it is the inverse of software. Software takes many cycles to reach maturation. It needs to be tweaked and finessed. However, once it is in place it has a long, long life. The old saying goes, old software never dies. The same cannot be said for hardware. Hardware has a much straighter line to maturity. It is developed, designed, and delivered to the market. Sometimes it leapfrogs the competition enough that it has a long and very profitable life. Other times, it hits the market at the end of a cycle when a new more innovative player enters the market. The culmination of all the work and effort can be short as something new comes along at the right place at the right time. It is often a lot easier to get rid of hardware than software. The computer industry is littered with the corpses of failed hardware platforms that started with great fanfare and then faded away quickly. Will Oracle be successful with hardware? It will depend on how really good the company is in transforming its DNA.
Issue Five. Are customers ready to embrace Oracle’s brave new world? Oracle’s strategy is a good one — if you are Oracle. But what about for customers? And what about for partners? Customers need to understand the long-term implication and tradeoffs in buying into Oracle’s integrated approach to its platform. It will clearly mean fewer moving parts to worry about. It will mean one phone call and no finger pointing. However, customers have to understand the type of leverage that single company will have in terms of contract terms and conditions. And what about partners? How does an independent software vendor or a channel partner participate within the new Oracle? Is there room? What type of testing and preparation will be required to play?

The DNA of the Cloud Power Partnerships

January 15, 2010 2 comments

I have been thinking  alot about the new alliances forming around cloud computing over the past couple of months.  The most important of these moves are EMC,Cisco, and VMware, HP and Microsoft’s announced collaboration, and of course, Oracle’s planned acquisition of Sun.  Now, let’s add IBM’s cloud strategy into the mix which has a very different complexion from its competitors. And, of course, my discussion of the cloud power struggle wouldn’t be complete without adding in the insurgents — Google and Amazon.  While it is tempting to want to portray this power grab by all of the above as something brand new — it isn’t.  It is a replay of well-worn patterns that we have seen in the computer industry for the past several decades. Yes, I am old enough to have been around for all of these power shifts. So, I’d like to point out what the DNA of this power struggle looks like for the cloud and how we might see history repeating itself in the coming year.  So, here is a sample of how high profile partnerships have fared over the past few decades. While the past can never accurately predict the future, it does provide some interesting insights.

Partner realignment happens when the stakes change.  There was a time when Cisco was a very, very close partner with HP. In fact, I remember a time when HP got out of the customer service software market to collaborate with Cisco. That was back in 1997.

Here are the first couple of sentences from the press release:

SAN JOSE and PALO ALTO, Calif., Jan. 15, 1997 — Hewlett-Packard Company and Cisco Systems Inc. today announced an alliance to jointly develop Internet-ready networked-computing solutions to maximize the benefits of combining networking and computing. HP and Cisco will expand or begin collaboration in four areas: technology development, product integration, professional services and customer service and support.

If you are interested, here is a link to the full press release.  What’s my point? These type of partnerships are in both HP’s and Cisco’s DNA. Both companies have made significant and broad-reaching partnerships. For example, back in 2004, IBM and Cisco created a broad partnership focused on the data center. Here’s an excerpt from a CRN article:

From the April 29, 2004 issue of CRN Cisco Systems (NSDQ:CSCO) and IBM (NYSE:IBM) on Thursday expanded their long-standing strategic alliance to take aim at the data center market. Solution providers said the new integrated data center solutions, which include a Cisco Gigabit Ethernet Layer 2 switch module for IBM’s eServer Blade Center, will help speed deployment times and ease management of on-demand technology environments.
“This is a big win for IBM,” said Chris Swahn, president of sales at Amherst Technologies, a solution provider in Merrimack, N.H.
The partnership propels IBM past rival Hewlett-Packard, which has not been as quick to integrate its own ProCurve network equipment into its autonomic computing strategy, Swahn said.
Cisco and IBM said they are bringing together their server, storage, networking and management products to provide an integrated data center automation platform.

Here is a link to the rest of the article.

HP itself has had a long history of very interesting partnerships. A few that are most relevant include HP’s ill-fated partnership with BEA in the 1990s. At the time, HP invested $100 million in BEA to further the development of software to support HP’s software infrastructure and platform strategy.

HP Gives BEA $100m for Joint TP Development
Published:08-April-1999
By Computergram

Hewlett-Packard Co and BEA Systems Inc yesterday said they plan to develop new transaction processing software as well as integrate a raft of HP software with BEA’s WebLogic application server, OLTP and e-commerce software. In giving the nod to WebLogic as its choice of application server, HP stopped far short of an outright acquisition of the recently-troubled middleware company, a piece of Wall Street tittle tattle which has been doing the round for several weeks now. HP has agreed to put BEA products through all of its distribution channels and is committing $100m for integration and joint development.

Here’s a link to an article about the deal.

Oracle  probably has more partnerships and more entanglement with more companies than anyone else.  For example,  HP has a  longstanding partnership with Oracle on the data management front. HP partnered closely with Oracle and optimized its hardware for the Oracle database. Today, Oracle and HP have more than 100,000 joint customers. Likewise, Oracle has a strong partnership with IBM — especially around its solutions business. IBM Global Services operates a huge consulting practice based on implementing and running Oracle’s solutions.  Not to be outdone, EMC and Oracle have about 70,000 joint customers. Oracle supports EMC’s storage solutions for Oracle’s portfolio while EMC supports Oracle’s solutions portfolio.

Microsoft, like Oracle, has entanglements with most of the market leaders. Microsoft has partnered very closely with HP for the last couple of decades both on the PC front and on the software front. Clearly, the partnership between HP and Microsoft has evolved for many years so this latest partnership is a continuation of a long-standing relationship. Microsoft has long-standing relationships with EMC, Sun, and Oracle — to name a few.

And what about Amazon and Google? Because both companies were early innovators in cloud computing, they were able to gain credibility in a market that had not yet emerged as the center of power. Therefore, both companies were well positioned to create partnerships with every established vendors that needed to do something with the cloud.  Every company from IBM to Oracle to EMC and Microsoft — to name but a few — established partnerships with these companies. Amazon and Google were small, convenient and non-threatening. But as the power of both companies continues to –grow,  so will their ability to partner in the traditional way. I am reminded of the way IBM partnered with two small companies — Intel and Microsoft when it needed a processor and an operating system to help bring the IBM PC to market in the early 1980s.

The bottom line is that cloud computing is becoming more than a passing fad — it is the future of how computing will change in the coming decades. Because of this reality, partnerships are changing and will continue to change. So, I suspect that the pronouncements of strategic, critical and sustainable partnerships may or may not be worth the paper or compute cycles that created them. But the reality is that the power struggle for cloud dominance is on. It will not leave anything untouched. It will envelop hardware, software, networking, and services. No one can predict exactly what will happen, but the way these companies have acted in the past and the present give us clues to a chaotic and predictable future.

Why did IBM buy Lombardi?

December 16, 2009 Leave a comment

Just as I was about to start figuring out my next six predictions for 2010 I had to stop the presses and focus on IBM’s latest acquisition. IBM just announced this morning that it has purchased Lombardi which focuses on Business Process Management software. Lombardi is one of the independent leaders in the market as well as a strong IBM business partner. The obvious question is why would IBM need yet another business process management platform? After all, IBM has a large portfolio of business process management software — some homegrown and some from various acquisitions such as Filenet, ILOG, and Webify. I think that the answer is actually quite straight forward. Lombardi’s offerings are used extensively in business units, by business management to codify complex processes that are at the heart of streamlining how businesses are able to differentiate themselves. Clearly, IBM has recognized the importance of Lombardi to its customers since it has had a long standing partnership with the company.  I think there are two reasons that this acquisition are significant beyond the need to provide direct support for business management. The ability to use Lombardi’s technology to sell more WebSphere offerings and the connection of business process to IBM’s Smarter Planet initiative are the two issues that stand out in my mind.

Selling more WebSphere products. There is no question that the WebSphere brand within IBM’s Software business unit includes a lot of products such as its registry/repository, applications integration, security, and various middleware offerings. IBM likes to sell its products by focusing on entry points — the immediate problem that the customer is trying to solve. The opportunity to gain direct access to business buyers who start with business process management and then may be see the value of adding new capabilities to that platform.

Supporting the Smarter Planet strategy. Business transformaton often starts by reconstructing process. IBM’s smarter planet strategy is based on the premise that customers want to be able to transform their businesses utilizing sophisticated technology. Therefore, it is important to look at how business innovation can be supported by IBM’s huge hardware, software, and services portfolio. The fact that Lombardi’s technology is the starting point for business units looking at transformational process changes is an important marker in IBM’s evolution as a company.

Is there a Twitter sneak attack in our future?

November 4, 2009 Leave a comment

Last year I wrote a post about what I called the Google Sneak attack. If you don’t feel like reading that post, I’ll make it simple for you. Google comes to market as a benign helpful little search engine that threatened no one. Fast forward a decade and Google now pulls in more ad revenue than most of the television networks combined. It has attacked Microsoft’s office franchise, is playing a key role in the cloud via Platform as a Service (Google AppEngine), not to mention the importance of its entry into the book business and who knows what else.  But let’s turn our attention to Twitter.  I’ve been using Twitter since 2007. For the first several months I couldn’t quite figure out what this was all about. It was confusing and intriguing at the same time.  In fact, my first blog about Twitter suggested that the Emperor has no clothes.

So fast forward to the end of 2009 and several very interesting things are happening:

1. Twitter is becoming as much a part of the cultural and technical fabric as Google did just a few years ago

2. A partner ecosystem has grown up around Twitter. A post from February by Matt Ingram of Gigaom echos this point.

3. The number of individuals, large corporations, and small businesses are using Twitter as everything from the neighborhood water cooler to a sales channel.

What does mean? Despite detractors who wonder what you can possibly accomplish in 140 characters, it is becoming clear that this company without a published business plan does have a plan to dominate.  It is, in fact, the same strategy that Google had. Which company would have been threatened by a small search company? And who could be threatened from a strange little company called Twitter that asked people to say it all in 140 characters? Today Twitter claims to have 18 Million users about 4% of adult internet users.  I suspect that we will begin to see a slow but well orchestrated roll out of services that leverage the Twitter platform. I suspect that we will see a combination of advertising plus commercial software aimed at helping companies reach new customers in new channels.

I am confident that within the next two years this small, profitless, patient company will roll out a plan targeting social networking world dominance. It will be fun to watch.

Is application portability possible in the cloud?

October 8, 2009 1 comment

As companies try to get a handle on the costs involved in running data centers. In fact, this is one of the primary reasons that companies are looking to cloud computing to make the headaches go away.  Like everything else is the complex world of computing, clouds solve some problems but they also cause the same type of lock-in problems that our industry has experienced for a few decades.

I wanted to add a little perspective before I launch into my thoughts about portability in the cloud.  So, I was thinking about the traditional data centers and how their performance has long been hampered because of their lack of homogeneity.  The typical data center is   filled with a warehouse of different hardware platforms, operating systems, applications, networks – to name but a few.  You might want to think of them as archeological digs – tracing the history of the computer industry.   To protect their turf, each vendor came up with their own platforms, proprietary operating systems and specialized applications that would only work on a single platform.

In addition to the complexities involved in managing this type of environment, the applications that run in these data centers are also trapped.   In fact, one of the main reasons that large IT organizations ended up with so many different hardware platforms running a myriad of different operating systems was because applications were tightly intertwined with the operating system and the underlying hardware.

As we begin to move towards the industrialization of software, there has been an effort to separate the components of computing so that application code is separate from the underlying operating system and the hardware. This has been the allure of both service oriented architectures and virtualization.  Service orientation has enabled companies to create clean web services interfaces and to create business services that can be reused for a lot of different situations.  SOA has taught us the business benefits that can be gained from encapsulating existing code so that it is isolated from other application code, operating systems and hardware.

Sever Virtualization takes the existing “clean” interface that is between the hardware and the software and separates the two. One benefit of fueling rapid adoption and market growth is that there is no need for rewriting of software between the x86 instructions and the software. As Server virtualization moves into the data center, companies can dramatically consolidate the massive number of machines that are dramatically underutilized to a new machines that are used in a much more efficient manner. The resultant cost savings from server virtualization include reduction in physical boxes, heating, maintenance, overhead, cooling, power etc.

Server virtualization has enabled users to create virtual images to recapture some efficiency in the data center.  And although it fixes the problem of operating systems bonded to hardware platforms, it does nothing to address the intertwining of applications and operating systems.

Why bring this issue up now? Don’t we have hypervisors that take care of all of our problems of separating operating systems from applications? Don’t companies simply spin up another virtual image and that is the end of the story.  I think the answer is no – especially with the projected growth of the cloud environment.

I got thinking about this issue after having a fascinating conversation with Greg O’Connor, CEO of AppZero.  AppZero’s value proposition is quite interesting.  In essence, AppZero provides an environment that separates the application from the underlying operating system, effectively moving up to the next level of the stack.

The company’s focus is particularly on the Windows operating system and for good reason. Unlike Linux or Zos, the Windows operating system does not allow applications to operate in a partition.  Partitions act to effectively isolate applications from one another so that if a bad thing happens to one application it cannot effect another application.   Because it is not possible to separate or isolate applications in the Windows based server environment when something goes bad with one application, it can hurt the rest of the system and other application in Windows.

In addition, when an application is loaded into Windows, DLLs (Dynamic Link Libraries) are often loaded into the operating system. DLLs are shared across applications and installing a new application can overwrite the current DLL of another application. As you can imagine, this conflict can have really bad side effects. .

Even when applications are installed on different servers – physical or virtual — installing software in Windows is a complicated issue. Applications create registry entries, modify registry entries of shared DLLS copy new DLLs over share libraries. This arrangement works fine unless you want to move that application to another environment. Movement requires a lot of work for the organization making the transition to another platform. It is especially complicated for independent software vendors (ISVs) that need to be able to move their application to whichever platform their customers prefer.

The problem gets even more complex when you start looking at issues related to Platform as a Service (PaaS).  With PaaS platform a customer is using a cloud service that includes everything from the operating system to application development tools and a testing environment.  Many PaaS vendors have created their own language to be used to link components together.  While there are benefits to having a well-architected development and deployment cloud platform, there is a huge danger of lock in.  Now, most of the PaaS vendors that I have been talking to promise that they will make it easy for customers to move from one Cloud environment to another.  Of course, although I always believe everything a vendor tells me  (that is meant as a joke….to lighten the mood) but I think that customers have to be wary about these claims of interoperability.

That was why I was intrigued with AppZero’s approach. Since the company decouples the operating system from the application code, it provides portability of pre-installed application from one environment to the next.  The company positions its approach as a virtual application appliance . In essence, this software is designed as a layer that sits between the operating system and the application. This layer intercepts file I/O, shared memory I/O as well as a specific DLL and keeps them in separate “containers” that are isolated from the application code.

Therefore, the actual application does not change any of the files or registry entries on a Windows server. In this way, a company could run a single instance of the windows server operating system. In essence, it isolates the applications, the specific dependencies and configurations from the operating system so it requires fewer operating systems to manage a Microsoft windows server based data center.

AppZero enables the user to load an application from  the network rather than to the local disk.  It therefore should simplify the job for data center operations management by enabling a single application image to be provisioned to multiple environments- enabling them to keep track of changes within a Windows environment because the application isn’t tied to a particular OS.   AppZero has found a niche selling its offerings to ISVs that want to move their offerings across different platforms without having to have people install the application. By having the application pre-installed in a virtual application appliance, the ISV can remove many of the errors that occur when a customer install the application into there environment.  The application that is delivered in a virtual application appliance container greatly reduces the variability of components that might be effect the application with traditional installation process. In addition, the company has been able to establish partnerships with both Amazon and GoGrid.

So, what does this have to do with portability and the cloud? It seems to me that this approach of separating layers of software so that interdependencies do not interfere with portability is one of the key ingredients in software portability in the cloud. Clearly, it isn’t the only issue to be solved. There are issues such as standard interfaces, standards for security, and the like. But I expect that many of these problems will be solved by a combination of lessons learned from existing standards from the Internet, web services, Service Orientation, systems and network management. We’ll be ok, as long as we don’t try to reinvent everything that has already been invented.

Does IT see the writing on the cloud wall?

April 15, 2009 5 comments

For the last six months or so I have been researching cloud computing. More recently, our team has started writing our next Dummies Book on Cloud Computing. Typically when we start a book we talk to everyone in the ecosystem — vendors big and small and lots of customers.  For example, when we started working on SOA for Dummies almost three years ago we found a lot of customers who could talk about their early experience. Not all of these companies had done things right. They had made lots of mistakes and started over. Many of them didn’t necessarily want their mistakes put into a book but they were willing to talk and share.  As I have mentioned in earlier writings, when we wrote the second edition of SOA for Dummies we had a huge number of customers that we could talk to. A lot of them have made tremendous progress in transforming not just their IT organization but the business as well.

We had a similar experience with Service Management for Dummies which comes out in June. Customers were eager to explain what they had learned about managing their increasingly complex computing and business infrastructures.  But something interesting in happening with the Cloud book. The experience feels very different and I think this is significant.

Our team has been talking to a lot of the vendors — big and small about their products and strategies around the cloud. Some of these vendors focused on some really important problems. Others are simply tacking the word cloud in front of their offerings hoping to get swept up in the excitment. But there is something missing. I think there are two things: there is a lack of clarity about what a cloud really is and what the component parts are. Is it simply Software as a Service? Is it an outsourced infrastructure? Is it storage capacity to supplement existing data centers? Is it a management platform that supports Software as a service? Does cloud require a massive ecosystem of partners? Is it a data center with APIs? Now, I am not going to answer these questions now (I’ll leave some of these to future writings).

What I wanted to talk about was what I see happening with customers.  I see customers being both confused and very wary. In fact, the other day I tried to set up a call with a senior executive from a large financial services company that I have spoken to about other emerging areas. This company always likes to be on the forefront of important technology trends. To my surprise, the executive was not willing to talk about clouds at all.  Other customers are putting their toes in the cloud (pun intended) by using some extra compute cycles from Amazon or by using Software as a Service offerings like SalesForce.com. Some customers are looking to implement a cloud-like capability within their own data center. Could it be there they are afraid that if they don’t offer something like Amazon’s EC2 cloud that they will be put out of business? Just as likely they are worried about the security of their intellectual property and their data.

I predict that the data center is about to go through a radical transformation that will forever change the landscape of corporate computing. Companies have recognized for a long time that data centers are very inefficient. They have tried clustering servers and virtualizing their servers with some level of success.  But the reality is that in time there will be a systematic approach to scalable computing based on the cloud.  It will not be a simple outsourced data center because of the transition to a new generation of software that is component based and service oriented. There is a new generation of service management technologies that makes the management of highly distributed environments much more seamless. The combination of service oriententation, service managment, and cloud will be the future of computing.

The bottom line is that while the vendor community sees dollar signs in this emerging cloud based world, the customers are afraid. The data center management team does not understand what this will mean for their future. If everything is tucked away in a cloud what is my job? Will we still have a data center? I suspect that it will not be that simple. At some point down the line we will actually move to utility computing where computing assets will all be based on a consistent set of standards so that customers will be able to mix and match the services they need in real time. We clearly are not there yet. Today there will be many data center activities that either cannot or will not be put into a cloud. Internal politics will keep this trend towards clouds moving slowly.