Archive

Posts Tagged ‘IBM’

Oracle + Sun: Five questions to ponder

January 27, 2010 3 comments

I spent a couple of hours today listening to Oracle talk about the long-awaited integration with Sun Microsystems. A real end of an era and beginning of a new one. What does this mean for Oracle? Whatever you might think about Oracle, you have to give the company credit for successfully integrating the 60 companies it has purchased over the past few years. Having watched hundreds and perhaps thousands of acquisitions over the last few decades, it is clear that integration is hard. There are overlapping technologies, teams, cultures, and egos. Oracle has successfully managed to leverage the IP from its acquisitions to support its business goals. For example, it has kept packaged software customers happy by improving the software. Peoplesoft customers, for example, were able to continue to use the software they had become dependent on in primarily the same way as before the acquisition. In some cases, the quality of the software actually improved dramatically. The path has been more complicated with the various middleware and infrastructure platforms the company has acquired over the years because of overlapping functionality.

The acquisition of Sun Microsystems is the biggest game changer for Oracle since the acquisition of PeopleSoft. There is little doubt that Sun has significant software and hardware IP that will be very important in defining Oracle in the 21st century. But I don’t expect this to be a simple journey. Here are the five key issues that I think will be tricky for Oracle to navigate. Obviously, this is not a complete list but it is a start.

Issue One: Can Oracle recreate the mainframe world? The mainframe is dead — long live the mainframe. Oracle has a new fondness for the mainframe and what that model could represent. So, if you combine Sun’s hardware, networking layer, storage, security, packaged applications, middleware into a package do you get to own total share of a customer’s wallet? That is the idea. Oracle management has determined that IBM had the right ideas in the 1960s — everything was nicely integrated and the customer never had to worry about the pieces working together.
Issue Two: Can you package everything together and still be an open platform? To its credit, Oracle has build its software on standards such as Unix/Linux, XML, Java, etc. So, can you have it both ways? Can you claim openness when the platform itself is hermetically sealed? I think it may be a stretch. In order to accomplish this goal, Oracle would have to have well-defined and published APIs. It would have to be able to certify that with these APIs the integrated platform won’t be broken. Not an easy task.
Issue Three: Can you manage a complex computing environment? Computing environments get complicated because there are so many moving parts. There are configurations that change; software gets patched; new operating system versions are introduced; emerging technology enters and messes up the well established environment. Oracle would like to automate the process of managing this process for customers. It is an appealing idea since configuration problems, missing links, and poor testing are often responsible for many of the outages in computing environments today. Will customers be willing to have this type of integrated environment controlled and managed by a single vendor? Some customers will be happy to turn over these headaches. Others may have too much legacy or want to work with a variety of vendors. This is not a new dilemma for customers. Customers have long had to rationalize the benefits of a single source of technology against the risks of being locked in.
Issue Four: Can you teach an old dog new tricks? Can Oracle really be a hardware vendor? Clearly, Sun continues to be a leader in hardware despite its diminished fortunes. But as anyone who has ventured into the hardware world knows, hardware is a tough, brutal game. In fact, it is the inverse of software. Software takes many cycles to reach maturation. It needs to be tweaked and finessed. However, once it is in place it has a long, long life. The old saying goes, old software never dies. The same cannot be said for hardware. Hardware has a much straighter line to maturity. It is developed, designed, and delivered to the market. Sometimes it leapfrogs the competition enough that it has a long and very profitable life. Other times, it hits the market at the end of a cycle when a new more innovative player enters the market. The culmination of all the work and effort can be short as something new comes along at the right place at the right time. It is often a lot easier to get rid of hardware than software. The computer industry is littered with the corpses of failed hardware platforms that started with great fanfare and then faded away quickly. Will Oracle be successful with hardware? It will depend on how really good the company is in transforming its DNA.
Issue Five. Are customers ready to embrace Oracle’s brave new world? Oracle’s strategy is a good one — if you are Oracle. But what about for customers? And what about for partners? Customers need to understand the long-term implication and tradeoffs in buying into Oracle’s integrated approach to its platform. It will clearly mean fewer moving parts to worry about. It will mean one phone call and no finger pointing. However, customers have to understand the type of leverage that single company will have in terms of contract terms and conditions. And what about partners? How does an independent software vendor or a channel partner participate within the new Oracle? Is there room? What type of testing and preparation will be required to play?

The DNA of the Cloud Power Partnerships

January 15, 2010 2 comments

I have been thinking  alot about the new alliances forming around cloud computing over the past couple of months.  The most important of these moves are EMC,Cisco, and VMware, HP and Microsoft’s announced collaboration, and of course, Oracle’s planned acquisition of Sun.  Now, let’s add IBM’s cloud strategy into the mix which has a very different complexion from its competitors. And, of course, my discussion of the cloud power struggle wouldn’t be complete without adding in the insurgents — Google and Amazon.  While it is tempting to want to portray this power grab by all of the above as something brand new — it isn’t.  It is a replay of well-worn patterns that we have seen in the computer industry for the past several decades. Yes, I am old enough to have been around for all of these power shifts. So, I’d like to point out what the DNA of this power struggle looks like for the cloud and how we might see history repeating itself in the coming year.  So, here is a sample of how high profile partnerships have fared over the past few decades. While the past can never accurately predict the future, it does provide some interesting insights.

Partner realignment happens when the stakes change.  There was a time when Cisco was a very, very close partner with HP. In fact, I remember a time when HP got out of the customer service software market to collaborate with Cisco. That was back in 1997.

Here are the first couple of sentences from the press release:

SAN JOSE and PALO ALTO, Calif., Jan. 15, 1997 — Hewlett-Packard Company and Cisco Systems Inc. today announced an alliance to jointly develop Internet-ready networked-computing solutions to maximize the benefits of combining networking and computing. HP and Cisco will expand or begin collaboration in four areas: technology development, product integration, professional services and customer service and support.

If you are interested, here is a link to the full press release.  What’s my point? These type of partnerships are in both HP’s and Cisco’s DNA. Both companies have made significant and broad-reaching partnerships. For example, back in 2004, IBM and Cisco created a broad partnership focused on the data center. Here’s an excerpt from a CRN article:

From the April 29, 2004 issue of CRN Cisco Systems (NSDQ:CSCO) and IBM (NYSE:IBM) on Thursday expanded their long-standing strategic alliance to take aim at the data center market. Solution providers said the new integrated data center solutions, which include a Cisco Gigabit Ethernet Layer 2 switch module for IBM’s eServer Blade Center, will help speed deployment times and ease management of on-demand technology environments.
“This is a big win for IBM,” said Chris Swahn, president of sales at Amherst Technologies, a solution provider in Merrimack, N.H.
The partnership propels IBM past rival Hewlett-Packard, which has not been as quick to integrate its own ProCurve network equipment into its autonomic computing strategy, Swahn said.
Cisco and IBM said they are bringing together their server, storage, networking and management products to provide an integrated data center automation platform.

Here is a link to the rest of the article.

HP itself has had a long history of very interesting partnerships. A few that are most relevant include HP’s ill-fated partnership with BEA in the 1990s. At the time, HP invested $100 million in BEA to further the development of software to support HP’s software infrastructure and platform strategy.

HP Gives BEA $100m for Joint TP Development
Published:08-April-1999
By Computergram

Hewlett-Packard Co and BEA Systems Inc yesterday said they plan to develop new transaction processing software as well as integrate a raft of HP software with BEA’s WebLogic application server, OLTP and e-commerce software. In giving the nod to WebLogic as its choice of application server, HP stopped far short of an outright acquisition of the recently-troubled middleware company, a piece of Wall Street tittle tattle which has been doing the round for several weeks now. HP has agreed to put BEA products through all of its distribution channels and is committing $100m for integration and joint development.

Here’s a link to an article about the deal.

Oracle  probably has more partnerships and more entanglement with more companies than anyone else.  For example,  HP has a  longstanding partnership with Oracle on the data management front. HP partnered closely with Oracle and optimized its hardware for the Oracle database. Today, Oracle and HP have more than 100,000 joint customers. Likewise, Oracle has a strong partnership with IBM — especially around its solutions business. IBM Global Services operates a huge consulting practice based on implementing and running Oracle’s solutions.  Not to be outdone, EMC and Oracle have about 70,000 joint customers. Oracle supports EMC’s storage solutions for Oracle’s portfolio while EMC supports Oracle’s solutions portfolio.

Microsoft, like Oracle, has entanglements with most of the market leaders. Microsoft has partnered very closely with HP for the last couple of decades both on the PC front and on the software front. Clearly, the partnership between HP and Microsoft has evolved for many years so this latest partnership is a continuation of a long-standing relationship. Microsoft has long-standing relationships with EMC, Sun, and Oracle — to name a few.

And what about Amazon and Google? Because both companies were early innovators in cloud computing, they were able to gain credibility in a market that had not yet emerged as the center of power. Therefore, both companies were well positioned to create partnerships with every established vendors that needed to do something with the cloud.  Every company from IBM to Oracle to EMC and Microsoft — to name but a few — established partnerships with these companies. Amazon and Google were small, convenient and non-threatening. But as the power of both companies continues to –grow,  so will their ability to partner in the traditional way. I am reminded of the way IBM partnered with two small companies — Intel and Microsoft when it needed a processor and an operating system to help bring the IBM PC to market in the early 1980s.

The bottom line is that cloud computing is becoming more than a passing fad — it is the future of how computing will change in the coming decades. Because of this reality, partnerships are changing and will continue to change. So, I suspect that the pronouncements of strategic, critical and sustainable partnerships may or may not be worth the paper or compute cycles that created them. But the reality is that the power struggle for cloud dominance is on. It will not leave anything untouched. It will envelop hardware, software, networking, and services. No one can predict exactly what will happen, but the way these companies have acted in the past and the present give us clues to a chaotic and predictable future.

Why did IBM buy Lombardi?

December 16, 2009 Leave a comment

Just as I was about to start figuring out my next six predictions for 2010 I had to stop the presses and focus on IBM’s latest acquisition. IBM just announced this morning that it has purchased Lombardi which focuses on Business Process Management software. Lombardi is one of the independent leaders in the market as well as a strong IBM business partner. The obvious question is why would IBM need yet another business process management platform? After all, IBM has a large portfolio of business process management software — some homegrown and some from various acquisitions such as Filenet, ILOG, and Webify. I think that the answer is actually quite straight forward. Lombardi’s offerings are used extensively in business units, by business management to codify complex processes that are at the heart of streamlining how businesses are able to differentiate themselves. Clearly, IBM has recognized the importance of Lombardi to its customers since it has had a long standing partnership with the company.  I think there are two reasons that this acquisition are significant beyond the need to provide direct support for business management. The ability to use Lombardi’s technology to sell more WebSphere offerings and the connection of business process to IBM’s Smarter Planet initiative are the two issues that stand out in my mind.

Selling more WebSphere products. There is no question that the WebSphere brand within IBM’s Software business unit includes a lot of products such as its registry/repository, applications integration, security, and various middleware offerings. IBM likes to sell its products by focusing on entry points — the immediate problem that the customer is trying to solve. The opportunity to gain direct access to business buyers who start with business process management and then may be see the value of adding new capabilities to that platform.

Supporting the Smarter Planet strategy. Business transformaton often starts by reconstructing process. IBM’s smarter planet strategy is based on the premise that customers want to be able to transform their businesses utilizing sophisticated technology. Therefore, it is important to look at how business innovation can be supported by IBM’s huge hardware, software, and services portfolio. The fact that Lombardi’s technology is the starting point for business units looking at transformational process changes is an important marker in IBM’s evolution as a company.

Predictions for 2010: clouds, mergers, social networks and analytics

December 15, 2009 7 comments

Yes, it is predictions time. Let me start by saying that no market change happens in a single year. Therefore, what is important is to look at the nuance of a market or a technology change in the context of its evolution. So, it is in this spirit that I will make a few predictions. I’ve decided to just list my top six predictions (I don’t like odd numbers). Next week I will add another five or six predictions.

  1. Cloud computing will move out of the fear, uncertainty and doubt phase to the reality phase for many customers. This means that large corporations will begin to move segments of their infrastructure and applications to the cloud. It will be a slow but steady movement. The biggest impact on the market is that customers will begin putting pressure on vendors to guarantee predictability and reliability and portability.
  2. Service Management will become mainstream. Over the past five years the focus of service management has been around ITIL (Information Technology Infrastructure Library) processes and certification. There is a subtle change happening as corporations are starting to take a more holistic view of how they can effectively manage how everything that has a sensor, an actuator, or a computer interface is managed. Cloud computing will have a major impact on the growing importance of service management.
  3. Cloud service providers will begin to drop their prices dramatically as competition intensifies. This will be one of the primary drivers of growth of the use of cloud services. It will put a lot of pressure on smaller niche cloud providers as the larger companies try to gain control of this emerging market.
  4. It is not a stretch to state that the pace of technology acquisitions will accelerate in 2010.  I expect that HP, IBM, Cisco, Oracle, Microsoft, Google, and CA will be extremely active. While it would be foolhardy to pick a single area, I’ll go out on a limb and suggest that security, data center management, service management, and information management will be the focus of many of the acquisitions.
  5. Social Networking will become much more mainstream than it was in 2009. Marketers will finally realize that blatant sales pitches on Twitter or Facebook just won’t cut it.  We will begin to see markets learn how to integrate social networking into the fabric of marketing programs. As this happens there will be hundreds of new start ups focused on analyzing the effectiveness of these marketing efforts.
  6. Information management is at the cusp of a major change. While the individual database remains important, the issue for customers is focus on the need to manage information holistically so that they can anticipate change. As markets grow increasingly complex and competitive, the hottest products in 2010 will those that help companies anticipate what will happen next.  So expect that anything with the term predictive analytics to be hot, hot, hot.

Predicting the future of computing by understanding the past

December 1, 2009 1 comment

Now that Thanksgiving is over I am ready to prepare to come up with predictions for 2010. This year, I decided to start by looking backwards. I first entered the computer industry in the late 1970s when mainframes roamed the earth and timesharing was king. Clearly, a lot has changed. But what I was thinking about was the assumptions that people had about the future of computing at that time and over the next several decades. So, I thought it would be instructive to mention a few interesting assumptions that I heard over the years. So, in preparation for my predictions in a couple of week, here are a few noteworthy predictions from past eras:

1. Late 1970s – The mainframe will always be the prevalent computing platform. The minicomputer is a toy.

2. Early 1980s – The PC will never be successful. It is for hobbyists. Who would ever want a personal computer in their home? And if they got one, what would they ever do with it — keep track of recipes?

3. Mid-1980s – The minicomputer will prevail. The personal computer and the networked based servers are just toys.

4. Mid-1980s – The leaders of the computer industry — IBM, Digital Equipment Corporation, and Wang Laboratories will prevail.

5. Early 1990s – The Internet has no real future as a computing platform. It is unreliable and too hard to use. How could it possibly scale enough to support millions of customers?

6. Early 1990s – Electronic Commerce is a pipe dream. It is too complicated to work in the real world.

7. Mid-1990s – If you give away software to gain “eyeballs” (the popular term in the era) and market share you will fail.

I could mention hundreds of other assumptions that I have come across that defied the conventional wisdom of the day. The reality is that these type of proof points are not without nuance.  For example, the mainframe remains an important platform today because of its ability to process high volume transactions and for its reliability and predictability. However, it is no longer the primary platform for computing. The minicomputer still exists but has morphed into more flexible server-based appliances. The PC would never have gotten off the ground without the pioneering work of done by Dan Bricklin and Bob Frankston who created the first PC-based spreadsheet. Also, if the mainframe and minicomputers had adopted a flexible computing model, corporations would never have brought millions of unmanageable PCs into their departments. Of the three computing giants of the late 80s, only IBM is still standing. Digital Equipment was swallowed by HP and Wang was bought by Getronics.  The lesson? Leaders come and go. Only the humble or paranoid survive. Who could have predicted the emergence of Google or Amazon.com? In the early days of online commerce it was unclear if it would really work. How could a vendor possible construct a system that could transmit transactions between partners and customers across the globe? It took time and lots of failures before it became the norm.

My final observation is actually the most complicated. In the mid-1990s during the dotcom era I worked with many companies that thought they could give away their software for a few dollars, gain a huge installed base and make money by monetizing those customers. I admit that I was skeptical. I would tell these companies, how can you make money and sustain your company? If you sell a few million copies of your software revenue will still be under $20 million — before expenses which would be huge. The reality is that none of these companies are around today. They simply couldn’t survive because there was no viable revenue model for the future. Fast forward almost 20 years. Google was built on top of the failures of these pioneers who understood that you could use an installed base to build something significant.

So, as I start to plan to predict 2010 I will try to keep in mind the assumptions, conventional wisdom, successes and failures of earlier times.

Can IBM become a business leader and a software leader?

November 23, 2009 3 comments

When I first started as an industry analyst in the 1980s IBM software was in dire straits. It was the era where IBM was making the transition from the mainframe to a new generation of distributed computing. It didn’t go really well. Even with thousands of smart developers working their hearts out the first three foresees into a new generation of software were an abysmal failure. IBM’s new architectural framework called SAA(Systems Application Architecture) didn’t work; neither did the first application built on top of that called OfficeVision. It’s first development framework called Application Development  Cycle (AD/Cycle) also ended up on the cutting room floor.  Now fast forward 20 years and a lot has changed for IBM and its software strategy.  While it is easy to sit back and laugh at these failures, it was also a signal to the market that things were changing faster than anyone could have expected. In the 1980s, the world looked very different — programming was procedural, architectures were rigid, and there were no standards except in basic networking.

My perspective on business is that embracing failure and learning from them is the only way to really have success for the future. Plenty of companies that I have worked with over my decades in the industry have made incredible mistakes in trying to lead the world. Most of them make those mistakes and keep making them until they crawl into a hole and die quietly.  The companies I admire of the ones that make the mistakes, learn from them and keep pushing. I’d put both IBM, Microsoft, and Oracle in that space.

But I promised that this piece would be about IBM. I won’t bore you with more IBM history. Let’s just say that over the next 20 years IBM did not give up on distributed computing. So, where is IBM Software today? Since it isn’t time to write the book yet, I will tease you with the five most important observations that I have on where IBM is in its software journey:

1. Common components. If you look under the covers of the technology that is embedded in everything from Tivoli to Information Management and software development you will see common software components. There is one database engine; there is a single development framework, and a single analytics backbone.  There are common interfaces between elements across a very big software portfolio. So, any management capabilities needed to manage an analytics engine will use Tivoli components, etc.

2. Analytics rules. No matter what you are doing, being able to analyze the information inside a management environment or a packaged application can make the difference between success and failure.  IBM has pushed information management to the top of stack across its software portfolio. Since we are seeing increasing levels of automation in everything from cars to factory floors to healthcare equipment, collecting and analyzing this data is becoming the norm. This is where Information Management and Service Management come together.

3. Solutions don’t have to be packaged software. More than 10 years ago IBM made the decision that it would not be in the packaged software business. Even as SAP and Oracle continued to build their empires, IBM took a different path. IBM (like HP) is building solution frameworks that over time incorporate more and more best practices and software patterns. These frameworks are intended to work in partnership with packaged software. What’s the difference? Treat the packages like ERP as the underlying commodity engine and focus on the business value add.

4. Going cloud. Over the past few years, IBM has been making a major investment in cloud computing and has begun to release some public cloud offerings for software testing and development as a starting point. IBM is investing a lot in security and overall cloud management.  It’s Cloud Burst appliance and packaged offerings are intended to be the opening salvo.   In addition, and probably even more important are the private clouds that IBM is building for its largest customers. Ironically, the growing importance of the cloud may actually be the salvation of the Lotus brand.

5. The appliance lives. Even as we look towards the cloud to wean us off of hardware, IBM is putting big bets on hardware appliances. It is actually a good strategy. Packaging all the piece parts onto an appliance that can be remotely upgraded and managed is a good sales strategy for companies cutting back on staff but still requiring capabilities.

There is a lot more that is important about this stage in IBM’s evolution as a company. If I had to sum up what I took away from this annual analyst software event is that IBM is focused at winning the hearts, minds, and dollars of the business leader looking for ways to innovate. That’s what Smarter Planet is about. Will IBM be able to juggle its place as a software leader with its push into business leadership? It is a complicated task that will take years to accomplish and even longer to assess its success.

Tectonic shifts: HP Plus 3Com versus Cisco Plus EMC

November 18, 2009 4 comments

Just when it looked clear where the markets were lining up around data center automation and cloud computing, things change. I guess that is what makes this industry so very interesting.  The proposed acquisition by HP of 3Com is a direct challenge to Cisco’s network management franchise. However, the implications of this move go further than what meets the eye.  It also pits HP in a direct path against EMC with its Cisco partnership. And to make things even more interesting, it also puts these two companies in a competitive three way race against IBM and its cloud/data center automation strategy. And of course, it doesn’t stop there. A myriad of emerging companies like Google and Amazon want a larger share of the enterprise market for cloud services. Companies like Unisys and CSC that has focused on the outsourced secure data centers are getting into the act.

I don’t think that we will see a single winner — no matter what any one of these companies will tell you.  The winners in this market shift will be those companies can build a compelling platform and a compelling value proposition for a partner ecosystem.  The truth about the cloud is that it is not simply a network or a data center. It is a new way of providing services of all sorts that can support changing customer workloads in a secure and predictable manner.

In light of this, what does this say for HP’s plans to acquire 3Com? If we assume that the network infrastructure is a key component of an emerging cloud and data center strategy, HP is making a calculated risk in acquiring more assets in this market.  The company that has found that its ProCurve networking division has begun gaining traction. HP ProCurve Networking is the networking division of HP.  The division includes network switches, wireless access points, WAN routers, and Access Control servers and software.   ProCurve competes directly with Cisco in the networking switch market. When HP had a tight partnership with Cisco, the company de-emphasized the networking. However, once Cisco started to move into the server market, the handcuffs came off. The 3Com acquisition takes the competitive play to a new level. 3Com has a variety of good pieces of technology that HP could leverage within ProCurve. Even more significantly, it picks up a strong security product called TippingPoint, a 3Com acquisition. TippingPoint fills a critical hole in HP’s security offering. TippingPoint, offers network security offerings including intrusion prevention and a product that inspects network packets.  The former 3Com subsidiary has also established a database of security threats based a network of external researchers.

But I think that one of the most important reasons that HP bought 3Com is its strong relationships in the Chinese market. In fiscal year 2008 half of 3Com’s revenue came from its H3C joint venture with Chinese vendor, Huawei Technology. Therefore, it is not surprising that HP would have paid a premium to gain a foothold in this lucrative market. If HP is smart, it will do a good job leveraging the many software assets to build out both its networking assets as well as beefing up its software organization. In reality, HP is much more comfortable in the hardware market. Therefore, adding networking as a core competency makes sense. It will also bolster its position as a player in the high end data center market and in the private cloud space.

Cisco, on the other hand, is coming from the network and moving agressively into the cloud and the data center market.  The company has purchased a position with VMWare and has established a tight partnership with EMC as a go to market strategy.  For Cisco, it gives the company credibility and access to customers outside of its traditional markets. For EMC, the Cisco relationship strengthens its networking play.  But an even bigger value for the relationship is to present a bigger footprint to customers as they move to take on HP, IBM, and the assortment of other players who all want to win.  The Cisco/EMC/VMware play is to focus on the private cloud.  In their view a private cloud is very similar to a private, preconfigured data center.  It can be a compelling value proposition to a customer that needs a data center fast without having to deal with a lot of moving parts.  The real question from a cloud computing perspective is the key question: is this really a cloud?

It was inevitable that this quiet market dominated by Google and Amazon would heat up as the cloud becomes a real market force.  But I don’t expect that HP or Cisco/EMC will have a free run. They are being joined by IBM and Microsoft — among others. The impact could be better options for customers and prices that invariably will fall. The key to success for all of these players will be how well they manage what will be an increasingly heterogeneous, federated, and highly distributed hardware and software world. Management comes in many flavors: management of these highly distributed services and management of the workloads.