Archive

Archive for the ‘Oracle’ Category

Predictions for 2011: getting ready to compete in real time

December 1, 2010 3 comments

2010 was a transition year for the tech sector. It was the year when cloud suddenly began to look realistic to the large companies that had scorned it. It was the year when social media suddenly became serious business. And it was the year when hardware and software were being united as a platform – something like in the old mainframe days – but different because of high-level interfaces and modularity. There were also important trends starting to emerge like the important of managing information across both the enterprise and among partners and suppliers. Competition for ownership of the enterprise software ecosystem headed up as did the leadership of the emerging cloud computing ecosystem.

So, what do I predict for this coming year? While at the outset it might look like 2011 will be a continuation of what has been happening this year, I think there will be some important changes that will impact the world of enterprise software for the rest of the decade.

First, I think it is going to be a very big year for acquisitions. Now I have said that before and I will say it again. The software market is consolidating around major players that need to fill out their software infrastructure in order to compete. It will come as no surprise if HP begins to purchase software companies if it intends to compete with IBM and Oracle on the software front.  But IBM, Oracle, SAP, and Microsoft will not sit still either.  All these companies will purchase the incremental technology companies they need to compete and expand their share of wallet with their customers.

This will be a transitional year for the up and coming players like Google, Amazon, Netflix, Salesforce.com, and others that haven’t hit the radar yet.  These companies are plotting their own strategies to gain leadership. These companies will continue to push the boundaries in search of dominance.  As they push up market as they grab market share, they will face the familiar problem of being able to support customers who will expect them to act like adults.

Customer support, in fact, will bubble to the top of the issues for emerging as well as established companies in the enterprise space – especially as cloud computing becomes a well-established distribution and delivery platform for computing.  All these companies, whether well established or startups will have to balance the requirements to provide sophisticated customer support with the need to make profit.  This will impact everything from license and maintenance revenue to how companies will charge for consulting and support services.

But what are customers be looking for in 2011? Customers are always looking to reduce their IT expenses – that is a given. However, the major change in 2011 will be the need to innovative based on customer facing initiatives.  Of course, the idea of focusing on customer facing software itself isn’t new there are some subtle changes.  The new initiatives are based on leveraging social networking from a secure perspective to both drive business traffic, anticipate customer needs and issues before they become issues.  Companies will spend money innovating on customer relationships.

Cloud Computing is the other issue in 2011. While it was clearly a major differentiator in 2010, the cloud will take an important leap forward in 2011.  While companies were testing the water this year, next year, companies will be looking at best practices in cloud computing.  2011 will be there year where customers are going to focus on three key issues: data integration across public, private, and data centers, manageability both in terms of workload optimization, security, and overall performance.  The vendors that can demonstrate that they can provide the right level of service across cloud-based services will win significant business. These vendors will increasingly focus on expanding their partner ecosystem as a way to lock in customers to their cloud platform.

Most importantly, 2011 will be the year of analytics.  The technology industry continues to provide data at an accelerated pace never seen before. But what can we do with this data? What does it mean in organizations’ ability to make better business decisions and to prepare for an unpredictable future?  The traditional warehouse simply is too slow to be effective. 2011 will be the year where predictive analytics and information management overall will emerge as among the hottest and most important initiatives.

Now I know that we all like lists, so I will take what I’ve just said and put them into my top ten predictions:

1. Both today’s market leaders and upstarts are going to continue to acquire assets to become more competitive.  Many emerging startups will be scooped up before they see the light of day. At the same time, there will be almost as many startups emerge as we saw in the dot-com era.

2. Hardware will continue to evolve in a new way. The market will move away from hardware as a commodity. The hardware platform in 2010 will be differentiated based on software and packaging. 2010 will be the year of smart hardware packaged with enterprise software, often as appliances.

3. Cloud computing models will put extreme pressure on everything from software license and maintenance pricing to customer support. Integration between different cloud computing models will be front and center. The cloud model is moving out of risk adverse pilots to serious deployments. Best practices will emerge as a major issue for customers that see the cloud as a way to boost innovation and the rate of change.

4. Managing highly distributed services in a compliant and predictable manner will take center stage. Service management and service level agreements across cloud and on-premises environments will become a prerequisite for buyers.

5. Security software will be redefined based on challenges of customer facing initiatives and the need to more aggressively open the corporate environment to support a constantly morphing relationship with customers, partners, and suppliers.

6. The fear of lock in will reach a fever pitch in 2011. SaaS vendors will increasingly add functionality to tighten their grip on customers.  Traditional vendors will purchase more of the components to support the lifecycle needs of customers.  How can everything be integrated from a business process and data integration standpoint and still allow for portability? Today, the answers are not there.

7. The definition of an application is changing. The traditional view that the packaged application is hermetically sealed is going away. More of the new packaged applications will be based on service orientation based on best practices. These applications will be parameter-driven so that they can be changed in real time. And yes, Service Oriented Architectures (SOA) didn’t die after all.

8. Social networking grows up and will be become business social networks. These initiatives will be driven by line of business executives as a way to engage with customers and employees, gain insights into trends, to fix problems before they become widespread. Companies will leverage social networking to enhance agility and new business models.

9. Managing end points will be one of the key technology drivers in 2011. Smart phones, sensors, and tablet computers are refining what computing means. It will drive the requirement for a new approach to role and process based security.

10. Data management and predictive analytics will explode based on both the need to understand traditional information and the need to manage data coming from new sales and communications channels.

The bottom line is that 2011 will be the year where the seeds that have been planted over the last few years are now ready to become the drivers of a new generation of innovation and business change. Put together everything from the flexibility of service orientation, business process management innovation, the wide-spread impact of social and collaborative networks, the new delivery and deployment models of the cloud. Now apply tools to harness these environments like service management, new security platforms, and analytics. From my view, innovative companies are grabbing the threads of technology and focusing on outcomes. 2011 is going to be an important transition year. The corporations that get this right and transform themselves so that they are ready to change on a dime can win – even if they are smaller than their competitors.

Eight things that changed since we wrote Cloud Computing for Dummies

October 8, 2010 3 comments

I admit that I haven’t written a blog in more than three months — but I do have a good reason. I just finished writing my latest book — not a Dummies book this time. It will be my first business book based on almost three decades in the computer industry. Once I know the publication date I will tell you a lot more about it. But as I was finishing this book I was thinking about my last book, Cloud Computing for Dummies that was published almost two years ago.  As this anniversary approaches I thought it was appropriate to take a look back at what has changed.  I could probably go on for quite a while talking about how little information was available at that point and how few CIOs were willing to talk about or even consider cloud computing as a strategy. But that’s old news.  I decided that it would be most interesting to focus on eight of the changes that I have seen in this fast-moving market over the past two years.

Change One: IT is now on board with cloud computing. Cloud Computing has moved from a reaction to sluggish IT departments to a business strategy involving both business and technology leaders.  A few years ago, business leaders were reading about Amazon and Google in business magazines. They knew little about what was behind the hype. They focused on the fact that these early cloud pioneers seemed to be efficient at making cloud capability available on demand. No paperwork and no waiting for the procurement department to process an order. Two years ago IT leaders tried to pretend that cloud computing was  passing fad that would disappear.  Now I am finding that IT is treating cloud computing as a center piece of their future strategies — even if they are only testing the waters.

Change Two: enterprise computing vendors are all in with both private and public cloud offerings. Two years ago most traditional IT vendors did not pay too much attention to the cloud.  Today, most hardware, software, and services vendors have jumped on the bandwagon. They all have cloud computing strategies.  Most of these vendors are clearly focused on a private cloud strategy. However, many are beginning to offer specialized public cloud services with a focus on security and manageability. These vendors are melding all types of cloud services — public, private, and hybrid into interesting and sometimes compelling offerings.

Change Three: Service Orientation will make cloud computing successful. Service Orientation was hot two years ago. The huge hype behind cloud computing led many pundits to proclaim that Service Oriented Architectures was dead and gone. In fact, cloud vendors that are succeeding are those that are building true business services without dependencies that can migrate between public, private and hybrid clouds have a competitive advantage.

Change Four: System Vendors are banking on integration. Does a cloud really need hardware? The dialog only two years ago surrounded the contention that clouds meant no hardware would be necessary. What a difference a few years can make. The emphasis coming primarily from the major systems vendors is that hardware indeed matters. These vendors are integrating cloud infrastructure services with their hardware.

Change Five: Cloud Security takes center stage. Yes, cloud security was a huge topic two years ago but the dialog is beginning to change. There are three conversations that I am hearing. First, cloud security is a huge issue that is holding back widespread adoption. Second, there are well designed software and hardware offerings that can make cloud computing safe. Third, public clouds are just as secure as a an internal data center because these vendors have more security experts than any traditional data center. In addition, a large number of venture backed cloud security companies are entering the market with new and quite compelling value propositions.

Change Six: Cloud Service Level Management is a  primary customer concern. Two years ago no one our team interviewed for Cloud Computing for Dummies connected service level management with cloud computing.   Now that customers are seriously planning for wide spread adoption of cloud computing they are seriously examining their required level of service for cloud computing. IT managers are reading the service level agreements from public cloud vendors and Software as a Service vendors carefully. They are looking beyond the service level for a single service and beginning to think about the overall service level across their own data centers as well as the other cloud services they intend to use.

Change Seven: IT cares most about service automation. No, automation in the data center is not new; it has been an important consideration for years. However, what is new is that IT management is looking at the cloud not just to avoid the costs of purchasing hardware. They are automation of both routine functions as well as business processes as the primary benefit of cloud computing. In the long run, IT management intends to focus on automation and reduce hardware to interchanagable commodities.

Change Eight: Cloud computing moves to the front office. Two years ago IT and business leaders saw cloud computing as a way to improve back office efficiency. This is beginning to change. With the flexibility of cloud computing, management is now looking at the potential for to quickly innovate business processes that touch partners and customers.

IBM’s hardware sneak attack

April 13, 2010 5 comments

Yesterday I read an interesting blog commenting on why Oracle seems so interested in Sun’s hardware.

I quote from a comment by Brian Aker, former head of architecture for MySQL on the O’Reily Radar blog site.  He comments on his view on why Oracle bought Sun,

Brian Aker: I have my opinions, and they’re based on what I see happening in the market. IBM has been moving their P Series systems into datacenter after datacenter, replacing Sun-based hardware. I believe that Oracle saw this and asked themselves “What is the next thing that IBM is going to do?” That’s easy. IBM is going to start pushing DB2 and the rest of their software stack into those environments. Now whether or not they’ll be successful, I don’t know. I suspect once Oracle reflected on their own need for hardware to scale up on, they saw a need to dive into the hardware business. I’m betting that they looked at Apple’s margins on hardware, and saw potential in doing the same with Sun’s hardware business. I’m sure everything else Sun owned looked nice and scrumptious, but Oracle bought Sun for the hardware.

I think that Brian has a good point. In fact, in a post I wrote a few months ago, I commented on the fact that hardware is back.  It is somewhat ironic. For a long time, the assumption has been that a software platform is the right leverage point to control markets.  Clearly, the tide is shifting.  IBM, for example, has taken full advantage of customer concerns about the future of the Sun platform. But IBM is not stopping there. I predict a hardware sneak attack that encompasses IBM’s platform software strength (i.e., middleware, automation, analytics, and service management) combined with its hardware platforms.

IBM will use its strength in systems and middleware software to expand its footprint into Oracle’s backyard surrounding its software with an integrated platform designed to work as a system of systems.  It is clear that over the past five or six years IBM’s focus has been on software and services.  Software has long provided good profitability for IBM. Services has made enormous strides over the past decade as IBM has learned to codify knowledge and best practices into what I have called Service as Software. The other most important movement has been IBM’s focused effort over the past decade to revamp the underlying structure of its software into modular services that are used across its software portfolio. Combine this approach with industry focused business frameworks and you have a pretty good idea of where IBM is headed with its software and services portfolios.

The hardware strategy has begun to evolve in 2005 when IBM software bought a little hardware XML accelerator hardware appliance company called DataPower. Many market watchers were confused. What would IBM software do with a hardware platform?  Over time, IBM expanded the footprint of this platform and began to repurpose it as a means to pre-packaging software components. First there was a SOA-based appliance; then IBM added a virtual machine appliance called the CloudBurst appliance.  On the Lotus side of the business, IBM bought another appliance company that evolved into the Lotus Foundations platform.  Appliances became a great opportunity to package and preconfigure systems that could be remotely upgraded and managed.  This packaging of software with systems demonstrated the potential not only for simplicity for customers but a new way of adding value and revenue.

Now, IBM is taking the idea of packaging hardware with software to new levels.  It is starting to leverage the software and networking capability focused on hardware-driven systems. For example, within the systems environment, IBM is leveraging its knowledge of optimizing systems software so that it applications-based workloads can take advantage of capabilities such as threading, caching, and systems level networking.

In its recent announcement, IBM has developed its new hardware platforms based on the five most common workloads: transaction processing, analytics, business applications, records management and archiving, and collaboration.  What does this mean to customers? If a customer has a transaction oriented system, the most important capability is to ensure that the environment uses as many threads as possible to optimize speed of throughput. In addition, caching repetitive workloads will also ensure that transactions move through the system as quickly as possible. While this has been doable in the past, the difference is that these capabilities are packaged as an end-to-end system. Thus, implementation could be faster and more precise. The same can be said for analytics workloads. These workloads demand a high level of efficiency to enable customers to look for patterns in the data that help predict outcomes.     Analytics workloads require the caching and fast processing of   algorithms and data across multiple sources.

The bottom line is that IBM is looking at its hardware as an extension of the type of workloads they are required to support.  Rather than considering hardware as as set of separate platforms, IBM is following a systems of systems approach that is consistent with cloud computing.  With this type of approach, IBM will continue on the path of viewing a system as a combination of the hardware platform, the systems software, and systems-based networking.  These elements of computing are therefore configured based on the type of application and the nature of the current workload.

It is, in fact, workload optimization that is at the forefront of what is changing in hardware in the coming decade. This is true both in the data center and in the cloud. Cloud computing — and the hybrid environments that make up the future of computing are all predicated on predictable, scalable, and elastic workload management.  It is the way we will start thinking about computing as a continuum of all of the component parts combined — hardware, software, services, networking, storage, collaboration, and applications.  This reflects the dramatic changes that are just at the horizon.

Why we about to move from cloud computing to industrial computing?

April 5, 2010 7 comments

I spent the other week at a new conference called Cloud Connect. Being able to spend four days emerged in an industry discussion about cloud computing really allows you to step back and think about where we are with this emerging industry. While it would be possible to write endlessly about all the meeting and conversations I had, you probably wouldn’t have enough time to read all that. So, I’ll spare you and give you the top four things I learned at Cloud Connect. I recommend that you also take a look at Brenda Michelson’s blogs from the event for a lot more detail. I would also refer you to Joe McKendrick’s blog from the event.

1. Customers are still figuring out what Cloud Computing is all about.  For those of us who spend way too many hours on the topic of cloud computing, it is easy to make the assumption that everyone knows what it is all about.  The reality is that most customers do not understand what cloud computing is.  Marcia Kaufman and I conducted a full day workshop called Introduction to Cloud. The more than 60 people who dedicated a full day to a discussion of all aspects of the cloud made it clear to us that they are still figuring out the difference between infrastructure as a service and platform as a service. They are still trying to understand the issues around security and what cloud computing will mean to their jobs.

2. There is a parallel universe out there among people who have been living and breathing cloud computing for the last few years. In their view the questions are very different. The big issues discussed among the well-connected were focused on a few key issues: is there such a thing as a private cloud?; Is Software as a Service really cloud computing? Will we ever have a true segmentation of the cloud computing market?

3. From the vantage point of the market, it is becoming clear that we are about to enter one of those transitional times in this important evolution of computing. Cloud Connect reminded me a lot of the early days of the commercial Unix market. When I attended my first Unix conference in the mid-1980s it was a different experience than going to a conference like Comdex. It was small. I could go and have a conversation with every vendor exhibiting. I had great meetings with true innovators. There was a spirit of change and innovation in the halls. I had the same feeling about the Cloud Connect conference. There were a small number of exhibitors. The key innovators driving the future of the market were there to discuss and debate the future. There was electricity in the air.

4. I also anticipate a change in the direction of cloud computing now that it is about to pass that tipping point. I am a student of history so I look for patterns. When Unix reached the stage where the giants woke up and started seeing huge opportunity, they jumped in with a vengeance. The great but small Unix technology companies were either acquired, got big or went out of business. I think that we are on the cusp of the same situation with cloud computing. IBM, HP, Microsoft, and a vast array of others have seen the future and it is the cloud. This will mean that emerging companies with great technology will have to be both really luck and really smart.

The bottom line is that Cloud Connect represented a seminal moment in cloud computing. There is plenty of fear among customers who are trying to figure out what it will mean to their own data centers. What will the organizational structure of the future look like? They don’t know and they are afraid. The innovative companies are looking at the coming armies of large vendors and are wondering how to keep their differentiation so that they can become the next Google rather than the next company whose name we can’t remember. There was much debate about two important issues: cloud standards and private clouds. Are these issues related? Of course. Standards always become an issue when there is a power grab in a market. If a Google, Microsoft, Amazon, IBM, or an Oracle is able to set the terms for cloud computing, market control can shift over night. Will standard interfaces be able to save the customer? And how about private clouds? Are they real? My observation and contention is that yes, private clouds are real. If you deploy the same automation, provisioning software, and workload management inside a company rather than inside a public cloud it is still a cloud. Ironically, the debate over the private cloud is also about power and position in the market, not about ideology. If a company like Google, Amazon, or name whichever company is your favorite flavor… is able to debunk the private cloud — guess who gets all the money? If you are a large company where IT and the data center is core to how you conduct business — you can and should have a private cloud that you control and manage.

So, after taking a step back I believe that we are witnessing the next generation of computing — the industrialization of computing. It might not be as much fun as the wild west that we are in the midst of right now but it is coming and should be here before we realize that it has happened.

Can Informatica earn a place at the head table?

February 22, 2010 Leave a comment

Informatica might be thought of as the last independent data management company standing. In fact, that used to be Informatica’s main positioning in the market. That has begun to change over the last few years as Informatica can continued to make strategic acquisitions. Over the past two years Informatica has purchased five companies  — the most recent was Siperian, a significant player in Master Data Management solutions. These acquisitions have paid off. Today Informatica has past the $500 million revenue mark with about 4,000 customers. It has deepened its strategic partnerships with HP, Ascenture, Salesforce.com, and MicroStrategies,  In a nutshell, Informatica has made the transition from a focus on ETL (Extract, Transform, Load) tools to support data warehouses to a company focused broadly on managing information. Merv Adrian did a great job of providing context for Informatica’s strategy and acquisitions. To transition itself in the market, Informatica has set its sights on data service management — a culmination of data integration, master data management and data transformation, predictive analytics in a holistic manner across departments, divisions, and business partners.

In essence, Informatica is trying to position itself as a leading manager of data across its customers’ ecosystem. This requires a way to have consistent data definitions across silos (Master Data Management), ways to trust the integrity of that data (data cleansing), event processing, predictive analytics, integration tools to move and transform data, and the ability to prove that governance can be verified (data governance). Through its acquisitions, Informatica is working to put these pieces together. However, as a relatively small player living in a tough neighborhood (Oracle, IBM, SAS Institute,etc. it will be a difficult journey. This is one of the reasons that Informatica is putting so much emphasis on its new partner marketplace. A partner network can really help a smaller player appear and act bigger.

This Marketplace will include all of Informatica’s products. It will enable developers to develop within Informatica’s development cloud and deploy either in the cloud or on premise. Like its new partner marketplace, the cloud is offering another important opportunity for Informatica to compete. Informatica was an early partner with Salesforce.com. It has been offerings complementary information management products that can be used as options with Salesforce.com.  This has provided Informatica access to customers who might not have ever thought about Informatica in the past. In addition, it taught Informatica about the value of cloud computing as a platform for the future. Therefore, I expect that with Informatica’s strong cloud-based offerings will help the company maintain its industry position. In addition, I expect that the company’s newly strengthened partnership with HP will be very important in the company’s growth.

What is Informatica’s roadmap? It intends to continue to deliver new releases every six months including new data services and new data integration services. It will including develop these services with a self-service interfaces. In the end, its goal is to be a great data steward to its customers. This is an admirable goal. Informatica has made very good acquisitions that support its strategic goals. It is making the right bets on cloud and on a partner ecosystem. The question that remains is whether Informatica can truly scale to the size where it can sustain the competitive threats.  Companies like IBM, Oracle, Microsoft, SAP, and SAS Institute are not standing still.  Each of these companies have built and will continue to expand their information management strategies and portfolios of offerings. If Informatica can break the mold on ease of implementation on complex data service management it will have earned a place at the head table.

Why hardware still matters– at least for a couple of years

February 9, 2010 3 comments

It is easy to assume that with the excitement around cloud computing would put a damper on the hardware market. But I have news for you. I am predicting that over the next few years hardware will be front and center.  Why would I make such a wild prediction. Here are my three reasons.

1. Hardware is front and center in almost all aspects of the computer industry. It is no wonder that Oracle wants to become a hardware company. Hardware is tangible. It’s revenue hits the bottom line right away. Hardware can envelop software and keep customers pinned down for many, many years. New generation platforms in the form of hardware appliances are a convenient delivery platform that helps the sales cycle. It is no wonder that Oracle wants a hardware platform. It completes the equation and allows Oracle to position itself as a fully integrated computing company. Likewise, IBM and HP are focused on building up their war chest full of strong hardware platforms. If you believe that customers want to deal with one large brand..or two, then the winners want to control the entire computing ecosystem.

2. The cloud looms. Companies like Amazon.com and Google do not buy hardware from the big iron providers and never will. For economic reasons, these companies go directly to component providers and purchase custom designed chips, board, etc. This approach means that for a very low price, these cloud providers can reduce their power consumption by making sure that the components are optimize for massively scaled clouds.  These cloud vendors are focused on undercutting the opportunity and power of the big systems providers. Therefore, cloud providers care a lot about hardware — it is through optimization of the hardware that they can threaten the power equilibrium in the computer market.

3. The clash between cloud and on premise environments. It is clear that the computer marketplace is at a transition point. The cloud vendors are betting that they can get the costs based on optimization of everything so low that they win. The large Systems vendors are betting that their sophisticated systems combining hardware, software, and service will win because of their ability to better protect the integrity of the customer’s business. These vendors will all provide their own version of the public and private cloud to ensure that they maintain power.

So, in my view there will be an incredible focus on hardware over the next two years. This will actually be good for customers because the level of sophistication, cost/performance metrics will be impressive. This hardware renaissance will not last. In the long run, hardware will be commoditized. The end game will be interesting because of the cloud. It will not a zero sum game. No, the data center doesn’t go away. But the difference is that purpose built hardware will be optimized for workloads to support the massively scaled environments that will be the heart of the future of computing. And then, it will be all about the software, the data, and the integration.

3.

Oracle + Sun: Five questions to ponder

January 27, 2010 3 comments

I spent a couple of hours today listening to Oracle talk about the long-awaited integration with Sun Microsystems. A real end of an era and beginning of a new one. What does this mean for Oracle? Whatever you might think about Oracle, you have to give the company credit for successfully integrating the 60 companies it has purchased over the past few years. Having watched hundreds and perhaps thousands of acquisitions over the last few decades, it is clear that integration is hard. There are overlapping technologies, teams, cultures, and egos. Oracle has successfully managed to leverage the IP from its acquisitions to support its business goals. For example, it has kept packaged software customers happy by improving the software. Peoplesoft customers, for example, were able to continue to use the software they had become dependent on in primarily the same way as before the acquisition. In some cases, the quality of the software actually improved dramatically. The path has been more complicated with the various middleware and infrastructure platforms the company has acquired over the years because of overlapping functionality.

The acquisition of Sun Microsystems is the biggest game changer for Oracle since the acquisition of PeopleSoft. There is little doubt that Sun has significant software and hardware IP that will be very important in defining Oracle in the 21st century. But I don’t expect this to be a simple journey. Here are the five key issues that I think will be tricky for Oracle to navigate. Obviously, this is not a complete list but it is a start.

Issue One: Can Oracle recreate the mainframe world? The mainframe is dead — long live the mainframe. Oracle has a new fondness for the mainframe and what that model could represent. So, if you combine Sun’s hardware, networking layer, storage, security, packaged applications, middleware into a package do you get to own total share of a customer’s wallet? That is the idea. Oracle management has determined that IBM had the right ideas in the 1960s — everything was nicely integrated and the customer never had to worry about the pieces working together.
Issue Two: Can you package everything together and still be an open platform? To its credit, Oracle has build its software on standards such as Unix/Linux, XML, Java, etc. So, can you have it both ways? Can you claim openness when the platform itself is hermetically sealed? I think it may be a stretch. In order to accomplish this goal, Oracle would have to have well-defined and published APIs. It would have to be able to certify that with these APIs the integrated platform won’t be broken. Not an easy task.
Issue Three: Can you manage a complex computing environment? Computing environments get complicated because there are so many moving parts. There are configurations that change; software gets patched; new operating system versions are introduced; emerging technology enters and messes up the well established environment. Oracle would like to automate the process of managing this process for customers. It is an appealing idea since configuration problems, missing links, and poor testing are often responsible for many of the outages in computing environments today. Will customers be willing to have this type of integrated environment controlled and managed by a single vendor? Some customers will be happy to turn over these headaches. Others may have too much legacy or want to work with a variety of vendors. This is not a new dilemma for customers. Customers have long had to rationalize the benefits of a single source of technology against the risks of being locked in.
Issue Four: Can you teach an old dog new tricks? Can Oracle really be a hardware vendor? Clearly, Sun continues to be a leader in hardware despite its diminished fortunes. But as anyone who has ventured into the hardware world knows, hardware is a tough, brutal game. In fact, it is the inverse of software. Software takes many cycles to reach maturation. It needs to be tweaked and finessed. However, once it is in place it has a long, long life. The old saying goes, old software never dies. The same cannot be said for hardware. Hardware has a much straighter line to maturity. It is developed, designed, and delivered to the market. Sometimes it leapfrogs the competition enough that it has a long and very profitable life. Other times, it hits the market at the end of a cycle when a new more innovative player enters the market. The culmination of all the work and effort can be short as something new comes along at the right place at the right time. It is often a lot easier to get rid of hardware than software. The computer industry is littered with the corpses of failed hardware platforms that started with great fanfare and then faded away quickly. Will Oracle be successful with hardware? It will depend on how really good the company is in transforming its DNA.
Issue Five. Are customers ready to embrace Oracle’s brave new world? Oracle’s strategy is a good one — if you are Oracle. But what about for customers? And what about for partners? Customers need to understand the long-term implication and tradeoffs in buying into Oracle’s integrated approach to its platform. It will clearly mean fewer moving parts to worry about. It will mean one phone call and no finger pointing. However, customers have to understand the type of leverage that single company will have in terms of contract terms and conditions. And what about partners? How does an independent software vendor or a channel partner participate within the new Oracle? Is there room? What type of testing and preparation will be required to play?

The DNA of the Cloud Power Partnerships

January 15, 2010 2 comments

I have been thinking  alot about the new alliances forming around cloud computing over the past couple of months.  The most important of these moves are EMC,Cisco, and VMware, HP and Microsoft’s announced collaboration, and of course, Oracle’s planned acquisition of Sun.  Now, let’s add IBM’s cloud strategy into the mix which has a very different complexion from its competitors. And, of course, my discussion of the cloud power struggle wouldn’t be complete without adding in the insurgents — Google and Amazon.  While it is tempting to want to portray this power grab by all of the above as something brand new — it isn’t.  It is a replay of well-worn patterns that we have seen in the computer industry for the past several decades. Yes, I am old enough to have been around for all of these power shifts. So, I’d like to point out what the DNA of this power struggle looks like for the cloud and how we might see history repeating itself in the coming year.  So, here is a sample of how high profile partnerships have fared over the past few decades. While the past can never accurately predict the future, it does provide some interesting insights.

Partner realignment happens when the stakes change.  There was a time when Cisco was a very, very close partner with HP. In fact, I remember a time when HP got out of the customer service software market to collaborate with Cisco. That was back in 1997.

Here are the first couple of sentences from the press release:

SAN JOSE and PALO ALTO, Calif., Jan. 15, 1997 — Hewlett-Packard Company and Cisco Systems Inc. today announced an alliance to jointly develop Internet-ready networked-computing solutions to maximize the benefits of combining networking and computing. HP and Cisco will expand or begin collaboration in four areas: technology development, product integration, professional services and customer service and support.

If you are interested, here is a link to the full press release.  What’s my point? These type of partnerships are in both HP’s and Cisco’s DNA. Both companies have made significant and broad-reaching partnerships. For example, back in 2004, IBM and Cisco created a broad partnership focused on the data center. Here’s an excerpt from a CRN article:

From the April 29, 2004 issue of CRN Cisco Systems (NSDQ:CSCO) and IBM (NYSE:IBM) on Thursday expanded their long-standing strategic alliance to take aim at the data center market. Solution providers said the new integrated data center solutions, which include a Cisco Gigabit Ethernet Layer 2 switch module for IBM’s eServer Blade Center, will help speed deployment times and ease management of on-demand technology environments.
“This is a big win for IBM,” said Chris Swahn, president of sales at Amherst Technologies, a solution provider in Merrimack, N.H.
The partnership propels IBM past rival Hewlett-Packard, which has not been as quick to integrate its own ProCurve network equipment into its autonomic computing strategy, Swahn said.
Cisco and IBM said they are bringing together their server, storage, networking and management products to provide an integrated data center automation platform.

Here is a link to the rest of the article.

HP itself has had a long history of very interesting partnerships. A few that are most relevant include HP’s ill-fated partnership with BEA in the 1990s. At the time, HP invested $100 million in BEA to further the development of software to support HP’s software infrastructure and platform strategy.

HP Gives BEA $100m for Joint TP Development
Published:08-April-1999
By Computergram

Hewlett-Packard Co and BEA Systems Inc yesterday said they plan to develop new transaction processing software as well as integrate a raft of HP software with BEA’s WebLogic application server, OLTP and e-commerce software. In giving the nod to WebLogic as its choice of application server, HP stopped far short of an outright acquisition of the recently-troubled middleware company, a piece of Wall Street tittle tattle which has been doing the round for several weeks now. HP has agreed to put BEA products through all of its distribution channels and is committing $100m for integration and joint development.

Here’s a link to an article about the deal.

Oracle  probably has more partnerships and more entanglement with more companies than anyone else.  For example,  HP has a  longstanding partnership with Oracle on the data management front. HP partnered closely with Oracle and optimized its hardware for the Oracle database. Today, Oracle and HP have more than 100,000 joint customers. Likewise, Oracle has a strong partnership with IBM — especially around its solutions business. IBM Global Services operates a huge consulting practice based on implementing and running Oracle’s solutions.  Not to be outdone, EMC and Oracle have about 70,000 joint customers. Oracle supports EMC’s storage solutions for Oracle’s portfolio while EMC supports Oracle’s solutions portfolio.

Microsoft, like Oracle, has entanglements with most of the market leaders. Microsoft has partnered very closely with HP for the last couple of decades both on the PC front and on the software front. Clearly, the partnership between HP and Microsoft has evolved for many years so this latest partnership is a continuation of a long-standing relationship. Microsoft has long-standing relationships with EMC, Sun, and Oracle — to name a few.

And what about Amazon and Google? Because both companies were early innovators in cloud computing, they were able to gain credibility in a market that had not yet emerged as the center of power. Therefore, both companies were well positioned to create partnerships with every established vendors that needed to do something with the cloud.  Every company from IBM to Oracle to EMC and Microsoft — to name but a few — established partnerships with these companies. Amazon and Google were small, convenient and non-threatening. But as the power of both companies continues to –grow,  so will their ability to partner in the traditional way. I am reminded of the way IBM partnered with two small companies — Intel and Microsoft when it needed a processor and an operating system to help bring the IBM PC to market in the early 1980s.

The bottom line is that cloud computing is becoming more than a passing fad — it is the future of how computing will change in the coming decades. Because of this reality, partnerships are changing and will continue to change. So, I suspect that the pronouncements of strategic, critical and sustainable partnerships may or may not be worth the paper or compute cycles that created them. But the reality is that the power struggle for cloud dominance is on. It will not leave anything untouched. It will envelop hardware, software, networking, and services. No one can predict exactly what will happen, but the way these companies have acted in the past and the present give us clues to a chaotic and predictable future.

Predictions for 2010: clouds, mergers, social networks and analytics

December 15, 2009 7 comments

Yes, it is predictions time. Let me start by saying that no market change happens in a single year. Therefore, what is important is to look at the nuance of a market or a technology change in the context of its evolution. So, it is in this spirit that I will make a few predictions. I’ve decided to just list my top six predictions (I don’t like odd numbers). Next week I will add another five or six predictions.

  1. Cloud computing will move out of the fear, uncertainty and doubt phase to the reality phase for many customers. This means that large corporations will begin to move segments of their infrastructure and applications to the cloud. It will be a slow but steady movement. The biggest impact on the market is that customers will begin putting pressure on vendors to guarantee predictability and reliability and portability.
  2. Service Management will become mainstream. Over the past five years the focus of service management has been around ITIL (Information Technology Infrastructure Library) processes and certification. There is a subtle change happening as corporations are starting to take a more holistic view of how they can effectively manage how everything that has a sensor, an actuator, or a computer interface is managed. Cloud computing will have a major impact on the growing importance of service management.
  3. Cloud service providers will begin to drop their prices dramatically as competition intensifies. This will be one of the primary drivers of growth of the use of cloud services. It will put a lot of pressure on smaller niche cloud providers as the larger companies try to gain control of this emerging market.
  4. It is not a stretch to state that the pace of technology acquisitions will accelerate in 2010.  I expect that HP, IBM, Cisco, Oracle, Microsoft, Google, and CA will be extremely active. While it would be foolhardy to pick a single area, I’ll go out on a limb and suggest that security, data center management, service management, and information management will be the focus of many of the acquisitions.
  5. Social Networking will become much more mainstream than it was in 2009. Marketers will finally realize that blatant sales pitches on Twitter or Facebook just won’t cut it.  We will begin to see markets learn how to integrate social networking into the fabric of marketing programs. As this happens there will be hundreds of new start ups focused on analyzing the effectiveness of these marketing efforts.
  6. Information management is at the cusp of a major change. While the individual database remains important, the issue for customers is focus on the need to manage information holistically so that they can anticipate change. As markets grow increasingly complex and competitive, the hottest products in 2010 will those that help companies anticipate what will happen next.  So expect that anything with the term predictive analytics to be hot, hot, hot.

Can IBM become a business leader and a software leader?

November 23, 2009 3 comments

When I first started as an industry analyst in the 1980s IBM software was in dire straits. It was the era where IBM was making the transition from the mainframe to a new generation of distributed computing. It didn’t go really well. Even with thousands of smart developers working their hearts out the first three foresees into a new generation of software were an abysmal failure. IBM’s new architectural framework called SAA(Systems Application Architecture) didn’t work; neither did the first application built on top of that called OfficeVision. It’s first development framework called Application Development  Cycle (AD/Cycle) also ended up on the cutting room floor.  Now fast forward 20 years and a lot has changed for IBM and its software strategy.  While it is easy to sit back and laugh at these failures, it was also a signal to the market that things were changing faster than anyone could have expected. In the 1980s, the world looked very different — programming was procedural, architectures were rigid, and there were no standards except in basic networking.

My perspective on business is that embracing failure and learning from them is the only way to really have success for the future. Plenty of companies that I have worked with over my decades in the industry have made incredible mistakes in trying to lead the world. Most of them make those mistakes and keep making them until they crawl into a hole and die quietly.  The companies I admire of the ones that make the mistakes, learn from them and keep pushing. I’d put both IBM, Microsoft, and Oracle in that space.

But I promised that this piece would be about IBM. I won’t bore you with more IBM history. Let’s just say that over the next 20 years IBM did not give up on distributed computing. So, where is IBM Software today? Since it isn’t time to write the book yet, I will tease you with the five most important observations that I have on where IBM is in its software journey:

1. Common components. If you look under the covers of the technology that is embedded in everything from Tivoli to Information Management and software development you will see common software components. There is one database engine; there is a single development framework, and a single analytics backbone.  There are common interfaces between elements across a very big software portfolio. So, any management capabilities needed to manage an analytics engine will use Tivoli components, etc.

2. Analytics rules. No matter what you are doing, being able to analyze the information inside a management environment or a packaged application can make the difference between success and failure.  IBM has pushed information management to the top of stack across its software portfolio. Since we are seeing increasing levels of automation in everything from cars to factory floors to healthcare equipment, collecting and analyzing this data is becoming the norm. This is where Information Management and Service Management come together.

3. Solutions don’t have to be packaged software. More than 10 years ago IBM made the decision that it would not be in the packaged software business. Even as SAP and Oracle continued to build their empires, IBM took a different path. IBM (like HP) is building solution frameworks that over time incorporate more and more best practices and software patterns. These frameworks are intended to work in partnership with packaged software. What’s the difference? Treat the packages like ERP as the underlying commodity engine and focus on the business value add.

4. Going cloud. Over the past few years, IBM has been making a major investment in cloud computing and has begun to release some public cloud offerings for software testing and development as a starting point. IBM is investing a lot in security and overall cloud management.  It’s Cloud Burst appliance and packaged offerings are intended to be the opening salvo.   In addition, and probably even more important are the private clouds that IBM is building for its largest customers. Ironically, the growing importance of the cloud may actually be the salvation of the Lotus brand.

5. The appliance lives. Even as we look towards the cloud to wean us off of hardware, IBM is putting big bets on hardware appliances. It is actually a good strategy. Packaging all the piece parts onto an appliance that can be remotely upgraded and managed is a good sales strategy for companies cutting back on staff but still requiring capabilities.

There is a lot more that is important about this stage in IBM’s evolution as a company. If I had to sum up what I took away from this annual analyst software event is that IBM is focused at winning the hearts, minds, and dollars of the business leader looking for ways to innovate. That’s what Smarter Planet is about. Will IBM be able to juggle its place as a software leader with its push into business leadership? It is a complicated task that will take years to accomplish and even longer to assess its success.