Archive

Posts Tagged ‘security’

HP’s Ambitious Cloud Computing Strategy: Can HP Emerge as a Power?

February 15, 2011 4 comments

To comprehend HP’s cloud computing strategy you have to first understand HP’s Matrix Blade System.  HP announced the Matrix system in April of 2009 as a prepackaged fabric-based system.  Because Matrix was designed as a packaged environment, it has become the lynch pin of HP’s cloud strategy.

So, what is Matrix?  Within this environment, HP has pre-integrated servers, networking, storage, and software (primarily orchestration to customize workflow). In essence, Matrix is a Unified Computing System so that it supports both physical blades as well as virtual configurations. It includes a graphical command center console to manage resource pools, physical and virtual servers and network connectivity. On the software side, Matrix provides an abstraction layer that supports workload provisioning and workflow based policy management that can determine where workloads will run. The environment supports the VMware hypervisor, open source KVM, and Microsoft’s Hyper-V.

HP’s strategy is to combine this Matrix system, which it has positioned as its private cloud, with a public compute cloud. In addition, HP is incorporating its lifecycle management software and its security acquisitions as part of its overall cloud strategy. It is leveraging the HP services (formerly EDS) to offer a hosted private cloud and traditional outsourcing as part of an overall plan. HP is hoping to leveraging its services expertise in running large enterprise packaged software

There are three components to the HP cloud strategy:

  • CloudSystem
  • Cloud Services Automation
  • Cloud Consulting Services

CloudSystem. What HP calls CloudSystem is, in fact, based on the Matrix blade system. The Matrix Blade System uses a common rack enclosure to support all the blades produced by HP. The Matrix is a packaging of is what HP calls an operating environment that includes provisioning software, virtualization, a self-service portal and management tools to manage resources pools. HP considers its public cloud services to be part of the CloudSystem.  To provide a hybrid cloud computing environment, HP will offer compute public cloud services similar to what is available from Amazon EC2.  When combined with the outsourcing services from HP Services, HP contends that it provides a common architectural framework across public, private, virtualized servers, and outsourcing.  It includes what HP is calling cloud maps. Cloud maps are configuration templates based on HP’s acquisition of Stratavia, a database and application automation software company.

Cloud Service Automation.  The CloudSystem is intended to make use of Services Automation software called Cloud Service Automation (CSA). The components of CSA include a self-service portal that manages a service catalog. The service catalog describes each service that is intended to be used as part of the cloud environment.  Within the catalog, the required service level is defined. In addition, the CSA can meter the use of services and can provide visibility to the performance of each service. A second capability is a cloud controller, based on the orchestration technology from HP’s Opsware acquisition. A third component, the resource manager provide provisioning and monitoring services.  The objective of CSA is to provide end-to-end lifecycle management of the CloudSystem.

Cloud Consulting Services. HP is taking advantage of EDS’s experience in managing computing infrastructure as the foundation for its cloud consulting services offerings. HP also leverages its consulting services that were traditionally part of HP as well as services from EDS.  Therefore, HP has deep experience in designing and running Cloud seminars and strategy engagements for customers.

From HP’s perspective, it is taking a hybrid approach to cloud computing. What does HP mean by Hybrid? Basically, HP’s hybrid strategy includes the combination of the CloudSystem – a hardware-based private cloud, its own public compute services, and traditional outsourcing.

The Bottom Line.  Making the transition to becoming a major cloud computing vendor is complicated.  The market is young and still in transition. HP has many interesting building blocks that have the potential to make it an important player.  Leveraging the Matrix Blade System is a pragmatic move since it is already an integrated and highly abstracted platform. However, it will have to provide more services that increase the ability of its customers to use the CloudSystem to create an elastic and flexible computing platform.  The Cloud Automation Services is a good start but still requires more evolution.  For example, it needs to add more capabilities into its service catalog.  Leveraging its Systinet registry/repository as part of its service catalog would be advisable.  I also think that HP needs to package its security offerings to be cloud specific. This includes both in the governance and compliance area as well as Identity Management.

Just how much will HP plan to compete in the public cloud space is uncertain.  Can HP be effective in both markets? Does it need to combine its offerings or create two different business models?

It is clear that HP wants to make cloud computing the cornerstone of its “Instant-On Enterprise” strategy announced last year. In essence, Instant-on Enterprise is intended to make it easier for customers to consume data center capabilities including infrastructure, applications, and services.  This is a good vision in keeping with what customers need.  And plainly cloud computing is an essential ingredient in achieving this ambitious strategy.

Eight things that changed since we wrote Cloud Computing for Dummies

October 8, 2010 3 comments

I admit that I haven’t written a blog in more than three months — but I do have a good reason. I just finished writing my latest book — not a Dummies book this time. It will be my first business book based on almost three decades in the computer industry. Once I know the publication date I will tell you a lot more about it. But as I was finishing this book I was thinking about my last book, Cloud Computing for Dummies that was published almost two years ago.  As this anniversary approaches I thought it was appropriate to take a look back at what has changed.  I could probably go on for quite a while talking about how little information was available at that point and how few CIOs were willing to talk about or even consider cloud computing as a strategy. But that’s old news.  I decided that it would be most interesting to focus on eight of the changes that I have seen in this fast-moving market over the past two years.

Change One: IT is now on board with cloud computing. Cloud Computing has moved from a reaction to sluggish IT departments to a business strategy involving both business and technology leaders.  A few years ago, business leaders were reading about Amazon and Google in business magazines. They knew little about what was behind the hype. They focused on the fact that these early cloud pioneers seemed to be efficient at making cloud capability available on demand. No paperwork and no waiting for the procurement department to process an order. Two years ago IT leaders tried to pretend that cloud computing was  passing fad that would disappear.  Now I am finding that IT is treating cloud computing as a center piece of their future strategies — even if they are only testing the waters.

Change Two: enterprise computing vendors are all in with both private and public cloud offerings. Two years ago most traditional IT vendors did not pay too much attention to the cloud.  Today, most hardware, software, and services vendors have jumped on the bandwagon. They all have cloud computing strategies.  Most of these vendors are clearly focused on a private cloud strategy. However, many are beginning to offer specialized public cloud services with a focus on security and manageability. These vendors are melding all types of cloud services — public, private, and hybrid into interesting and sometimes compelling offerings.

Change Three: Service Orientation will make cloud computing successful. Service Orientation was hot two years ago. The huge hype behind cloud computing led many pundits to proclaim that Service Oriented Architectures was dead and gone. In fact, cloud vendors that are succeeding are those that are building true business services without dependencies that can migrate between public, private and hybrid clouds have a competitive advantage.

Change Four: System Vendors are banking on integration. Does a cloud really need hardware? The dialog only two years ago surrounded the contention that clouds meant no hardware would be necessary. What a difference a few years can make. The emphasis coming primarily from the major systems vendors is that hardware indeed matters. These vendors are integrating cloud infrastructure services with their hardware.

Change Five: Cloud Security takes center stage. Yes, cloud security was a huge topic two years ago but the dialog is beginning to change. There are three conversations that I am hearing. First, cloud security is a huge issue that is holding back widespread adoption. Second, there are well designed software and hardware offerings that can make cloud computing safe. Third, public clouds are just as secure as a an internal data center because these vendors have more security experts than any traditional data center. In addition, a large number of venture backed cloud security companies are entering the market with new and quite compelling value propositions.

Change Six: Cloud Service Level Management is a  primary customer concern. Two years ago no one our team interviewed for Cloud Computing for Dummies connected service level management with cloud computing.   Now that customers are seriously planning for wide spread adoption of cloud computing they are seriously examining their required level of service for cloud computing. IT managers are reading the service level agreements from public cloud vendors and Software as a Service vendors carefully. They are looking beyond the service level for a single service and beginning to think about the overall service level across their own data centers as well as the other cloud services they intend to use.

Change Seven: IT cares most about service automation. No, automation in the data center is not new; it has been an important consideration for years. However, what is new is that IT management is looking at the cloud not just to avoid the costs of purchasing hardware. They are automation of both routine functions as well as business processes as the primary benefit of cloud computing. In the long run, IT management intends to focus on automation and reduce hardware to interchanagable commodities.

Change Eight: Cloud computing moves to the front office. Two years ago IT and business leaders saw cloud computing as a way to improve back office efficiency. This is beginning to change. With the flexibility of cloud computing, management is now looking at the potential for to quickly innovate business processes that touch partners and customers.

Can IBM become a business leader and a software leader?

November 23, 2009 3 comments

When I first started as an industry analyst in the 1980s IBM software was in dire straits. It was the era where IBM was making the transition from the mainframe to a new generation of distributed computing. It didn’t go really well. Even with thousands of smart developers working their hearts out the first three foresees into a new generation of software were an abysmal failure. IBM’s new architectural framework called SAA(Systems Application Architecture) didn’t work; neither did the first application built on top of that called OfficeVision. It’s first development framework called Application Development  Cycle (AD/Cycle) also ended up on the cutting room floor.  Now fast forward 20 years and a lot has changed for IBM and its software strategy.  While it is easy to sit back and laugh at these failures, it was also a signal to the market that things were changing faster than anyone could have expected. In the 1980s, the world looked very different — programming was procedural, architectures were rigid, and there were no standards except in basic networking.

My perspective on business is that embracing failure and learning from them is the only way to really have success for the future. Plenty of companies that I have worked with over my decades in the industry have made incredible mistakes in trying to lead the world. Most of them make those mistakes and keep making them until they crawl into a hole and die quietly.  The companies I admire of the ones that make the mistakes, learn from them and keep pushing. I’d put both IBM, Microsoft, and Oracle in that space.

But I promised that this piece would be about IBM. I won’t bore you with more IBM history. Let’s just say that over the next 20 years IBM did not give up on distributed computing. So, where is IBM Software today? Since it isn’t time to write the book yet, I will tease you with the five most important observations that I have on where IBM is in its software journey:

1. Common components. If you look under the covers of the technology that is embedded in everything from Tivoli to Information Management and software development you will see common software components. There is one database engine; there is a single development framework, and a single analytics backbone.  There are common interfaces between elements across a very big software portfolio. So, any management capabilities needed to manage an analytics engine will use Tivoli components, etc.

2. Analytics rules. No matter what you are doing, being able to analyze the information inside a management environment or a packaged application can make the difference between success and failure.  IBM has pushed information management to the top of stack across its software portfolio. Since we are seeing increasing levels of automation in everything from cars to factory floors to healthcare equipment, collecting and analyzing this data is becoming the norm. This is where Information Management and Service Management come together.

3. Solutions don’t have to be packaged software. More than 10 years ago IBM made the decision that it would not be in the packaged software business. Even as SAP and Oracle continued to build their empires, IBM took a different path. IBM (like HP) is building solution frameworks that over time incorporate more and more best practices and software patterns. These frameworks are intended to work in partnership with packaged software. What’s the difference? Treat the packages like ERP as the underlying commodity engine and focus on the business value add.

4. Going cloud. Over the past few years, IBM has been making a major investment in cloud computing and has begun to release some public cloud offerings for software testing and development as a starting point. IBM is investing a lot in security and overall cloud management.  It’s Cloud Burst appliance and packaged offerings are intended to be the opening salvo.   In addition, and probably even more important are the private clouds that IBM is building for its largest customers. Ironically, the growing importance of the cloud may actually be the salvation of the Lotus brand.

5. The appliance lives. Even as we look towards the cloud to wean us off of hardware, IBM is putting big bets on hardware appliances. It is actually a good strategy. Packaging all the piece parts onto an appliance that can be remotely upgraded and managed is a good sales strategy for companies cutting back on staff but still requiring capabilities.

There is a lot more that is important about this stage in IBM’s evolution as a company. If I had to sum up what I took away from this annual analyst software event is that IBM is focused at winning the hearts, minds, and dollars of the business leader looking for ways to innovate. That’s what Smarter Planet is about. Will IBM be able to juggle its place as a software leader with its push into business leadership? It is a complicated task that will take years to accomplish and even longer to assess its success.

Is cloud security really different than data center security?

October 30, 2009 7 comments

Almost every conversation I have had over the past year or so always comes back to security in the cloud.  Is it really secure? Or we are thinking about implementing the cloud but we are worried about security.  There are, of course, good reasons to plan a cloud security strategy. But in a sense, it is no different than planning a security strategy for your company. But it is the big scary cloud! Well, before I list the top then issues I would like to say one thing: if you think you need an entirely different security strategy for the cloud, you may not have a comprehensive security strategy to start with.  Yes, you have to make sure that you cloud provider has a sophisticated approach to security. However, what about your Internet service provider? What about the level of security within your own IT department? Can you throw stones if you live in a glass house (yes, that is a pun…sorry)?  So, before you start fretting about security in the cloud, get your own house in order.  Do you have an identity management plan? Do you ensure that one individual within the data center can’t control all of the data within a single environment to minimize risks? If you don’t have a well executed internal security plan, you aren’t ready for the cloud.  But let’s say that you have fixed that problem and you are ready to really plan your cloud security strategy. So, here five of the issues to consider. If you have others, let’s start a conversation.

security police

1. You need to start at the beginning with understanding the characteristics of your cloud provider. Is the company well funded? Is its data center designed with security at the center? Your level of scrutiny will also depend on how you are using the cloud. If you are using Infrastructure as a Service for a short term project there is less risk than if you are planning to use a cloud to store important customer data.

2. How is your cloud provider implementing security in a multi-tenant environment? How do they ensure that one customer’s data doesn’t impact another customer’s data?

3. Does your cloud provider give you the ability to monitor security of your data in the cloud? This will be important both for compliance and to keep track of your own security policies.

4. Does your cloud provider encrypt your critical data? If not, why not?

5. Does your cloud provider give you the ability to control who is allowed to access your information based on roles and authorization? Does the cloud provider support federated identity management? This is basic security best practices.

Now you are probably saying to yourself that this isn’t rocket science. These are fundamental security approaches that any data center should follow. I recommend that you take a look at a great document published by the Cloud Security Alliance that details many of the key issues surrounding security in the cloud. So, I guess my principle message is that cloud security is not different than security in any data center.  But the market does not seem to understand this because the perception is that a cloud is somehow not a data center that can be secured with regular old security. I think that we will see something interesting happen because of this perception: cloud vendors will begin to charge a premium for really good security.  In fact, this is already happening.  Vendors like Amazon and Salesforce are offering segregated implementations of their environments to customers who don’t trust their ordinary security approaches.  This will work in the short term primarily because during this early phase of the cloud there is not enough focus on security. Long term, as the market matures, cloud vendors will have to demonstrate their ability to provide a secure environment based on basic security best practices. In the meantime, cloud vendors will rake in the cash for premium secure cloud services.

Ten things I learned about CA

May 5, 2008 Leave a comment

I spent part of last week at CA’s (Computer Associates in the old days) industry analyst meeting. My overall impression is very positive. CA is a complicated company with a complicated history. Often when a company has a near death experience, it either dies or changes. I have seen many companies that wither away — even if they don’t die completely. CA seems to be one of the exceptions. While it is hard to translate two full days of discussions and interactions into a couple of hundred words, I will put it in context with some of the ten key things I learned.

One. A focused approach. CA has selected three areas of concentration: enterprise management, governance, and security. This is a far cry from past decades where CA focused on hundreds of markets with thousands of product offerings. CA still has a boatload of mainframe products that it still sells but it has moved these products under a separate business unit.

Two. Enterprise IT Management remains a lynch pin offering. Management software has long been at the core of CA’s product offerings. The company has now divided its management portfolio into six discrete areas: service management, project and portfolio management (Clarity), application performance management (Wily), infrastructure management, security management, and datacenter automation (built on performance management and configuration management). Like IBM, CA is putting forth the idea that a customer can start with any one of these areas and then move to the next. Perhaps the fact that CEO, John Swainson started life inside IBM had something to do with that change. CA is building a case that it is architecting these product areas with a common foundation. It is an ambitious goal but a necessary one.

Three. The mainframe is still a money maker. CA remains committed to the mainframe market. It is experiencing strong growth — especially with the introduction of the z10.

Four. Focus, focus, focus. CA is getting very pragmatic under the operational leadership of president and COO, Mike Christenson. The company is focusing on its top 4,000 customers.

Five. Focus on systematizing governance. OK, so everyone is selling governance. CA is making good use of its Niku acquisition (reborn as Clarity) to become a player. In addition, the company is using some of its management technologies to support automation of governance. This is clearly an area where CA is investing.

Six. Security as core. The Netegrity acquisition has served CA well. It plays well in everything from SOA, governance, and virtualization. Securing highly distributed environments never goes out of style. ID Management is one of the key enablers across the portfolio.

Seven. Data Center Management is the most mature area of Enterprise Information Technology Management. I sat through a two hour deep dive about CA’s datacenter management offering. This is a big area, not just for CA but for everyone in the management space. The overall “vision” is to provide an overall unified infrastructure management platform. The offerings range from traditional systems management to network management. The focus of the team seems to be on providing integration points between modules across the product line — an ambitious plan.

Eight. CA likes Virtualization. CA is focused on virtual systems management. They are working to integrate virtualization management into their own offerings as well as offerings from partners. CA’s focus is around packaging management as a service — an obvious requirement if you are going to be a player in virtualization.

Nine. Getting focused on business services. CA is focusing a lot of attention on the area of business service management. I liked the approach of having a formal policy based automation engine. CA claims its differentiator is its ability to implement dynamic server provision.

Ten. CA does SOA. CA has been relatively quiet about SOA in the past. It was interesting that rather than producing a SOA product offering, CA is retooling its technology offerings as a set of SOA services with web services interfaces. Obviously, creating the interfaces is the easy part. But it is a step in the right direction.

The bottom line. CA is clearly a company on the move. It is living in a rough neighborhood with tough competitors. But I am impressed with some of the new thinking and some of the architectural approaches that are the foundation for the company’s product directions. CA has made the right acquisition moves that are paying off. Now, the proof will be in what acquisitions come next and the way CA will execute on the vision and directions.

Top Ten Predictions for 2005

February 23, 2005 Leave a comment

While 2004 started out with a whimper for the technology market, it ended with a sense that the momentum that had been missing from the market was finally beginning to take hold. Hurwitz & Associates predicts that the coming year will offer some interesting opportunities as well as challenges. Here are our top ten predictions:

1. Emerging Technologies

Leveraging emerging technologies in innovative ways to transform business will be the key driver in 2005. While many organizations are able to provide a predictable payback from technology acquisition with relative ease, the bar is being raised. The insightful companies are looking for technology to become a core competitive asset. For many industries technology innovation has the potential to transform business practice. We expect that this focus on technology as the foundation for business transformation will become the norm – not the exception. Traditional ROI methodologies will begin to be viewed as outdated. We expect the buying pattern to move away from cost saving technology purchase towards technology that offers business opportunity.

2. Open Source

While the Open Source market will continue to expand at a rapid rate some customers will begin to experience problems because of poor, undocumented implementations executed by inexperienced contractors. Customers will begin to learn the hard way that all open source is not the same. Companies that provide verification and certification of open source offerings will gain major momentum in the market. More software companies will continue to try to regain market momentum by putting their crown jewels in the open source arena. We predict that many of these efforts will be viewed skeptically and will not be commercially successful.

3. Data Quality

Quality in general will become a massive issue in 2005. This crisis will extend both to data quality and software quality in general. Data Quality, traditionally viewed as a back office function will begin to emerge as a major crisis in organizations. Studies are showing that few managers have confidence in the quality of their organization’s data. With tough regulations (Sarbanes Oxley, etc.) bearing down on companies, the quality of data becomes a front office issue. We anticipate that predictable data quality will become a battle cry for many CIOs and their bosses in the coming months. As software becomes the personification of the company, software quality moves form the isolated Q&A department.

4. Integration

How organizations are able to manage their information across departments and across organizational boundaries will be one of the hottest markets in 2005. While many software companies are beginning to leap into this emerging market, most will fail to gain critical mass. Customers will want to buy a well integrated package from a highly trusted source. Therefore, we expect to see many more acquisitions in this market. Those weaker players who are not acquired will go out of business.

5. IT Security

Security is moving from an applications play to an infrastructure play. Today there are thousands of small security companies focused on small pieces of a bigger puzzle. We expect that companies like IBM, CA, HP, Symantec, Novell, and BMC will position them for leadership by providing a consolidated set of offerings both at the infrastructure and the application level.

6. IT Security Innovation

Innovation in security will be driven by the need to anticipate problems before they materialize rather than having to react to threats – a move to real-time and away from reactive security. We anticipate the real action in startups will be in this area.

7. Linux

The Linux operating system will continue to gain significant market share at the expense of traditional Unix and Microsoft platforms. Innovative emerging software vendors are increasingly selecting Linux as their platform in order to compete with larger, more established players. The net effect will be a renaissance in innovative applications that do not need the same funding to approach the market. This will have dramatic implications for the SMB market. The barriers to entry are indeed being broken.

8. Middleware

The definition of middleware will change in 2005. We anticipate that what had been viewed as industry specific packaged software will begin to be seen as corporate infrastructure and middleware. This has the potential to change the balance of power in the market. Oracle’s acquisition of PeopleSoft will start an avalanche of acquisitions by unexpected players who have not been in the packaged software market.

9. Software As A Service

This is the year that software as a service will become the norm. Increasingly, we are seeing customers accept that software can and should be bought as a service rather than in a perpetual license mode. This model will change the dynamics of software: it will be much easier for companies to walk away from their vendor if they become dissatisfied.

10. Software License Management

Being able to more easily manage software licenses will become a major market factor in 2005. Until now, it has been difficult for large organizations to know what software is installed on desktops and laptops throughout their organization. Activity will be driven by a combination of increasingly tight budgets, regulatory demands for accuracy in software fees, as well as security concerns, Increasingly, companies are unwilling to pay for software licenses that users do not access and do not need.