Archive

Archive for the ‘Web 2.0’ Category

Five things I learned at IBM’s Rational Conference

June 9, 2009 3 comments

I haven’t been to IBM’s Rational Conference in a couple of years so I was very interested not just to see what IBM had to say about the changing landscape of software development but how the customers attending the conference had changed. I was not disappointed.  While I could write a whole book on the changes happening in software development (but I have enough problems) I thought I would mention some of the aspects of the conference that I found noteworthy.

One. Rational is moving from tools company to a software development platform. Rational has always been a complex organization to understand since it has evolved and changed so much over the years. The organization now seems to have found its focus.

Two. More management, fewer low level developers. In the old day, conferences like this would be dominated by programmers. While there were many developers  in attendance, I found that there were a lot of upper level managers. For example, I sat at lunch with one CIO who was in the process of moving to a sophisticated service oriented architecture. Another person at my table was a manager looking to update his company’s current development platforms. Still another individual was a customer of one of the company’s that IBM had purchased who was looking to understand how to implement new capabilities added since the acquisition.

Three. Rational has changed dramatically through acquisitions. Rational is a tale of acquisitions. Rational Software, the lynch pin of IBM’s software development division, itself was a combination of many acquisitions. Rational, before being bought by IBM in 2002 for $2.1 billion, had acquired an impressive array of companies including Requiste, SQA, Performance Aware, Pure-Atria, and Object Time Ltd.  After a period of absorbtion, IBM started acquiring more assets. BuildForge (build and release management) was purchased in 2006; Watchfire (Web application security vulnerability and compliance testing software) was bought in 2007; and Telelogic (requirements management) was purchased in 2008.

It has taken IBM a while to both absorb all of the acquisitions and then to create a unified architecture so that these software products could share components and interoperate. While IBM is not done, under Danny Sabbah’s leadership (General Manager), Rational made the transition from being a tools company to becoming platform for managing software complexity. It is work in progress.

Four. It’s all about Jazz. Jazz, IBM’s collaboration platform was a major focus of the conference.  Jazz is an architecture intended to integrate data and function.  Jazz’s foundation is the REST architecture and therefore it is well positioned for use in Web 2.0 applications. What is most important is that IBM is bringing all of its Rational technology under this model. Over the next few years, we can expect to see this framework under all of the Rational’s products.

Five. Rational doesn’t stand alone. It is easy to focus on all of the Rational portfolio (which could take a while). But what I found quite interesting was the emphasis on the intersection between the Rational platform and Tivoli’s management services as well as Websphere’s Service Oriented Architecture offerings. Rational also made a point of focusing on the use of collaboration elements provided by the Lotus division.  Cloud computing was also a major focus of discussion at the event. While many customers at the event are evaluating the potential of using various Rational products in the cloud it is early.  The one area that IBM seem to have hit a home run is its Cloud Burst appliance which is intended create and manage virtual images. Rational is also beginning to deliver its testing offerings as cloud based services. One of the most interesting elements of its approach is to use tokens as a licensing model. In other words, customers purchase a set number of tokens or virtual licenses that can be used to purchase services that are not tied to a specific project or product.

Can Microsoft Pull Virtualization, SOA, Management, and SaaS Together?

June 17, 2008 5 comments

For three years in a row I have attended Microsoft’s server and tools analyst briefing. This is the vision of Microsoft that focuses on the server side of the company. A few years ago I predicted that this part of the company would get my vote in terms of growth and potential. I stand by my position. While Microsoft’s desktop division is suffering through a mid-life crisis, the server side is flexing its muscles. The transition towards power on the enterprise side is complicated for Microsoft. The challenges facing Microsoft is how to make the transition from its traditional role as champion and leader of the programmer to a leader in the next generation of distributed computing infrastructure. If Microsoft can make this transition in a coherent way it could emerge in an extremely powerful position.

So, I will provide what I think are the five most opportunities that the server and tools division of Microsoft is focused on.


Opportunity One. Virtualization as a foundation
. The greatest opportunity, ironically, is also the greatest threat. If customers decide to virtualize rather than to buy individual licenses, Microsoft could suffer – especially in the desktop arena. At the same time, Microsoft clearly sees the benefits in becoming a leader in virtualization. Therefore, virtualization is becoming the focus of the next generation of computing infrastructure both on the server and the desktop. Microsoft is making many investments in virtualization including the desktop, the hypervisor, the applications, the operating system, graphics, and overall management (including Identity Management). One smart move that Microsoft has made is to invest in its hypervisor intended to come out soon as HyperV. Rather than offering HyperV as a standalone product, Microsoft is adding the hypervisor into the to the fabric of Microsoft’s server platform. This is a pragmatic and forward thinking approach. If I were an independent hypervisor vendor I would hit the road right about now. Microsoft’s philosophy around enterprise computing is clear: unified and virtualized.

Microsoft’s management believes that within five to ten years all servers will be virtualized. To me this sounds like a logical assumption both in terms of manageability and power consumption. So, how does Microsoft gain supremacy in this market? Clearly, it understands that it has to take on the market leader: VMware. It hopes to do this in two ways: providing overall management of the management framework (including managing VMware) and though its partnership with Citrix. There was a lot of buzz for a while that Microsoft would buy Citrix. I don’t think so. The relationship is advantageous to both companies so I expect that Microsoft will enjoy the revenue and Citrix will enjoy the benefits of the Microsoft market clout.

Microsoft has been on an acquisition binge in the virtualization market. While they haven’t created the buzz of the Yahoo attempted acquisition, they are important pieces to support the new strategy. Investments include: Kidaro for desktop virtualization management (that sits on the virtual PC and is intended to provide application compatibility on the virtual desktop. Another investment, Calista Technologies, provides graphics virtualization that offers the full “vista experience” for the remote desktop. Last year Microsoft purchased Softricity, which offers application virtualization and OS streaming. Microsoft has said that it has sold 6.5 million Softricity seats (priced at $3.00 per copy). Now, add in the HyperV and the ID management offerings and things get very interesting.

One of the smartest things that Microsoft is doing is to position virtualization within the context of a management framework. In fact, in my view, virtualization is simply not viable without management. Microsoft positioned virtualization around this portfolio of offerings in the context of a management framework (System Center) for managing both the physical and virtual environment for customers.

Opportunity Two. Managing a combined physical and virtual world. Since Microsoft came out with SMS in the late 1990s, it has wanted to find a way to gain a leadership role in management software. It has been a complex journey and is still a work in progress. It is indeed a time of transition for Microsoft. The container for its management approach is System Center. Today with System Center, Microsoft has its sights on managing not only Windows systems but also a customer’s heterogeneous environment. Within the environment Microsoft has included identity management (leveraging active director as the management framework including provisioning and certificate management). This is one area where Microsoft seems to be embracing heterogeneity in a big way. Like many of the infrastructure leaders that Microsoft competes with, Microsoft’s leaders are talking about the ability to create a management framework that is “state aware” so that the overall environment is more easily self-managed. Microsoft envisions a world where through virtualization there are basically a pool of resources that are available and can be managed based on business policies and service levels. They talked a lot about automating the management of resources. Good thinking, but certainly not unique.

Microsoft is making a significant investment in management – especially in areas such as virtualization management, virtual machine management. More importantly, through its Zen-based connections (via Citrix) Microsoft will offer connectors to other system management platforms such as IBM’s Tivoli and HP’s OpenView. That means that Microsoft has ambitions to manage large-scale data centers. Microsoft is building its own data centers that will be the foundation for its cloud offerings.

Opportunity Three. Creating the next generation dynamic platform
. Every company I talk to lately is looking to own the next generation dynamic computing platform. This platform will be the foundation for the evolution of Service Oriented Architectures, social networks, and software as a service. But, obviously, this is complicated especially if you assume that you want to achieve ubiquitous integration between services that don’t know each other. Microsoft’s approach to this (they call it Oslo) is a based on a modeling language. Microsoft understands that achieving this nirvana requires a way to establish context. The world we live in is a web of relationships. Somehow in real life we humans are able to take tiny cues and construct a world view. Unfortunately, computers are not so bright. So, Microsoft is attacking this problem by developing a semantic language that will be the foundation for a model-based view of the world. Microsoft intends to leverage its network of developers to make this language based approach the focal point of a new way of creating modular services that can dynamically change based on context.

This is indeed an interesting approach. It is also a bottoms-up approach to the problem of semantic modeling. While Microsoft does have a lot of developers who will want to leverage this emerging technology I am concerned that a bottoms-up approach could be problematic. This must be combined with a tops-down approach if this approach is to be successful.

Opportunity Four. Software as a Service Plus.
I always thought that Microsoft envied AOL in the old days when it could get customers to pay per month while Microsoft sold perpetual licenses that might not be upgraded for years. Microsoft is trying to build a case that customers really want a hybrid environment so they can use an application on premise and then enable their mobile users to use this same capability as a service. Therefore, when Microsoft compares itself to companies like Salesforce.com, Netsuites, and Zoho they feel like Microsoft has a strategic advantage because they have full capabilities whether online or off line. But Microsoft is taking this further by taking services such as Exchange and offering that as a service. This will be primarily focused on the SMB market and for remote departments of large companies.

This is only the beginning from what I am seeing. Services such as Live Mesh, announced in April, is a services based web platform that helps developers with context over the web. Silverlight, also announced this spring is intended as a web 2.0 platform. Microsoft is taking these offerings plus others such as Visual Earth, SQL Server data services, cloud-based storage, and BizTalk services and offerings them as components in a service platform – both on its own and with its partners.

Opportunity Five. Microsoft revs up SOA. Microsoft has been slow to get on the SOA bandwagon. But it is starting to make some progress as it readies its registry/repository. This new offering will be built on top of SQL server and will include a UDDI version 3 service registry. For Master Data Management (MDM) – single view of the customer, Microsoft will create an offering based on SQLServer. It also views Sharepoint as a focal point for MDM. It intends to build an entity data model to support its MDM strategy.

While Microsoft has many of the building blocks it needs to create a Service Oriented Architecture strategy, the company still has a way to go. This is especially true in how the company creates a SOA framework so that customers know how to leverage its technology to move through the life cycle. Microsoft is beginning to talk a lot about business process including putting a common foundation for service interoperability by supporting key standards such as WS* and its own Windows Communications Foundation services.

The real problem is not in the component parts but the integration of those parts into a cohesive architectural foundation that customers can understand and work with. Also, Microsoft still lacks the in-depth business knowledge that customers are looking for. It relies on its integration partners to provide the industry knowledge.

The bottom line
Microsoft has made tremendous progress over the past five years in coming to terms with new models of computing that are not client or server centric but are dynamic. I perceive that the thinking is going in the right direction. Bringing process thinking with virtualization, management, and federated infrastructure and software as a service are all the right stuff. The question will be whether Microsoft can put all the pieces together that doesn’t just rely on its traditional base of developers to move it forward to the next generation. Microsoft has a unique opportunity to take its traditional customer base of programmers and move them to a new level of knowledge so they can participate in their vision of Dynamic IT.

When not to salvage the legacy application

March 12, 2008 5 comments

One of the hardest things for organizations to do is to retire old applications. Unlike hardware that tends to be replaced on a regular cycle, old software sticks around way too long. It definitely over stays its welcome. I remember when I worked at John Hancock decades ago and watching as departments struggled to replace aging systems. While they were ready and willing to make the change, they often didn’t know precisely how these old systems worked. The developers never documented what they wrote and those people had retired years earlier.

Now you would think that the problem had gone away. In reality, the problem got worse with the advent of client/server computing where there was less structure applied to the development process. I came across a very old article I wrote back in 1996 that talked about a lot of those issues (please ignore the picture). Just when you thought it couldn’t get any worse, web based development came along. Instead of having a few hundred developers, the web brought the advent of thousands of developers all provide changes and updates to applications. We are now at a cross roads that is quite unique.

While we still have many aging applications that cannot be easily updated, we also have the need to move to Web 2.0 to create Rich Internet applications (RIA). Web 2.0 offers a way to dramatically transform the user experience. Organizations are looking to this approach to development to make access to knowledge and information much more immediate and intuitive than ever before. But the transition isn’t easy.

I got thinking a lot about the transition from client/server applications and old web based applications when I met with Nexaweb a few weeks ago. The company has been around since 2000 and specializes in the Web 2.0 space. While there has been a lot of hype around Web 2.0 it actually is a very pragmatic technology infrastructure. While I think that a lot of customers assume that you can just approach Web 2.0 as though it were a simple web application. The reality is quite different. In fact, good Web 2.0 applications have to be well architected. What I liked about what Nexaweb is doing is their approach to application modernization with a Web 2.0 spin. In essence, Nexaweb is focused on modernization of aging client/server applications by providing tooling that documents the existing code. It is designed to identify bad code and provides a tool to generate a model driven architecture. Like any good consulting organization, Nexaweb has leveraged best practices used to help its consulting clients move old applications to Web 2.0. Nexaweb is selling a set of productivity tools that can generate a model driven architecture. It is intended to generate code as part of this process. The company claims that it can reduce the cost of transforming old code by as much as 70 percent.

The new product called Nexaweb’s Enterprise Web Suite including a UML modeling tool, a reporting tool that identifies repetitive processes, and code that is no longer used. Clearly, Nexaweb isn’t the only company taking advantage of modeling tools and an architectural approach. But the fact that the company is focused on helping companies transform their aging client/server applications into modular, service oriented approach is a step forward. It is one of the set of companies focused on not just updating applications by transforming into Web 2.0. What stands out is the fact that Nexaweb seems to be combining application transformation into business services (can you say Service Oriented Architectures). However, I must add that IBM has been on this track for quite a few years. Through its industry models, IBM has been helping companies transform its aging areapplications into industry specific business services. In addition, Microsoft’s Silverlight and Adobe’s Air are adding a new level of sophistication to the momentum. WaveMaker, that I discussed in an earlier entry is making a contribution as well.

The trend is clear and it is good for customers. We are finally seeing software companies providing a path to moving code into the new world that is based on reusable, modular services that are architected. The next stage in the movement towards a service oriented architecture is applying this approach to the new generation of Web 2.0. Let me add a disclaimer — this isn’t magic. There is hard work here. None of these approaches or tools are automatic. They give customers a head start but there is hard work to be done. The alternative is to hold your breath and hope that things don’t break too quickly. There are so many promises of easy solutions to hard problems. There are solutions and tools that take the drudgery out of leaving legacy applications behind. But there is worthwhile hard work that really has to be done.

Top 10 Predictions: Innovation, ROI, Cloud Computing and more…

December 21, 2007 2 comments

I love the end of the year. I get to sneak out of the office for a few days and stay off of airplanes. I also have a chance to look ahead to the new year. I like making predictions. Sometimes, I am years ahead of the market; other times I am able to hit the nail on the head. So, for what it is worth, here are my top ten predictions for 2008 (Hey, how did that happen? What happened to 2007? I thought it just started!)

1. There will be two hot buzzwords this year: innovation and ROI. Companies want to find ways to leverage the technology they have invested in, to do things in totally new ways. At the same time, companies are nervous about investing in technology. They want assurances that there will be a return on their investment — quickly. So, you will see a lot of discussion of both issues. But here is one prediction that I guarantee: most of the proof about innovation and ROI will be fluffy and devoid of any real meat!

2. Here come the clouds! I think that cloud computing, one of the latest versions of virtualization, will become one of the hottest trends of 2008. Any infrastructure company you can name will come up with a cloud computing strategy. No single leader will emerge in 2008 but you won’t be able to move without bumping into the hype.

3. Software as a Service goes mainstream. Sure, SalesForce.com has been the industry darling over the past few years. There can be no doubt that SalesForce CEO Marc Benioff’s imaginative adventure hit the bulls-eye. But I expect that in 2008 there will be numerous mainstream, innovative approaches to Software as a Service. We already saw SAP announce SAP By Design as its entry into the SaaS market. Expect a lot more from mainstream players. Now add a social networking twist and things really get interesting.

4. The world gets more virtual. VMWare’s spectacular IPO made the rest of the market wakeup and smell the roses. Maybe there is money in this virtualization stuff after all. There will be three virtualization market segments: client, server, and application. I can’t decide which one I think is more important. How about all three!

5. More vendors will make more acquisitions (that’s another one you can take to the bank). Yes, Oracle will certainly make more acquisitions, but I don’ t think that BEA will be in the mix. Nor will HP buy BEA. However, I do predict that BEA will probably go private. I predict that HP will buy more software companies, especially in the data management area. IBM will continue its buying especially in software — more companies in what they call information management, more in systems management, and in the collaboration space. I expect to see more action from EMC as well primarily in management and security. The list is too long for this entry but stay tuned, it is going to be a very, very busy year.

6. So, I didn’t mention Microsoft yet. This is the year when Microsoft’s server/enterprise business will get the respect it deserves. Therefore, I expect to see Microsoft continue to make small but strategic acquisitions that will fit into the forthcoming Oslo strategy. I would expect to see Microsoft look for information management picks (among others). However, I don’t expect that Microsoft will be buying big, traditional software companies. I expect that Microsoft will make interesting acquisitions in web collaboration, social networking, and advertising.

7. Online goes off-line. Companies like Zoho are starting to gain traction because they can provide both online services combined with offline usage. Being able to continue working when you can’t get connectivity is the tipping point for these collaboration offerings to challenge Microsoft in the office and collaboration space.

8. This is the year that Service Oriented Architectures (SOA) moves from IT strategy to business strategy. Therefore, SOA will officially move out of the hype cycle and into mainstream. CEOs and CIOs have bought into the importance of consistent business oriented services. Therefore, expect that customers will get down to serious business of moving out of pilots into slow, deliberate implementations. This doesn’t make for splashy headlines but it does make business sense.

9. Google will continue to move into any market that leverages the advertising revenue model — including collaboration software and various cloud computing options. No surprise there. I do not expect that Google will make a bid for the traditional enterprise applications. I do expect to see a strengthening partnership with IBM.

10. Partner ecosystems will reach a new level of intensity this year. Enterprise software leaders will be working hard to make sure the most popular emerging players support their platforms. They will be joined in the mix by Software as a Service players who are trying to build up their arsenal of partners. Emerging players will live or die by their ability to sign the best partnerships. At the same time, enterprise software leaders are upping the requirements for participation. The bottom line is: what’s in it for me?

11. I know I promised 10 predictions but I have to add one more. There will be at least a few trends that will come out of the blue. But that is what makes things interesting!

 

Can developers use high level tools without lock in?

December 3, 2007 2 comments

Last week I wrote about WaveMaker and its new Web 2.0 development environment comparing it to PowerBuilder. One of the big problems that someone pointed out to me is that PowerBuilder was proprietary. Because old software never dies, IT organizations are still coping with old PowerBuilder applications that they have to support. Does this mean that we should avoid all development environments that hide the complexity of traditional programming environments? In my view that would be unrealistic. While there are some very talented developers in the world who can do quick magic with Java, cSharp, and C++, etc.; a majority of developers need abstracted tools to do projects needed by the business on a moment to moment basis.

I remember many years ago when I spoke at a conference of developers. I told them (I think this was in the early 1990s) that whatever tools they were using would be obsolete in a few years. These guys were outraged. They did not believe me.

I think that the reality is that organizations need to use emerging development environments that offer innovation, ease of use, and sophistication that is out of reach with traditional programming languages. It is best if these emerging Web 2.0 tools can generate standard programming code (although I am skeptical that the code will represent the nuances that these tools provide). The reality is that IT organizations should understand that fads and tools will come and go. Change is a reality. Here’s a radical idea — how about retiring old code before it becomes a burden to the business.

Is WaveMaker the Web 2.0 version of PowerBuilder?

November 30, 2007 3 comments

Some meetings are just fun (I can’t always say that..sometimes I just want to run away and hide under my desk). But my meeting today with WaveMaker reminded me of the type of meetings I had in the .com days. I admit I was excited about what I heard. Now a disclaimer — I haven’t taken a look at the technology itself. I haven’t reviewed their architecture. But I tell you about what I heard and what I liked.

This is not a brand new company. It just changed its name from ActiveGrid (a great name if you are running a data grid company but awful if you are a development environment for web 2.0 for the enterprise).

A lot of the corporations I have worked with and talked to about their SOA strategies are frustrated by the show development groups that bring in cool tools without any knowledge or approval of IT. This problem is as old as IT itself. I remember when I worked at John Hancock in the 1980s, it was common for the actuaries to sneak PCs into their department so they could get something done because IT was backlogged.

The problem has gotten a lot worse since those days. Now, when a department brings in its own development tools and technology, it can cause massive security breaches because these innocent projects happen to touch corporate data and cause decisions to be made out of context. Yet, it continues to happen. It got worse in the Internet days and it is getting worse in the Web 2.0 days — we are all developers and we can use the web to do anything we need for our businesses.

So, what does the newly named WaveMaker do? They provide what they are calling the WaveMaker Visual Assembly Studio. It is a service based approach to development. Everything within the environment is a service. It has web services interfaces so that an organization can default to the corporate authentication and authorization model. The studio generates a pure Java application.

While the company is of the Open Source world — it offers a free version of its development studio to be used for testing purposes. It supports a open source developer community called dev.wavemaker.com. However, it also sells its framework.

The company is really just getting off the group but has about six customers including Macy.com, Brunswick Bowling, National Citibank in Cincinnati, Pioneer Energy, and American Express. Not a bad start. CEO, Chris Keene told me that the reason the company was able to sell to those companies is that the development environment gives the business user Web 2.0 type interactivity and graphical development while keeping control of the computing infrastructure.

What was most interesting to me is the connection that WaveMaker is making with PowerSoft and its PowerBuilder platform — the company that transformed the graphical development process. This was a company that I knew quite well. I tortured the management team when I labeled PowerBuilder the poster child of the Fat Client Syndrome — a term I coined in the early 90s. It is interesting to note that Mitch Kertzman, one of the founders of PowerSoft is an investor in WaveMaker. If the company follows Kertzman’s lead of creating graphical development for the masses of Cobol developers, the company might be on to a good thing.

But, of course, WaveMaker isn’t alone. Companies like Microsoft with Silverlight, Adobe with Flex, and a host of new players such as Nexaweb, Jackbe, and Kapow — I wrote about these companies in my January 20th entry.

What may be different about WaveMaker is the focus on a the connection between the free wheeling Web 2.0 world and the structured world of enterprise IT.

Why are Web 2.0 vendors dreaming about PowerSoft?

January 20, 2007 1 comment

Recently I have been having dejá vu back to the days of PowerSoft. If you are old enough to remember, PowerSoft was the leader in making graphical development practical for the masses—rather than the object oriented gurus. Back in the early 1990s when PowerSoft’s product—called PowerBuilder—was in its heyday, it had been able to achieve dominance over arch rival Gupta Technology and a myriad of other long forgotten competitors. Ironically, at the time, Gupta had a much more sophisticated object oriented environment than PowerBuilder. But PowerBuilder was able to achieve leadership because the company found a way to make the traditional COBOL developer (and there were lots of them) very successful as graphical software designers. The secret was that while PowerBuilder professed to be an object oriented graphical development environment, it was actually a procedural environment that was familiar to the COBOL developer. Therefore, the skills that had made this generation of developers successful in an earlier generation provided the platform for a new career path in client/server development. Therefore, PowerBuilder took the market by storm and set the path for the early success of client/server computing.

Now, fast forward to today and the advent of Web 2.0 I am seeing lots of interesting tools such as Nexaweb, Jackbe, and Kapow. All these companies have a common strategy: they want to become the PowerBuilder of this new generation of application development environments. To create a rich, collaborative environment requires a level of sophistication that would prohibit less technical developers from participating. Therefore, just as PowerBuilder provided a way for the masses to create a graphical first generation environment, so this next generation of development tools will bring Web 2.0 to a broad audience. These web development environments provide the dynamic, stateful approach needed to create Web 2.0 environments.

I think that this movement towards Web 2.0 and these abstracted tools to support them will complete the picture of a service oriented architecture. The Web 2.0 environments will make the browser environment a full fledge participant in enterprise computing. Over time, we’ll see lots of business people creating compelling business services in this way focused on innovative, collaborative software that provides a rich client environment that provides sophisticated communications, as well as a stateful distributed computing platform. This is not an easy feat but one that some innovative players are going to grab to become the PowerBuilder of the Web 2.0 set.