Archive

Archive for the ‘Master Data Management’ Category

Predictions for 2011: getting ready to compete in real time

December 1, 2010 3 comments

2010 was a transition year for the tech sector. It was the year when cloud suddenly began to look realistic to the large companies that had scorned it. It was the year when social media suddenly became serious business. And it was the year when hardware and software were being united as a platform – something like in the old mainframe days – but different because of high-level interfaces and modularity. There were also important trends starting to emerge like the important of managing information across both the enterprise and among partners and suppliers. Competition for ownership of the enterprise software ecosystem headed up as did the leadership of the emerging cloud computing ecosystem.

So, what do I predict for this coming year? While at the outset it might look like 2011 will be a continuation of what has been happening this year, I think there will be some important changes that will impact the world of enterprise software for the rest of the decade.

First, I think it is going to be a very big year for acquisitions. Now I have said that before and I will say it again. The software market is consolidating around major players that need to fill out their software infrastructure in order to compete. It will come as no surprise if HP begins to purchase software companies if it intends to compete with IBM and Oracle on the software front.  But IBM, Oracle, SAP, and Microsoft will not sit still either.  All these companies will purchase the incremental technology companies they need to compete and expand their share of wallet with their customers.

This will be a transitional year for the up and coming players like Google, Amazon, Netflix, Salesforce.com, and others that haven’t hit the radar yet.  These companies are plotting their own strategies to gain leadership. These companies will continue to push the boundaries in search of dominance.  As they push up market as they grab market share, they will face the familiar problem of being able to support customers who will expect them to act like adults.

Customer support, in fact, will bubble to the top of the issues for emerging as well as established companies in the enterprise space – especially as cloud computing becomes a well-established distribution and delivery platform for computing.  All these companies, whether well established or startups will have to balance the requirements to provide sophisticated customer support with the need to make profit.  This will impact everything from license and maintenance revenue to how companies will charge for consulting and support services.

But what are customers be looking for in 2011? Customers are always looking to reduce their IT expenses – that is a given. However, the major change in 2011 will be the need to innovative based on customer facing initiatives.  Of course, the idea of focusing on customer facing software itself isn’t new there are some subtle changes.  The new initiatives are based on leveraging social networking from a secure perspective to both drive business traffic, anticipate customer needs and issues before they become issues.  Companies will spend money innovating on customer relationships.

Cloud Computing is the other issue in 2011. While it was clearly a major differentiator in 2010, the cloud will take an important leap forward in 2011.  While companies were testing the water this year, next year, companies will be looking at best practices in cloud computing.  2011 will be there year where customers are going to focus on three key issues: data integration across public, private, and data centers, manageability both in terms of workload optimization, security, and overall performance.  The vendors that can demonstrate that they can provide the right level of service across cloud-based services will win significant business. These vendors will increasingly focus on expanding their partner ecosystem as a way to lock in customers to their cloud platform.

Most importantly, 2011 will be the year of analytics.  The technology industry continues to provide data at an accelerated pace never seen before. But what can we do with this data? What does it mean in organizations’ ability to make better business decisions and to prepare for an unpredictable future?  The traditional warehouse simply is too slow to be effective. 2011 will be the year where predictive analytics and information management overall will emerge as among the hottest and most important initiatives.

Now I know that we all like lists, so I will take what I’ve just said and put them into my top ten predictions:

1. Both today’s market leaders and upstarts are going to continue to acquire assets to become more competitive.  Many emerging startups will be scooped up before they see the light of day. At the same time, there will be almost as many startups emerge as we saw in the dot-com era.

2. Hardware will continue to evolve in a new way. The market will move away from hardware as a commodity. The hardware platform in 2010 will be differentiated based on software and packaging. 2010 will be the year of smart hardware packaged with enterprise software, often as appliances.

3. Cloud computing models will put extreme pressure on everything from software license and maintenance pricing to customer support. Integration between different cloud computing models will be front and center. The cloud model is moving out of risk adverse pilots to serious deployments. Best practices will emerge as a major issue for customers that see the cloud as a way to boost innovation and the rate of change.

4. Managing highly distributed services in a compliant and predictable manner will take center stage. Service management and service level agreements across cloud and on-premises environments will become a prerequisite for buyers.

5. Security software will be redefined based on challenges of customer facing initiatives and the need to more aggressively open the corporate environment to support a constantly morphing relationship with customers, partners, and suppliers.

6. The fear of lock in will reach a fever pitch in 2011. SaaS vendors will increasingly add functionality to tighten their grip on customers.  Traditional vendors will purchase more of the components to support the lifecycle needs of customers.  How can everything be integrated from a business process and data integration standpoint and still allow for portability? Today, the answers are not there.

7. The definition of an application is changing. The traditional view that the packaged application is hermetically sealed is going away. More of the new packaged applications will be based on service orientation based on best practices. These applications will be parameter-driven so that they can be changed in real time. And yes, Service Oriented Architectures (SOA) didn’t die after all.

8. Social networking grows up and will be become business social networks. These initiatives will be driven by line of business executives as a way to engage with customers and employees, gain insights into trends, to fix problems before they become widespread. Companies will leverage social networking to enhance agility and new business models.

9. Managing end points will be one of the key technology drivers in 2011. Smart phones, sensors, and tablet computers are refining what computing means. It will drive the requirement for a new approach to role and process based security.

10. Data management and predictive analytics will explode based on both the need to understand traditional information and the need to manage data coming from new sales and communications channels.

The bottom line is that 2011 will be the year where the seeds that have been planted over the last few years are now ready to become the drivers of a new generation of innovation and business change. Put together everything from the flexibility of service orientation, business process management innovation, the wide-spread impact of social and collaborative networks, the new delivery and deployment models of the cloud. Now apply tools to harness these environments like service management, new security platforms, and analytics. From my view, innovative companies are grabbing the threads of technology and focusing on outcomes. 2011 is going to be an important transition year. The corporations that get this right and transform themselves so that they are ready to change on a dime can win – even if they are smaller than their competitors.

Can Informatica earn a place at the head table?

February 22, 2010 Leave a comment

Informatica might be thought of as the last independent data management company standing. In fact, that used to be Informatica’s main positioning in the market. That has begun to change over the last few years as Informatica can continued to make strategic acquisitions. Over the past two years Informatica has purchased five companies  — the most recent was Siperian, a significant player in Master Data Management solutions. These acquisitions have paid off. Today Informatica has past the $500 million revenue mark with about 4,000 customers. It has deepened its strategic partnerships with HP, Ascenture, Salesforce.com, and MicroStrategies,  In a nutshell, Informatica has made the transition from a focus on ETL (Extract, Transform, Load) tools to support data warehouses to a company focused broadly on managing information. Merv Adrian did a great job of providing context for Informatica’s strategy and acquisitions. To transition itself in the market, Informatica has set its sights on data service management — a culmination of data integration, master data management and data transformation, predictive analytics in a holistic manner across departments, divisions, and business partners.

In essence, Informatica is trying to position itself as a leading manager of data across its customers’ ecosystem. This requires a way to have consistent data definitions across silos (Master Data Management), ways to trust the integrity of that data (data cleansing), event processing, predictive analytics, integration tools to move and transform data, and the ability to prove that governance can be verified (data governance). Through its acquisitions, Informatica is working to put these pieces together. However, as a relatively small player living in a tough neighborhood (Oracle, IBM, SAS Institute,etc. it will be a difficult journey. This is one of the reasons that Informatica is putting so much emphasis on its new partner marketplace. A partner network can really help a smaller player appear and act bigger.

This Marketplace will include all of Informatica’s products. It will enable developers to develop within Informatica’s development cloud and deploy either in the cloud or on premise. Like its new partner marketplace, the cloud is offering another important opportunity for Informatica to compete. Informatica was an early partner with Salesforce.com. It has been offerings complementary information management products that can be used as options with Salesforce.com.  This has provided Informatica access to customers who might not have ever thought about Informatica in the past. In addition, it taught Informatica about the value of cloud computing as a platform for the future. Therefore, I expect that with Informatica’s strong cloud-based offerings will help the company maintain its industry position. In addition, I expect that the company’s newly strengthened partnership with HP will be very important in the company’s growth.

What is Informatica’s roadmap? It intends to continue to deliver new releases every six months including new data services and new data integration services. It will including develop these services with a self-service interfaces. In the end, its goal is to be a great data steward to its customers. This is an admirable goal. Informatica has made very good acquisitions that support its strategic goals. It is making the right bets on cloud and on a partner ecosystem. The question that remains is whether Informatica can truly scale to the size where it can sustain the competitive threats.  Companies like IBM, Oracle, Microsoft, SAP, and SAS Institute are not standing still.  Each of these companies have built and will continue to expand their information management strategies and portfolios of offerings. If Informatica can break the mold on ease of implementation on complex data service management it will have earned a place at the head table.

What’s an information agenda?

September 29, 2008 5 comments

I had an opportunity to have a chat with Ambuj Goyal, General Manager of IBM’s Information Management division about the idea of an information agenda —  an initiative that IBM recently announced. The company intends to make a major investment in methodologies, best practices, and technologies over the coming years as way to help its customers implement the information on demand strategy.
While it may seem confusing at the outset, I think that the idea of an information agenda makes sense.  But first, I want to clear up a confusion that I have seen.  I asked Ambju to define the difference between information on demand and the information agenda.  While he agreed that both ideas are aspirational goals, he distinguishes between the two.  Information on Demand is really the specific techniques and technology that help companies architect their information assets so that they can be able to deliver business value on time and in context.  In contrast, he explains that an information agenda is really the business strategy for information that becomes the road map for the future.  While the distinctions are subtle it is interesting to think about these two concepts.
Here are my thoughts.  This problem is not new, it has been around for many generations of information management.  I won’t use this blog to remind you that we have so many disconnected information sources, with differing definitions of what the simplest concepts – what’s a customer and what’s a price – just to name a few.  And the problem is really getting worse. It isn’t enough anymore to just do joins across relational data sources. There is so much information that is stored in documents, on websites, in social networks, and customer service sites.  And you can’t just throw everything into one massive warehouse.
I think that the initial instinct of most technically oriented organizations is to react.  They embark on a Master Data Management strategy to quickly get one consistent view of data across relational sources.  Or in many situations, they might go out and buy a tool that makes it easier to query many different sources.  In some situations, customers are apt to invest in a massive data warehouse.  Each one is a valid strategy and will work to solve one specific problem.  But here is the difference that I see — reacting to one problem at a time is what has always gotten us into a mess with enterprise software in the first place.
Our team has been finalizing the second edition of Service Oriented Architectures for Dummies.  One of the key lessons we have taken away from this project is that customers who are successful are those that have moved away from being reactive to the crisis de jour to creating a business focused strategy.  For example, rather than taking on a project in isolation, these managers will make that project fit into an overall strategy for managing their business services or managing data across lots of business units.  So, while they are solving problems on an incremental basis, they are ensuring that these problems are solved in context with the overall business strategy.
What I like about the idea of an information agenda is that it focuses customers on the idea of having a strategy and a plan.  So, here’s my view of the top three things that should be in a customer’s information agenda:
1.    Starting with an honest assessment. Companies need to start by taking a step back and determine how they use information as part of their business strategy.  Information is used in different ways – both formal and informal.  It is used as part of structured databases, document management systems, warehouses, and informal paper based workflows.  Companies still use spreadsheets as their formal information management strategy.  Taking stock is critical.
2.    Imagining success. What would it look like if information could be available on demand and if that information could be trusted?  I think this approach could become a strategic differentiation for companies.  In fact, many of the companies that were interviewed for the second edition of SOA for Dummies were in the process of creating a strategy based on this idea.  Most of these companies were looking for ways to leverage information as part of the strategy to proactively engage customers.
3.    Fit small steps into a roadmap. I think this is the most important issue for companies.  It is so easy to devolve into a reactive state – especially in complex financial times.  I suspect that many companies will dump the idea of having a strategy and just try to do only what is necessary to survive.  You can’t blame them.  But, it is dangerous to take this approach.  Yes, companies should implement pragmatic projects that match their current pain. However, they should be a step in a journey towards a strategic approach to managing information.

Is Anticipation Management a Game Changer?

September 3, 2008 2 comments

I am going to introduce you to the idea of anticipation management. I hope that by the end of this you will agree with me that at the end of the day, this is what the value of information is all about.  Alright, you might ask, what is anticipation management? I define anticipation management as the ability for businesses to be able to leverage their information across departmental silos in order to look into the future.  Isn’t that what business intelligence is supposed to be about?  Of course, but is it really?

Here’s where I think we are headed with information management.  What management really wants to be able to do is to anticipate what is going to happen next. There are actually two categories of information management: the environment that helps you assess results. In other words, how successful was our last marketing campaign? How many customers paid their bills on time. How much money is in the bank and how much do customers owe us?  Within this category is also the need to look across departmental silos to understand not just what happened with customers in one product line but across product lines — and across partners and suppliers.  A lot of what we are seeing in the market these days fall into this area. This is important and it is hard.  To get to that single view of the customer requires some heavy lifting –  like having master data — a single definition of key elements that define your company.

The second category is game changing.  What would most companies pay to be able to look into the future and figure out what customers will want to buy next year?  What would they do if they could anticipate where their biggest weaknesses will be and fix them before they become a crisis?  This is where I expect to see investment and innovation in information management in the coming years.  Here is what the characteristics of this anticipation management software will be (just one person’s opinion):

1. Anticipation management software will combine traditional analytics with analysis of text from everything from blogs to social networks.

2. It will be based on sophisticated pattern analysis.

3. It will evolve so that it will be able to compare this type of data over time so that information is not analyzed in isolation but rather in context with what has happened in the past and what is expected in the future.

4. Context will be the key to anticipation management.  For example, you might find that there is an overwhelming degree of chatter and customer queries about a specific capability. Your first reaction might be to add that new function into the next generation of products.  The reality might be very different in context: your biggest competitor plans to unveil that capability in three months. If you make that capability the foundation for your new product you will miss the boat.

Do I think that you can go out and order anticipation management today? No. I think it needs to be a way that management begins to think about how they can use information to make strategic decisions about the future.  Anticipation management will not be easy. It will require companies to take a step back and look at their data in a very different way. But it will be worth the trouble.

So, talk to me. Let me know what you think of this crazy idea. Let’s have a conversation about what it will take to make anticipation management real.

Can Microsoft Pull Virtualization, SOA, Management, and SaaS Together?

June 17, 2008 5 comments

For three years in a row I have attended Microsoft’s server and tools analyst briefing. This is the vision of Microsoft that focuses on the server side of the company. A few years ago I predicted that this part of the company would get my vote in terms of growth and potential. I stand by my position. While Microsoft’s desktop division is suffering through a mid-life crisis, the server side is flexing its muscles. The transition towards power on the enterprise side is complicated for Microsoft. The challenges facing Microsoft is how to make the transition from its traditional role as champion and leader of the programmer to a leader in the next generation of distributed computing infrastructure. If Microsoft can make this transition in a coherent way it could emerge in an extremely powerful position.

So, I will provide what I think are the five most opportunities that the server and tools division of Microsoft is focused on.


Opportunity One. Virtualization as a foundation
. The greatest opportunity, ironically, is also the greatest threat. If customers decide to virtualize rather than to buy individual licenses, Microsoft could suffer – especially in the desktop arena. At the same time, Microsoft clearly sees the benefits in becoming a leader in virtualization. Therefore, virtualization is becoming the focus of the next generation of computing infrastructure both on the server and the desktop. Microsoft is making many investments in virtualization including the desktop, the hypervisor, the applications, the operating system, graphics, and overall management (including Identity Management). One smart move that Microsoft has made is to invest in its hypervisor intended to come out soon as HyperV. Rather than offering HyperV as a standalone product, Microsoft is adding the hypervisor into the to the fabric of Microsoft’s server platform. This is a pragmatic and forward thinking approach. If I were an independent hypervisor vendor I would hit the road right about now. Microsoft’s philosophy around enterprise computing is clear: unified and virtualized.

Microsoft’s management believes that within five to ten years all servers will be virtualized. To me this sounds like a logical assumption both in terms of manageability and power consumption. So, how does Microsoft gain supremacy in this market? Clearly, it understands that it has to take on the market leader: VMware. It hopes to do this in two ways: providing overall management of the management framework (including managing VMware) and though its partnership with Citrix. There was a lot of buzz for a while that Microsoft would buy Citrix. I don’t think so. The relationship is advantageous to both companies so I expect that Microsoft will enjoy the revenue and Citrix will enjoy the benefits of the Microsoft market clout.

Microsoft has been on an acquisition binge in the virtualization market. While they haven’t created the buzz of the Yahoo attempted acquisition, they are important pieces to support the new strategy. Investments include: Kidaro for desktop virtualization management (that sits on the virtual PC and is intended to provide application compatibility on the virtual desktop. Another investment, Calista Technologies, provides graphics virtualization that offers the full “vista experience” for the remote desktop. Last year Microsoft purchased Softricity, which offers application virtualization and OS streaming. Microsoft has said that it has sold 6.5 million Softricity seats (priced at $3.00 per copy). Now, add in the HyperV and the ID management offerings and things get very interesting.

One of the smartest things that Microsoft is doing is to position virtualization within the context of a management framework. In fact, in my view, virtualization is simply not viable without management. Microsoft positioned virtualization around this portfolio of offerings in the context of a management framework (System Center) for managing both the physical and virtual environment for customers.

Opportunity Two. Managing a combined physical and virtual world. Since Microsoft came out with SMS in the late 1990s, it has wanted to find a way to gain a leadership role in management software. It has been a complex journey and is still a work in progress. It is indeed a time of transition for Microsoft. The container for its management approach is System Center. Today with System Center, Microsoft has its sights on managing not only Windows systems but also a customer’s heterogeneous environment. Within the environment Microsoft has included identity management (leveraging active director as the management framework including provisioning and certificate management). This is one area where Microsoft seems to be embracing heterogeneity in a big way. Like many of the infrastructure leaders that Microsoft competes with, Microsoft’s leaders are talking about the ability to create a management framework that is “state aware” so that the overall environment is more easily self-managed. Microsoft envisions a world where through virtualization there are basically a pool of resources that are available and can be managed based on business policies and service levels. They talked a lot about automating the management of resources. Good thinking, but certainly not unique.

Microsoft is making a significant investment in management – especially in areas such as virtualization management, virtual machine management. More importantly, through its Zen-based connections (via Citrix) Microsoft will offer connectors to other system management platforms such as IBM’s Tivoli and HP’s OpenView. That means that Microsoft has ambitions to manage large-scale data centers. Microsoft is building its own data centers that will be the foundation for its cloud offerings.

Opportunity Three. Creating the next generation dynamic platform
. Every company I talk to lately is looking to own the next generation dynamic computing platform. This platform will be the foundation for the evolution of Service Oriented Architectures, social networks, and software as a service. But, obviously, this is complicated especially if you assume that you want to achieve ubiquitous integration between services that don’t know each other. Microsoft’s approach to this (they call it Oslo) is a based on a modeling language. Microsoft understands that achieving this nirvana requires a way to establish context. The world we live in is a web of relationships. Somehow in real life we humans are able to take tiny cues and construct a world view. Unfortunately, computers are not so bright. So, Microsoft is attacking this problem by developing a semantic language that will be the foundation for a model-based view of the world. Microsoft intends to leverage its network of developers to make this language based approach the focal point of a new way of creating modular services that can dynamically change based on context.

This is indeed an interesting approach. It is also a bottoms-up approach to the problem of semantic modeling. While Microsoft does have a lot of developers who will want to leverage this emerging technology I am concerned that a bottoms-up approach could be problematic. This must be combined with a tops-down approach if this approach is to be successful.

Opportunity Four. Software as a Service Plus.
I always thought that Microsoft envied AOL in the old days when it could get customers to pay per month while Microsoft sold perpetual licenses that might not be upgraded for years. Microsoft is trying to build a case that customers really want a hybrid environment so they can use an application on premise and then enable their mobile users to use this same capability as a service. Therefore, when Microsoft compares itself to companies like Salesforce.com, Netsuites, and Zoho they feel like Microsoft has a strategic advantage because they have full capabilities whether online or off line. But Microsoft is taking this further by taking services such as Exchange and offering that as a service. This will be primarily focused on the SMB market and for remote departments of large companies.

This is only the beginning from what I am seeing. Services such as Live Mesh, announced in April, is a services based web platform that helps developers with context over the web. Silverlight, also announced this spring is intended as a web 2.0 platform. Microsoft is taking these offerings plus others such as Visual Earth, SQL Server data services, cloud-based storage, and BizTalk services and offerings them as components in a service platform – both on its own and with its partners.

Opportunity Five. Microsoft revs up SOA. Microsoft has been slow to get on the SOA bandwagon. But it is starting to make some progress as it readies its registry/repository. This new offering will be built on top of SQL server and will include a UDDI version 3 service registry. For Master Data Management (MDM) – single view of the customer, Microsoft will create an offering based on SQLServer. It also views Sharepoint as a focal point for MDM. It intends to build an entity data model to support its MDM strategy.

While Microsoft has many of the building blocks it needs to create a Service Oriented Architecture strategy, the company still has a way to go. This is especially true in how the company creates a SOA framework so that customers know how to leverage its technology to move through the life cycle. Microsoft is beginning to talk a lot about business process including putting a common foundation for service interoperability by supporting key standards such as WS* and its own Windows Communications Foundation services.

The real problem is not in the component parts but the integration of those parts into a cohesive architectural foundation that customers can understand and work with. Also, Microsoft still lacks the in-depth business knowledge that customers are looking for. It relies on its integration partners to provide the industry knowledge.

The bottom line
Microsoft has made tremendous progress over the past five years in coming to terms with new models of computing that are not client or server centric but are dynamic. I perceive that the thinking is going in the right direction. Bringing process thinking with virtualization, management, and federated infrastructure and software as a service are all the right stuff. The question will be whether Microsoft can put all the pieces together that doesn’t just rely on its traditional base of developers to move it forward to the next generation. Microsoft has a unique opportunity to take its traditional customer base of programmers and move them to a new level of knowledge so they can participate in their vision of Dynamic IT.

Five cool things about Master Data Management

October 18, 2007 1 comment

Ok, so you never thought that Master Data Management (MDM) was cool. Well I just returned from IBM’s Information On Demand conference and found that IBM has gotten MDM fever in a big way. In fact, they were handling out cool MDM buttons. Now, MDM has been around for a long time. Customers are always trying to get to a “single view of the customer”. But because there are so many sources of data that are stove-piped and disconnected across departments, it has been nearly impossible. One of the approaches that was popular a few years ago was Enterprise Information Integration (EII). I hereby declare EII dead! What is taking its place is true MDM.

Now that MDM is starting to mature it has the potential to become the authoritative way to manage the single trusted view of the customer. I got some valuable insights into MDM from Dan Wolfson, a Distinguished Engineer and MDM expert at IBM. His view is that MDM has to be constructed as a hub. Through this federated view it is possible to understand the context of data. I really like this concept and think that if we are indeed going to move to a model where there is a single view of the customer it will be because companies can have a single way to manage their information about customers, products, services, and the like.

Let me digress for a minute and tell you a little about IBM’s new approach to MDM. The company has come up with a strange name for it: multi-form MDM. Despite this name, I actually think they are on to something. The idea behind this is that rather than having a different platform for each type of MDM focused application, there is a single platform that can be used no matter what type of approach is at issue. For example, there are customer centric MDM applications that focus on the details about customer, location, relationships, etc. Another application might focus on products such as a company’s product portfolio, billing details, sales territories, and the like.

There is a lot more to say about this MDM hub approach but that will have to come later. Check out our monthly newsletter. I will probably write a more in-depth article about that later.

But as I promised, here are five cool things about MDM:

1. A Master Data Management platform can help avoid a thousand versions of the “truth”

2 . If implemented from a holistic perspective, MDM can actually solve problems

3. MDM can actually become an information integration standard

4. MDM could become the lynch pin between metadata, semantic web, and registry/repository

5. MDM is cool because it really matters to the business