Building toward BPM

Technological hurdles to enterprisewide performance analytics are many

IN THE SCRAMBLE to bring accountability and profitability back to the corporate world, BPM (business performance management) is gaining momentum in the BI (business intelligence) world. The basic idea is to tie operational data from enterprisewide systems to specific business goals, and to provide managers with integrated visibility into performance against those goals — a tall order.

To deliver on this promise, vendors such as SAS, Cognos, Hyperion, and Business Objects are rolling out BI platforms, frameworks, and architectures that aim to enable more integrated data gathering, analysis, and metrics-based management throughout the enterprise. The promise here is to go beyond querying and reporting to facilitate integration of multiple analytic applications across departments, and provide a seamless delivery experience for large numbers of users. “Having a BI architecture that lets you go from ETL [Extraction, Translation, and Loading] and data cleansing to analysis to information delivery is where the value proposition is,” explains Keith Collins, chief technology officer of SAS in Cary, N.C.

But behind these frameworks are a host of thorny technological issues that need to be worked out. And to complicate matters, everybody’s jumping into the analytics game, including the major ERP, CRM, and SCM ISVs, and of course none of the players wants to give up its current piece of the action. Here’s a rundown of some of the key issues involved in making the analytics ugly duckling turn into a business performance management swan.

Data acquisition, metadata, and real-time cleansing

Although recent investment in ERP and CRM systems has resulted in better source data for performance management systems, data integration is still high on the list of pain points that the new BI frameworks are attempting to address. Unlike operational connectors, BI vendors’ tools are geared toward extraction of detailed historical data from core systems (front end and back end) through bulk extraction and so-called “drip-feed” techniques. This data must also be tied into both data warehouses and spreadsheet-based data.

Vendors such as Ascential Software and Informatica have developed platforms for data integration, ETL and staging for analytic applications, leveraging prepackaged connectors plus XML to perform key functions, such as data cleansing, transformation, profiling, and source system analysis (to cut down on labor costs in the discovery phase). The resulting data can then be tapped by other BI vendors.

One key issue is data consistency or normalizing the data — for example, getting data that relates to specific buckets of time for trending analysis. “You have to do predictive modeling or forecasting in a known state,” says SAS’s Collins. “You’ve got to have a consistent source of data to work from.”

Another key challenge is data timeliness. Corporate planning is increasingly event-driven rather than calendar-driven, and expectations are for near real-time, actionable BI reports and alerts. “You really have to start thinking about the integration of the batch-server world and the real-time Java world,” says Jim Welch, vice president of product development at Westboro, Mass.-based Ascential. Although real-time transactional data is usually never clean enough to analyze, vendors are trying to get as close as possible, through XML, Web services, and integration buses. “BI is all about what data you give it,” Welch says. “The challenge is to do analysis across historical and real-time data as it’s happening. You take a shower all the time — the data’s always clean.”

Finally, vendors developing performance management BI frameworks are touting metadata management servers to track where key data and reports came from, capture all the physical and logical information and context-specific business rules attached to the data, and enable managers to perform cross-tool impact analysis. The industry is working on developing metadata standards — most notably CWM (Common Warehouse Model), which came out of the Object Management Group consortium, including Oracle, IBM, and SAS. CWM has now also been adopted by Microsoft.

Modeling and analytics

Analytics is the core of any performance management system, but to make it work, you need a good model of the process you’re analyzing so you can put the data in a meaningful context for end-users. “It’s all about how do you make the model of those elements that impact the outcome?” explains Dan Jewett, vice president of functional architecture at Santa Clara, Calif.-based Brio.

For higher-level BPM issues, such as predicting whether your company might lose a major account, the models are even harder to build. “It’s one thing to predict churn for phone accounts, but a lot tougher to predict overall corporate performance,” Jewett says. Model-building wizards this sophisticated don’t yet exist. But increasingly, vendors are developing vertically focused analytical solutions that incorporate external domain expertise and give enterprises the tools to add their own proprietary business insights.

A big challenge for BI framework developers is how to integrate multiple BI models across departments to enable score-carding and monitoring of shared commitments or targets. Bringing interoperability to these models is a bear, as they are often highly specific to application or departmental workflow, and based on varying departmental standards and requirements.

Currently, there are no widely accepted analytics modeling standards, although some vendors are hoping that PMML (Predictive Model Markup Language), developed by a vendor organization called the Data Mining Group that includes Oracle and IBM among others, will get traction. Microsoft is pushing its own XML for Analysis, a SOAP-based XML API for analytical data interaction but, according to rumor, the company is ready to commit to the PMML format for integration with its product.

Delivery, scalability, and security

The final challenge for performance management framework builders is delivering analytical capabilities and data flexibly, securely, and in a scalable way. Traditionally, analytics were performed by a small group of Ph.D.s wearing lab coats who generated reports and fielded requests from business users. Today, as many as tens of thousands of users across a large enterprise can require access to sophisticated analytical capabilities via multiple channels and interfaces.

Scalability means optimizing queries to work on multiterabyte databases and developing analytics engines that leverage both relational and multidimensional/OLAP technologies. These engines must support not only queries but simultaneous data manipulation and write-backs from thousands of browser-based users.

Scalability also means managing stateless n-tier J2EE and Web-based architectures that can include heterogeneous Web servers, application servers, metadata servers, OLAP servers, database servers, and BI integration middleware. “TCO [total cost of operation] and managing that across the enterprise is where people really need the help,” says SAS’s Collins.

Another delivery challenge is the user interface. BI applications typically use a “dashboard” paradigm — a single screen, similar to a portal, that gives managers an overview of their relevant metrics and data that lets them drill down into detail and set up alerts and notifications. But as with portals, there is currently no plug-and-play compatibility among different vendors’ dashboards or dashboard elements, and no standards activity on the horizon to develop that compatibility.

But BI vendors are closely monitoring the portlet standards process, centered around JSR 168 and the OASIS (Organization for the Advancement of Structured Information Standards) WSRP (Web Services for Remote Portals) committee. Their key concern is being able to get their data into other vendors’ portals or dashboards, specifically those of ISVs that are starting to build in their own analytics capabilities under the hood.

“We don’t want to take over the operational piece of the entry into SAP and Siebel,” explains SAS’s Collins, “but we want to make sure the intelligence we generate can be fed back into [those] operational systems.” Is a war shaping up between BI framework vendors and the ISVs? Collins says no. “Most of the ISVs are focused on just doing the analysis on the data that they contain,” he claims, whereas BI vendors are focused on integrating data from across the enterprise.

Security is the final piece of the puzzle. With so much knowledge at the fingertips of anyone in the company, authentication and authorization are crucial to making performance management systems effective. Most BI vendors developing frameworks are trying to leverage enterprises’ existing security infrastructure capabilities, especially security services provided by app servers, PKI, LDAP directories, or NT Challenge/Response.

Vendors are finding, however, that such infrastructure largely does not exist; enterprises are having a hard time implementing such capabilities as single sign-on. The reason? “Every organization implements different rules and has a different philosophy about how those rules should be applied,” says Brio’s Jewett. Furthermore, many ISVs have dramatically different data hierarchies, so authentication across multiple data sources is impossible unless that data is restaged elsewhere, as most BI vendors are doing for expediency.

When will all these pieces come together to enable true enterprisewide performance management analytics? Probably not for a long time. But watch for lots of incremental improvements in the meantime, driven by enterprises’ need to manage their bottom lines better.

Source: www.infoworld.com