Skip to content

Ahmed Lashin

Cognos, Business Intelligence and MORE …


Here is the latest Magic Quadrant report for Business Intelligence as for Feb 2012.


Business intelligence (BI) platforms enable all types of users — from IT staff to consultants to business users — to build applications that help organizations learn about and understand their business. Gartner defines a BI platform as a software platform that delivers the 14 capabilities listed below. These capabilities are organized into three categories of functionality: integration, information delivery and analysis. Information delivery is the core focus of most BI projects today, but we are seeing an increased interest in deployments of analysis to discover new insights, and in integration to implement those insights.


  • BI infrastructure — All tools in the platform use the same security, metadata, administration, portal integration, object model and query engine, and should share the same look and feel.
  • Metadata management — Not only should all tools leverage the same metadata, but the offering should provide a robust way to search, capture, store, reuse and publish metadata objects such as dimensions, hierarchies, measures, performance metrics and report layout objects.
  • Development tools — The BI platform should provide a set of programmatic development tools and a visual development environment, coupled with a software developer’s kit for creating BI applications, integrating them into a business process, and/or embedding them in another application. The BI platform should also enable developers to build BI applications without coding by using wizard-like components for a graphical assembly process. The development environment should also support Web services in performing common tasks such as scheduling, delivering, administering and managing. In addition, the BI application can assign and track events or tasks allotted to specific users, based on predefined business rules. Often, this capability can be delivered by integrating with a separate portal or workflow tool.
  • Collaboration — This capability enables BI users to share and discuss information, BI content and results, and/or manage hierarchies and metrics via discussion threads, chat and annotations, either embedded in the BI platform or through integration with collaboration, social software and analytical master data management (MDM).
Information Delivery
  • Reporting — Reporting provides the ability to create formatted and interactive reports, with or without parameters, with highly scalable distribution and scheduling capabilities. In addition, BI platform vendors should handle a wide array of reporting styles (for example, financial, operational and performance dashboards), and should enable users to access and fully interact with BI content delivered consistently across delivery platforms including the Web, mobile devices and common portal environments.
  • Dashboards — This subset of reporting includes the ability to publish formal, Web-based or mobile reports with intuitive interactive displays of information, including dials, gauges, sliders, check boxes and traffic lights. These displays indicate the state of the performance metric compared with a goal or target value. Increasingly, dashboards are used to disseminate real-time data from operational applications or in conjunction with a complex event processing engine.
  • Ad hoc query — This capability enables users to ask their own questions of the data, without relying on IT to create a report. In particular, the tools must have a robust semantic layer to allow users to navigate available data sources. These tools should include a disconnected analysis capability that enables users to access BI content and analyze data remotely without being connected to a server-based BI application. In addition, these tools should offer query governance and auditing capabilities to ensure that queries perform well.
  • Microsoft Office integration — In some use cases, BI platforms are used as a middle tier to manage, secure and execute BI tasks, but Microsoft Office (particularly Excel) acts as the BI client. In these cases, it is vital that the BI vendor provides integration with Microsoft Office applications, including support for document and presentation formats, formulas, data "refreshes" and pivot tables. Advanced integration includes cell locking and write-back.
  • Search-based BI — This applies a search index to both structured and unstructured data sources and maps them into a classification structure of dimensions and measures (often, but not necessarily leveraging the BI semantic layer) that users can easily navigate and explore using a search (Google-like) interface. This capability extends beyond keyword searching of BI platform content and metadata.
  • Mobile BI — This capability enables organizations to deliver report and dashboard content to mobile devices (such as smartphones and tablets) in a publishing and/or interactive (bidirectional) mode, and takes advantage of the interaction mode of the device (tapping, swiping and so on) and other capabilities not commonly available on desktops and laptops, such as location awareness.
  • Online analytical processing (OLAP) — This enables end users to analyze data with extremely fast query and calculation performance, enabling a style of analysis known as "slicing and dicing." Users are (often) able to easily navigate multidimensional drill paths. And they (sometimes) have the ability to write-back values to a proprietary database for planning and "what if" modeling purposes. This capability could span a variety of data architectures (such as relational or multidimensional) and storage architectures (such as disk-based or in-memory).
  • Interactive visualization — This gives users the ability to display numerous aspects of the data more efficiently by using interactive pictures and charts, instead of rows and columns. Over time, advanced visualization will go beyond just slicing and dicing data to include more process-driven BI projects, allowing all stakeholders to better understand the workflow through a visual representation.
  • Predictive modeling and data mining — This capability enables organizations to classify categorical variables and to estimate continuous variables using advanced mathematical techniques. BI developers are able to integrate models easily into BI reports, dashboards and analysis, and business processes.
  • Scorecards — These take the metrics displayed in a dashboard a step further by applying them to a strategy map that aligns key performance indicators (KPIs) with a strategic objective. Scorecard metrics should be linked to related reports and information in order to do further analysis. A scorecard implies the use of a performance management methodology such as Six Sigma or a balanced scorecard framework.

Magic Quadrant

Figure 1. Magic Quadrant for Business Intelligence Platforms

Figure 1. Magic Quadrant for Business Intelligence Platforms

Source: Gartner (February 2012)


Here is what is written for IBM,

  • IBM maintains its leading position on the Completeness of Vision axis for this year’s Magic Quadrant. The company takes a holistic approach to what it calls Business Analytics and Optimization (BAO), combining comprehensive software, hardware and services in a coordinated market offering. IBM’s business analytics software portfolio includes a unified BI, analytics and performance management platform, and is complemented by IBM information management software and appliances (Netezza, for example). Services are made up of a consulting line of nearly 9,000 people, which is a growing part of IBM Global Business Services (GBS). IBM can offer both a tools-based and/or a solution-driven offering, along with significant vertical expertise, to customers and prospects.
  • In 4Q10, IBM introduced its latest business analytics platform, IBM Cognos 10. Throughout 2011, additional capabilities have been released and customer adoption has begun in earnest. Cognos 10 references who responded to this year’s Magic Quadrant survey painted a very interesting snapshot — on average nearly 4,000 users, over 12 TB of data, broad functional use, and very high platform integration scores, all at or near the top of all ratings for all vendors in this report. Overall, Cognos 10 references were significantly more satisfied than Cognos 8 customers, who were the majority of IBM’s survey respondents. While some indicated that upgrading from Cognos 8 to Cognos 10 had some complexity, the majority rated it as straightforward or very straightforward. This bodes well for IBM’s future ability to execute, providing the firm delivers superior service and support and problem-free software.
  • The average tenure of IBM respondents was seven years, second highest of all vendors in this survey. Gartner often hears this long-standing customer commitment in inquiry, and this represents a strong customer loyalty factor. This year, less than 7% of references noted that they are planning to discontinue use of the software in the next three years (or are considering doing so), which is significantly lower than last year’s result.
  • Advanced analytics is a particular IBM strength. The company’s SPSS software continues to advance nicely, readily allowing IBM to bid for predictive analytics and statistical use cases. Customers rated IBM’s predictive capabilities in the top quartile of all vendors. A secret weapon at IBM’s disposal — IBM Research — delivers another level of research and development prowess to the overall IBM value proposition. For example, Watson, the Deep Question and Answer system that interprets natural language and scores possible answers based on probability, is a visible example of IBM Research at work. While not a part of the Cognos 10 platform, it demonstrates the depth and breadth that IBM can bring to clients’ advanced analytic scenarios.
  • The top reasons why customers select IBM are functionality, ease of use for end users, and data access and integration. IBM’s road map and future vision weighed heavily in reference decisions. In 2011, IBM delivered a new Cognos 10 mobile application for the iPad that is included free in existing user roles. In early 2012 the company will introduce Cognos Insight, a personal, desktop BI product that enables independent discovery and "what if" modeling, while also providing full interoperability with the larger workgroup and enterprise solutions.
  • Twenty-three percent of Cognos 8 references indicate that performance continues to be problematic (a persistent problem for the last several years), nearly three times the average response for other vendors evaluated in this Magic Quadrant. In contrast, Cognos 10 references reported below average performance concerns. This is a sure signal that IBM must encourage upgrades to Cognos 10 without technical and/or financial disruption.
  • Again this year, references consider the Cognos products more difficult to implement and use than those of competitors. While Cognos 10 was rated slightly below average, other IBM products (Cognos 8, SPSS software and Cognos TM1) were deemed significantly more difficult. These are cited as two major reasons that limit expanded BI deployments with Cognos 8. As such, improved system administration and end-user usability were major development themes of the Cognos 10 release. References indicate that Cognos software is used largely by a consumer/casual user population. Reporting is the most extensively deployed component, followed by ad hoc query and OLAP analysis.
  • IBM’s customers also continue to have less than optimal customer experiences, with support and sales interactions, along with product quality, rated in the bottom quartile of all vendors reviewed in this report. References also rate product functionality slightly below the average for all vendors. The bright spot is that Cognos 10 references rated product functionality near the top of all vendors, and support, sales and product quality were rated better than for Cognos 8. These issues remain IBM’s Achilles’ heel, and will limit its ability to raise execution scores next year unless action is taken quickly.
  • License cost continues to be another source of customer concern across all products in the IBM business analytics portfolio. Gartner client inquiry also bears out this concern. Higher than expected costs to upgrade from Cognos 8 to Cognos 10 have stalled some projects, but changes in configuration, user roles, and/or support costs appear to drive the increase. As a counterpoint, existing Cognos 10 users did not identify license cost as a concern.
        I have some comments about the report but this will be in the next post.

      Timo Elliott recently wrote a post about the differences between Business Intelligence & Business Analytics which we found interesting.

      He feels that the differences between the two – even though they do exist – are not of huge importance. Most people have their own idea of what each of them are and although they differ from person to person, the underlying elements are often the same.

      One business intelligence vendor uses the term “business analytics” as an umbrella term,which they mean to inlcude: data warehousing, business intelligence, enterprise information management, enterprise performance management, analytic applications, and governance, risk, and compliance.

      Whereas another uses the term to indicate some level of vertical/horizontal domain knowledge tied with statistical or predictive analytics.

      Elliott comes to the conclusion that there are two things worth differentiating:

      1. The business aspect of BI - the need to draw value from analyzing information. Over the past 5 decades this has not really changed, and neither have the factors preventing us from doing so.

      2. The IT aspect of BI - this is the tool used to provide the information. This has clearly evolved over time, sometimes drastically.

      “Business intelligence” is often used to describe both of the above – resulting in confusion.

      He also makes the interesting point that analysts and vendors will change the term with the fear that their offering will appear dated if they continue using an established term, especially those new to the industry. He rounds off by stating quite correctly that it doesn’t really matter what you call it, at the end of the day it’s all about working out the best way to leverage the information you have to make better business decisions.

      Interestingly, he changed the name of his blog from “BI questions” to “Business Analytics” because Google Trends showed a decline in the overall search volume of the term “business intelligence” and a sharp rise in searches for “Business Analytics”.

      As Microsoft PowerPivot is gaining more popularity and exposure, BI professionals ask more and more questions about PowerPivot’s role in the organization in trying to understand what value the new in-memory BI solution from Microsoft brings, along with the benefits and the limitations of it. Is PowerPivot going to replace SQL Server Analysis Services? If so, how soon? What should be done with the existing BI solution? Or maybe both can coexist and serve different needs?

      In order to answer these questions and understand both short and long term impacts of the new products on your BI solution we need to understand what motivated Microsoft to release this new creature and where do they position it. Microsoft are trying to achieve two main goals – introduce a new in-memory engine for data processing and promote the self-service BI concept extending the usage of BI systems to a wider audience.

      The new in-memory engine is called “Vertipaq”. Vertipaq is claimed to perform much better than classic SSAS engine doing the aggregations and calculations as well as temporary data storage in a computer’s RAM eliminating the slow disk lookup overhead. The first version of this engine is currently released as a part of both Microsoft Office Excel 2010 and the SQL Server 2008 R2 enabling SSAS to work either in classic or the new in-memory mode. The in-memory mode for SSAS is currently only available for PowerPivot created cubes and not for all your classic cubes, however, eventually the new engine will make it to a major SSAS release and will become the new default engine of the SSAS.

      Meanwhile, classic SSAS is more functional than PowerPivot in terms of analytics and administration. SSAS has more semantics such as hierarchies, and more administration support such as robust data security functionality. SSAS is probably the richest multidimensional engine on the market today, scalable to support large data amounts and completely enterprise ready. The downside of these capabilities is that SSAS project requires design and planning of the BI solution, implementation, deployment, testing and additional phases. A team of BI developers, IT support, long development cycle and not that frequent updates result with a highly customized, less flexible solution which is good for years and relies on enterprise data which structure does not change that often.

      Analysis Services is the corner stone of any corporate infrastructure and it enables users to analyze data that has already been pre-modeled for them by IT. So users can create standard reports, dashboards and KPI’s based on the data there, in a sense, answering ‘known’ questions. PowerPivot, on the other hand, enables users to connect to any data and instantly start modeling and analyzing it “on-the-fly” (without IT defining the cubes and modeling it in advance). PowerPivot essentially enables users to answer those ‘unknown’ questions that can often exist.

      How often have you had data was missing from the cube? Or a business user come to ask for a missing metric and you postponed its creation for the next data warehouse update which was postponed and never actually happened? This is where we need self-service BI and this is where PowerPivot comes to help both the business user and the IT team. PowerPivot authoring environment is the same beloved Microsoft Office Excel that everybody has and knows how to use. The simplicity and the familiarity of this desktop tool eliminates the need for additional training and increases the adoption rate. Give them a tool they are not afraid to use and they’ll know how to work with the metrics. Business users are able to just go through any data on their flat spreadsheet and produce a cube from it in a pivot table with only a single mouse click. There are certain limitations there, but the value is still huge – self service BI with zero training required and remarkable engine performance providing instant business value.

      That’s why we say SSAS answers your “known” questions and PowerPivot solves the “unknown” ones. Panorama NovaView 6.2 supports both systems and supplies our customers with the same interface and same tools for both SSAS and PowerPivot. NovaView’s unified security layer secures both data sources at the same time and with the same security definitions making administrators’ life easier and making PowerPivot ready for a large enterprise deployment. NovaView BI Server resides in the center of the BI solution and implements the business logic, additional data semantics, and security applied on both SSAS and PowerPivot. It also delivers the data insights over both data sources via the entire suite of NovaView front end tools such as Flash Analytics, Dashboard, Smart Report, Spotlight and more.



      Cognos (Cognos Incorporated) was an Ottawa, Ontario-based company making business intelligence (BI) and performance management software. Founded in 1969, at its peak Cognos employed almost 3,500 people and served more than 23,000 customers in over 135 countries.

      Originally Quasar Systems Limited, it adopted the Cognos name in 1982. On January 31, 2008, Cognos was officially acquired by IBM. The Cognos name continues to be applied to IBM’s line of business intelligence (BI) and performance management products.

      In January 2010, as part of a reorganization of IBM Software Group, Cognos software and software from recently acquired SPSS were brought together to create the Business Analytics division.

      Acquisition of Cognos by IBM

      In 2007, following SAP’s acquisition of Business Objects and Oracle’s acquisition of Hyperion,[8] IBM announced its acquisition of Cognos in November for $4.9 billion.[9] It continued to operate as a wholly owned subsidiary (Cognos, an IBM company) until January 1, 2009, when it was absorbed into IBM’s Information Management brand within the company’s Software Group. The software is now called “Cognos Business Intelligence and Financial Performance Management” or Cognos BI and FPM.

      BI market

      IBM’s purchase of Cognos and other business intelligence software vendors was a step in establishing IBM as a BI “megavendor” (along with Oracle, Microsoft, and SAP). This consolidation may prove beneficial for customers to have fewer vendors to deal with, but this raises concerns about integration of the software as more vendors are bought out by the big four. Another challenge is maintaining the same level of customer service.


      IBM Cognos 8 BI, initially launched in September 2005, combined the features of several previous products, including ReportNet, PowerPlay, Metrics Manager, NoticeCast, and DecisionStream. There are also Express and Extended versions of Cognos 8 BI. Full features:

      * Report Studio (Professional report authoring tool formatted for the web)
      * Query Studio (Ad hoc report authoring tool with instant data preview)
      * Analysis Studio (Explore multi-dimensional cube data to answer business questions)
      * Metric Studio (Monitor, analyze, and report on KPIs)
      * Metric Designer (Define, load, and maintain metrics to be available in Metric Studio)
      * Event Studio (Action based agents to notify decision makers as events happen)
      * Framework Manager (Semantic metadata layer tool which creates models or packages)
      * PowerPlay Studio (formerly PowerPlay Web)

      IBM Cognos Express, launched in September 2009, is an integrated business intelligence (BI) and planning solution purpose-built to meet the needs of midsize companies. The features of Express are:

      * Cognos Express Reporter (Self-service reporting and ad hoc query)
      * Cognos Express Advisor (Freeform analysis and visualization)
      * Cognos Express Xcelerator (Microsoft Excel-based planning and business analysis)

      IBM Cognos TM1

      IBM Cognos TM1 (formerly Applix TM1) is enterprise planning software used to implement collaborative planning, budgeting and forecasting solutions, as well as analytical and reporting applications. Data in IBM Cognos TM1 is stored and represented as multidimensional OLAP cubes, with data being stored at the “leaf” level. Computations on the leaf data are performed in real-time (for example, to aggregate numbers up a dimensional hierarchy). IBM Cognos TM1 includes a data orchestration environment for accessing external data and systems, as well as capabilities designed for common business planning and budgeting requirements (e.g. workflow, top-down adjustments). The latest version, IBM Cognos TM1 9.5, became publicly available on February 9, 2010.

      By Mark Karas, Senior Consultant

      Since Express Authoring mode has been around, I have asked many of my fellow colleagues if they have used the Express Authoring Mode for creating a Report Studio report yet, which is a functionality addition in 8.3, and I believe all answers were “no.” Seems there is little experience out in the field, so here is a brief summary of some high-level information on Express Authoring mode and some comparison to the more familiar and popular Professional Authoring mode:

      Some facts about, and prerequisites for, using the Express Authoring mode:

      • Its main purpose is to provide non-Cognos report professionals the ability to create Financial reports
      • Express Authoring Mode works best with a package that contains a multi-dimensional data source (DMR, Cube) but will still work with standard, relational source.
      • Reports created in either authoring mode can be edited using the other mode.
      • The mode that is set in Report Studio at the time of authoring a report is not tied to the report specification or vice-versa.
      • There is no properties window for setting object properties in Express mode.
      • The Express Authoring mode works only with Crosstabs. Lists and Charts are not available in the tool set, nor are there any prompting capabilities.
      • Like Query and Analysis Studios, the default Express authoring environment works with live data.
      • By default, the reports authored in Express mode are not Drill-Down enabled but drilling up and down is available.
      • Levels are not part of the toolset, strictly entire hierarchies and the members, a la Analysis Studio.

      There are some nifty little functions added to facilitate easy report authoring using members.

      • To add all the children of an existing member in the report (Data menu, Insert Children, After | Before | Nest)
      • Toggle between creating individual Members, or Member Sets when dragging a member onto the report
      • Selection to insert a single member only, the member’s children only or both with the drag in.

      To get an understanding of the differences in the 2 modes, I used the GO Finance cube as a data source to create a Balance Sheet report in Report Studio using what is now known as the Professional Authoring mode. Professional mode utilizes the full-feature/function set of Report Studio, where the Express mode only has a small sub-set of those features. I then switched to the Express Authoring mode and re-created the same report. Here are my findings and in some cases opinions about Express Authoring Mode:

      Findings of the Express Authoring mode

      On the positive and/or neutral side:

      • Formatting was pretty much the same from the toolbar aspect. You do not have a properties window like in Professional mode, so the toolbar is the place to change font, justification, add borders, indenting (padding), etc.
      • Creating the Balance Sheet report in Express mode was indeed faster than using the Professional mode
      • From the Financial Analyst’s viewpoint, he or she will be working with the accounts and account roll-ups… things they are already very familiar with, as opposed to levels of a hierarchy which is often confusing to them.

      On the Negative side:

      • Changes in the underlying hierarchies of the source will probably break these member-only reports.
      • Additions to the Chart of Accounts will not be picked up automatically, as they would if levels were used.
      • Lack of flexibility in object usage and setting.
      • No way to turn off the data preview
      • Didn’t find a way to change the Rows per page limit. My report went to 2 pages in HTML mode… very frustrating

      In Summary:

      I found Express Mode to be fairly easy to use for creating a quick financial report. It is not however, going to remotely replace a truly professionally authored and formatted report when it comes to the total end-user/consumer experience, given its “Ad Hoc” feel.

      In many business cases you may need to have more than one Cognos environment on the same machine (like a development and testing environment or multiple development environments).

      The installation is simply the normal Cognos installation. The only note in the installation step is to ensure that each instance has been installed into a different directory. for example

      install instance 1 under c:\Program Files\cognos\cognos83dev\
      install instance 2 under c:\Program Files\cognos\cognos83tst\

      The real work will be in Configuring the second instance to overcome any potential conflict. So, amusing that you already have a running Cognos installation and you need to install another instance of Cognos. Here is how to do this,

      1- On the web server ensure that there are separate virtual directories pointing to the corresponding install directory for each instance.

      2- In Cognos Configuration under Environment, ensure that no two instances are both set to use the same port number (9300 for the default Tomcat application server) in the Dispatcher URIs for gateway, External Dispatcher URI, Internal Dispatcher URI, Content Manager URIs and Dispatcher URIs for external applications settings. (I used 9301 for the second installation).

      3- In Cognos Configuration under Environment -> Logging, ensure that no two instances are both set to use the same Local log server port number ( 9362 was the default port used in the first installation, I used 9363 for the second installation).

      4- In Cognos Configuration under Environment -> Cognos 8 service ensure that no two instances are both set to use the same Shutdown port number. ( 9399 was the default port used in the first installation, I used 9398 for the second installation).

      5- Ensure that no two instances are pointing to the same content store. In Cognos Configuration under Environment -> Cognos Content database ensure that no two instances are both set to use the same Shutdown port number. ( 1527 was the default port used in the first installation, I used 1528 for the second installation). You will need to change the same port under the Data Access -> Content Manager -> Cognos Content Store.

      Note : Make sure any new port assigned are not currently in use. You can run “netstat -a” to list out all the port numbers currently in use.

      When I am working in Framework Manager, I always use the Context Explorer to check the relationship between database entities, fact tables and their dimensions in a star schema.  When I moved to Cognos v8.4, I noticed a very annoying problem. The Context Explorer is not saving the settings between sessions. So each time I open the Context Explorer I have to select the view options like Star Schema options and others. I am sure I didn’t have this problem in Cognos v8.3.

      So after searching IBM support website, I found that it is a problem and that they have a workaround for it. here is what I found in IBM support website,

      In Framework Manager, the level of detail settings (query items, relationships, cardinality, notation) are not saved. Changing these in the Context Explorer will not save these settings for the next session

      Change these settings in the Diagram View, and they will be propagated the next time Context Explorer is used.


      1. Open the Diagram View (View > Views > Diagram View)
      2. Set the level of detail, either by right-clicking on the whitespace and selecting Level Of Detail, or go to Diagram > Level Of Detail
      3. Save the project


      1. Open the Diagram View (View > Views > Diagram View)
      2. Go to Diagram > Diagram settings…
      3. Specify the Level of Details
    • Select whether you want these settings to be the default for all new projects (8.4 Only) Click OK and then Save the project
    • Gartner, Inc. has revealed its five predictions for business intelligence (BI) between 2009-2012. Speaking ahead of the Gartner Business Intelligence Summit 2009 in The Hague, analysts’ predictions ranged from the impact of business units exerting greater control over analytic applications to the effect of the economic crisis and how it will force a renewed focus on information trust and transparency to innovations such as collaborative decision making and trusted data providers.

      “Organizations will expect IT leaders in charge of BI and performance management initiatives to help transform and significantly improve their business,” said Nigel Rayner, research vice president of Gartner. “This year’s predictions focus on the need for BI and performance management to deliver greater business value.”

      Through 2012, more than 35% of the top 5,000 global companies will regularly fail to make insightful decisions about significant changes in their business and markets. The economic downturn forces businesses to be aware of changes in their organisations, re-think their strategies and operating plans and face demands from stakeholders and governments for greater transparency about finances, operations, decisions and core performance metrics. However, most organisations do not have the information, processes and tools needed to make informed, responsive decisions due to underinvestment in information infrastructure and business tools.

      “IT leaders in companies with a strong culture of information-based management should create a task-force to respond to the changing information and analysis needs of executives,” said Bill Hostmann, research vice president and distinguished analyst at Gartner. “IT leaders in businesses without such a culture should document the costs and challenges of adjusting to new conditions and propose a business case for investing in the information infrastructure, process and tools to support decision making.”

      By 2012, business units will control at least 40 per cent of the total budget for BI
      Although IT organisations excel at building BI infrastructure, business users have lost confidence in the ability of them to deliver the information they need to make decisions. Business units drive analysis and performance management initiatives, mainly using spreadsheets that create dashboards full of metrics, plus analytic and packaged business applications to automate the process. Business units will increase spending on packaged analytic applications, including corporate performance management (CPM), online marketing analytics and predictive analytics that optimise processes, not just report on them.

      “By making purchases independently of the IT organization, business units risk creating silos of applications and information, which will limit cross-function analysis, add complexity, and delay to corporate planning and execution of changes,” said Mr Rayner. “IT organizations can overcome this by encouraging business units to use existing assets and create standards for purchasing classes of packaged analytic applications that minimise the impact of isolated functions.”

      By 2010, 20% of organizations will have an industry-specific analytic application delivered via software as a service (SaaS) as a standard component of their BI portfolio. Information aggregators will increasingly rely on SaaS to deliver domain specific analytic applications built from industry data they collect and shift the balance of power in the BI platform market in their favour. Companies will only share their data with aggregators that can guarantee security and confidentiality so, while hundreds of information aggregators offering SaaS analytic applications will emerge, a virtual monopoly will persist within each vertical niche because of the high barrier to entry for others.

      “IT leaders should work with business users to identify the information aggregators in their industry and plan to incorporate a manageable number into their BI and performance management portfolio,” said Kurt Schlegel, research vice president at Gartner. “They should work with the information provider to ensure the information tapped by the SaaS analytic application can be integrated into their internal data warehouses.”

      In 2009, collaborative decision making will emerge as a new product category that combines social software with BI platform capabilities. The emergence of social software presents an opportunity for savvy IT leaders to exploit the groundswell of interest in informal collaboration. Instead of promoting a formal, top-down decision-making initiative, these IT leaders will tap people’s natural inclination to use social software to collaborate and make decisions.

      “Social software allows users to tag assumptions made in the decision making process to the BI framework,” said Mr Schlegel. “For example, in deciding how much to invest in marketing a new product, users can tag the assumptions they made about the future sales of that product to a key performance indicator (KPI) that measures product sales. The BI platform could then send alerts to the user when the KPI surpassed a threshold so that the decision makers know when an assumption made in the decision-making process no longer holds true. This approach dramatically improves the business value of BI because it ties all the good stuff BI delivers (e.g. analytical insights, KPIs) directly to decisions made in the business.”

      By 2012, one-third of analytic applications applied to business processes will be delivered through course-grained application mashups. Businesses should not trust their megavendor to solve all their integration problems. Vendors move slowly to integrate the disparate code bases they have acquired. Reliance on one vendor also limits the ability to use best-of-breed capabilities and weakens the buyer’s negotiating position. At the same time, business units do not care about grand visions for service-oriented architecture (SOA), such as assembling composite applications by weaving together fine-grained services.

      “IT leaders in Type A organisations who want to link analytics with business processes should use course-grained mashups of existing operational and analytical applications,” said Mr Schlegel. “Today, most use portals to integrate operational and analytical applications, but portals simply put the operational and analytical views side by side. Course-grained mashups overlay analytical insights, such as queries, scores, calculations, metrics and graphs, onto the graphical user interface of the operational application.”

      “The current economic crisis shows the importance of trust and transparency in the information that organisations use to run their business. Integrate the analytical insights derived from this information into the decision-making processes throughout the company,” concluded Mr Rayner.

      More information can be found in the report “Predicts 2009: Business Intelligence and Performance Management Will Deliver Greater Business Value”, available on Gartner’s website at