The Performance Index analysis we performed as part of our next-generation predictive analytics benchmark research shows that only one in four organizations, those functioning at the highest Innovative level of performance, can use predictive analytics to compete effectively against others that use this technology less well. We analyze performance in detail in four dimensions (People, Process, Information and Technology), and for predictive analytics we find that organizations perform best in the Technology dimension, with 38 percent reaching the top Innovative level. This is often the case in our analyses, as organizations initially perform better in the details of selectingvr_NG_Predictive_Analytics_performance_06_dimensions and managing new tools than in the other dimensions. Predictive analytics is not a new technology per se, but the difference is that it is becoming more common in business units, as I have written.

In contrast to organizations’ performance in the Technology dimension, only 10 percent reach the Innovative level in People and only 11 percent in Process. This disparity uncovered by the research analysis suggests there is value in focusing on the skills that are used to design and deploy predictive analytics. In particular, we found that one of the two most-often cited reasons why participants are not fully satisfied with the organization’s use of predictive analytics is that there are not enough skilled resources (cited by 62%). In addition, 29 percent said that the need for too much training or customized skills is a barrier to changing their predictive analytics.

The challenge for many organizations is to find the combination of domain knowledge, statistical and mathematical knowledge, and technical knowledge that it needs to be able to integrate predictive analytics into other technology systems and into operations in the lines of business, which I also have discussed. The need for technical knowledge is evident in the research findings on the jobs held by individual participants: Three out of four require technical sophistication. More than one-third (35%) are data scientists who have a deep understanding of predictive analytics and its use as well as of data-related technology; one-fourth are data analysts who understand the organization’s data and systems but have limited knowledge of predictive analytics; and 16 percent described themselves as predictive analytics experts who have a deep understanding of this topic but not of technology in general. The research also finds that those most often primarily responsible for designing and deploying predictive analytics are data scientists (in 31% of organizations) or members of the business intelligence and data warehouse team (27%). This focus on business intelligence and data warehousing vr_NG_Predictive_Analytics_16_why_users_dont_produce_predictive_analysesrepresents a shift toward integrating predictive analytics with other technologies and indicates a need to scale predictive analytics across the organization.

In only about half (52%) of organizations are the people who design and deploy predictive analytics the same people who utilize the output of these processes. The most common reasons cited by research participants that users of predictive analytics don’t produce their own analyses are that they don’t have enough skills training (79%) and don’t understand the mathematics involved (66%). The research also finds evidence that skills training pays off: Fully half of those who said they received adequate training in applying predictive analytics to business problems also said they are very satisfied with their predictive analytics; percentages dropped precipitously for those who said the training was somewhat adequate (8%) and inadequate (6%). It is clear that professionals trained in both business and technology are necessary for an organization to successfully understand, deploy and use predictive analytics.

To determine the technical skills and training necessary for predictive analytics, it is important to understand which languages and libraries are used. The research shows that the most common are SQL (used by 67% of organizations) and Microsoft Excel (64%), with which many people are familiar and which are relatively easy to use. The three next-most commonly used are much more sophisticated: the open source language R (by 58%), Java (42%) and Python (36%). Overall, many languages are in use: Three out of five organizations use four or more of them. This array reflects the diversity of approaches to predictive analytics. Organizations must assess what languages make sense for their uses, and vendors must support many languages for predictive analytics to meet the demands of all customers.

The research thus makes clear that organizations must pay attention to a variety of skills and how to combine them with technology to ensure success in using predictive analytics. Not all the skills necessary in an analytics-driven organization can be combined in one person, as I discussed in my analysis of analytic personas. We recommend that as organizations focus on the skills discussed above, they consider creating cross-functional teams from both business and technology groups.

Regards,

Tony Cosentino

VP and Research Director

To impact business success, Ventana Research recommends viewing predictive analytics as a business investment rather than an IT investment.  Our recent benchmark research into next-generation predictive analytics  reveals that since our previous research on the topic in 2012, funding has shifted from general business budgets (previously 44%) to line of business IT budgets (previously 19%). Now more than vr_NG_Predictive_Analytics_15_preferences_in_purchasing_predictive_analy.._  half of organizations fund such projects from business budgets: 29 percent from general business budgets and 27 percent from a line of business IT budget. This shift in buying reflects the mainstreaming of predictive analytics in organizations,  which I recently wrote about .

This shift in funding of initiatives coincides with a change in the preferred format for predictive analytics. The research reveals that 15 percent fewer organizations prefer to purchase predictive analytics as stand-alone technology today than did in the previous research (29% now vs. 44% then). Instead we find growing demand for predictive analytics tools that can be integrated with operational environments such as business intelligence or transaction applications. More than two in five (43%) organizations now prefer predictive analytics embedded in other technologies. This integration can help businesses respond faster to market opportunities and competitive threats without having to switch applications.

  vr_NG_Predictive_Analytics_14_considerations_in_evaluating_predictive_an.._ The features most often sought in predictive analytics products further confirm business interest. Usability (very important to 67%) and capability (59%) are the top buying criteria, followed by reliability (52%) and manageability (49%). This is consistent with the priorities of organizations three years ago with one important exception: Manageability was one of the two least important criteria then (33%) but today is nearly tied with reliability for third place. This change makes sense in light of a broader use of predictive analytics and the need to manage an increasing variety of models and input variables.

Further, as a business investment predictive analytics is most often used in front-office functions, but the research shows that IT and operations are closely associated with these functions. The top four areas of predictive analytics use are marketing (48%), operations (44%), IT (40%) and sales (38%). In the previous research operations ranked much lower on the list.

To select the most useful product, organizations must understand where IT and business buyers agree and disagree on what matters. The research shows that they agree closely on how to deploy the tools: Both expressed a greater preference to deploy on-premises (business 53%, IT 55%) but also agree in the number of those who prefer it on demand through cloud computing (business 22%, IT 23%). More than 90 percent on both sides said the organization plans to deploy more predictive analytics, and they also were in close agreement (business 32%, IT 33%) that doing so would have a transformational impact, enabling the organization to do things it couldn’t do before.

However, some distinctions are important to consider, especially when looking at the business case for predictive analytics. Business users more often focus on the benefit of achieving competitive advantage (60% vs. 50% of IT) and creating new revenue opportunities (55% vs. 41%), which are the two benefits most often cited overall. On the other hand, IT professionals more often focus on the benefits of in­creased upselling and cross-selling (53% vs. 32%), reduced risk (26% vs. 21%) and better compliance (26% vs. 19%); the last two reflect key responsibilities of the IT group.

Despite strong business involvement, when it comes to products, IT, technical and data experts are indispensable for the evaluation and use of predictive analytics. Data scientists or the head of data management are most often involved in recommending (52%) and evaluating (56%) predictive analytics technologies. Reflecting the need to deploy predictive analytics to business units, analysts and IT staff are the next-most influential roles for evaluating and recommending. This involvement of technically sophisticated individuals combined with the movement away from organizations buying stand-alone tools indicates an increasingly team-oriented approach.

Purchase of predictive analytics often requires approval from high up in the organization, which underscores the degree of enterprise-wide interest in this technology. The CEO or president is most likely to be involved in the final decision in small (87%) and midsize (76%) companies. In contrast, large companies rely most on IT management (40%), and very large companies rely most on the CIO or head of IT (60%). We again note the importance of IT in the predictive analytics decision-making process in larger organizations. In the previous research, in large companies IT management was involved in approval in 9 percent of them and the CIO was involved in only 40 percent.

As predictive analytics becomes more widely used, buyers should take a broad view of the design and deployment requirements of the organization and specific lines of business. They should consider which functional areas will use the tools and consider issues involving people, processes and information as well as technology when evaluating such systems. We urge business and IT buyers to work together during the buying process with the common goal of using predictive analytics to deliver value to the enterprise.

Regards,

Tony Cosentino

VP and Research Director

Our recently released benchmark research into next-generation predictive analytics  shows that in this increasingly important area many organizations are moving forward in the dimensions of information and technology, but most are challenged to find people with the right skills and to align organizationalVentanaResearch_NextGenPredictiveAnalytics_BenchmarkResearch processes to derive business value from predictive analytics.

For those that have done so, the rewards can be significant. One-third of organizations participating in the research said that using predictive  analytics leads to transformational change – that is, it enables them to do things they couldn’t do before – and at least half said that it provides competitive advantage or creates new revenue opportunities. Reflecting the  vr_NG_Predictive_Analytics_03_benefits_of_predictive_analytics momentum behind predic­tive analytics today, virtually all participants (98%) that have engaged in predictive analytics said that they will be rolling out more of it.

Our research shows that predictive analytics is being used most often in the front offices of organizations, specifically in marketing (48%), operations (44%) and IT (40%). While operations and IT are not often considered front-office functions, we find that they are using predictive analytics in service to customers. For instance, the ability to manage and impact the customer experience by applying analytics to big data is an increasingly important approach that  I recently wrote about . As conventional channels of communication give way to digital channels, the use of predictive analytics in operations and IT becomes more valuable for marketing and customer service.

However, the most widespread barrier to making changes in predictive analytics is lack of resources (cited by 52% of organizations), which includes finding the necessary skills to design and deploy programs. The research shows that currently consultants and data scientists are those most often needed. Half the time those designing the system are also the end users of it, which indicates that using predictive analytics still requires advanced skills. Lack of awareness (cited by 48%) is the second-most common barrier; many organizations fail to understand the vr_NG_Predictive_Analytics_06_technical_challenges_to_predictive_analyti.._  value of predictive analytics in their business. Some of the reluctance to implement predictive analytics may be because doing so can require significant change. Predictive analytics often represents a new way of thinking and can necessitate revamping of key organizational processes.

From a technical perspective, the most common deployment challenge is difficulty in integrating predictive analytics into the information architecture, an issue cited by half of participants. This is not surprising given the diversity of tools and databases involved in big data. Problems with accessing source data (30%), inappropriate algorithms (26%) and inaccurate results (21%) also impede use. Accessing and normalizing data sources is a significant issue as many different types of data must be incorporated to use predictive analytics optimally. Blending this data and turning it into a clean analytic data set often takes significant effort. Confirming this is the finding that data preparation is the most challenging part of the analytic process for half of the organizations in the research.

Regarding interaction with other established systems, business intelligence is most often the integration point (for 56% of companies). However, it also is increasingly embedded in databases and middleware. The ability to perform modeling in databases is important since it enables analysts to work with large data sets and do more timely model updates and scoring. Embedding into middleware has grown fourfold since our previous research on predictive analytics in 2012; this has implications for the emerging Internet of Things (IoT), through which people will interact with an increasing array of devices.

Another sign of the broader adoption of predictive analytics is how and where buying decisions are made. Budgets for  vr_NG_Predictive_Analytics_07_funding_improvement_in_predictive_analytic.._ predictive analytics are shifting. Since the previous research, funding sourced from general business budgets has declined 9 percent and increased 8 percent in line-of-business IT budgets. This comports with a shift in the form in which organizations prefer to buy predictive analytics, which now is less as a stand-alone product and more embedded in other systems. Usability and functionality are still the top buying criteria, reflecting needs to simplify predictive analytics tools and address the skills gap while still being able to access a range of capabilities.

Overall the research shows that the application of predictive analytics to business processes sets high-performing organizations apart from others. Companies more often achieve competitive advantage with predictive analytics when they support the deployment of predictive analytics in business processes (66% vs. 57% overall), use business intelligence and data warehouse teams to design and deploy predictive analytics (71% vs. 58%) and fund predictive analytics as a shared service (73% vs. 58%). Similarly, those that train employees in the application of predictive analytics to business problems achieve more satisfaction and better outcomes.

Organizations looking to improve their business through predictive analytics should examine what others are doing. Since the time of our previous research, innovation has expanded and there are more peer organizations across industries and business functions that can be emulated. And the search for such innovation need not be limited to within one’s industry; cross-industry examples also can be enlightening. More concretely, the research finds that people and processes are where organizations can improve most in predictive analytics. We advise them to concentrate on streamlining processes, acquiring necessary skills and supporting both with technology available in the market. To begin, develop a practical predictive analytics strategy and enlist all stakeholders in the organization to support initiatives.

Regards,

Tony Cosentino

VP and Research Director

Our benchmark research into big data analytics shows that marketing in the form of cross-selling and upselling (38%) and customer understanding (32%) are the top use cases for big data analytics. Related to these uses, organizations today spend billions of dollars on programs seeking customer loyalty andvr_Big_Data_Analytics_09_use_cases_for_big_data_analytics satisfaction. A powerful metric that impacts this spending is net promoter score (NPS), which attempts to connect brand promotion with revenue. NPS has proven to be a popular metric among major brands and Fortune 500 companies. Today, however, the advent of big data systems brings the value and the accuracy of NPS into question. It and similar loyalty metrics face displacement by big data analytics capabilities that can replace stated behavior and survey-based attitudinal data with actual behavioral data (sometimes called revealed behavior) combined with unstructured data sources such as social media. Revealed behavior shows what people have actually done and thus is a better predictor of what they will do in the future than what they say they have done or intend to do in the future. With interaction through various customer touch points (the omnichannel approach) it is possible to measure both attitudes and revealed behavior in a digital format and to analyze such data in an integrated fashion. Using innovative technology such as big data analytics can overcome three inherent drawbacks of NPS and similar customer loyalty and satisfaction metrics.

Such metrics have been part of the vernacular in boardrooms, organizational cultures and MBA programs since the 1980s, based on frameworks such as the Balanced Scorecard introduced by Kaplan and Norton. Net promoter score, a metric to inform the customer quadrant of such scorecards, is based on surveys in which participants are asked how likely they are to promote a brand based on an 11-point scale. The percentage of detractors (scores 0-6) is subtracted from the percentage of promoters (9-10) to produce the net promoter score. This score helps companies assess satisfaction around a brand and allows executives and managers to allocate resources. The underlying assumption is that attitude toward a brand is a leading indicator of intent and behavior. As such, NPS ostensibly can predict things such as churn behavior (the net number of new customers minus those leaving). By understanding attitudes and behavioral intent, marketers can intervene with actions such as timely offers and others intended to change behavior such as customers leaving.

Until recently, NPS and similar loyalty approaches have been one of the most adopted methods to track attitudes and vr_Customer_Analytics_02_drivers_for_new_customer_analyticsbehaviors in customer interactions and to provide a logical way to impact and improve the customer experience. The prominence of such loyalty programs and metrics reflects an increasing focus on the customer. An indication of this increased focus is found in our next-generation customer analytics benchmark research, in which improving the customer experience (63%), improving customer service strategy (57%) and improving outcomes of interactions (51%) are the top drivers for adopting customer analytics. Nevertheless, while satisfaction and loyalty metrics such as NPS are entrenched in many organizations, there are three fundamental problems with them that can be overcome using big data analytics. Let’s look at each of these challenges and how big data analytics can overcome them.

It is prone to error. Current methods and metrics are vulnerable to errors, most deriving from one of three sources.

Coverage error results from measuring only a segment of a population and projecting the results onto the entire population. The problem here is clear if we imagine using data about California to draw conclusions about the entire United States. While researchers try to overcome such coverage error with stratified sampling methods, it necessitates significant investment usually not associated with business research. Additionally, nonresponse error, a subset of coverage error, results from people opting out of being measured.

Sample error is the statistical error associated with making conclusions about a population based on only a subset of a population. Researchers can overcome it by increasing sample sizes, but this, too, requires significant investment usually not associated with business research.

Measurement error is a complex topic that deserves an extended discussion beyond the scope here, but it presumes that analysts should start with a hypothesis and try to disprove it rather than to prove it. From there, iteration is needed to come as close to the truth as possible. In the case of NPS, measurement error can simply be the result of people not telling the truth or being unduly influenced by a recent experience that skews evaluations such as brand impression or likelihood to promote a brand. Another instance occurs when a proper response option is not represented and people are forced to give an incorrect response.

Big data can address these error vulnerability because it uses a census approach to data collection. Today companies can capture data about nearly every customer interaction with the brand, including customer service calls, website experiences, social media posts and transactions. Because the data is collected across the entire population and includes more revealed behavior than attitudinal and stated behavior, the error problems associated with NPS can be largely overcome.

It lacks causal linkage with financial metrics. The common claim that a higher NPS leads to increased revenue, like the presumed relationship between customer satisfaction and business outcomes, is impossible to prove in all circumstances and all industries. For instance, a pharmaceutical company trying to tie NPS to revenue might ask a doctor how likely he is to write a prescription for a certain drug. The doctor might see this as a compromising question and not be willing to answer honestly. Regarding satisfaction metrics, Microsoft in the 1990s had very low user satisfaction but high loyalty because it had a virtual monopoly. The airline industry today sees similar dynamics.

Big data analytics can show causal linkage between measurement of the customer experience and the organization’s financial metrics. It can link systems of record such as enterprise resource planning and enterprise performance management with systems of engagement such as content management, social media, marketing and sales. Collecting large data sets of customer interactions over time enable systems to relate customer experiences with purchase behaviors such as recency, frequency and size of purchase. This can be done on an ongoing basis and can be tested with randomized experiments. With big data platforms that can reduce data to the lowest common denominators in the form of key-value pairs, the only obstacles are to have the right skill sets, big data analytic software and enough data to be able to isolate variables and repeat the experiments over time. When there is enough data to do so, causal patterns emerge that can link customer attitudes and experiences directly with transactional outcomes. As long as there is enough data, such linkage can be revealed in any type of market such as wallet share in consumer packaged goods or “winner take all” markets such as automobiles.

It lacks actionable data. Often loyalty metrics such as NPS are tied to employee compensation. Those employees have a motivation to understand the metric and what action is needed to improve the score, but that is not easy due to a number of factors. Unlike quantitative metrics such as revenue or profitability, NPS and similar loyalty metrics are softer metrics whose impacts are not easily understood. Furthermore, the measurement may happen just once or twice a year, and the composition of the sample can change over time. Often what happens is a customer satisfaction team and consultants responsible for the research and analysis prepare the trend and driver analysis and share that with various teams with suggested areas of improvement and action to be taken. Such information is disseminated based on aggregated data broken out by important product and service segments and perhaps customer journey timelines. The problem is that even if employees understand the metric and how to impact it, by the time action is taken within the organization, it is not timely and not customized in an individual manner.

Big data analytics inherently has a streamlined capability to act upon data. Instead of the traditional process of reporting results and waiting months for action to be taken on those results and new results to show up in an NPS program, data can be acted upon immediately by all employees. A big reason for this is that data is now collected at a granular level for individual customers. For instance, if a customer with a high customer lifetime value (CLV) score shows signs that are precursors of switching companies, a report can be issued to show all interactions in that individual’s customer journey and highlight the most impactful events. Then an alert can be sent and a personal interaction such as a phone call or a face-to-face meeting can be set up with the objective of preventing the customer’s defection. Incentives such as a bank automatically waiving certain fees, an airline giving an upgrade to first-class or a grocery store giving a gift certificate can be recommended by the system as a next best action.  It can also be done on a more automated but still personalized basis where the individual customer can be discreetly addressed to see how he or she can be made happy. Each of the actions can be measured against the value of the customer and contextualized forvr_Big_Data_Analytics_08_top_capabilities_of_big_data_analytics that customer. In this way, big data analytics platforms can bring together what used to be separate analytic models and action plans related to loyalty, churn, micromarketing campaigns and next best action. It is not surprising in this context that applying predictive analytics is the most important capability for big data analytics for nearly two-thirds (64%) of organizations participating in our research.I wrote about these ideas a few years ago, but only recently have I seen information systems capable of disrupting this entire category. It will not happen overnight since many NPS and satisfaction programs are tied to a component of employee compensation and internal processes that are not easily changed. Furthermore, NPS can still have value as a metric to understand word of mouth around a brand and in areas that lack data and better metrics. However, as attitudinal and behavioral big data continue to be collected and big data analytics technology continues to mature, revealed behavior will always outperform attitudinal and stated behavior data. Organizations that can challenge their conventional NPS wisdom and overcome internal political obstacles are likely to see superior return from their customer experience management investments.

Regards,

Tony Cosentino

VP and Research Director

Ventana Research recently completed the most comprehensive evaluation of analytics and business intelligence products and vendors available anywhere. As I discussed recently, such research is necessary and timely as analytics and business intelligence is now a fast-changing market. Our Value Index for Analytics and Business Intelligence in 2015 scrutinizes 15 top vendors and their product offerings in seven keyvr_VI_BI_2015_Weighted_Overall categories: Usability, Manageability, Reliability, Capability, Adaptability, Vendor Validation and TCO/ROI. The analysis shows that the top supplier is Information Builders, which qualifies as a Hot vendor and is followed by 10 other Hot vendors: SAP, IBM, MicroStrategy, Oracle, SAS, Qlik, Actuate (now part of OpenText) and Pentaho.

The evaluations drew on our research and analysis of vendors’ and products along with their responses to our detailed RFI or questionnaire, our own hands-on experience and the buyer-related findings from our benchmark research on next-generation business intelligence, information optimization and big data analytics. The benchmark research examines analytics and business intelligence from various perspectives to determine organizations’ current and planned use of these technologies and the capabilities they require for successful deployments.

We find that the processes that comprise business intelligence today have expanded beyond standard query, reporting, analysis and publishing capabilities. They now include sourcing and integration of data and at later stages the use of analytics for planning and forecasting and of capabilities utilizing analytics and metrics for collaborative interaction and performance management. Our research on big data analytics finds that new technologies collectively known as big data vr_Big_Data_Analytics_15_new_technologies_enhance_analyticsare influencing the evolution of business intelligence as well; here in-memory systems (used by 50% of participating organizations), Hadoop (42%) and data warehouse appliances (33%) are the most important innovations. In-memory computing in particular has changed BI because it enables rapid processing of even complex models with very large data sets. In-memory computing also can change how users access data through data visualization and incorporate data mining, simulation and predictive analytics into business intelligence systems. Thus the ability of products to work with big data tools figured in our assessments.

In addition, the 2015 Value Index includes assessments of their self-service tools and cloud deployment options. New self-service approaches can enable business users to reduce their reliance on IT to access and use data and analysis. However, our information optimization research shows that this change is slow to proliferate. In four out of five organizations, IT currently is involved in making information available to end users vr_Info_Optimization_01_whos_responsible_for_information_availabilityand remains entrenched in the operations of business intelligence systems.

Similarly, our research, as well as the lack of maturity of the cloud-based products evaluated, shows that organizations are still in the early stages of cloud adoption for analytics and business intelligence; deployments are mostly departmental in scope. We are exploring these issues further in our benchmark research into data and analytics in the cloud, which will be released in the second quarter of 2015.

The products offered by the five top-rated com­pa­nies in the Value Index provide exceptional functionality and a superior user experi­ence. However, Information Builders stands out, providing an excep­tional user experience and a completely integrated portfolio of data management, predictive analytics, visual discovery and operational intelligence capabilities in a single platform. SAP, in second place, is not far behind, having made significant prog­ress by integrating its Lumira platform into its BusinessObjects Suite; it added pre­dictive analytics capabilities, which led to higher Usability and Capability scores. IBM, MicroStrategy and Oracle, the next three, each provide a ro­bust integrated platform of capabilities. The key differentiator between them and the top two top is that they do not have superior scores in all of the seven categories.

In evaluating products for this Value Index we found some noteworthy innovations in business intelligence. One is Qlik Sense, which has a modern architecture that is cloud-ready and supports responsive design on mobile devices. Another is SAS Visual Analytics, which combines predictive analytics with visual discovery in ways that are a step ahead of others currently in the market. Pentaho’s Automated Data Refinery concept adds its unique Pentaho Data Integration platform to business intelligence for a flexible, well-managed user experience. IBM Watson Analytics uses advanced analytics and VR_AnalyticsandBI_VI_2015natural language processing for an interactive experience beyond the traditional paradigm of business intelligence. Tableau, which led the field in the category of Usability, continues to innovate in the area of user experience and aligning technology with people and process. MicroStrategy’s innovative Usher technology addresses the need for identity management and security, especially in an evolving era in which individuals utilize multiple devices to access information.

The Value Index analysis uncovered notable differences in how well products satisfy the business intelligence needs of employees working in a range of IT and business roles. Our analysis also found substantial variation in how products provide development, security and collaboration capabilities and role-based support for users. Thus, we caution that similar vendor scores should not be taken to imply that the packages evaluated are functionally identical or equally well suited for use by every organization or for a specific process.

To learn more about this research and to download a free executive summary, please visit.

Regards,

Tony Cosentino

VP and Research Director

Ventana Research recently completed the most comprehensive evaluation of analytics and business intelligence products and vendors available anywhere. As I discussed recently, such research is necessary and timely as analytics and business intelligence is now a fast-changing market. Our Value Index for Analytics and Business Intelligence in 2015 scrutinizes 15 top vendors and their product offerings in seven keyvr_VI_BI_2015_Weighted_Overall categories: Usability, Manageability, Reliability, Capability, Adaptability, Vendor Validation and TCO/ROI. The analysis shows that the top supplier is Information Builders, which qualifies as a Hot vendor and is followed by 10 other Hot vendors: SAP, IBM, MicroStrategy, Oracle, SAS, Qlik, Actuate (now part of OpenText) and Pentaho.

The evaluations drew on our research and analysis of vendors’ and products along with their responses to our detailed RFI or questionnaire, our own hands-on experience and the buyer-related findings from our benchmark research on next-generation business intelligence, information optimization and big data analytics. The benchmark research examines analytics and business intelligence from various perspectives to determine organizations’ current and planned use of these technologies and the capabilities they require for successful deployments.

We find that the processes that comprise business intelligence today have expanded beyond standard query, reporting, analysis and publishing capabilities. They now include sourcing and integration of data and at later stages the use of analytics for planning and forecasting and of capabilities utilizing analytics and metrics for collaborative interaction and performance management. Our research on big data analytics finds that new technologies collectively known as big data vr_Big_Data_Analytics_15_new_technologies_enhance_analyticsare influencing the evolution of business intelligence as well; here in-memory systems (used by 50% of participating organizations), Hadoop (42%) and data warehouse appliances (33%) are the most important innovations. In-memory computing in particular has changed BI because it enables rapid processing of even complex models with very large data sets. In-memory computing also can change how users access data through data visualization and incorporate data mining, simulation and predictive analytics into business intelligence systems. Thus the ability of products to work with big data tools figured in our assessments.

In addition, the 2015 Value Index includes assessments of their self-service tools and cloud deployment options. New self-service approaches can enable business users to reduce their reliance on IT to access and use data and analysis. However, our information optimization research shows that this change is slow to proliferate. In four out of five organizations, IT currently is involved in making information available to end users vr_Info_Optimization_01_whos_responsible_for_information_availabilityand remains entrenched in the operations of business intelligence systems.

Similarly, our research, as well as the lack of maturity of the cloud-based products evaluated, shows that organizations are still in the early stages of cloud adoption for analytics and business intelligence; deployments are mostly departmental in scope. We are exploring these issues further in our benchmark research into data and analytics in the cloud, which will be released in the second quarter of 2015.

The products offered by the five top-rated com­pa­nies in the Value Index provide exceptional functionality and a superior user experi­ence. However, Information Builders stands out, providing an excep­tional user experience and a completely integrated portfolio of data management, predictive analytics, visual discovery and operational intelligence capabilities in a single platform. SAP, in second place, is not far behind, having made significant prog­ress by integrating its Lumira platform into its BusinessObjects Suite; it added pre­dictive analytics capabilities, which led to higher Usability and Capability scores. IBM, MicroStrategy and Oracle, the next three, each provide a ro­bust integrated platform of capabilities. The key differentiator between them and the top two top is that they do not have superior scores in all of the seven categories.

In evaluating products for this Value Index we found some noteworthy innovations in business intelligence. One is Qlik Sense, which has a modern architecture that is cloud-ready and supports responsive design on mobile devices. Another is SAS Visual Analytics, which combines predictive analytics with visual discovery in ways that are a step ahead of others currently in the market. Pentaho’s Automated Data Refinery concept adds its unique Pentaho Data Integration platform to business intelligence for a flexible, well-managed user experience. IBM Watson Analytics uses advanced analytics and VR_AnalyticsandBI_VI_2015natural language processing for an interactive experience beyond the traditional paradigm of business intelligence. Tableau, which led the field in the category of Usability, continues to innovate in the area of user experience and aligning technology with people and process. MicroStrategy’s innovative Usher technology addresses the need for identity management and security, especially in an evolving era in which individuals utilize multiple devices to access information.

The Value Index analysis uncovered notable differences in how well products satisfy the business intelligence needs of employees working in a range of IT and business roles. Our analysis also found substantial variation in how products provide development, security and collaboration capabilities and role-based support for users. Thus, we caution that similar vendor scores should not be taken to imply that the packages evaluated are functionally identical or equally well suited for use by every organization or for a specific process.

To learn more about this research and to download a free executive summary, please visit.

Regards,

Tony Cosentino

VP and Research Director

Just a few years ago, the prevailing view in the software industry was that the category of business intelligence (BI) was mature and without room for innovation. Vendors competed in terms of feature parity and incremental advancements of their platforms. But since then business intelligence has grown to include analytics, data discovery tools and big data capabilities to process huge volumes and new types of data much faster. As is often the case with change, though, this one has created uncertainty. For example, only one in 11 participants in our benchmark research on big data analytics said that their organization fully agrees on the meaning of the term “big data analytics.”

There is little question that clear definitions of analytics and business intelligence as they are used in business today would be of value. But some IT analyst firms have tried to oversimplify the process of updating these definitions by merely combining a market basket of discovery capabilities under the label of analytics. In our estimation, this attempt is neither accurate nor useful. Discovery tools are only components of business intelligence, and their capabilities cannot accomplish all the tasks comprehensive BI systems can do. Some firms seem to want to reduce the field further by overemphasizing the visualization aspect of discovery. While visual discovery can help users solve basic business problems, other BI and analytic tools are available that can attack more sophisticated and technically challenging problems. In our view, visual discovery is one of four types of analytic discovery that can help organizations identify and understand the masses of data they accumulate today. But for many organizations visualization alone cannot provide them with the insights necessary to help make critical decisions, as interpreting the analysis requires expertise that mainstream business professionals lack.

In Ventana Research’s view, business intelligence is a technology managed by IT that is designed to produce information and reports from business data to inform business about the performance of activities, people and processes. It has provided and will continue to provide great value to business, but in itself basic BI will not meet the new generation of requirements that businesses face; they need not just information but guidance on how to take advantage of opportunities, address issues and mitigate the risks of subpar performance. Ventana_Research_Value_Index_LogoAnalytics is a component of BI that is applied to data to generate information, including metrics. It is a technology-based set of methodologies used by analysts as well as the information gained through the use of tools designed to help those professionals. These thoughtfully crafted definitions inform the evaluation criteria we apply in our new and comprehensive 2015 Analytics and Business Intelligence Value Index, which we will publish soon. As with all business tools, applications and systems we assess in this series of indexes, we evaluate the value of analytic and business intelligence tools in terms of five functional categories – usability, manageability, reliability, capability and adaptability – and two customer assurance categories – validation of the vendor and total cost of ownership and return on investment (TCO/ROI). We feature our findings in these seven areas of assessment in our Value Index research and reports. In the Analytics and Business Intelligence Value Index for 2015 we assess in depth the products of 15 of the leading vendors in today’s BI market.

The Capabilities category examines the breadth of functionality that products offer and assesses their ability to deliver the insights today’s enterprises need. For our analysis we divide this category into three subcategories for business intelligence: data, analytics and optimization. We explain each of them below.

The data subcategory of Capabilities examines data access and preparation along with supporting integration and modeling. New data sources are coming into being continually; for example, data now is generated in sensors in watches, smartphones, cars, airplanes, homes, utilities and an assortment of business, network, medical and military equipment. In addition, organizations increasingly are interested in behavioral and attitudinal data collected through various communication platforms. Examples include Web browser behavior, data mined from the Internet, social media and various survey and community polling data. The data access and integration process identifies each type of data, integrates it with all other relevant types, checks it all for quality issues, maps it back to the organization’s systems of record and master data, and manages its lineage. Master data management in particular, including newer approaches such as probabilistic matching, is a key component for creating a system that can combine data types across the organization and in the cloud to create a common organizational vernacular for the use of data.

Ascertaining which systems must be accessed and how is a primary challenge for today’s business intelligence platforms. A key part of data access is the user interface. Whether it appears in an Internet browser, a laptop, a smartphone, a tablet or a wearable device, data must be presented in a manner optimized for the interface. Examining the user interface for business intelligence systems was a primary interest of our 2014 Mobile Business Intelligence Value Index. In that research, we learned that vendors are following divergent paths and that it may be hard for some to change course as they continue. Therefore how a vendor manages mobile access and other new means impacts its products’ value for particular organizations.

Once data is accessed, it must be modeled in a useful way. Data models in the form of OLAP cubes and predefined relationships of data sometimes grow overly complex, but there is value in premodeling data in ways that make sense to business people, most of whom are not up to modeling it for themselves. Defining data relationships and transforming data through complex manipulations is often needed, for instance, to define performance indicators that align with an organization’s business initiatives. These manipulations can include business rules or what-if analysis within the context of a model or external to it. Finally, models must be flexible so they do not hinder the work of organizational users. The value of premodeling data is that it provides a common view for business users so they need not redefine data relationships that have already been thoroughly considered.

The analytics subcategory includes analytic discovery, prediction and integration. Discovery and prediction roughly map to the ideas of exploratory and confirmatory analytics, which I have discussed. Analytic discovery includes calculation and visualization processes that enable users to move quickly and easily through data to create the types of information they need for business purposes. Complementing it is prediction, which typically follows discovery. Discovery facilitates root-cause and historical analysis, but to look ahead and make decisions that produce desired business outcomes, organizations need to track various metrics and make informed predictions. Analytic integration encompasses customization of both discovery and predictive analytics and embedding them in other systems such as applications and portals.

The optimization subcategory includes collaboration, organizational management, information optimization, action and automation. Collaboration is a key consideration for today’s analytic platforms. It includes the ability to publish, share and coordinate various analytic and business intelligence functions. Notably, some recently developed collaboration platforms incorporate many of the characteristics of social platforms such as Facebook or LinkedIn. Organizational management attempts to manage to particular outcomes and sometimes provides performance indicators and scorecard frameworks. Action assesses how technology directly assists decision-making in an operational context. This includes gathering inputs and outputs for collaboration before and after a decision, predictive scoring that prescribes action and delivery of the information in the correct form to the decision-maker. Finally, automation triggers alerts in circumstances based on statistical triggers or rules and should be managed as part of a workflow. Agent technology takes automation to a level that is more proactive and autonomous.

vr_Info_Optim_Maturity_06_oraganization_maturity_by_dimensionsThis broad framework of data, analytics and optimization fits with a process orientation to business analytics that I have discussed. Our benchmark research on information optimization indicates that the people and process dimensions of performance are less well developed than the information and technology aspects, and thus a focus on these aspects of business intelligence and analytics will be beneficial.

In our view, it’s important to consider business intelligence software in a broad business context rather than in artificially separate categories that are designed for IT only. We advise organizations seeking to gain a competitive edge to adopt a multifaceted strategy that is business-driven, incorporates a complete view of BI and analytics, and uses the comprehensive evaluation criteria we apply.

Regards,

Tony Cosentino

VP and Research Director

In many organizations, advanced analytics groups and IT are separate, and there often is a chasm of understanding between them, as I have noted. A key finding in our benchmark research on big data analytics is that communication and knowledge sharing is a top benefit of big data analytics initiatives,vr_Big_Data_Analytics_06_benefits_realized_from_big_data_analytics but often it is a latent benefit. That is, prior to deployment, communication and knowledge sharing is deemed a marginal benefit, but once the program is deployed it is deemed a top benefit. From a tactical viewpoint, organizations may not spend enough time defining a common vocabulary for big data analytics prior to starting the program; our research shows that fewer than half of organizations have agreement on the definition of big data analytics. It makes sense therefore that, along with a technical infrastructure and management processes, explicit communication processes at the beginning of a big data analytics program can increase the chance of success. We found these qualities in the Chorus platform of Alpine Data Labs, which received the Ventana Research Technology Innovation Award for Predictive Analytics in September 2014.

VR2014_TechInnovation_AwardWinnerAlpine Chorus 5.0, the company’s flagship product, addresses the big data analytics communication challenge by providing a user-friendly platform for multiple roles in an organization to build and collaborate on analytic projects. Chorus helps organizations manage the analytic life cycle from discovery and data preparation through model development and model deployment. It brings together analytics professionals via activity streams for rapid collaboration and workspaces that encourage projects to be managed in a uniform manner. While activity streams enable group communication via short messages and file sharing, workspaces allow each analytic project to be managed separately with capabilities for project summary, tracking and data source mapping. These functions are particularly valuable as organizations embark on multiple analytic initiatives and need to track and share information about models as well as the multitude of data sources feeding the models.

The Alpine platform addresses the challenge of processing big data by parallelizing algorithms to run across big data platforms such as Hadoop and making it accessible by a wide audience of users. The platform supports most analytic databases and all major Hadoop distributions. Alpine was vr_Big_Data_Analytics_13_advanced_analytics_on_big_dataan early adopter of Apache Spark, an open source in-memory data processing framework that one day may replace the original map-reduce processing paradigm of Hadoop. Alpine Data Labs has been certified by Databricks, the primary contributor to the Spark project, which is responsible for 75 percent of the code added in the past year. With Spark, Alpine’s analytic models such as logistic regression run in a fraction of the time previously possible and new approaches, such as one the company calls Sequoia Forest, a machine learning approach that is a more robust version of random forest analysis. Our big data analytics research shows that predictive analytics is a top priority for about two-thirds (64%) of organizations, but they often lack the skills to deploy a fully customized approach. This is likely a reason that companies now are looking for more packaged approaches to implementing big data analytics (44%) than custom approaches (36%), according to our research. Alpine taps into this trend by delivering advanced analytics directly in Hadoop and the HDFS file system with its in-cluster analytic capabilities that address the complex parallel processing tasks needed to run in distributed environments such as Hadoop.

A key differentiator for Alpine is usability. Its graphical user interface provides a visual analytic workflow experience built on popular algorithms to deliver transformation capabilities and predictive analytics on big data. The platform supports scripts in the R language, which can be cut and pasted into the workflow development studio; custom operators for more advanced users; and Predictive Model Markup Language (PMML), which enables extensible model sharing and scoring across different systems. The complexities of the underlying data stores and databases as well as the orchestration of the analytic workflow are abstracted from the user. Using it an analyst or statistician does not need to know programming languages or the intricacies of the database technology to build analytic models and workflows.

It will be interesting to see what direction Alpine will take as the big data industry continues to evolve; currently there are many point tools, each strong in a specific area of the analytic process. For many of the analytic tools currently available in the market, co-opetition among vendors prevails in which partner ecosystems compete with stack-oriented approaches. The decisions vendors make in terms of partnering as well as research and development are often a function of these market dynamics, and buyers should be keenly aware of who aligns with whom.  For example, Alpine currently partners with Qlik and Tableau for data visualization but also offers its own data visualization tool. Similarly, it offers data transformation capabilities, but its toolbox could be complimented by data preparation and master data solutions. This emerging area of self-service data preparation is important to line-of-business analysts, as my colleague Mark Smith recently discussed.

Alpine Labs is one of many companies that have been gaining traction in the booming analytics market. With a cadre of large clients and venture capital backing of US$23 million in series A and B, Alpine competes in an increasingly crowded and diverse big data analytics market. The management team includes industry veterans Joe Otto and Steve Hillion. Alpine seems to be particularly well suited for customers that have a clear understanding of the challenges of advanced analytics vr_predanalytics_benefits_of_predictive_analytics_updatedand are committed to using it with big data to gain a competitive advantage. This benefit is what organizations find most in over two thirds (68%) of organizations according to our predictive analytics benchmark research. A key differentiator for Alpine Labs is the collaboration platform, which helps companies clear the communication hurdle discussed above and address the advanced analytics skills gap at the same time. The collaboration assets embedded into the application and the usability of the visual workflow process enable the product to meet a host of needs in predictive analytics. This platform approach to analytics is often missing in organizations grounded in individual processes and spreadsheet approaches. Companies seeking to use big data with advanced analytics tools should include Alpine Labs in their consideration.

Regards,

Tony Cosentino

VP and Research Director

The idea of not focusing on innovation is heretical in today’s business culture and media. Yet a recent article in The New Yorker suggests that today’s society and organizations focus too much on innovation and technology. The same may be true for technology in business organizations. Our research provides evidence for my claim.

My analysis on our benchmark research into information optimization shows that organizations perform better in technology and information than in the people and process dimensions. vr_Info_Optim_Maturity_06_oraganization_maturity_by_dimensionsThey face a flood of information that continues to increase in volume and frequency and must use technology to manage and analyze it in the hope of improving their decision-making and competitiveness. It is understandable that many see this as foremost an IT issue. But proficiency in use of technology and even statistical knowledge are not the only capabilities needed to optimize an organization’s use of information and analytics. They also need a framework that complements the usual analytical modeling to ensure that analytics are used correctly and deliver the desired results. Without a process for getting to the right question, users can go off in the wrong direction, producing results that cannot solve the problem.

In terms of business analytics strategy, getting to the right question is a matter of defining goals and terms; when this is done properly, the “noise” of differing meanings is reduced and people can work together efficiently. As we all know, many vr_Big_Data_Analytics_05_terminology_for_big_data_analyticsterms, especially new ones, mean different things to different people, and this can be an impediment to teamwork and achieving of business goals. Our research into big data analytics shows a significant gap in understanding here: Fewer than half of organizations have internal agreement on what big data analytics is. This lack of agreement is a barrier to building a strong analytic process. The best practice is to take time to discover what people really want to know; describing something in detail ensures that everyone is on the same page. Strategic listening is a critical skill, and done right it enables analysts to identify, craft and focus the questions that the organization needs answered through the analytic process.

To develop an effective process and create an adaptive mindset, organizations should instill a Bayesian sensibility. Bayesian analysis, also called posterior probability analysis, starts with assuming an end probability and works backward to determine prior probabilities. In a practical sense, it’s about updating a hypothesis when given new information; it’s about taking all available information and finding where it converges. This is a flexible approach in which beliefs are updated as new information is presented; it values both data and intuition. This mindset also instills strategic listening into the team and into the organization.

For business analytics, the more you know about the category you’re dealing with, the easier it is to separate what is valuable information and hypothesis from what is not. Category knowledge allows you to look at the data from a different perspective and add complex existing knowledge. This in and of itself is a Bayesian approach, and it allows the analyst to iteratively take the investigation in the right direction. This is not to say that intuition should be the analytic starting point. Data is the starting point, but a hypothesis is needed to make sense of the data. Physicist Enrico Fermi pointed out that measurement is the reduction of uncertainty. Analysts should start with a hypothesis and try to disprove it rather than to prove it. From there, iteration is needed to come as close to the truth as possible. Starting with a gut feel and trying to prove it is the wrong approach. The results are rarely surprising and the analysis is likely to add nothing new. Let the data guide the analysis rather than allowing predetermined beliefs to guide the analysis. Technological innovations in exploratory analytics and machine learning support this idea and encourage a data-driven approach.

Bayesian analysis has had a great impact not only on statistics and market insights in recent years, but it has impacted how we view important historical events as well. It is consistent with modern thinking in the fields of technology and machine learning, as well as behavioral economics. For those interested in how the Bayesian philosophy is taking hold in many different disciplines, I recommend a book entitled The Theory That Would Not Die by Sharon Bertsch McGrayne.

A good analytic process, however, needs more than a sensibility for how to derive and think about questions; it needs a tangible method to address the questions and derive business value from the answers. The method I propose can be framed in four steps: what, so what, now what and then what. Moving beyond the “what” (i.e., measurement and data) to the “so what” (i.e., insights) should be a goal of any analysis, yet many organizations are still turning out analysis that does nothing more than state the facts. Maybe 54 percent of people in a study prefer white houses, but why does anyone care? Analysis must move beyond mere findings to answer critical business questions and provide informed insights, implications and ideally full recommendations. That said, if organizations cannot get the instrumentation and the data right, findings and recommendations are subject to scrutiny.

The analytics professional should make sure that the findings, implications and recommendations of the analysis are heard by strategic and operational decision-makers. This is the “now what” step and includes business planning and implementation decisions that are driven by the analytic insights. If those insights do not lead to decision-making or action, the analytic effort has no value. There are a number of things that the analyst can do to make the information heard. A compelling story line that incorporates storytelling techniques, animation and dynamic presentation is a good start. Depending on the size of the initiative, professional videography, implementation of learning systems and change management tools also may be used.

The “then what” represents a closed-loop process in which insights and new data are fed back into the organization’s operational systems. This can be from the perspective of institutional knowledge and learning in the usual human sense which is an imperative in organizations. Our benchmark research into big data and business analytics shows a need for this: Skills and training are substantial obstacles to using big data (for 79%) and analytics (77%) in organizations. This process is similar to machine learning. That is, as new information is brought into the organization, the organization as a whole learns and adapts to current business conditions. This is the goal of the closed-loop analytic process.

Our business technology innovation research finds analytics in the top three priorities in three out of four (74%) organizations; collaboration is a top-three priority in 59 percent. vr_bti_br_technology_innovation_prioritiesBoth analytics and collaboration have a process orientation that uses technology as an enabler of the process. The sooner organizations implement a process framework, the sooner they can achieve success in their analytic efforts. To implement a successful framework such as the one described above, organizations must realize that innovation is not the top priority; rather they need the ability to use innovation to support an adaptable analytic process. The benefits will be wide-ranging, including better understanding of objectives, more targeted analysis, analytical depth and analytical initiatives that have a real impact on decision-making.

Regards,

Tony Cosentino

VP and Research Director

Oracle is one of the world’s largest business intelligence and analytics software companies. Its products range from middleware, back-end databases and ETL tools to business intelligence applications and cloud platforms, and it is well established in many corporate and government accounts. A key to Oracle’s ongoing success is in transitioning its business intelligence and analytics portfolio to self-service, big data and cloud deployments. To that end, three areas in which the company has innovated are fast, scalable access for transaction data; exploratory data access for less structured data; and cloud-based business intelligence.

 Providing users with access to structured data in an expedient and governed fashion continues to be a necessity for companies. Our benchmark research into information optimization finds drilling into information within applications (37%) and search (36%) to be the capabilities most needed for end users in business.

To provide them, Oracle enhanced its database in version Oracle 12c, which was  released in 2013 . The key innovation is to enable both transaction processing and analytic processing workloads on the same system.MostImportantEndUseCapUsing in-memory instruction sets on the processor, the system can run calculations quickly without changing the application data. The result is that end users can explore large amounts of information in the context of all data and applications running on the 12c platform. These applications include Oracle’s growing cadre of cloud based applications. The value of this is evident in our big data analytics benchmark research , which finds that the number-one source of big data is transactional data from applications, mentioned by 60 percent of participants.

 Search and interactive analysis of structured data are addressed by Oracle Business Intelligence Enterprise Edition (OBIEE) through a new visualization interface that applies assets Oracle acquired from Endeca in 2011. (Currently, this approach is available in Business Intelligence Cloud Service, which I discuss below.) To run fast queries of large data sets, columnar compression can be implemented by small code changes in the Oracle SQL Developer interface. These changes use the innovation in 12c discussed above and would be implemented by users familiar with SQL. Previously, IT professionals would have to spend significant time to construct aggregate data and tune the database so users could quickly access data. Otherwise transactional databases take a long time to query since they are row-oriented and the query literally must go through every row of data to return analytic results. With columnar compression, end users can explore and interact with data in a much faster, less limited fashion. With the new approach, users no longer need to walk down each hierarchy but can drag and drop or right-click to see the hierarchy definition. Drag-and-drop and brushing features enable exploration and uniform updates across all visualizations on the screen. Under the covers,

 DefiningBDAnalyticsthe database is doing some heavy lifting, often joining five to 10 tables to compute the query in near real time. The ability to do correlations on large data sets in near real time is a critical enabler of data exploration since it allows questions to be asked and answered one after another rather than asking users to predefine what those questions might be. This type of analytic discovery enables much faster time to value especially when providing root-cause analysis for decision-making.

 Oracle also  provides Big Data SQL , a query approach that enables analysis of unstructured data analysis on systems such as Hadoop. The model uses what Oracle calls query franchising rather than query federation in which, processing is done in a native SQL dialect and the various dialects must be translated and combined into one. With franchising, Oracle SQL runs natively inside of each of the systems. This approach applies Oracle SQL to big data systems and offloads queries to the compute nodes or storage servers of the big data system. It also maintains the security and speed needed to do exploration on less structured data sources such as JSON, which the 12c database supports natively. In this way Oracle provides security and manageability within the big data environment. Looking beyond structured data is key for organizations today. Our research shows that analyzing data from all sources is how three-fourths (76%) of organizations define big data analytics.

 To visualize and explore big data, Oracle  offers Big Data Discovery , which browses Hadoop and NoSQL stores, and samples and profiles data automatically to create catalogs. Users can explore important attributes through visualization as well as using common search techniques. The system currently supports capabilities such as string transformations, variable grouping, geotagging and text enrichment that assist in data preparation. This is a good start to address exploration on big data sources, but to better compete in this space, Oracle should offer more usable interfaces and more capabilities for both data preparation and visualization. For example, visualizations such as decision trees and correlation matrices are important to help end users to make sense of big data and do not appear to be included in the tool.

 The third analytic focus, and the catalyst of the innovations discussed above, is Oracle’s move to the cloud. In September 2014,  Oracle released BI Cloud Service  (BICS), which helps business users access Oracle BI systems in a self-service manner with limited help from IT. Cloud computing has been a major priority for Oracle in the past few years with not just its applications but also for its entire stack of technology. With BICS, Oracle offers a stand-alone product with which a departmental workgroup can insert analytics directly into its cloud applications. When BICS is coupled with the Data-as-a-Service (DaaS) offering, which accesses internal data as well as third-party data sources in the cloud, Oracle is able to deliver cross-channel analysis and identity-as-data. Cross-channel analysis and identity management are important in cloud analytics from both business and a privacy and security perspectives.

 CustomerAnalyticsIn particular, such tools can help tie together and thus simplify the complex task of managing multichannel marketing. Availability and simplicity in analytics tools are priorities for marketing organizations.  Our research into next-generation customer analytics  shows that for most organizations data not being readily available (63%) and difficulty in maintaining customer analytics systems (56%) are top challenges.

 Oracle is not the first vendor to offer self-service discovery and flexible data preparation, but BICS begins its movement from the previous generation of BI technology to the next. BICS puts Oracle Transactional Business Intelligence (OTBI) in the cloud as a first step toward integration with vertical applications in the lines of business. It lays the groundwork for cross-functional analysis in the cloud.

 We don’t expect BICS to compete immediately with more user-friendly analytic tools designed for business and analytics or with well-established cloud computing BI players. Designers still must be trained in Oracle tools, and for this reason, it appears that the tool, at least in its first iteration, is targeted only at Oracle’s OBIEE customers seeking a departmental solution that limits IT involvement. Oracle should continue to address usability for both end users and designers. BICS also should connect to more data sources including Oracle Essbase. It currently comes bundled with  Oracle Database Schema Service  which acts as the sole data source but does not directly connect with any other database. Furthermore, data movement is not streamlined in the first iteration, and replication of data is often necessary.

 Overall, Oracle’s moves in business intelligence and analytics make sense because they use the same semantic models in the cloud as those analytic applications that many very large companies use today and won’t abandon soon. Furthermore, given Oracle’s growing portfolio of cloud applications and the integration of analytics into these transactional applications through OTBI, Oracle can leverage cloud application differentiation for companies not using Oracle. If Oracle can align its self-service discovery and big data tools with its current portfolio in reasonably timely fashion, current customers will not turn away from their Oracle investments. In particular, those with an Oracle centric cloud roadmap will have no reason to switch. We note that cloud-based business intelligence and analytics applications is still a developing market. Our previous research showed that business intelligence had been a laggard in the cloud in comparison to genres such as human capital management, marketing, sales and customer service. We are examining trends in our forthcoming  data and analytics in the cloud benchmark research, which will evaluate both the current state of such software and where the industry likely is heading in 2015 and beyond. For organizations shifting to cloud platforms, Oracle has a very progressive cloud computing portfolio that  my colleague has assessed  and they have created a path by investing in its Platform-as-a-Service (PaaS) and DaaS offerings. Its goal is to provide uniform capabilities across mobility, collaboration, big data and analytics so that all Oracle applications are consistent for users and can be extended easily by developers. However, Oracle competes against many cloud computing heavyweights like Amazon Web Services, IBM and Microsoft, so achieving success through significant growth has some challenges. Oracle customers generally and OBIEE customers especially should investigate the new innovations in the context of their own roadmaps for big data analytics, cloud computing and self-service access to analytics.

 Regards,

 Tony Cosentino

Vice President and Research Director

Tony Cosentino – Twitter

Stats

  • 61,780 hits
Follow

Get every new post delivered to your Inbox.

Join 96 other followers

%d bloggers like this: