It’s widely agreed that cloud computing is a major technology innovation. Many companies use cloud-based systems for specific business functions such as customer service, sales, marketing, finance and human resources. More generally, however, analytics and business intelligence (BI) have not migrated to the cloud as quickly. But now cloud-based data and analytics products are becoming more common. This trend is most popular among technology companies, small and midsize businesses, and departments in larger ones, but there are examples of large companies moving their entire BI environments to the cloud. Our research into big data analytics shows that more than one-fourth of analytics initiatives for companies of all sizes are cloud-based.

vr_bti_br_top_benefits_of_cloud_computingLike other cloud-based applications, cloud analytics offers enhanced scalability and flexibility, affordability and IT staff optimization. Our research shows that in general the top benefits are lowered costs (for 40%), improved efficiency (39%) and better communication and knowledge sharing (34%). Using the cloud, organizations can use a sophisticated IT infrastructure without having to dedicate staff to install and support it. There is no need for comprehensive development and testing because the provider is responsible for maintaining and upgrading the application and the infrastructure. The cloud can also provide flexible infrastructure resources to support “sandbox” testing environments for advanced analytics deployments. Multitenant cloud deployments are more affordable because costs are shared across many companies. When used departmentally, application costs need not be capitalized but instead can be made operational expenditures. Capabilities can be put to use quickly, as vendors develop them, and updates need not disrupt use. Finally, some cloud-based interfaces are more intuitive for end users since they have been designed with the user experience in mind. Regarding cloud technology, our business technology innovation research finds that usability is the most important technology evaluation criterion (for 64% of participants), followed by reliability (54%) and capability (%).

vr_bti_why_companies_dont_use_cloudFor analytics and BI specifically, there are still issues holding back adoption. Our research finds that a primary reason companies do not deploy cloud-based applications of any sort are security and compliance issues. For analytics and business intelligence, we can also include data related activities as another reason since cloud-based approaches often require data integration and transmission of sensitive data across an external network along with a range of data preparation. Such issues are especially prevalent for companies that have legacy BI tools using data models that have been distributed across their divisions. Often these organizations have defined their business logic and metrics calculations within the context of these tools. Furthermore, these tools may be integrated with other core applications such as forecasting and planning. To re-architect such data models and metrics calculations is a challenge some companies are reluctant to undertake.

In addition, despite widespread use of some types of cloud-based systems, for nontechnical business people discussions of business intelligence in the cloud can be confusing, especially when they involve information integration, the types of analytics to be performed and where the analytic processes will. The first generation of cloud applications focused on end-user processes related to the various lines of business and largely ignored the complexities inherent in information integration and analytics. Organizations can no longer ignore these complexities since doing so exacerbates the challenge of fragmented systems and distributed data. Buyers and architects should understand the benefits of analytics in the cloud and weigh these benefits against the challenges described above.

Our upcoming benchmark research into data and analytics in the cloud will examine the current maturity of this market as well opportunities and barriers to organizational adoption across line of business and IT. It will evaluate cloud-based analytics in the context of trends such as big data, mobile technology and social collaboration as well as location intelligence and predictive analytics. It will consider how cloud computing enables these and other applications and identify leading indicators for adoption of cloud-based analytics. It also will examine how cloud deployment enables large-scale and streaming applications. For example, it will examine real-time processing of vast amounts of data from sensors and other semistructured data (often referred to as the Internet of Things).

It is an exciting time to be studying this particular market as companies consider moving platforms to the cloud. I look forward to receiving any qualified feedback as we move forward to start this important benchmark research. Please get in touch if you have an interest in this area of our research.

Regards,

Tony Cosentino

VP and Research Director

It’s widely agreed that cloud computing is a major technology innovation. Many companies use cloud-based systems for specific business functions such as customer service, sales, marketing, finance and human resources. More generally, however, analytics and business intelligence (BI) have not migrated to the cloud as quickly. But now cloud-based data and analytics products are becoming more common. This trend is most popular among technology companies, small and midsize businesses, and departments in larger ones, but there are examples of large companies moving their entire BI environments to the cloud. Our research into big data analytics shows that more than one-fourth of analytics initiatives for companies of all sizes are cloud-based.

vr_bti_br_top_benefits_of_cloud_computingLike other cloud-based applications, cloud analytics offers enhanced scalability and flexibility, affordability and IT staff optimization. Our research shows that in general the top benefits are lowered costs (for 40%), improved efficiency (39%) and better communication and knowledge sharing (34%). Using the cloud, organizations can use a sophisticated IT infrastructure without having to dedicate staff to install and support it. There is no need for comprehensive development and testing because the provider is responsible for maintaining and upgrading the application and the infrastructure. The cloud can also provide flexible infrastructure resources to support “sandbox” testing environments for advanced analytics deployments. Multitenant cloud deployments are more affordable because costs are shared across many companies. When used departmentally, application costs need not be capitalized but instead can be made operational expenditures. Capabilities can be put to use quickly, as vendors develop them, and updates need not disrupt use. Finally, some cloud-based interfaces are more intuitive for end users since they have been designed with the user experience in mind. Regarding cloud technology, our business technology innovation research finds that usability is the most important technology evaluation criterion (for 64% of participants), followed by reliability (54%) and capability (%).

vr_bti_why_companies_dont_use_cloudFor analytics and BI specifically, there are still issues holding back adoption. Our research finds that a primary reason companies do not deploy cloud-based applications of any sort are security and compliance issues. For analytics and business intelligence, we can also include data related activities as another reason since cloud-based approaches often require data integration and transmission of sensitive data across an external network along with a range of data preparation. Such issues are especially prevalent for companies that have legacy BI tools using data models that have been distributed across their divisions. Often these organizations have defined their business logic and metrics calculations within the context of these tools. Furthermore, these tools may be integrated with other core applications such as forecasting and planning. To re-architect such data models and metrics calculations is a challenge some companies are reluctant to undertake.

In addition, despite widespread use of some types of cloud-based systems, for nontechnical business people discussions of business intelligence in the cloud can be confusing, especially when they involve information integration, the types of analytics to be performed and where the analytic processes will. The first generation of cloud applications focused on end-user processes related to the various lines of business and largely ignored the complexities inherent in information integration and analytics. Organizations can no longer ignore these complexities since doing so exacerbates the challenge of fragmented systems and distributed data. Buyers and architects should understand the benefits of analytics in the cloud and weigh these benefits against the challenges described above.

Our upcoming benchmark research into data and analytics in the cloud will examine the current maturity of this market as well opportunities and barriers to organizational adoption across line of business and IT. It will evaluate cloud-based analytics in the context of trends such as big data, mobile technology and social collaboration as well as location intelligence and predictive analytics. It will consider how cloud computing enables these and other applications and identify leading indicators for adoption of cloud-based analytics. It also will examine how cloud deployment enables large-scale and streaming applications. For example, it will examine real-time processing of vast amounts of data from sensors and other semistructured data (often referred to as the Internet of Things).

It is an exciting time to be studying this particular market as companies consider moving platforms to the cloud. I look forward to receiving any qualified feedback as we move forward to start this important benchmark research. Please get in touch if you have an interest in this area of our research.

Regards,

Tony Cosentino

VP and Research Director

Our benchmark research consistently shows that business analytics is the most significant technology trend in business today and acquiring effective predictive analytics is organizations’ top priority for analytics. It enables them to look forward rather than backward and, participate organizations reported, leads to competitive advantage and operational efficiencies.

In our benchmark research on big data analytics, for example, 64 percent of organizations ranked predictive analytics as the most Untitledimportant analytics category for working with big data. Yet a majority indicated that they do not have enough experience in applying predictive analytics to business problems and lack training on the tools themselves.

Predictive analytics improves an organization’s ability to understand potential future outcomes of variables that matter. Its results enable an organization to decide correct courses of action in key areas of the business. Predictive analytics can enhance the people, process, information and technology components of an organization’s future performance.

In our most recent research on this topic, more than half (58%) of participants indicated that predictive analytics is very important to their organization, but only one in five said they are very satisfied with their use of those analytics. Furthermore, our research found that implementing predictive analysis would have a transformational impact in one-third of organizations and a significant positive impact in more than half of other ones.

In our new research project, The Next Generation of Predictive Analytics, we will revisit predictive analysis with an eye to determining how attitudes toward it have changed,  along with its current and planned use, and its importance in business. There are significant changes in this area, including where, how, why, and when predictive analytics are applied. We expect to find changes not only in forecasting and analyzing customer churn but also in operational use at the front lines of the organization and in improving the analytic process itself. The research will also look at the progress of emerging statistical languages such as R and Python, which I have written about.

vr_predanalytics_benefits_of_predictive_analytics_updatedAs does big data analytics, predictive analytics involves sourcing data, creating models, deploying them and managing them to understand when an analytic model has become stale and ought to be revised or replaced. It should be obvious that only the most technically advanced users will be familiar with all this, so to achieve broad adoption, predictive analytics products must mask the complexity and be easy to use. Our research will determine the extent to which usability and manageability are being built into product offerings.

The promise of predictive analytics, including competitive advantage (68%), new revenue opportunities (55%), and increased profitability (52%), is significant. But to realize the advantages of predictive analytics, companies must transform how they work. In terms of people and processes a more collaborative strategy may be necessary. Analysts need tools and skills in order to use predictive analytics effectively. A new generation of technology is also becoming available where predictive analytics are easier to apply and use, along with deploy into line of business processes. This will help organizations significantly as there are not enough data scientists and specially trained professionals in predictive analytics that will be available for organizations to utilize or afford to hire.

This benchmark research will look closely at the evolving use of predictive analytics to establish how it equips business to make decisions based on likely futures, not just the past.

Regards,

Tony Cosentino

VP & Research Director

VRMobileBIVI_HotVendorMobility continues to be a hot adoption area in business intelligence, according to our research across analytics and line of business departments. Nearly three-quarters (71%) of organizations said their mobile workforce would be able to access BI capabilities in the next 12 months according to our next generation mobile business intelligence research. Roambi, a provider of mobile business intelligence applications, has made important strides this year after moving to deploying its products in the cloud, an event that I covered previously. Roambi is rated as one of the top providers of mobile business intelligence or what we refer to as a ‘Hot Vendor’ according to our Value Index.

Earlier this year, Roambi announced a partnership with Box, which offers cloud-based data storage and file sharing. More recently it began to catch up with the market by announcing support for the Android mobile operating system. Roambi has focused on the mobile BI market from its inception, first by building on the Apple iPhone’s small screen and then progressing to support the majority of mobile devices in corporations today.

The Box partnership, announced in March, enables joint Box and Roambi customers to visualize and interact with data stored on the Box file sharing system. Specifically, users are able to sync Roambi Analytics, the company’s visualization engine, and Roambi Flow, its storyboarding capability, with Box. The integration allows fast importation of Box source files and the ability to create and share interactive reports through Roambi Analytics and to create digital magazines and content through Roambi Flow. Data can be automatically refreshed in real time via Box Sync.

This partnership serves both companies since the coupled service provide users with expanded capabilities and little overlap. Box’s rapid growth is being driven by its ease of use, open platform and cloud approach to file sharing. Thisis a natural platform for Roambi to build on and expand its customer base. For Box, Roambi’s dynamic visualization and a collaboration capabilities address its business customers’ needs and increase its market opportunities. In our benchmark research on information optimization 83 percent of organizations said it is important to have components that provide interactive capabilities to support presentation of information.

Roambi also works with other application vendors in the CRM and ERP market to integrate their applications. The same research shows that CRM (for 45%) and ERP (38%) are important types to integrate with others especially in areas such as sales and customer service. Apple’s recent announcement of a new SDK, should facilitate process integration between software systems so that, for example, users can access and navigate applications such as those from Box and Roambi and transactional applications such as CRM and ERP in a single workflow. This capability can provide further usability advantages for Roambi, which scored the highest rating in this area in our Mobile Business Intelligence Value Index.

Roambi announced its mobile BI support for the Google Android mobile operating system that operates across a wide range of those smartphone and tablet technologies. It had delayed its development of its software on the Android platform, which required significant development resources and investment but was part of its strategy to maximize its business potential and relationship with Apple. The release are available at the Google Play store . The initial version will include four of Roambi’s most popular views: Card, Catalist, Layers and Superlist. Similar to its application for the Apple platform, security features for Android include Device Lockout, Remote Wipe and Application Passcode. Integration with Roambi’s partner Okta provides identity management services and gives access to any applications supported by Okta. Integration includes Active Directory (AD) and lightweight directory access protocol (LDAP). While Roambi Flow will not be available on Android out of the gate, the company says it will be becoming available by the end of 2014.

Current approaches to mobile business intelligence applications on the market include native, Web-based and hybrid (a combination of both). We compare these approaches in detail in the executive summary of our Mobile Business Intelligence Value Index report.With the new Android support, Roambi has a unique approach to the hybrid architecture that bypasses the browser completely. There is no data cached in a browser and in fact the data is loaded directly to the device and rendered through Roambi natively on the device.  From the user’s perspective, the benefit of this approach is performance since interactivity does not rely on data traveling over a network. A second benefit is offline access to data, which is not available via non-native approaches. From the developer’s or information assembler perspective, testing across browsers is not needed since there is no data caching and the experience is the same regardless of browser in use.

vr_Info_Optimization_16_information_software_evaluation_criteriaOur next-generation business intelligence research shows that executives strongly support mobile BI: Nearly half of them said that mobility is very important to their BI processes. Executives also favor Apple over Android devices which is likely one of the key reasons for Apple’s dominance in the current business intelligence landscape. However our research shows latent demand for Android devices in the business intelligence market and given the dominance of Android in the consumer market as well as dominance in places like Asia and other emerging markets, any company that seeks to take significant global market share will need support for both platforms. Our information optimization research fond Google Android as second in smartphone platform (24%) behind Apple iPhone (59%) as first ranked position. Therefore, Android support is an important evolution for Roambi in order to address an increasing demand in the market for Android devices.

Usability has become the most important evaluation criteria and in the information optimization benchmark research was selected as most important to over half (58%) of organizations. Roambi ranked as the highest in usability in the Ventana Research Mobile Business Intelligence Value Index, though its overall score was hampered somewhat by the lack of support for Android. With Android support, the company now addresses the need for multiple and the so called bring your own device (BYOD) to work methods now becoming more prevalent and allowed by organizations. By addressing this as well as taking advantages of broader market developments such as the new Apple SDKs, Roambi continues to address what organizations find important today. Usability of business intelligence systems is a top priority for 63% of companies. Even our big data analytics research finds a growing level of overall importance for mobile access in almost half of organizations (46%).  Any company that wants to get a user friendly business intelligence into the hands of its mobile workers quickly and effectively should have Roambi in the consideration set.

Regards,

Tony Cosentino

VP and Research Director

At its annual industry analyst summit last month and in a more recent announcement of enterprise support for parallelizing the R language on its Aster Discovery Platform, Teradata showed that it is adapting to changes in database and analytics technologies. The presentations at the conference revealed a unified approach to data architectures and value propositions in a variety of uses including the Internet of Things, digital marketing and ETL offloading. In particular, the company provided updates on the state of its business as well as how the latest version of its database platform, Teradata 15.0, is addressing customers’ needs for big data. My colleague Mark Smith covered these announcements in depth. The introduction of scalable R support was discussed at the conference but not announced publicly until late last month.

vr_Big_Data_Analytics_13_advanced_analytics_on_big_dataTeradata now has a beta release of parallelized support for R, an open source programming language used significantly in universities and growing rapidly in enterprise use. One challenge is that R relies on a single-thread, in-memory approach to analytics. Parallelization of R allows the algorithm to run on much larger data sets since it is not limited to data stored in memory. For a broader discussion of the pros and cons of R and its evolution, see my analysis. Our benchmark research shows that organizations are counting on companies such as Teradata to provide a layer of abstraction that can simplify analytics on big data architectures. More than half (54%) of advanced analytics implementations are custom built, but in the future this percentage will go down to about one in three (36%).

Teradata’s R project has three parts. The first includes a Teradata Aster R library, which supplies more than 100 prebuilt R functions that hide complexity of the in-database implementation. The algorithms cover the most common big data analytic approaches in use today, which according to our big data analytics benchmark research are classification (used by 39% of organizations), clustering (37%), regression (35%), time series (32%) and affinity analysis (29%). Some use innovative approaches available in Aster such as Teradata’s patented nPath algorithm, which is useful in areas such as digital marketing. All of these functions will receive enterprise support from Teradata, likely through its professional services team.

The second part of the project involves the R parallel constructor. This component gives analysts and data scientists tools to build their own parallel algorithms based on the entire library of open source R algorithms. The framework follows the “split, apply and combine” paradigm, which is popular among the R community. While Teradata won’t support the algorithms themselves, this tool set is a key innovation that I have not yet seen from others in the market.

Finally, the R engine has been integrated with Teradata’s SNAP integration framework. The framework provides unified access to multiple workload specific engines such as relational (SQL), graph (SQL-GR), MapReduce (SQL-MR) and statistics. This is critical since the ultimate value of analytics rests in the information itself. By tying together multiple systems, Teradata enables a variety of analytic approaches. More importantly, the data sources that can be merged into the analysis can deliver competitive advantages. For example, JSON integration, recently announced, delivers information from a plethora of connected devices and detailed Web data.

vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticsTeradata is participating in industry discussions about both data management and analytics. As Mark Smith discussed, its unified approach to data architecture addresses challenges brought on competing big data platforms such as Hadoop and other NoSQL approaches like that one announced with MongoDB supporting JSON integration. These platforms access new information sources and help companies use analytics to indirectly increase revenues, reduce costs and improve operational efficiency. Analytics applied to big data serve a variety of uses, most often cross-selling and up-selling (for 38% of organizations), better understanding of individual customers (32%) and optimizing price (30%) and IT operations (24%). Teradata is active in these areas and is working in multiple industries such as financial services, retail, healthcare, communications, government, energy and utilities.

Current Teradata customers should evaluate the company’s broader analytic and platform portfolio, not just the database appliances. In the fragmented and diverse big data market, Teradata is sorting through the chaos to provide a roadmap for largest of organizations to midsized ones. The Aster Discovery Platform can put power into the hands of analysts and statisticians who need not be data scientists. Business users from various departments, but especially high-level marketing groups that need to integrate multiple data sources for operational use, should take a close look at the Teradata Aster approach.

Regards,

Tony Cosentino

VP & Research Director

Information Builders announced two major new products at its recent annual user summit. The first was InfoDiscovery, a tool for ad hoc data analysis and visual discoveryThe second was iWay Sentinel, which allows administrators to manage applications in a proactive and dynamic manner. Being a privately held company, Information Builders is not a household name, but it is a major provider of highly scalable business intelligence (BI) and information management software to companies around the world.

VRMobileBIVI_HotVendorThis year’s announcements come one year after the release of WebFOCUS 8.0, which I wrote about at the time. Version 8.0 of this flagship BI product includes a significant overhaul of the underlying code base, and its biggest change is how it renders graphics by putting the parameters of the HTML5 graph code directly inside the browser. This approach allows consistent representation of the business intelligence graphics in multiple device environments including mobile ones. Our research into information optimization shows that mobile technology improves business performance significantly in one out of three organizations. The graphics capability helped Information Builders earn the rating of Hot vendor in our latest Value Index on Mobile Business Intelligence. It is an increasingly important trend to combine analytics with transactional systems in a mobile environment. Our research shows that mobile business intelligence is advancing quickly. Nearly three-quarters (71%) of participants said they expect their mobile workforce to have BI capabilities in the next 12 months.

vr_Big_Data_Analytics_12_benefits_of_visualizing_big_dataWebFOCUS InfoDiscovery represents the company’s new offer in the self-service analytics market. For visual discovery it enables users to extract, blend and prepare data from various data sources such as spreadsheets, company databases and third-party sources. Once the analytic data set is created, users can drill down into the information in an underlying columnar database. They can define queries as they go and examine trends, correlations and anomalies in the data set. Users given permission can publish the visualization from their desktop to the server for others to view or build further. Visualization is another area of increasing importance for organizations. Our research on big data analytics said data visualization has a number of benefits; the most-often cited are faster analytics (by 49%), understanding content (48%), root-cause analysis (40%) and displaying multiple result sets at the same time (40%).

InfoDiscovery is Information Builders’ contender in the new breed of visual discovery products. The first generation of visual discovery products drew attention for their visual capabilities, ease of use and agility. More recently, established business intelligence vendors, of which Information Builders is one, have focused on developing visual discovery tools on the platform of their well-known BI products, with the aim of taking advantage of their maturity. Currently this second wave of tools is still behind the first in terms of ease of use and visual analysis but are advancing rapidly, and they can provide better data governance, version control, auditing and user security. For instance, InfoDiscovery uses the same metadata as the enterprise platform WebFOCUS 8 so objects from both InfoDiscovery and other WebFOCUS applications can be configured in the same user portal. When a business user selects a filter, the data updates across all the components in the dashboard. The HTML5 rendering engine, new in WebFOCUS 8.0, makes the dashboard available to various devices including tablets and smartphones.

vr_oi_how_operational_intellegence_is_usedThe other major announcement at the conference, iWay Sentinel, is a real-time application monitoring tool that helps administrators manage resources across distributed systems. It works with iWay Service Manager, which is used to manage application workflows. IWay Sentinel allows multiple instances of Service Manager to be viewed and managed from a single Web interface, and administrators can address bottlenecks in system resources both manually and automatically. The tool belongs in the category we call operational intelligence and as our research finds, activity and event monitoring is the most important use (for 62% of research participants), followed by alerting and notification.

Sentinel is an important product in the Information Builders portfolio for a couple of reasons. Tactically speaking, it enables large organizations that are running multiple implementations of iWay Service Manager to manage infrastructure resources in a flexible and streamlined manner. From a strategic perspective, it ties the company to the emerging Internet of Things (IoT), which connects devices and real-time application workflows across a distributed environment. In such an environment, rules and processes flows must be monitored and coordinated in real time. Information is passed along an enterprise service bus that enables synchronous interaction of various application components. The use of IoT is in multiple areas such as remote management of devices, telematics and fleet management, predictive maintenance, supply chain optimization, and utlilities monitoring. The challenge is that application software is often complex and its processes are interdependent. For this reason, most approaches to the IoT have been proprietary in nature. Even so, Information Builders has a large number of clients in various industries, especially retail, that may be interested in its approach.

Information Builders continues to innovate in the changing IT industry and business demand for analytics and data, building on its integration capabilities and its core business intelligence assets. The breadth and depth of its software portfolio enable the company to capitalize on these assets as demand shifts. For instance, temporal analysis is becoming more important; Information Builders has built that capability into its products for years. In addition, the company’s core software is hardened by years of meeting high-concurrency needs. Companies that have thousands of users need this type of scalable, battle-tested system.

Both iWay Sentinel and InfoDiscovery are in limited release currently and will be generally available later this year. Users of other Information Builders software should examine InfoDiscovery and assess its fit in their organizations. For business users it offers a self-service approach on the same platform as the WebFOCUS enterprise product. IT staff can uphold their governance and system management responsibilities through visibility and flexible control of the platform. For its part iWay Sentinel should interest companies that have to manage multiple instances of information applications and use iWay Service Manager. In particular, retailers, transportation companies and healthcare companies exploring IoT uses should consider how it can help.

Information Builders is exploiting the value of data into what is called information optimization for which they are finding continued growth in providing information applications that meet specific business and process needs. Information Builders is also beginning to further exploit the big data sources and mobile technology areas but will need to further invest to ensure it can be part of a spectrum of new business needs. I continued to recommend any company that must serve a large set of employees in the workforce and has a need for blending data and analytics for business intelligence or information needs to consider Information Builders.

Regards,

Tony Cosentino

VP and Research Director

Tibco’s recent acquisition of Jaspersoft helps the company fill out its portfolio of business intelligence (BI) and reporting software in an increasingly competitive marketplace. Tibco already offered a range of products in BI and analytics including Tibco Spotfire, an established product for visual data discovery. Jaspersoft and its open source Java reporting tool JasperReports have been around since 2001, and the company says it has 16 million product downloads worldwide, 140,000 production deployments and 2,000 commercial customers in 100 countries. Jaspersoft received attention recently for its partnership with Amazon Marketplace and the ability to embed its system into applications using a credit card and a few simple configuration steps. This example of embedding the technology is an area that Tibco knows well from its history of integrating its technology into enterprise architecture across the planet.

vr_Info_Optimization_08_most_important_analyst_capabilities_updatedThe acquisition is significant given today’s advancements in the Business Intelligence market and the need for tools to serve a variety of users. In some ways their technologies serve the same users – analysts and business users trying to make decisions with data but how they approach it and support a broad set of roles and responsibilities is different.  Tibco Spotfire, Tibco’s approach to business analytics, serves for analytics and visualization with specializing in visual discovery and data exploration while Jaspersoft addresses the query and analyze, reporting, dashboards and other aspects of BI. According to our benchmark research on information optimization, the capabilities business users most often need are to drill into information within applications (37%), search for data (36%) and collaborate (27%). For analysts, the most necessary capabilities are extracting data (39%), designing and integrating metrics (37%) and developing policies and rules for access (34%). With Jaspersoft, Tibco can address both groups and also can embed intelligence and reporting capabilities into operationally oriented environments across range of business applications.

vr_oi_challenges_using_bi_for_operational_intelligenceThe acquisition makes sense in that more capabilities are needed to address the expanding scope of business intelligence and analytics. In practice, it will be interesting to see how the open source community and culture of Jaspersoft meshes with the culture of Tibco’s Spotfire division. For now, Jaspersoft will continue as a separate division so business likely will continue as usual until management decides specific areas of integration. With respect to development efforts, it will be critical to blend the discovery capabilities of Tibco Spotfire with Jaspersoft’s reporting which will be a formidable challenge.  Another key to success will be how Tibco integrates both with the capabilities from Extended Results, a mobile business intelligence provider Tibco bought in 2013. Mobility is an area where Ventana Research found Jaspersoft significantly lacking, so the Extended Results capabilities should prove useful. Finally, Tibco’s event-enabled infrastructure will likely play a key role as the company continues to invest in operational intelligence for event-focused information gathering and delivery. Our operational intelligence research has found a lack of integration from business intelligence like that of Jaspersoft with event streams like from Tibco to be a major challenge in over half (51%) of organizations. This is a potential opportunity for Tibco as it looks at future integration of the technologies.

The Jaspersoft acquisition is not surprising given recent changes in the BI market. The category, which just a few years ago was vr_Info_Optimization_01_whos_responsible_for_information_availabilityconsidered mature and well-defined, is expanding to include areas such as analytic discovery tools, advanced analytics and big data. The union of Tibco Spotfire, which primarily targets line-of-business professionals from analysts to knowledge worksers, and Jaspersoft, a more IT-centered company, reflects the need for the industry to bridge a divide that exists in many organizations where IT is publishing dashboards and reports to business.  The challenge of using information across business and IT was found in our latest research, revealed in our information optimization benchmark research, shows that information management these days is most often (in 42% of organizations) a joint responsibility of IT and the lines of business , although IT is involved in some capacity in four-fifths of them. It remains to be seen whether the joint company can take on major competitors that have far more cash resources and take a similar approach.

Preliminary indicators show a good fit between these two organizations. Customers from each will be introduced to important new tools and capabilities from the other. One of the first likely moves for Tibco will be to introduce the 2,000 commercial customers and global presence of Jaspersoft to the broader portfolio. We advise those customers to evaluate what Tibco offers, especially those from Tibco Spotfire which continues to be a leader in the visual data discovery market. Before investing, however, customers and prospects should demand clarity on the company’s plans for technical integration of analytics and how these will fit with organizations long-term business intelligence and analytics roadmaps. Tibco customers migrating to the cloud should investigate the work Jaspersoft is doing with companies like Amazon and consider whether the embedded approach to interactive reporting can fit with their analytics, cloud and application strategies.

The opportunity for Tibco to advance business analytics is significant through this acquisition but it has historically not been as progressive in its marketing and sales of analytics compared to others in the market. The demand for visual discovery and big data analytics has grown dramatically with over three quarters of organizations according to our research has shown as overall important. Big data analytics and visualization is an area that Spotfire had innovated before Tibco acquisition but has not seen its fair share of growth with the buying trends. The opportunity for Tibco to provide analytics and BI that can further leverage the entire Tibco portfolio of integration, event processing, cloud and social collaboration software products is upon them, let’s see how they do. It now needs to supercharge its analytics efforts significantly with leveraging its new products from Jaspersoft.

Regards,

Tony Cosentino

VP and Research Director

Alteryx has released version 9.0 of Alteryx Analytics that provides a range of data to predictive analytics in advance of its annual user conference called Inspire 2014. I have covered the company for several years as it has emerged as a key player in providing a range of business analytics from predictive to big data analytics. The importance of this category of analytics is revealed by our latest benchmark research on big data analytics, which finds that predictive analytics is the most important type of big data analytics, ranked first by nearly half (47%) of research participants. The new version 9 includes new capabilities and integration with a range of new information sources including read and write capability to IBM SPSS and SAS for range of analytic needs.

vr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsAfter attending Inspire 2013 last year, I wrote about capabilities that are enabling an emerging business role, that which Alteryx calls the data artisan. The label refers to analysts who combines both art and science in using analytics to help direct business outcomes. Alteryx uses an innovative and intuitive approach to analytic tasks, using workflow and linking various data sources through in-memory computation and processing. It takes a “no code” drag and drop approach to integrate data from files and databases, prepare data for analysis, and build and score predictive models to yield relevant results. Other vendors in the advanced analytics market are also applying this approach, but few mature tools are currently available. The output of the Alteryx analytic processes can be shared automatically in numerous data formats including direct export into visualization tools such as those from Qlik (new support) and Tableau. This can help users improve their predictive analytics capabilities and take action on the outcomes of analytics, which are the two capabilities most-often cited in our research as needed to improve big data analytics.

vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticsAlteryx now works with Revolution Analytics to increase the scalability of its system to work with large data sets. The open source language R continues to gain popularity and is being embedded in many business intelligence tools, but it runs only on data that can be loaded into memory. Running only in memory does not address analytics on datasets that run into Terabytes and hundreds of millions of values, and potentially requires use of a sub-sampling approach to advanced analytics. With its RevoScaleR, Revolution Analytics rewrites parts of the R algorithm so that the processing tasks can be parallelized and run in big data architectures such as Hadoop. Such capability is important for analytic problems including recommendation engines, unsupervised anomaly detection, some classification and regression problems, and some clustering problems. These analytic techniques are appropriate for some of the top business uses of big data analytics, which according to our research are cross-selling and up-selling (important to 38%), better understanding of individual customers (32%), analyzing all data rather than a sample (30%) and price optimization (28%). Alteryx Analytics automatically detects whether to use RevoScaleR or open source R algorithms. This approach simplifies the technical complexities of scaling R by providing a layer of abstraction for the analytic professional.

Scoring – the ability to input a data record and receive the probability of a particular outcome – is an important if not well understood aspect of predictive analytics. Our research shows that companies that score models on a timely basis according to their needs get better organizational results than those that score all models the same way. Working with Revolution Analytics, Alteryx has enhanced scoring scalability for R algorithms with new capabilities that chunk data in a parallelized fashion. This approach bypasses the memory-only approach to enable a theoretically unlimited number of scores to be processed. For large-scale implementations and consumer applications in industries such as retail, an important target market for Alteryx, and these capabilities are becoming important.

Alteryx 9.0 also improves on open source R’s default approach to scoring, which is “all or nothing.” That is, if data is missing (a null value) or a new level for a categorical variable is not included in the original model, R will not score the model until the issue is addressed. This process is a particular problem for analysts who want to score data in small batches or individually. In contrast, Alteryx’s new “best effort” approach scores the records that can be run without incident, and those that cannot be run are returned with an error message. This adjustment is particularly important as companies start to deploy predictive analytics into areas such as call centers or within Web applications such as automatic quotes for insurance.

vr_Big_Data_Analytics_02_defining_big_data_analyticsAlteryx 9.0 also has new predictive modeling tools and functionality. A spline model helps address regression and classification problems such as data reduction and nonlinear relationships and their interactions. It uses a clear box way to serve users with differing objectives and skill levels. The approach exposes the underpinnings of the model so that advanced users can modify a model, but at the same time less sophisticated users can use the model without necessarily understanding all of the intricacies of the model itself. Other capabilities include a Gamma regression tool allows data matching to model the Gamma family of distributions using the generalized linear modeling (GLM) framework. Heat plot tools for visualizing joint probability distributions, such as between customer income level and customer advocacy, and more robust A/B testing tools, which are particularly important in digital marketing analytics, are also part of the release.

At the same time, Alteryx has expanded its base of information sources. According to our research, working with all sources of data, not just one, is the most common definition for big data analytics, as stated by three-quarters (76%) of organizations. While structured data from transaction systems and so-called systems of record is still the most important, new data sources including those coming from external sources are becoming important. Our research shows that the most widely used external data sources are cloud applications (54%) and social media data (46%); five additional data sources, including Internet, consumer, market and government sources, are virtually tied in third position (with 39% to 42% each). Alteryx will need to be mindful of best practices in big data analytics as I have outlined to ensure it can stay on top of a growing set of requirements to blend big data but also apply a range of advanced analytics.

New connectors to the social media data provider Gnip give access to social media websites through a single API, and a DataSift (http://www.datasift.com) connector helps make social media more accessible and easier to analyze for any business need. Other new connectors in 9.0 include those for Foursquare, Google Analytics, Marketo, salesforce.com and Twitter. New data warehouse connectors include those for Amazon Redshift, HP Vertica, Microsoft SQL Server and Pivotal Greenplum. Access to SPSS and SAS data files also is introduced in this version; Alteryx hopes to break down the barriers to entry in accounts dominated by these advanced analytic stalwarts. With already existing connectors to major cloud and on-premises data sources, the company provides a robust integration platform for analytics.

Alteryx is on a solid growth curve as evidenced by the increasing number of inquiries and my conversations with company vr_Customer_Analytics_08_time_spent_in_customer_analyticsexecutives. It’s not surprising given the disruptive potential of the technology itself and its unique analytic workflow technology for data blending and advanced analytics. This data blending and workflow technology that Alteryx provides is not highlighted enough as it is one of the largest differentiators of its software and reduces the data related tasks like preparing (47%) and reviewing (43%) data that our customer analytics research finds gets in the way of analysts performing analytics. Additionally Alteryx ability to apply location analytics within its product is a key differentiation that our research found delivers exponential value from analytics than just viewing traditional visualization and tables of data. Also location analytics like Alteryx provides helps rapidly identify areas where customer experience and satisfaction can be improved and is the top benefit found in our research. The flexible platform resonates particularly well with line-of-business and especially in fast-moving, lightly regulated industries such as travel, retail and consumer goods where speed of analytics are critical to be performed. The work the company is doing with Revolution Analytics and the ability to scale is important for advanced analytic that operate on big data. The ability to seamlessly connect and blend information sources is a critical capability for Alteryx and it’s a wise move to invest further in this area but Alteryx will need to examine where collaborative technology could be used to help business work together on analytics within the software. Alteryx will need to continue to adapt to the market demand for analytics and keep focused on varying line of business areas so it can continue its growth. Just about any company involved in analytics today should evaluate Alteryx and see how it can streamline analytics in a very unique approach.

Regards,

Tony Cosentino

VP and Research Director

Organizations should consider multiple aspects of deploying big data analytics. These include the type of analytics to be deployed, how the analytics will be deployed technologically and who must be involved both internally and externally to enable success. Our recent big data analytics benchmark research assesses each of these areas. How an organization views these deployment considerations may depend on the expected benefits of the big data analytics program and the particular business case to be made, which I discussed recently.

According to the research, the most important capability of big data analytics is predictive analytics (64%), but among companies vr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsthat have deployed big data analytics, descriptive analytic approaches of query and reporting (74%) and data discovery (64%) are more readily available than predictive capabilities (57%). Such statistics may be a function of big data technologies such as Hadoop, and their associated distributions having prioritized the ability to run descriptive statistics through standard SQL, which is the most common method for implementing analysis on Hadoop. Cloudera’s Impala, Hortonworks’ Stinger (an extension of Apache Hive), MapR’s Drill, IBM’s Big SQL, Pivotal’s HAWQ and Facebook’s open-source contribution of Presto SQL all focus on accessing data through an SQL paradigm. It is not surprising then that the technology research participants use most for big data analytics is business intelligence (75%) and that the most-used analytic methods — pivot tables (46%), classification (39%) and clustering (37%) — are descriptive and exploratory in nature. Similarly, participants said that visualization of big data allows analysts to perform faster analysis (49%), understand context better (48%), perform root-cause analysis (40%) and display multiple result sets (40%), but visualization does not provide more advanced analytic capabilities. While various vendors now offer approaches to run advanced analytics on big data, the research shows that in terms of big data, organizational capabilities still revolve around more basic analytic access.

For companies that are implementing advanced analytic capabilities on big data, there are further analytic process considerations, and many have not yet tackled those. Model building and model deployment should be manageable and timely, involve specialized personnel, and integrate into the broader enterprise architecture. While our research provides an in-depth look at adoption of the different types of in-database analytics, deployment of advanced analytic sandboxes, data mining, model management, integration with business processes and overall model deployment, that is beyond the topic here.

Beyond analytic considerations, a host of technological decisionsvr_Big_Data_Analytics_13_advanced_analytics_on_big_data must be made around big data analytics initiatives. One of these is the degree of customization necessary. As technology advances, customization is giving way to more packaged approaches to big data analytics. According to our research, the majority (54%) of companies that have already implemented big data analytics did custom builds using big data-specific languages and interfaces. The most of those that have not yet deployed are likely to purchase a dedicated or packaged application (44%), followed by a custom build (36%). We think that this pre- and post-deployment comparison reflects a maturing market.

The move from custom approaches to standardized ones has important implications for the skills sets needed for a big data vr_Big_Data_Analytics_14_big_data_analytics_skillsanalytics initiative. In comparing the skills that organizations said they currently have to the skills they need to be successful with big data analytics, it is clear that companies should spend more time building employees’ statistical, mathematical and visualization skills. On the flip side, organizations should make sure their tools can support skill sets that they already have, such as use of spreadsheets and SQL. This is convergent with other findings about training needs, which include applying analytics to business problems (54%), training on big data analytics tools (53%), analytic concepts and techniques (46%) and visualizing big data (41%). The data shows that as approaches become more standardized and the market focus shifts toward them from customized implementations, skill needs are shifting as well. This is not to say that demand is moving away from the data scientist completely. According to our research, organizations that involve cross-functional teams or data scientists in the deployment process are realizing the most significant impact. It is clear that multiple approaches for personnel, departments and current vendors play a role in deployments and that some approaches will be more effective than others.

Cloud computing is another key consideration with respect to deploying analytics systems as well as sandbox modelling and testing environments. For deployment of big data analytics, 27 percent of companies currently use a cloud-based method, while 58 percent said they do not and 16 percent do not know what is used. Not surprisingly, far fewer IT professionals (19%) than business users (40%) said they use cloud-based deployments for big data analytics. The flexibility and capability that cloud resources provide is particularly attractive for sandbox environments and for organizations that lack big data analytic expertise. However, for big data model building, most organizations (42%) still utilize a dedicated internal sandbox environment to build models while fewer (19%) use a non-dedicated internal sandbox (that is, a container in a data warehouse used to build models) and others use a cloud-based sandbox either as a completely separate physical environment (9%) or as a hybrid approach (9%). From this last data we infer that business users are sometimes using cloud-based systems to do big data analytics without the knowledge of IT staff. Among organizations that are not using cloud-based systems for big data analytics, security (45%) is the primary reason that they do not.

Perhaps the most important consideration for big data analytics is choosing vendors to partner with to achieve organizational objectives. When we understand the move from custom technological approaches to more packaged ones and the types of analytics currently being implemented for big data, it is not surprising that a majority of research participants (52%) are looking to their business intelligence systems providers to supply their big data analytics solution. However, a significant number of companies (35%) said they will turn to a specialist analytics provider or their database provider (34%). When evaluating big data analytics, usability is the most important vendor consideration but not by as wide a margin as in categories such as business intelligence. A look at criteria rated important and very important by research participants reveals usability is the highest ranked (94%), but functionality (92%) and reliability (90%) follow closely. Among innovative new technologies, collaboration is important (78%) while mobile access (46%) is much less so. Coupled with the finding that communication and knowledge sharing combined is an important benefit of big data analytics, it is clear that organizations are cognizant of the collaborative imperative when choosing a big data analytics product.

Deployment of big data analytics starts with forethought and a well-defined business case that includes the expected benefits I discussed in my previous analysis. Once the outcome-driven framework is established, organizations should consider the types of analytics needed, the enabling technologies and the people and processes necessary for implementation. To learn more about our big data analytics research, download a copy of the executive summary here.

Regards,

Tony Cosentino

VP & Research Director

SAP recently presented its analytics and business intelligence roadmap and new innovations to about 1,700 customers and partners using SAP BusinessObjects at its SAP Insider event (#BI2014). SAP has one of the largest presences in business intelligence due to its installed base of SAP BusinessObjects customers. The company intends to defend its current position in the established business intelligence (BI) market while expanding in the areas of databases, discovery analytics and advanced analytics. As I discussed a year ago, SAP faces an innovator’s dilemma in parts of its portfolio, but it is working aggressively to get ahead of competitors.

vr_bti_br_technology_innovation_prioritiesOne of the pressures that SAP faces is from a new class of software that is designed for business analytics and enables users to visualize and interact on data in new ways without relationships in the data being predefined. Our business technology innovation research shows that analytics is the top-ranked technology innovation in business today, rated first by 39 percent of organizations. In conventional BI systems, data is modeled in so-called cubes or other defined structures that allow users to slice and dice data quickly and easily. The cube structure solves the problem of abstracting the complexity of the structured query language (SQL) of the database and slashes the amount of time it takes to read data from a row-oriented database. However, as the cost of memory decreases significantly, enabling the use of new column-oriented databases, these methods of BI are being challenged. For SAP and other established business intelligence providers, this situation represents both an opportunity and a challenge. In responding, almost all of these BI companies have introduced some sort of visual discovery capability. SAP introduced SAP Lumira, formerly known as Visual Intelligence, 18 months ago to compete in this emerging segment, and it has gained traction in terms of downloads, which the company estimated at 365,000 in the fourth quarter of 2013.

SAP and other large players in analytics are trying not just to catch up with visual discovery players such as Tableau but rather to make it a game of leapfrog. Toward that end, the capabilities of Lumira demonstrated at the Insider conference included information security and governance, advanced analytics, integrated data preparation, storyboarding and infographics; the aim is to create a differentiated position for the tool. For me, the storyboarding and infographics capabilities are about catching up, but being able to govern and secure today’s analytic platforms is a critical concern for organizations, and SAP means to capitalize on them. A major analytic announcement at the conference focused on the integration of Lumira with the BusinessObjects platform. Lumira users now can create content and save it to the BusinessObjects server, mash up data and deliver the results through a secure common interface.

Beyond the integration of security and governance with discovery analytics, the leapfrog approach centers on advanced analytics. SAP’s acquisition last year of KXEN and its initial integration with Lumira provide an advanced analytics tool that does not require a data scientist to use it. My coverage of KXEN prior to the acquisition revealed that the tool was user-friendly and broadly applicable especially in the area of marketing analytics. Used with Lumira, KXEN will ultimately provide front-end integration for in-database analytic approaches and for more advanced techniques. Currently, for data scientists to run advanced analytics on large data sets, SAP provides its own predictive analytic library (PAL), which runs natively on SAP HANA and offers commonly used algorithms such as clustering, classification and time-series. Integration with the R language is available through a wrapper approach, but the system overhead is greater when compared to the PAL approach on HANA.

The broader vision for Lumira and the BusinessObjects analytics platform SAP said is “collective intelligence,” which it described as “a Wikipedia for business” that provides a bidirectional analytic and communication platform. To achieve this lofty goal, SAP will vr_Big_Data_Analytics_02_defining_big_data_analyticsneed to continue to put resources into HANA and facilitate the integration of underlying data sources. Our recently released research on big data analytics shows that being able to analyze data from all data sources (selected by 75% of participants) is the most prevalent definition for big data analytics. To this end, SAP announced the idea of an “in-memory fabric” that allows virtual data access to multiple underlying data sources including big data platforms such as Hadoop. The key feature of this data federation approach is what the company calls smart data access (SDA). Instead of loading all data into memory, the virtualized system sets a proxy that points to where specific data is held. Using machine learning algorithms, it can define how important information is based on the query patterns of users and upload the most important data into memory. The approach will enable the company to analyze data on a massive scale since utilizing both HANA and the Sybase IQ columnar database which the company says was just certified as the world record for the largest data warehouse, at more than 12 petabytes. Others such as eBay and Teradata may beg to differ with the result based on another implementation, but nevertheless it is an impressive achievement.

Another key announcement was SAP Business Warehouse (BW) 7.4, which now runs on top of HANA. This combination is likely to be popular because it enables migration of the underlying database without impacting business users. Such users store many of their KPIs and complex calculations in BW, and to uproot this system is untenable for many organizations. SAP’s ability to continue support for these users is therefore something of an imperative. The upgrade to 7.4 also provides advances in capability and usability. The ability to do complex calculations at the database level without impacting the application layer enables much faster time-to-value for SAP analytic applications. Relative to the in-memory fabric and SDA discussed above, BW users no longer need intimate knowledge of HANA SDA. The complete data model is now exposed to HANA as an information cube object, and HANA data can be reflected back into BW. To back it up, the company offered testimony from users. Representatives of Molson Coors said their new system took only a weekend to move into production (after six weeks of sandbox experiments and six weeks of development) and enables users to perform right-time financial reporting, rapid prototyping and customer sentiment analysis.

SAP’s advancements and portfolio expansion are necessary for it to continue in a leadership position, but the inherent risk is confusion amongst its customer and prospect base.  SAP published its last statement of direction for analytic dashboard about this time last year, and according to company executives, it will be updated fairly soon, though they would not specify when. The many tools in the portfolio include Web Intelligence, Crystal Reports, Explorer, Xcelsius and now Lumira. SAP and its partners position the portfolio as a toolbox in which each tool is meant to solve a different organizational need. There is overlap among them, however, and the inherent complexity of the toolbox approach may not resonate well with business users who desire simplicity and timeliness.

SAP customers and others considering SAP should carefully examine how well these tools match the skills in their organizations. We encourage companies to look at the different organizationalVRMobileBIVI roles as analytic personas and try to understand which constituencies are served by which parts of the SAP portfolio. For instance, one of the most critical personas going forward is the Designer role since usability is the top priority for organizational software according to our next-generation business intelligence research. Yet this role may become more difficult to fill over time since trends such as mobility continue to add to the job requirement. SAP’s recent upgrade of Design Studio to address emerging needs such as mobility and mobile device management (MDM) may force some organizations to rebuild  dashboards and upscale their designer skill sets to include JavaScript and Cascading Style Sheets, but the ability to deliver multifunctional analytics across devices in a secure manner is becoming paramount. I note that SAP’s capabilities in this regard helped it score third overall in our 2014 Mobile Business Intelligence Value Index. Other key personas are the knowledge worker and the analyst. Our data analytics research shows that while SQL and Excel skills are abundant in organizations, statistical skills and mathematical skills are less common. SAP’s integration of KXEN into Lumira can help organizations develop these personas.

SAP is pursuing an expansive analytic strategy that includes not just traditional business intelligence but databases, discovery analytics and advanced analytics. Any company that has SAP installed, especially those with BusinessObjects or an SAP ERP system, should consider the broader analytic portfolio and how it can meet business goals. Even for new prospects, the portfolio can be compelling, and as the roadmap centered on Lumira develops, SAP may be able to take that big leap in the analytics market.

Regards,

Tony Cosentino

VP and Research Director

Tony Cosentino – Twitter

Stats

  • 48,283 hits
Follow

Get every new post delivered to your Inbox.

Join 96 other followers

%d bloggers like this: