VRMobileBIVI_HotVendorMobility continues to be a hot adoption area in business intelligence, according to our research across analytics and line of business departments. Nearly three-quarters (71%) of organizations said their mobile workforce would be able to access BI capabilities in the next 12 months according to our next generation mobile business intelligence research. Roambi, a provider of mobile business intelligence applications, has made important strides this year after moving to deploying its products in the cloud, an event that I covered previously. Roambi is rated as one of the top providers of mobile business intelligence or what we refer to as a ‘Hot Vendor’ according to our Value Index.

Earlier this year, Roambi announced a partnership with Box, which offers cloud-based data storage and file sharing. More recently it began to catch up with the market by announcing support for the Android mobile operating system. Roambi has focused on the mobile BI market from its inception, first by building on the Apple iPhone’s small screen and then progressing to support the majority of mobile devices in corporations today.

The Box partnership, announced in March, enables joint Box and Roambi customers to visualize and interact with data stored on the Box file sharing system. Specifically, users are able to sync Roambi Analytics, the company’s visualization engine, and Roambi Flow, its storyboarding capability, with Box. The integration allows fast importation of Box source files and the ability to create and share interactive reports through Roambi Analytics and to create digital magazines and content through Roambi Flow. Data can be automatically refreshed in real time via Box Sync.

This partnership serves both companies since the coupled service provide users with expanded capabilities and little overlap. Box’s rapid growth is being driven by its ease of use, open platform and cloud approach to file sharing. Thisis a natural platform for Roambi to build on and expand its customer base. For Box, Roambi’s dynamic visualization and a collaboration capabilities address its business customers’ needs and increase its market opportunities. In our benchmark research on information optimization 83 percent of organizations said it is important to have components that provide interactive capabilities to support presentation of information.

Roambi also works with other application vendors in the CRM and ERP market to integrate their applications. The same research shows that CRM (for 45%) and ERP (38%) are important types to integrate with others especially in areas such as sales and customer service. Apple’s recent announcement of a new SDK, should facilitate process integration between software systems so that, for example, users can access and navigate applications such as those from Box and Roambi and transactional applications such as CRM and ERP in a single workflow. This capability can provide further usability advantages for Roambi, which scored the highest rating in this area in our Mobile Business Intelligence Value Index.

Roambi announced its mobile BI support for the Google Android mobile operating system that operates across a wide range of those smartphone and tablet technologies. It had delayed its development of its software on the Android platform, which required significant development resources and investment but was part of its strategy to maximize its business potential and relationship with Apple. The release are available at the Google Play store . The initial version will include four of Roambi’s most popular views: Card, Catalist, Layers and Superlist. Similar to its application for the Apple platform, security features for Android include Device Lockout, Remote Wipe and Application Passcode. Integration with Roambi’s partner Okta provides identity management services and gives access to any applications supported by Okta. Integration includes Active Directory (AD) and lightweight directory access protocol (LDAP). While Roambi Flow will not be available on Android out of the gate, the company says it will be becoming available by the end of 2014.

Current approaches to mobile business intelligence applications on the market include native, Web-based and hybrid (a combination of both). We compare these approaches in detail in the executive summary of our Mobile Business Intelligence Value Index report.With the new Android support, Roambi has a unique approach to the hybrid architecture that bypasses the browser completely. There is no data cached in a browser and in fact the data is loaded directly to the device and rendered through Roambi natively on the device.  From the user’s perspective, the benefit of this approach is performance since interactivity does not rely on data traveling over a network. A second benefit is offline access to data, which is not available via non-native approaches. From the developer’s or information assembler perspective, testing across browsers is not needed since there is no data caching and the experience is the same regardless of browser in use.

vr_Info_Optimization_16_information_software_evaluation_criteriaOur next-generation business intelligence research shows that executives strongly support mobile BI: Nearly half of them said that mobility is very important to their BI processes. Executives also favor Apple over Android devices which is likely one of the key reasons for Apple’s dominance in the current business intelligence landscape. However our research shows latent demand for Android devices in the business intelligence market and given the dominance of Android in the consumer market as well as dominance in places like Asia and other emerging markets, any company that seeks to take significant global market share will need support for both platforms. Our information optimization research fond Google Android as second in smartphone platform (24%) behind Apple iPhone (59%) as first ranked position. Therefore, Android support is an important evolution for Roambi in order to address an increasing demand in the market for Android devices.

Usability has become the most important evaluation criteria and in the information optimization benchmark research was selected as most important to over half (58%) of organizations. Roambi ranked as the highest in usability in the Ventana Research Mobile Business Intelligence Value Index, though its overall score was hampered somewhat by the lack of support for Android. With Android support, the company now addresses the need for multiple and the so called bring your own device (BYOD) to work methods now becoming more prevalent and allowed by organizations. By addressing this as well as taking advantages of broader market developments such as the new Apple SDKs, Roambi continues to address what organizations find important today. Usability of business intelligence systems is a top priority for 63% of companies. Even our big data analytics research finds a growing level of overall importance for mobile access in almost half of organizations (46%).  Any company that wants to get a user friendly business intelligence into the hands of its mobile workers quickly and effectively should have Roambi in the consideration set.

Regards,

Tony Cosentino

VP and Research Director

At its annual industry analyst summit last month and in a more recent announcement of enterprise support for parallelizing the R language on its Aster Discovery Platform, Teradata showed that it is adapting to changes in database and analytics technologies. The presentations at the conference revealed a unified approach to data architectures and value propositions in a variety of uses including the Internet of Things, digital marketing and ETL offloading. In particular, the company provided updates on the state of its business as well as how the latest version of its database platform, Teradata 15.0, is addressing customers’ needs for big data. My colleague Mark Smith covered these announcements in depth. The introduction of scalable R support was discussed at the conference but not announced publicly until late last month.

vr_Big_Data_Analytics_13_advanced_analytics_on_big_dataTeradata now has a beta release of parallelized support for R, an open source programming language used significantly in universities and growing rapidly in enterprise use. One challenge is that R relies on a single-thread, in-memory approach to analytics. Parallelization of R allows the algorithm to run on much larger data sets since it is not limited to data stored in memory. For a broader discussion of the pros and cons of R and its evolution, see my analysis. Our benchmark research shows that organizations are counting on companies such as Teradata to provide a layer of abstraction that can simplify analytics on big data architectures. More than half (54%) of advanced analytics implementations are custom built, but in the future this percentage will go down to about one in three (36%).

Teradata’s R project has three parts. The first includes a Teradata Aster R library, which supplies more than 100 prebuilt R functions that hide complexity of the in-database implementation. The algorithms cover the most common big data analytic approaches in use today, which according to our big data analytics benchmark research are classification (used by 39% of organizations), clustering (37%), regression (35%), time series (32%) and affinity analysis (29%). Some use innovative approaches available in Aster such as Teradata’s patented nPath algorithm, which is useful in areas such as digital marketing. All of these functions will receive enterprise support from Teradata, likely through its professional services team.

The second part of the project involves the R parallel constructor. This component gives analysts and data scientists tools to build their own parallel algorithms based on the entire library of open source R algorithms. The framework follows the “split, apply and combine” paradigm, which is popular among the R community. While Teradata won’t support the algorithms themselves, this tool set is a key innovation that I have not yet seen from others in the market.

Finally, the R engine has been integrated with Teradata’s SNAP integration framework. The framework provides unified access to multiple workload specific engines such as relational (SQL), graph (SQL-GR), MapReduce (SQL-MR) and statistics. This is critical since the ultimate value of analytics rests in the information itself. By tying together multiple systems, Teradata enables a variety of analytic approaches. More importantly, the data sources that can be merged into the analysis can deliver competitive advantages. For example, JSON integration, recently announced, delivers information from a plethora of connected devices and detailed Web data.

vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticsTeradata is participating in industry discussions about both data management and analytics. As Mark Smith discussed, its unified approach to data architecture addresses challenges brought on competing big data platforms such as Hadoop and other NoSQL approaches like that one announced with MongoDB supporting JSON integration. These platforms access new information sources and help companies use analytics to indirectly increase revenues, reduce costs and improve operational efficiency. Analytics applied to big data serve a variety of uses, most often cross-selling and up-selling (for 38% of organizations), better understanding of individual customers (32%) and optimizing price (30%) and IT operations (24%). Teradata is active in these areas and is working in multiple industries such as financial services, retail, healthcare, communications, government, energy and utilities.

Current Teradata customers should evaluate the company’s broader analytic and platform portfolio, not just the database appliances. In the fragmented and diverse big data market, Teradata is sorting through the chaos to provide a roadmap for largest of organizations to midsized ones. The Aster Discovery Platform can put power into the hands of analysts and statisticians who need not be data scientists. Business users from various departments, but especially high-level marketing groups that need to integrate multiple data sources for operational use, should take a close look at the Teradata Aster approach.

Regards,

Tony Cosentino

VP & Research Director

Information Builders announced two major new products at its recent annual user summit. The first was InfoDiscovery, a tool for ad hoc data analysis and visual discoveryThe second was iWay Sentinel, which allows administrators to manage applications in a proactive and dynamic manner. Being a privately held company, Information Builders is not a household name, but it is a major provider of highly scalable business intelligence (BI) and information management software to companies around the world.

VRMobileBIVI_HotVendorThis year’s announcements come one year after the release of WebFOCUS 8.0, which I wrote about at the time. Version 8.0 of this flagship BI product includes a significant overhaul of the underlying code base, and its biggest change is how it renders graphics by putting the parameters of the HTML5 graph code directly inside the browser. This approach allows consistent representation of the business intelligence graphics in multiple device environments including mobile ones. Our research into information optimization shows that mobile technology improves business performance significantly in one out of three organizations. The graphics capability helped Information Builders earn the rating of Hot vendor in our latest Value Index on Mobile Business Intelligence. It is an increasingly important trend to combine analytics with transactional systems in a mobile environment. Our research shows that mobile business intelligence is advancing quickly. Nearly three-quarters (71%) of participants said they expect their mobile workforce to have BI capabilities in the next 12 months.

vr_Big_Data_Analytics_12_benefits_of_visualizing_big_dataWebFOCUS InfoDiscovery represents the company’s new offer in the self-service analytics market. For visual discovery it enables users to extract, blend and prepare data from various data sources such as spreadsheets, company databases and third-party sources. Once the analytic data set is created, users can drill down into the information in an underlying columnar database. They can define queries as they go and examine trends, correlations and anomalies in the data set. Users given permission can publish the visualization from their desktop to the server for others to view or build further. Visualization is another area of increasing importance for organizations. Our research on big data analytics said data visualization has a number of benefits; the most-often cited are faster analytics (by 49%), understanding content (48%), root-cause analysis (40%) and displaying multiple result sets at the same time (40%).

InfoDiscovery is Information Builders’ contender in the new breed of visual discovery products. The first generation of visual discovery products drew attention for their visual capabilities, ease of use and agility. More recently, established business intelligence vendors, of which Information Builders is one, have focused on developing visual discovery tools on the platform of their well-known BI products, with the aim of taking advantage of their maturity. Currently this second wave of tools is still behind the first in terms of ease of use and visual analysis but are advancing rapidly, and they can provide better data governance, version control, auditing and user security. For instance, InfoDiscovery uses the same metadata as the enterprise platform WebFOCUS 8 so objects from both InfoDiscovery and other WebFOCUS applications can be configured in the same user portal. When a business user selects a filter, the data updates across all the components in the dashboard. The HTML5 rendering engine, new in WebFOCUS 8.0, makes the dashboard available to various devices including tablets and smartphones.

vr_oi_how_operational_intellegence_is_usedThe other major announcement at the conference, iWay Sentinel, is a real-time application monitoring tool that helps administrators manage resources across distributed systems. It works with iWay Service Manager, which is used to manage application workflows. IWay Sentinel allows multiple instances of Service Manager to be viewed and managed from a single Web interface, and administrators can address bottlenecks in system resources both manually and automatically. The tool belongs in the category we call operational intelligence and as our research finds, activity and event monitoring is the most important use (for 62% of research participants), followed by alerting and notification.

Sentinel is an important product in the Information Builders portfolio for a couple of reasons. Tactically speaking, it enables large organizations that are running multiple implementations of iWay Service Manager to manage infrastructure resources in a flexible and streamlined manner. From a strategic perspective, it ties the company to the emerging Internet of Things (IoT), which connects devices and real-time application workflows across a distributed environment. In such an environment, rules and processes flows must be monitored and coordinated in real time. Information is passed along an enterprise service bus that enables synchronous interaction of various application components. The use of IoT is in multiple areas such as remote management of devices, telematics and fleet management, predictive maintenance, supply chain optimization, and utlilities monitoring. The challenge is that application software is often complex and its processes are interdependent. For this reason, most approaches to the IoT have been proprietary in nature. Even so, Information Builders has a large number of clients in various industries, especially retail, that may be interested in its approach.

Information Builders continues to innovate in the changing IT industry and business demand for analytics and data, building on its integration capabilities and its core business intelligence assets. The breadth and depth of its software portfolio enable the company to capitalize on these assets as demand shifts. For instance, temporal analysis is becoming more important; Information Builders has built that capability into its products for years. In addition, the company’s core software is hardened by years of meeting high-concurrency needs. Companies that have thousands of users need this type of scalable, battle-tested system.

Both iWay Sentinel and InfoDiscovery are in limited release currently and will be generally available later this year. Users of other Information Builders software should examine InfoDiscovery and assess its fit in their organizations. For business users it offers a self-service approach on the same platform as the WebFOCUS enterprise product. IT staff can uphold their governance and system management responsibilities through visibility and flexible control of the platform. For its part iWay Sentinel should interest companies that have to manage multiple instances of information applications and use iWay Service Manager. In particular, retailers, transportation companies and healthcare companies exploring IoT uses should consider how it can help.

Information Builders is exploiting the value of data into what is called information optimization for which they are finding continued growth in providing information applications that meet specific business and process needs. Information Builders is also beginning to further exploit the big data sources and mobile technology areas but will need to further invest to ensure it can be part of a spectrum of new business needs. I continued to recommend any company that must serve a large set of employees in the workforce and has a need for blending data and analytics for business intelligence or information needs to consider Information Builders.

Regards,

Tony Cosentino

VP and Research Director

Tibco’s recent acquisition of Jaspersoft helps the company fill out its portfolio of business intelligence (BI) and reporting software in an increasingly competitive marketplace. Tibco already offered a range of products in BI and analytics including Tibco Spotfire, an established product for visual data discovery. Jaspersoft and its open source Java reporting tool JasperReports have been around since 2001, and the company says it has 16 million product downloads worldwide, 140,000 production deployments and 2,000 commercial customers in 100 countries. Jaspersoft received attention recently for its partnership with Amazon Marketplace and the ability to embed its system into applications using a credit card and a few simple configuration steps. This example of embedding the technology is an area that Tibco knows well from its history of integrating its technology into enterprise architecture across the planet.

vr_Info_Optimization_08_most_important_analyst_capabilities_updatedThe acquisition is significant given today’s advancements in the Business Intelligence market and the need for tools to serve a variety of users. In some ways their technologies serve the same users – analysts and business users trying to make decisions with data but how they approach it and support a broad set of roles and responsibilities is different.  Tibco Spotfire, Tibco’s approach to business analytics, serves for analytics and visualization with specializing in visual discovery and data exploration while Jaspersoft addresses the query and analyze, reporting, dashboards and other aspects of BI. According to our benchmark research on information optimization, the capabilities business users most often need are to drill into information within applications (37%), search for data (36%) and collaborate (27%). For analysts, the most necessary capabilities are extracting data (39%), designing and integrating metrics (37%) and developing policies and rules for access (34%). With Jaspersoft, Tibco can address both groups and also can embed intelligence and reporting capabilities into operationally oriented environments across range of business applications.

vr_oi_challenges_using_bi_for_operational_intelligenceThe acquisition makes sense in that more capabilities are needed to address the expanding scope of business intelligence and analytics. In practice, it will be interesting to see how the open source community and culture of Jaspersoft meshes with the culture of Tibco’s Spotfire division. For now, Jaspersoft will continue as a separate division so business likely will continue as usual until management decides specific areas of integration. With respect to development efforts, it will be critical to blend the discovery capabilities of Tibco Spotfire with Jaspersoft’s reporting which will be a formidable challenge.  Another key to success will be how Tibco integrates both with the capabilities from Extended Results, a mobile business intelligence provider Tibco bought in 2013. Mobility is an area where Ventana Research found Jaspersoft significantly lacking, so the Extended Results capabilities should prove useful. Finally, Tibco’s event-enabled infrastructure will likely play a key role as the company continues to invest in operational intelligence for event-focused information gathering and delivery. Our operational intelligence research has found a lack of integration from business intelligence like that of Jaspersoft with event streams like from Tibco to be a major challenge in over half (51%) of organizations. This is a potential opportunity for Tibco as it looks at future integration of the technologies.

The Jaspersoft acquisition is not surprising given recent changes in the BI market. The category, which just a few years ago was vr_Info_Optimization_01_whos_responsible_for_information_availabilityconsidered mature and well-defined, is expanding to include areas such as analytic discovery tools, advanced analytics and big data. The union of Tibco Spotfire, which primarily targets line-of-business professionals from analysts to knowledge worksers, and Jaspersoft, a more IT-centered company, reflects the need for the industry to bridge a divide that exists in many organizations where IT is publishing dashboards and reports to business.  The challenge of using information across business and IT was found in our latest research, revealed in our information optimization benchmark research, shows that information management these days is most often (in 42% of organizations) a joint responsibility of IT and the lines of business , although IT is involved in some capacity in four-fifths of them. It remains to be seen whether the joint company can take on major competitors that have far more cash resources and take a similar approach.

Preliminary indicators show a good fit between these two organizations. Customers from each will be introduced to important new tools and capabilities from the other. One of the first likely moves for Tibco will be to introduce the 2,000 commercial customers and global presence of Jaspersoft to the broader portfolio. We advise those customers to evaluate what Tibco offers, especially those from Tibco Spotfire which continues to be a leader in the visual data discovery market. Before investing, however, customers and prospects should demand clarity on the company’s plans for technical integration of analytics and how these will fit with organizations long-term business intelligence and analytics roadmaps. Tibco customers migrating to the cloud should investigate the work Jaspersoft is doing with companies like Amazon and consider whether the embedded approach to interactive reporting can fit with their analytics, cloud and application strategies.

The opportunity for Tibco to advance business analytics is significant through this acquisition but it has historically not been as progressive in its marketing and sales of analytics compared to others in the market. The demand for visual discovery and big data analytics has grown dramatically with over three quarters of organizations according to our research has shown as overall important. Big data analytics and visualization is an area that Spotfire had innovated before Tibco acquisition but has not seen its fair share of growth with the buying trends. The opportunity for Tibco to provide analytics and BI that can further leverage the entire Tibco portfolio of integration, event processing, cloud and social collaboration software products is upon them, let’s see how they do. It now needs to supercharge its analytics efforts significantly with leveraging its new products from Jaspersoft.

Regards,

Tony Cosentino

VP and Research Director

Alteryx has released version 9.0 of Alteryx Analytics that provides a range of data to predictive analytics in advance of its annual user conference called Inspire 2014. I have covered the company for several years as it has emerged as a key player in providing a range of business analytics from predictive to big data analytics. The importance of this category of analytics is revealed by our latest benchmark research on big data analytics, which finds that predictive analytics is the most important type of big data analytics, ranked first by nearly half (47%) of research participants. The new version 9 includes new capabilities and integration with a range of new information sources including read and write capability to IBM SPSS and SAS for range of analytic needs.

vr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsAfter attending Inspire 2013 last year, I wrote about capabilities that are enabling an emerging business role, that which Alteryx calls the data artisan. The label refers to analysts who combines both art and science in using analytics to help direct business outcomes. Alteryx uses an innovative and intuitive approach to analytic tasks, using workflow and linking various data sources through in-memory computation and processing. It takes a “no code” drag and drop approach to integrate data from files and databases, prepare data for analysis, and build and score predictive models to yield relevant results. Other vendors in the advanced analytics market are also applying this approach, but few mature tools are currently available. The output of the Alteryx analytic processes can be shared automatically in numerous data formats including direct export into visualization tools such as those from Qlik (new support) and Tableau. This can help users improve their predictive analytics capabilities and take action on the outcomes of analytics, which are the two capabilities most-often cited in our research as needed to improve big data analytics.

vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticsAlteryx now works with Revolution Analytics to increase the scalability of its system to work with large data sets. The open source language R continues to gain popularity and is being embedded in many business intelligence tools, but it runs only on data that can be loaded into memory. Running only in memory does not address analytics on datasets that run into Terabytes and hundreds of millions of values, and potentially requires use of a sub-sampling approach to advanced analytics. With its RevoScaleR, Revolution Analytics rewrites parts of the R algorithm so that the processing tasks can be parallelized and run in big data architectures such as Hadoop. Such capability is important for analytic problems including recommendation engines, unsupervised anomaly detection, some classification and regression problems, and some clustering problems. These analytic techniques are appropriate for some of the top business uses of big data analytics, which according to our research are cross-selling and up-selling (important to 38%), better understanding of individual customers (32%), analyzing all data rather than a sample (30%) and price optimization (28%). Alteryx Analytics automatically detects whether to use RevoScaleR or open source R algorithms. This approach simplifies the technical complexities of scaling R by providing a layer of abstraction for the analytic professional.

Scoring – the ability to input a data record and receive the probability of a particular outcome – is an important if not well understood aspect of predictive analytics. Our research shows that companies that score models on a timely basis according to their needs get better organizational results than those that score all models the same way. Working with Revolution Analytics, Alteryx has enhanced scoring scalability for R algorithms with new capabilities that chunk data in a parallelized fashion. This approach bypasses the memory-only approach to enable a theoretically unlimited number of scores to be processed. For large-scale implementations and consumer applications in industries such as retail, an important target market for Alteryx, and these capabilities are becoming important.

Alteryx 9.0 also improves on open source R’s default approach to scoring, which is “all or nothing.” That is, if data is missing (a null value) or a new level for a categorical variable is not included in the original model, R will not score the model until the issue is addressed. This process is a particular problem for analysts who want to score data in small batches or individually. In contrast, Alteryx’s new “best effort” approach scores the records that can be run without incident, and those that cannot be run are returned with an error message. This adjustment is particularly important as companies start to deploy predictive analytics into areas such as call centers or within Web applications such as automatic quotes for insurance.

vr_Big_Data_Analytics_02_defining_big_data_analyticsAlteryx 9.0 also has new predictive modeling tools and functionality. A spline model helps address regression and classification problems such as data reduction and nonlinear relationships and their interactions. It uses a clear box way to serve users with differing objectives and skill levels. The approach exposes the underpinnings of the model so that advanced users can modify a model, but at the same time less sophisticated users can use the model without necessarily understanding all of the intricacies of the model itself. Other capabilities include a Gamma regression tool allows data matching to model the Gamma family of distributions using the generalized linear modeling (GLM) framework. Heat plot tools for visualizing joint probability distributions, such as between customer income level and customer advocacy, and more robust A/B testing tools, which are particularly important in digital marketing analytics, are also part of the release.

At the same time, Alteryx has expanded its base of information sources. According to our research, working with all sources of data, not just one, is the most common definition for big data analytics, as stated by three-quarters (76%) of organizations. While structured data from transaction systems and so-called systems of record is still the most important, new data sources including those coming from external sources are becoming important. Our research shows that the most widely used external data sources are cloud applications (54%) and social media data (46%); five additional data sources, including Internet, consumer, market and government sources, are virtually tied in third position (with 39% to 42% each). Alteryx will need to be mindful of best practices in big data analytics as I have outlined to ensure it can stay on top of a growing set of requirements to blend big data but also apply a range of advanced analytics.

New connectors to the social media data provider Gnip give access to social media websites through a single API, and a DataSift (http://www.datasift.com) connector helps make social media more accessible and easier to analyze for any business need. Other new connectors in 9.0 include those for Foursquare, Google Analytics, Marketo, salesforce.com and Twitter. New data warehouse connectors include those for Amazon Redshift, HP Vertica, Microsoft SQL Server and Pivotal Greenplum. Access to SPSS and SAS data files also is introduced in this version; Alteryx hopes to break down the barriers to entry in accounts dominated by these advanced analytic stalwarts. With already existing connectors to major cloud and on-premises data sources, the company provides a robust integration platform for analytics.

Alteryx is on a solid growth curve as evidenced by the increasing number of inquiries and my conversations with company vr_Customer_Analytics_08_time_spent_in_customer_analyticsexecutives. It’s not surprising given the disruptive potential of the technology itself and its unique analytic workflow technology for data blending and advanced analytics. This data blending and workflow technology that Alteryx provides is not highlighted enough as it is one of the largest differentiators of its software and reduces the data related tasks like preparing (47%) and reviewing (43%) data that our customer analytics research finds gets in the way of analysts performing analytics. Additionally Alteryx ability to apply location analytics within its product is a key differentiation that our research found delivers exponential value from analytics than just viewing traditional visualization and tables of data. Also location analytics like Alteryx provides helps rapidly identify areas where customer experience and satisfaction can be improved and is the top benefit found in our research. The flexible platform resonates particularly well with line-of-business and especially in fast-moving, lightly regulated industries such as travel, retail and consumer goods where speed of analytics are critical to be performed. The work the company is doing with Revolution Analytics and the ability to scale is important for advanced analytic that operate on big data. The ability to seamlessly connect and blend information sources is a critical capability for Alteryx and it’s a wise move to invest further in this area but Alteryx will need to examine where collaborative technology could be used to help business work together on analytics within the software. Alteryx will need to continue to adapt to the market demand for analytics and keep focused on varying line of business areas so it can continue its growth. Just about any company involved in analytics today should evaluate Alteryx and see how it can streamline analytics in a very unique approach.

Regards,

Tony Cosentino

VP and Research Director

Organizations should consider multiple aspects of deploying big data analytics. These include the type of analytics to be deployed, how the analytics will be deployed technologically and who must be involved both internally and externally to enable success. Our recent big data analytics benchmark research assesses each of these areas. How an organization views these deployment considerations may depend on the expected benefits of the big data analytics program and the particular business case to be made, which I discussed recently.

According to the research, the most important capability of big data analytics is predictive analytics (64%), but among companies vr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsthat have deployed big data analytics, descriptive analytic approaches of query and reporting (74%) and data discovery (64%) are more readily available than predictive capabilities (57%). Such statistics may be a function of big data technologies such as Hadoop, and their associated distributions having prioritized the ability to run descriptive statistics through standard SQL, which is the most common method for implementing analysis on Hadoop. Cloudera’s Impala, Hortonworks’ Stinger (an extension of Apache Hive), MapR’s Drill, IBM’s Big SQL, Pivotal’s HAWQ and Facebook’s open-source contribution of Presto SQL all focus on accessing data through an SQL paradigm. It is not surprising then that the technology research participants use most for big data analytics is business intelligence (75%) and that the most-used analytic methods — pivot tables (46%), classification (39%) and clustering (37%) — are descriptive and exploratory in nature. Similarly, participants said that visualization of big data allows analysts to perform faster analysis (49%), understand context better (48%), perform root-cause analysis (40%) and display multiple result sets (40%), but visualization does not provide more advanced analytic capabilities. While various vendors now offer approaches to run advanced analytics on big data, the research shows that in terms of big data, organizational capabilities still revolve around more basic analytic access.

For companies that are implementing advanced analytic capabilities on big data, there are further analytic process considerations, and many have not yet tackled those. Model building and model deployment should be manageable and timely, involve specialized personnel, and integrate into the broader enterprise architecture. While our research provides an in-depth look at adoption of the different types of in-database analytics, deployment of advanced analytic sandboxes, data mining, model management, integration with business processes and overall model deployment, that is beyond the topic here.

Beyond analytic considerations, a host of technological decisionsvr_Big_Data_Analytics_13_advanced_analytics_on_big_data must be made around big data analytics initiatives. One of these is the degree of customization necessary. As technology advances, customization is giving way to more packaged approaches to big data analytics. According to our research, the majority (54%) of companies that have already implemented big data analytics did custom builds using big data-specific languages and interfaces. The most of those that have not yet deployed are likely to purchase a dedicated or packaged application (44%), followed by a custom build (36%). We think that this pre- and post-deployment comparison reflects a maturing market.

The move from custom approaches to standardized ones has important implications for the skills sets needed for a big data vr_Big_Data_Analytics_14_big_data_analytics_skillsanalytics initiative. In comparing the skills that organizations said they currently have to the skills they need to be successful with big data analytics, it is clear that companies should spend more time building employees’ statistical, mathematical and visualization skills. On the flip side, organizations should make sure their tools can support skill sets that they already have, such as use of spreadsheets and SQL. This is convergent with other findings about training needs, which include applying analytics to business problems (54%), training on big data analytics tools (53%), analytic concepts and techniques (46%) and visualizing big data (41%). The data shows that as approaches become more standardized and the market focus shifts toward them from customized implementations, skill needs are shifting as well. This is not to say that demand is moving away from the data scientist completely. According to our research, organizations that involve cross-functional teams or data scientists in the deployment process are realizing the most significant impact. It is clear that multiple approaches for personnel, departments and current vendors play a role in deployments and that some approaches will be more effective than others.

Cloud computing is another key consideration with respect to deploying analytics systems as well as sandbox modelling and testing environments. For deployment of big data analytics, 27 percent of companies currently use a cloud-based method, while 58 percent said they do not and 16 percent do not know what is used. Not surprisingly, far fewer IT professionals (19%) than business users (40%) said they use cloud-based deployments for big data analytics. The flexibility and capability that cloud resources provide is particularly attractive for sandbox environments and for organizations that lack big data analytic expertise. However, for big data model building, most organizations (42%) still utilize a dedicated internal sandbox environment to build models while fewer (19%) use a non-dedicated internal sandbox (that is, a container in a data warehouse used to build models) and others use a cloud-based sandbox either as a completely separate physical environment (9%) or as a hybrid approach (9%). From this last data we infer that business users are sometimes using cloud-based systems to do big data analytics without the knowledge of IT staff. Among organizations that are not using cloud-based systems for big data analytics, security (45%) is the primary reason that they do not.

Perhaps the most important consideration for big data analytics is choosing vendors to partner with to achieve organizational objectives. When we understand the move from custom technological approaches to more packaged ones and the types of analytics currently being implemented for big data, it is not surprising that a majority of research participants (52%) are looking to their business intelligence systems providers to supply their big data analytics solution. However, a significant number of companies (35%) said they will turn to a specialist analytics provider or their database provider (34%). When evaluating big data analytics, usability is the most important vendor consideration but not by as wide a margin as in categories such as business intelligence. A look at criteria rated important and very important by research participants reveals usability is the highest ranked (94%), but functionality (92%) and reliability (90%) follow closely. Among innovative new technologies, collaboration is important (78%) while mobile access (46%) is much less so. Coupled with the finding that communication and knowledge sharing combined is an important benefit of big data analytics, it is clear that organizations are cognizant of the collaborative imperative when choosing a big data analytics product.

Deployment of big data analytics starts with forethought and a well-defined business case that includes the expected benefits I discussed in my previous analysis. Once the outcome-driven framework is established, organizations should consider the types of analytics needed, the enabling technologies and the people and processes necessary for implementation. To learn more about our big data analytics research, download a copy of the executive summary here.

Regards,

Tony Cosentino

VP & Research Director

SAP recently presented its analytics and business intelligence roadmap and new innovations to about 1,700 customers and partners using SAP BusinessObjects at its SAP Insider event (#BI2014). SAP has one of the largest presences in business intelligence due to its installed base of SAP BusinessObjects customers. The company intends to defend its current position in the established business intelligence (BI) market while expanding in the areas of databases, discovery analytics and advanced analytics. As I discussed a year ago, SAP faces an innovator’s dilemma in parts of its portfolio, but it is working aggressively to get ahead of competitors.

vr_bti_br_technology_innovation_prioritiesOne of the pressures that SAP faces is from a new class of software that is designed for business analytics and enables users to visualize and interact on data in new ways without relationships in the data being predefined. Our business technology innovation research shows that analytics is the top-ranked technology innovation in business today, rated first by 39 percent of organizations. In conventional BI systems, data is modeled in so-called cubes or other defined structures that allow users to slice and dice data quickly and easily. The cube structure solves the problem of abstracting the complexity of the structured query language (SQL) of the database and slashes the amount of time it takes to read data from a row-oriented database. However, as the cost of memory decreases significantly, enabling the use of new column-oriented databases, these methods of BI are being challenged. For SAP and other established business intelligence providers, this situation represents both an opportunity and a challenge. In responding, almost all of these BI companies have introduced some sort of visual discovery capability. SAP introduced SAP Lumira, formerly known as Visual Intelligence, 18 months ago to compete in this emerging segment, and it has gained traction in terms of downloads, which the company estimated at 365,000 in the fourth quarter of 2013.

SAP and other large players in analytics are trying not just to catch up with visual discovery players such as Tableau but rather to make it a game of leapfrog. Toward that end, the capabilities of Lumira demonstrated at the Insider conference included information security and governance, advanced analytics, integrated data preparation, storyboarding and infographics; the aim is to create a differentiated position for the tool. For me, the storyboarding and infographics capabilities are about catching up, but being able to govern and secure today’s analytic platforms is a critical concern for organizations, and SAP means to capitalize on them. A major analytic announcement at the conference focused on the integration of Lumira with the BusinessObjects platform. Lumira users now can create content and save it to the BusinessObjects server, mash up data and deliver the results through a secure common interface.

Beyond the integration of security and governance with discovery analytics, the leapfrog approach centers on advanced analytics. SAP’s acquisition last year of KXEN and its initial integration with Lumira provide an advanced analytics tool that does not require a data scientist to use it. My coverage of KXEN prior to the acquisition revealed that the tool was user-friendly and broadly applicable especially in the area of marketing analytics. Used with Lumira, KXEN will ultimately provide front-end integration for in-database analytic approaches and for more advanced techniques. Currently, for data scientists to run advanced analytics on large data sets, SAP provides its own predictive analytic library (PAL), which runs natively on SAP HANA and offers commonly used algorithms such as clustering, classification and time-series. Integration with the R language is available through a wrapper approach, but the system overhead is greater when compared to the PAL approach on HANA.

The broader vision for Lumira and the BusinessObjects analytics platform SAP said is “collective intelligence,” which it described as “a Wikipedia for business” that provides a bidirectional analytic and communication platform. To achieve this lofty goal, SAP will vr_Big_Data_Analytics_02_defining_big_data_analyticsneed to continue to put resources into HANA and facilitate the integration of underlying data sources. Our recently released research on big data analytics shows that being able to analyze data from all data sources (selected by 75% of participants) is the most prevalent definition for big data analytics. To this end, SAP announced the idea of an “in-memory fabric” that allows virtual data access to multiple underlying data sources including big data platforms such as Hadoop. The key feature of this data federation approach is what the company calls smart data access (SDA). Instead of loading all data into memory, the virtualized system sets a proxy that points to where specific data is held. Using machine learning algorithms, it can define how important information is based on the query patterns of users and upload the most important data into memory. The approach will enable the company to analyze data on a massive scale since utilizing both HANA and the Sybase IQ columnar database which the company says was just certified as the world record for the largest data warehouse, at more than 12 petabytes. Others such as eBay and Teradata may beg to differ with the result based on another implementation, but nevertheless it is an impressive achievement.

Another key announcement was SAP Business Warehouse (BW) 7.4, which now runs on top of HANA. This combination is likely to be popular because it enables migration of the underlying database without impacting business users. Such users store many of their KPIs and complex calculations in BW, and to uproot this system is untenable for many organizations. SAP’s ability to continue support for these users is therefore something of an imperative. The upgrade to 7.4 also provides advances in capability and usability. The ability to do complex calculations at the database level without impacting the application layer enables much faster time-to-value for SAP analytic applications. Relative to the in-memory fabric and SDA discussed above, BW users no longer need intimate knowledge of HANA SDA. The complete data model is now exposed to HANA as an information cube object, and HANA data can be reflected back into BW. To back it up, the company offered testimony from users. Representatives of Molson Coors said their new system took only a weekend to move into production (after six weeks of sandbox experiments and six weeks of development) and enables users to perform right-time financial reporting, rapid prototyping and customer sentiment analysis.

SAP’s advancements and portfolio expansion are necessary for it to continue in a leadership position, but the inherent risk is confusion amongst its customer and prospect base.  SAP published its last statement of direction for analytic dashboard about this time last year, and according to company executives, it will be updated fairly soon, though they would not specify when. The many tools in the portfolio include Web Intelligence, Crystal Reports, Explorer, Xcelsius and now Lumira. SAP and its partners position the portfolio as a toolbox in which each tool is meant to solve a different organizational need. There is overlap among them, however, and the inherent complexity of the toolbox approach may not resonate well with business users who desire simplicity and timeliness.

SAP customers and others considering SAP should carefully examine how well these tools match the skills in their organizations. We encourage companies to look at the different organizationalVRMobileBIVI roles as analytic personas and try to understand which constituencies are served by which parts of the SAP portfolio. For instance, one of the most critical personas going forward is the Designer role since usability is the top priority for organizational software according to our next-generation business intelligence research. Yet this role may become more difficult to fill over time since trends such as mobility continue to add to the job requirement. SAP’s recent upgrade of Design Studio to address emerging needs such as mobility and mobile device management (MDM) may force some organizations to rebuild  dashboards and upscale their designer skill sets to include JavaScript and Cascading Style Sheets, but the ability to deliver multifunctional analytics across devices in a secure manner is becoming paramount. I note that SAP’s capabilities in this regard helped it score third overall in our 2014 Mobile Business Intelligence Value Index. Other key personas are the knowledge worker and the analyst. Our data analytics research shows that while SQL and Excel skills are abundant in organizations, statistical skills and mathematical skills are less common. SAP’s integration of KXEN into Lumira can help organizations develop these personas.

SAP is pursuing an expansive analytic strategy that includes not just traditional business intelligence but databases, discovery analytics and advanced analytics. Any company that has SAP installed, especially those with BusinessObjects or an SAP ERP system, should consider the broader analytic portfolio and how it can meet business goals. Even for new prospects, the portfolio can be compelling, and as the roadmap centered on Lumira develops, SAP may be able to take that big leap in the analytics market.

Regards,

Tony Cosentino

VP and Research Director

SAS Institute, a long-established provider analytics software, showed off its latest technology innovations and product road maps at its recent analyst conference. In a very competitive market, SAS is not standing still, and executives showed progress on the goals introduced at last year’s conference, which I coveredSAS’s Visual Analytics software, integrated with an in-memory analytics engine called LASR, remains the company’s flagship product in its modernized portfolio. CEO Jim Goodnight demonstrated Visual Analytics’ sophisticated integration with statistical capabilities, which is something the company sees as a differentiator going forward. The product already provides automated charting capabilities, forecasting and scenario analysis, and SAS probably has been doing user-experience testing, since the visual interactivity is better than what I saw last year. SAS has put Visual Analytics on a six-month release cadence, which is a fast pace but necessary to keep up with the industry.

Visual discovery alone is becoming an ante in the analytics market,vr_predanalytics_benefits_of_predictive_analytics_updated since just about every vendor has some sort of discovery product in its portfolio. For SAS to gain on its competitors, it must make advanced analytic capabilities part of the product. In this regard, Dr. Goodnight demonstrated the software’s visual statistics capabilities, which can switch quickly from visual discovery into regression analysis running multiple models simultaneously and then optimize the best model. The statistical product is scheduled for availability in the second half of this year. With the ability to automatically create multiple models and output summary statistics and model parameters, users can create and optimize models in a more timely fashion, so the information can be come actionable sooner. In our research on predictive analytics, the most participants (68%) cited competitive advantage as a benefit of predictive analytics, and companies that are able to update their models daily or more often, our research also shows, are very satisfied with their predictive analytics tools more often than others are. The ability to create models in an agile and timely manner is valuable for various uses in a range of industries.

There are three ways that SAS allows high performance computing. The first is the more traditional grid approach which distributes processing across multiple nodes. The second is the in-database approach that allows SAS to run as a process inside of the database. vr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsThe third is extracting data and running it in-memory. The system has the flexibility to run on different large-scale database types such as MPP as well Hadoop infrastructure through PIG and HIVE. This is important because for 64 percent of organizations, the ability to run predictive analytics on big data is a priority, according to our recently released research on big data analytics. SAS can run via MapReduce or directly access the underlying Hadoop Distributed File System and pull the data into LASR, the SAS in-memory system. SAS works with almost all commercial Hadoop implementations, including Cloudera, Hortonworks, EMC’s Pivotal and IBM’s InfoSphere BigInsights. The ability to put analytical processes into the MapReduce paradigm is compelling as it enables predictive analytics on big data sets in Hadoop, though the immaturity of initiatives such as YARN may relegate the jobs to batch processing for the time being. The flexibility of LASR and the associated portfolio can help organizations overcome the challenge of architectural integration, which is the most widespread technological barrier to predictive analytics (for 55% of participants in that research). Of note is that the SAS approach provides purely analytical engine, and since there is no SQL involved in the algorithms, its overhead related to SQL is non-existent and it runs directly on the supporting system’s resources.

As well as innovating with Visual Analytics and Hadoop, SAS has a clear direction in its road map, intending to integrate the data integration and data quality aspects of the portfolio in a singlevr_Info_Optimization_04_basic_information_tasks_consume_time workflow with the Visual Analytics product. Indeed, data preparation is still a key sticking point for organizations. According to our benchmark research on information optimization, time spent in analytic tasks is still consumed most by data preparation (for 47%) and data quality and consistency (45%). The most valuable task, interpretation of the data, ranks fourth at 33 percent of analytics time. This is a big area of opportunity in the market, as reflected by the flurry of funding for data preparation software companies in the fourth quarter of 2013. For further analysis of SAS’s data management and big data efforts, please read my colleague Mark Smith’s analysis.

Established relationships with companies like Teradata and a reinvigorated relationship with SAP position SAS to remain at the heart of enterprise analytic architectures. In particular, the co-development effort that allow the SAS predictive analytic workbench to run on top of SAP HANA is promising, which raises the question of how aggressive SAP will be in advancing its own advanced analytic capabilities on HANA. One area where SAS could learn from SAP is in its developer ecosystem. While SAP has thousands of developers building applications for HANA, SAS could do a better job of providing the tools developers need to extend the SAS platform. SAS has been able to prosper with a walled-garden approach, but the breadth and depth of innovation across the technology and analytics industry puts this type of strategy under pressure.

Overall, SAS impressed me with what it has accomplished in the past year and the direction it is heading in. The broad-based development efforts raise a final question of where the company should focus its resources. Based on its progress in the past year, it seems that a lot has gone into visual analytics, visual statistics, LASR and alignment with the Hadoop ecosystem. In 2014, the company will continue horizontal development, but there is a renewed focus on specific analytic solutions as well. At a minimum, the company has good momentum in retail, fraud and risk management, and manufacturing. I’m encouraged by this industry-centric direction because I think that the industry needs to move away from the technology-oriented V’s toward the business-oriented W’s.

For customers already using SAS, the company’s road map is designed to capture market advantage with minimal disruption to existing environments. In particular, focusing on solutions as well as technological depth and breadth is a viable strategy. While it still may make sense for customers to look around at the innovation occurring in analytics, moving to a new system will often incur high switching costs in productivity as well as money. For companies just starting out with visual discovery or predictive analytics, SAS Visual Analytics provides a good point of entry, and SAS has a vision for more advanced analytics down the road.

Regards,

Tony Cosentino

VP and Research Director

We recently released our benchmark research on big data analytics, and it sheds light on many of the most important discussions occurring in business technology today. The study’s structure was based on the big data analytics framework that I laid out last year as well as the framework that my colleague Mark Smith put forth on the four types of discovery technology available. These frameworks view big data and analytics as part of a major change that includes a movement from designed data to organic data, the bringing together of analytics and data in a single system, and a corresponding move away from the technology-oriented three Vs of big data to the business-oriented three Ws of data. Our big data analytics research confirms these trends but also reveals some important subtleties and new findings with respect to this important emerging market. I want to share three of the most interesting and even surprising results and their implications for the big data analytics market.

First, we note that communication and knowledge sharing is a primary vr_Big_Data_Analytics_06_benefits_realized_from_big_data_analyticsbenefit of big data analytics initiatives, but it is a latent one. Among organizations planning to deploy big data analytics, the benefits most often anticipated are faster response to opportunities and threats (57%), improving efficiency (57%), improving the customer experience (48%) and gaining competitive advantage (43%). However, once a big data analytics system has moved into production, the benefits most often mentioned as achieved are better communication and knowledge sharing (51%), gaining competitive advantage (51%), improved efficiency in business processes (49%) and improved customer experience and satisfaction (46%). (The chart shows rankings of first choices as most important.) Although the last three of these benefits are predictable, it’s noteworthy that the benefit of communication and knowledge sharing, while not a priority before deployment, becomes one of the two most often cited later.

As for the implications, in our view, one reason why communication and knowledge sharing are more often seen as a key benefit after deployment rather than before is that agreement on big data analytics terminology is often lacking within organizations. Participants from fewer than half (44%) of organizations said that the people making business technology decisions mostly agree or completely agree on the meaning of big data analytics, while the same number said there are many different opinions about its meaning. To address this particular challenge, companies should pay more attention to setting up internal communication structures prior to the launch of a big data analytics project, and we expect collaborative technologies to play a larger role in these initiatives going forward.

vr_Big_Data_Analytics_02_defining_big_data_analyticsA second finding of our research is that integration of distributed data is the most important enabler of big data analytics. Asked the meaning of big data analytics in terms of capabilities, the largest percentage (76%) of participants said it involves analyzing data from all sources rather than just one, while for 55 percent it means analyzing all of the data rather than just a sample of it. (We allowed multiple responses.) More than half (56%) told us they view big data as finding patterns in large and diverse data sets in Hadoop, which indicates the continuing influence of this original big data technology. A second tier of percentages emphasizes timeliness as an aspect of big data: doing real-time processing on streams of data (44%), visualizing large structured data sets in seconds (40%) and doing real-time scoring against a database record (36%).

The implications here are that the primary characteristic of big data analytics technology is the ability to analyze data from many data sources. This shows that companies today are focused on bringing together multiple information sources and secondarily being able to process all data rather than just a sample, as well as being able to do machine learning on especially large data sets. Fast processing and the ability to analyze streams of data are relegated to third position in these priorities. That suggests that the so-called three Vs of big data are confusing the discussion by prioritizing volume, velocity and variety all at once. For companies engaged in big data analytics today, sourcing and integration of various data sources in an expedient manner is the top priority, followed by the ideas of size and then speed of arrival of data.

Third, we found that usage is not relegated to particular industries, vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticscertain types of companies or certain functional areas. From among 25 uses for big data analytics those that participants are personally involved with, three of the four most often mentioned involve customers and sales: enabling cross-selling and up-selling (38%), understanding the customer better (32%) and optimizing pricing (28%). Meanwhile, optimizing IT operations ranked fifth (24%) though it was most often chosen by those in IT roles (76%). What is particularly fascinating, however, is that 17 of the 25 use cases were named by more than 10 percent, which indicates many uses for big data analytics.

The primary implication of this finding is that big data analytics is not following the famous technology adoption curves outlined in books such as Geoffrey Moore’s seminal work, “Crossing the Chasm.” That is, companies are not following a narrowly defined path that solves only one particular problem. Instead, they are creatively deploying technological innovations en route to a diverse set of outcomes. And this is occurring across organizational functions and industries, including conservative ones, which conflicts with conventional wisdom. For this reason, companies are more often looking across industries and functional disciplines as part of their due diligence on big data analytics to come up with unique applications that may yield competitive advantage or organizational efficiencies.

In summary, it has been difficult for companies to define what big data analytics actually means and how to prioritize their investments accordingly. Research such as ours can help organizations address this issue. While the above discussion outlines a few of the interesting findings of this research, it also yields many more insights, related to aspects as diverse as big data in the cloud, sandbox environments, embedded predictive analytics, the most important data sources in use, and the challenges of choosing an architecture and deploying big data analytic products. For a copy of the executive summary download it directly from the Ventana Research community.

Regards,

Tony Cosentino

VP and Research Director

Ventana Research recently completed the most comprehensiveVRMobileBIVI evaluation of mobile business intelligence products and vendors available anywhere today. The evaluation includes 16 technology vendors’ offerings on smartphones and tablets and use across Apple, Google Android, Microsoft Surface and RIM BlackBerry that were assessed in seven key categories: usability, manageability, reliability, capability, adaptability, vendor validation and TCO and ROI. The result is our Value Index for Mobile Business Intelligence in 2014. The analysis shows that the top supplier is MicroStrategy, which qualifies as a Hot vendor and is followed by 10 other Hot vendors: IBM, SAP, QlikTech, Information Builders, Yellowfin, Tableau Software, Roambi, SAS, Oracle and arcplan.

Our expertise, hands on experience and the buyer research from our benchmark research on next-generation business intelligence and on information optimization informed our product evaluations in this new Value Index. The research examined business intelligence on mobile technology to determine organizations’ current and planned use and the capabilities required for successful deployment.

What we found was wide interest in mobile business intelligence and a desire to improve the use of information in 40 percent of organizations, though adoption is less pervasive than interest. Fewer than half of organizations currently access BI capabilities on mobile devices, but nearly three-quarters (71%) expect their mobile workforce to be able to access BI capabilities in the next 12 months. The research also shows strong executive support: Nearly half of executives said that mobility is very important to their BI processes.

Mobile_BI_Weighted_OverallEase of access and use are an important criteria in this Value Index because the largest percentage of organizations identified usability as an important factor in evaluations of mobile business intelligence applications. This is an emphasis that we find in most of our research, and in this case it also may reflect users’ experience with first-generation business intelligence on mobile devices; not all those applications were optimized for touch-screen interfaces and designed to support gestures. It is clear that today’s mobile workforce requires the ability to access and analyze data simply and in a straightforward manner, using an intuitive interface.

The top five companies’ products in our 2014 Mobile Business Intelligence Value Index all provide strong user experiences and functionality. MicroStrategy stood out across the board, finishing first in five categories and most notably in the areas of user experience, mobile application development and presentation of information. IBM, the second-place finisher, has made significant progress in mobile BI with six releases in the past year, adding support for Android, advanced security features and an extensible visualization library. SAP’s steady support for the mobile access to SAP BusinessObjects platform and support for access to SAP Lumira, and its integrated mobile device management software helped produce high scores in various categories and put it in third place. QlikTech’s flexible offline deployment capabilities for the iPad and its high ranking in assurance-related category of TCO and ROI secured it the fourth spot. Information Builders’ latest release of WebFOCUS renders content directly with HTML5 and its Active Technologies and Mobile Faves, the company delivers strong mobile capabilities and rounds out the top five ranked companies. Other noteworthy innovations in mobile BI include Yellowfin’s collaboration technology, Roambi’s use of storyboarding in its Flow application.

Although there is some commonality in how vendors provide mobile access to data, there are many differences among their offerings that can make one a better fit than another for an organization’s particular needs. For example, companies that want their mobile workforce to be able to engage in root-cause discovery analysis may prefer tools from Tableau and QlikTech. For large companies looking for a custom application approach, MicroStrategy or Roambi may be good choices, while others looking for streamlined collaboration on mobile devices may prefer Yellowfin. Many companies may base the decision on mobile business intelligence on which vendor they currently have installed. Customers with large implementations from IBM, SAP or Information Builders will be reassured to find that these companies have made mobility a critical focus.

To learn more about this research and to download a free executive summary, please visit http://www.ventanaresearch.com/bivalueindex/.

Regards,

Tony Cosentino

Vice President and Research Director

Tony Cosentino – Twitter

Stats

  • 45,880 hits
Follow

Get every new post delivered to your Inbox.

Join 94 other followers

%d bloggers like this: