PentahoWorld, the first user conference for this 10-year-old supplier of data integration and business intelligence that provides business analytics, attracted more than 400 customers in roles ranging from IT and database professionals to business analysts and end users. The diversity of the crowd reflects Pentaho’s broad portfolio of products. It covers the integration aspects of big data analytics with the Pentaho Data Integration tools and the front-end tools and visualization with the Pentaho Business Analytics. In essence its portfolio provides end-to-end data to analytics through what they introduced as Big Data Orchestration that brings governed data delivery and streamlined data refinery together on one platform.

vr_BDI_03_plans_for_big_data_technologyPentaho has made progress in business over the past year, picking up Fortune 1000 clients and moving from providing analytics to midsize companies to serving more major companies such as Halliburton, Lufthansa and NASDAQ. One reason for this success is Pentaho’s ability to integrate large scale data from multiple sources including enterprise data warehouses, Hadoop and other NoSQL approaches. Our research into big data integration shows that Hadoop is a key technology that 44 percent of organizations are likely to use, but it is just one option in the enterprise data environment. A second key for Pentaho has been the embeddable nature of its approach, which enables companies, especially those selling cloud-based software as a service (SaaS), to use analytics to gain competitive advantage by placing its tools within their applications. For more detail on Pentaho’s analytics and business intelligence tools please my previous analytic perspective.

A key advance for the company over the past year has been the development and refinement of what the company calls big data blueprints. These are general use cases in such areas as ETL offloading and customer analytics. Each approach includes design patterns for ETL and analytics that work with high-performance analytic databases including NoSQL variants such as Mongo and Cassandra.

The blueprint concept is important for several reasons. First, it helps Pentaho focus on specific market needs. Second, it shows customers and partners processes that enable them to get immediate return on the technology investment. The same research referenced above shows that organizations manage their information and technology better than their people and processes; to realize full value from spending on new technology, they need to pay more attention to how the technology fits with these cultural aspects.

vr_Info_Optimization_09_most_important_end_user_capabilitiesAt the user conference, the company announced release 5.2 of its core business analytics products and featured its Governed Data Delivery concept and Streamlined Data Refinery. The Streamlined Data Refinery provides a process for business analysts to access the already integrated data provided through PDI and create data models on the fly. The advantage is that this is not a technical task and the business analyst does not have to understand the underlying metadata or the data structures. The user chooses the dimensions of the analysis using menus that offer multiple combinations to be chosen in an ad hoc manner. Then the Streamlined Data Refinery automatically generates a data cube that is available for fast querying of an analytic database. Currently, Pentaho supports only the HP Vertica database, but its roadmap promises to add high-performance databases from other suppliers. The entire process can take only a few minutes and provides a much more flexible and dynamic process than asking IT to rebuild a data model every time a new question is asked.

While Pentaho Data Integration enables users to bring together all available data and integrate it to find new insights, Streamlined Data Refinery gives business users direct access to the blended data. In this way they can explore data dynamically without involving IT. The other important aspect is that it easily provides the lineage of the data. Internal or external auditors often need to understand the nature of the data and the integration, which data lineage supports. Such a feature should benefit all types of businesses but especially those in regulated industries. This approach addresses the two top needs of business end users, which according to our benchmark research into information optimization, are to drill into data (for 37%) and search for specific information (36%).

Another advance is Pentaho 5.2’s support for Kerberos security on Cloudera, Hortonworks and MapR. Cloudera, currently the largest Hadoop distribution, and Hortonworks, which is planning to raise capital via a public offering, hold the lion’s share of the commercial Hadoop market. Kerberos puts a layer of authentication security between the Pentaho Data Integration tool and the Hadoop data. This helps address security concerns which have dramatically increased over the past year after major breaches at retailers, banks and government institutions.

These announcements show results of Pentaho’s enterprise-centric customer strategy as well as the company’s investment in senior leadership. Christopher Dziekan, the new chief product officer, presented a three-year roadmap that focuses on data access, governance and data integration. It is good to see the company put its stake in the ground with a well-formed vision of the big data market. Given the speed at which the market is changing and the necessity for Pentaho to consider the needs of its open source community, it will be interesting to see how the company adjusts the roadmap going forward.

For enterprises grappling with big data integration and trying to give business users access to new information sources, Pentaho’s Streamlined Data Refinery deserves a look. For both enterprises and ISVs that want to apply integration and analytics in context of another application, Pentaho’s REST-based APIs allow embedding of end-to-end analytic capabilities. Together with the big data blue prints discussed above, Pentaho is able to deliver a targeted yet flexible approach to big data.

Regards,

Tony Cosentino

VP and Research Director

vr_oi_information_sources_for_operational_intelligenceAt a conference of more than 3,500 users, Splunk executives showed off their company’s latest tools. Splunk makes software for discovering, monitoring and analyzing machine data, which is often considered data exhaust since it is a by-product of computing processes and applications. But machine data is essential to a smoothly running technology infrastructure that supports business process. One advantage is that because machine data not recorded by end users, it is less subject to input error. Splunk has grown rapidly by solving fundamental problems associated with the complexities of information technology and challenging assumptions in IT systems and network management that is rapidly being referred to as big data analytics. The two main and related assumptions it challenges are that different types of IT systems should be managed separately and that data should be modeled prior to recording it. Clint Sharp, Splunk’s director of product marketing, pointed out that network and system data can come from several sources and argued that utilizing point solution tools and a “model first” approach does not work when it has to deal with big data and a question-and-answer paradigm. Our research into Operational Intelligence finds that IT systems are most important information source in almost two thirds (62%) of organizations. Splunk used the conference to show how it has brought to these data management innovations the business trends of mobility, cloud deployment and security.

Presenters from major customer companies demonstrated how they work with Splunk Enterprise. For example, according to Michael Connor, senior platform architect for Coca-Cola, bringing all the company’s data into Splunk allowed the IT department to reduce trouble tickets by 80 percent and operating costs by 40 percent. Beyond asserting the core value of streamlining IT operations and the ability to quickly provision system resources, Connor discussed other uses for data derived from the Splunk product. Coca-Cola IT used a free community add-on to deliver easy-to-use dashboards for the security team. He showed how channel managers compare different vending environments in ways they had never done before. They also can conduct online ethnographic studies to better understand behavior patterns and serve different groups. For Coca-Cola, the key to success for the application was to bring data from various platforms in the organization into one data platform. This challenge, he said, is more to do with people and processes than technology, since many parts of an organization are protective of their data, in effect forming what he called “data cartels.” This situation is not uncommon. Our research into information optimization shows that organizations need these so-called softer disciplines to catch up with their capabilities in technology and information to realize full value from data and analytics initiatives.

In keeping up with trends, Splunk is making advances in mobility. One is MINT for monitoring mobile devices. With the company’s acquisition of BugSense as a foundation, Splunk has built an extension of its core platform that consumes and indexes application and other machine data from mobile devices. The company is offering the MINT Express version to developers so they can build the operational service into their applications. Similar to the core product, MINT has the ability to track transactions, network latency and crashes throughout the IT stack. It can help application developers quickly solve user experience issues by understanding root causes and determining responsibility. For instance, MINT Express can answer questions such as these: Is it an application issue or a carrier issue? Is it a bad feature or a system problem? After it is applied, end-user customers get a better digital experience which results in more time spent with the application and increased customer loyalty in a mobile environment where the cost of switching is low. Splunk also offers MINT Enterprise, which allows users to link and cross-reference data in Splunk Enterprise. The ability to instrument data in a mobile environment, draw a relationship with the enterprise data  and display key operational variables is critical to serving and satisfying consumers. By extending this capability into the mobile sphere, Splunk MINT delivers value for corporate IT operations as well as the new breed of cloud software providers. However, Splunk risks stepping on its partners’ toes as it takes advantage of certain opportunities as in mobility. In my estimation, the risk is worth taking given that mobility is a systemic change that represents enormous opportunity. Our research into business technology innovation shows mobility in a virtual tie with collaboration for the second-most important innovation priority for companies today.

vr_Info_Optimization_17_with_more_sources_easy_access_more_importantCloud computing is another major shift that the company is prioritizing. Praveen Rangnath, director of cloud product marketing, said that Splunk Cloud enables the company to deliver 100 percent on service level agreements through fail-over capabilities across AWS availability zones, redundant operations across indexers and search heads, and by using Splunk on Splunk itself. Perhaps the most important capability of the cloud product is its integration of enterprise and on-demand systems. This capability allows a single view and queries across multiple data sources no matter where they physically reside. Coupled with Splunk’s abilities to ingest data from various NoSQL systems – such as Mongo, Cassandra, Accumulo, Amazon’s Elastic Map Reduce, Amazon S3 and even mainframes – with the Ironstream crawler, its hybrid search capability is unique. The company’s significant push into the cloud is reflected by both a 33 percent reduction in price and its continued investment into the platform. According to our research into information optimization one of the biggest challenges with big data is simplification of data access; as data sources increase easy access becomes more important. More than 92 percent of organizations that have  16 to 20 data sources rated information simplification very important. As data proliferates both on-premises and in the cloud, Splunk’s software abstracts users from the technical complexities of integrating and accessing the hybrid environment. (Exploring this and related issues, our upcoming benchmark research into data and analytics in the cloud will examine trends in business intelligence and analytics related to cloud computing.)

vr_ngbi_br_importance_of_bi_technology_considerations_updatedUsability is another key consideration: In our research on next-generation business intelligence nearly two-thirds (63%) of organizations said that is an important evaluation criterion, more than any other one. At the user conference Divanny Lamas, senior manager of product management, discussed new features aimed at the less sophisticated Splunk user. Advanced Feature Extractor enables users to extract fields in a streamlined fashion that does not require them to write an expression. Instant Pivot enables easy access to a library of panels and dashboards that allows end users to pivot and visually explore data. Event Pattern Detection clusters patterns in the data to make different product usage metrics and issues impacting downtime easier to resolve. Each of these advances represents progress in broader usability and organizational appeal. While Splunk continues to make its data accessible to business users, gaining broader adoption is still an uphill battle because much of the Splunk data is technical in nature. The current capabilities address the technologically sophisticated knowledge worker or the data analyst, while a library of plug-ins allows more line-of-business end-users to perform visualization. (For more on the analytic user personas that matter in the organization and what they need to be successful, please see my analysis.)

Splunk is building an impressive platform for collecting and analyzing data across the organization. The question from the business analytics perspective is whether the data can be modeled in ways that easily represent each organization’s unique business challenges. Splunk provides search capabilities for IT data by default, but when other data sources need to be brought in for more advanced reporting and correlation, it requires the data to be normalized, categorized and parsed. Currently, business users apply various data models and frameworks from major IT vendors as well as various agencies and data brokers. This dispersion could provide an opportunity for Splunk to provide a unified platform; the more data businesses ingest, the more likely they will rely on such a platform. Splunk’s Common Information Model provides a metadata framework using key-value pair representation similar to what other providers of cloud analytic applications are doing. When we consider the programmable nature of the platform including RESTful APIs and various SDKs, HUNK’s streamlined access to Hadoop and other NoSQL sources, Splunk BD connect for relational sources, the Splunk Cloud hybrid access model and the instrumentation of mobile data in MINT, the expansive platform idea seems plausible.

vr_Big_Data_Analytics_04_types_of_big_data_for_analyticsA complicating factor as to whether Splunk will become such a platform for operational intelligence and big data analytics is the Internet of Things (IoT), which collects data from various devices. Massive amounts of sensor data already are moving through the Internet, but IoT approaches and service architectures are works in progress. Currently, many of these architectures do not communicate with others. Given Splunk’s focus on machine data which is a key type of input for big data analytics in 42 percent of organizations according to our research, IoT appears to be a natural fit as it is generating event-centered data which is a type of input for big data analytics in 48 percent of organizations. There is some debate about whether Splunk is a true event processing engine, but that depends on how the category is defined. Log messages, its specialty, are not events per se but rather are data related to something that has happened in an IT infrastructure. Once correlated, this data points directly to something of significance, including events that can be acted upon. If such a correlation triggers a system action, and that action is taken in time to solve the problem, then the data provides value and it should not matter if the system is acting in real time or near real time. In this way, the data itself is Splunk’s advantage. To be successful in becoming a broader data platform, the company will need to advance its Common Information Model, continue to emphasize the unique value of machine data, build their developer and partner ecosystem, and encourage customers to push the envelope and develop new use cases.

For organizations considering Splunk for the first time, IT operations, developer operations, security, fraud management and compliance management are obvious areas to evaluate. Splunk’s core value is that it simplifies administration, reduces IT costs and can reduce risk through pattern recognition and anomaly detection. Each of these areas can deliver value immediately. For those with a current Splunk implementation, we suggest examining use cases related to business analytics. Specifically, comparative analysis and analysis of root causes, online ethnography and feature optimization in the context of the user experience can all deliver value. As ever more data comes into their systems, companies also may find it reasonable to consider Splunk in other new ways like big data analytics and operational intelligence.

Regards,

Tony Cosentino

VP and Research Director

Qlik was an early pioneer in developing a substantial market for a visual discovery tool that enables end users to easily access and manipulate analytics and data. Its QlikView application uses an associative experience that takes  an in-memory, correlation-based approach to present a simpler design and user experience for analytics than previous tools. Driven by sales of QlikView, the company’s revenue has grown to more than $.5 billion, and originating in Sweden it has a global presence.

At its annual analyst event in New York the business intelligence and analytics vendor discussed recent product developments, in particular the release of Qlik Sense. It is a drag-and-drop visual analytics tool targeted at business users but scalable enough for enterprise use. Its aim is to give business users a simplified visual analytic experience that takes advantage of modern cloud technologies. Such a user experience is important; our benchmark research into next-generation business intelligence shows that usability is an important buying criterion for nearly two out of three (63%) companies. A couple of months ago, Qlik introduced Qlik Sense for desktop systems, and at the analyst event it announced general availability of the cloud and server editions.

vr_bti_br_technology_innovation_prioritiesAccording to our research into business technology innovation, analytics is the top initiative for new technology: 39 percent of organizations ranked it their number-one priority. Analytics includes exploratory and confirmatory approaches to analysis. Ventana Research refers to exploratory analytics as analytic discovery and segments it into four categories that my colleague Mark Smith has articulated. Qlik’s products belong in the analytic discovery category. Users can use the tool to investigate data sets in an intuitive and visual manner, often conducting root cause analysis and decision support functions. This software market is relatively young, and competing companies are evolving and redesigning their products to suit changing tastes. Tableau, one of Qlik’s primary competitors, which I wrote about recently, is adapting its current platform to developments in hardware and in-memory processing, focusing on usability and opening up its APIs. Others have recently made their first moves into the market for visual discovery applications, including Information Builders and MicroStrategy. Companies such as Actuate, IBM, SAP, SAS and Tibco are focused on incorporating more advanced analytics in their discovery tools. For buyers, this competitive and fragmented market creates a challenge when comparing offers in the analytic discovery market.

A key differentiator is Qlik Sense’s new modern architecture, which is designed for cloud-based deployment and embedding in other applications for specialized use. Its analytic engine plugs into a range of Web services. For instance, the Qlik Sense API enables the analytic engine to call to a data set on the fly and allow the application to manipulate data in the context of a business process. An entire table can be delivered to node.js, which extends the JavaScript API to offer server-side features and enables the Qlik Sense engine to take on an almost unlimited number of real-time connections  by not blocking input and output. Previously developers could write PHP script and pipe SQL to get the data, and the resulting application is viable but complex to build and maintain. Now all they need is JavaScript and HTML. The Qlik Sense architecture abstracts the complexity and allows JavaScript developers to make use of complex constructs without intricate knowledge of the database. The new architecture can decouple the Qlik engine from the visualizations themselves, so Web developers can define expressions and dimensions without going into the complexities of the server-side architecture. Furthermore, by decoupling the services, developers gain access to open source visualization technologies such as d3.js. Cloud-based business intelligence and extensible analytics are becoming a hot topic. I have written about this, including a glimpse of our newly announced benchmark research on the next generation of data and analytics in the cloud. From a business user perspective, these types of architectural changes may not mean much, but for developers, OEMs and UX design teams, it allows much faster time to value through a simpler component-based approach to utilizing the Qlik analytic engine and building visualizations.

vr_Big_Data_Analytics_06_benefits_realized_from_big_data_analyticsThe modern architecture of Qlik Sense together with the company’s ecosystem of more than 1,000 partners and a professional services organization that has completed more than 2,700 consulting engagements, gives Qlik a competitive position. The service partner relationships, including those with major systems integrators, are key to the company’s future since analytics is as much about change management as technology. Our research in analytics consistently shows that people and processes lag technology and information in performance with analytics. Furthermore, in our benchmark research into big data analytics, the benefits most often mentioned as achieved are better communication and knowledge sharing (24%), better management and alignment of business goals (18%), and gaining competitive advantage (17%).

As tested on my desktop, Qlik Sense shows an intuitive interface with drag-and-drop capabilities for building analysis. Formulas are easy to incorporate as new measures, and the palate offers a variety of visualization options which automatically fit to the screen. The integration with QlikView is straightforward in that a data model from QlikView can be saved seamlessly and opened intact in Qlik Sense. The storyboard function allows for multiple visualizations to build into narratives and for annotations to be added including linkages with data. For instance, annotations can be added to specific inflection points in a trend line or outliers that may need explanation. Since the approach is all HTML5-based, the visualizations are ready for deployment to mobile devices and responsive to various screen sizes including newer smartphones, tablets and the new class of so-called phablets. In the evaluation of vendors in our Mobile Business Intelligence Value Index Qlik ranked fourth overall.

In the software business, of course, technology advances alone don’t guarantee success. Qlik has struggled to clarify the position its next-generation product and it is not a replacement for QlikView. QlikView users are passionate about keeping their existing tool because they have already designed dashboards and calculations using this tool. Vendors should not underestimate user loyalty and adoption. Therefore Qlik now promises to support both products for as long as the market continues to demand them. The majority of R&D investment will go into Qlik Sense as developers focus on surpassing the capabilities of QlikView. For now, the company will follow a bifurcated strategy in which the tools work together to meet needs for various organizational personas. To me, this is the right strategy. There is no issue in being a two-product company, and the revised positioning of Qlik Sense complements QlikView both on the self-service side and the developer side. Qlik Sense is not yet as mature a product as QlikView, but from a business user’s perspective it is a simple and effective analysis tool for exploring data and building different data views. It is simpler because users no do not need to script the data in order to create the specific views they deem necessary. As the product matures, I expect it to become more than an end user’s visual analysis tool since the capabilities of Qlik Sense lends itself to web scale approaches. Over time, it will be interesting to see how the company harmonizes the two products and how quickly customers will adopt Qlik Sense as a stand-alone tool.

For companies already using QlikView, Qlik Sense is an important addition to the portfolio. It will allow business users to become more engaged in exploring data and sharing ideas. Even for those not using QlikView, with its modern architecture and open approach to analytics, Qlik Sense can help future-proof an organization’s current business intelligence architecture. For those considering Qlik for the first time, the choice may be whether to bring in one or both products. Given the proven approach of QlikView, in the near term a combination approach may be a better solution in some organizations. Partners, content providers and ISVs should consider Qlik Branch, which provides resources for embedding Qlik Sense directly into applications. The site provides developer tools, community efforts such as d3.js integrations and synchronization with Github for sharing and branching of designs. For every class of user, Qlik Sense can be downloaded for free and tested directly on the desktop. Qlik has made significant strides with Qlik Sense, and it is worth a look for anybody interested in the cutting edge of analytics and business intelligence.

Regards,

Tony Cosentino

VP and Research Director

Tableau Software introduced its latest advancements in analytics and business intelligence software along with its future plan to more than 5,000 attendees at its annual user conference in its home town of Seattle. The enthusiasm of the primarily millennial-age crowd reflected not only the success of the young company but also its aspirations. The market for what Ventana Research calls visual and data discovery and Tableau have experienced rapid growth that is likely to continue.

vr_ngbi_br_importance_of_bi_technology_considerations_updatedThe company focuses on the mission of usability, which our benchmark research into next-generation business intelligence shows to be a top software buying criterion for more organizations (63%) than any other. Tableau introduced advances in this area including analytic ease of use, APIs, data preparation, storyboarding and mobile technology support as part of its user-centric product strategy. Without revealing specific timelines, executives said that the next major release, Tableau 9.0, likely will be available in the first half of 2015 as outlined by the CEO in his keynote.

Chief Development Officer and co-founder Chris Stolte showed upcoming ease-of-use features such as the addition of Excel-like functionality within workbooks. Users can type a formula directly into a field and use auto-completion or drag and drop to bring in other fields that are components of the calculation. The new calculation can be saved as a metric and easily added to the Tableau data model. Others announced included table calculations, geographic search capabilities and radial and lasso selection on maps. The live demonstration between users onstage was seamless and created flows that the audience could understand. The demonstration reflected impressive navigation capabilities.

Stolte also demonstrated upcoming statistical capabilities.  Box plots have been available since Tableau 8.1, but now the capabilities have been extended for comparative analysis across groups and to create basic forecast models. The comparative descriptive analytics has been improved with drill-down features and calculations within tables. This is important since analysis between and within groups is necessary to use descriptive statistics to reveal business insights. Our research into big data analytics shows that the some of the most important analytic approaches are descriptive in nature: Pivot tables (48%), classification or decision trees (39%) and clustering (37%) are the methods most widely used for big data analytics.

When it comes to predictive analytics, however, Tableau is still somewhat limited. Companies such as IBM, Information Builders, MicroStrategy, SAS and SAP have focused more resources on incorporating advanced analytics in their discovery tools; Tableau has to catch up in this area. Forecasting of basic trend lines is a first step, but if the tool is meant for model builders, then I’d like to see more families of curves and algorithms to fit different data sets such as seasonal variations. Business users, Tableau’s core audience, need automated modeling approaches that can evaluate the characteristics of the data and produce adequate models. How different stakeholders communicate around the statistical parameters and models is also unclear to me. Our research shows that summary statistics and model comparisons are important capabilities for administering and sharing predictive analytics. Overall, Tableau is making strides in both descriptive and predictive statistics and making this intuitive for users.

vr_Info_Optimization_04_basic_information_tasks_consume_timePresenters also introduced new data preparation capabilities on Excel imports including the abilities to view delimiters, split columns and even un-pivot data. The software also has some ability to clean up spreadsheets such as getting rid of headers and footers. Truly dirty data, such as survey data captured in spreadsheets or created with custom calculations and nesting, is not the target here. The data preparation capabilities can’t compare with those provided by companies such as Alteryx, Infromatica, Paxata, Pentaho, Tamr or Trifacta. However, it is useful to quickly upload and clean a basic Excel document and then visualize it in a single application. According to our benchmark research on information optimization, data preparation (47%) and checking data for quality and consistency (45%) are the primary tasks on which analysts spend their time.

Storytelling (which Tableau calls Storypoints), is an exciting area of development for the company. Introduced last year, it enables users to build a sequential narrative that includes graphics and text. New developments enable the user to view thumbnails of different visualizations and easily pull them into the story. Better control over calculations, fonts, colors and text positioning were also introduced. While these changes may seem minor, they are important to this kind of application. A major reason that most analysts take their data out of an analytic tool and put it into PowerPoint is to have this type of control and ease of use. While PowerPoint remains dominant when it comes to communicating analytic results in business, a new category of tools is challenging Microsoft’s preeminence in this area. Tableau Storypoints is one of the easiest to use in the market.

API advancements were discussed by Francois Ajenstat, senior director of product management, who suggested that in the future anything done on Tableau Server can be done through APIs. In this way different capabilities will be exposed so that other software can use them (Tableau visualizations, for example) within the context of their workflows. As well, Tableau has added REST APIs including JavaScript capabilities, which allow users to embed Tableau in applications to do such things as customize dashboards. The ability to visualize JSON data in Tableau is also a step forward since exploiting new data sources is the fastest way to gain business advantage from analytics. This capability was demonstrated using government data, which is commonly packaged in JSON format. As Tableau continues its API initiatives, we hope to see more advances in exposing APIs so Tableau can be integrated into governance workflows, which can be critical to enterprise implementations. APIs also can enable analytic workflow tools to more easily access the product so statistical modelers can understand the data prior to building models. While Tableau integrates on the back end with analytic tools such as Alteryx, the visual and data discovery aspects that must precede statistical model building are still a challenge. Having more open APIs will open up opportunities for Tableau’s customers and could broaden its partner ecosystem.

The company made other enterprise announcements such as Kerberos Security, easier content management, an ability to seamlessly integrate with Salesforce.com, and a parallelized approach to accessing very large data sets. These are all significant developments. As Tableau advances its enterprise vision and continues to expand on the edges of the organization, I expect it to compete in more enterprise deals. The challenge the company faces is still one of the enterprise data model. Short of displacing the enterprise data models that companies have invested in over the years, it will continue to be an uphill battle for Tableau to displace large incumbent BI vendors. Our research into Information Optimization shows that the integration with security and user-access frameworks is the biggest technical challenge for optimizing information. For a deeper look at Tableau’s battle for the enterprise, please read my previous analysis.

VRMobileBIVIPerhaps the most excitement from the audience came from the introduction of Project Elastic, a new mobile application with which users can automatically turn an email attachment in Excel into a visualization. The capability is native so it works in offline mode and provides a fast and responsive experience. The new direction bifurcates Tableau’s mobile strategy which heretofore was a purely HTML5 strategy introduced with Tableau 6.1. Tableau ranked seventh in our 2014 Mobile Business Intelligence Value Index.

Tableau has a keen vision of how it can carve out a place in the analytics market. Usability has a major role in building a following among users that should help it continue to grow. The Tableau cool factor won’t go unchallenged, however. Major players are introducing visual discovery products amid discussion about the need for more governance of data into the enterprise and cloud computing; Tableau likely have to blend into the fabric of analytics and BI in organizations. To do so, the company will need to forge significant partnerships and open its platform for easy access.

Organizations considering visual discovery software in the context of business and IT should include Tableau. For enterprise implementations, consideration should be done to ensure Tableau can support the broader manageability and reliability requirements for larger scale deployments. Visualization of data continues to be a critical method to understand the challenges of global business but should not be the only analytic approach taken for all types of users. Tableau is on the leading edge of visual discovery and should not be overlooked.

Regards,

Tony Cosentino

VP and Research Director

It’s widely agreed that cloud computing is a major technology innovation. Many companies use cloud-based systems for specific business functions such as customer service, sales, marketing, finance and human resources. More generally, however, analytics and business intelligence (BI) have not migrated to the cloud as quickly. But now cloud-based data and analytics products are becoming more common. This trend is most popular among technology companies, small and midsize businesses, and departments in larger ones, but there are examples of large companies moving their entire BI environments to the cloud. Our research into big data analytics shows that more than one-fourth of analytics initiatives for companies of all sizes are cloud-based.

vr_bti_br_top_benefits_of_cloud_computingLike other cloud-based applications, cloud analytics offers enhanced scalability and flexibility, affordability and IT staff optimization. Our research shows that in general the top benefits are lowered costs (for 40%), improved efficiency (39%) and better communication and knowledge sharing (34%). Using the cloud, organizations can use a sophisticated IT infrastructure without having to dedicate staff to install and support it. There is no need for comprehensive development and testing because the provider is responsible for maintaining and upgrading the application and the infrastructure. The cloud can also provide flexible infrastructure resources to support “sandbox” testing environments for advanced analytics deployments. Multitenant cloud deployments are more affordable because costs are shared across many companies. When used departmentally, application costs need not be capitalized but instead can be made operational expenditures. Capabilities can be put to use quickly, as vendors develop them, and updates need not disrupt use. Finally, some cloud-based interfaces are more intuitive for end users since they have been designed with the user experience in mind. Regarding cloud technology, our business technology innovation research finds that usability is the most important technology evaluation criterion (for 64% of participants), followed by reliability (54%) and capability (%).

vr_bti_why_companies_dont_use_cloudFor analytics and BI specifically, there are still issues holding back adoption. Our research finds that a primary reason companies do not deploy cloud-based applications of any sort are security and compliance issues. For analytics and business intelligence, we can also include data related activities as another reason since cloud-based approaches often require data integration and transmission of sensitive data across an external network along with a range of data preparation. Such issues are especially prevalent for companies that have legacy BI tools using data models that have been distributed across their divisions. Often these organizations have defined their business logic and metrics calculations within the context of these tools. Furthermore, these tools may be integrated with other core applications such as forecasting and planning. To re-architect such data models and metrics calculations is a challenge some companies are reluctant to undertake.

In addition, despite widespread use of some types of cloud-based systems, for nontechnical business people discussions of business intelligence in the cloud can be confusing, especially when they involve information integration, the types of analytics to be performed and where the analytic processes will. The first generation of cloud applications focused on end-user processes related to the various lines of business and largely ignored the complexities inherent in information integration and analytics. Organizations can no longer ignore these complexities since doing so exacerbates the challenge of fragmented systems and distributed data. Buyers and architects should understand the benefits of analytics in the cloud and weigh these benefits against the challenges described above.

Our upcoming benchmark research into data and analytics in the cloud will examine the current maturity of this market as well opportunities and barriers to organizational adoption across line of business and IT. It will evaluate cloud-based analytics in the context of trends such as big data, mobile technology and social collaboration as well as location intelligence and predictive analytics. It will consider how cloud computing enables these and other applications and identify leading indicators for adoption of cloud-based analytics. It also will examine how cloud deployment enables large-scale and streaming applications. For example, it will examine real-time processing of vast amounts of data from sensors and other semistructured data (often referred to as the Internet of Things).

It is an exciting time to be studying this particular market as companies consider moving platforms to the cloud. I look forward to receiving any qualified feedback as we move forward to start this important benchmark research. Please get in touch if you have an interest in this area of our research.

Regards,

Tony Cosentino

VP and Research Director

It’s widely agreed that cloud computing is a major technology innovation. Many companies use cloud-based systems for specific business functions such as customer service, sales, marketing, finance and human resources. More generally, however, analytics and business intelligence (BI) have not migrated to the cloud as quickly. But now cloud-based data and analytics products are becoming more common. This trend is most popular among technology companies, small and midsize businesses, and departments in larger ones, but there are examples of large companies moving their entire BI environments to the cloud. Our research into big data analytics shows that more than one-fourth of analytics initiatives for companies of all sizes are cloud-based.

vr_bti_br_top_benefits_of_cloud_computingLike other cloud-based applications, cloud analytics offers enhanced scalability and flexibility, affordability and IT staff optimization. Our research shows that in general the top benefits are lowered costs (for 40%), improved efficiency (39%) and better communication and knowledge sharing (34%). Using the cloud, organizations can use a sophisticated IT infrastructure without having to dedicate staff to install and support it. There is no need for comprehensive development and testing because the provider is responsible for maintaining and upgrading the application and the infrastructure. The cloud can also provide flexible infrastructure resources to support “sandbox” testing environments for advanced analytics deployments. Multitenant cloud deployments are more affordable because costs are shared across many companies. When used departmentally, application costs need not be capitalized but instead can be made operational expenditures. Capabilities can be put to use quickly, as vendors develop them, and updates need not disrupt use. Finally, some cloud-based interfaces are more intuitive for end users since they have been designed with the user experience in mind. Regarding cloud technology, our business technology innovation research finds that usability is the most important technology evaluation criterion (for 64% of participants), followed by reliability (54%) and capability (%).

vr_bti_why_companies_dont_use_cloudFor analytics and BI specifically, there are still issues holding back adoption. Our research finds that a primary reason companies do not deploy cloud-based applications of any sort are security and compliance issues. For analytics and business intelligence, we can also include data related activities as another reason since cloud-based approaches often require data integration and transmission of sensitive data across an external network along with a range of data preparation. Such issues are especially prevalent for companies that have legacy BI tools using data models that have been distributed across their divisions. Often these organizations have defined their business logic and metrics calculations within the context of these tools. Furthermore, these tools may be integrated with other core applications such as forecasting and planning. To re-architect such data models and metrics calculations is a challenge some companies are reluctant to undertake.

In addition, despite widespread use of some types of cloud-based systems, for nontechnical business people discussions of business intelligence in the cloud can be confusing, especially when they involve information integration, the types of analytics to be performed and where the analytic processes will. The first generation of cloud applications focused on end-user processes related to the various lines of business and largely ignored the complexities inherent in information integration and analytics. Organizations can no longer ignore these complexities since doing so exacerbates the challenge of fragmented systems and distributed data. Buyers and architects should understand the benefits of analytics in the cloud and weigh these benefits against the challenges described above.

Our upcoming benchmark research into data and analytics in the cloud will examine the current maturity of this market as well opportunities and barriers to organizational adoption across line of business and IT. It will evaluate cloud-based analytics in the context of trends such as big data, mobile technology and social collaboration as well as location intelligence and predictive analytics. It will consider how cloud computing enables these and other applications and identify leading indicators for adoption of cloud-based analytics. It also will examine how cloud deployment enables large-scale and streaming applications. For example, it will examine real-time processing of vast amounts of data from sensors and other semistructured data (often referred to as the Internet of Things).

It is an exciting time to be studying this particular market as companies consider moving platforms to the cloud. I look forward to receiving any qualified feedback as we move forward to start this important benchmark research. Please get in touch if you have an interest in this area of our research.

Regards,

Tony Cosentino

VP and Research Director

Our benchmark research consistently shows that business analytics is the most significant technology trend in business today and acquiring effective predictive analytics is organizations’ top priority for analytics. It enables them to look forward rather than backward and, participate organizations reported, leads to competitive advantage and operational efficiencies.

In our benchmark research on big data analytics, for example, 64 percent of organizations ranked predictive analytics as the most Untitledimportant analytics category for working with big data. Yet a majority indicated that they do not have enough experience in applying predictive analytics to business problems and lack training on the tools themselves.

Predictive analytics improves an organization’s ability to understand potential future outcomes of variables that matter. Its results enable an organization to decide correct courses of action in key areas of the business. Predictive analytics can enhance the people, process, information and technology components of an organization’s future performance.

In our most recent research on this topic, more than half (58%) of participants indicated that predictive analytics is very important to their organization, but only one in five said they are very satisfied with their use of those analytics. Furthermore, our research found that implementing predictive analysis would have a transformational impact in one-third of organizations and a significant positive impact in more than half of other ones.

In our new research project, The Next Generation of Predictive Analytics, we will revisit predictive analysis with an eye to determining how attitudes toward it have changed,  along with its current and planned use, and its importance in business. There are significant changes in this area, including where, how, why, and when predictive analytics are applied. We expect to find changes not only in forecasting and analyzing customer churn but also in operational use at the front lines of the organization and in improving the analytic process itself. The research will also look at the progress of emerging statistical languages such as R and Python, which I have written about.

vr_predanalytics_benefits_of_predictive_analytics_updatedAs does big data analytics, predictive analytics involves sourcing data, creating models, deploying them and managing them to understand when an analytic model has become stale and ought to be revised or replaced. It should be obvious that only the most technically advanced users will be familiar with all this, so to achieve broad adoption, predictive analytics products must mask the complexity and be easy to use. Our research will determine the extent to which usability and manageability are being built into product offerings.

The promise of predictive analytics, including competitive advantage (68%), new revenue opportunities (55%), and increased profitability (52%), is significant. But to realize the advantages of predictive analytics, companies must transform how they work. In terms of people and processes a more collaborative strategy may be necessary. Analysts need tools and skills in order to use predictive analytics effectively. A new generation of technology is also becoming available where predictive analytics are easier to apply and use, along with deploy into line of business processes. This will help organizations significantly as there are not enough data scientists and specially trained professionals in predictive analytics that will be available for organizations to utilize or afford to hire.

This benchmark research will look closely at the evolving use of predictive analytics to establish how it equips business to make decisions based on likely futures, not just the past.

Regards,

Tony Cosentino

VP & Research Director

VRMobileBIVI_HotVendorMobility continues to be a hot adoption area in business intelligence, according to our research across analytics and line of business departments. Nearly three-quarters (71%) of organizations said their mobile workforce would be able to access BI capabilities in the next 12 months according to our next generation mobile business intelligence research. Roambi, a provider of mobile business intelligence applications, has made important strides this year after moving to deploying its products in the cloud, an event that I covered previously. Roambi is rated as one of the top providers of mobile business intelligence or what we refer to as a ‘Hot Vendor’ according to our Value Index.

Earlier this year, Roambi announced a partnership with Box, which offers cloud-based data storage and file sharing. More recently it began to catch up with the market by announcing support for the Android mobile operating system. Roambi has focused on the mobile BI market from its inception, first by building on the Apple iPhone’s small screen and then progressing to support the majority of mobile devices in corporations today.

The Box partnership, announced in March, enables joint Box and Roambi customers to visualize and interact with data stored on the Box file sharing system. Specifically, users are able to sync Roambi Analytics, the company’s visualization engine, and Roambi Flow, its storyboarding capability, with Box. The integration allows fast importation of Box source files and the ability to create and share interactive reports through Roambi Analytics and to create digital magazines and content through Roambi Flow. Data can be automatically refreshed in real time via Box Sync.

This partnership serves both companies since the coupled service provide users with expanded capabilities and little overlap. Box’s rapid growth is being driven by its ease of use, open platform and cloud approach to file sharing. Thisis a natural platform for Roambi to build on and expand its customer base. For Box, Roambi’s dynamic visualization and a collaboration capabilities address its business customers’ needs and increase its market opportunities. In our benchmark research on information optimization 83 percent of organizations said it is important to have components that provide interactive capabilities to support presentation of information.

Roambi also works with other application vendors in the CRM and ERP market to integrate their applications. The same research shows that CRM (for 45%) and ERP (38%) are important types to integrate with others especially in areas such as sales and customer service. Apple’s recent announcement of a new SDK, should facilitate process integration between software systems so that, for example, users can access and navigate applications such as those from Box and Roambi and transactional applications such as CRM and ERP in a single workflow. This capability can provide further usability advantages for Roambi, which scored the highest rating in this area in our Mobile Business Intelligence Value Index.

Roambi announced its mobile BI support for the Google Android mobile operating system that operates across a wide range of those smartphone and tablet technologies. It had delayed its development of its software on the Android platform, which required significant development resources and investment but was part of its strategy to maximize its business potential and relationship with Apple. The release are available at the Google Play store . The initial version will include four of Roambi’s most popular views: Card, Catalist, Layers and Superlist. Similar to its application for the Apple platform, security features for Android include Device Lockout, Remote Wipe and Application Passcode. Integration with Roambi’s partner Okta provides identity management services and gives access to any applications supported by Okta. Integration includes Active Directory (AD) and lightweight directory access protocol (LDAP). While Roambi Flow will not be available on Android out of the gate, the company says it will be becoming available by the end of 2014.

Current approaches to mobile business intelligence applications on the market include native, Web-based and hybrid (a combination of both). We compare these approaches in detail in the executive summary of our Mobile Business Intelligence Value Index report.With the new Android support, Roambi has a unique approach to the hybrid architecture that bypasses the browser completely. There is no data cached in a browser and in fact the data is loaded directly to the device and rendered through Roambi natively on the device.  From the user’s perspective, the benefit of this approach is performance since interactivity does not rely on data traveling over a network. A second benefit is offline access to data, which is not available via non-native approaches. From the developer’s or information assembler perspective, testing across browsers is not needed since there is no data caching and the experience is the same regardless of browser in use.

vr_Info_Optimization_16_information_software_evaluation_criteriaOur next-generation business intelligence research shows that executives strongly support mobile BI: Nearly half of them said that mobility is very important to their BI processes. Executives also favor Apple over Android devices which is likely one of the key reasons for Apple’s dominance in the current business intelligence landscape. However our research shows latent demand for Android devices in the business intelligence market and given the dominance of Android in the consumer market as well as dominance in places like Asia and other emerging markets, any company that seeks to take significant global market share will need support for both platforms. Our information optimization research fond Google Android as second in smartphone platform (24%) behind Apple iPhone (59%) as first ranked position. Therefore, Android support is an important evolution for Roambi in order to address an increasing demand in the market for Android devices.

Usability has become the most important evaluation criteria and in the information optimization benchmark research was selected as most important to over half (58%) of organizations. Roambi ranked as the highest in usability in the Ventana Research Mobile Business Intelligence Value Index, though its overall score was hampered somewhat by the lack of support for Android. With Android support, the company now addresses the need for multiple and the so called bring your own device (BYOD) to work methods now becoming more prevalent and allowed by organizations. By addressing this as well as taking advantages of broader market developments such as the new Apple SDKs, Roambi continues to address what organizations find important today. Usability of business intelligence systems is a top priority for 63% of companies. Even our big data analytics research finds a growing level of overall importance for mobile access in almost half of organizations (46%).  Any company that wants to get a user friendly business intelligence into the hands of its mobile workers quickly and effectively should have Roambi in the consideration set.

Regards,

Tony Cosentino

VP and Research Director

At its annual industry analyst summit last month and in a more recent announcement of enterprise support for parallelizing the R language on its Aster Discovery Platform, Teradata showed that it is adapting to changes in database and analytics technologies. The presentations at the conference revealed a unified approach to data architectures and value propositions in a variety of uses including the Internet of Things, digital marketing and ETL offloading. In particular, the company provided updates on the state of its business as well as how the latest version of its database platform, Teradata 15.0, is addressing customers’ needs for big data. My colleague Mark Smith covered these announcements in depth. The introduction of scalable R support was discussed at the conference but not announced publicly until late last month.

vr_Big_Data_Analytics_13_advanced_analytics_on_big_dataTeradata now has a beta release of parallelized support for R, an open source programming language used significantly in universities and growing rapidly in enterprise use. One challenge is that R relies on a single-thread, in-memory approach to analytics. Parallelization of R allows the algorithm to run on much larger data sets since it is not limited to data stored in memory. For a broader discussion of the pros and cons of R and its evolution, see my analysis. Our benchmark research shows that organizations are counting on companies such as Teradata to provide a layer of abstraction that can simplify analytics on big data architectures. More than half (54%) of advanced analytics implementations are custom built, but in the future this percentage will go down to about one in three (36%).

Teradata’s R project has three parts. The first includes a Teradata Aster R library, which supplies more than 100 prebuilt R functions that hide complexity of the in-database implementation. The algorithms cover the most common big data analytic approaches in use today, which according to our big data analytics benchmark research are classification (used by 39% of organizations), clustering (37%), regression (35%), time series (32%) and affinity analysis (29%). Some use innovative approaches available in Aster such as Teradata’s patented nPath algorithm, which is useful in areas such as digital marketing. All of these functions will receive enterprise support from Teradata, likely through its professional services team.

The second part of the project involves the R parallel constructor. This component gives analysts and data scientists tools to build their own parallel algorithms based on the entire library of open source R algorithms. The framework follows the “split, apply and combine” paradigm, which is popular among the R community. While Teradata won’t support the algorithms themselves, this tool set is a key innovation that I have not yet seen from others in the market.

Finally, the R engine has been integrated with Teradata’s SNAP integration framework. The framework provides unified access to multiple workload specific engines such as relational (SQL), graph (SQL-GR), MapReduce (SQL-MR) and statistics. This is critical since the ultimate value of analytics rests in the information itself. By tying together multiple systems, Teradata enables a variety of analytic approaches. More importantly, the data sources that can be merged into the analysis can deliver competitive advantages. For example, JSON integration, recently announced, delivers information from a plethora of connected devices and detailed Web data.

vr_Big_Data_Analytics_09_use_cases_for_big_data_analyticsTeradata is participating in industry discussions about both data management and analytics. As Mark Smith discussed, its unified approach to data architecture addresses challenges brought on competing big data platforms such as Hadoop and other NoSQL approaches like that one announced with MongoDB supporting JSON integration. These platforms access new information sources and help companies use analytics to indirectly increase revenues, reduce costs and improve operational efficiency. Analytics applied to big data serve a variety of uses, most often cross-selling and up-selling (for 38% of organizations), better understanding of individual customers (32%) and optimizing price (30%) and IT operations (24%). Teradata is active in these areas and is working in multiple industries such as financial services, retail, healthcare, communications, government, energy and utilities.

Current Teradata customers should evaluate the company’s broader analytic and platform portfolio, not just the database appliances. In the fragmented and diverse big data market, Teradata is sorting through the chaos to provide a roadmap for largest of organizations to midsized ones. The Aster Discovery Platform can put power into the hands of analysts and statisticians who need not be data scientists. Business users from various departments, but especially high-level marketing groups that need to integrate multiple data sources for operational use, should take a close look at the Teradata Aster approach.

Regards,

Tony Cosentino

VP & Research Director

Information Builders announced two major new products at its recent annual user summit. The first was InfoDiscovery, a tool for ad hoc data analysis and visual discoveryThe second was iWay Sentinel, which allows administrators to manage applications in a proactive and dynamic manner. Being a privately held company, Information Builders is not a household name, but it is a major provider of highly scalable business intelligence (BI) and information management software to companies around the world.

VRMobileBIVI_HotVendorThis year’s announcements come one year after the release of WebFOCUS 8.0, which I wrote about at the time. Version 8.0 of this flagship BI product includes a significant overhaul of the underlying code base, and its biggest change is how it renders graphics by putting the parameters of the HTML5 graph code directly inside the browser. This approach allows consistent representation of the business intelligence graphics in multiple device environments including mobile ones. Our research into information optimization shows that mobile technology improves business performance significantly in one out of three organizations. The graphics capability helped Information Builders earn the rating of Hot vendor in our latest Value Index on Mobile Business Intelligence. It is an increasingly important trend to combine analytics with transactional systems in a mobile environment. Our research shows that mobile business intelligence is advancing quickly. Nearly three-quarters (71%) of participants said they expect their mobile workforce to have BI capabilities in the next 12 months.

vr_Big_Data_Analytics_12_benefits_of_visualizing_big_dataWebFOCUS InfoDiscovery represents the company’s new offer in the self-service analytics market. For visual discovery it enables users to extract, blend and prepare data from various data sources such as spreadsheets, company databases and third-party sources. Once the analytic data set is created, users can drill down into the information in an underlying columnar database. They can define queries as they go and examine trends, correlations and anomalies in the data set. Users given permission can publish the visualization from their desktop to the server for others to view or build further. Visualization is another area of increasing importance for organizations. Our research on big data analytics said data visualization has a number of benefits; the most-often cited are faster analytics (by 49%), understanding content (48%), root-cause analysis (40%) and displaying multiple result sets at the same time (40%).

InfoDiscovery is Information Builders’ contender in the new breed of visual discovery products. The first generation of visual discovery products drew attention for their visual capabilities, ease of use and agility. More recently, established business intelligence vendors, of which Information Builders is one, have focused on developing visual discovery tools on the platform of their well-known BI products, with the aim of taking advantage of their maturity. Currently this second wave of tools is still behind the first in terms of ease of use and visual analysis but are advancing rapidly, and they can provide better data governance, version control, auditing and user security. For instance, InfoDiscovery uses the same metadata as the enterprise platform WebFOCUS 8 so objects from both InfoDiscovery and other WebFOCUS applications can be configured in the same user portal. When a business user selects a filter, the data updates across all the components in the dashboard. The HTML5 rendering engine, new in WebFOCUS 8.0, makes the dashboard available to various devices including tablets and smartphones.

vr_oi_how_operational_intellegence_is_usedThe other major announcement at the conference, iWay Sentinel, is a real-time application monitoring tool that helps administrators manage resources across distributed systems. It works with iWay Service Manager, which is used to manage application workflows. IWay Sentinel allows multiple instances of Service Manager to be viewed and managed from a single Web interface, and administrators can address bottlenecks in system resources both manually and automatically. The tool belongs in the category we call operational intelligence and as our research finds, activity and event monitoring is the most important use (for 62% of research participants), followed by alerting and notification.

Sentinel is an important product in the Information Builders portfolio for a couple of reasons. Tactically speaking, it enables large organizations that are running multiple implementations of iWay Service Manager to manage infrastructure resources in a flexible and streamlined manner. From a strategic perspective, it ties the company to the emerging Internet of Things (IoT), which connects devices and real-time application workflows across a distributed environment. In such an environment, rules and processes flows must be monitored and coordinated in real time. Information is passed along an enterprise service bus that enables synchronous interaction of various application components. The use of IoT is in multiple areas such as remote management of devices, telematics and fleet management, predictive maintenance, supply chain optimization, and utlilities monitoring. The challenge is that application software is often complex and its processes are interdependent. For this reason, most approaches to the IoT have been proprietary in nature. Even so, Information Builders has a large number of clients in various industries, especially retail, that may be interested in its approach.

Information Builders continues to innovate in the changing IT industry and business demand for analytics and data, building on its integration capabilities and its core business intelligence assets. The breadth and depth of its software portfolio enable the company to capitalize on these assets as demand shifts. For instance, temporal analysis is becoming more important; Information Builders has built that capability into its products for years. In addition, the company’s core software is hardened by years of meeting high-concurrency needs. Companies that have thousands of users need this type of scalable, battle-tested system.

Both iWay Sentinel and InfoDiscovery are in limited release currently and will be generally available later this year. Users of other Information Builders software should examine InfoDiscovery and assess its fit in their organizations. For business users it offers a self-service approach on the same platform as the WebFOCUS enterprise product. IT staff can uphold their governance and system management responsibilities through visibility and flexible control of the platform. For its part iWay Sentinel should interest companies that have to manage multiple instances of information applications and use iWay Service Manager. In particular, retailers, transportation companies and healthcare companies exploring IoT uses should consider how it can help.

Information Builders is exploiting the value of data into what is called information optimization for which they are finding continued growth in providing information applications that meet specific business and process needs. Information Builders is also beginning to further exploit the big data sources and mobile technology areas but will need to further invest to ensure it can be part of a spectrum of new business needs. I continued to recommend any company that must serve a large set of employees in the workforce and has a need for blending data and analytics for business intelligence or information needs to consider Information Builders.

Regards,

Tony Cosentino

VP and Research Director

Tony Cosentino – Twitter

Stats

  • 52,224 hits
Follow

Get every new post delivered to your Inbox.

Join 97 other followers

%d bloggers like this: