The story of the first business computer could not be more British if it tried.
British tea shop operator and caterer J. Lyons & Co. put the Lyons Electronic Office (LEO) to work in 1951 running calculations on the cost of ingredients used in its bakery products. LEO soon assumed the responsibilities of running payroll for the company’s workforce (LEO also moonlighted for the United Kingdom Ministry of Defence, calculating missile trajectories). LEO is widely credited as the world’s first real-time office application, and in this analyst’s opinion not only kicked off the modern area of business computing, but was the first move toward enterprises’ interest in analyzing data for value-adding, actionable insights.
The rationale for analyzing that data was and still is extremely simple: Understanding the operations of a business by using data makes decisions made about that business better. This is true of both handwritten accounting ledgers and massive ERP systems that help global business corporations—it’s easier with a computer.
Business intelligence emerges, but fails to serve many
Fast forward to the late '70s and early '80s, where decision support systems emerged. This software aimed to help executives make better choices about their businesses using data; this arguably represented the first packaged analytical capabilities. More recognizable was the emergence of business intelligence (BI) in the 1990s, with major names such as Business Objects, Cognos, Hyperion (acquired by SAP, IBM, and Oracle, respectively), and SAS (still privately owned) delivering more standardized enterprise data analysis capabilities.
A problem soon emerged. Every decision-maker at every level can benefit from data-driven insights, yet BI software was only in the hands of high-level managers and executives. As a result, vast numbers of potential beneficiaries, including business analysts and sales operations professionals, increased the demand for data analysis software; this led to the rise of self-service BI in the late '90s and early 2000s. Vendors such as Tableau and Qlik best represent the self-service market. They put data analysis tools into the hands of new users, displaying results in a visual way, an approach widely credited with making data analysis more accessible to non-expert users.
The downside of self-service rapidly surfaced in what became known as shadow IT. Shadow IT is the rise of unsanctioned (by IT) tech solutions, often purchased by line-of-business stakeholders who are looking to solve an unmet need without reference to regular IT buying practices. In the context of self-service BI, shadow IT’s primary impact has been damage to corporate data governance programs usually by allowing users to (in many cases unwittingly) circumnavigate controls on data access.
Quer aprender mais sobre Ferramentas de Visualização de Dados? Explore os produtos de Visualização de Dados.
Consolidating decades of data analysis capabilities with analytics platforms
From the corporate big BI platforms of the '90s to now, we see the emergence of analytics platforms. An analytics platform consolidates the functionality of each era of data analysis, bringing with it abilities to serve many use cases and personas that benefit from data-driven insights. These modern platforms provide critical capabilities that properly realize the original promise of data analysis: more insights to a greater number of people. Modern solutions neatly fit in with the current world of IT that includes supporting cloud deployment options and emerging technologies such as AI.
Here are some of the defining features of an analytics platform:
- Data is everywhere, including in the cloud
For many, the cloud is a natural complement to data storage and processing given its growing scale and the opportunities to analyze it for more insights. Companies take advantage of cloud economics that replace capital expenses with operational expenses, a generally favored approach. These tools offer infinitely scalable storage and computing to address big data challenges; running analytics in the cloud is a logical choice. However, enterprises’ journeys to the cloud are still underway, progressing at different speeds across business size, industry, IT type and all in the context of varied compliance requirements. As a result, IT workloads and underlying data are run and stored in many different places—from on-premises solutions to public and private clouds, and sometimes across hybrid cloud scenarios. An analytics platform must be able to work with data across all these platforms to reflect the reality of use cases. As most are keenly aware, solutions must be able to accommodate a varied (and growing) range of compliance requirements such as the European Union’s GDPR and California’s new Consumer Privacy Act.
- AI-powered features grow accessibility
Practical applications for AI are felt in analytics platforms through features that help tackle the accessibility and usability issues associated with traditional BI. These include the functionality to automatically analyze data sets for relationships, surface insights to users without manual work, and auto-suggest best fit visualizations for results. Each of these examples means the end user no longer has to undergo substantial technical training to access insights from data. This opens up analytics to a mass of potential new beneficiaries.
- Have a conversation about analytics, with the machine
Analytics platforms are also changing the way users interact with analytics. This shift is driven by conversational user interfaces (UIs) that enable question and answer functionality, powered by natural language capabilities that either allow the user to ask a question in plain language, or have data and insights explained in layman’s terms. Intelligent bots that integrate with common office suites and internal communications software such as Microsoft Teams and Slack are a growing feature, providing ever-present access to insights in the apps people work and collaborate in.
- More accessible analytics means more accessible data
Along with the problem of technical skills blocking access to analytics is the issue of data availability. Traditional approaches to data management such as heavy extract, transform and load (ETL) paired with the necessary knowledge to effectively cleanse and combine data sources into an analytics friendly format do not scale well for the current era of mass insight consumption. While many standalone data preparation solutions exist, I expect an acceleration of a trend to closely tie, or even fully integrate, these capabilities into analytics platforms.
- Embedding data-driven insights into other apps
I have long said the best technology is the one you didn’t realize you were using. This is as true of analytics as it is any other technology, and the concept of embedding analytics into other applications is no different. Similar to the use of AI-powered features, embedded analytics enhance accessibility for non-expert users, who are served data-driven insights into the apps they use every day, without having to use the analytics solution, or switch between apps. For example, customer relationship management software is most often one of the starting points for applications vendors.
Data science is the new frontier of analytics in the enterprise
Data science and data scientists are some of the most talked about subjects in technology business today. While in short supply, they represent the latest iteration of power users that once dominated the use of analytical and BI solutions. These experts sit in centers of excellence and serve the wider enterprise with their rare capabilities. Some of those capabilities have been available for a long time such as predictive analytics solutions. However, as discussed in a previous column, only recently has the volume of data and available computing power been sufficient to make use cases at scale a reality.
It is G2’s opinion that as analytics platforms continue to build out, data science capabilities will be the next big feature set that’s incorporated. At first, these will be “light” capabilities, largely prebuilt with limited opportunity for customizations. Ultimately, expect data science to become part of enterprise’s analytics tool box, likely enhanced with many of the defining features of analytics platforms discussed above. These platforms will deliver advanced analytical functions with the underlying complexity of the technology hidden away from the end user.
As part of our regular monthly columns, G2 will be exploring some of these functional areas as they evolve within the context of analytics platforms. Make sure you sign up here to receive updates from our analyst team on this, and other critical trends and developments in enterprise IT.

Tom Pringle
Tom is Vice President of Market Research at G2, and leads our analyst team. Tom's entire professional experience has been in information technology where he has worked in both consulting and research roles. His personal research has focused on data and analytics technologies; more recently, this has led to a practical and philosophical interest in artificial intelligence and automation. Prior to G2, Tom held research, consulting, and management roles at Datamonitor, Deloitte, BCG, and Ovum. Tom received a BSc. from the London School of Economics.