Hello!

This website is also available in your region.


Skip to content
Insights + News/Expert Opinions

The State of BI and Data Visualisation

Andy Durkan

Andy Durkan
Lead Consultant - Data Engineer

Choosing a BI tool

In our Data Engineering competency, we work with customers from all sorts of industries and business areas to help them get value from their data – from helping them define their data strategy, to designing and implementing production-ready data platforms. As engineers, we love getting stuck into the detail of data platform architectures, processing frameworks and storage solutions. However, ensuring a business has the right tools and processes in place to allow them to visualise its data effectively is essential.

Anyone who works in the data analytics space will be familiar with business intelligence (BI) and data visualisation tools, for example Power BI and Tableau. These tools have proved so successful for their abilities to create powerful and appealing visualisations over wide ranges of data sources, while also being very user friendly – accessible for business users and analysts, not just IT specialists.

Until just over a decade ago when these tools came into the market, the previous generation of BI tools (for example IBM Cognos, Microsoft SSRS and SAP Business Objects) allowed creation of BI dashboards. These tools had a much steeper learning curve and were typically managed by central IT departments. This would lead to inefficient processes – the business users and analysts who needed actionable insights from the data on a daily basis would not be using the BI tools to interact with the data dynamically but exporting it and using tools such as Excel to perform analyses and create further visualisations* – and we all know the legacy of this lives on today!

*A note on Excel and spreadsheets – these can be fantastically useful tools and still do have their place – the challenge is ensuring users understand what they should and shouldn’t be used for.

Today, there are a wide range of BI tools on the market, all of which allow users to connect to a wide range of data sources, perform some modelling and data preparation, create and share appealing visuals and dashboards. There are the market leaders – comprehensive and powerful tools that can perform all of these tasks well – however these can come with heavy licencing costs. Some tools have focussed on a particular niche, for example generating visuals from natural language searches (e.g. ThoughtSpot) or data storytelling. Open-source tools such as Apache Superset also offer an option for businesses not wanting to be tied into licencing agreements and costs. As with all things in data, there isn’t a one-size-fits-all answer – however there are several considerations that should help you decide which path to take when deciding on a BI and visualisation tool.

Connectivity

A modern BI tool should offer native integrations with a range of data sources. As part of a wider data platform solution, you’d typically look to combine and conform disparate data sources into a single location prior to loading into the BI tool (e.g. a data lakehouse or warehouse). Therefore, most critical is the ability to connect with these platforms – typically through database-like interfaces or processing frameworks. However, the ability to natively connect to and import data from a wider range of data sources can be beneficial in many cases – for example to speed up prototyping and exploration of new data sources.

Another major consideration is whether data connections used in visualisations are to be live (i.e. queries performed directly on the data source) or based on a cached / extracted copy of data. Both approaches have their place based upon the data latency requirements, size and performance capabilities of the data source, and their support in different BI tools varies.

Data preparation and semantic layer

As mentioned above, in a modern data platform we’d always recommend that data is cleansed and modelled prior to being brought into a BI tool. Nonetheless, modern BI tools offer a range of capabilities for data preparation. These capabilities can also enable less technical users to bring additional data entities or transformations into their analyses, in lieu of them being available in the core data platform. They can also allow consistent models to be used throughout BI reporting, joining and labelling data entities (for example fact and dimension tables).

Power BI offers visual data transformation capabilities using DAX expressions, while Looker offers the LookML semantic layer giving a SQL-based approach to a standardised data model. Tools like Qlik Sense have their own proprietary approach and language for data preparation. These capabilities should be used with caution however – having extensive data transformation and modelling activities occurring within a BI tool can lead to divergence from consistent data models, cause performance issues and difficulties in being able to migrate to different tools in future. Metrics Stores can also help address this issue, providing a consistent semantic layer and metric definitions not tied to a particular BI tool.

Development experience

BI tools have certainly moved on in ease-of-use when compared to the previous generation. However, there are still differences in the development experience between tools, and which kinds of dashboard developer these will cater to. Analysts coming from a less IT-centric background may favour more visual and menu-based authoring, while other developers may be most efficient with more code / SQL-based approaches to building dashboards and datasets. All users and use-cases for the tool should be considered.

Deployment and administration

SaaS offerings have become more commonplace across all types of business application, and BI tools are no exception. These can reduce the overheads required of central IT departments in being able to deploy and administer BI tools, such as managing servers, patching and upgrades. Many tools have a separate experience for dashboard developers and consumers – for example a desktop application vs a web-based interface. Integration with an enterprise’s IdAM solution is fundamental for managing access. Administrators will have to consider permission levels and access to reports, dashboards and datasets. Depending on the complexity of these requirements, the tools offering the more comprehensive options here will be preferable.

Visualisation capabilities

Most BI tools now offer a huge range of visualisation options as standard. Common charts such as bar and line charts are ubiquitous for a reason – they are easy to understand and do the job well! However, if your business cases call for something more bespoke – for example advanced geospatial visuals – these capabilities should be a major factor in choice of tool. Some tools also allow for full extensibility and use of custom visuals, giving limitless control over visualisations for more advanced developers. User communities may also share and collaborate on custom visuals. These should be a consideration, bearing in mind ongoing support and maintenance required for non-native visuals.

Licencing costs

A major factor for businesses is of course the licencing costs associated with a BI tool. Unfortunately, comparing the overall costs of tools can be complex, due to the different licencing models used by vendors. The key things to consider include:

  • Expected number of BI developers / analysts / administrators
  • Expected number of report consumers
  • Maximum expected amount of queries / API calls
  • Maximum expected data model size / refresh rate
  • Any add-on / premium functionality needed from the tool

The information above is normally sufficient to estimate and compare the ongoing costs for each tool being considered. Open-source tools such as Superset are another option to remove licencing costs – this may mean a trade-off in some capabilities compared to proprietary tools, but in some scenarios is a great option.

DataOps capabilities

Key to a successful modern data platform is the implementation of DataOps – using best practices from software engineering and DevOps such as CICD (continuous integration, continuous deployment) and automated testing. Consideration needs to be given to how the BI tool will factor into these processes and support multiple environments, version control, and allow frictionless development, testing and deployment of new content. This must ensure the reliability of business-critical reports, while still giving analysts environments for data exploration and sandboxing.

It’s how you use it

Choosing the right tool is important, but certainly not the most important aspect of helping a business get value from its data.

Data storytelling

Modern BI tools offer an incredibly wide range of visualisation types and customisation. It’s very easy to create something that at first glance looks impressive. But is it actually telling a story about the data and drawing out insights in the most effective way?

When developing BI dashboards in Cognos over a decade ago, I religiously followed the teachings of Stephen Few – a guru of data visualisation best practices. While BI tools and the typical data platform has evolved massively since then, the principles still apply. All too frequently we can still encounter an ugly pie chart (as Mr Few famously said – save the pies for dessert!) or a stacked bar resembling my dad’s tie from the 1980s. Along with the principles that ensure visuals are easy to interpret, I’ve found that asking the following questions helpful in deciding how to visualise data:

Who is the audience? How engaged and familiar are they with the data? Are they only interested in high-level figures / trends, or do they need to understand the detail?

How will it be consumed? How frequently do you expect users to access the dashboard, and how will it factor into their day-to-day processes and decision making? What kind of device will they use?

What story are you trying to tell? Is the purpose of the visualisation for exploration, or is there a particular message you want to convey?

Answering the questions above will set you on the right track to ensuring your dashboards hit the mark. Anyone who creates data visualisations should spend some time learning best practices, and teams should look to implement their own standards and style guides, work closely with end users and monitor dashboard usage.

Business value is key

Overarching all of this is always the most important element – what value will any of this deliver to the business? For those of us that work in the data analytics space, we love getting to build data solutions using modern tools and platforms that are powerful, efficient and deliver beautiful, dynamic visuals. However, you need to be mindful of how much this will cost– staff, licencing, training, and cloud consumption costs all considered. The first thing we always look to address when starting any data project is what value will this deliver to the business, versus the costs of getting there. Being vendor and technology agnostic and having worked with all types of organisations from start-up to large enterprises, we can identify the most appropriate solutions and approaches to ensure that a data initiative will deliver real value.

Don't miss the latest from Ensono

PHA+WW91J3JlIGFsbCBzZXQgdG8gcmVjZWl2ZSB0aGUgbGF0ZXN0IG5ld3MsIHVwZGF0ZXMgYW5kIGluc2lnaHRzIGZyb20gRW5zb25vLjwvcD4=

Keep up with Ensono

Innovation never stops, and we support you at every stage. From infrastructure-as-a-service advances to upcoming webinars, explore our news here.

Start your digital transformation today.