Tableau goes Minority Report in TC23 – takes direction towards augmented reality, generative AI and headless BI

Tableau Conference (TC23) was held last week in Las Vegas and once again it shed light on Tableau’s long term roadmap but also provided some concrete examples of features coming in the next releases. Tableau jumped on the generative AI bandwagon with Tableau GPT. Tableau Pulse redefines metrics and creates a new landing page for data consumption. VizQL Data Service is the first step towards headless BI for Tableau. The introduction of Tableau Gestures in an augmented reality context was impressive, it reminded me a bit of Tom Cruise exploring data in the film Minority Report.

TC23 keynote was started by Chief Product Officer Francois Ajenstat with the celebration of Tableau’s 20 years long journey. Francois emphasised the role of Tableau and Tableau community as a key innovator in easy-to-use self-service analytics. ”A new day for data” was used as a title for the upcoming introductions to suggest there is something big and impressive coming out.

The new CEO of Tableau, Ryan Aytay, also thanked the community, customers, partners and employees for their support. Ryan revealed Tableau success plan for all customers coming later this year to listen and support customers more closely. One of the conference highlights was once again Iron Viz visualisation competition, this year’s winner was Paul Ross with his magnificent renewable energy dashboard.

Tableau Conference venue during Iron Viz competition with the winner Paul Ross
Tableau Iron Viz vibes in TC23 (photo credit Sharad Adhikari).

But what about the features? Tableau GPT is a very interesting new feature but in a way it isn’t very unique considering almost every organisation is talking about language models and generative AI. On the other hand, it doesn’t mean the feature wouldn’t be very useful, it might be quite the opposite. Tableau Pulse might be a bigger thing than you first think. It has a very appealing UI to combine metrics, visualisations, descriptive information and Tableau GPT based additional insights & interactions. The redesigned metrics layer seems to be much more flexible than before. Metrics are easier to create, more powerful and they can be used around Tableau: in Pulse, dashboards, emails, Slack and mobile.

Possibly a bit more surprising feature is the upcoming VizQL Data Service that takes Tableau towards composable analytics or headless BI. This means you can connect directly to the Tableau backend data model (hyper engine) to query the data without the need of building frontend visualisations with Tableau. This would provide a lot more flexibility when creating data-related products and solutions where you need to use data & analytics. This feature might be somewhat related to the fact that Salesforce is using Tableau hyper data models within its Data Cloud offering to boost analytics possibilities. In the future, Salesforce could use data accelerated by Tableau data engine in their Salesforce Clouds via VizQL Data Service.

From an analytics developer point of view, the most interesting single feature showcased in TC23 (originally introduced in TC22) was shared dimensions (or multi-fact models) support. Shared dimensions enable more flexible multi-fact data models where multiple fact tables can relate to shared dimension tables. This feature makes the logical data layer introduced a couple of years ago more comprehensive and very powerful. Tableau would finally fully support the creation of enterprise level data models that can be leveraged in very flexible ways and managed in a centralised manner. The user interface icon for defining the relationships looked a bit like a meatball, and because the relationships in the logical data model have been referred to as noodles, it was said that Tableau is bringing meatballs to the noodles, very clever 🙂. 

Perhaps the coolest little thing was the augmented reality demo where Matthew Miller used a gesture-based user interface to interact with data and visualise it in a meeting context. The demonstration had a bit of a Minority Report vibe in it, perhaps the technology wasn’t yet as smooth as in the film, but Miller was just as convincing as Tom. Tableau gestures feature was created by the Tableau research team and it appears to be in its early stages. Most likely it won’t be released any time soon, but it might be a hint of where data interaction is going in the future.

Matthew Miller using hand gestures to analyse data
Matthew Miller demonstrates gesture-based data analytics in TC23.

But what wasn’t mentioned in the TC23? There are a couple of features or big announcements that were highlights in TC21 and TC22, but haven’t yet been released and weren’t mentioned again in TC23. One year ago, in TC22, one of the big buzzwords was business science. It was described as business-driven data science using autoML features and scenario planning etc. But in TC23 keynote business science wasn’t mentioned at all nor were the Model builder or Scenario Planner features.

Next, I’ll go through the key features introduced in TC23 and also list functionalities presented in TC22 and TC21 to understand the big picture. These feature lists don’t contain all the features included in previous releases but the ones mentioned in earlier Tableau Conferences. More info about TC22 and TC21 introduced features can be found in our previous blog posts:

Note: All the product/feature related images are created using screenshots from the TC23 Opening Keynote / Devs on Stage session. You can watch the sessions at any time on Tableau site.

Workbook authoring & data visualisation

Let’s start with workbook authoring and actual data visualisation related features. The only new feature was the new Sankey and Radial charts (or mark types) that are already in pilot use in Tableau Public. It was suggested that there are also other new chart types to be released in near future. Even though I’m a bit sceptical towards too complex or hyped visualisations it’s good to have the option to easily create something a bit different. Because of Tableau’s flexibility, creating something totally crazy has always been possible but often it has required a lot of data wrangling and custom calculations. 

Sankey visualisation in Tableau Desktop
Out-of-the-box Sankey chart type presented in TC23.

Creating custom visualisations with Visualisation Extensions was introduced in TC21 (more info here), but we haven’t heard anything about this feature since. It might be that the visualisation extensions development has been stopped or paused, but still these new Sankey and Radial chart types might have something to do with the visualisation extension development done in the past, who knows?

  • New in TC23
    • TC23 New mark types (pilot period currently in Tableau Public): Create Sankey & radial charts using specific mark types. Possibly new native mark/charts types in the future.
    • TC23 Improved Image role functionality: new file types (gif) & expansion to the size limit.
    • TC23 Edit alt text (for screen readers) directly in Data Guide
  • Previously introduced and already released features
    • TC22 Image role (2022.4): Dynamically render images in the viz based on a link field in the data.
    • TC21 Dynamic zone visibility (2022.3): Use parameters & field values to show/hide layout containers and visualisations.
    • TC21 Redesigned View Data (2022.1): View/hide columns, reorder columns, sort data, etc.
    • TC21 Workbook Optimizer (2022.1): Suggest performance improvements when publishing a workbook.
    • TC21 Multi Data Source Spatial Layers (2021.4): Use data from different data sources in different layers of a single map visualisation.
  • Previously introduced but not released nor mentioned in TC23
    • TC21 Visualisation Extensions (~2022 H2): Custom mark types, mark designer to fine-tune the visualisation details, share custom viz types.

Consume analytics & understand data

The hype (and also actual new features) around generative AI have been the number one topic for most of the tech companies this year, and it sure was also for Tableau. Tableau introduced Tableau GPT, which is a generative language model integrated to Tableau and its data with security and governance included. Tableau GPT can be useful for both consumers and analysts. It can be used to search data and find insights just by writing questions and it’ll provide answers in both written text and as a visualisation (like Ask data with steroids). Ask any question and Tableau GPT will help to 1) Find relevant data sources, 2) Analyse data, 3) Present results in text and chart with the possibility to explore more, 4) Suggest related additional questions. It was suggested that Tableau GPT will also be integrated into Data Guide and for developers/analysts to the calculation editor to help build calculations.

Tableau Pulse was another big announcement. It’s a completely new interface to consume analytics and insights with the ability to ask questions via Tableau GPT. It seems to be mostly intended for consumers to follow and understand key metrics and related trends, outliers and other interesting aspects. Tableau Pulse includes a redesigned metrics layer with the possibility to create embeddable metrics manually or suggested by Tableau GPT. It contains personalised metrics & contents (changes, outliers, trends, drivers) and descriptive information created by Tableau GPT.

Tableau Pulse user interface with metrics information
Tableau Pulse with metrics and TableauGPT generated textual contents presented in TC23.

Unfortunately, we still need to wait to get our hands on Tableau GPT and Tableau Pulse. It might be the end half of this year or even early next year when Tableau actually gets these new features released.

  • New in TC23
    • TC23 Tableau GPT (~pilot 2023 H2): Generative AI to assist in searching, consuming and developing data & analytics in many Tableau user interfaces.
    • TC23 Tableau Pulse with redesigned metrics (~pilot 2023 H2): New user interface to consume analytics and create, embed & follow metrics.
    • TC23 Tableau Gestures & augmented analytics: Use gestures to interact with data and infuse analytics into meetings. 
  • Previously introduced and already released features
    • TC22 Data Guide (2022.3): Contains information about the dashboard and fields, applied filters, data outliers and data summary, and links to external resources.
    • TC22 Data Stories (2022.2 & 2022.3):  Dynamic and automated data story component in Tableau Dashboard. Automatically describes data contents.
    • TC21 Data Change Radar (2022.3): Alert and show details about meaningful data changes, detect new outliers or anomalies, alert and explain these.
    • TC21 Explain the Viz (2022.3): Show outliers and anomalies in the data, explain changes, explain marks etc.
    • TC21 Multiple Smaller Improvements in Ask Data (2022.2 & 2022.3): Contact Lens author, Personal pinning, Lens lineage in Catalog, Embed Ask Data.
    • TC21 Ask Data improvements (2022.1): Phrase builder already available, phrase recommendations available later this year.
  • Previously introduced but not released nor mentioned in TC23
    • TC21 Model Builder: Use autoML to build and deploy predictive models within Tableau. Based on Salesforce’s Einstein platform.
    • TC21 Scenario Planner: Easy what-if-analysis. View how changes in certain variables affect target variables and how certain targets could be achieved.

Collaborate, embed and act

New features in this area related heavily to embedding and using Tableau data for building external data products and services. Especially the VizQL Data Service is Tableau’s first step towards composable analytics where the backend data layer and frontend user interface don’t need to be created with the same tool or technology. Composable analytics or headless BI is seen as a future trend in analytics. VizQL Data Service provides access to data modelling capabilities and data within Tableau to streamline building different kinds of data products with Tableau data. This means that data from Tableau could easily be used outside Tableau without actually embedding visuals, but using the data itself in different ways.

Another introduced feature was the Embedding Playground that will ease up the creation of code to embed Tableau visuals and different kinds of interactions. In the playground, you can select options from dropdowns to alter embedding settings, create interactions (eg. context menus, export, filtering, marks etc.) and get ready to be embedded in Javascript & HTML code. Ephemeral users will centralise user identity and access management and in the future usage-based licensing will be provided to make the pricing more flexible to.

  • New in TC23
    • TC23 Tableau Embedding Playground (dev preview now): Configure embedding options without coding.
    • TC23 Ephemeral users (~2023 H2): Centralises user identity and access management to one place. Usage-based licensing options in the future.
    • TC23 VizQL Data Service (~dev preview 2023 H2): Tableau’s first step is to decouple the data and presentation layer.
    • TC23 Grant access to a workbook when sharing
  • Previously introduced and already released features
    • TC22 Tableau External Actions (2022.4): Trigger actions outside Tableau, for example, Salesforce Flow actions. Support for other workflow engines will be added later.
    • TC22 Publicly share dashboards: Share content via external public facing site to give access to unauthenticated non-licenced users, only Tableau Cloud. Available via Tableau Embedded analytics usage-based licensing.
    • TC21 Embeddable Ask Data (2023.1)
    • TC21 Embeddable Web Authoring (2022.2): No need for a desktop when creating & editing embedded contents, full embedded visual analytics.
    • TC21 3rd party Identity & Access Providers (2022.2): Better capabilities to manage users externally outside Tableau.
    • TC21 Connected Apps (2021.4): More easily embed to external apps, creating a secure handshake between Tableau and other apps.
    • TC21 Tableau search, Explain Data and Ask Data in Slack (2021.4)
    • TC21 Tableau Prep notifications in Slack (2022.1)

Data preparation, modeling and management

My personal favourite, the Shared dimensions feature, which was introduced already a year ago, was demoed once again. It enables more flexible multi-fact data models with shared dimension tables to create more flexible and comprehensive data models. At least the modelling UI seemed to be rather ready, but unfortunately we didn’t get a target schedule for when this might be released.

Modeling interface with shared dimension in Tableau Desktop
Shared dimensions enable multi-fact data sources. Example presented in TC23.

One very welcome little feature is Address Geocoding which allows you to visualise addresses on a map without doing the geocoding beforehand. Related to data models, Tableau also emphasised how Tableau data models are used and available within Salesforce Data Cloud (Tableau Hyper-accelerated queries) and also in the future Data Cloud contents can be analysed in Tableau with a single click (Tableau Instant Analytics in SF Data Cloud).

  • New in TC23
    • TC23 Tableau Hyper-accelerated queries in SF Data Cloud (Available now): Salesforce data Cloud is at least partially based on Tableau Hyper data models, which can be used to easily analyse the data within Salesforce Data Cloud without additional modeling steps.
    • TC23 Tableau Instant Analytics in SF Data Cloud (~2023 H2): Analyse SF Data Cloud data with Tableau with one click.
    • TC23 Address Geocoding: geocode address data in Tableau to visualise addresses on a map.
    • TC23 Use TableauGTP in prep & modeling: ask TableauGTP to create advanced calculations, eg. extract email address from json.
    • TC23 Tableau Prep enhancements: spatial joins, smart suggestion to remove duplicates & easily set header and start a row.
  • Previously introduced and revisited in TC23
    • TC22 Shared dimensions / multi-fact models: Build multi-fact data models where different facts relate to multiple shared dimensions.
    • TC22 New AWS data sources: Amazon S3 connector. Previously mentioned also Amazon DocumentDB, Amazon OpenSearch, Amazon Neptune.
    • TC22 Multi-row calculations in Prep: Calculate for example running total or moving average in Tableau Prep.
  • Previously introduced and already released features
    • TC22 Insert row number and clean null values in Prep (2023.1): Easily insert row number column and clean & fill null values.
    • TC22 Table extensions (2022.3): Leverage python and R scripts in the data model layer.
    • TC22 Web data connector 3.0 (2022.3): Easily connect to web data and APIs, for example to AWS S3, Twitter etc.
    • TC21 Data Catalog Integration: Sync external metadata to Tableau.
    • TC21 Virtual Connections (2021.4): Centrally managed and reusable access points to source data with a single point to define security policy and data standards.
    • TC21 Centralised row-level security (2021.4): Centralised RLS and data management for virtual connections.
    • TC21 Parameters in Tableau Prep (2021.4): Leverage parameters in Tableau Prep workflows.
  • Previously introduced but not released nor mentioned in TC23
    • TC21 Tableau Prep Extensions: Leverage and build an extension for Tableau Prep (sentiment analysis, OCR, geocoding, feature engineering etc.).

Tableau Cloud management

For Tableau Cloud management Tableau emphasised HIPAA compliance and improved activity logs to analyse for example login activities and attempts. Customer-managed IP filtering for Tableau Cloud will streamline cloud security management. There were also new features introduced related to access token management in the Tableau Cloud environment.

  • New in TC23
    • TC23 Improved activity logs: More data in admin templates about login activities & attempts.
    • TC23 Customer-managed IP filtering: Set IP address filtering to limit access to Tableau Cloud Site.
    • TC23 Enhanced access token management: Access token management via API, Control personal access token creation via user group and set expiration periods.
  • Previously introduced and revisited in TC23
    • TC22 Multi-site management for Tableau Cloud: Manage centrally all Tableau Cloud sites.
  • Previously introduced and already released features
    • TC22 Customer-managed encryption keys (2022.1): BYOK (Bring Your Own Keys). 
    • TC22 Activity Log (2022.1): More insights on how people are using Tableau, permission auditing etc.
    • TC22 Admin Insights (2022.1): Maximise performance, boost adoption, and manage content.
Admin Templates login activity dashboard
Tableau Admin Insights login activity example presented in TC23.

Tableau Server management

Again this year, there weren’t too many new specific features related to Tableau Server management. On the other hand, it was emphasised that the possibility to use an on-premise Tableau Server will be an option also in the future.

  • Previously introduced and already released features
    • TC22 Auto-scaling for Tableau Server (2022.3): Starting with backgrounder auto-scaling for container deployments.
    • TC21 Resource Monitoring Improvements (2022.1): Show view load requests, establish new baseline etc.
    • TC21 Backgrounder resource limits (2022.1): Set limits for backgrounder resource consumption.
    • TC21 Time Stamped log Zips (2021.4)

Tableau Ecosystem & Tableau Public

Tableau Public had a few new features introduced, like improved search. Accelerators weren’t mentioned too much in TC23, but lately their usability has improved with the ability to easily map fields when taking dashboard accelerators in use. There were some Tableau Public-related features introduced few years ago in TC21 that haven’t been released yet. Especially getting more connectors to Tableau Public would be very nice, and also the possibility to publish Prep workflows to Tableau Public would be great. Let’s see if we get these previously introduced features to use in the future.

  • New in TC23
    • TC23 Tableau Public Enhanced search with sorting & filtering, network activity feed with notifications & extra info, profile pronouns
  • Previously introduced and already released features
    • TC21 Tableau Public Custom Channels:  Custom channels around certain topics.
    • TC21 Tableau Exchange: Search and leverage shared extensions, connectors, more than 100 accelerators. The possibility to share the dataset may be added later on.
    • TC21 Accelerators: Dashboard starters for certain use cases and source data (e.g. call center analysis, Marketo data, Salesforce data etc.). Can soon be used directly from Tableau.
  • Previously introduced but not released nor mentioned in TC23
    • TC21 Tableau Public Slack Integration (~2022 H1)
    • TC21 More connectors to Tableau Public (~2022 H1): Box, Dropbox, OneDrive.
    • TC21 Publish Prep flows to Tableau Public: Will there be a Public version for Tableau Prep?

Want to know more?

If you are looking for more info about Tableau, please read our previous blog posts, check out our visualisation and Tableau offering, and send a message to discuss more (via our website):

More info about the upcoming features on the Tableau coming soon page.

Data Consultant

Unfolding the work of an Analytics Consultant

Meet Johanna, Tuomas and Tero! Our Consultants, who all work with data analysis and visualizations. Let’s map out their journey at Solita and demystify the work of Analytics Consultants!

All three have had different journeys to become an Analytics Consultant. Tuomas has a business degree and Tero started his career working with telecommunications technology. Johanna however found her way to visualizations quite young: “I created my first IBM Cognos reports as a summer trainee when I was 18 and somehow, I ended up studying Information Systems Science.” It has been, however, love at first sight for all of them. Now they work at Solita’s Data Science and Analytics Cell.

What is a typical Analytics Consultant’s workday like?

The interest in versatile work tasks combines our Analytics Consultants.  Tuomas describes himself as “a Power BI Expert”. His days go fast by designing Power BI phases, modelling data, and doing classical pipeline work. “Sometimes I’d say my role has been something between project or service manager.”

Tero in the other hand is focusing on report developing and visualizations. He defines backlogs, develops metadata models, and holds client workshops.

Johanna sees herself as a Data Visualization Specialist, who develops reports for her customers. She creates datasets, and defines report designs and themes. “My work also includes data governance and the occasional maintenance work,” Johanna adds.

All three agree that development work is one of their main tasks. “I could say that a third of my time goes to development,” Tuomas estimates. “In my case I would say even half of my time goes to development,” Tero states.

Power BI is the main tool that they are using. Microsoft Azure and Snowflake are also in daily use. Tools vary in projects, so Tuomas highlights that “it is important to understand the nature of different tools even though one would not work straight with them”.

What is the best part of an Analytics Consultant’s work?

The possibility to work with real-life problems and creating concrete solutions brings the most joy to our consultants. “It is really satisfying to provide user experiences, which deliver the necessary information and functionality, which the end users need to solve their business-related questions,” Johanna clarifies her thoughts.

And of course, collaborating with people keeps our consultants going! Tuomas estimates that 35% of his time is dedicated to stakeholder communications: he mentions customer meetings, but also writing documentations, and creating project defining, “specs”, with his customers.

Our consultants agree that communication skills are one of the key soft skills to master when desiring to become an Analytics Consultant! Tuomas tells, that working and communicating with end-users has always felt natural to him.

Tero is intrigued by the possibility of working with different industries: “I will learn how different industries and companies work, what kind of processes they have and how legislation affects them. This work is all about understanding the industry and being customer-oriented.”

“Each workday is different and interesting! I am dealing with many different kinds of customers and business domains every day.”

When asked, what keeps the consultants working with visualizations, they all ponder for a few seconds. “A report, which I create, will provide straight benefit for the users. That is important to me,” Tuomas sums up his thoughts. “Each workday is unique and interesting! I am dealing with many different customers and business domains every day,” Johanna answers. Tero smiles and concludes: “When my customers get excited about my visualization, that is the best feeling!”

How are our Analytics Consultants developing their careers?

After working over 10 years with reporting and visualizations, Tero feels that he has found his home: “This role feels good to me, and it suits my personality well. Of course, I am interested in getting involved with new industries and learning new tools, but now I am really contented!”

Tuomas, who is a newcomer compared to Tero, has a strong urge to learn more: “Next target is to get a deeper and more technical understanding of data engineering tools. I would say there are good opportunities at Solita to find the most suitable path for you.”

Johanna has had different roles in her Solita journey, but she keeps returning to work with visualizations: “I will develop my skills in design, and I would love to learn a new tool too! This role is all about continuous learning and that is an important capability of an Analytics Consultant!”

“I would say there are good opportunities at Solita to find the most suitable path for you.”

How to become an excellent Analytics Consultant? Here are our experts’ tips:

Johanna: “Work together with different stakeholders to produce the best solutions. Do not be afraid to challenge the customer, ask questions or make mistakes.”

Tuomas: “Be curious to try and learn new things. Don’t be afraid to fail. Ask colleagues and remember to challenge customer’s point of view when needed.”

Tero: “Be proactive! From the point of view of technical solutions and data. Customers expect us to bring them innovative ideas!”

Would you like to join our Analytics Consultant team? Check our open positions.

Read our Power BI Experts’ blog post: Power BI Deep Dive

Tableau Image Role Example

Overview of the Tableau product roadmap based on TC22 and TC21

Tableau Conference (TC22) was held last week in person in Las Vegas (with virtual participation possibility). Majority of the introduced new features and functionalities were related to data preparation & modeling, easy and automated data science (business science as Tableau calls it), and Tableau Cloud management & governance capabilities. Tableau is on its journey from a visual analytics platform to a full scale end-to-end analytics platform.

In the keynote Tableau CEO Mark Nelson emphasised the role of both Tableau and Salesforce user communities to drive change with data: there are over 1M Tableau Datafam members and over 16M Salesforce Trailblazers. Once again, the importance of data for businesses and organisations was highlighted. But the viewpoint was data skills – or lack of them – and data cultures more than technologies. Mark Nelson underlined the meaning of cloud saying 70% of new customers start their analytical journey in the cloud. One of the big announcements was rebranding Tableau Online to Tableau Cloud and introducing plenty of new features to it.

Taking account the new features introduced at TC22 Tableau platform includes good data preparation and modelling capabilities with many connectors to a variety of data sources, services and APIs. Tableau’s visual analytics and dashboarding capabilities are already one of best in the market. In TC21 last year Tableau talked a lot about Slack integration and embedding to boost collaboration and sharing of insights. At the moment, effort is put especially to democratize data analytics for everyone despite gaps in the data skills. This is done using autoML type of functionalities to automatically describe and explain data, show outliers, create predictions and help to build and act on scenarios. Also the cloud offering with better governance, security and manageability was a high priority.

Next I’ll go through the key features introduced in TC22 and also list functionalities presented in TC21 to understand the big picture. More info about TC21 released features can be found in a previous blog post: A complete list of new features introduced at the Tableau Conference 2021. These feature lists don’t contain all the features included in previous releases but the ones mentioned in TC21.

Note: All the images are created using screenshots from the TC22 Opening Keynote / Devs on Stage session and Tableau new product innovations blog post. You can watch the sessions at any time on Tableau site.

Update: Read latest TC23 blog Tableau goes Minority Report in TC23 – takes direction towards augmented reality, generative AI and headless BI.

Workbook authoring & data visualization

In TC22 there weren’t too many features related to workbook authoring. The only bigger announcement was the new image role to enable dynamic images in visualizations. These could be for example product images or any other images that can be found via a url link in the source data.  From TC21 there are still a couple of very interesting features waiting to be released, I’m especially waiting for dynamic dashboard layouts.

  • Introduced in TC22
    • Image role: Dynamically render images in the viz based on a link field in the data.
  • Introduced in TC21 (but not yet released)
    • Dynamic Dashboard Layouts (~2022 H1): Use parameters & field values to show/hide layout containers and visualizations.
    • Visualization Extensions (~2022 H2): Custom mark types, mark designer to fine tune the visualization details, share custom viz types.
  • Introduced in TC21 (and already released)
    • Multi Data Source Spatial Layers (2021.4): Use data from different data sources in different layers of a single map visualization.
    • Redesigned View Data (2022.1): View/hide columns, reorder columns, sort data, etc.
    • Workbook Optimizer (2022.1): Suggest performance improvements when publishing a workbook.
Tableau Image Role Example
Image role example to dynamically render images presented in TC22. Side note: have to appreciate the “Loves Tableau: True” filter.

Augmented analytics & understand data

For this area there were a couple of brand new announcements and more info about a few major functionalities already unveiled in TC21. Data stories is an automated feature to create descriptive stories about data insights in a single visualization. Data stories explains what data and insights is presented in the visualization, explanation changes dynamically when data is filtered or selected in the viz. With the data orientation pane the author can partly automate the documentation of dashboard and visualizations. It shows information about data fields, applied filters, data outliers and data summary, and possible links to external documentation.

Tableau Data Stories example
Example of automatically created descriptive data story within a dashboard presented in TC22.

 

Few originally in TC21 introduced features were also mentioned in TC22. Model Builder is a big step toward guided data science. It will help to build ML-model driven predictions fully integrated within Tableau. It’s based on the same technology as Salesforce’s Einstein Analytics. Scenario planner is a functionality to build what-if-analyses to understand different options and outcomes of different decisions.

  • Introduced in TC22
    • Data Stories (beta in Tableau Cloud):  Dynamic and automated data story component in Tableau Dashboard. Automatically describes data contents.
    • Data orientation pane: Contain information about dashboard and fields, applied filters, data outliers and data summary, and links to external resources.
    • Model Builder: Use autoML to build and deploy predictive models within Tableau. Based on Salesforce’s Einstein platform.
    • Scenario Planner: Easy what-if-analysis. View how changes in certain variables affect target variables and how certain targets could be achieved.
  • Introduced in TC21 (but not yet released)
    • Data Change Radar (~2022 H1): Alert and show details about meaningful data changes, detect new outliers or anomalies, alert and explain these.
    • Multiple Smaller Improvements in Ask Data (~2022 H1): Contact Lens author, Personal pinning, Lens lineage in Catalog, Embed Ask Data.
    • Explain the Viz (~2022 H2): Show outliers and anomalies in the data, explain changes, explain mark etc.
  • Introduced in TC21 (and already released)
    • Ask Data improvements (2022.1): Phrase builder already available, phrase recommendations available later this year.

Collaborate, embed and act

In TC21 collaboration and Slack integration were one of the big development areas. In TC22 there wasn’t much new about this topic, but Tableau actions were again demonstrated as a way to build actionable dashboards. Also the possibility to share dashboards publicly for unauthenticated non-licenced users was shown again in TC22. This functionality is coming to Tableau Cloud later this year.

  • Introduced in TC22
    • Tableau Actions: Trigger actions outside Tableau, for example Salesforce Flow actions. Support for other workflow engines will be added later.
    • Publicly share dashboards (~2022 H2): Share content via external public facing site to give access to unauthenticated non-licenced users, only Tableau Cloud.
  • Introduced in TC21 (but not yet released)
    • 3rd party Identity & Access Providers: Better capabilities to manage users externally outside Tableau.
    • Embeddable Web Authoring: No need for desktop when creating & editing embedded contents, full embedded visual analytics.
    • Embeddable Ask Data 
  • Introduced in TC21 (and already released)
    • Connected Apps (2021.4): More easily embed to external apps, create secure handshake between Tableau and other apps.
    • Tableau search, Explain Data and Ask Data in Slack (2021.4)
    • Tableau Prep notifications in Slack (2022.1)

Data preparation, modeling and management

My personal favourite of the new features can be found here. Shared dimensions enable more flexible multi-fact data models where multiple fact tables can relate to shared dimension tables. This feature makes the logical data model layer introduced a couple of years ago more comprehensive and very powerful. Tableau finally supports creation of enterprise level data models that can be leveraged in very flexible ways and managed in a centralized manner. Another data model related new feature was Table extensions that enable use of Python and R scripts directly in the data model layer.

Tableau Shared Dimensions Example
Shared dimensions enabled multi-fact data source example presented in TC22.

 

There are also features to boost data source connectivity. Web Data Connector 3.0 makes it easier to connect different web data sources, services and API’s. One important new data source is AWS S3 that will enable connection directly to the data lake layer. Also Tableau Prep is getting few new functionalities. Row number column and null value cleaning are rather small features. Multi-row calculations instead are a bit bigger thing, although the examples Tableau mentioned (running totals and moving averages) might not very relevant in data prep cause these usually must take into account filters and row level security and therefore these calculations must often be done at runtime.

  • Introduced in TC22
    • Shared dimensions: Build multi-fact data models where facts relate to many shared dimensions,
    • Web data connector 3.0: Easily connect to web data and APIs, for example to AWS S3, Twitter etc.
    • Table extensions: Leverage python and R scripts in the data model layer.
    • Insert row number and clean null values in Prep: Easily insert row number column and clean & fill null values.
    • Multi-row calculations in Prep: Calculate for example running total or moving average in Tableau Prep.
    • New AWS data sources: Amazon S3, Amazon DocumentDB, Amazon OpenSearch, Amazon Neptune.
  • Introduced in TC21 (but not yet released)
    • Data Catalog Integration: Sync external metadata to Tableau (from Collibra, Alation, & Informatica).
    • Tableau Prep Extensions: Leverage and build extension for Tableau Prep (sentiment analysis, OCR, geocoding, feature engineering etc.).
  • Introduced in TC21 (and already released)
    • Virtual Connections (2021.4): Centrally managed and reusable access points to source data with single point to define security policy and data standards.
    • Centralized row level security (2021.4): Centralized RLS and data management for virtual connections.
    • Parameters in Tableau Prep (2021.4): Leverage parameters in Tableau Prep workflows.

Tableau Cloud management

Rebranding Tableau Online to Tableau Cloud and a bunch of new management and governance features in it was one important area of TC22. Tableau Cloud can now be managed as a whole with multi-site management. Security has already been a key area when moving to cloud and now Tableau finally supports customer managed encryption keys (BYOK).  From a monitoring point of view both activity log and admin insights provide information how Tableau Cloud and contents in it are used.

  • Introduced in TC22
    • Multi-site management for Tableau Cloud: Manage centrally all Tableau Cloud sites.
    • Customer managed encryption keys (later 2022): BYOK (Bring Your Own Keys). 
    • Activity Log: More insights on how people are using Tableau, permission auditing etc.
    • Admin Insights: Maximise performance, boost adoption, and manage contents.
Tableau Admin Insights Example
Tableau Cloud Admin Insights example presented in TC22.

Tableau Server management

There weren’t too many new features in Tableau Server management, I guess partly because of the effort put into Tableau Cloud Management instead. However, Tableau Server auto-scaling was mentioned again and it will be coming soon starting with backgrounder auto-scaling.

  • Introduced in TC22
    • Auto-scaling for Tableau Server (2022 H1): Starting with backgrounder auto-scaling for container deployments.
  • Introduced in TC21 (but not yet released)
    • Resource Monitoring Improvements (~2022 H1): Show view load requests, establish new baseline etc.
    • Backgrounder resource limits (~2022 H1): Set limits for backgrounder resource consumption.
  • Introduced in TC21 (and already released)
    • Time Stamped log Zips (2021.4)

Tableau ecosystem & Tableau Public

Last year in the TC21 Tableau ecosystem and upcoming Tableau Public features had a big role. This year there wasn’t much new in this area but still the Tableau exchange and accelerators were mentioned and shown in the demos a couple of times.

  • Introduced in TC21 (but not yet released)
    • Tableau Public Slack Integration (~2022 H1)
    • More connectors to Tableau Public (~2022 H1): Box, Dropbox, OneDrive.
    • Publish Prep flows to Tableau Public: Will there be a Public version for Tableau Prep?
    • Tableau Public custom Channels (~2022 H1):  Custom channels around certain topics.
  • Introduced in TC21 (and already released)
    • Tableau exchange: Search and leverage shared extensions, connectors, more than 100 accelerators. Possibility to share dataset may be added later on.
    • Accelerators: Dashboard starters for certain use cases and source data (e.g. call center analysis, Marketo data, Salesforce data etc.). Can soon be used directly from Tableau.

Want to know more?

If you are looking for more info about Tableau read our previous blog posts:

More info about the upcoming features on the Tableau coming soon page.

Check out our offering about visual analytics & Tableau, and book a demo to find out more:

 

A complete list of new features introduced at the Tableau Conference 2021

The Tableau Conference 2021 is over and yet again it was a lot of fun with all the not-so-serious music performances, great informative sessions, excellent Iron Viz competition, and of course demonstrations of many new features coming in the future releases. In general my first thoughts about the new capabilities revealed in TC21 are very positive. Obviously some of the details are still a bit blurry but the overall topics seem to be in a good balance: There are very interesting improvements coming for visual analytics, data management and content consumption in different channels, but in my opinion the most interesting area was augmented analytics and capabilities for citizen data scientists.

It’s been 2 years since Salesforce announced the acquisition of Tableau. After acquisitions and mergers, it’s always interesting to see how it affects the product roadmap and development. Now I really feel the pace for Tableau is getting faster and also the scope is getting more extensive. Tableau is not only fine tuning the current offering, but creating a more comprehensive analytics platform with autoML, easier collaboration & embedding, and action triggers that extend beyond the Tableau.

Note: All the pictures are created using screenshots from the TC21 Devs on Stage and TC21 Opening Keynote sessions. You can watch the sessions at any time on Tableau site.

Update: Read our latest overview of the Tableau product roadmap based on TC22 and TC21 and Tableau goes Minority Report in TC23 – takes direction towards augmented reality, generative AI and headless BI blog posts.

The Basics – Workbook Authoring

Let’s dive into workbook authoring first. It is still the core of Tableau and I’m very pleased to see there is still room for improvement. For the workbook authoring the biggest announcement was the visualization extensions. This means you can more easily develop and use new custom visualization types (for example sunburst and flower). The feature makes it possible to adjust visualization details with mark designer and to share these custom visualizations with others. Another very nice feature was dynamic dashboard layouts, you can use parameters and field values to dynamically toggle the visibility of dashboard components (visualizations and containers). This gives so much more power to flexibly show and hide visualizations on the dashboard.

There is also a redesigned UI to view underlying data with options to select the desired columns, reorder columns and sort data, export data etc. For map analysis the possibility to use data from multiple data sources in spatial layers is a very nice feature. Using workbook optimizer you can view tips to improve performance when publishing the workbook. In general it also seems the full web authoring for both data source and visualization authoring isn’t very far away anymore.

  • Visualization Extensions (2022 H2): Custom mark types, mark designer to fine tune the visualization details, share custom viz types.
  • Dynamic Dashboard Layouts (2022 H1): Use parameters & field values to show/hide layout containers and visualizations.
  • Multi Data Source Spatial Layers (2021.4): Use data from different data sources in different layers of a single map visualization.
  • Redesigned View Data (2022 H1): View/hide columns, reorder columns, sort data, etc.
  • Workbook Optimizer (2021.4): Suggest performance improvements when publishing a workbook.
Visualization Extensions. Create more complex visualizations (like sunburst) with ease.

Augmented Analytics & Citizen Data Science

This topic has been in the Gartner’s hype cycle for some time. In Tableau we have already seen the first capabilities related to augmented analytics and autoML, but this area is really getting a lot more power in the future. Data change radar will automatically detect new outliers or anomalies in the data, and alert and visualize those to the user. Then users can apply the explain data feature to automatically get insights and explanations about the data, what has happened and why. Explain the viz feature will not explain only one data point but the whole visualization or dashboard and show descriptive information about the data. All this happens automatically behind the scenes and it can really speed up the analysis to get these insights out-of-the-box. There were also a bunch of smaller improvements in the Ask Data feature for example to adjust the behavior and to embed the ask data functionality.

One of the biggest new upcoming features was the possibility to create and deploy predictive models within Tableau with Tableau Model Builder. This means citizen data scientists can create autoML type of predictive models and deploy those inside Tableau to get new insights about the data.  The user interface for this seemed to be a lot like Tableau Prep. Another very interesting feature was Scenario Planning, which is currently under development in Tableau Labs. This feature gives the possibility to view how changes in certain variables would affect defined target variables and compare different scenarios to each other. Another use case for scenarios would be finding different ways to achieve a certain target. For me the scenario planning seemed to be a bit disconnected from the core capabilities of Tableau, but it is under development and for sure there could be some very nice use cases for this type of functionality.

  • Data Change Radar (2022 H1): Alert and show details about meaningful data changes, detect new outliers or anomalies, alert and explain these.
  • Explain the Viz (2022 H2): Show outliers and anomalies in the data, explain changes, explain mark etc.
  • Multiple Smaller Improvements in Ask Data (2022 H1): Contact Lens author, Personal pinning, Phrase builder, Lens lineage in Catalog, Embed Ask Data.
  • Tableau Model Builder: Use autoML to build and deploy predictive models within Tableau.
  • Scenario Planning: View how changes in certain variables affect target variables and how certain targets could be achieved.
Explain Data side pane with data changes and explain change drill down path.

Collaborate, embed and act

The Tableau Slack integration is getting better and more versatile. With the 2021.4 version you can use Tableau search, Explain Data and Ask Data features directly in Slack. As it was said in the event: “it’s like having data as your Slack member“. In the future also Tableau Prep notifications can be viewed via Slack. It was also suggested that later on similar integration will be possible for example with MS Teams.

There were many new capabilities related to embedding contents to external services. With Connected Apps feature admins can define trusted applications (secure handshake) to make embedding more easy. Tableau Broadcast can be used in Tableau Online to share content via external public facing sites for everyone (for unauthenticated users). There was also a mention about 3rd party identity and access provider support which was not very precise but in my opinion it suggests the possibility to more easily leverage identities and access management from outside Tableau. Embeddable web authoring makes it possible to create and edit contents directly within the service where contents are embedded using the web edit, so no need to use Tableau Desktop.

One big announcement was the Tableau Actions. Tableau dashboards already have great actions to create interactions between the user and the data, but this is something more. With Tableau Actions you can trigger actions outside Tableau directly from a dashboard. You could for example trigger Salesforce Flow tasks by clicking a button in the dashboard. And in the future also other workflow engines will be supported. This will provide much more powerful interactivity options for the user.

  • Tableau search, Explain Data and Ask Data in Slack (2021.4)
  • Tableau Prep notifications in Slack (2022 H1)
  • Connected Apps (2021.4): More easily embed to external apps, create secure handshake between Tableau and other apps.
  • Tableau Broadcast (2022 H2): Share contest via external public facing site to give access to unauthenticated users, only Tableau Online.
  • 3rd party Identity & Access Providers: Better capabilities to manage users externally outside Tableau.
  • Embeddable Web Authoring: No need for desktop when creating & editing embedded contents, full embedded visual analytics.
  • Embeddable Ask Data 
  • Tableau Actions: Trigger actions outside Tableau, for example Salesforce Flow actions, later on support for other workflow engines.
Creating new Tableau Action to trigger Salesforce Flow to escalate case.

Data management & data preparation

Virtual Connections have already been introduced earlier and those seem to be very powerful functionality to centrally manage data connections and create centralized row level security rules. These functionalities and possible new future features build around them can really boost end-to-end self-service analytics in the future. The only downside is that this is part of the data management add-on. Data Catalog Integration will bring the possibility to sync metadata from external data catalog services, like Collibra and Alation.

Related to the data preparation there will be new Tableau Prep Extensions so you can get more power to the prep workflows as a custom step. These new steps can be for example sentiment analysis, geocoding, feature engineering etc. Other new functionality in Tableau Prep is the possibility to use parameters in the Prep workflows. It was also said that in the future you can use Tableau Public to publish and share Tableau Prep flows. This might mean there is also a Public version coming for Tableau Prep. It wasn’t mentioned in the event, but it would be great.

  • Virtual Connections (2021.4): Centrally managed and reusable access points to source data with single point to define security policy and data standards.
  • Centralized row level security (2021.4): Centralized RLS and data management for virtual connections.
  • Data Catalog Integration: Sync external metadata to Tableau (from Collibra, Alation, & Informatica).
  • Tableau Prep Extensions: Leverage and build extension for Tableau Prep (sentiment analysis, OCR, geocoding, feature engineering etc.).
  • Parameters in Tableau Prep (2021.4): Leverage parameters in Tableau Prep workflows.
Content of a virtual connection and related security policies.

Server Management

Even though SaaS options like Tableau Online are getting more popular all the time there was still a bunch of new Tableau Server specific features. New improved resource monitoring capabilities as well as time stamped log file zip generation were mentioned. Backgrounder resource limits can limit the amount of resources consumed by backgrounder processes and auto-scaling for backgrounders for containerized deployments can help the environment to adjust for different workloads during different times of the day.

  • Resource Monitoring Improvements (2022 H1): Show view load requests, establish new baseline etc.
  • Time Stamped log Zips (2021.4)
  • Backgrounder resource limits (2022 H1): Set limits for backgrounder resource consumption.
  • Auto-scaling for backgrounder (2022 H1): Set backgrounder auto-scaling for container deployments.

Tableau Ecosystem & Tableau Public

Tableau is building Tableau Public to better serve the data family in different ways. There is already a possibility to create visualizations in Tableau Public using the web edit. There is also redesigned search and better general user interface to structure and view contents as channels. Tableau Public will also have Slack integration and more data connectors for example to Dropbox and OneDrive. As already mentioned, Tableau Prep flows can be published to Tableau Public in the future and that might also mean a release of Tableau Prep Public, who knows.

In the keynote there was also mention that Tableau exchange would contain all the different kinds of extensions, connectors, datasets and accelerators in the future. The other contents are already there but the datasets will be a very interesting addition. This would mean companies could publish, use and possibly sell and buy analysis ready data contents. The accelerators are dashboard starters for certain use cases or source data.

  • Tableau Public Slack Integration (2022 H1)
  • More connectors to Tableau Public (2022 H1): Box, Dropbox, OneDrive.
  • Publish Prep flows to Tableau Public: Will there be a Public version for Tableau Prep?
  • Tableau Public custom Channels (2022 H1):  Custom channels around certain topics.
  • Tableau exchange: Search and leverage shared extensions, connectors, datasets and accelerators.
  • Accelerators: Dashboard starters for certain use cases and source data (e.g. call center analysis, Marketo data, Salesforce data etc.).

Want to read or hear more?

If you are looking for more info about Tableau read our blog post: Tableau – a pioneer of modern self-service business intelligence.

More info about the upcoming features on the Tableau coming soon page.

You can also read about our visual analytics services and contact to hear more or to see a comprehensive end-to-end Tableau Demo.

Thanks for reading!

Tero Honko, Senior Data Consultant
tero.honko@solita.fi
Phone +358 40 5878359

Tableau – a pioneer of modern self-service business intelligence

Tableau can rightly be called a pioneer of modern data visualisation and self-service BI. Founded in 2003, the company launched the first version of its visual analytics product back in 2004. The basic principles of the tool, the way it’s used to analyze data and create visualisations, have remained similar ever since. Tableau still stands out from other tools especially in the flexibility of building visualisations and interactions, as well as the versatility of out-of-the-box map visualisations and geospatial capabilities. In addition to visualisations, Tableau is a fully-fledged analytic solution – to understand and act on data.

This is the second post in the blog series about BI tools. The first post was about the evolution of business intelligence in the 21st century. This time we delve into one of the leading tools in the market. We will describe what differentiates Tableau from key competitors, what the platform consists of, what the licensing options are and much more. We will try to be as comprehensive as possible, but all the features can’t be considered or even mentioned. Describing a BI tool thoroughly in a blog post is extremely challenging. Contact us if you need a more detailed evaluation or want to see Tableau in action with real-life data contents.

Update: Read our blog posts about the new features introduced at the Tableau Conference 2021, and overview of the Tableau product roadmap based on TC22 and TC21 and Tableau goes Minority Report in TC23 – takes direction towards augmented reality, generative AI and headless BI.

To help people see and understand their data

This is what Tableau mentions as their mission: to help people see and understand their data. Tableau aims to be easy to use so everybody can utilize it and derive usable insights out of their data. Tableau was originally built based on data visualisation research done at Stanford University; how to optimally support people’s natural ability to think visually and to intuitively understand certain graphical presentations.

Tableau Desktop did a very good job in the era of Enterprise BI dinosaurs to make data analytics easier and even fun (read the previous blog post for reference about dinosaurs). The success and market penetration with the Tableau Desktop meant the platform needed to be expanded. Tableau Server, Online, Public, Mobile and Prep have been released since then. Nowadays the Tableau offering is a comprehensive analytical platform with a certain twist compared to competitors.

The Tableau twist

Quickly and easily to insights
In general it is very fast to get from source data to valuable insights with Tableau. Analysing data and creating visuals and dashboards is mostly very easy and smooth. There are out-of-the-box time hierarchies available, drag and drop analytical templates to use and a good amount of easy to create calculations (running totals, moving averages, share of total, rank etc.). Ease of use also goes to data preparation and modeling. Both of those can be done without deep technical knowledge and coding skills. Perhaps what I’m most grateful for in this area is how new features are published and old ones deprecated: in a way it just works. For example when the new in-memory extract storage replaced the old technology in 2018 it was done with minimal effect and maintenance work to the users. Same thing happened in 2020 when a new semantic data model layer was introduced, and again, no laborious migrations from old to new, everything just worked.

Extraordinary creativity
Tableau was originally a tool for data visualisation and visual analytics, and for that it remains extremely strong. Tableau uniquely enables user creativity and ingenuity when analyzing data and developing content. What does this mean? In other tools you usually first select the desired outcome you are looking for (the visualisation type e.g. line, area, bar, pie, etc.) and then assign the fields to the roles the visualisation type supports (e.g. values, legend, axis, tooltip, etc.). If the visualisation doesn’t support something you would need (e.g. size or small-multiples) then there isn’t much you can do.

Tableau works very differently: you can drag and drop fields to the canvas and Tableau will visualize the data in a suitable way. Certain properties of a field can be changed on the fly: dimensions can be changed to measures, discrete fields converted to continuous, and vice versa. Almost any field can be assigned into any role, and different types of visualisations can be combined. This approach is more flexible than in any other tool I have used. However, this can seem complicated at first. Fortunately, Tableau has a Show Me menu to help you to create different visualisations and to understand how the tool works. Once you get the hang of it, you can do powerful visual analytics like never before.

A bunch of different Tableau dashboards and visualisations. All of these are available in Tableau Public.

Maps and spatial capabilities
As mentioned earlier, the different types of visualisations are very diverse and flexible in Tableau, but especially maps and spatial analytics are top notch. Here’s a short list of what makes Tableau’s spatial capabilities so great:

  • Tableau is able to read spatial data from many different data sources. Point, line and polygon geometries can be used directly from Snowflake, SQL Server, PostgreSQL and Oracle databases. Spatial data can also be ingested from different files, like GeoJSON, KML, TopoJSON, Esri Shapefile etc.
  • An unlimited number of layers can be defined to the same Tableau map. Different layers can display various kinds of data and geometries. And users can toggle layer visibility on/off.
  • Data on a single layer can be visualised in various ways: as points (symbols) , lines, polygons (filled areas), heatmaps, pies, paths etc.
  • Tableau supports geocoding (transforming location related attributes to a location on a map). Attributes that can be geocoded are for example: country, state, city and postal code.
  • Tableau supports spatial joins and functions. These enable location based data joins and calculations for example to make lines between points, calculate the distance between points, recognize if lines intersect or if a point is inside a polygon etc.
  • WMS (Web Map Service) maps and Mapbox are supported as background maps.
  • There is no limit to the number of data points on the maps in Tableau. Many tools can have a limit of 3500 or 10000 points, but Tableau can visualize hundreds of thousands of points with good performance.
  • With map tools, the user can interact with the map in many ways, e.g., zoom in/out, measure distance, calculate  areas, select points, toggle layer visibility, search locations, and more.
  • All of this mentioned above is available out-of-the-box, no additional components required.
Detailed city centre map with street map as a background, building layer containing dark grey polygons on the bottom and point layer on the top showing floor area (size) & heating fuel (color).

Interactions between user and visualisations
The third strength of Tableau is the abilities for the user to interact with visualisations and the ability for the developer to precisely define where and how these interactions take place. Interactions can be used, for example, to filter data, highlight data, show and hide layout objects, show tooltips, define values ​​for parameters and set objects, drill up and down, drill through to another dashboard or to an external url. Interactions can enable especially non-technical business users who consume pre-made content to get more information and insights from a single dashboard without the need to create multiple dashboards or going full self-service mode.

Flexibility of infrastructure and governance
Tableau is exactly the same tool regardless of how and where you choose to deploy it (on-premise, public cloud or SaaS). You can use Windows or Linux servers (or containers) and Windows and Mac computers for the desktop. You can use different authentication options, user directories and data sources without any mandatory dependencies to any cloud vendor whatsoever.

Same flexibility is there when creating the content. Data models can be created with exactly the same way and functionalities whether it’s in extract or live mode. And you can also combine extract and live mode contents on the same dashboard. The same scripting language is used when preparing the data and building the visualisations. And it is quite a powerful, yet easy and straightforward language to use. The flexibility carries on when publishing the content to Server/Online. You can structure the contents to folders exactly as you like and apply security policies on the detail level you need.

Active and passionate user community
The Tableau user community is more active and passionate compared to other corporate tool user communities. For example, Tableau Public has more than 3.7 million published visualisations from more than 1.5 million users. Anyone can browse and use these visualisations to learn about the data and how to use Tableau. The community supports and helps with issues and problems related to the tool, but I personally appreciate the work they do to spread data understanding and share best visualisation practices and examples.

Main functionalities & workflow

Tableau contains everything that a modern analytics platform can be expected to contain. There are no major deficiencies, but obviously there are some areas for improvements especially related to the newer features. Tableau can be used to master the whole visual analytics pipeline, from the data preparation to various ways of consumption, across multiple channels. This is how Tableau workflow usually goes.

Tableau platform core functionalities, components and related user roles.

Data Preparation
If you need data preparation capabilities Tableau offers this within Tableau Prep. This tool can be used as a desktop client or directly within Tableau Server or Online. Tableau Prep is built around the same easy to use mentality as the other components in the platform. Creating data manipulation steps and the whole workflow is very visual, the process is easy to understand and it’s easy to see what’s happening to the data along the way. Tableau Prep offers standard data wrangling capabilities to join, union, pivot, clean and aggregate data. You can also add new rows to the data and use custom R or Python scripts to calculate new insights. The result dataset can be pushed to a file, to a database or as a Tableau data extract. Already made data preparation workflows can be shared and reused, and the scheduling and execution can be monitored via the Prep Conductor add-on.

Data modeling
Most commonly data modeling is done using the Tableau Desktop client. Exceptions are, if you use Tableau Prep or some external tool with Tableau API to create and refresh the data extracts. With Tableau Desktop you connect to the data sources, select the objects you want and define joins and relationships between the objects. Nowadays Tableau data models include two layers: physical layer and logical (semantic) layer. The separation of the two makes it possible to reuse the same Tableau data model for different purposes. Logical layer functionality was published with version 2020.2 and it is a crucial update to the data model.

While modeling the data you selected whether to use live connection or extract data to Tableau’s columnar in-memory data storage. Whatever you choose, you have the exact same functionalities and capabilities in use and you can also change the connection type later on. One possibility is also to use incremental refresh so only new rows are inserted to the data extract. The best practice is to verify and define all field’s data type, default formatting & aggregation, geographical role etc. directly when modeling the data even though these can be altered later on while doing visual analytics. Row-level security filters can also be added to the data model to define different data visibility for different groups. While doing the data model you usually create the first visualizations in parallel to better understand the data and to make sure it is what you are expecting. When the data model is ready you can publish it to Tableau Server/Online to enable reusability.

Visual analytics
Then we get to the fun part, doing visual analytics. This and the following steps can be done either with Tableau Desktop or via Tableau Server/Online using the browser. There are so many ways to do this. You can drag and drop the fields to the canvas and let Tableau pick the proper visualisation type. Or drag and drop the fields to the exact roles and define the exact settings, filters and parameters you want.

When you get insights from the data and new questions arise you just modify the visualisation to also get the new questions answered. Perhaps create quick table calculations or various types of other calculations to get new insights. Sometimes it’s a good idea to try the Show Me menu to get some new perspectives. Or use the Ask Data functionality to write the questions you have and let Tableau build the vizzes. As previously mentioned, this is where Tableau truly shines. When you have individual visualisations ready you can start building a dashboard.

Dashboards
If you want you can create the dashboard very quickly: just drag and drop the visualisations to the canvas, enable visual filtering, show filter selections, legends and some descriptive headers, and you are ready. On the other hand, you can also plan and finetune the layout and interactions to great detail. Create objects with conditional visibility controlled via show/hide buttons or selections in other visualisations, add multiple tabs and drill-throughs to other contents etc.

Nowadays you can even have fully customizable objects via Tableau Extensions, for example new types of visualisations, predictive analytics, interactions, write-back, etc. If the dashboard will be consumed via different devices you can define distinct layouts and contents tailored to for example tablets and phones. In addition to dashboards, users can also create stories with multiple steps/slides containing different visualisations and comments, a bit like PowerPoint presentations with interactive visuals.

Example screenshot of Tableau Dashboard Extensions offering.

Metrics (KPI’s)
You can create many kinds of KPIs and metrics within a dashboard, but there is also a distinct Metrics feature in Tableau. Metrics objects can be created in Tableau Server/Online folders to view the most important figures already while navigating the contents. Metrics are a nice way to gather key figures from different dashboards to a single place in a very easy way. And if there’s a date field available in the data the metric can also contain a small trend graph.

Other ways to consume contents
There are still many ways in Tableau to consume the contents that I haven’t yet written about. Dashboard users can subscribe to the content, set alarms to get notifications when thresholds are exceeded, save filter & parameter combinations as bookmarks, export data, comment and discuss about the dashboards etc. In addition to Tableau Server/Online, content can be consumed with mobile apps (also offline possibility), integrated to Slack or embedded to external services.

With Ask Data functionality Tableau data models can be queried using written questions. Someone might ask for “top 20 customers in Europe by sales in 2021”, and Tableau would show the answer as a graph. A few years ago I was very sceptical about this kind of feature, thinking it wouldn’t work. But after using it a couple of times during this year I think it is actually quite neat, although I still have my doubts for more complex use cases. Another nice automated insights type of feature is Explain Data which can show fairly basic info about the selected datapoint from statistical perspective.

Administration and Governance
One crucial part of the workflow is governance and monitoring. Most of the governance definitions are created before the development work even starts. Administrator sets up the authentication and creates appropriate user groups either manually or from the user directory. Administrators can mandate domain owners to control their contents but still have visibility to the contents in the platform. Administrators have a variety of tools to monitor and govern the environment, also to a very detailed level if needed.

There are also a few add-on components available to enhance the use of Tableau Server/Online. Tableau Data Management add-on contains Tableau Prep Conductor to orchestrate and monitor Tableau Prep workflows and Tableau Catalog to view more details about the contents, data lineage and impact analysis. Tableau Server Manager add-on gives more power managing Tableau Server environment, to enhance performance, scalability, content migration, resource usage etc.

Also several API’s are available to control and use Tableau programmatically. These include ways to manage Tableau Server environments via code, connect to data, create and use Tableau data sources, use external analytical capabilities like R and Python, create and use dashboard extensions and embed Tableau content to external services and mobile apps.

Room for improvements

Even though Tableau data models nowadays contain a semantic layer and are way more versatile than before, there is still something to improve. Better multifact support, possibility for secondary relationships and refined incremental refresh would be nice, but of course those might sometimes complicate the models quite a lot. The good thing about the current state is that models are still easy to understand and use. A bigger data model related improvement would be the ability to reuse existing data models when creating new ones, a bit of what you already can do with the data flows in Tableau Prep. This would really improve the ability to do end-to-end bimodal BI on the data model layer. Most important data models could be built centrally and then decentralised content development could add their own data to their own models without duplicating the model and the data of the centralised model.

Some augmented analytics or autoML features have been released during this year, but those still feel very basic and a bit disconnected from the core platform. This capability somewhat relies on Salesforce Einstein Analytics capabilities and is not (at least yet) fully built-in to Tableau platform. The current Explain Data feature is able to show basic details about the selected datapoint, but I would like it to emphasize the most interesting data points and related insights (anomalies & trends etc.) automatically.

The history of being originally a desktop tool is still quite visible. Contents are somewhat workbook and visualisation oriented. This is not necessarily bad, because it can help to structure the contents in a logical way, but there are few things to improve. I would really love to be able to more easily create dashboard navigation and drill through between contents in distinct workbooks. Within the same workbook it’s very easy, but among different workbooks it gets a bit clunky.

Desktop tooling can create pressure for IT or whoever needs to maintain, deliver and update the client software on a regular basis. Keeping up with major updates (4 times a year) and possible minor updates can be a hassle. Tableau is moving towards a browser based approach but for now some of the functionalities are still only available via Tableau Desktop client installed on users’ laptops.

Building the visualisation and doing visual analytics is a somewhat manual process in Tableau. After all, it wouldn’t be visual analytics if the outcome would just appear, without the journey to see different viewpoints and learning the insights along the way. Ask Data and Explain Data features are one way of making visualisations faster and in a more automated manner, but I would also like to see more code driven options to build and manage the contents. This would make it possible to use the visual power of Tableau in a more data ops oriented way. To build visualisations and dashboards on the fly already in the data pipelines and to deploy the contents automatically to different environments.

Then I have to mention the pricing, even though the importance of the licence price is commonly exaggerated over the other components affecting the total cost of ownership (TCO). What I do like about Tableau pricing is the fact there are no hidden costs to be discovered later. With the default price you get the capabilities and there rarely is a need to buy something more expensive later on. You just  buy more licenses if you want to increase the number of users. And here lies the criticism I have. Normally you use per user licensing when the number of users is rather small (something like 10-300 users). With Tableau Server you can switch to core based licensing when the number of users gets bigger or you want to enable guest access etc. But when using Tableau Online there is no possibility to select core or node based licensing, you just have to stick with the user based license model. Of course Tableau might offer you some discounts if you have a lot of users within the Tableau Online, but that’s just something I really don’t know nor can’t promise.

Greetings from Gartner and Forrester

Gartner has placed Tableau as a leader already for 9 consecutive years in the Magic Quadrant for Analytics and Business Intelligence. In the latest report Gartner recognizes the analytics user experience, and the very strong community and customer’s fan-like attitude towards the product as a core strengths of Tableau. Gartner also mentions the potential with the Salesforce product family to integrate Tableau more tightly to different solutions and to easily embed Tableau visualisation with the Tableau Viz Lightning web component. As a caution, Gartner mentions Tableau’s non-cloud native history and install base as well as premium pricing and possible integration challenges with Salesforce products.

Tableau 2021 position and path in the Gartner MQ for Analytics and Business intelligence. Check out the visualisation in Tableau Public.

In the Critical Capabilities for Analytics and Business Intelligence Platform 2021 report Gartner focuses more on the actual capabilities and functionalities. In the report Gartner rates Tableau as excellent in data preparation, which is simple and visual to use and easy to publish, schedule and monitor. Also more complex tasks can be executed via R & Python scripts. Gartner also praises the Tableau governance capabilities to promote and certify contents as well as control the workflows and view data lineage to better understand data assets. Gartner says Tableau is the clear leader in the area of data visualisation, but there are things to improve in the augmented analytics area, partly because of the lack of integration with Einstein Analytics. This however has improved since the publication of the Gartner report with the Einstein Discovery Extension and other functionalities.

The Forrester Wave for Augmented BI Platforms Q3 2021 names Tableau (Salesforce in the report) as a leader. Forrester recognizes visual and geospatial analytics as core strengths. The Forrester report, being published later than the two Gartner reports, rates Tableau much better in augmented analytics. Forrester mentions the Einstein Discovery functionality and out-of-the-box ML models that significantly boost Tableau capabilities beyond descriptive and diagnostic analytics towards guided ML. Forrester sees room for improvement among business application connectors.

Infrastructure options

Tableau offers a wide variety of options in how to be deployed in organisations and Tableau doesn’t favor any cloud or infra provider. Tableau Desktop is available for both Mac and Windows. It is used to connect to data in databases, services or files and to visualise that data in charts and dashboards. Tableau also offers a web authoring mode where no software installation is required.

In order to share visualisations with a wider audience, Tableau Server is used. Tableau Server is available as a server application and a cloud service (Tableau Online). If you want to host your own server, you can do it as an on premise server, in a private cloud or house it in a public cloud such as AWS, Azure or GCP. Tableau Server can be installed on Windows or Linux operating systems and for Linux, it is also available to run inside a Linux Docker container.

In its basic form Tableau Server can be installed on a single node. For more complex solutions, the installation can be scaled out for specific scenarios such as high availability or high performance. Using your own server allows for total control over settings and customisations of the server, but then of course you have the extra effort to maintain and monitor the environment and take care of the infrastructure costs.

Tableau Online is the software-as-a-service offering for those not hosting their own servers. The Online service is divided into pods located all over the world and customers can select which pod that should house their Tableau site. Tableau Online obviously doesn’t provide so much control over the environment, but instead it’s much more straightforward to use and deploy. Accessing the portal in Tableau Server or Online can be done using all major browsers. There are also mobile viewer apps for iOS and Android.

Licensing and publicly available pricing

The default way of licensing Tableau is a per user subscription model. Additionally there is a core based licensing option available for Tableau Server (but not for Tableau Online) and possibility to license to a specific embedding use case with a discounted price. Tableau licenses can be purchased from Tableau partners, Solita can help you to find the optimal license combination, get the licenses, and everything else you might need.

Tableau licensing is divided by the usage roles for Creator, Explorer and Viewer. Capabilities depend on the role and Creators are the most capable of the lot. They can connect to data sources, prepare and model data, create visualisations and publish both visualisations and data models to Server/Online. Explorers can do visual analytics and use existing data models and reports to build and extend visualisations and dashboards. Viewers can browse and interact with content. All roles can and set up favourites, subscriptions and alerts to personalise their experience in the service.

All three roles are available for both Tableau Server and Tableau Online. Subscriptions are priced in USD per month. License fees are billed yearly. You can use the license price calculator in the Tableau Public to calculate total price for certain role combinations (notice: calculator contains only publicly available pricing information): Data Viz tool license pricing

  • Tableau Online (Oct/2021, per user per month)
    • Creator: $70
    • Explorer: $42
    • Viewer: $15
  • Tableau Server (Oct/2021, per user per month)
    • Creator: $70
    • Explorer: $35
    • Viewer: $12
  • Add-on modules (custom pricing from Tableau)
    • Data Management
    • Server Management
    • Einstein Discovery

Server licenses are also offered as license type Tableau Embedded Analytics with a 25% reduction on licenses, when organisations want to offer Tableau content as an analysis service to external parties.

For students and academic institutions there is a possibility to get a free 1-year license and access to eLearning contents.

There’s also a free version called Tableau Public. Tableau Public offers Tableau visual analytics power and possibility to save and share the results only via Tableau Public service. It is used by visualisation enthusiasts all over the world and is an excellent source to find creative ways to use Tableau. But be sure not to publish any non-public data to Tableau Public service since the contents can be found via url, even when the content is not searchable or listed within your profile.

Sometimes you might also hear about a tool called Tableau CRM. Tableau CRM is actually rebranded Salesforce Einstein Analytics. That is not originally part of Tableau platform, but Salesforce has plans to tighten the integration between the two in the future.

How to test and start with Tableau

  • Tableau Desktop trial: 14-days trial to try the capabilities in the Tableau Desktop.
    • Download and install the product from the Tableau site
    • Fill in your email when launching the tool for the first time
  • Tableau Online trial: Test the Tableau Online capabilities to share and analyse information.
    • Request the Tableau Online trial in the Tableau Online site
    • Activate the trial account with the link in your email
  • Tableau Public: To analyse and visualise primarily open and public data for free.
    • Create an account and download the app from Tableau Public site
    • Notice that you can also create visualisations directly in the Tableau Public service using the browser
  • Other relevant contents
  • Solita Tableau and visual analytics related offering
    • Tool evaluations and recommendations
    • License consulting and sales
    • Extensive training options
    • Analytics solution kickstart
    • Solution implementation and rollouts
    • Maintenance and support

Until next time

Thanks for reading and scrolling down here. In the next post for the series we will take a look at what Microsoft and Power BI has to offer. If you have questions or any kind of consulting needs about Tableau, you can contact us:

Tero Honko, Senior Data Consultant, Finland
tero.honko@solita.fi
Phone +358 40 5878359

Aron Saläng, Visual Analytics Tech Lead, Sweden
aron.salang@solita.se
Phone +46 70 144 67 87

Business intelligence in the 21st century

It's been interesting to follow and live the evolution of the business intelligence and data visualisation tools over the last 20 years. Leading vendors have changed, a lot of acquisitions have taken place, cloud became de-facto, big data hype came and went, self-service became possible, and the data culture & processes are evolving – little by little.

We are starting a blog series to go through the BI and data visualisation market. We will uncover each leading vendor in detail, take a look at the key challengers and anticipate where the market is going in the future. In this first post, we are going to delve into the world of business intelligence tools in the 21st century, and review the market and product changes over time.

Occasionally, this blog series tackles our personal experiences and views in relation to tools. Still, the actual assessments have been made objectively and technology agnostically – just like tool assessments are supposed to. If you wish to go through the interactive visualisation based on the content of “Gartner Magic Quadrant for Analytics & BI”, from where the attached figures have been taken, you can do so at Tableau Public: Gartner MQ for Analytics & BI visualisation

Current kings of the hill

For a long time now, the leaders in the data visualisation tool market have been Tableau, Microsoft, and Qlik. These vendors entered Gartner’s Magic Quadrant Leader section in 2008 (Microsoft), 2011 (Qlik), and 2013 (Tableau). And they have held their position ever since. Tableau and Qlik have remained quite stable within a small area, whereas Microsoft has bounced around the quadrant (possibly due to their transfer from the old SSRS/SSAS stack to Power BI).

Visualization about the Gartner MQ for Analytics and BI and the history paths of current market leaders.
“Gartner Magic Quadrant for Analytics & BI” 2021 and the paths of current market leaders.

 

These tools have gained a stable market position, and each of them has their own strengths and users. Various rivals are regularly knocking on the door in the hope of attending the party, but, for the moment, they have always come away disappointed and been forced to gain new momentum in other quadrants. Before going into more detail about these kings of the hill, let’s review how the current situation has come about in terms of vendors and tool evolution.

Acquisitions and Bitcoins

Previous kings of the hill, i.e., vendors in the leaders quadrant, were IBM/Cognos, SAP/BusinessObject, Oracle/Hyperion, SAS and MicroStrategy. During the first decade of the 21st century, especially in 2007, BI reporting market was consolidating fast. The IT giants of that time acquired the long-term market leaders: Oracle announced its acquisition of Hyperion in March 2007; SAP announced its acquisition of BusinessObjects in October 2007; and IBM announced its acquisition of Cognos in November 2007. The acquired market leaders were previously themselves purchasing industry rivals and minor companies (such as Crystal Decision, Applix and Acta Technologies).

Based on Gartner’s Magic Quadrant, the leaders were still going strong about four years after these acquisitions. But then they started to slip down the slippery slope. Well, to be precise, SAP/BusinessObjects started its decline a bit earlier. Maybe the strong identification with the SAP family did not promote success. I cannot say whether the decline of the leaders was more due to the uncertainty caused by these business acquisitions: difficulty to integrate the organisations and the products, or due to the fact that renewal is always hard for market leaders. Development stalls because companies don’t want to cannibalise their own market, and when customers abandon the ship and start rooting for more innovative rivals, companies complicate their licensing model and push up the prices. And this really gets the rest of the customers going!

Visualization of the downhill of prior market leaders in the Gartner MQ for Analytics and BI.
Prior market leaders positions in Gartner MQ over the years, based on Gartner Magic Quadrant for Analytics & BI data from 2006–2021.

 

MicroStrategy and SAS didn’t immerse themselves as much in business acquisitions, but still they shared the same fate with their rivals ruling the market at the turn of the 2010s. The offering stalled, at least in the area of data visualisation, and MicroStrategy is probably more famous today for its Bitcoins than its product offering.

OLAP-cubes

Let’s forget the vendors for a moment and start looking at product evolution. The first BI tools emerged at the end of 1980s, but they started to flourish in the 1990s. Data warehouses were rare in those days, and most BI tools included features that allowed users to obtain data directly from operative systems and download it into the tool’s own data model. One popular data storage was OLAP-cubes that were easy to use and view from different perspectives by filtering into the most interesting slice of information.

The most popular presentations were crosstabs and various pixel perfect listings, so the content was still not that visual. The users were mostly from finance departments, so for the end users, this numeric presentation was surely just the perfect one. Some example products from the 1990s worth mentioning include Cognos PowerPlay Transformer, Crystal Reports, and Oracle Discoverer. Qlikview also has its roots in the ‘90s, but let’s not go there yet.

OLAP-cube and report-centred solutions built directly on top of operative systems were often quite fragmented. Different departments could have made their own solutions in which each separate cube or report might have had its own data models and data refresh tasks straining the source database. This made the solution complex to maintain and caused unnecessary load to data sources. Partially due to these reasons, data warehouses increased in popularity and there was a demand for more centralised reporting solutions.

From a novelty to a dinosaur in 10 years

In early 21st century, comprehensive Enterprise BI systems started to emerge in the market. They enabled the creation of extensive solutions covering various departments and functions. The development work often required very specific competence, and it mostly focused on a BI competence centre under IT or finance departments. In the competence centre, or as subcontractors, BI developers tried their best to understand the needs of the end users and created metamodels, built OLAP-cubes, and produced reports. More graphs and KPI indicators started to appear in the solutions. Some even created dashboards containing the most essential data. In those times, graphic elements included speed gauge charts, 3D effects, gradient colors, pie charts, and other “fantastic” visual presentations. It’s not really surprising that users often wanted numeric data and these early graphs were not a hit.

New functionalities were added to these Enterprise BI tools as vendors acquired other companies and their products were integrated into existing systems. Existing components or functionalities were rarely discontinued and these newly integrated functionalities often seemed to be flimsy stick-and-bubble-gum contraptions. Over the years, Enterprise BI solutions became so fragmented and complicated that even experienced specialists struggled to make out what each component or “studio” was for (or maybe it was just me who didn’t always understand this).

Visual self-service

The clumsiness and difficulty of a centralised BI organisation and Enterprise tools accelerated the agile and easy-to-use self-service BI and data visualisation. At the turn of the 2010s, Tableau – established almost ten years earlier – started to gain a reputation as a new kind of visual analytics tool that could be used for data analysis even by people without much technical knowledge. Tableau wasn’t marketed to IT departments but directly to business operations. It didn’t try to replace existing Enterprise BI tools in companies but positioned itself alongside them directly in the business units, which now had the chance to create their own reporting content either without or partially with a data warehouse.

Gradually, other similar tools started to appear on the market: Microsoft Power BI, Qlik Sense, SAP Lumira, Oracle Data Visualisation Desktop etc. Also enterprise BI vendors started to include more features directed at business users in their solutions. In an evaluation of self-service BI tools I did a few years ago, already 13 different tools were included, so there were plenty of tools available at the time. However, when the tools were examined in detail, it was clear that some of them had resorted to shortcuts or had taken the easy way out. Most of these tools haven’t become hugely popular, and some might even be discontinued by now.

Dashboards from a self-service data visualization tool evaluation.
A glimpse to the Self-service BI tools evaluation a few years back.

New rivals

In the early 2010s, brand new start-ups were aiming to enter the data visualisation market with slightly different approaches. The big data hype brought along a bunch of Hadoop-based platforms, such as Platfora, Datameer and Zoomdata. Another trend was SaaS (Software as a Service) type reporting and visualisation services offered only in the cloud. These services included Clearstory Data, GoodData, Chartio, Domo, and Bime. The third trend was AI- and search-based solutions in which the user could analyse and retrieve data in a very automated manner, a bit like using a Google search. Some examples include Beyondcore and ThoughtSpot. Some new tools were very heavily relying on the performance of cloud databases, and they didn’t offer the possibility to extract and store data within the tool. A lighter version of this approach is Periscope Data, while a more versatile version is Looker.

Guess what has happened to most of these new rivals? Around 70% of the tools mentioned above are already acquired by another company. So again, consolidation lives strong in the market. The biggest business acquisitions in the industry in recent years have been Salesforce’s acquisition of Tableau ($15,7B) and Google’s acquisition of Looker ($2,6B). Both of these acquisitions were announced in June 2019.

A union between decentralised and centralised

Perhaps the biggest problem of self-service tools has been the limited possibilities to control and monitor the environment and the published content in a centralised manner. On several occasions, I’ve seen how a self-service environment has been filled with hundreds of data sets and thousands of reports and no one has had a clear visibility of which content is relevant and which is not. As governance is not enforced in the tools, they have to be created and implemented separately for each organisation. Luckily, the self-service BI tools of today are already offering better features to centrally control and monitor the environment and contents.

Another important aspect to consider when self-service tools and centrally controlled solutions are approaching each other is bimodal BI. This means that both centrally controlled content (often predefined and stable) and more agile self-service content (often more exploratory) can be flexibly developed and utilised in parallel. Current BI tools mostly support both of these modes but there are still gaps in how different types of contents can be infused together. A bigger challenge, however, is how to change the data culture, processes and governance practicalities to make the bimodal way of working easier and more flexible.

The death of data warehouses and dashboards 

In the past ten years, it has been repeatedly predicted that data warehouses are dying. A ton of Qlikview solutions that are based on a strong internal data storage have been implemented without use of a data warehouse, and this might be well justified on a smaller scale. Virtualisation, Hadoop, data lakes and the like have been killing data warehouses in turns but it is still going strong. This is more marketing hype rather than reality. It is true that building data warehouses has changed irrevocably. The ETL tools leading the market 10 to 15 years ago as well as the manual and slow way of building data warehouses has died. There have never been as many ways to implement and use a data warehouse as today. So data warehouses are alive and kicking. But don’t get me wrong – they are not and never will be the solution for everything.

Some people are predicting a similar fate for dashboards. The most provocative example might be the ad by ThoughtSpot which proclaims: “Dashboards are dead”.  Machine learning and AI based visualisation and data search solutions predict hard times for dashboards and traditional BI. Data science platforms have been implying the same. Most of this is purely a marketing gimmick. Or course the tools themselves and our ways of using them are constantly changing and developing. One direction for development is certainly machine learning and NLP (Natural Language Processing), and the convergence of different kinds of tools.

It will be interesting to see how the current market leaders will act when new functionalities are developed and diversified into tools. Will companies discontinue existing functionalities or parts of the tools when replacements are launched. Or will existing tools again turn into dinosaurs left to be trampled on by new rivals? Or will the giant vendors integrate their other offerings too tightly with their BI tools so that they won’t be viable options in environments already using competitors’ tech stack?

Thanks and stay tuned

In the following posts of this series, each of the key market-leading tools are covered one by one. A bit later we’ll also review some smaller rivals in detail. Leave us a comment or send an email if you want to read about a certain tool or aspect. We’ll also examine later where the Business Intelligence & data analysis tool market is going and what we can expect in the future. A preliminary schedule for the blog series is as follows:

If you are interested in data visualisation solutions or tools, please feel free to contact tero.honko@solita.fi. And finally a big thank you for reading the post!

Tableau has removed minimum purchase requirement from their license policies

Tableau has removed minimum purchase requirement from their licenses. The change has enabled Tableau to be deployed at very low cost and with exactly the number of users needed for each organisation.

In February 2021, Tableau announced that they will remove minimum user amount restrictions from their licensing. For example, earlier the Viewer license had a minimum sales volume of 100 users. The change has enabled Tableau to be deployed at very low cost and with exactly the number of users needed for each organisation.

The Tableau Creator license is intended for individuals who prepare data for their own or others’ use and publish content. With this license, user can take advantage of all Tableau’s capabilities from preparation, analysis, visualisation and publishing.

With the Tableau Explorer license, the user can create visualisations based on ready-made data models with a browser. 

With the Tableau Viewer license, you can view and use published visualisations and dashboards interactively in a variety of ways based on given permissions.

At Solita, we see Tableau as a visualisation platform that gives our customers the best visibility into their data. We are the gold level partner of Tableau and through Solita you get licenses, commissioning, training, design and implementation work at a scale that suits your needs!

We will be happy to tell you more about Tableau and together we can build a solution that is the most suitable size for your organisation!

Contact:

Suvi Korhonen, Tableau Partnership Manager in Solita Finland /
Data Consultant
suvi.korhonen@solita.fi
+358503096268

Tero Honko, Data Consultant
tero.honko@solita.fi
Phone +358405878359

Jenni Linna, Data Consultant / People Lead
jenni.linna@solita.fi
Phone +358440601244