Tableau Software has rolled out a new data engine that can handle much larger datasets, to be embedded in their data visualization and analytics software.
After acquiring technology 18 months ago that was originally developed at the Technical University at Munich, a new data engine that can handle much larger data sets has been embedded in data visualization and analytics software provided by Tableau Software.
Dan Jewett, vice president of product management for Tableau, says the Hyper data engine now being used in version 10.5 of the company’s namesake visualization software significantly improves overall performance. Because that data engine runs in memory it’s now possible to extract data from an external source three times faster while improving overall query speed by a factor of five, says Jewett.
See also: Tableau acquires NLP startup
Hyper enables Tableau to achieve that goal by adding a columnar data store running in memory that enables Tableau to take advantage of multicore processes to execute more tasks in parallel, says Jewett.
Jewett says Tableau is not positioning the data engine it has added as an alternative to traditional databases. Rather, Hyper is intended to provide a mechanism that enables end users to query massive data sets in near real-time, says Jewett.
“We’re not going to market Helm as a generic database,” says Jewett.
In addition to adding Hyper. Tableau has also announced that its software for the first time can also be deployed on Linux servers. That should especially appeal to IT organizations looking to deploy data visualization and analytics on a public cloud, says Jewett.
Finally, Tableau 10.5 also now provides support for nested projects within the application to simplify workflow as well as a contextual visualizations capability within a Viz in Tooltip that can be used to discover additional details of a specific dataset. That capability can be invoked without requiring someone to write any additional code.
Tableau Knows Expectations are High
Jewett notes that end-user expectations relating to digital business transformation are creating an expectation that data can in interrogated in near real time to make better business decisions faster. End users are no longer willing to waiting until the end of the quarter for all the relevant data to be rolled up, says Jewett. Analytics software that runs in memory enables end users to gain insights that empower them to affect meaningful change in business outcomes before the end of the quarter, adds Jewett.
It’s not clear to what degree digital business transformation initiatives will drive adoption of data visualization and analytics software that runs in memory. But the fact remains that organizations that can’t react to changing business conditions in real time are going to be at a distinct disadvantage.
That requirement is also changing the role IT plays. Instead of delivering a regular cadence of canned reports, business users want to be able to directly interrogate data without any intervention on the part of the IT department required. That capability not only results in more informed business decisions it also serves to free up IT resources for other projects.
The rate at which advanced analytics applications are being put into the hands of business users varies widely. But at this juncture, it’s now more a question of when most business users will be relying on advanced analytics applications to self-service their own requirements than if.