Datometry’s offering basically is an emulator for databases. The organization is centered in general on column-orientated relational databases, or OLAP systems, along with Teradata’s famous offering. There’s no cause its database emulator couldn’t be used for OLTP systems too, but information warehouse migrations tend to be the most expensive and painful, so the employer is starting there.
Here’s how Waas describes the product:
“We intercept the speaking from the utility, unpack the requests, take out the SQL, after which do what successfully is sort of like what the higher half of any database does, which means build an entire algebraic version for the incoming request, optimize that, and then synthetize what the optimized SQL approach for that destination,” he stated.

Once the Datometry software has dicovered the defining characteristics of the source database, then a alternative answer, consisting of the actual-time workload translation layer, may be deployed in the field to guide the customers new database system. Running in the clients’ virtual personal cloud (VPC), the Datometry answer sits between the asking for device, including a Tableau or Looker BI patron, and new the records warehouse that the customer chose, which is probably Amazon Web Services‘ Amazon Redshift, Google Cloud BigQuery, or Microsoft Azure Synapse Analytics.
The key advantage that this technique has, Waas says, is that none of the analytics packages recognise that they’re not speaking to a Teradata records warehouse anymore. As the Tableau or Looker BI customer fires off SQL queries, or because the Informatica or Talend ETL device masses source statistics into the warehouse, the Datometry emulator interprets the requests and tweaks them as necessary to account for the differences between the antique Teradata and the brand new Redshift/Synapse/BigQuery device.
Waas says Datometry has carried out its homework in developing its way to account for the unique layout and peculiarities of Teradata and Oracle systems, which are extremely complex analytical machines with many shifting parts. He said that out of the container, Hyper-Q can reflect 99.6% of the Teradata capabilities. The one caveat is that Datometry to date hasn’t evolved assist for XML. (That sound you hear is MarkLogic executives respiratory a sigh of comfort.)
“Teradata has tremendous matters: macros, saved techniques. You name it, we do all of that,” Waas says. “Even if your new vacation spot database doesn’t have stored tactics, we come up with stored techniques because a saved technique is correctly a set of, or string of, SQL statements connected with manage glide. And so we actually interpret the manage go with the flow and execute the SQL statements, allow’s say towards BigQuery or Synapse.
“So you get complete constancy of the stored procedure, with all the goodness of mistakes coping with and cross-to statements and also you name it,” he persisted. “But it’s now not executing within the database. The manipulate drift is carried out in Hyper-Q our product, however all of the heavy lifting is executed inside the database.”