Frequently Asked Questions

Datometry Hyper-Q is a SaaS platform that lets applications originally written for a specific database run natively on a cloud data warehouse. Hyper-Q enables enterprise to replatform to public cloud without a highly time-consuming, costly, and risk-laden database migration.

Datometry Hyper-Q translates and emulates SQL statements and translates data in real-time between application and cloud data warehouse. Specifically, Hyper-Q rewrites the SQL statement in a way to take advantage of the intricacies of the new destination system. The application, and with it the long-standing investment into its development, is preserved and disruptions to the business are averted.

In industry benchmarks such as TPC-H and TPC-DS, Hyper-Q was found to add strictly less than 2% of overhead. Hyper-Q does not process any data but delegates all compute and data-intensive tasks to the destination data warehouse. This results in minimal overhead while leveraging the highly scalable architecture of the destination data warehouse optimally.

Datometry Hyper-Q supports all application types. Because of its position in the IT stack, Hyper-Q “sees” only SQL and management statements but does not need to know about the semantics of the application

Datometry Hyper-Q supports all SQL statements commonly used in workloads on all systems Datometry currently supports as source systems

Even for large-scale installations, customers can expect the replatforming to be complete within 6-9 months including functional, performance and user-acceptance test phases. Actual project durations may vary based on factors that will be determined during the assessment phase in the project.

Within just a few weeks, Datometry will determine scope and effort of a pilot based on the workloads that are currently processed on the source system. In general, Datometry pilots deliver conclusive results for an entire business function, including performance and scalability analysis, within 8-10 weeks.

Workload latency and performance is primarily dependent on the destination data warehouse. This gives customers the flexibility to achieve performance and scale using the controls of the destination data warehouse.

To harness Hyper-Q, all existing applications need to be repointed to Hyper-Q. This can either be done directly or through changes to DNS configurations. Hyper-Q has an out-of-the-box coverage of 99.5% or more on average. That is workloads will be almost completely functional from the get-go.

Datometry provides a comprehensive error logging and handling mechanism. In the rare case of encountering an unsupported command or statement, the logs can be used directly to initiate a support ticket.

Hyper-Q integrates with standard IT security infrastructure including Active Directory and OAuth.

Datometry uses the RBAC mechanism of the target database to control which objects are accessible to users. This avoids having to maintain an external layer of permissions and configuration.

Yes, Hyper-Q uses standard encryption mechanisms

All query traffic passing through Datometry is automatically logged and traced to allow for auditing and managing workloads. The Hyper-Q logs can be exported to an external log aggregation system. This is a powerful value-add out-of-the-box for customers.

Some features—stored procedures, macros, updatable views and recursive queries—may not yet be natively supported on the destination cloud data warehouse. Hyper-Q provides full emulation of these features by using the available constructs on the cloud data warehouse. Hyper-Q achieves a completeness of 99.5% or higher. This is in stark contrast to static code conversion tools that are typically limited to 60-70% of completeness only.

In addition to generating individual statements, Hyper-Q controls the translations of complex features that require more than just a single-statement translation. To support complex workloads, emulation is indispensable.

Datometry strives to ensure its customers are successful by continuously expanding the already extensive coverage of the supported source systems. Because of the volume and complexity of the supported statements, we provide documentation of the features and variants which are not currently supported.

Datometry qInsight provides a unique and detailed breakdown of all interactions of analytical application with the data warehouse. During the analysis, qInsight parses queries to extract and categorize SQL and database features. The insights provided in the report paint a complete picture of effort and risk associated with a replatforming effort.

For its analysis, qInsight needs offline access to schema and metadata information to perform semantically correct translations. The extraction of the workload and schema information is designed with ease-of-use in mind. The analysis is performed off-line by qInsight SaaS by ingesting the query and schema logs

Datometry qInsight analyzes application workloads as recorded and logged in the source database query logs. This includes workloads regardless of the application that submitted the requests. Critically, this includes ad-hoc and analytical queries, query scripts executed by custom applications, as well as third-party BI and ETL solutions.

Datometry qInsight provides unprecedented visibility into all aspects of existing application workloads and provides detailed information on the features used and the contexts in which they are used. This analysis includes information on out-of-the-box coverage of applications, query insights, recommendations for performance tuning and optimization, and a list of database objects referenced by the workload. Using the intelligence from the qInsight analysis, customers can create database migration and implementation plans in days instead of months.

Datometry qShift transforms the original source schema into a SmartSchema™ on the destination system. SmartSchema™ can represent additional data and object types that do not natively exist on the destination system and optimize for performance and completeness.

Currently, Datometry qShift supports Azure Synapse, Amazon Redshift, and Google BigQuery.

Datometry qShift takes best practices for the destination data warehouse into account and combines it with the unique knowledge and experience of Datometry in their analysis of production workloads. Datometry optimizes and improves the output of qShift constantly in close collaboration with our cloud data warehouse partners

Yes, new applications can be written developed directly on the destination data warehouse to take full advantage of the data having become cloud native. Datometry offers enterprises the best of both worlds: move the existing business and accelerated new revenue streams immediately.

Don’t rewrite. Virtualize.