In this engaging webinar tailored for Azure SQL...
Data Analytics Platform Migration

GUEST BLOG POST – Claudia Imhoff, Ph.D.
A thought leader, visionary, and practitioner, Claudia Imhoff, Ph.D., is an internationally recognized expert on analytics, business intelligence, and the architectures to support these initiatives. Dr. Imhoff has co-authored five books on these subjects and writes articles (totaling more than 150) for technical and business magazines.
Migrating to a New Analytics Platform? Here are Some Things to Think About
Many enterprises are considering a move to a new analytics platform, particularly a cloud-based one. Why? Well, there are many reasons – reducing IT costs, reducing data storage costs, improved performance from newer technologies, and many others. But migrating to a new platform is more than just forklifting your legacy data warehouse or data lake into the new environment.
Doing that is a big mistake and a missed opportunity. Aging analytics environments come with several problems such as workarounds that were created to mask problems, production of inefficient code, or projects that valued expediency over good design techniques. Migrating is a great time to blow the dust off the old designs, fix nagging problems, improve overall efficiency of data management processes, remove unused or forgotten data and analytical processes, rationalize all the tools and technologies being used for analysis, and tighten up governance procedures.
ETL
One area that has great potential for improvement is the data transformation or ETL processes. It is this area upon which the remainder of this blog post will focus.
So with this in mind, let’s discuss the technology behind data transformation. ETL or ELT has been around for decades now and yes, you still need mature transformation technology. But for a migration initiative, you need more than just good technology. You need technology that has been created explicitly to ease the migration effort. Ask yourself the following set of questions before embarking on your migration:
- Is your data transformation technology configured to work specifically with whatever cloud platform or platforms you have chosen? Often, enterprises do not settle on a single cloud vendor or a single instance. Make sure your choice of data transformation technology works with and across all the major cloud platforms.
- Does it use the latest cloud platform functionalities and capabilities? Cloud platforms have their own ways of loading and unloading data, as well as other features that must work with the data transformation technology.
- Does it have the proper connectivity for all major sources? Aging analytics environments often have multiple “satellite” sets of analyses occurring outside of the main data warehouse. Migration activities could be used to consolidate some of these disparate environments back into the “mother ship”. These connectors are also used to profile and detect quality problems in the aging environment, as well as browse and discover redundant, unused, or undocumented sets of data.
- What about modern standards for modeling and data transformation procedures? Now would be a good time to enforce standards across the new environment through your data transformation (and other) technologies.
- Does it integrate easily with the other technologies (design, quality, catalog, analysis, etc.) used in an analytics environment? Just enforcing standards from the data model stage through data transformation on to the creation of analytical assets would be a great improvement in many enterprises.
- Does the data management technology have pre-built templates and configurations for different data model types (3NF, Star Schemas, or Data Vault) for all major platforms? It is always faster and better to start from a template than to have to create everything from scratch. Check to make sure your data transformation technology contains patterns as well for specific data modeling styles. Not only are templates and patterns good for standardization but they are terrific productivity tools.
- Can repetitive ETL processes (e.g., dates, times, codes, etc.) be used to relieve some of the tedious programming for the IT staff? These are also great productivity and standardization functions.
- Can you use automation of the data transformation processes to ensure accuracy, timeliness, and up-to-the-minute accuracy of metadata behind these processes? Automation is the key to guaranteeing a successful migration. It is your certification of “goodness” in the final analytical environment.
Cloud Data Migration
Finally, ensure that your data management technology vendor has the data engineers to assist your data architects and data designers in this migration. These resources must be fully capable of delivering and provisioning your new environment on any data platform, for any major cloud configuration. Use these engineers to not only help migrate to the new environment but to fully train your own staff to ultimately replace them.
Once the newly remodeled, redesigned, retransformed analytical environment is up and running, enjoy the benefits of lowered costs, reduced redundancy in data and analytical assets, increased efficiency in data transformations, and improved access to ALL data by the ultimate consumers.
Data Automation
Do you want to know how Data Automation can support your migration to a new analytics platform? Contact WhereScape to find out more.
New in RED 10.5: Streamlined Install, Smarter Upgrades & Enterprise Scale
For many teams, the hardest part of progress isn’t always about what they’re building - instead, it’s staying current, without slowing down. WhereScape RED 10.5 has been developed with that thought squarely in mind. This new release reduces the steps between “we...
Implementing the Medallion Lakehouse on Microsoft Fabric – Fast – with WhereScape
Organizations arriving at Microsoft Fabric often share the same first impression: the platform brings the right ingredients together—OneLake for storage, Data Factory for movement, a lake-centric Fabric Warehouse for SQL performance, and governance that spans the...
Accelerate Microsoft Fabric Adoption with WhereScape Automation
As organizations embrace Microsoft Fabric to streamline their analytics infrastructure, they quickly encounter the complexity inherent in managing multiple integrated components. Microsoft Fabric’s extensive capabilities—from OneLake storage and Data Factory pipelines...
Demystifying Microsoft Fabric Components for Business & Technical Users
Microsoft Fabric is rapidly becoming the go-to solution for enterprises aiming to consolidate their analytics processes under a single comprehensive platform. However, understanding the full scope and function of its components can initially seem daunting to both...
An Introduction to Microsoft Fabric: Unifying Analytics for Enterprises
In today's data-driven world, enterprises face an ever-growing demand to harness data efficiently. The complexity of managing diverse and expansive data sources often presents significant challenges. Microsoft Fabric has emerged as a comprehensive solution designed to...
WhereScape at TDWI Munich: Automate Data Vault on Databricks
WhereScape at TDWI Munich 2025: Automate a Full Data Vault on Databricks in Just 45 Minutes June 24–26, 2025 | MOC Munich, Germany As data complexity grows and business demands accelerate, scalable and governed data architectures are no longer optional—they're...
What Is OLAP? Online Analytical Processing for Fast, Multidimensional Analysis
Streamline your data analysis process with OLAP for better business intelligence. Explore the advantages of Online Analytical Processing (OLAP) now! Do you find it challenging to analyze large volumes of data swiftly? A Forrester study reveals that data teams spend...
Build AI-Ready Data: Visit WhereScape at AI & Big Data Expo
June 4–5, 2025 | Booth 202 | Santa Clara Convention Center As organizations scale their artificial intelligence and analytics capabilities, the demand for timely, accurate, governed, and AI-ready data has become a strategic priority. According to Gartner, through...
Automating Star Schemas in Microsoft Fabric: A Webinar Recap
From Data Discovery to Deployment—All in One Workflow According to Gartner, data professionals dedicate more than half of their time, 56%, to operational tasks, leaving only 22% for strategic work that drives innovation. This imbalance is especially apparent when...
What is a Data Model? How Structured Data Drives AI Success
What is a data model? According to the 2020 State of Data Science report by Anaconda, data scientists spend about 45% of their time on data preparation tasks, including cleaning and loading data. Without well-structured data, even the most advanced AI systems can...