"Why Data Modeling is More Essential In The New...
Why Automation is No Longer a Choice for Your Data Architecture

The world of data has changed for sure. Especially over the past several years. In fact, the pandemic accelerated some changes, like the migration to cloud-based data platforms.
When everyone needed to be remote, it just made sense to move to the cloud and use a service for your data platform.
Along with that came more data, more data types, and an actual business needs to move faster. Companies had to adapt very quickly during the pandemic if they wanted to survive. Many did and thrived while others, well, not so much.
As the demand for data continues to grow at unprecedented rates, and as it becomes a non-negotiable asset for organizational success, the requirement to rapidly deliver value from that data (i.e., turn it into information for data-driven decision making) has become an imperative.
So how do we deliver value faster with our data warehouses, data meshes, and enterprise data hubs? Automate, automate, automate.
Automation of Architecture
Anyone who has been following me for more than a few years knows I have been a huge fan of agile thought and code automation in the data space for a long time. The easiest code to test is code you never write!
How do you deliver faster? Write and test less code (there are no syntax errors in generated code).
How do you do that? Generate the code based on standards and templates. Use a low code or even a no-code tool to do it. This helps with both agility and quality. In our space, this has generally been referred to as a data warehouse automation tool.
IT Automation Benefits
One of the key benefits of an automation tool is that your team, data engineers, architects, and analysts, become more productive. They no longer need to be expert coders nor do they need to be experts in all the nuances of data warehousing theory or a particular design methodology, like knowing what a type 2 slowly changing dimension is. Sure, it helps to know what these concepts are but not having to code it all by hand is a big win (and definitely less error-prone).
With a template-based approach, you also get the benefit of standards enforcement without having to do tedious code reviews. Plus, it means you can onboard new team members very quickly. They need to learn to use the tool properly but they don’t have to remember what all the standards are. And if the standards need to change, you change the templates and regenerate the code. Done!
Leverage Automation Tools
Additionally, if you decide to change platforms, a good automaton tool will make those transitions much easier by letting you choose a new target platform and regenerating all the logic into the new platform’s native syntax. I personally have seen several large migrations benefit from this approach in recent years – saving months and hundreds of thousands of dollars in the process.
Likewise, as your current platform evolves, your automation tools should be incorporating those new features into the tool so again, you don’t have to be an expert to take advantage of them quickly. A good automation tool lets you describe “why,” and automatically implements the “how.”
In the end, that means your investment into the design and logic and transformation rules of your data platform are protected regardless of the changes that may come your way in the future. Automation is a great way to future-proof your platform architecture.
Documentation
To top it all off, if you build your architecture and generate your code from a good end-to-end automation tool, with a solid repository under it, you get the one benefit everyone needs, but rarely builds – comprehensive documentation. And that documentation will not be static. As you make changes and iterate through your design, expand, build, and deliver, the documentation stays current – you only need to push a button to see the current state of your system. You can be agile and documented!
Benefits of Automation in the Workplace
As you go about justifying automation to your management and staff, focus on these key benefits:
- Automated documentation
- Target platform flexibility
- Ability to customize templates and apply standards
- Agile modeling and data engineering – easily adapt to rapidly changing business needs
- Sustainability (“future-proofed” platform – change is easier when you have automated)
So, the question you need to ask yourself is “Why haven’t we automated yet?” Better yet ask “When can we start?” Because now you know that automation is no longer a choice, it is mandatory.

Kent Graziano (AKA The Data Warrior), was the Chief Technical Evangelist for Snowflake and is an award-winning author, speaker, and thought leader. He is an Oracle ACE Director (Alumni), Knight of the OakTable Network, a certified Data Vault Master and Data Vault 2.0 Practitioner (CDVP2), and expert solution architect with over 35 years of experience, including more than 25 years designing advanced data and analytics architectures (in multiple industries).
An internationally recognized expert in cloud and agile data design and prolific author, Mr. Graziano has penned numerous articles, three Kindle books, and co-authored four other books (including the 1st Edition of The Data Model Resource Book and the first book on Data Vault). He is also the technical editor for Super Charge Your Data Warehouse.
Want to hear more?
Is Data Vault 2.0 Still Relevant?
TL;DR Yes. Data Vault 2.0 Design Principles Data Vault 2.0 is a database modeling method published in 2013. It was designed to overcome many of the shortcomings of data warehouses created using relational modeling (3NF) or star schemas (dimensional modeling)....
Data Vault Revisited: A Six-Year Journey into the Secure Data Repository
In 2017, Dr. Barry Devlin provided valuable insights about Data Vaults, a concept that sparked interest among businesses and IT professionals. Data Vaults were envisioned as secure repositories for core data assets, designed to provide easy access for business users....
Understanding Data Vault 2.0 and How to Avoid Pitfalls During Implementation
Implementing a data vault as your Data Modeling approach has many advantages, such as flexibility, scalability, and efficiency. But along with that, one must be aware of the challenges that come along with that. Data vault modeling leads to a significantly larger...
Navigating the AI Landscape: The Pivotal Role of Data Modeling
In the rapidly evolving digital age, artificial intelligence (AI) has emerged as a game-changer, deeply impacting the business landscape. Its ability to automate operations, refine decision-making processes, and significantly enhance customer experiences has proven...
Unlocking Your Business Potential: Understanding and Enhancing Information Management Maturity
Introduction: In a recent report by Gartner, they emphasize the crucial role of information in the current business environment, stating, "Through 2025, organizations that are data-driven will outperform their industry peers in profitability by an average of 8%." This...
Data Warehousing Best Practices
In modern times, organizations are daily generating huge volumes of data. Appreciating the significance of data, companies are storing data from different departments which can be analyzed to gather insights to help the organization in better decision-making. This...
Unlocking the Benefits of Auditability and Adaptability with Data Vault 2.0
Introduction Data management has become a crucial aspect of modern businesses as the data volume grows exponentially. Organizations must ensure that the data they collect, store, and use is reliable and trustworthy. Data Vault 2.0 provides a robust data warehousing...
WhereScape 3D 8.6.7 Release – June 2022
WhereScape provides design automation to some of the world’s largest and most complex data platforms. The latest release of WhereScape 3D further improves the User Interface (UI), the templating options, and delivers significant enhancements and value to the...
WhereScape STAR: Our New Tool for Easily Creating SQL Server Data Warehouses
Developing and operating a data warehouse or simply generating reports, dashboards and spreadsheets can become a challenge for a smaller organization with a staff of 1 or 2 (or part-time) or for larger organizations that do not have full IT support. For the...
WhereScape 3D 8.6.8 Release – October 2022
WhereScape provides design automation to some of the world’s largest and most complex data platforms. The latest release of WhereScape 3D further improves the User Interface (UI), the templating options, and delivers significant enhancements and value to the...
Related Content

Is Data Vault 2.0 Still Relevant?
TL;DR Yes. Data Vault 2.0 Design Principles Data Vault 2.0 is a database modeling method published in 2013. It was designed to overcome many of the shortcomings of data warehouses created using relational modeling (3NF) or star schemas (dimensional modeling)....

Data Vault Revisited: A Six-Year Journey into the Secure Data Repository
In 2017, Dr. Barry Devlin provided valuable insights about Data Vaults, a concept that sparked interest among businesses and IT professionals. Data Vaults were envisioned as secure repositories for core data assets, designed to provide easy access for business users....

Understanding Data Vault 2.0 and How to Avoid Pitfalls During Implementation
Implementing a data vault as your Data Modeling approach has many advantages, such as flexibility, scalability, and efficiency. But along with that, one must be aware of the challenges that come along with that. Data vault modeling leads to a significantly larger...

Navigating the AI Landscape: The Pivotal Role of Data Modeling
In the rapidly evolving digital age, artificial intelligence (AI) has emerged as a game-changer, deeply impacting the business landscape. Its ability to automate operations, refine decision-making processes, and significantly enhance customer experiences has proven...