Select Page

WhereScape at TDWI Munich: Automate Data Vault on Databricks

By Kortney Phillips
| June 19, 2025
WhereScape at TDWI Munich: Automate Data Vault on Databricks

WhereScape at TDWI Munich 2025: Automate a Full Data Vault on Databricks in Just 45 Minutes

June 24–26, 2025 | MOC Munich, Germany

As data complexity grows and business demands accelerate, scalable and governed data architectures are no longer optional—they’re essential. That’s why WhereScape is returning to TDWI Munich, Germany’s premier conference for data, analytics, and AI, to demonstrate how automation is redefining enterprise data infrastructure.

Join us in Munich to explore how WhereScape enables data teams to eliminate manual coding, accelerate cloud modernization, and deliver analytics-ready platforms faster. Our experts will be available throughout the event at the WhereScape booth, offering live demonstrations, technical walkthroughs, and practical guidance on automating your data environment from end to end.

Featured Technical Workshop

data vault on databricks

Create a Data Vault on Databricks in 45 Minutes
Presented by Endika Pascual
June 24 | SDdi3.3 at 14:30–15:30 | SDdi4 at 16:15–17:00

In this immersive, hands-on workshop, you’ll see how WhereScape automates the entire Data Vault 2.0 lifecycle on Databricks, using a metadata-driven approach to deliver governed, scalable architectures at speed.

Workshop Highlights:

Rapid Model Creation with WhereScape 3D: Learn how to ingest, profile, and visually model source system data into a scalable Data Vault blueprint. WhereScape 3D automatically generates hubs, satellites, and links based on DV2.0 standards, eliminating hand coding and ensuring consistency from the start.

Automated Code Generation and Deployment via WhereScape RED: Generate native SQL and Python optimized for Databricks and Delta Lake, including logic for key generation, load dates, audit columns, and record sourcing. Instantly deploy Raw Vault and Business Vault structures using built-in automation.

Orchestrated ELT Pipelines: Design and schedule ELT workflows with full dependency management, ensuring reliable execution across Raw and Business Vault layers. Compute workloads are pushed directly to Databricks using native orchestration.

Built-In Governance and Lineage: Every transformation is automatically documented. Built-in data lineage, impact analysis, and governance features provide transparency and traceability across the entire pipeline.

Data Quality and Analytics Readiness: Delta Lake integration ensures ACID compliance, versioning, and schema enforcement, resulting in a vault structure that is reliable, auditable, and ready for downstream analytics and machine learning workloads.

By the end of this session, attendees will have walked through the full technical process of building a Data Vault 2.0 architecture on Databricks using WhereScape—and gained the knowledge to replicate it within their own environments.

No registration is required to attend this session. Simply show up and take a seat.

However, if you’d like to register, please see below…

How to Register

Attending the workshop or booth is simple. Here’s how to register:

  1. Visit the TDWI Munich registration page and click on tickets
  2. Select the Partner Program ticket (cost: €0)
  3. Add “Special Workshop WhereScape – Tuesday” to your basket
  4. Complete the registration process by entering your details

What to Expect at the WhereScape Booth

Visit the WhereScape booth to talk with our solution architects and see live demos of:

  • Metadata-driven data modeling with WhereScape 3D
  • Automated SQL/Python generation and deployment with RED
  • Support for cloud platforms like Databricks, Snowflake, Redshift, and Microsoft Fabric
  • Fully auditable and governed data pipelines with end-to-end automation

Connect with our data automation experts and learn how to reduce development time by up to 80%, cut manual coding by 95%, and build cloud-ready data infrastructure without compromise.

We’re also giving away an OURA ring, stop by the booth for a chance to win.

TDWI Munich 2025

About WhereScape

WhereScape helps data teams automate the design, development, deployment, and operation of data infrastructure. Our platform includes:

  • WhereScape 3D: Automates data discovery, profiling, and modeling. Instantly generate visual blueprints for Data Vaults, star schemas, and more.
  • WhereScape RED: Deploy data infrastructure with native, platform-specific code generation across Databricks, Snowflake, Microsoft Fabric, and more, no hand-coding required.
  • Data Vault Express: Simplifies the implementation of Data Vault 2.0 and 2.1, reducing risk while increasing speed, consistency, and auditability.

With metadata-driven automation and built-in governance, WhereScape enables teams to deliver data solutions in days, not months, at scale and with confidence.

About TDWI Munich

TDWI Munich is Germany’s largest and most established conference for data, analytics, and AI professionals. With more than 100 technical sessions, expert-led workshops, and a dynamic expo hall, the conference serves as a key hub for innovation, learning, and professional growth in the data space. Attendees from across industries, including data scientists, engineers, architects, and decision-makers, come together to exchange knowledge, explore real-world use cases, and gain the skills needed to navigate today’s fast-evolving data landscape.

Whether you’re focused on building modern data architectures, implementing AI-driven analytics, or managing data governance at scale, TDWI Munich provides the expertise and community to move your projects forward.

We’ll see you there!

Should You Use Data Vault on Snowflake? Complete Decision Guide

TL;DR Data Vault on Snowflake works well for: Integrating 20+ data sources with frequent schema changes Meeting strict compliance requirements with complete audit trails Supporting multiple teams developing data pipelines in parallel Building enterprise systems that...

A Step-by-Step Framework for Data Platform Modernization

TL;DR: Legacy data platforms weren't built for real-time analytics, AI workloads, or today's data volumes. This three-phase framework covers cloud migration, architecture selection (warehouse, lakehouse, or hybrid), and pipeline automation. The goal: replace brittle,...

How-to: Migrate On-Prem SQL Server to Azure

Migrating on-premises SQL Server to Azure shifts infrastructure management to the cloud while maintaining control over data workloads. Organizations move to Azure SQL Database, Azure SQL Managed Instance, or in some instances on-prem SQL Server on Azure run on virtual...

Data Governance in Healthcare: HIPAA Compliance Guide

TL;DR Healthcare data architects must integrate fragmented clinical systems (EHRs, PACS, LIS) while maintaining HIPAA-compliant lineage and clinical data quality. Data Vault modeling can help provide the audit trails regulators demand, but generates hundreds of tables...

Enterprise Data Warehouse Guide: Architecture, Costs and Deployment

TL;DR: Enterprise data warehouses centralize business data for analysis, but most implementations run over budget and timeline while requiring specialized talent. They unify reporting across departments and enable self-service analytics, yet the technical complexity...

What Is a Data Vault? A Complete Guide for Data Leaders

A data vault is a data modeling methodology designed to handle rapidly changing source systems, complex data relationships, and strict audit requirements that traditional data warehouses struggle to manage.  Unlike conventional approaches that require extensive...

New in 3D 9.0.6.1: The ‘Source Aware’ Release

When your sources shift beneath you, the fastest teams adapt at the metadata layer. WhereScape 3D 9.0.6.1 focuses on precisely that: making your modeling, conversion rules and catalog imports more aware of where data comes from and how it should be treated in-flight....

Related Content

A Step-by-Step Framework for Data Platform Modernization

A Step-by-Step Framework for Data Platform Modernization

TL;DR: Legacy data platforms weren't built for real-time analytics, AI workloads, or today's data volumes. This three-phase framework covers cloud migration, architecture selection (warehouse, lakehouse, or hybrid), and pipeline automation. The goal: replace brittle,...