digna 2026.01 Expands Enterprise Data Validation Inside the Database

24 mar 2026

|

5

min. czyt.

digna 2026.01: Enterprise Data Validation Inside the Database

Release 2026.01 is the most significant expansion to digna's data validation architecture since the module launched. This release extends what validation rules can cover, how they are enforced, and how the platform connects to complex, multi-source enterprise environments. The headline additions are multi-column uniqueness checks and referential integrity validation. But the release also introduces changes to how datasources are modeled, how database connections are managed across projects, and how anomaly detection behaves in specific business contexts. Each of these matters in practice. 

This post walks through what is new and who it is built for. 


Why Traditional Single-Column Validation Falls Short 

Most data quality platforms enforce validation at the single-column level. Check that a value is not null. Check that it falls within a range. Check that it matches a format. These checks matter. They are also insufficient for the kinds of data quality failures that actually undermine enterprise reporting and regulatory compliance. 

The failure modes that cause the most damage in production are relational. A transaction record referencing a customer identifier that no longer exists in the master. An order line whose product code is unique in isolation but duplicated within a specific order, violating the compound key. A financial exposure whose counterparty reference matches nothing in the approved list. None of these trip a single-column check. All of them produce downstream corruptions that are expensive to trace and harder to explain to an auditor. 

Release 2026.01 adds the validation capabilities that address these failure modes directly. 


Multi-Column Uniqueness Checks: Validating Compound Business Keys 

Many real-world business entities are defined by combinations of attributes, not single identifiers. An order line is unique within an order when the combination of order ID and line number is distinct. A financial position is unique when account, instrument, and date together are unique. 

digna 2026.01 introduces uniqueness checks across configurable column sets. The check evaluates whether the combination of selected columns contains duplicates, identifying cases where compound business keys are violated. This runs entirely within the source database through SQL-based inspection, without exporting data or creating additional processing layers. 

The practical impact for data quality and governance teams is the ability to detect duplicate business entities that would otherwise pass all individual-column checks cleanly. For data warehouses where slowly changing dimensions or incremental loads are common, compound key violations are a frequent source of silent data corruption that single-column validation never surfaces. 


Referential Integrity Checks: Enforcing Relationships Across Datasources 

The second major validation addition is referential integrity checking across datasources. This check validates that foreign key values in a source datasource exist within a referenced target datasource, detecting orphaned records and broken relationships before they propagate through downstream analytics and reporting pipelines. 

What makes this practically useful at enterprise scale is the scope it supports. Referential integrity checks in digna 2026.01 work across different tables and views, different schemas within the same database, and different database connections within the same project. The check is not limited to relationships within a single schema or a single warehouse. An enterprise that holds customer master data in one system and transaction records in another can validate the referential relationship between them without replicating data into a common environment. 

This directly supports the data quality requirements that matter most in regulated industries: maintaining data warehouse integrity, validating master data relationships, supporting regulatory reporting, and ensuring that BI and analytics systems are consuming data that is structurally sound. 

Full technical documentation for both new validation capabilities is available in the digna 2026.01 release notes


Logical Datasources and Global Connections: Simplifying Complex Environments 

Beyond the validation additions, Release 2026.01 introduces changes to how digna models datasources and manages database connections, both of which have direct implications for teams operating in heterogeneous enterprise data environments. 

Datasources in 2026.01 now represent a logical layer within a project rather than a direct mapping to a physical table. Each logical datasource can be backed by a database table, a database view, or a custom SQL statement. This decouples the inspection and validation logic from the physical storage structure, which matters when schemas evolve, when validation rules need to apply to derived datasets, or when the same business entity is represented across multiple physical locations. 

Database connections are now configured at a global level and can be reused across all projects. Previously, teams managing multiple projects across multiple database environments needed to configure connections separately per project. Global connections eliminate that duplication, reduce configuration overhead, and ensure that connectivity settings are consistent across the platform. Projects can also now reference multiple source connections simultaneously, supporting realistic enterprise architectures where data resides across several warehouses or operational databases. 


Anomaly Relevance Conditions, Per-Module Notifications, and CSV Export 

Three additional features in this release address specific operational pain points that users have flagged consistently. 

The Anomaly Relevance Condition allows teams to define a condition that controls whether digna Data Anomalies evaluates anomaly status for a given dataset. Statistics are always calculated. But if the defined condition is not met, for example if the record count is below a threshold that makes anomaly detection statistically meaningful, the platform does not surface a green, yellow, or red anomaly status for that dataset. This prevents alert noise from low-volume or transient datasets and ensures that anomaly evaluation is only applied in contexts where it is operationally meaningful. 

Per-module notification configuration allows teams to set independent alerting behavior for each digna module. A data engineering team responsible for pipeline timeliness can receive notifications independently from a governance team focused on validation failures. Alerts can be tuned to the criticality of each module's output without applying a single notification policy across the entire platform. 

Inspection results can now be exported as CSV files, with direct practical value for teams that need to incorporate digna's validation output into audit workflows, external reporting, or downstream analysis outside the platform. 


Who This Release Is Built For 

  • Data Engineers benefit from logical datasource modeling that decouples inspection logic from physical schema, and from global connections that eliminate redundant configuration across projects. 


  • Data Quality and Governance Teams gain the relational validation coverage: compound key uniqueness and referential integrity needed to enforce the structural rules that matter for regulatory reporting and master data management. 


  • Analytics and BI Teams receive cleaner, more structurally sound inputs from upstream data systems, with exportable inspection results that feed directly into audit and reporting workflows. 


  • Platform and Infrastructure Owners benefit from reduced configuration complexity through global connections and logical datasources, and from per-module notification control that scales alerting to team structure rather than applying a single policy across all contexts. 

Release 2026.01 is available now. The full changelog, including technical detail on all new features and updated CLI reference, is at docs.digna.ai/changelog/Release_202601

  

Ready to see 2026.01 in your environment? 

Multi-column uniqueness checks, referential integrity validation, and in-database architecture that keeps your data where it belongs. Book a personalised walkthrough with the digna team to see the new validation capabilities on your own data environment. 

Book a Demo, Read the Full Release Notes. 

Udostępnij na X
Udostępnij na X
Udostępnij na Facebooku
Udostępnij na Facebooku
Udostępnij na LinkedIn
Udostępnij na LinkedIn

Poznaj zespół tworzący platformę

Zespół z Wiednia, składający się z ekspertów od AI, danych i oprogramowania, wspierany rygorem akademickim i doświadczeniem korporacyjnym.

Produkt

Integracje

Zasoby

Firma

Polski
Polski