Data Governance Is Not a Project. It's a Practice.

Most data governance initiatives fail not because of bad technology, but because they're treated as a one-time implementation rather than an ongoing discipline. Here's what actually works in a regulated financial services environment.

Every organisation I've encountered in Malaysian financial services has a data governance initiative. Most of them are a SharePoint folder with a policy document, a data dictionary that was last updated in 2022, and a steering committee that meets quarterly and produces minutes nobody reads.

This is not data governance. This is the appearance of data governance.

The distinction matters enormously in a regulated environment where BNM's Risk Management in Technology (RMiT) framework explicitly requires that data be managed as a strategic asset, with demonstrable controls over data quality, lineage, and access.

Why Governance Initiatives Fail

The most common failure mode is treating data governance as a project with a start date, an end date, and a sign-off. A consulting firm is engaged. A framework is adopted DAMA-DMBOK, usually, or a derivative of it. Data owners are nominated. A glossary is populated. The project is closed.

Six months later, the glossary is stale, the nominated data owners have moved roles, and nobody is enforcing the policies because there's no mechanism to do so.

The second failure mode is technology-led governance. An organisation buys Microsoft Purview or Collibra or Alation, configures it, and calls it done. The tool is only as good as the processes feeding it. A data catalogue with no stewardship model is just an expensive yellow pages.

What Governance Actually Requires

Data governance is a set of ongoing decisions about who has authority over data, how data quality is maintained, and how data is used made consistently, over time, at the operational level.

Three things have to be true for it to work:

Accountability must be real, not nominal. Data owners need to have actual authority over their domains the ability to approve or reject data definitions, enforce quality standards, and escalate issues. A data owner who is listed in a RACI but has no time or mandate to act is not a data owner.

Governance must be embedded in the pipeline, not bolted on. Data quality rules belong in your silver layer validation, not in a policy document. Data lineage is captured by your pipeline instrumentation, not documented manually after the fact. If governance exists only in documents, it will always lag behind the actual state of the data.

There must be a feedback loop. When a data quality issue is found a NULL rate that exceeds tolerance, a reconciliation break, a schema drift there needs to be a defined path: who is notified, who investigates, who signs off on the resolution. Without the loop, issues are fixed locally and silently, and the same problems recur.

The RMiT Lens

BNM's RMiT framework (updated 2020, with ongoing guidance circulars) sets specific expectations for licensed entities:

Data classification: all data assets must be classified by sensitivity and criticality. In practice this means a taxonomy public, internal, confidential, restricted applied consistently across systems, with access controls that reflect the classification.

Data quality management: RMiT expects documented data quality dimensions (completeness, accuracy, timeliness, consistency) with measurable thresholds and evidence of monitoring. "We check the data" is not sufficient. You need dashboards, exception reports, and audit trails.

Data lineage: for material data used in regulatory reporting, you must be able to trace a figure from the report back to its source system. This is not optional for submissions to BNM it is an audit expectation.

Third-party data risk: if your data platform involves cloud services or vendors processing material data, those relationships fall under RMiT's third-party risk requirements. This includes your cloud provider (Azure, in many cases), your analytics vendors, and any SaaS tools that touch regulated data.

A Practical Starting Point

If you're building a governance practice from scratch in a financial services data team, here's the sequence that works:

Start with the data that matters most. Identify the five to ten datasets that feed your regulatory reports, your financial close, or your board-level metrics. Govern those first, completely, before expanding scope. Boiling the ocean produces nothing.

Instrument your pipelines for quality and lineage. Every dataset that lands in your silver layer should have a quality check row counts, NULL rates, referential integrity, business rule validation with results logged to a monitoring table. Azure Data Factory and Fabric both support this natively. Microsoft Purview can ingest lineage from ADF automatically.

Appoint stewards, not just owners. Data owners set policy. Data stewards execute it daily they review quality dashboards, triage exceptions, maintain the glossary, and liaise with source system owners. Stewardship is an operational role, not a governance committee seat.

Make the glossary a living document. A business glossary is only valuable if it reflects current definitions and is referenced in actual work in report annotations, in dataset documentation, in onboarding for new analysts. Integrate it into your catalogue tool and require new datasets to be registered before they go to production.

Report on governance, not just data. A monthly data quality scorecard coverage, open issues, resolution time makes governance visible to leadership and creates accountability. What gets measured gets managed.

The Maturity Progression

Governance maturity is not binary. Most organisations sit somewhere on a spectrum:

Level Characteristic
Reactive Data issues found by end users, fixed ad hoc, no systemic tracking
Defined Policies exist, some monitoring, quality issues logged but inconsistently resolved
Managed Automated quality monitoring, stewards actively triaging, glossary maintained
Optimising Governance embedded in pipeline design, proactive issue prevention, continuous improvement cycle

Most organisations in Malaysian financial services I've encountered are at Defined, aspiring to Managed. The gap between the two is almost always about operational discipline, not technology.

The Uncomfortable Truth

Data governance requires sustained attention from people who are already busy. That's why it fails. The technology is not the hard part. The culture is.

The organisations that do it well treat data quality as a first-class engineering concern not a compliance checkbox. They have engineers who care when a NULL rate spikes, analysts who know where their numbers come from, and leaders who ask hard questions when figures don't reconcile.

That disposition is harder to build than a data catalogue. But it's the only thing that actually works.