Role Description
- The Techno-Functional Business Analyst will support a banking Data Quality / Data Under Governance program aligned to the proven UK DQP approach (PRA) and now being replicated across ECB and India regulatory asks.
- Support delivery across the five workstreams:
- CDE Identification (CDE inventory, definitions, ownership, scope by legal entity)
- SOR Allocation (authoritative source mapping, SoR/AR alignment, data contracts)
- Controls Mapping (control design, thresholds, risk appetite alignment, evidence requirements)
- Data Lineage (traceability across systems, transformation chains, endpoints/reporting)
- Operating Model (DCRM workflow, exception management, governance routines, reporting)
- Work with Markets, Risk, Finance, Operations, Technology, Data Platform, and Governance teams to drive outcomes and ensure regulatory alignment.
- Translate regulatory expectations into clear requirements and executable delivery artefacts (user stories, decision tables, STTMs, test packs).
- Ensure traceability from policy/regulatory expectation → CDE → SOR/AR → controls → lineage → exception management → audit evidence.
- Support validation readiness by producing clear, audit-ready documentation and evidence packs.
Location
- The role supports one of our top-tier banking clients in London (Canary Wharf) and requires a minimum of three days on-site presence.
- This is a permanent position based in the UK. We will only consider applicants who are eligible to work in the UK. For this role do NOT offer visa sponsorship.
Experience Requirements & Qualifications
Core Experience
- Minimum 5 years of relevant experience in data governance, data quality, reporting controls, or data transformation programs (preferably in financial services / Capital Markets).
- Proven experience delivering governance-led programs involving CDEs, SOR/authoritative sources, controls, and lineage.
- Experience working in regulated remediation / regulatory delivery environments with exposure to validation, audit evidence, and structured governance.
- Strong stakeholder management across business, operations, technology, and governance functions.
Domain Knowledge
- Strong understanding of Capital Markets and Finance data domains (front-to-back awareness is a plus).
- Familiarity with risk appetite concepts as applied to data quality thresholds and control exceptions.
Technical / Analytical Skills
- Proficiency in SQL (advanced querying, reconciliation logic, data validation).
- Strong proficiency in Python for data analysis and automation (pandas, validation frameworks, scripting).
- Experience supporting or validating ETL/ELT pipelines and data quality frameworks (rules, thresholds, exception handling).
- Exposure to lineage and metadata approaches; ability to validate transformations and trace data across platforms.
Tooling / Delivery Methods
- Working knowledge of scheduling/orchestration tools such as Autosys and/or Apache Airflow (monitoring schedules, reruns, failure triage).
- Experience with CI/CD and release controls (Git, Harness, UrbanCode Deploy (UCD), Red Hat OpenShift or equivalent).
- Familiarity with large-scale storage patterns (e.g., AWS S3) for dataset movement and controls.
- Experience supporting BI/reporting outputs such as Tableau dashboards (data validation, extract refresh checks, reconciliation to source).
Nice-to-Have
- Experience with tools such as PySpark, Spark SQL, Hive, Impala, HDFS, Parquet, and Oracle databases.
- Hands-on exposure to DCRM tooling and operational exception management processes.
- Experience with governance/catalog tools and lineage documentation methods (Collibra/Alation/Informatica EDC/Purview or similar).
- Experience running delivery routines across workstreams (intake, triage, prioritization, wave planning, reporting).
- Experience working in Agile/Scrum delivery models.
- Familiarity with Visio (or equivalent) for lineage, control mapping, and operating model workflows.
Main tasks and responsibilities
- Run discovery workshops to confirm scope by legal entity, regulatory asks, priority datasets, and key stakeholders.
- Build and maintain CDE inventory: definitions, ownership, criticality, and mapping to reports/processes.
- Support SOR / Authoritative Source allocation: document authoritative sources, data contracts, and key dependencies.
- Define and maintain controls mapping: control points on SOR, AR, and endpoints; thresholds aligned to risk appetite; evidence requirements.
- Support data lineage creation/validation: source-to-endpoint traceability, transformation logic validation, and coverage reporting.
- Define and embed the operating model: exception workflows, DCRM lifecycle, triage routines, governance reporting, and closure evidence.
- Perform data profiling and reconciliation checks to support control design and validation readiness.
- Lead/support UAT and validation activities; coordinate defect triage and ensure sign-off evidence is complete.
- Produce an audit-ready documentation pack: lineage evidence, controls evidence, test packs, decision logs, and explainable outcomes.
- Track and communicate risks, dependencies, and changes impacting regulatory delivery outcomes through governance forums.


