Our client is a technology-led financial services business focused on delivering modern credit products to underserved and growing customer segments. The company operates in multiple international markets and is building data-driven lending capabilities to improve accessibility, speed, and decision accuracy.
With continued investment in its data and engineering functions, the business is expanding its team to strengthen real-time credit decisioning, portfolio risk visibility, and operational efficiency.We are hiring a Credit Data Engineer to design, build, and scale the data infrastructure that powers the company’s credit platform. This role will focus on delivering clean, reliable datasets and production-ready data pipelines that support credit analytics, risk modelling, affordability assessments, and compliance reporting.
You will work closely with stakeholders across Data, Risk, Engineering, and Product, playing a key role in shaping data maturity within the lending function.
Key Responsibilities
Build and maintain data pipelines that support credit decisioning, portfolio monitoring, and operational reporting
Integrate and manage data ingestion from internal transactional systems and third-party providers
Develop automated data validation and quality monitoring frameworks
Create structured datasets and data models used by risk, analytics, and product teams
Collaborate with stakeholders to define data requirements for new lending features and products
Improve data performance, reliability, and scalability across batch and real-time environments
Maintain documentation of data flows, business logic, and system dependencies
Apply best practices in data governance, access control, and security standards
Skills & Experience
Required
1–3 years’ experience in Data Engineering or similar
Strong SQL and Python skills
Experience building ETL/ELT pipelines with modern tooling
Hands-on experience with cloud data warehouses (Snowflake, BigQuery, Redshift, etc.)
Familiarity with dbt and/or workflow orchestration tools (Airflow, Dagster, Prefect, etc.)
Understanding of data modelling, warehouse architectures, and schema design
Experience working with APIs or event-based data
Preferred
Background in financial services, credit, lending, or risk data
Knowledge of data quality frameworks and monitoring
Experience supporting data science or machine learning workloads
Understanding of regulatory requirements related to data (e.g. GDPR)
Who You Are
Analytical problem-solver with strong attention to detail
Comfortable working in a fast-paced, evolving environment
Clear communicator who enjoys cross-functional collaboration
Proactive and ownership-driven mindset
With continued investment in its data and engineering functions, the business is expanding its team to strengthen real-time credit decisioning, portfolio risk visibility, and operational efficiency.We are hiring a Credit Data Engineer to design, build, and scale the data infrastructure that powers the company’s credit platform. This role will focus on delivering clean, reliable datasets and production-ready data pipelines that support credit analytics, risk modelling, affordability assessments, and compliance reporting.
You will work closely with stakeholders across Data, Risk, Engineering, and Product, playing a key role in shaping data maturity within the lending function.
Key Responsibilities
Build and maintain data pipelines that support credit decisioning, portfolio monitoring, and operational reporting
Integrate and manage data ingestion from internal transactional systems and third-party providers
Develop automated data validation and quality monitoring frameworks
Create structured datasets and data models used by risk, analytics, and product teams
Collaborate with stakeholders to define data requirements for new lending features and products
Improve data performance, reliability, and scalability across batch and real-time environments
Maintain documentation of data flows, business logic, and system dependencies
Apply best practices in data governance, access control, and security standards
Skills & Experience
Required
1–3 years’ experience in Data Engineering or similar
Strong SQL and Python skills
Experience building ETL/ELT pipelines with modern tooling
Hands-on experience with cloud data warehouses (Snowflake, BigQuery, Redshift, etc.)
Familiarity with dbt and/or workflow orchestration tools (Airflow, Dagster, Prefect, etc.)
Understanding of data modelling, warehouse architectures, and schema design
Experience working with APIs or event-based data
Preferred
Background in financial services, credit, lending, or risk data
Knowledge of data quality frameworks and monitoring
Experience supporting data science or machine learning workloads
Understanding of regulatory requirements related to data (e.g. GDPR)
Who You Are
Analytical problem-solver with strong attention to detail
Comfortable working in a fast-paced, evolving environment
Clear communicator who enjoys cross-functional collaboration
Proactive and ownership-driven mindset