
Senior Data Engineer
On site
London, United Kingdom
Full Time
24-03-2025
Job Specifications
Job Title: Senior Data Engineer
Location: London, UK (3 days in the office)
SC Cleared: Required
Job Type: Full-Time
Experience: 8+ years
Job Summary:
We are seeking a highly skilled and experienced Senior Data Engineer to join our team and contribute to the development and maintenance of our cutting-edge Azure Databricks platform for economic data. This platform is critical for our Monetary Analysis, Forecasting, and Modelling activities. The Senior Data Engineer will be responsible for building and optimising data pipelines, implementing data transformations, and ensuring data quality and reliability. This role requires a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets.
Key Responsibilities:
Data Pipeline Development & Optimisation:
Design, develop, and maintain robust and scalable data pipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data providers) into the Azure Databricks platform.
Optimise data pipelines for performance, efficiency, and cost-effectiveness.
Implement data quality checks and validation rules within data pipelines.
Data Transformation & Processing:
Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies.
Develop and maintain data processing logic for cleaning, enriching, and aggregating data.
Ensure data consistency and accuracy throughout the data lifecycle.
Azure Databricks Implementation:
Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services.
Implement best practices for Databricks development and deployment.
Optimise Databricks workloads for performance and cost.
Need to program using the languages such as SQL, Python, R, YAML and JavaScript
Data Integration:
Integrate data from various sources, including relational databases, APIs, and streaming data sources.
Implement data integration patterns and best practices.
Work with API developers to ensure seamless data exchange.
Data Quality & Governance:
Hands on experience to use Azure Purview for data quality and data governance
Implement data quality monitoring and alerting processes.
Work with data governance teams to ensure compliance with data governance policies and standards.
Implement data lineage tracking and metadata management processes.
Collaboration & Communication:
Collaborate closely with data scientists, economists, and other technical teams to understand data requirements and translate them into technical solutions.
Communicate technical concepts effectively to both technical and non-technical audiences.
Participate in code reviews and knowledge sharing sessions.
Automation & DevOps:
Implement automation for data pipeline deployments and other data engineering tasks.
Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments.
Promote and implement DevOps best practices.
Essential Skills & Experience:
10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks.
Strong proficiency in Python and Spark (PySpark) or Scala.
Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns.
Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage, and Azure SQL Database.
Experience working with large datasets and complex data pipelines.
Experience with data architecture design and data pipeline optimization.
Proven expertise with Databricks, including hands-on implementation experience and certifications.
Experience with SQL and NoSQL databases.
Experience with data quality and data governance processes.
Experience with version control systems (e.g., Git).
Experience with Agile development methodologies.
Excellent communication, interpersonal, and problem-solving skills.
Experience with streaming data technologies (e.g., Kafka, Azure Event Hubs).
Experience with data visualisation tools (e.g., Tableau, Power BI).
Experience with DevOps tools and practices (e.g., Azure DevOps, Jenkins, Docker, Kubernetes).
Experience working in a financial services or economic data environment.
Azure certifications related to data engineering (e.g., Azure Data Engineer Associate).
About the Company
Mastek is an enterprise digital and cloud transformation partner that engineers excellence for customers in industries such as healthcare and life sciences, retail & consumer, manufacturing, financial services, and public sector across 40 countries, including the UK, US, Europe, the Middle East, and Asia Pacific. Mastek helps enterprises decomplex digital and delivers business outcomes with trust, value, and velocity across the spectrum of services including digital experience & engineering, cloud implementations, data, auto... Know more
Related Jobs


- Company Name
- Sainsbury's
- Job Title
- Analytics Engineer
- Job Description
- We’d all like amazing work to do, and real work-life balance. That’s waiting for you at Sainsbury’s. Think about the scale it takes for us to feed the nation. The level of data, transactions and variety it involves. Then you’ll realise that ours is a modern software engineering environment because it has to be. We’ve made serious investment into a Tech Academy and into setting standards and principles. We iterate, learn, experiment and push ways of working such as Agile, Scrum and XP. So you can look forward to awesome opportunities in everything from AI to reusable tech. Sainsburys Tech - Analytics Engineer Why join us At Sainsbury's, we may be a 150-year-old retail chain, but we're on an exciting journey of transformation. As a Tech professional, you should consider joining us because we're changing the way we operate, embracing nimble thinking, and empowering our teams to push boundaries and create amazing systems and technologies. We're unlocking the immense potential of petabytes of data, leveraging it to make business decisions that are unparalleled. With thousands of shops, hundreds of thousands of colleagues, and millions of customers, we offer an unparalleled opportunity to work on groundbreaking projects. As an Analytics Engineer, you'll have the chance to write great SQL, design complex data models, collaborate with a talented team, and deliver robust data products alongside Data Architects, Data Scientists, Product Managers, and Data Engineers. Join us to be part of our data revolution and shape incredible experiences for our colleagues and customers. What You'll Do Create designs for complex projects (data products), iterating existing components or designing new components as needed. Influence teams within your area of responsibility to design and build components aligned with the overall roadmap and engineering principles. Collaborate within teams to contribute to the execution of the organisation's technical strategy, focusing on the development and deployment of data solutions. Lead outcomes, develop stakeholder relationships, and deliver high-quality insights through data storytelling. Write code following coding standards and best practices, adhering to a test-driven and behaviour-driven development approach. Assist in architecting systems, designing efficient data solutions, and facilitating technical decision-making. Apply insightful domain knowledge to business problems, recommending and implementing data-led approaches. Ensure high-quality, accurate, and professional outputs that drive real business decisions. What I Need To Know Experience with relevant coding languages and unit testing. Proficiency in SQL for data transformation, analysis and problem-solving. Understanding of version control systems, continuous integration pipelines, and service-oriented architecture. Highly numerate background with the ability to drive business change through data. Excellent communication skills, capable of explaining complex information in simple terms using available documentation tools. Strong problem-solving skills and attention to detail, with a curiosity to explore opportunities and solve problems logically. Proactive attitude towards continuous learning and career development. Delivery of solutions with longevity and maintainability following the latest Agile practices. Positive impact on the wider engineering and analytics community through contributions and support. Understanding of Relational (and non-relational) databases and when to use them. Working knowledge of modern data architecture frameworks, understanding of Architecture & Engineering Standards/ Principles and knowing when and how to implement frameworks and when to make suggestions for new Standards. Understanding the need for different and appropriate design techniques, such as data vault and data warehousing How Will I Succeed Proactively communicating risks and challenges about your technical product to both a technical and non technical audience. Applying technical judgement to deliver solutions with longevity; solutions which can be maintained and serviced. Driven to deliver for your product family, including setting and contributing to specific team goals. Share your knowledge and ideas with the team. Contribute within the Agile team and community of practice spaces. This role offers a unique opportunity to combine expertise in architecture, data engineering, and data insights to drive innovation and value creation within our organisation. We are committed to being a truly inclusive retailer, so you’ll be welcomed whoever you are and wherever you work. Around here, there’s always the chance to try something new - whether that’s as part of an evolving team or somewhere else across the business - and we take development seriously and promise to support you. We also recognise and celebrate colleagues when they go the extra mile and, where possible, offer flexible working. When you join our team, we’ll also offer you an amazing range of benefits. Here are some of them: Starting off with colleague discount, you'll be able to get 10% off at Sainsbury's, Argos, TU and Habitat after 4 weeks. This increases to 15% off at Sainsbury’s every Friday and Saturday and 15% off at Argos every pay day. We've also got you covered for your future with our pensions scheme and life cover. You'll also be able to share in our success as you may be eligible for a performance-related bonus of up to 10% of salary, depending on how we perform. Your wellbeing is important to us too. You'll receive an annual holiday allowance, and you can buy additional holiday. We also offer other benefits that will help your money go further such as season ticket loans, cycle to work scheme, health cash plans, pay advance (where you can access some of your pay before pay day) as well access to a great range of discounts from hundreds of other retailers. And if you ever need it there is also an employee assistance programme. Moments that matter are as important to us as they are to you which is why we give up to 26 weeks’ pay for maternity or adoption leave and up to 4 weeks’ pay for paternity leave. Please see www.sainsburys.jobs for a range of our benefits (note, length of service and eligibility criteria may apply).


- Company Name
- Oracle
- Job Title
- Principal Product Manager - Oracle Cloud Infrastructure & Multicloud Database Solutions
- Job Description
- Job Description About the Role: We are seeking an experienced Principal Product Manager with strong technical expertise in Oracle database technologies, OCI, and Multicloud environments. This senior role requires deep technical knowledge of Oracle database ecosystems combined with strategic product vision to design, validate, and deliver innovative database solutions across cloud platforms. Qualifications: Required Bachelor’s degree in computer science, Information Technology, or related field (or equivalent work experience). 5-10 years of professional experience with Oracle database technologies, best practices, and solution development. Expert-level Oracle database administration skills in Linux/UNIX environments. Hands-on experience with Oracle Database Cloud (18c, 19c, 21c), Exadata Cloud Service (ExaCS and ExaCC), and Autonomous Database. Comprehensive understanding of enterprise database infrastructure including: Backup and recovery using RMAN. High availability with Data Guard and Oracle RAC. Updating the RDBMS and Grid Infrastructure. Updating Exadata System Software (Database and Storage Servers). Automatic Storage management (ASM). Network configuration and optimization. Advanced knowledge of SQL performance tuning, database scalability, and capacity planning. Demonstrated proficiency with SQL/PLSQL, UNIX shell scripting, and Python or similar languages. Proven experience designing, deploying, and maintaining large-scale database systems. Practical experience with multiple cloud platforms including AWS, Azure and/or GCP services. Preferred Strong hands-on experience with Oracle GoldenGate replication technology and Exadata Database Machine. Demonstrated success managing database environments across multiple clouds, especially OCI, AWS, and Azure. Deep expertise in database backup strategies, recovery procedures, and disaster recovery planning. Experience implementing and managing database security in Exadata and cloud environments. Practical knowledge of containerization technologies including Docker and Kubernetes for database workloads. Cloud Solution Architect experience with proven ability to design and administer complex Oracle OCI environments. 2+ years of hands-on experience with AWS Cloud-native database solutions. 1+ years of experience implementing infrastructure automation using DevOps tools such as Ansible and Terraform. Key Competencies Expert-level technical knowledge of Oracle database technologies and cloud infrastructure. Strategic product vision with ability to translate customer requirements into viable technical solutions. Exceptional problem-solving abilities with a methodical approach to complex technical challenges. Strong communication skills with ability to explain complex database concepts to technical and non-technical audiences. Self-motivated with demonstrated ability to work independently in a remote environment. Passion for continuous learning and staying current with rapidly evolving database and cloud technologies. Responsibilities Responsibilities: Design and execute comprehensive database architecture strategies and product roadmaps as part of the technical leadership team. Partner with customers and sales teams to gather requirements, understand pain points, and identify opportunities for new database solution offerings. Architect and build Oracle database and Multicloud solutions that deliver significant value across our customer base. Lead testing and validation efforts for database solutions in OCI and OracleDB@hyperscaler (Oracle databases running in AWS, Azure, and GCP). Develop and document end-to-end processes for integrating applications running in hyperscaler clouds with Oracle database services (OracleDB@hyperscaler) Manage and optimize our internal OCI and OracleDB@hyperscaler lab environments, including database instances, compute resources, storage, networking, and security configurations. Orchestrate collaboration across product, engineering, and customer success teams to deliver reliable, high-performance database solutions. Engineer fully automated database deployment pipelines using Terraform and Ansible that customers can leverage as templates for their own CI/CD integrations. Design and implement end-to-end logical and physical cloud migration strategies leveraging Oracle technologies including Zero Downtime Migration (ZDM), RMAN, Data Guard, OCI Migrations, DataPump, and GoldenGate. Create comprehensive product specifications incorporating automation scripts and infrastructure-as-code approaches for consistent deployments. Consult with customer application and infrastructure teams to optimize their cloud-based database architectures. Produce detailed technical documentation covering product capabilities, implementation guidelines, and operational best practices. Implement and govern complex database solutions using DevOps methodologies and tools. Research emerging cloud technologies and provide strategic recommendations aligned with industry best practices and customer needs. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.


- Company Name
- Tardis Tech
- Job Title
- Head of AI Engineering
- Job Description
- Head of AI Engineering – Proprietary Trading Location: Chicago, New York, London Company Overview Our proprietary trading client is seeking a Head of AI Engineering to lead the development and deployment of AI-driven solutions that optimize trading efficiency and unlock new strategic opportunities. This role is ideal for a hands-on leader with deep expertise in AI/ML infrastructure, real-time data processing, and scalable model deployment in high-performance computing environments. Role Overview As the Head of AI Engineering, you will drive the AI strategy, architecture, and execution, leading a team of engineers to build state-of-the-art AI infrastructure and applications. You will collaborate closely with technologists, traders, and quantitative researchers to integrate AI into trading systems, ensuring scalable, low-latency, and production-grade deployments. Key Responsibilities Lead and scale a high-performance AI engineering team, setting technical direction and best practices. Develop and optimize AI/ML models and infrastructure for trading and risk management. Drive end-to-end AI application development, from concept to deployment and continuous monitoring. Architect and enhance MLOps pipelines, feature stores, and model training infrastructure. Ensure low-latency, high-reliability AI solutions by optimizing GPU/CPU performance. Evaluate and integrate cutting-edge AI frameworks and tools, including TensorFlow, PyTorch, TensorRT, and ONNX. Collaborate with quantitative researchers and traders to implement AI-driven strategies. Qualifications Five or more years leading AI/ML engineering teams in high-performance computing or trading environments. Seven or more years of hands-on AI/ML development, with expertise in Python, C++, or Java. Deep experience in MLOps, AIOps, and AI model deployment at scale. Proven track record in designing AI/ML architectures for real-time, mission-critical systems. Strong expertise in large language models, retrieval-augmented generation techniques, and fine-tuning AI models. Familiarity with compute infrastructure required for high-frequency AI/ML applications. Advanced degree in computer science, AI, machine learning, or a related field preferred. Exceptional leadership, problem-solving, and communication skills. This is a senior leadership role for an AI engineering expert passionate about driving innovation in proprietary trading. Only suitable applicants will be contacted.


- Company Name
- Zendr
- Job Title
- Director of Data Engineering
- Job Description
- Our client is a Series A funded SaaS startup specializing in Threat Intelligence. They leverage advanced machine learning for narrative intelligence, helping enterprises and government agencies combat social media manipulation and emerging narrative threats. Their platform processes vast amounts of unstructured, cross-channel media data, converting it into actionable insights. They are looking for a Director of Data Engineering expert to spearhead their development, implementation, and advancement of their data infrastructure. In this role, you will work closely with the Data, Product, and Engineering team. Will be tasked with managing 2 people initially then scale into consolidated Data team whilst you will be reporting into the VP of Engineering. Key Responsibilities: Develop and implement a long-term vision for data engineering and DevOps strategies. Collaborate with senior leadership to prioritize initiatives, set objectives, and define measurable outcomes. Build, mentor, and lead a diverse team of Data Engineers Oversee the design, development, and maintenance of scalable data pipelines, warehouses, and processing frameworks. Lead adoption of modern DevOps methodologies to streamline CI/CD pipelines and deployment processes. Partner with cross-functional teams, including product, analytics, and engineering, to align technical solutions with business needs. Present project updates, performance metrics, and strategic initiatives to leadership. Required Qualifications: 10+ years of engineering experience, with at least 5+ years in data engineering Proven experience in designing and implementing data architectures, ETL processes, and DevOps pipelines. Expertise in cloud platforms AWS, Azure, or GCP. Preferably AWS Experience with modern DevOps tools such as Kubernetes, Docker, Terraform, Jenkins, or similar. Track record of successfully managing and scaling high-performing technical teams. Experience with data orchestration platforms such as Dagster or Airflow. Strong database architecture design skills for both structured and unstructured data. Advanced knowledge of Elasticsearch or OpenSearch, including configuration and search functionalities. Ability to define and communicate data architecture requirements while staying up to date with best practices.