
Big Data Lead
Hybrid
London, United Kingdom
Freelance
19-02-2025
Job Specifications
Job Description:
For this role, senior experience of Data Engineering and building automated data pipelines on IBM Datastage & DB2, AWS and Databricks from source to operational databases through to curation layer is expected using the latest cloud modern technologies where experience of delivering complex pipelines will be significantly valuable to how D&G maintain and deliver world class data pipelines.
Knowledge in the following areas essential:
Data Engineering Experience:
Databricks: Expertise in managing and scaling Databricks environments for ETL, data science, and analytics use cases.
AWS Cloud: Extensive experience with AWS services such as S3, Glue, Lambda, RDS, and IAM.
IBM Skills: DB2, Datastage, Tivoli Workload Scheduler, Urban Code
Programming Languages: Proficiency in Python, SQL.
Data Warehousing & ETL: Experience with modern ETL frameworks and data warehousing techniques.
DevOps & CI/CD: Familiarity with DevOps practices for data engineering, including infrastructure-as-code (e.g., Terraform, CloudFormation), CI/CD pipelines, and monitoring (e.g., CloudWatch, Datadog).
Familiarity with big data technologies like Apache Spark, Hadoop, or similar.
Test automation skills
ETL/ELT tools and creating common data sets across on-prem (IBMDatastage ETL) and cloud data stores
Leadership & Strategy: Lead Data Engineering team(s) in designing, developing, and maintaining highly scalable and performant data infrastructures.
Customer Data Platform Development: Architect and manage our data platforms using IBM (legacy platform) & Databricks on AWS technologies (e.g., S3, Lambda, Glacier, Glue, EventBridge, RDS) to support real-time and batch data processing needs.
Data Governance & Best Practices: Implement best practices for data governance, security, and data quality across our data platform. Ensure data is well-documented, accessible, and meets compliance standards.
Pipeline Automation & Optimization: Drive the automation of data pipelines and workflows to improve efficiency and reliability.
Team Management: Mentor and grow a team of data engineers, ensuring alignment with business goals, delivery timelines, and technical standards.
Cross Company Collaboration: Work closely with all levels of business stakeholder including data scientists, finance analysts, MI and cross-functional teams to ensure seamless data access and integration with various tools and systems.
Cloud Management: Lead efforts to integrate and scale cloud data services on AWS, optimising costs and ensuring the resilience of the platform.
Performance Monitoring: Establish monitoring and alerting solutions to ensure the high performance and availability of data pipelines and systems to ensure no impact to downstream consumers.
About the Company
SoftNice is an award-winning global technology company specialized in the deployment and delivery of IT solutions enabling enterprises in their digital transformation journey. Our innovative software solutions ensure continuous improvement, increase productivity, and empower the employees with digital transformation by unleashing the power of modern digital technologies – Enterprise Mobility, SQL/Business Intelligence, Azure, SharePoint Portals & Collaboration, Enterprise Social Collaboration, Cloud, Security, Salesforce, Of... Know more
Related Jobs


- Company Name
- Project Brains
- Job Title
- Data Engineering Specialist
- Job Description
- Company Description Project Brains is a Future of Work platform that matches business needs with fractional specialists who have the expertise to deliver successful outcomes. By leveraging our vetted community of specialists, businesses can focus on their core priorities and grow ambitiously. Today we are working with a business who urgently needs senior and expert Data Engineers to migrate an existing Data Lake from SAS to Google Cloud. Qualifications - Expertise in Data Engineering: Strong skills in SAS, SQL, dbt, git, Airflow (Python), ODE Generic Export Framework. - Experience with Key Tools: Familiarity with KNIME, VS Code, DIL-Pipelines, InnovatorX (MDD Software), Iceberg/Biglake Tables.. - Industry Knowledge (Preferred but not mandatory): Understanding of fiber optic marketing. - Availability for an urgent start. Goal This is a time-sensitive project, and the right engineers will ensure a seamless migration, preventing disruptions. We are open to a streamlined hiring process for this contract need.


- Company Name
- Creo Recruitment
- Job Title
- Data QA Tester - £550 - £600 Inside IR35
- Job Description
- Key Responsibilities: - Develop Test Plans and Strategies: Create comprehensive test plans and strategies for the Salesforce project, ensuring all aspects of the program are covered. - Design and Execute Test Cases: Develop and execute detailed test cases to identify and report bugs, glitches, and other issues, ensuring thorough testing of all functionalities. - System and Functional Testing: Perform system and functional testing to ensure the solution meets business requirements and acceptance criteria, validating the effectiveness of the integration. - Issue Resolution: Collaborate with developers to resolve issues and improve the user experience, ensuring a seamless integration process. - Maintain Test Documentation: Keep detailed test documentation and record defect details. Track defects through their lifecycle, from identification to resolution and retesting, ensuring closure. - Quality Assurance: Ensure that the software meets the quality criteria established by the program team and end users, maintaining high standards of performance and reliability. - Regression Testing: Conduct regression testing to ensure new solutions or updates do not negatively impact existing core processes, maintaining system integrity. - Comprehensive Testing: Where applicable, appropriate, and feasible, ensure that testing includes performance, regression, functionality, non-functional, and integration tests, providing a holistic approach to quality assurance. - Manual and Automated Testing: Perform both manual and automated testing to ensure thorough coverage and efficiency in the testing process. - Reconciliation Testing: Designing tests and validations to ensure that migrated data is consistent across each phase of the migration pathway, from source staging to Salesforce. Key Skills: - Proven experience as a Data QA Tester or similar role, with a focus on data integration and migration testing, specifically Salesforce reports and dashboards. - Strong & Proven proficiency in Behaviour-Driven Development (BDD) techniques, tools (e.g., Cucumber, SpecFlow), and frameworks. - Extensive hands-on experience with Salesforce testing, including report validation, data validation, and dashboard testing. - Proficiency in dataset generation techniques, SQL querying, and data manipulation. - Experience working in Agile development methodologies, such as Scrum or Kanban, with a strong understanding of the testing lifecycle in Agile projects. - Ability to effectively test, validate, and verify Salesforce reports and dashboards against business requirements. - Excellent analytical skills and attention to detail, with the ability to identify data inconsistencies, anomalies, and discrepancies. - Strong communication and collaboration skills, with the ability to effectively interact with cross-functional teams and stakeholders. - Proactive mindset, with a passion for continuous learning and staying abreast of the latest trends and advancements in data QA testing. - Experience of reconciliation testing between source and target data sources.


- Company Name
- Walker Lovell
- Job Title
- Market Data Analyst
- Job Description
- We are seeking a Market Data Analyst to support a global market data team. This role is key to ensuring accurate reporting, compliance, and client support within a fast-paced financial environment. This is a 12-month fixed-term contract with the opportunity to work in a globally recognized financial environment. Key Responsibilities: Data & Compliance: Manage market data reporting, ensuring accuracy and timely submission. Client Support: Handle customer queries and support market data licensing processes. Financial Oversight: Work with finance teams on billing, reporting corrections, and debt follow-ups. Market Insights: Analyze transaction data to identify trends and business insights. Process Improvement: Contribute to audit programs and market data projects. What We’re Looking For: Experience in market data, compliance, or transaction reporting. Knowledge of exchange-traded derivatives, futures, or commodities trading is a plus. Strong analytical and communication skills. Advanced proficiency in Excel, PowerPoint, and Word. Ability to manage multiple tasks and collaborate across teams.


- Company Name
- Meritus
- Job Title
- Lead Data Engineer
- Job Description
- Lead Data Engineer | Defence Consultancy | DV Clearance Required | 12 Month Contract (Inside IR35) | Up to £800 per day | London MERITUS are working with a tech focused Defence Consultancy in London looking for a Lead Data Engineer to join on a 12 month contract working with MOD customers in the London area. If successful you will be responsible for designing, building and maintaining scalable data pipelines and integration solutions. You will work closely with business & customer stakeholders in the wider data team to ensure efficient flow of data across systems. The role is based in London & requires active DV security clearance. Main Responsibilities: Design, implement, and maintain scalable data architectures, pipelines, and ETL processes. Lead and mentor data engineers, ensuring best practices in coding, data modelling, and pipeline optimisation. Establish and enforce data quality, compliance, and security standards across data platforms. Work closely with data scientists, analysts, and business teams to understand requirements and optimise data solutions. Skills Required: Expertise in tools like Hadoop, Spark, Kafka, and cloud data services (AWS, GCP, or Azure). Strong knowledge of relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra) databases. Proficiency in Python, Java, or Scala for data processing and automation. Experience with Apache Airflow, dbt, or other orchestration tools for efficient ETL/ELT processes. Got your attention? If you believe that you have the skills and experience for the role – then please get in touch. We also offer a referral scheme for any candidates whose details are passed to us that we successfully place. If you have any further questions then please contact Ryan Harris at MERITUS Talent.