cover image
eTeam

Principal Data Engineer

On site

United Kingdom

Freelance

14-02-2025

Share this job:

Score my CV

Job Specifications

Job Title: Principal Data Engineer
Duration = 6 Months with likelihood of extending to 12
Location = UK (Remote)


Principal Data Engineer
Who you'll work with
You’ll be a member of a global team working on GenAI initiative. Client’s Tech Ecosystem function is responsible for developing and delivering all technology solutions for the firm’s internal use. We are taking a cloud-first approach to transform our data platforms and analytical applications across the firm.

We are seeking an experienced data engineer to shape and accelerate the delivery of target state data platform enabling GenAI use-cases. We want a passionate specialist who loves to build data solutions as part of multi-disciplinary team, working closely with digital product professionals, data scientists, cloud engineers and others.
Your impact within our firm.

We are looking to staff a rockstar Principal Data Engineer to join Lilli team in enabling GenAI application with data. You’ll learn what it’s like to build and work on a GenAI product that has Firm-wide impact.
As part of the team, you sit at the technical side and work closely with product professional to deliver data processing capabilities enabling top use cases to transform how we work using GenAI. You collaborate with cross-functional teams, including product managers, data scientists, engineers, data engineers, designers, and Firm stakeholders, to define product requirements, prioritize features, and drive a successful delivery.
You will be leading multiple workstreams that include work related to data infrastructure, data engineering framework & adoption, developer productivity, data processing capabilities, GenAI-agents data enablement and more.

What you'll do

As a Principal Data Engineer, you will be responsible for making critical technical design decisions that shape our data infrastructure. You will mentor and apprentice data engineers, fostering their growth and ensuring they adhere to best practices. Your role includes influencing the data engineering team structure to optimize delivery and maximize impact. You will also be accountable for the quality of deliverables, ensuring that all code meets high standards of reliability, scalability, and performance. Your leadership and expertise will drive the team to achieve excellence in all aspects of data engineering.

Skills

8+ years of professional experience as a data engineer, with a strong focus on cloud- based data engineering using AWS services
Experience leading teams of engineering professionals
Creation of enterprise-grade data solutions at large scale
Expertise with Python development
Fluency with data architecture & data engineering patterns and ability to apply them appropriately
Experience with relational databases and vector stores
Strong experience with containerization technologies and AWS ECS
Practicing high coding standards with clean code, modularity, error handling, testing automation and more
Very driven, superstrong on execution and output orientation, likes to get job done attitude and ability to figure things out independently. Able to work in complex and very fast paced environment
Interest in Generative AI and other ML topics
Kedro framework experience as a plus
Holds their ground, opinionated, not afraid to speak up at any level
Familiarity with agile principles and product development
Excellent problem-solving skills and the ability to analyze and resolve complex data engineering challenges
Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment

Data Engineer
Duties

Who you'll work with
You’ll be a member of a global team working on GenAI initiative. Client’s Tech Ecosystem function is responsible for developing and delivering all technology solutions for the firm’s internal use. We are taking a cloud-first approach to transform our data platforms and analytical applications across the firm.

We are seeking an experienced data engineer to shape and accelerate the delivery of target state data platform enabling GenAI use-cases. We want a passionate specialist who loves to build data solutions as part of multi-disciplinary team, working closely with digital product professionals, data scientists, cloud engineers and others.

Your impact within our firm

We are looking for a Data Engineer with expertise in Python development, who is passionate about cloud- based data engineering using AWS services and loves to build data solutions as part of multi-disciplinary team.

You would be working closely with digital product professionals, data scientists, cloud engineers and others.

You’ll be a member of a global team working on GenAI initiative, based in one of our European offices. Our Client’s Tech Ecosystem function is responsible for developing and delivering all technology solutions for the firm’s internal use.

You will work in a team of data engineers to develop data ingestion pipelines, create and mature data processing capabilities that ingest data into a data system used by GenAI applications.

Your work would include but won't be limited to creation of the python code, tests, creation and modification of GitHub Action CICD pipelines, working with AWS-based infrastructure and docker containers.

What you'll do
You will work in a team of data engineers to develop data ingestion pipelines, create and mature data processing capabilities that ingest data into a data system used by GenAI applications. Work includes but not limited to creation of the python code, tests, creation and modification of GitHub Action CICD pipelines, working with AWS-based infrastructure and docker containers

Skills

3+ years of professional experience as a data engineer, with a strong focus on cloud- based data engineering using AWS services
Expertise with Python development
Practicing high coding standards with clean code, modularity, error handling, testing automation and more
Strong experience with relational databases
Very driven, superstrong on execution and output orientation, likes to get job done attitude and ability to figure things out independently. Able to work in complex and very fast paced environment
Hands-on experience with Docker
Solid and demonstrable background in data pipeline performance and diagnostics
Interest in Generative AI and other ML topics
Kedro framework experience as a plus
Holds their ground, opinionated, not afraid to speak up at any level
Familiarity with agile principles and product development
Excellent problem-solving skills and the ability to analyze and resolve complex data engineering challenges
Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment

About the Company

eTeam was formed in 1999 with the goal of becoming the supplier of choice for clients, employees and contingent workers. Today, we're one of the fastest-growing companies in New Jersey and ranked as one of the best companies to work for by Staffing Industry Analysts and New Jersey Business. We're also an honored member of Deloitte's Technology Fast 50. eTeam provides high-volume staffing, SOW and pay rolling services to structured contingent workforce programs and projects across the U.S., Canada and India. We also offer ... Know more

Related Jobs

Company background Company brand
Company Name
Square One Resources
Job Title
SC Cleared - Service Layer Data Architect - Fully Remote UK
Job Description
Job Title: SC Cleared - Service Layer Data Architect - Fully Remote Location: UK Based - Fully Remote Salary/Rate: Up to £605 a day Inside ir35 Start Date: 01/05/25 Job Type: Contract Company Introduction We are looking for an SC Cleared Service Layer Data Architect to join out client, a Global Professional Services company to work on a high profile Healthcare sector project. *Candidates applying must hole active Security Clearance*. *The successful candidate for this role must be able to start this role on the 1st May 2025*. Required Skills/Experience SC CLEARANCE REQUIRED Will work closely with the Application Architect but have a particular focus on the data flow and data architecture between the Healthcare Clients. Experience of UK Health Protocols is very beneficial. HL7 or integration would be extremely beneficial. You will be responsible for taking the current high-level logical architecture and developing this into a deliverable and validated design in 2 months Deep understanding of core Azure services like Logic Apps, Azure Functions, Service Bus, API Management, Event Hubs, and Azure Storage to design and build integration flows. Knowledge of common integration patterns like pub/sub, request-response, batch processing, and event-driven architectures to create efficient data exchange mechanisms - Knowledge of various data flow, data base design would be important to understand and design logging, data retention etc Ability to make decisions, own designs, excellent communication skills Job Responsibilities/Objectives You will be responsible for designing and overseeing the data integration of service layer and translating this into user stories with the help of the senior members of the team work with the application architect in communicating with key resources within the client screening platform around the design and implementation to ensure the company delivers to plan. You will also be responsible for ensuring that the organisation shows value through the delivery life cycle. The service layer project is the first agile project and the you will play a key role in showing what the company can do. You will develop and document the application architecture, including system components, data models, integration points, and in the Azure technology stack selection, you will evaluate and choose appropriate technologies, frameworks, and tools based on performance, scalability, and maintainability requirements. Ensure that all development technologies, development approaches and standards adhere to the client standards and guide the development team on architectural decisions, best practices, and coding standard incorporate security measures into the architecture to protect sensitive data and comply with industry standard. If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format. Disclaimer Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies. Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.
England, United Kingdom
On site
Freelance
11-03-2025
Company background Company brand
Company Name
Sanderson
Job Title
Business Analyst - AI/Automation - London Markets
Job Description
Business Analyst - AI/Automation - London Markets £650 - £700 Inside via Umbrella - Can be some flexibility here 6 months initially - Will run for at least another 2 years Hybrid 2 days per week in London, some travel to Paris I am looking for a Business Analyst to join an speciality insurance client of ours as they look to implement an AI/Automation tool into their underwriting function for Europe. Skills/Experience London Markets Experience. Experience of International Markets. Experience of implementing any AI/Automation/Robotics or Robotic Process Automation (RPA) tools. Experience of working in a small to medium sized project team. Front footed. Comfortable working in ambiguity. Strong knowledge of the Underwriting process
London, United Kingdom
On site
Freelance
11-03-2025
Company background Company brand
Company Name
ERSOU (Eastern Region Special Operations Unit)
Job Title
Data Exploitation Team Practitioner
Job Description
Data is now at the heart of all our investigations, the volume and complexity of this data is rapidly changing, and we need to keep pace with the advancements in technology within the world around us. This is an exciting opportunity to join a new team at the forefront of digital data exploitation. We are seeking proactive, inquisitive and committed individuals to join our new Data Exploitation Team. The Data Exploitation Practitioner will work alongside Intercept Case Officers to analyse sensitive intelligence data to identify and extract information of intelligence value. You will provide direct support to operations by utilising your experience, skills and knowledge to gather, develop, and analyse intelligence. In addition to supporting operations using cutting-edge technology, you will also be responsible for more strategic work such as inputting into future capability requirements, developing tradecraft and knowledge to share with the team, wider community, and our partners. Key responsibilities: Work alongside Intercept Case Officers to analyse sensitive intelligence data to identify and extract information of intelligence value. Identify opportunities to utilise and gather sensitive intelligence, against the Operations Leads priorities, identifying gaps in collections and opportunities for filling those gaps. Build relationships and work closely with the Intercept Case Officers and the Sensitive Intelligence Units to better understand gaps and requirements. Identify opportunities that might be available within data collected or that could be collected. Create intelligence notes / logs for dissemination to further investigations. Develop capability requirements, techniques, methods, and technologies for analysis of complex data, identifying new ways of working and sharing knowledge with the wider team, community and Partners. Input into the strategic requirements for tooling and data to enhance future capabilities. Ensure that the handling and use of sensitive intelligence is in accordance with legislation, policy and procedures. Ensure, on a day-to-day basis, the security of information and sensitive intelligence. Applicants would ideally come from a Data Analyst, Intelligence Development Office (IDO) or Intelligence Researcher background. Any experience of carrying out digital investigation, open-source intelligence (OSINT) or digital forensics investigations would be beneficial in the role. We are seeking individuals who are proactive, inquisitive, and committed to their work, with a strong desire to expand their knowledge and experience. Candidates will possess strong interpersonal skills, competent in both verbal and written communication, e.g., be able to write reports which are clear and succinct to the end user. You should be able to convey complex intelligence information and concepts in a clear and understandable manner, tailored to the audience at all levels. Be self-motivated and have the ability to work under pressure, prioritising workloads and working to tight timescales within an ever-changing environment. You should be able to demonstrate your experience in problem solving in an intelligence environment. As a minimum you should have a good knowledge of Excel or other similar platforms. About ERSOU The Eastern Region Special Operations Unit (ERSOU) is dedicated to tackling serious organised crime and the terrorism threat across the Eastern Region. Created in 2010, ERSOU is funded by the seven police forces that make up the Eastern Region, with Bedfordshire Police being the lead force. The unit is made up of both police officers and staff from across the region. ERSOU has two main functions: regional organised crime unit (ROCU) and Counter Terrorism Policing (CTP). The CTP unit aims to manage the threat of terrorism across the Eastern Region which is part of the CTP national network. The unit has officers covering all four parts of the governments CONTEST strategy – Protect, Prepare, Prevent and Pursue. CTP also has a substantial presence at the various ports across the region including London Stansted and London Luton Airports. Working for ERSOU presents a fantastic opportunity for you to learn and develop within a unique law enforcement community whose staff and officers are dedicated to protecting the public from the threat of terrorism and serious and organised crime. ERSOU as an employer can offer. Blue light card discounts Generous pension scheme 24/7 access to an employee assistance programme Opportunities to get involved with networking groups such as the network of women, LGBTQ+, parent, carers, health, and wellbeing VETTING Applicants must hold or be prepared to undergo Management Vetting (5 years residency criteria) and Security Clearance (5 years residency criteria) and you will be required to undertake development vetting, before taking up the post. Please be aware the Police Corporate Vetting Unit will undertake security vetting on successful candidates, their family and others that live at their home address, which will include financial checks. Due to the sensitive nature of these checks, we are not able to supply feedback should clearance be declined.
London, United Kingdom
On site
Freelance
11-03-2025
Company background Company brand
Company Name
Ubique Systems
Job Title
Data Warehouse Specialist
Job Description
Greetings from Ubique Systems!!! We are looking for a Data Warehouse Specialist for one of our customer who has an expertise in:- • Proficiency in Snowflake data warehouse architecture Design, build, and optimize ETL/ELT pipelines using DBT (Data Build Tool) and Snowflake. • Experience with DBT (Data Build Tool) for data transformation and modelling. Implement data transformation workflows using DBT (core/cloud). • Strong Python programming skills for automation and data processing. Leverage Python to create automation scripts and optimize data processing tasks. • Proficiency in SQL performance tuning, and query optimization techniques using snowflake. • Troubleshoot and optimize DBT models, and Snowflake performance. • Knowledge of CI/CD, version control (Git) tools. Experience with orchestration tools such as Airflow, • Strong analytical and problem-solving skills with an ability to work in agile development environment independently. • Ensure data quality, reliability, and consistency across different environments. • Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions. • Certification in AWS, Snowflake, or DBT is a plus. Employment type:-Contract. It would be hybrid working in Luton Area. Note:-Only apply if you have valid visa in the EU Region, Sponsorship will not be given.
Luton, United Kingdom
Hybrid
Freelance
11-03-2025