cover image
Tenth Revolution Group

Senior Data Engineer - Remote - £70k

Remote

Manchester, United Kingdom

Full Time

05-03-2025

Job Specifications

My client are a leading Data Partner and consultancy looking for an experienced senior data engineer with skills in Microsoft Fabric, the Azure Data Platform and Python to join their expanding team in a role which encompasses technical know how and a client-facing skillset.

Salary And Benefits

Competitive salary of up to £70k
Annual Performance related bonus of 10%
Remote/hybrid working (once every 2 weeks in office) in Edinburgh, Manchester or London hubs
25 days annual leave (plus bank and public holidays)
Career progress programme - guaranteed learning and development investment
Life insurance
Private medical health insurance
Contributory pension scheme

Role And Responsibilities

Possess a wide range of data engineering skills, with a focus on having delivered in Microsoft Azure
Develop good working relationships with clients on a project including interpersonal skills with both business and technical focused colleagues.
Experience working as a data engineer to develop performant end-to-end solutions in a collaborative team environment.
Delivering high-quality pieces of work, proven ability to escalate problems to client / senior team members where necessary and propose possible solutions.
Support building the Consulting practice through contribution to ongoing initiatives. This can include contributing to knowledge-sharing activities, and data services.
Demonstrated success in delivering commercial projects leveraging the above technologies.
Experience overseeing junior staff, including mentoring, reviewing work, and ensuring project alignment with organisational goals and standards.

What Do I Need To Apply For The Role

Strong in Azure Data Factory, Azure Synapse and Fabric.
Expertise in SQL and Python.
Experience working with relational SQL databases either on premises or in the cloud.
Experience delivering multiple solutions using key techniques such as Governance, Architecture, Data Modelling, ETL / ELT, Data Lakes, Data Warehousing, Master Data, and BI.
A solid understanding of key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines.
Experience working with one or more of Spark, Kafka, or Snowflake

My client have very limited interview slots and they are looking to fill this vacancy within the next 2 weeks. I have limited slots for 1st stage interviews next week so if you're interest, get in touch ASAP with a copy of your most up to date CV and email me at or call me on .

Please Note: This is a permanent role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check.

Nigel Frank are the go-to recruiter for Power BI and Azure Data Platform roles in the UK, offering more opportunities across the country than any other. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, the London Power BI User Group, Newcastle Power BI User Group and Newcastle Data Platform and Cloud User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly at

About the Company

As the global leaders in solving the technology skills gap, we find, train, and deploy experienced professionals, reskill your existing staff and deliver world-class consulting services, all through our unique range of talent solutions. What we do Find talent Permanent and contract recruitment for one role or an entire team Train talent Upskill existing employees via our training programs Deploy talent Access experienced professionals through our hire-train-deploy model Consulting services A new team or project support ... Know more

Related Jobs

Company background Company brand
Company Name
Deliveroo
Job Title
Analytics Engineer
Job Description
Why Deliveroo? We're building the definitive online food company, transforming the way the world eats by making hyper-local food more convenient and accessible. We obsess about building the future of food, whilst using our network as a force for good. We're at the forefront of a industry, powered by our market-leading technology and unrivalled network to bring incredible convenience and selection to our customers. Working at Deliveroo is the perfect environment to build a definitive career, motivated by impact. Firstly, the impact that working here will have on your development, allowing you to grow faster than you might elsewhere; secondly, the impact that you can have on Deliveroo, leaving your mark as we scale; and finally, being part of something bigger, through the impact that we make together in our marketplace and communities. The Role Working as part of our Analytics Engineering team and reporting to one of our Analytics Engineering Managers, your role will be to provide clean, tested, well-documented and well-modelled data sets, that will enable and empower data scientists and business users alike, via tools like Snowflake and/or Looker. You'll work with product engineering teams to ensure modelling of source data meets downstream requirements. You will maintain and develop SQL data transformation scripts, and advise and review data scientists on data modelling to achieve denormalised and aggregated output datasets. You'll work with data scientists and other analytics engineers to surface clean, intuitive datasets in our BI tool, Looker. You will be responsible for optimisation and further adoption of Looker as a data product in the business, catering to :1500 current active users who need to discover and interact with data. Skillset Required 3+ years Analytics Engineering / Data Engineering / BI Engineering experience Excellent SQL skills Understanding of data warehousing, data modelling concepts and structuring new data tables Knowledge of cloud-based MPP data warehousing (e.g. Snowflake, BigQuery, Redshift) Nice to have Experience developing in a BI tool (Looker or similar) Good practical understanding of version control SQL ETL/ELT knowledge, experience with DAGs to manage script dependencies Python coding skills, particularly in the areas of automation & integrations Good knowledge of the Looker API Workplace & Diversity At Deliveroo we know that people are the heart of the business and we prioritise their welfare. We offer multiple great benefits in areas including health, family, finance, community, convenience, growth and relocation. We believe a great workplace is one that represents the world we live in and how beautifully diverse it can be. That means we have no judgement when it comes to any one of the things that make you who you are - your gender, race, sexuality, religion or a secret aversion to coriander. All you need is a passion for (most) food and a desire to be part of one of the fastest growing start-ups around. Please click here to view our candidate privacy policy.
London, United Kingdom
On site
Full Time
06-03-2025
Company background Company brand
Company Name
Roku
Job Title
Data Scientist
Job Description
Teamwork makes the stream work. Roku Is Changing How The World Watches TV Roku is the #1 TV streaming platform in the US and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About The Role Roku is looking for a Data Scientist to join the Core Analytics team supporting Roku Merchandising. In this role, you will leverage data analytics to support the merchandising team to understand the business impact of placing promotional content in the Roku UI. This individual will be responsible for A/B testing, trend analysis, and dashboards and reporting of how promotional content is surfaced. The successful candidate is quantitatively driven, detail-focused, and possesses an elevated level of problem-solving expertise. What You'll Be Doing Develop and maintain dashboards, reports, and data visualizations to monitor key metrics for operational and systems health Analyze structured and unstructured data and communicate insights to help stakeholders solve business problems, identify trends and make data-driven decisions Develop necessary data pipelines to power automation, validation and reporting Collaborate with stakeholders to align data science initiatives with organizational goals and strategy design and execute AB tests Perform exploratory data analysis on emerging trends and execute advanced analysis across the Roku Platform Collaborate with the Program Management and Engineering team to proactively seek and incorporate feedback We're excited if you have 3+ years of work experience with a bachelor's degree or master's degree in quantitative field (e.g., Statistics, Business Analytics, Data Science, Mathematics, Economics, Engineering or Computer Science) 3+ years of experience in consumer product, digital media or entertainment industries Expertise in SQL, SAS, R, Python or other programming language to query data and perform analysis Hands on experience with visualization tools like Tableau or Looker Have a bias towards action in resolving issues and operate in a high-energy, fast-paced environment Hands on experience in A/B testing and statistical modeling/forecasting Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.
Cambridge, United Kingdom
On site
Full Time
06-03-2025
Company background Company brand
Company Name
Tenth Revolution Group
Job Title
Data Engineer - Databricks - Remote - £60k
Job Description
My client are a leading Data Partner and consultancy looking for an experienced data engineer with skills in Databricks, the Azure Data Platform and Python to join their expanding team in a role which encompasses technical know how and a client-facing skillset. Salary And Benefits Competitive salary of up to £60k Annual Performance related bonus of 10% Remote/hybrid working (once every 2 weeks in office) in Edinburgh, Manchester or London hubs 25 days annual leave (plus bank and public holidays) Career progress programme - guaranteed learning and development investment Life insurance Private medical health insurance Contributory pension scheme Role And Responsibilities Utilise a wide range of data engineering skills, with a focus on the Databricks, Python, and the Azure Data Platform, experience with other cloud platforms is also desirable. Develop strong working relationships with clients on projects including interpersonal skills with both business and technical focused colleagues. Experience working as a data engineer to develop end-to-end solutions in a collaborative team environment. Delivering high-quality pieces of work, proven ability to escalate problems to client / senior team members where necessary and propose possible solutions. Support building the Consulting practice through contribution to ongoing initiatives. This can include contributing to knowledge-sharing activities, and data services. What Do I Need To Apply For The Role Strong experience with Databricks, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python) Experience working with relational SQL databases either on premises or in the cloud. Experience delivering multiple solutions using key techniques such as Governance, Architecture, Data Modelling, ETL / ELT, Data Lakes, Data Warehousing, Master Data, and BI. A solid understanding of key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. Experience working with one or more of Kafka, Snowflake, Azure Data Factory, Azure Synapse or Microsoft Fabric is highly desirable. Knowledge of data modelling and data architectures: Inmon, Kimball, DataVault Strong client facing and consultancy skills and experience My client have very limited interview slots and they are looking to fill this vacancy within the next 2 weeks. I have limited slots for 1st stage interviews next week so if you're interest, get in touch ASAP with a copy of your most up to date CV and email me at or call me on . Please Note: This is a permanent role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. Nigel Frank are the go-to recruiter for Power BI and Azure Data Platform roles in the UK, offering more opportunities across the country than any other. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, the London Power BI User Group, Newcastle Power BI User Group and Newcastle Data Platform and Cloud User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly at
Manchester, United Kingdom
Remote
Full Time
05-03-2025
Company background Company brand
Company Name
Ziggy | Revenue-First Demand Gen Agency
Job Title
Lead Data Engineer
Job Description
Lead Data Engineer We’re looking for an experienced, organised, data engineer to lead our data services across the agency. As a member of the data team, you’ll help shape the way we analyse, report and measure performance across Ziggy’s clients. We’re looking for a passionate, experienced, and outcome-driven candidate to help our key customers scale revenue through paid media, data science and marketing automation. The ideal candidate is highly collaborative, a self-starter, with strong problem solving and analytical ability. You’re a strategic thinker that can build customer insights from data to help achieve marketing and sales goals. You take a creative approach to challenges and initiatives and never settle for good enough. You’re strategic and collaborative with a strong bias to action and you care about doing what’s best for our customers, for Ziggy, and for your team. Ziggy’s mission is to become one of the leading B2B performance marketing agencies in the world. Our suite of services helps tech companies—from startups to enterprise—scale revenue from demand gen. You’ll work alongside an experienced team of account directors, performance directors and a growth marketing team to deliver best-in-class B2B marketing. What You’ll Be Doing: Design, build, maintain and monitor data pipelines on Google Cloud Platform. Create dashboards for external and internal stakeholders using Looker Studio. Optimise data workflows and architecture for efficiency and cost savings. Implement dimensional modelling for marketing, sales and commercial data in BigQuery Develop DAGs and orchestrate data pipelines using Airflow Collaborate with performance teams, clients and data analysts to enhance reporting and insights Streamline and automate data processes within the company Advocate for data as a product to ensure data is documented, reviewed, and accurate. You have the ability to explain complex technical information in a simple way. You will lead strategic and technical conversations internally and externally for the data function. You will lead our AI innovation projects. Integrating LLM models into our data pipelines. Finding new ways to leverage AI to automate insights and improve our agency operations. Who You Are: Previous work experience as a data engineer or in a similar role Skilled in building scalable, well-tested Python data pipelines with good coding practices Proficient in extracting data from REST APIs using 3rd party tools like Adverity, Fivetran, Airbyte Experience with data orchestration tools like Dataform, Apache Airflow Strong Python and SQL skills for data wrangling and modelling Degree in Computer Science, IT, Business or similar field; a Master’s is a plus Data engineering certifications are a plus Autonomous problem solver with strong communication skills Able to educate non-technical stakeholders on data solutions Forward-thinking, with a focus on innovative data opportunities as we gain maturity Bonus (nice to have) B2B SaaS Marketing domain knowledge Knowledge of Cloud engineering systems (GCP) Benefits Salary range – £40-50k 10% annual bonus paid as 2.5% per quarter if targets are hit. 5% pension contribution 4pm finish on Friday’s Quarterly 1-1 Coaching Sessions 25 days + bank holidays 4 wellness day per year (1x per quarter) Remote first with monthly socials MacBook Pro or other laptop Glassdoor reviews for this role “Working at Ziggy gives you the opportunity to do some new and exciting work if you choose to. Senior management are ready to support you in any new skills, career development or qualifications that you wish to do and are super friendly, approachable and engaged with their team. If you're looking for somewhere to be challenged, but not overwhelmed, this will be a very good opportunity for you. There is also a strong focus on working culture and work-life balance with regular out of work socialising, lots of holiday, early finishes and other fun perks. Overall, a really great time working here, can't find any way to fault it!” Glassdoor, Data Scientist in London, England
United Kingdom
Remote
Full Time
06-03-2025