cover image
WRK digital

Analytics Engineer

On site

Leeds, United Kingdom

£ 80,000 / year

Full Time

24-02-2025

Share this job:

Score my CV

Job Specifications

WRK digital is delighted to team up with Vintage Cash Cow, a fast-growing online platform that brings preloved vintage treasures to a global audience. Since launching in 2016, they've impressively grown from a small team of five to over 100 experts—and now, we are shortlisting for a Analytics Engineer to join their Technology & Data team!

Vintage Cash Cow is the UK’s leading service for selling vintage and antique items in one simple, hassle-free process. With a mission to promote sustainability and support a circular economy, they empower customers to unlock the value of their cherished items while reducing waste. As they expand into new markets, they are looking for a highly skilled and motivated Analytics Engineer to join a dynamic team and help elevate their data infrastructure and analytics capabilities. If you have experience with Snowflake, digital marketing analytics, and are familiar with Hubspot’s data, we want to hear from you!

Role Overview:
As an Analytics Engineer, you will play a crucial role in developing and maintaining the data architecture, pipelines, and reporting frameworks that enable the VCC teams to make data-informed decisions. You will work closely with both technical and marketing teams, ensuring data accuracy, integrity, and scalability. Your experience with Snowflake, digital marketing analytics, and Hubspot will be key to delivering insights that drive growth and optimisation across their digital channels.

This will be the company’s first data hire as it looks to build out its data capabilities. You should possess a good working knowledge of data warehousing as well as what best practice looks like when undertaking data engineering.

Key Responsibilities:
Data Architecture & Pipeline Development:
Design, implement, and maintain robust and scalable data pipelines, ensuring seamless integration and flow of data across multiple systems.
Data Integration:
Manage and integrate marketing data sources into the data warehouse, ensuring consistency and quality across the data ecosystem. These sources include Adalyser, Meta Ads, Google Ads, Hubspot and Aircall.
Analytics & Reporting:
Collaborate with cross-functional teams (marketing, product, etc.) to build actionable dashboards and reports using your choice of analytics tool. Focus on performance analysis of digital marketing campaigns, ROI, customer journeys, and more.
Data Quality & Validation:
Implement processes to monitor and ensure the accuracy, completeness, and consistency of data. Conduct regular data audits and resolve data issues as needed.
Optimisation & Automation:
Identify opportunities for automation and optimization of data workflows and reporting processes, driving efficiency across the business.
Collaboration:
Work closely with marketing teams to understand key KPIs, campaign metrics, and insights needs, translating them into actionable data solutions.
Documentation & Best Practices:
Document data architecture, processes, and workflows, ensuring a clear understanding of the data ecosystem and promoting best practices within the team.

Required Skills and Experience:
Snowflake Experience:
Solid experience in managing and optimizing Snowflake environments, including data loading, querying, and creating views and stored procedures.
Digital Marketing Analytics Knowledge:
Proven experience working with digital marketing data, including understanding of key metrics (e.g., conversion rates, CAC, LTV) and their analysis across platforms like Google Analytics, social media, SEM, etc.
Hubspot Data Expertise (Preferred):
Experience in working with Hubspot data, including extracting, transforming, and loading (ETL) Hubspot data into a central data warehouse. Familiarity with Hubspot’s reporting tools is a plus.
SQL Proficiency:
Strong SQL skills, with experience in writing complex queries to manipulate and extract insights from large datasets.
ETL & Data Integration Tools:
Experience with ETL tools and frameworks such as Airflow, dbt, or similar.
Data Visualization:
Experience in building interactive and insightful dashboards using BI tools.
Attention to Detail & Problem-Solving Skills:
Strong analytical mindset with a focus on data accuracy, troubleshooting, and resolving complex data issues.
Communication Skills:
Ability to communicate complex technical concepts to both technical and non-technical stakeholders.
Experience in Python or R for data analysis and automation is a plus.

Role Benefits:
This is a hybrid position working from their Leeds based office offering a flexible work schedule and remote work options. There is a very competitive salary and performance bonus alongside comprehensive health and wellness benefits. There are brilliant professional development opportunities in a dynamic and creative work environment. You would be working as part of a highly skilled team with over 20 years experience in the sector.

Next Steps:
If this role sounds of interest, please click apply or reach out to James Westwood, james@wrkdigital.co.uk to discuss further.

About the Company

WRK digital was founded to be a Trusted, Passionate and Authentic business, partnering some of the most exciting companies to scale their technology teams and disrupt our industry. WRK digital excels in four core disciplines; - Technology Leadership - Data, BI and Analytics - Software Engineering - Transformation and Programme Delivery The challenges of recruiting experienced and exceptional Technology talent has never been more complex, however WRK digital act as an extension of your business and represent your brand ... Know more

Related Jobs

Company background Company brand
Company Name
Datatech Analytics
Job Title
Senior Azure Data Engineer
Job Description
Senior Azure Data Engineer Remote Working - UK Home-based with occasional travel into the office £39,784 - £49,477 (National Framework) £45,456- £55,149 (London Framework - if you are London or homebased and live within the boundary of the M25) Homeworking allowance of £581 per annum Additional allowance for exceptional candidates - Max Salary - £62,000 outside London and £67,000 for London (within the M25) Job Ref: J12917 The role of the Senior Azure Data Engineer is to design, build, test and maintain data on the enterprise data platform, allowing accessibility of the data that meets business and end user needs. The role holder will be responsible for maximising the automations, scalability, reliability and security of data services, focusing on opportunities for re-use, adaptation and efficient engineering. Accountabilities: Design, build and test data pipelines and services, based on feeds from multiple systems using a range of different storage technologies and/or access methods provided by the Enterprise Data Platform, with a focus on creating repeatable and reusable components and products. Design, write and iterate code from prototype to production ready. Understand security, accessibility and version control. Use a range of coding tools and languages as required. Work closely with colleagues across the Data & Insight Unit to effectively translate requirements into solutions, and accurately communicate across technical and non-technical stakeholders as well as facilitating discussions within a multidisciplinary team. Deliver robust, supportable and sustainable data solutions in accordance with agreed organisational standards that ensure services are resilient, scalable and future proof. Understand the concepts and principles of data modelling and produce, maintain and update relevant physical data models for specific business needs, aligning to the enterprise data architecture standards. Design and implement data solutions for the ingest, storage and use of sensitive data within the organisation, including designing and implementing row and field-level controls as needed to appropriately control, protect and audit such data. Clearly, accurately and informatively document and annotate code, routines and other components to enable support, maintenance and future development. Work with QA Engineers to execute testing where required, automating processes as much as possible. Keep abreast of opportunities for innovation with new tools and uses of data. Learn from what has worked as well as what has not, being open to change and improvement and working in ‘smarter’, more effective ways. Work collaboratively, sharing information appropriately and building supportive, trusting and professional relationships with colleagues and a wide range of people within and outside of the organisation. Provide oversight and assurance of suppliers and team members, coaching and mentoring colleagues to create a highly-performant and effective team. Design and undertake appropriate quality control and assurance for delivery of output. Provide direction and guidance to peers and junior colleagues, including line management and development of teams, where required. Essential Skills and Experience: Educated to degree level or have equivalent professional experience. Experience translating business requirements into solution design and implementation. Experience of MS Azure Databricks Experience working with Database technologies such as SQL Server, and Data Warehouse Architecture with knowledge of big data, data lakes and NoSQL. Experience following product/solution development lifecycles using frameworks/methodologies such as Agile, SAFe, DevOps and use of associated tooling (e.g., version control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding of the principles of data modelling and data flows with ability to apply this to design of data solutions. Experience of supporting and enabling AI technologies. Experience implementing data flows to connect operational systems, data for analytics and BI systems. Experience documenting source-to-target mappings. Experience in assessing and analysing technical issues or problems in order to identify and implement the appropriate solution. Knowledge and experience of data security and data protection (e.g., GDPR) practices and application of these through technology. Strong decision-making, leadership and mentoring skills Ability to identify problems and lead the delivery of solutions and preventative measures, escalating where appropriate. Proven ability to understand stakeholder needs, manage their expectations and influence at all levels on the use of data and insight. Desirable Knowledge and understanding of the health and care sectors. Experience of using Power BI and creating reports/analysing data. Additional Requirements: Candidates must have an existing and future right to live and work in the UK. Sponsorship at any point is not available. If this sounds like the role for you then please apply today! Alternatively, you can refer a friend or colleague by taking part in our fantastic referral schemes! If you have a friend or colleague who would be interested in this role, please refer them to us. For each relevant candidate that you introduce to us (there is no limit) and we place, you will be entitled to our general gift/voucher scheme. Datatech is one of the UK’s leading recruitment agencies in the field of analytics and host of the critically acclaimed event, Women in Data. For more information, visit our website: www.datatech.org.uk
London, United Kingdom
On site
Full Time
24-02-2025
Company background Company brand
Company Name
Onyx Data
Job Title
Azure Data Engineer
Job Description
About the Role We are seeking a highly skilled Azure Data Engineer with substantial experience in T-SQL and Spark to join our dynamic team. This is a remote position for UK-based applicants. The ideal candidate will have strong expertise in designing, developing, and optimising data solutions on the Microsoft Azure ecosystem and will hold relevant Microsoft certifications. As an Azure Data Engineer, you will play a crucial role in designing and implementing scalable data architectures, developing robust ETL pipelines, and optimising data processing frameworks to support business intelligence, analytics, and AI initiatives. Key Responsibilities Data Engineering & Development: Design, develop, and maintain Azure-based data solutions. Develop complex T-SQL queries and optimise database performance. Build and manage big data processing pipelines using Apache Spark (Databricks, Synapse Spark). Implement ETL/ELT processes using Microsoft Fabric, Azure Data Factory and Synapse Pipelines. Azure Cloud & Architecture: Work with Microsoft Fabric, Azure Data Lake, Azure Synapse Analytics, and Azure SQL to manage data storage and processing. Develop and maintain data models, schemas, and integrations. Ensure solutions are scalable, resilient, and cost-effective. Performance Optimisation & Best Practices: Tune queries and optimise Spark jobs for performance and cost efficiency. Implement data governance, security, and compliance standards. Ensure high availability and reliability of data solutions. Collaboration & Stakeholder Engagement: Work closely with data analysts, BI developers, and business stakeholders to understand data needs. Collaborate with cross-functional teams to integrate data solutions into business applications. Document data processes and provide knowledge sharing within the team. Key Skills & Experience Strong experience with T-SQL (query writing, stored procedures, indexing, performance tuning). Hands-on experience with Apache Spark (PySpark, Scala, Databricks, Synapse Spark). Proficiency in Azure Data Platform Services, including: Microsoft Fabric Azure Data Factory (ETL/ELT pipelines). Azure Synapse Analytics (formerly SQL DW). Azure Data Lake Storage (ADLS Gen2). Azure SQL Database. Experience working with structured, semi-structured, and unstructured data. Knowledge of data architecture best practices, data governance, and security compliance. Strong analytical and problem-solving skills. Excellent communication skills and ability to work independently in a remote setting. Preferred Qualifications & Certifications Microsoft Certifications (Preferred but not essential): DP-700: Fabric Data Engineer Associate DP-600: Fabric Analytics Engineer Associate DP-203: Azure Data Engineer Associate DP-900: Microsoft Azure Data Fundamentals AZ-900: Microsoft Azure Fundamentals Experience with CI/CD for data pipelines, DevOps practices, and Infrastructure as Code (Terraform, Bicep) is a plus. What We Offer Competitive salary based on experience £40,00 - £55,000 depending on experience Fully remote role within the UK Career growth opportunities within a leading Azure data team Exposure to cutting-edge Microsoft Azure data technologies and projects Support for professional certifications and continuous learning If you're a highly skilled Azure Data Engineer with a passion for T-SQL, Spark, and Microsoft Azure, we want to hear from you! Apply now and be part of an innovative team driving data excellence!
United Kingdom
Remote
Full Time
24-02-2025
Company background Company brand
Company Name
Mr Fothergill's Seeds Ltd
Job Title
Data Analyst
Job Description
At Mr Fothergill’s Seeds we are in the business of bringing gardens to life. Known for our exceptional range of seeds, garden products and horticultural expertise with well-known brands such as Mr Fothergill’s, Johnsons, DT Brown and Darlac our products can be found in garden centres, big retailers and direct to consumers through our websites and catalogues. We have a legacy of quality, a loyal customer base and are committed to inspiring and empowering gardeners of all levels. We are a passionate team with a strong can-do attitude. We show respect to one another and are keen to innovate on our respective areas of expertise. We value agility, understand the journey our business is on and strive to add our unique mark onto that journey. We work hard but don’t forget to have fun. Our company purpose is “Helping Everyone Grow” and we extend that purpose to everything we do. The Data Analyst is responsible for the science behind our direct-to-consumer marketing initiatives and will be at the heart of our data-driven decision-making process, managing and analysing customer data across multiple sources to maximize marketing impact. This role will report to our Head of IT and work closely with the Chief Marketing Officer and wider marketing team. Core Responsibilities: Customer Data Management: Oversee our customer data platform to accurately identify customers, build segmentation models, and deliver insights into customer behaviours. Centralize Operational Data: Maintain and centralize data across all direct-to-consumer channels within a dashboard, reporting against budget targets. Optimization and Cost-Saving Initiatives: Identify cost-saving opportunities and actions to improve profit margins and provide measurable recommendations to the marketing team. Sales & Revenue Reporting: Produce daily and weekly reports on sales revenue and margin by channel, as well as regular customer segmentation reports for the marketing team. Collaboration with IT: Partner closely with the IT team to enhance data capture and reporting processes, ensuring the accuracy and accessibility of data. Performance Insights: Present data insights to the e-commerce team to guide decisions on marketing activities, focusing on ROI and ROAS metrics. We are looking for someone who have a minimum of 2 years’ experience in a data marketing role ideally within a direct-to-consumer organisation. You will possess strong analytical abilities and be practiced in effectively communicating complex insights appropriate to the audience. You will also demonstrate: Technical Knowledge: Familiarity with ERP systems, PowerBI, and Shopify Plus is desirable. Analytical Skills: Proven ability to interpret complex data, generate actionable insights, and present findings clearly. Reporting Proficiency: Experienced in transforming raw data into user-friendly, insight-driven reports for various audiences. Team-Oriented Approach: Collaborative approach with excellent communication skills, able to work effectively with cross-functional teams. Problem-Solving: Energetic and solution-oriented, with a “can-do” attitude that drives results. The world of gardening should be accessible to all and it’s important that our teams reflect the diversity of our customers, ensuring that our products meet everyone’s needs. Therefore, we very much welcome applications from a diverse range of candidates regardless of background, gender, race, religious beliefs, disability, sexual orientation, or age. We’re ‘rooting’ for you!
Kentford, United Kingdom
On site
Full Time
24-02-2025
Company background Company brand
Company Name
Thurn Partners
Job Title
Data Engineer | London
Job Description
Our client is a global investment management firm that utilises a diversified portfolio of systematic and quantitative strategies across financial markets that seeks high quality returns. A technology and data-driven firm, they design and build their own cutting-edge systems, from high performance trading platforms to large scale data analysis and compute farms. Owing to recent success over the past year, our client is looking to expand its London office alpha data team. They are seeking someone with a high level of academic pedigree, commerciality and has an ability to work in a fast-paced environment. The firm's commitment to innovation and intellectual growth provides an ideal environment for ambitious professionals seeking to make a significant impact in the financial technology sector. Your Role Design, implement, test, optimize and troubleshoot Python data pipelines, frameworks and services. Responsible for delivering a vast quantity of valuable data to the rest of the company at speed. Collaborate with and influence technologists and investment researchers to ensure our data pipelines and platform meet constantly evolving requirements. Work closely with data operations and data platform developers to improve our data platform and reduce our technical debt. Experience Required / About You Professional experience of coding to a high standard in Python Bachelor's degree in Computer Science or a related subject Excellent communication skills Experience and knowledge of SQL Preferred experience with big data frameworks, databases, distributed systems, or Cloud development. Further Context Hybrid working structure - 4 days in the office, 1 day working from home a week. London based office Competitive salary A successful candidate will be able to collaborate closely with team members and end users, bring fresh perspective and contribute to the dynamic, forward-thinking culture of the company. Pre-Application Please do not apply if you are looking for contract. You must be eligible to live and work in the UK, visa sponsorship can be discussed. Please ensure you meet the required experience section prior to applying. Your application is subject to our privacy policy, found here: https://www.thurnpartners.com/privacy-policy
London, United Kingdom
Hybrid
Full Time
24-02-2025