
Head of Data Science
On site
London, United Kingdom
Full Time
23-04-2025
Job Specifications
Current responsibilities of Head of Data Science
Lead the data science strategy and team to deliver data science solutions e.g. retention, acquisitions and customer management using Python and Spark
Lead the hiring to build a great pool of Data Scientists and Engineers for the team and support the recruitment activities of other data functions
Motivate, inspire, coach and mentor colleagues within the Data Science team to help them develop technical excellence
Define clear objectives for each individual managed, ensuring each individual has a personal development plan and regularly proactively works on it
Support and mentor Data Scientists and Engineers with direct reports in their role as line managers
Motivate, inspire, coach and mentor business partners and stakeholders to help them identify new transformational possibilities that Data Science enables
Engage with senior stakeholders to identify and implement machine learning solution
Work actively in the innovation team to catalogue, enable and propose innovation ideas
A Head of Data Science is a responsible authority with the requisite knowledge to work across portfolios in the domain and help provide strategic technical direction that can optimise enterprise outcomes. This particular role focuses on the portfolios within the Legal Technology Solutions area, including Lawyer Productivity, Legal Digital Products, Knowledge Systems and Data Science. It is a key role in driving digital transformation and helping to ensure that the vision is being delivered in a rapid, iterative way while focusing on the overall experience to the users.
Collaborate and work in tangent with different business and technical teams
Identifying key data sources required to solve the business and undertaking data collection, pre-processing and analysis
Big picture thinking - correctly diagnosing problems and productionising research.
In charge of demonstrations, conducting demo trials, helping clients evaluate success criteria, and training users
Compile, integrate, and analyse data from multiple sources to answer business questions
Be updated with latest technological advances, evaluate their potential by working with the hands-on
Quality assurance of team deliverables
Partner management (Microsoft, start-up discussions)
Manage scrum-of-scrum
About the Company
HCLTech is a global technology company, home to more than 220,000 people across 60 countries, delivering industry-leading capabilities centered around digital, engineering, cloud and AI, powered by a broad portfolio of technology services and products. We work with clients across all major verticals, providing industry solutions for Financial Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of 12 months ending Dece... Know more
Related Jobs


- Company Name
- PDI Technologies
- Job Title
- Data Scientist II (TBH5177)
- Job Description
- At PDI Technologies, we empower some of the world's leading convenience retail and petroleum brands with cutting-edge technology solutions that drive growth and operational efficiency. By “Connecting Convenience” across the globe, we empower businesses to increase productivity, make more informed decisions, and engage faster with customers through loyalty programs, shopper insights, and unmatched real-time market intelligence via mobile applications, such as GasBuddy. We’re a global team committed to excellence, collaboration, and driving real impact. Explore our opportunities and become part of a company that values diversity, integrity, and growth. Role Overview We are seeking a skilled and motivated Data Scientist II to join our team. In this role, you will leverage your advanced analytical skills and programming expertise to extract insights from complex datasets, develop predictive models, and support decision-making for our diverse range of customers. As a mid-level contributor, you will work on a variety of data-driven projects, collaborate with cross-functional teams, and help implement scalable solutions. Key Responsibilities Data Analysis & Modelling: Analyse large, complex datasets to identify trends, patterns, and actionable insights Develop, implement, and optimize machine learning models to solve business problems Conduct A/B testing and experimental analysis to validate hypotheses Data Management & Engineering: Collaborate with data engineering teams to ensure data quality, accessibility, and efficiency Design and develop ETL pipelines and workflows for data pre-processing Develop automated tests to validate the processes and models you create Collaboration & Communication: Collaborate with stakeholders to define project goals, requirements, and deliverables Actively participate in design meetings to help shape the solutions that the team delivers Present findings and recommendations to technical and non-technical audiences Acquire domain knowledge to inform modelling opportunities and model feature creation Technical Leadership: Mentor junior data scientists and provide peer reviews for modelling projects Stay current with industry trends, tools, and best practices to continuously improve the team's capabilities Qualifications Education: Bachelor’s degree in data science, Statistics, Mathematics, or a related field Experience: 2 or more years of experience in a data science or analytics role Proven experience in building machine learning models, statistical analysis, and predictive analytics Experience designing experiments or modelling approaches to solve a specified business problem Preferred Qualifications Proficiency in programming languages such as Python or R; knowledge of is R an advantage Experience with SQL and working knowledge of relational databases Proficiency with data visualisation tools and techniques Experience with AWS is a plus Strong problem-solving and critical-thinking abilities Excellent communication and presentation skills Ability to manage multiple projects and prioritize tasks effectively PDI is committed to offering a well-rounded benefits program, designed to support and care for you, and your family throughout your life and career. This includes a competitive salary, market-competitive benefits, and a quarterly perks program. We encourage a good work-life balance with ample time off [time away] and, where appropriate, hybrid working arrangements. Employees have access to continuous learning, professional certifications, and leadership development opportunities. Our global culture fosters diversity, inclusion, and values authenticity, trust, curiosity, and diversity of thought, ensuring a supportive environment for all.


- Company Name
- LGC
- Job Title
- BI Developer
- Job Description
- Company Description LGC is a global life science tools company with a broad portfolio. We work closely with our diverse customers to drive science forward and find solutions that diagnose, heal, and help feed our growing population. Our 180 years of scientific heritage, combined with a track record of innovation and value-enhancing acquisitions, has enabled us to build our product portfolio and expertise, and develop deep relationships with customers, industry partners and the global scientific community. Together we solve complex challenges such as managing global pandemics, pioneering precision medicine, improving agriculture outputs, and ensuring the safety of food and medicines. LGC recognises the importance of a work-life balance and will always endeavour to facilitate hybrid working arrangements to support each employee in balancing their working life with personal interests, life-long learning, charity work, leisure activities and other interests. We are a global leader in the life sciences sector, serving customers in healthcare, applied markets (including food, agbio and the environment), academia and government. Underpinned by our five core values – integrity, brilliance, passion, curiosity, respect – our core purpose is to deliver science for a safer world. We are actively looking for individuals that are passionate about making a difference, and have an opportunity for a BI Developer to join our team. Job Description Join Our Group Analytics CoE Team at LGC! Are you ready to take your data analytics career to the next level? At LGC, we’re on the lookout for passionate and skilled individuals to help drive our data platform transformation with innovative projects, like demonstrating cloud-based Snowflake. This is your chance to work with ground breaking tools such as dbt, Snowflake, and Tableau, while being part of a collaborative team dedicated to enhancing our analytics solutions across the business. What We Offer: An exceptional opportunity to contribute to impactful projects within a dynamic team environment. A focus on teamwork, inclusion, and rapid growth in analytics capability. The chance to shape and elevate our data landscape in a forward-thinking organization. Your Responsibilities Will Include: Developing, maintaining, and supporting the transition of our data platform from SQL Server to Snowflake. Building and optimizing data products using Data Vault and core ODS data platforms. Constructing and preserving seamless reporting models using SQL. Collaborating with fellow analytics experts to implement and enhance analytics solutions. Ensuring data is used effectively to solve complex business challenges. Presenting and sharing insightful analysis findings with your colleagues. Staying abreast of the latest trends in BI tools and data warehousing practices. Implementing data models and analytics solutions that meet our evolving business needs. Qualifications Qualifications We're Looking For: Demonstrable experience in data modeling using SQL. Skills in developing reporting models using SQL. A solid understanding of relational databases and data warehousing principles. Familiarity with Git version control to enhance collaboration. Good communication skills for effective collaboration and analysis presentation. Experience with Snowflake and/or dbt is advantageous but not essential. Familiarity with BI tools such as Tableau and/or SAP Business Objects is a plus. A logical and analytical approach to solving business problems. Join Us! If you're eager to make a significant impact in the analytics space and work with a team that values innovation and collaboration, we want to hear from you! Apply now to be part of our exciting journey at LGC Additional Information LGC Group Analytics CoE is a centrally placed team, responsible for providing each business unit within the organisation with analytics support, driving analytics best practices and engagement with our analytics tools (Tableau and SAP Business Objects). As part of the Group Analytics team, this role has the exciting opportunity to learn and develop alongside other analytics colleagues, and help influence how LGC performs analytics.


- Company Name
- Propel
- Job Title
- Senior Data Scientist
- Job Description
- Job Title: Senior Data Scientist (AI & ML) Location: London hybrid Salary: £80-90K + Equity About the Company: A super interesting start-up that is on a mission to take on some of the most complex challenges in the field of predictive models, fraud detection, and AI-powered insights. What You’ll Do: As a Senior Data Scientist, you will own the product and lead the development of AI features. You’ll be a hands-on “do-er” who’s comfortable rolling up your sleeves and executing. This role is ideal for an individual who excels at building, optimising, and deploying machine learning (ML) models, and enjoys working with large datasets and cutting-edge technologies. You’ll be responsible for: Building and optimising machine learning models using NLP and graph vector DBs Leveraging big data efficiently to create actionable insights Working with large documents, data infrastructure, and containerization Investigating fraud and ethics reports, transforming raw data into meaningful insights Developing predictive models that go beyond the basics, focusing on real-world applications Using Python to build robust, scalable solutions About You: You have experience as a Data Scientist, preferably at a start-up or fast-growing company You have a deep understanding of ML, NLP, graph databases, and predictive modeling You can work independently and enjoy taking ownership of your projects You’re not just a theorist, you can build solutions and make things happen You’re a strong problem solver and know how to dig into raw data and find actionable insights Experience with containerization and data infrastructure is a big plus The Team: You’ll be joining a small but dynamic team, led by a Chief Science Officer. Expect a high level of collaboration and the opportunity to wear many hats. We aim to expedite the process for candidates, with decisions made within two weeks. Interested? If you’re ready to take on a hands-on role with a dynamic team and be part of something exciting, we’d love to hear from you!


- Company Name
- Octopus Energy
- Job Title
- Data Engineer (Mid-Level)
- Job Description
- London, UK Octopus Energy UK – Procurement / Full-time / Hybrid The Energy Markets team at Octopus Energy is responsible for making sure that we always have the electricity and gas we need to support our customer demand whilst also supporting the grid to enable the Net Zero transition. To achieve this mission across all Octopus international regions, we have sub-teams focused on forecasting energy demand and generation, hedging and shaping our trade position, tracking and reporting the ongoing risk to Octopus, and driving the proportion of our supply directly from generators via PPA agreements. The Engineering sub-team owns our global technical platform that supports these different processes and drives forward long-term solutions to enhance Group capabilities. We are looking for a Data Engineer to help achieve this goal - ideally someone who is comfortable diving into different tasks to support each team using a variety of coding languages across our platform setup, who enjoys developing relationships across the company while explaining technical processes in the most appropriate way, and who keeps an eye on scalable solutions to support data growth. This is therefore an exciting opportunity to take on a role that combines complex data engineering, visual analytics and business critical need. What You'll Do... Supporting different Energy Markets teams to design and build key operational and reporting pipelines across all Octopus Energy regions; Taking responsibility for the maintenance of these critical data pipelines supporting core trading, forecasting, risk and PPA processes; Developing automations and alerts to quickly debug where these pipelines are failing or showing unprecedented trends; Setting up and maintaining processes for capturing, preparing and loading valuable new data into the data lake; Designing and building dashboards that cover operational processes and reporting requirements; Working with international teams across the Octopus Energy Group to ensure everyone shares the best possible practises and code is standardised where possible; Taking ownership of data platform improvements that enhance the capabilities for all Energy Markets teams and drives trust in the stability of the setup; Sharing, enhancing and upskilling team members on available tools and best practices. What You'll Need... Strong aptitude with SQL, Python and Airflow; Experience in kubernetes, docker, django, Spark and related monitoring tools for devops a big plus (e.g. Grafana, Prometheus); Experience with dbt for pipeline modelling also beneficial; Skilled at shaping needs into a solid set of requirements and designing scalable solutions to meet them; Able to quickly understand new domain areas and visualise data effectively; Team player excited at the idea of ownership across lots of different projects and tools;Passion for driving towards Net Zero; Drives knowledge sharing and documentation for a more effective platform; Open to travelling to Octopus offices across Europe and the US. Our Data Stack: SQL-based pipelines built with dbt on Databricks Analysis via Python jupyter notebooks Pyspark in Databricks workflows for heavy lifting Streamlit and Python for dashboarding Airflow DAGs with Python for ETL running on kubernetes and docker Django for custom app/database development Kubernetes for container management, with Grafana/Prometheus for monitoring Hugo/Markdown for data documentation