
Manager Data Scientist
On site
Zaventem, Belgium
Full Time
25-03-2025
Job Specifications
Function
Advisory - Manager
Roles & Responsibilities
A corporate culture in which personal growth, mutual trust and lifelong learning are being fostered.
An inclusive workspace that encourages diversity and pursues mutual respect for each other’s beliefs and background.
Professional experiences in an international and dynamic working environment with inspiring colleagues.
Flexible, hybrid work arrangements to enable working on different locations: home office, on-site or on the go.
A wide range of teambuilding activities and social events that enable new joiners to meet colleagues within the wider KPMG community and have some fun along the way.
#technology
Location
Zaventem HQ
Skills & Qualifications
The tasks of a data scientist manager are diverse, and we are not looking for the unicorn data scientist, but a mindset focused on learning on the job is vital!
Elements we hope to see in our data scientist managers:
Concrete experience delivering concrete AI products.
A master degree with strong elements of modern data science techniques: Machine Learning algorithms, coding, statistics,…
Other types of eduction are also valued.
A sound knowledge of algorithms, and the ability to critically apply this knowledge to practical problems.
Knowledge of common data science software. We have a Python-first setup, but will if needed tackle problems using R, SAS, SQL, Spark, and work with tools such as Jupyter Notebooks, IDE’s, Git, Microsoft Azure,
Obviously, additional bonus points for PhD’s, and more specific knowledge like e.g. Keras, D3.js
The desire to learn and grow
A positive, collaborative, and flexible mindset
Knowledge of dutch is a big plus
We offer
In a world where stakeholders are consistently expected to provide more diverse digital experiences, data scientists are continuously challenged – to work is to grow. The KPMG Lighthouse is currently looking for talents in the field of data science, people who are able to exploit data to create smart digital solutions, but who are also able to inspire about the art of the possible.
What is the job about?
Contributing to the growth of Advanced Analytics and Machine Learning as a topic and as a competence at KPMG
Contributing to the growth of our project portfolio
Mentoring colleagues
Stepping ito conversations with clients, and translating their business needs into Artificial Intelligence solutions
Gathering and organizing data
Deciding on an analytical approach, and executing the analysis with a good eye for detail, all the while efficiently working towards the best possible solution
Communicating with clients: absorbing their thoughts, convincing them of the strengths of your models
Bringing your model into production – and communicating with specialist IT profiles
Creating and maintaining good relationships with stakeholders – clients, vendors, KPMG colleagues,…
About the Company
Uncovering a world of opportunity together. Whether your objective is to speed up digitalization, manage risk, drive growth, optimize cost, or increase your competitive advantage to secure your organization’s future, we can help you uncover fresh insights and develop new opportunities for your organization. Learn how to fully leverage data so that you can put it to work for your organization and discover how we can help your business stay relevant through Digital & Technology, Innovation, and Compliance. Know more
Related Jobs


- Company Name
- dataroots
- Job Title
- Data & Cloud Engineer
- Job Description
- The Job At Dataroots, we lead the charge in crafting cutting-edge AI solutions and platforms that drive innovation across industries. As a Senior Data & Cloud Engineer, you’ll play a pivotal role in designing and implementing advanced data infrastructure and digital solutions that form the foundation of our AI-powered strategies. With a deep focus on DataOps and MLOps, you'll collaborate with some of the brightest minds in the field to create robust, scalable data pipelines that empower machine learning teams and elevate our clients’ operations. Your expertise will ensure that our AI solutions are backed by high-quality data and state-of-the-art infrastructure, allowing companies to make smarter, data-driven decisions. You’ll design and deploy complex data architectures that impact businesses across a variety of sectors. You’ll oversee the entire lifecycle from data ingestion to production-ready systems, creating real-world solutions that drive results. Work closely with a multidisciplinary team of machine learning engineers, cloud architects, and data scientists. Together, you’ll build solutions that are not only innovative but also scalable and reliable What you can expect at Dataroots? Founded in 2016, Dataroots has been dedicated to helping customers achieve excellence in their data-driven operations. Our sustained and healthy growth over the years stems from a focus on team well-being, diversity, knowledge sharing and customer experience. Join a team where you can make a significant impact on our growth story. Learn more about our DNA at [https://dataroots.io/our-dna]. Job requirements The Skills Proven Leadership in Data Engineering: You are a master at designing high- performance data platforms and architectures. End-to-End Pipeline Expertise: You can create complex ELT data pipelines—both batch and streaming—with tools like Airflow, Prefect, or Dagster. Strong Software Engineering Foundations: Proficient with Git, Python, SQL, and Docker, you write clean, maintainable code. Cloud Engineering Mastery: You have hands-on experience with Terraform and are skilled in building scalable, cloud-based infrastructures. Hands-on Experience with Data Platforms: You are proficient with platforms like Databricks, Snowflake, Azure Data Services, or AWS Data Services. Nice-to-Have Experience with Spark for data processing. Expertise in data quality tooling and metadata management. Familiarity with dbt and advanced data modeling. Proficiency in container orchestration using Kubernetes and Helm. Knowledge of secure, private networking in cloud environments. The Offer An attractive salary with extralegal benefits, including: Mobility budget or a company car with fuel/charging card Hospitalization and group insurance High-end laptop Smartphone with subscription Substantial amount of holidays Meal vouchers ... Diverse and welcoming work environment, where you’ll collaborate & unwind with colleagues from different cultures and disciplines. The organization of both fun & professional events and initiatives is actively encouraged and supported. A training budget for individual and team learning opportunities. Tons of team-building events and sports initiatives to stay connected and unwind. Where you’ll work Our main offices are in Leuven and Ghent, with co-working spaces in Charleroi and Antwerp. These spaces are perfect for both focusing and unwinding with colleagues, as well as brainstorming sessions and team events. While much of your work can be performed remotely, you will also work on-site with clients.


- Company Name
- CAPGEMINI ENGINEERING
- Job Title
- Robotics AI Model Engineer
- Job Description
- We are looking for a highly skilled and motivated candidate to join our AI Research team. The goal is to innovate and accelerate the deployment of cutting-edge machine-learning techniques for driving and car monitoring technologies. In this role, you will be responsible for developing a machine-learning model that monitors and analyzes time series data in real-time. You will play a crucial role in collaborating with a multidisciplinary team of researchers and engineers to ensure the integration and usability of the model. If you are a team player who thrives on responsibility, possesses strong analytical skills, and has a deep understanding of time series and deep learning methods, then you are the right person for this challenge. Experience the excitement of teamwork and be inspired by the passion for motorsport and automotive projects. Profile Computer science, physics, engineering, mathematics Research experience and mindset, min Master level, good to have PhD Hands-on mindset, practical and pragmatic for real-world problem solving Self-motivation, be able to work under pressure, be able to manage multiple tasks effectively, and eager to learn Experience 3-5 years Expertise Machine- and Deep-learning, incl. knowledge of classical algorithms Foundation models (knowledge in large vision models, multi-modal models) Time series (knowledge in large real-time data would be a plus) Robotics / mechatronics Technical Python, (extra) C++, MATLAB/Simulink Proficient in PyTorch (knowledge in PyTorch Forecasting and TensorRT would be a plus) Python libraries: Pandas, Scikit-learn (knowledge in SKTime and Darts would be a plus) Proficient in Linux, Docker, Git Proficient in Linux environment and HPC experience (preferable AWS and Azure)


- Company Name
- dataroots
- Job Title
- Cloud (& Data) Engineer
- Job Description
- The Job As a Senior Cloud & ( Data) Engineer, you thrive in architecting, building, and automating cloud-native infrastructure that supports scalable, production-grade data and AI solutions. You bring a wealth of experience in designing and implementing CI/CD pipelines, comprehensive test suites, and Infrastructure-as-Code using best-in-class cloud-native approaches. Your sharp attention to detail is matched by a deep understanding of how to leverage DevOps principles, with a focus on balancing technology with business value. You are a natural leader, passionate about mentoring colleagues and advocating for DevOps and best engineering practices with clients to ensure the delivery of top-tier solutions. At Dataroots, our mission is to innovate, design, and deliver robust data solutions and platforms for a variety of sectors. Our growing team operates out of Leuven and Ghent, where our dedicated in-house data & AI specialists apply state-of-the-art methodologies to optimize and transform our clients' business processes. Our strong emphasis on DataOps and MLOps means we build solutions that are not only intelligent but also production-ready, offering clients a solid foundation to make data-driven decisions with confidence. . What you can expect at Dataroots? Dataroots was founded in 2016 and, since its beginnings, has focused on helping its customers achieve excellence in their data-driven operations. The only way to do this sustainably is by investing in team well-being, diversity, knowledge sharing and customer experience. This quality-first focus has allowed sustained and healthy growth over the years. Join a team where you can have an impact on our growth story. Read up more on our DNA over at [https://dataroots.io/our-dna]. The Skills You bring advanced software engineering expertise, with deep experience in Git, Python, SQL, and Docker You demonstrate extensive cloud engineering proficiency, utilizing Terraform to create robust, scalable, and maintainable cloud infrastructure. You have significant experience orchestrating containerized applications using Kubernetes and Helm, ensuring high availability and efficient scaling. You have implemented secure, private networking solutions in complex cloud environments, ensuring adherence to best security practices. You possess a solid understanding of at least one modern data platform, such as Databricks, Snowflake, Azure Data Services, or AWS Data Services. You lead the design and architecture of sophisticated data platform components, driving end-to-end solutions that align with client needs Nice-to-Have You have extensive experience applying Spark to deliver effective data engineering solutions. You skillfully design and build both batch and streaming ELT data pipelines, leveraging orchestration tools such as Airflow, Prefect, or Dagster. You have set up comprehensive data quality frameworks and managed metadata implementation to ensure reliability and traceability of data. The Offer An attractive salary with extralegal benefits, including: Mobility budget or a company car with fuel/charging card Hospitalization and group insurance High-end laptop Smartphone with subscription Substantial amount of holidays Meal vouchers ... Diverse and welcoming work environment, where you’ll collaborate & unwind with colleagues from different cultures and disciplines. The organization of both fun & professional events and initiatives is actively encouraged and supported. A training budget for individual and team learning opportunities. Tons of team-building events and sports initiatives to stay connected and unwind. Where you’ll work Our main offices are in Leuven and Ghent, with co-working spaces in Charleroi and Antwerp. These spaces are perfect for both focusing and unwinding with colleagues, as well as brainstorming sessions and team events. While much of your work can be performed remotely, you will also work on-site with clients. Ready to join Dataroots? Excited to shape the future of Cloud as a Cloud & ( Data) Engineer with us? We’re looking forward to hearing from you! Apply now and start a new chapter with Dataroots!


- Company Name
- RED Global
- Job Title
- Data Scientist - ML
- Job Description
- RED Global is currently recruiting for a Data Scientist - ML for our client, a global organisation who are scaling up their data capability. They are looking to bring on board an experienced Data Scientist with experience of Machine Learning, GenAI and solution architecture and software engineering. Key Responsibilities: Responsible to support and advice various clients to identify and define problems and resolve these complex issues for the business. You will interact with graph databases through standardized APIs and query languages and work with ETL Prosses. Use AI techniques such as GenAI, supervised and un-supervised machine learning techniques. Work across the latest data science platforms (Databricks, Dataiku, AzureML, SageMaker) and frameworks (e.g., Tensorflow, PyTorch, MXNet, scikit-learn). Support with building and deploying solutions to Cloud (AWS, Azure, Google Cloud) including Cloud provisioning tools (e.g., Terraform). Advising to the business stakeholders on data driven marketing - working closely with third parties and, internally, with digital teams, IT, and others. Excellent verbal and written communication skills in English and French/Dutch with good senior-level stakeholder management soft skills and competencies. Required skills: 3-5 years experiences as a Data Scientist working within global organisations. Extensive work experience of various Data Science Platforms such as Databricks, Dataiku, AzureML, SageMaker) and frameworks such as Tensorflow, PyTorch, MXNet, scikit-learn. Experience building and deploying solutions to Cloud (AWS, Azure, Google Cloud). Must have technical knowledge of Machine Learning (ML) or GenAI. Industry skills: Manufacturing or Life sciences would be a plus. Strong analytical skills and appreciation of coding languages, data sources, data matching and data acquisition.