
Ntrinsic Consulting
About the Company
Ntrinsic Consulting is a specialist talent business servicing the technology and business support sector across the UK and the Benelux Region. Placing the client at the epicentre of our strategy, we develop tailored solutions catered to unique challenges and realities of the business. We are dedicated to building long-lasting partnerships through bespoke, tailored services that allow us to showcase an understanding of our client’s business and the impact new hires can have on their projects.
Ntrinsic Consulting is now part of the Cpl group of brands and businesses. Cpl is a transformational talent solutions organisation who provide services across the full talent spectrum.
Listed Jobs


- Company Name
- Ntrinsic Consulting
- Job Title
- Data Engineer
- Job Description
-
Data Engineer, Contract, Hybrid, Manchester, Cheltenham, London, Leicester
Position: Data engineer
Number of positions: X2 positions available
Location: Manchester, Cheltenham, London, Leicester (Up to 80% or 4 days a week) and (home on weekends - based on project requirements)
Start date: ASAP start
Project scope + Skills required:
Seeking new possibilities and always staying curious, we are a team dedicated to creating the world’s leading AI-powered, cloud-native software solutions for our customers. Our renowned legacy creates endless global opportunities for our employees, so the door is always open for those who want to grow their career.
The Data Engineer role requires a highly analytical individual proficient in Python programming, database management, and data methodologies. You’ll focus on extracting insights from data, developing and implementing machine learning models, managing big data infrastructure, and supporting AI-driven product development.
Key Responsibilities:
Data Collection and Cleansing: Collect and cleanse data from diverse sources to ensure high-quality datasets for decision-making.
Data Exploration and Visualization: Explore and visualize data using advanced techniques to uncover insights and trends.
Statistical Analysis: Apply statistical and mathematical techniques to provide robust analytical foundations for predictive modelling.
Machine Learning and Deep Learning: Develop and implement machine learning and deep learning models to address business challenges.
ML-Ops / AI-Ops: Demonstrate expertise in ML-Ops / AI-Ops practices to ensure efficient model deployment and management.
Big Data Management: Manage big data infrastructure and execute data engineering tasks for efficient data processing.
Version Control and Collaboration: Utilize version control systems like Git for maintaining codebase integrity and fostering collaboration.
AI-Driven Product Development: Design, create, and support AI-driven products to deliver impactful solutions aligned with user needs and business objectives.
#dataengineer #contract #UK #AI #work


- Company Name
- Ntrinsic Consulting
- Job Title
- ETL Developer
- Job Description
-
Job Title: ETL Developer (Python / Big Data Engineer)
Location: Hybrid (2-3 days in customer office)
Mode of Work: Hybrid work environment, with 2-3 days onsite at the customer office.
Type: Contract (Initial duration 6 months)
Job Description:
We are looking for an experienced ETL Developer with a strong background in Python development and Big Data technologies to join our team. As an ETL Developer, you will be responsible for the design, development, and implementation of data processing pipelines using Python, Spark, and other related technologies to handle large-scale data efficiently. You will also be involved in ensuring the integration of data into cloud environments such as Azure, alongside basic DevOps tasks and RDBMS fundamentals.
Responsibilities:
Develop and maintain ETL pipelines using Python for data extraction, transformation, and loading.
Utilize Apache Spark for big data processing to handle large datasets and optimize performance.
Work with cloud technologies, particularly Azure, to deploy and integrate data solutions.
Implement key Python concepts and leverage libraries/packages like Pandas, NumPy, and others for data manipulation.
Perform data integration tasks involving various data sources and structures.
Collaborate with cross-functional teams to design and implement robust, scalable data solutions.
Apply basic DevOps practices to manage and automate workflows within the ETL process.
Ensure best practices in database management and integration with RDBMS systems.
Participate in troubleshooting, optimization, and performance tuning of data processing systems.
Required Skills and Experience:
Proficient in Python with hands-on experience in key libraries (Pandas, NumPy, etc.) and a deep understanding of Python programming concepts.
Solid experience in Big Data Processing using Apache Spark for large-scale data handling.
Basic DevOps knowledge and familiarity with CI/CD pipelines for automating workflows.
Understanding of Azure Fundamentals and cloud data solutions.
Strong understanding of RDBMS database fundamentals (SQL, relational data modelling, etc.).
Previous experience in ETL development and data integration.
Senior/Lead level experience with hands-on development in relevant technologies.
Excellent problem-solving skills and ability to optimize data workflows.
Additional Desirable Skills:
Familiarity with cloud-based data storage and processing technologies in Azure.
Experience working in Agile or other collaborative development environments.
#ETL #contract #work #London #developer #hybrid


- Company Name
- Ntrinsic Consulting
- Job Title
- Software Engineer
- Job Description
-
Software Engineer - Managing Investments Technology Team
Location: London
Work model: Hybrid, 3 days onsite per week
Rate: £200 per day inside ir35
Sponsorship: Cannot provide
Your Role
Are you passionate about designing and building innovative digital products and services? Do you want to play a key role in transforming how we work and deliver value to our clients and employees? Join our dynamic and forward-thinking team, where agility, adaptability, and continuous innovation are at the core of everything we do.
As a Software Engineer, you will:
Design, develop, and improve digital products and technology services to enhance both client and employee experiences.
Apply a wide range of software engineering practices, from analysing user needs and developing new features to automated testing and deployment.
Ensure the quality, security, reliability, and compliance of solutions by applying our digital principles and implementing functional and non-functional requirements.
Integrate observability into solutions, monitor production health, and help resolve incidents while addressing the root cause of risks and issues.
Understand, represent, and advocate for client needs, ensuring their perspectives are considered in the development process.
Share knowledge and expertise with colleagues, contribute to team growth, help with hiring, and actively contribute to our engineering culture and internal communities.
Your Team
You will be part of an agile team working towards creating a world-class, scalable, digital, and integrated Multi-Asset Portfolio Management and Trading platform. Specifically, you will work on initiatives to introduce a service-based, event-driven, messaging architecture in the cloud, moving away from on-prem systems where possible. Your work will support Portfolio Management and Trading, enabling the business to operate more effectively and deliver better outcomes for clients.
Your Expertise
We are looking for candidates who have:
A Bachelor’s and/or Master’s degree or equivalent in Computer Science, Computer Engineering, or a related technical discipline.
Strong experience with ETL tools such as Informatica, Unix shell scripting, and job scheduling using tools like Autosys.
Solid experience with SQL Server, Azure SQL Server, and familiarity with NoSQL database technologies.
Experience with Cloud architecture, particularly on Microsoft Azure (Data Factory, Kubernetes Service, Container Apps, App Service, Functions, Event Grid, Service Bus).
Experience with messaging platforms, preferably Kafka.
Familiarity with containers, particularly Kubernetes.
Experience with Microservice and Event-Driven architecture is a plus.
Familiarity with Agile, Test Driven Development, and DevOps practices.
A focus on engineering best practices, including Clean Code and software craftsmanship.
Excellent communication skills, both written and verbal.
Strong analytical and problem-solving abilities, with a keen interest in understanding how systems work.
A passion for continuous learning and innovation.
#work #contract #developer