
Senior AI Engineer
On site
London, United Kingdom
Freelance
20-02-2025
Job Specifications
Role Description: Senior Engineer – AI Tooling Assessment
Role Overview
We are seeking a highly experienced Senior Engineer to lead the assessment and adoption of AI tooling within our development teams. This role is integral to ensuring our organisation remains a leader in using AI to improve software delivery processes. As the Senior Engineer, you will evaluate emerging AI tools for code generation, guide their integration into our workflows, and ensure they provide measurable value to our teams.
The Person:
This is a pivotal role that will shape the future of our development practices, ensuring our teams are equipped with the best AI tools to deliver exceptional results.
Extensive fullstack experience in a senior engineering role, with a focus on innovation and emerging technologies.
Highly experienced in the use of GitHub Copilot and other tools/models that use AI for code generation
Strong understanding of AI and machine learning concepts, including their practical application in software development.
Solid background in software engineering, CI/CD pipelines, and modern programming practices.
Start-up mindset, highly proactive and adaptable, and passionate about AI as an enabler for engineering; thrives in fast-paced environments.
Exceptional communication and leadership skills, with the ability to influence and collaborate with stakeholders at all levels.
Able to articulate to both technical and business stakeholders the pros and cons of different tools and approaches
Strategic thinker with a forward-looking approach to technology adoption.
Able to commute up to 3 days a week into Old Street office and (optionally) Waterside, if and when required. Before then, there is an expectation of 2 days per week in the office.
Key Responsibilities
Assessment of AI Tools
Research and evaluate AI tools and platforms that can enhance software development practices (e.g. code generation, error detection, testing, and DevOps optimisation).
Conduct hands-on testing and technical evaluations to determine each tool’s viability, performance, and scalability.
Compare tools against business and technical requirements, including cost-effectiveness, integration complexity, and compliance standards.
Create recommendations of how the client should support or not support a range of AI tools / models and measure the benefit on productivity for a range of development tasks:
Perform high-level ½ day overview of different tools / models to see which warrant further investigation
Complete 1-week hands-on experiments with different tools and models for a range of tasks and produce recommendations / comparisons with the GitHub Copilot default model
Complete more in-depth evaluation of any selected AI tools and platforms to enhance code generation, error detection, testing, and DevOps optimisation
Collaboration with Stakeholders
Engage with development teams, technical leads, and product managers to identify challenges and opportunities where AI tools can deliver value.
Present findings and recommendations to senior leadership, highlighting the strategic benefits of adopting specific AI tools.
Integration and Rollout
Develop strategies and best practices for introducing AI tools into existing processes, ensuring minimal disruption.
Lead proof-of-concept projects and pilot programmes to validate tool effectiveness.
Support teams with onboarding and training to maximise the benefits of new AI technologies.
Assist in rolling out and increasing adoption of AI software development tools, through workshops, focus groups, etc
Produce handbooks and documentation to support the increased adoption of AI Code Generation tools
Performance Monitoring and Optimisation
Define metrics and KPIs to measure the impact of adopted AI tools.
Continuously review the performance of tools in use, identifying opportunities for further optimisation.
Stay abreast of industry trends and advances in AI tooling to ensure the organisation remains ahead of the curve.
Governance and Compliance
Ensure that all AI tools align with organisational policies, security protocols, and relevant regulations.
Establish guidelines for ethical and responsible use of AI in development processes.
About the Company
We are leaders in specialist recruitment and workforce solutions, offering advisory services such as learning and skill development, career transitions and employer brand positioning. As the Leadership Partner to our customers, we invest in lifelong partnerships that empower people and businesses to succeed. We help you achieve your career goals and deliver your business needs by combining meaningful innovation with our global scale and insights. Last year we helped over 280,000 people find their next career. Join the mill... Know more
Related Jobs


- Company Name
- Cornwallis Elt
- Job Title
- Infrastructure Engineer Low Latency, Market Data, Linux
- Job Description
- Infrastructure Engineer - Low Latency, Market Data, Linux, Data Centres, Financial Services. My client, a leading global financial organisation, is currently looking for an experienced Infrastructure Engineer to join the organisation. Working as part of the Market Data Architecture Team, you will be delivering end-to-end core engineering work for a migration project providing hands on design, Implementation and maintenance of market data solutions to seamless operate across different Low Latency platforms and operating systems (Linux & RedHat) within a trading environment. The successful candidate will be able to effectively combine long-term strategic vision with hands-on, practical architecture, problem solving strengths, communication and leadership skills to help bring about a wide-reaching Real Time Data migration project. Successful Candidates will have: Detailed understanding of Market data and Low-Latency systems Experience in designing and implementation of technical solutions. Experience in leading end-to-end technical design for new data centre infrastructure (Servers, virtualization, automated applications). Solid knowledge of operating systems (Linux, RedHat 8 Etc) Experience working with virtualisation tooling such as Kubernetes, Docker, VMWare, etc Experience in capturing requirements from the existing environment to drive improvements in the new. Demonstrated ability to work effectively within a global team to deliver high performance and customer satisfaction. This is an excellent opportunity for an experienced engineer to join a rapidly evolving organization.


- Company Name
- Methods Analytics
- Job Title
- (SC cleared) Senior Data Engineer - outside IR35
- Job Description
- On-site: near Worcester/Great Malvern/Gloucester Daily rate: up to £700 outside IR35 Duration: 6months with possible extension Clearance: Must be SC cleared; DV preferred We are seeking a seasoned Senior Data Engineer (Infrastructure) to join our team. This role is essential for designing, building, and maintaining sophisticated data infrastructure systems that operate across both on-premises and Azure cloud environments. The position involves deploying and managing scalable data operations that support advanced analytics and data-driven decision-making, crucial for our organisational growth and innovation. Requirements Develop and Manage Data Pipelines: You will design, construct, and maintain efficient and reliable data pipelines using Python, capable of supporting both streaming and batch data processing across structured, semi-structured, and unstructured data in on-premises and Azure environments. Hybrid Cloud and Data Storage Solutions: Implement and manage data storage solutions leveraging both on-premises infrastructure and Azure, ensuring seamless data integration and accessibility across platforms. Containerisation and Orchestration: Utilise Docker for containerisation and Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as Apache NiFi and Apache Airflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies such as Kafka, Apache NiFi, and Apache Flink to handle real-time data streams effectively. Security and Compliance: Manage security setups and access controls, incorporating tools like Keycloak to protect data integrity and comply with legal standards across all data platforms. Data Search and Analytics: Oversee and enhance Elasticsearch setups for robust data searching and analytics capabilities in mixed infrastructure settings. Database Management: Administer and optimise PostgreSQL databases, ensuring high performance and availability across diverse deployment scenarios. Essential Skills and Experience Strong Python Skills: Expertise in Python for scripting and automating data processes across varied environments. Experience with ETL/ELT: Demonstrable experience in developing and optimising ETL or ELT workflows, particularly in hybrid (on-premises and Azure) environments. Expertise in Hybrid Cloud Data Architecture: Profound knowledge of integrating on-premises infrastructure with Azure cloud services. Containerisation and Orchestration Expertise: Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience with Apache NiFi and Apache Airflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like Kafka, Apache NiFi, and Apache Flink. Data Security Knowledge: Experience with implementing security practices and tools, including Keycloak, across multiple platforms. Search and Database Management Skills: Strong background in managing Elasticsearch and PostgreSQL in environments that span on-premises and cloud infrastructures. Your Impact In this role, you will empower business leaders to make informed decisions by delivering timely, accurate, and actionable data insights from a robust, hybrid infrastructure. Your expertise will drive the seamless integration of on-premises and cloud-based data solutions, enhancing both the flexibility and scalability of our data operations. You will champion the adoption of modern data architectures and tooling, and play a pivotal role in cultivating a data-driven culture within the organisation, mentoring team members, and advancing our engineering practices. Desirable Skills and Experience Certifications in Azure and Other Relevant Technologies: Certifications in cloud and on-premises technologies are highly beneficial and will strengthen your application. Experience in Data Engineering: A minimum of 5 years of experience in data engineering, with significant exposure to managing infrastructure in both on-premises and cloud settings. This role will require you to have or be willing to go through Security Clearance. As part of the onboarding process candidates will be asked to complete a Baseline Personnel Security Standard; details of the evidence required to apply may be found on the government website Gov.UK. If you are unable to meet this and any associated criteria, then your employment may be delayed, or rejected . Details of this will be discussed with you at interview.


- Company Name
- Found Talent
- Job Title
- BI Developer
- Job Description
- Are you a contract BI Specialist with experience of Microsoft Fabric and Microsoft Synapse Analytics? If so Found Talent are looking for BI Developer to join a Manchester based organisation on a 6 initial contract, outside IR35. Experience: Proven experience as a BI Developer or similar role Strong knowledge of Microsoft Fabric & Microsoft Synapse Analytics Hands-on experience with Data Warehousing, ETL processes, and data modelling Proficiency in Power BI, SQL, and Azure-based data solutions Ability to optimize large datasets Strong problem-solving skills and a data-driven mindset Nice to Have: Experience with Azure Data Factory, Databricks, or other cloud-based data services Familiarity with DAX We require a UK based candidate for this assignment


- Company Name
- N Consulting Global
- Job Title
- Data Engineer with Oracle
- Job Description
- Role: Data Engineer with oracle Location: Edinburgh(Hybrid) Duration: Contract Oracle as primary skill and Snowflake/Other Data Engineering as secondary • Strong hands-on Oracle PL/SQL development and Performance tuning skills. This person should ideally have some solution design experience or be able to design based on the requirements/discussions with cross-functional teams (if/wherever needed). • The candidate is expected to have some experience working in a Software Development/Web application-based Agile project. • UK experience is a must and preferably able to come to RBS Edinburgh office. • Banking and lending experience prefer