cover image
Brightsmith

Data Centre Engineer

Hybrid

England, United Kingdom

Freelance

20-02-2025

Share this job:

Score my CV

Job Specifications

Exciting Opportunity for Data Centre Engineers Across London!
Duration: 6 Months
Location: London
Start Date: ASAP
IR35: Outside

I'm currently working with a client looking for a Data Centre Engineer who is happy to travel to London.

Key Responsibilities:
Design, implement, and maintain data centre infrastructure.
Troubleshoot and resolve hardware and software issues.
Manage and monitor network, server, and storage systems.
Ensure top-tier security, reliability, and performance.
Collaborate with cross-functional teams on critical infrastructure projects.
Key Skills & Experience:
Proven experience in data centre operations, maintenance, and management.
Strong understanding of servers, storage, networking, and virtualization.
Hands-on experience with hardware configuration, installation, and troubleshooting.
Familiarity with data centre monitoring tools and automation.
Excellent communication skills and the ability to work independently.

Please apply if you are interested!

About the Company

We're a search firm focused on cleantech, clean energy & sustainability globally. We're also proud to be a B Corp™, highlighting our commitment towards being a force for good and putting people before profits. Our Mission To create the most people-centric, purpose-driven search company, connecting diverse & passionate people, who together have the power to accelerate the energy transition across the globe. Our Values Be Ambitious - We strive for constant improvement Be Authentic – We are always honest & transparent Be... Know more

Related Jobs

Company background Company brand
Company Name
SystemsAccountants
Job Title
SAP Data Scientist
Job Description
SAP Data Scientist £500 inside IR35 12-month contract SystemsAccountants are currently working with a client implementing SAP S/4HANA Public Cloud and SAP Analytics Cloud, seeking a Data Scientist to join the team, collaborating with key stakeholders to create impactful reports and dashboards to enable key business intelligence decisions. The successful candidate will work closely with the database administrator to ensure data is clean and available, allow meaningful reports to be made available efficiently, and ensure single source of truth behind all data decisions. Role Responsibilities Data Warehousing: Collaborate with senior developers to design, develop, and maintain reports, dashboards, and visualizations that effectively communicate complex data insights Data Analysis: Work closely with business stakeholders to understand their reporting requirements and translate them into effective data models and visualizations Data Integration: Assist in integrating data from various sources into DW using appropriate Extract, Transfer and Load (ETL) processes, ensuring data quality and integrity Transfer current Data Warehouse reports into SAP Analytics Cloud (SAC) Report Optimization: Optimize existing reports and dashboards for improved performance, usability, and user experience Testing and Troubleshooting: Conduct thorough testing of reporting solutions (SAC) to identify and resolve any issues or bugs, ensuring accuracy and reliability of the data Documentation: Document data models, report specifications, and development processes to ensure clear communication and knowledge sharing within the team Collaboration: Collaborate with cross-functional teams, including business analysts, data engineers, and stakeholders, to gather requirements and ensure successful project outcome Continuous Learning: Stay up to date with the latest trends and best practices in DW development and business intelligence to enhance technical skills and contribute to ongoing process improvements Role Requirements Degree in Computing subject or equivalent Certification in Data Warehousing and related technologies is a plus Must be able to attain and hold National Security Vetting to a minimum SC level Experience in data analysis, modelling, and management Prior experience in developing analytical data models and ETL (Extract, Transform, Load) processes. Adept at designing, implementing, and optimizing data models to drive insightful analytics and support decision-making. Skills in building and managing ETL workflows to ensure the efficient and accurate movement of data across systems. Experience in SAP Data Tools – particularly SAP Analytics Cloud and SAP DataSphere. Proficiency in leveraging these tools to develop, manage, and optimize data models, analytics, and reporting solutions. Expertise in integrating data from various sources, performing advanced data analysis, and generating actionable insights to support business decision-making. Experience in managing priorities and stakeholders, with skills in balancing multiple tasks and projects simultaneously, ensuring that critical deadlines are met. The ideal candidate should have prior experience working with protected and sensitive data, such as ITAR (International Traffic in Arms Regulations) data. Knowledgeable about the handling, storage, and transfer of sensitive information, ensuring compliance with relevant regulations and security protocols. Experience working with MRP (Material Requirements Planning) and/or ERP (Enterprise Resource Planning) generated data, with skills in analysing and interpreting complex datasets from these systems to provide actionable insights and support decision-making processes. Ability to communicate complex data insights to non-technical stakeholders and collaborate with cross-functional teams. The ideal candidate should have a solid understanding of project management principles, including the ability to plan, execute, and oversee data science & BI projects effectively. Stay up to date with the latest trends and best practices in Data Warehouse development and business intelligence to enhance technical skills and contribute to ongoing process improvement. Strong working knowledge of SQL and DAX. An ideal candidate will also be familiar with VBA, PowerShell, and Python. Strong understanding & proficiency in data modelling, statistical analysis, and predictive modelling using tools such as SSAS, Power BI, SAP Analytic Cloud, and SAP DataSphere. An understanding of data privacy laws and regulations such as GDPR is essential. As well as knowledge of data protection principles and practices to ensure compliance. Knowledge of industry-standard security protocols and frameworks, such as ISO 27001. Experience in implementing and maintaining security measures to protect data integrity and confidentiality. Proficiency in generating regulatory reports and documentation as required by relevant authorities. Understanding of reporting timelines and formats. The ideal candidate would have experience in using monitoring tools such as SPLUNK, Datadog, or New Relic to analyse and troubleshoot system performance and / or logs. Understanding of business intelligence concepts and the ability to translate business needs into data-driven insights.
Surrey, United Kingdom
Hybrid
Freelance
21-02-2025
Company background Company brand
Company Name
Cornwallis Elt
Job Title
AWS Data Engineer
Job Description
AWS Data Engineer – Python, AWS, CloudFormation, Terraform, Containerisation, Financial Services, Contract - London/Remote - £550 - £650 p.d. Inside IR35 A hands-on Senior AWS Data Engineer is currently being sought after by a leading financial institution based in London for a contract running till the end of year with intent to extend/go permanent after. In this role, you will have the opportunity to drive the architecture, development, and optimization of cutting-edge AWS-based data solutions in the financial services domain. You’ll work with Agile delivery methods, developing serverless solutions, improving existing platforms, and collaborating with a skilled team to deliver top-tier results. To be successful in this role, you would need the following: Strong experience in Python development for data engineering tasks. Familiarity with AWS services including IAM, Step Functions, Glue, Lambda, RDS (e.g. DynamoDB, Aurora Postgres), SQS, API Gateway, Athena, is highly beneficial. Familiarity with container technologies (e.g., Docker, Kubernetes) is a plus. Experience implementing infrastructure as code (IaC) using tools like CloudFormation or Terraform. Experience with NoSQL and relational databases, including their integration within AWS environments, is highly beneficial Experience working in the Financial Services domain is essential. Bachelor’s Degree in MIS, Computer Science, or IT-related field or equivalent IT-related experience. This is an exciting opportunity to work in a collaborative environment, solve complex challenges, and contribute to the growth of cutting-edge data platforms.
London, United Kingdom
On site
Freelance
21-02-2025
Company background Company brand
Company Name
Trust In SODA
Job Title
Security Data Engineer
Job Description
Security Data Engineer Location: Hybrid (2 days in London office expected per week) Contract Length: 6 months from start date Rate: £430/day Inside IR35 About the Role: The Control Tower is on an exciting journey to provide data-led security insights, and we are looking for an experienced Security Data Engineer to join the team. As a key member of this initiative, you will design, develop, and implement data solutions using the Azure cloud platform, contributing to the creation of scalable data pipelines and insights that empower leadership and colleagues across the organization. Key Responsibilities: Design and Implement Data Architectures: Create scalable data pipelines using Azure Data Factory and Databricks to ingest, transform, and load data from multiple sources. Data Processing & Transformation: Develop and maintain integration processes, ensuring seamless data flow leveraging Databricks and Synapse Analytics. Data Storage Management: Select and manage storage solutions like Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Data Warehousing: Support the building of scalable Azure Synapse Analytics solutions for large datasets. Real-time Data Processing: Optimize streaming pipelines with Azure Stream Analytics. Data Governance & Security: Implement data governance practices and ensure data quality within Azure. DevOps & Automation: Automate data pipelines and manage deployments using Azure DevOps for CI/CD. Performance Optimization: Fine-tune data pipelines, queries, and storage solutions for optimal performance. Collaboration: Work closely with cross-functional teams (data scientists, software engineers, business analysts) and senior stakeholders to deliver end-to-end data solutions. Troubleshooting & Continuous Improvement: Diagnose data-related issues and stay updated on emerging technologies in data engineering and banking. Required Skills & Experience: Proficiency in Azure Data Factory, Databricks, and Synapse. Hands-on experience with Python, SQL, and Spark for data processing. Strong ability to work with unstructured datasets and design high-quality code. Experience in automating tasks and deploying production-level code. Familiarity with visualization tools and building insightful dashboards. Excellent communication and collaboration skills in a team environment. Azure certification or related technologies (e.g., Microsoft Certified: Azure Data Engineer Associate). Understanding of security tools, risk frameworks, and change management principles. Experience in Agile development methodologies and data management principles. Desirable Skills: Knowledge of retail banking channels and products, especially related to data-driven insights. Familiarity with industry standards, roadmaps, and best practices for data engineering and cloud computing.
London, United Kingdom
Hybrid
Freelance
21-02-2025
Company background Company brand
Company Name
Falcon Smart IT (FalconSmartIT)
Job Title
Big Data Lead ( Senior Data Engineer )
Job Description
Job TItle: Big Data Lead ( Senior Data Engineer ) Job Type: Contract Job Location: Wimbledon , UK Job Description: For this role, senior experience of Data Engineering and building automated data pipelines on IBM Datastage & DB2, AWS and Databricks from source to operational databases through to curation layer is expected using the latest cloud modern technologies where experience of delivering complex pipelines will be significantly valuable to how to maintain and deliver world class data pipelines. Knowledge in the following areas essential: Data Engineering Experience: Databricks: Expertise in managing and scaling Databricks environments for ETL, data science, and analytics use cases. AWS Cloud: Extensive experience with AWS services such as S3, Glue, Lambda, RDS, and IAM. IBM Skills: DB2, Datastage, Tivoli Workload Scheduler, Urban Code Programming Languages: Proficiency in Python, SQL. Data Warehousing & ETL: Experience with modern ETL frameworks and data warehousing techniques. DevOps & CI/CD: Familiarity with DevOps practices for data engineering, including infrastructure-as-code (e.g., Terraform, CloudFormation), CI/CD pipelines, and monitoring (e.g., CloudWatch, Datadog). Familiarity with big data technologies like Apache Spark, Hadoop, or similar. Test automation skills ETL/ELT tools and creating common data sets across on-prem (IBMDatastage ETL) and cloud data stores Leadership & Strategy: Lead Data Engineering team(s) in designing, developing, and maintaining highly scalable and performant data infrastructures. Customer Data Platform Development: Architect and manage our data platforms using IBM (legacy platform) & Databricks on AWS technologies (e.g., S3, Lambda, Glacier, Glue, EventBridge, RDS) to support real-time and batch data processing needs. Data Governance & Best Practices: Implement best practices for data governance, security, and data quality across our data platform. Ensure data is well-documented, accessible, and meets compliance standards. Pipeline Automation & Optimisation: Drive the automation of data pipelines and workflows to improve efficiency and reliability. Team Management: Mentor and grow a team of data engineers, ensuring alignment with business goals, delivery timelines, and technical standards. Cross Company Collaboration: Work closely with all levels of business stakeholder including data scientists, finance analysts, MI and cross-functional teams to ensure seamless data access and integration with various tools and systems. Cloud Management: Lead efforts to integrate and scale cloud data services on AWS, optimising costs and ensuring the resilience of the platform. Performance Monitoring: Establish monitoring and alerting solutions to ensure the high performance and availability of data pipelines and systems to ensure no impact to downstream consumers.
Wimblington, United Kingdom
Hybrid
Freelance
21-02-2025