Data Technical Architect, Data Warehouse, Dunstable-Hybrid, £85k - £95k
Hybrid
Dunstable - Hybrid 2 Days Weekly, United Kingdom
Full Time
19-12-2024
Job Specifications
Data Technical Architect, Data Warehouse, Dunstable-Hybrid, £85k - £95k
This is the chance for you to join our prestige client and be part of an exciting transformation. As a Data Technical Architect, you will be providing technical leadership to a data engineering team. You will also have the fantastic opportunity to play a major role in adhering to big data best practices whilst building and maintaining reliable data pipelines and products across a range of exciting projects.
Essential
Provide technical data solutions, design and development of logical and physical data models and databases for business solutions.
Providing technical expertise and guidance to development teams around solution/system design and coding, ensuring standards are consistency upheld and quality practices are Embedded early within delivery teams.
Lead the Data Engineering board by building and where necessary redefining patterns and standards for the data engineering teams.
Develop robust data pipelines to serve analyst and data science community.
Building a documentation library for all data pipelines and data catalogue.
Extensive Data modelling experience from conceptual to logical models. Taking long term ownership of the data models and analytical products built on top of them.
Experience within a combination of cloud based Big data technologies (eg HDFS, Blob Storage, Spark, Kafka, Delta, Hive, Airflow and DBT) and OLTP and Data Warehousing - within SQL server or other RDBMS's. (SQL server).
Understanding of Lake House Architecture and Databricks.
Building solution blueprints.
Up to date knowledge of data and analytical technical domain.
New technology awareness, evaluation and selection. Running or supporting vendor RFI and RFP process.
Good understanding of software best practices, including design principles & patterns, security and performance.
Methodical, logical and flexible in approach, you will have gained significant experience as a Technical Architect.
Airline/retail experience.
Knowledge of analysis and development techniques eg SCRUM, Agile & Waterfall.
Minorities, women, LGBTQ+ candidates, and individuals with disabilities are encouraged to apply.
Interviews will take place next week, so please apply immediately to be considered for this exciting contract role or call Bangura Solutions to discuss this role further.
About the Company
IT Recruitment Simplified. Experts in recruiting Technology and Digital Transformation specialist. Bangura Solutions is a Certified Ethnic Minority Owned Business, we are masters at recruitment organisational systems and processes, consistently delivering exceptional results for customers. We are agile throughout the recruitment process; swift to adapt pre-emptively in problematic Technology and Digital Transformation areas for organisations to hit their growth targets and achieve ambitions. Bangura Solutions was found... Know more
Related Jobs
- Company Name
- Saragossa
- Job Title
- Senior Data Engineer – Commodities Trading – £130,000 Salary + Bonus
- Job Description
- Get stuck in immediately to a database migration using Snowflake. This company work across various energy and commodities markets across the world. We appreciate that not everyone wants to work within Oil and Gas trading, however, part of the role of the data team is to look at more sustainable options for trading all kinds of commodities products. You’re going to be getting involved with a number of newly launched data projects, with your initial project being to work on this migration. You’ll face off with the business (Heads of Desk, Traders, Analysts), understanding what they need, discussing solutions with the Data Science team, then building out the best solution possible, whether it be with an off the shelf product, or building it completely from scratch using primarily Python and SQL. The data team has grown over the past 12-18 months, with data engineering still being built out in London. There’s a strong opportunity to take on leadership responsibilities, so if management is in your sights and ambitions, you’ll be able to achieve that here. The team are using more advanced technology as time goes on and you’ll be able to suggest potential tools to use. Snowflake is one of the examples of this, as it’s recently been brought into the team on suggestion of one of the team and is now being widely used. Alternatively, if there’s a ready customised tool within AWS that you feel is a better option, then you can use that. There really is plenty of technical freedom here. In terms of your technical experience you’ll need to have worked in a commercial data engineering role for a few years, this is a mid-level position. Strong Python, Snowflake and SQL experience will be required and any experience of working with tools like Docker/Kubernetes and AWS would be a huge preference. Commodities experience/knowledge is not required but would be a plus. This is a global commodities firm with a strong history of performance and revenue. Your starting salary will be up to £130,000 plus a performance related bonus. Benefits include medical, dental and life insurance, wellness programs, pension, generous parental leave and various other perks. Want to make sure data has an impact on the future of commodities trading? Get in touch. No up-to-date CV required.
- Company Name
- nisien.ai
- Job Title
- Data Scientist
- Job Description
- Make the Internet Safer: Join Nisien as a Data Scientist Help build cutting-edge AI solutions to combat online harms and shape the future of digital safety. We are looking for a proactive, passionate, and highly collaborative Data Scientist to help us build greenfield products aimed at safeguarding the internet. About Us Nisien is an AI startup with a mission to harness artificial intelligence as a force for good, specifically to detect and counter online harms. We’re a newly established company spun out of Cardiff University’s HateLab, led by founders with expertise in combating hate, AI, and cybersecurity. With a leadership team boasting a proven track record in securing funding, scaling operations, and successfully exiting startups, we are well-positioned to drive meaningful growth and innovation in this critical space. The advent of the UK Online Safety Act presents a unique opportunity for us to make a tangible impact, addressing urgent challenges in online safety and fostering a healthier digital ecosystem. About the Role As a Data Scientist at Nisien.ai, you'll develop AI models to detect and prevent online harms, working with engineers and policy experts to build scalable tools for content analysis and insights. Key Responsibilities: Design and optimize ML models to detect online harms (misinformation, hate speech, harmful behavior) Analyze data and create visualizations to uncover online harm patterns Develop NLP solutions for text classification and content moderation Handle large datasets from social media and digital platforms Work with teams to integrate AI solutions and define requirements Track AI research advances in online safety and digital ethics Help build scalable data infrastructure for rapid tool development This is an exciting opportunity to make a significant impact on a project from the ground up while working in a supportive and innovative environment. Requirements: Bachelor's/Master's in Computer Science, Data Science, Statistics, or related field. Strong Python proficiency with deep learning frameworks (PyTorch, TensorFlow). Experience with large-scale data processing and SQL. Proven track record of deploying ML models to production with cloud platforms (AWS, Azure). Nice to Have PhD in relevant field. Experience working with datasets related to social media, content moderation or cyber risk. Ready to Make an Impact? Apply now to join our mission of making the internet safer through AI innovation.
- Company Name
- Future Talent Group
- Job Title
- Senior Ruby Engineer - (SaaS / Tech4Good / GIS Data) - Remote-First Team
- Job Description
- Senior Ruby Engineer - (SaaS / Tech4Good / GIS Data) - Remote-First Team Salary: £80,000 - £85,000 Location: Remote, with team meet-ups in the UK several times per year Industry: GIS Data and PropTech Would you like to join a growing scale-up that is not only disrupting an entire industry in a positive way but also creating real benefits for its platform users? You’ll work alongside leading specialists in GIS data, Data Engineering, and SaaS to build cutting-edge solutions. Responsibilities: Develop and maintain robust, scalable, and secure Ruby-based applications. Build APIs and integrations to power our platform and deliver seamless user experiences. Collaborate with cross-functional teams, including Product, Design, and DevOps, to ship features quickly and efficiently. Write clean, maintainable, and testable code following best practices and coding standards. Technical Skills: Proven expertise as a Ruby/Ruby on Rails developer in a fast-paced environment. Strong understanding of RESTful APIs, database design, and system architecture. Experience working as part of an agile team, utilising TDD and DevOps practices. Proficiency in modern development tools and practices (e.g., Git, CI/CD pipelines). Knowledge of front-end technologies (e.g., JavaScript, HTML, CSS) is a plus. Familiarity with cloud platforms, ideally AWS. If this opportunity excites you, please apply or feel free to reach out directly at scott@futuretalent.io
- Company Name
- Infinity Quest
- Job Title
- Data Engineer
- Job Description
- Job Description: As a Data Engineer with Iceberg experience, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives. Key Responsibilities: Data Integration: Develop and maintain data pipelines to extract, transform, and load (ETL) data from various sources into AWS data stores for both batch and streaming data ingestion. AWS Expertise: Utilize your expertise in AWS services such as Amazon EMR , S3, AWS Glue, Amazon Redshift, AWS Lambda, and more to build and optimize data solutions. Data Modeling: Design and implement data models to support analytical and reporting needs, ensuring data accuracy and performance. Data Quality: Implement data quality and data governance best practices to maintain data integrity. Performance Optimization: Identify and resolve performance bottlenecks in data pipelines and storage solutions to ensure optimal performance. Documentation: Create and maintain comprehensive documentation for data pipelines, architecture, and best practices. Collaboration: Collaborate with cross-functional teams, including data scientists and analysts, to understand data requirements and deliver high-quality data solutions. Automation: Implement automation processes and best practices to streamline data workflows and reduce manual interventions. Experience working with bigdata ACID file formats to build delta lake, particularly with Iceberg file formats and loading methods of Iceberg. Good knowledge on Iceberg functionalities to use the delta features to identify the changed records, optimization, and housekeeping on Iceberg tables in the data lake. Must have: AWS, ETL, EMR, GLUE, Spark/Scala, Java, Python, Good to have: Cloudera – Spark, Hive, Impala, HDFS , Informatica PowerCenter, Informatica DQ/DG, Snowflake Erwin Qualifications: Bachelor's or master’s degree in computer science, Data Engineering, or a related field. 5 to 8 years of experience in data engineering, including working with AWS services. Proficiency in AWS services like S3, Glue, Redshift, Lambda, and EMR. Knowledge on Cloudera based hadoop is a plus. Strong ETL development skills and experience with data integration tools. Knowledge of data modeling, data warehousing, and data transformation techniques. Familiarity with data quality and data governance principles. Strong problem-solving and troubleshooting skills. Excellent communication and teamwork skills, with the ability to collaborate with technical and non-technical stakeholders. Knowledge of best practices in data engineering, scalability, and performance optimization. Experience with version control systems and DevOps practices is a plus.