
Senior Business Analyst. Data Team. Market Data and Data Governance experience. Trading firm. London. 6 months rolling contract. £700/ Day inside IR35.
Hybrid
London, United Kingdom
Freelance
28-02-2025
Job Specifications
Senior Business Analyst. Data Team. Market Data and Data Governance experience. Trading firm. London. 6 months rolling contract. £700/ Day inside IR35.
My client is a top tier energy trading firm that is looking for a Data Business Analyst to come on board on a 6 month rolling contract paying £700/ Day inside IR35.
They are looking for a Senior BA with experience and knowledge of trading market data. Ideally in energy/ commodities but are open to looking at candidates from any asset class as log as they are very experienced working as a BA in a trading setting. They are also keen for a BA to have experience working on Data Governance projects, i.e. policy setting, procedures, mandates etc
The Data team is looking for a Business Analyst to understand business problems, challenge existing practices and approaches, define scope and requirements, propose business improvement opportunities, and develop solutions that enable the Data Team to achieve its goals
Key Accountabilities and Responsibilities
Engage key stakeholders to identify underlying business needs to be addressed
Translate business needs into project objectives and business goals
Accurately and exhaustively document requirements, information and insights
Ensure traceability of project objectives and business goals through requirements definition, design, and testing through to solution implementation
Identify and map stakeholders and actively manage them to ensure positive engagement throughout the project
Develop approach to scope out and elicit all the potential requirements from the stakeholders (functional and non-functional), prioritise and undertake the requirements gathering and ensure review/sign-off at the right level
Oversee the business analysis approach and agree governance, deliverables and resource including time commitments from stakeholders
Manage any changes to requirements and stakeholder expectations to ensure agreement on the solution scope delivered
Validate proposed solutions to ensure they meet project objectives and business goals, are aligned with the strategy and the target operating model and can be implemented effectively
Provide input to testing preparations and activities to ensure the solution meets the identified business need
Review design and analyse impact of design decisions on requirements and stakeholder communications
Ensure compliance with Group policy, standards and reporting requirements together with all regulatory and statutory requirements
Provide domain knowledge and analysis capability, typically to identify additional business opportunities
Offer an independent review, including quality assurance
Manage business analysis activities throughout the project life cycle, including associated risks, issues and benefits delivery
Required Skills and Experience
Extensive experience of gathering technical and non-technical requirements.
Exposure and understanding of big data technologies and messaging systems
Proven technical expertise in SQL, for data extraction, transformation and analysis
Experience with financial services and/or commodities i.e. as reference data, settlement data and market data as well as private order feeds and trade capture
Experience with Agile and Waterfall methodologies, leading the work of others in a Project environment
Proven experience in dealing with stakeholders at all levels from analysts to senior management. Ability to build rapport, nurture business relationships and establish clear communication styles with business and technology teams.
Experience working with JIRA, Azure DevOps or other project management tools
Interested and passionate about data, particularly understanding the source of data, its nomenclature with strong attention for detail and precision.
Takes ownership of any issues that come up and facilitates their resolution quickly using own initiative while managing expectations.
A flexible approach and a willingness to move between multiple tasks and take on more responsibilities
Self-motivated and able to work remotely and unsupervised
Able to multitask, switch focus and prioritise own tasks being comfortable to work under pressure with demanding front office users
Exceptional time management and multi-tasking skills and ability to deliver under pressure
If you are interested, please send your CV for immediate consideration.
About the Company
CommuniTech are an exciting name in Tech Recruitment, seamlessly connecting the client & candidate communities to deliver exceptional technical talent to tech-driven companies. Ensuring that together, they will thrive, exceed, and achieve. By striving to intertwine the communities, we get to know our clients and candidates better than ever before. Providing recruitment solutions that deliver an individual experience tailored to your needs. Know more
Related Jobs


- Company Name
- GIOS Technology Limited
- Job Title
- Data Analyst - Banking Sector
- Job Description
- I am hiring for Data Analyst - Banking Sector Location: Glasgow - Hybrid / 2-3 days Per week in Office Design and build automated data pipelines for efficient data movement and processing. Develop and maintain Management Information (MI) dashboards to support the operations team and provide actionable insights. Experience with SQL and Python. Proficient data extraction skills. Design and build data pipelines to automate data movement and processing. Key Skills: Data Analyst / SQL / Python / Data extraction / Data pipelines / Banking


- Company Name
- Stott and May
- Job Title
- Data Engineer (Kafka) | £600-£700 pd | Inside IR35
- Job Description
- Data Engineer (Kafka) | £600-£700 per day | Inside IR35 | 6-12 months Role: Data Engineer (Kafka) Day Rate: £600-£700 per day Type: Contract Location: London (2 days per week in office) IR35 status: Inside My client, a FinTech company are looking for a Data Engineer with expertise in Kafka. You will be responsible for building and optimizing Real Time data pipelines, enabling low-latency processing for high-volume financial transactions. This role requires you to be in the office 2 days per week in London and the day rate on offer is £600-£700 per day (Inside IR35). Key Skills: - Kafka - Python/Java - Azure/AWS - SQL - CI/CD - Terraform If you are interested in the Data Engineer (Kafka) position, please apply with an up to date CV as soon as possible. Data Engineer (Kafka) | £600-£700 per day | Inside IR35 | 6-12 months


- Company Name
- Experis IT
- Job Title
- Data Engineer
- Job Description
- Role Title: Data Engineer Duration: 6 Months Location: Hybrid - 40% on site in Sheffield Rate - £399 Umbrella only Would you like to join a global leader in consulting, technology services and digital transformation? Our client is at the forefront of innovation to address the entire breadth of opportunities in the evolving world of cloud, digital and platforms. "Automation - Puppet & Ansible Must have experience: Have very strong coding skills in Puppet, Ansible and Python, as well as exposure to Cucumber, Gherkin, Git/GitHub and other pipelines used for testing purposes. Have a good understanding of traditional infrastructure This role will be to automate a new customer environment, using well-defined existing frameworks. You will need to be a quick learner to understand the existing environment Architect a build a provisioning pipeline to support the new initiative Creation of management dashboards and integration to existing portal tools will be necessary Adhere to change management processes - Change management automation. Sufficiently comment your code during creation to allow for supportability in the future Understand performance implications for your code, and create new code to be as performant as possible. Must be able to work as part of a team, and be able to receive and implement enhancements for code you have written from others Ability to work within the Agile frame using Jira Must have experience working in a finance background with adherence to strong change management practices Beneficial additional skills: Exposure to Ruby, Vagrant, Virtual Box, Visual Studio, SPLUNK, Powershell and deconstructing external API's to present an internal customer API will be useful Understand 'bigger pictures' and code towards a longer-term architecture rather than just for a short-term fix Experience in working in a large enterprise environment with thousands of end-points. Create and maintain documentation of the patching architecture. Update confluence and Jira tickets to reflect any changes or updates you make Work within a release schedule, adhering to strict change management rules Must have the 'soft skills' to be able to work in a larger team environment and contribute in team meetings/huddles" All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!


- Company Name
- Square One Resources
- Job Title
- Azure Platform Data Architect - Worthing
- Job Description
- Job Title: Azure Platform Data Architect (Worthing) Location: Worthing Salary/Rate: £470 per day inside IR35 Start Date: March 2025 Job Type: 5-month initial Contract Hybrid: 2-3 days a week onsite (Worthing) My Client within the Energy sector is currently seeking an Azure Data Architect for an enterprise-scale transformation project based in Worthing Job Responsibilities/Objectives 1. Database Management: Experience with various database systems, such as SQL, NoSQL, and NewSQL databases, including their design, implementation, and optimization. 2. Cloud Platforms: Expertise in Azure cloud services and understanding of data storage, processing, and security in the cloud environment. 3. Big Data Technologies: Familiarity with Azure MS Fabric and frameworks for handling large-scale data processing tasks. 4. ETL Processes: Knowledge of Extract, Transform, Load (ETL) processes and tools for integrating data from various sources into a unified data platform. Implementation of coding pipelines and setting up of environments 5. Programming Languages: Some proficiency in programming languages such as Python, Java, Scala, and SQL for data manipulation and analysis. 6. Data Warehousing: Experience in designing and managing data warehouses, including schema design, indexing, and partitioning. Optimising the bronze, silver and gold layers to support business needs. 7. Security and Compliance: Understanding of data security best practices and compliance requirements (GDPR, HIPAA, etc.) to ensure data protection and privacy. Required Skills/Experience 1. CI/CD, Source Control for Fabric (covering as many of the services as possible) 2. Implement Testing Automation: Standardise testing processes, incorporating schema validation, data reconciliation, and automated testing within CI/CD workflows. 3. Enabling Fabric Migration (Fabric Data Engineering/Data Platform HLD and LLDs, migration strategy, workspace design etc). This can include the layers of the Data Lake(One Lake) and how they should be structured via workspaces. 4. Establish Near Real-Time Data Lake: Configure Fabric's replication and mirroring capabilities to enable near Real Time ingestion of core systems (eg, ERS Billing, SmartGen, Salesforce, ETRM, Finance). 5. Build Metadata-Driven Orchestration: Develop a configurable orchestration solution using a metadata-driven approach to manage pipeline workflows. 6. Develop Data Engineering Migration: Transition workloads from Azure Synapse and SQL Server to Microsoft Fabric, implementing a medallion architecture (bronze, silver, gold layers). 7. Design Redeployable Patterns: Develop and document reusable patterns for configurations, transformations, and integrations to streamline onboarding of new teams and use cases. 8. Define API and Streaming Patterns: Develop and standardize API and streaming patterns to support Real Time data ingestion and processing, including best practices for integration pipelines. If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format. Disclaimer Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies. Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.