cover image
BBC Studios

Data Analyst EXTEND (FTC 12 MONTHS)

On site

London, United Kingdom

Full Time

01-04-2025

Job Specifications

Package Description

Job Reference: 22595

Band: C

Salary: £32,567 up to £35,000 as base salary, plus London Weighting (£5,319) on top of the base, depending on relevant skills, knowledge and experience. The expected salary range for this role reflects internal benchmarking and external market insights.

Location: Office Base is London TVC ( This is a hybrid role and the successful candidate will balance office working with home working)

EXTEND

This role is advertised as part of our BBC Extend programme for disabled people. To apply, you should identify as deaf, disabled or neurodivergent and must meet either: the definition of disability in the Equality Act (2010) , or the definition of disability in the Disability Discrimination Act (1995) if applying in Northern Ireland. You’re broadly defined as disabled under both acts if you have a physical or mental impairment that has a substantial and long-term negative or adverse effect on your ability to do normal daily activities. This definition includes both apparent and non-apparent conditions and impairments, and medical conditions such as Cancer, HIV or Multiple Sclerosis.

We are committed to making the process of applying for this role as accessible as possible. If you need to discuss adjustments or access requirements for the application process, or have any questions about our BBC Extend programme, please contact the BBC Extend team.

The BBC are fully committed to providing workplace adjustments to help eliminate barriers in the workface that disabled people face. To do this, we have our own dedicated BBC Access and Disability Service that provides assessments and support throughout employment with us. If you are successful in applying for this role and require workplace adjustments, we will work with you to get your adjustments in place. If you’d like more information on BBC Extend, please visit the BBC Extend webpage (Extend Code: EX2324)

About Bbc Studios

BBC Studios is a world-renowned content studio and channels & streaming business, powered by British creativity, with a reach that touches audiences in every corner of the globe.

We work with outstanding creative talent who are responsible for platform-defining shows from Strictly Come Dancing to Eastenders, Prehistoric Planet to Planet Earth III.

The range and quality of our content is unsurpassed, creating critical and commercial successes and global phenomena.

From BAFTAs to RTS Awards, BBC Studios is Britain’s most awarded production company and the only producer with three of the top ten shows on IMDB; we’re the home of bbc.com, the widest-read English language news website in the world; and the UK’s largest distributor of British content.

About Data Insights

The Data & Insights team works with all parts of BBC Studios, but this role will focus in particular on Production. Partnering closely with colleagues in Insights, you’ll produce reporting and analysis to support this large and growing part of the business. You will:

Support the production community with insights that can inform future development and production
Support the content sales community, to promote our content by highlighting key facts for the programme and the audience
Deliver performance reporting for your stakeholders to inform decision making at all levels of the business from the strategic to the tactical.

THE ROLE

Are you passionate about turning data into actionable insights? We're looking for a Data Insights Analyst to help drive informed decision-making through robust analysis and compelling storytelling.

What You'll Do

Deliver inspiring and actionable insights and reporting, rooted in robust data, analysis and measurement.
Support Lead, Principal and Senior Analysts in delivering projects and fielding queries.
Helping non-data oriented colleagues understand where analysis can help them with their day-to-day roles, and support them with this by building dashboards where they can self-serve their own data
Maintain high standards of presentation, including the development of new or imaginative ways of using or communicating data and insights.
Developing and maintaining documentation & knowledge hub.

WHAT DOES IT TAKE?

Key Criteria

Strong analytical and problem-solving skills, demonstrated through a degree in a STEM subject, equivalent professional experience, or a proven track record in data-driven roles.
Analytical, naturally inquisitive, and enjoys problem solving
Understanding of one or more data scripting language such as Python, R or SQL
Experience of using one or more data visualisation tools (e.g. Tableau, PowerBI, Looker etc.)
Able to simplify complex problems into component parts and deal with them systematically

Desirable Criteria

Experience of working within the TV/media industry.
A good understanding of linear and on demand television services, news media, online publishing, with a knowledge of key competitors.
Understanding of how the digital and media landscape is evolving and the implications for all parts of the BBC

LIFE AT BBC STUDIOS

We don’t focus simply on what we do – we also care how we do it. Our values and the way we behave are important to us. Please make sure you’ve read about our values and behaviours here.

The BBC is committed to building a culturally diverse workforce, that represents the audiences we serve, and encourages applications from candidates from any background, especially people from diverse communities. Equity of opportunity is important to us, and we endeavour to make our processes equal, fair and meritocratic for everybody. More information on our D&I plan can be found here.

BBC Studios puts sustainability at the heart of everything we do both onscreen and offscreen, including delivering against the BBC Group’s science-based Net Zero targets. More information on sustainability at BBC Studios can be found here.

We are proud to share that we are a Level 2 Disability Confident Employer and if you require any reasonable adjustments in order to apply please contact us on reasonable.adjustments@bbc.co.uk.

WHAT WILL YOU GAIN FROM WORKING AT BBC STUDIOS?

We offer a competitive salary package
Flexible 35-hour working week for work-life balance
26 days holiday (plus an additional day which is a Corporation Day) with the option to buy an extra 5 days
Parental leave for new parents, regardless of gender, of up to a year with 18 weeks fully paid
A defined contribution pension scheme with employer contributions up to 10%; and life assurance (at 4 times annual salary), eligibility for discounted dental, health care, gym and much more through salary sacrifice
Excellent career progression – access to courses, webinars, workshops and the opportunity to work in different areas of the organisation

NEXT STEPS

We appreciate your interest in this position and understand how important this opportunity is to you. Due to the high volume of interest we may need to close the application period earlier than anticipated. This step is nec...

About the Company

BBC Studios is a world-renowned content studio and channels & streaming business, powered by British creativity, with a reach that touches audiences in every corner of the globe. We work with outstanding creative talent who are responsible for platform-defining shows from Strictly Come Dancing to Bluey, Prehistoric Planet to Planet Earth III. The range and quality of our content is unsurpassed, creating critical and commercial successes and global phenomena. From BAFTAs to RTS Awards, BBC Studios is Britain’s most awarded... Know more

Related Jobs

Company background Company brand
Company Name
Derisk360
Job Title
GCCAI Architect
Job Description
Job Description Job Title: Google Dialogflow CX & CCAI Developer/Architect Location: Remote/Hybrid Available ( Pune , Bangalore , Hyderabad) Experience Level: 5+ Years Job Type: Full-Time, Permanent Job Summary We are seeking a skilled Google Dialogflow CX & Contact Center AI (CCAI) Developer with hands-on experience in designing and implementing intelligent chatbots and voice bots. The ideal candidate should have strong expertise in Agent Assist, Event Handlers, NLP, API integrations, and AI-driven automation. You will play a key role in building seamless conversational experiences for both chat and voice-based customer interactions. Key Responsibilities Google Dialogflow CX & CCAI Development Design, develop, and deploy Google Dialogflow CX-based chatbots and voice bots. Create natural conversation flows for diverse user intents across text and voice interfaces. Develop interactive and context-aware virtual agents with multi-turn conversations. Optimize Dialogflow CX models for improved accuracy, response handling, and bot performance metrics. Agent Assist & Event Handlers Implement Google Agent Assist for real-time support to human agents. Configure Smart Reply, Event Handlers, and real-time Agent Assist suggestions. Integrate knowledge bases to enhance Agent Assist recommendations. Analyze customer interaction data to fine-tune Agent Assist AI models. API & System Integrations Integrate Dialogflow CX chatbots with backend services, databases, and enterprise systems. Develop custom API integrations to enable real-time data retrieval and process automation. Work with third-party telephony systems (Genesys, Avaya, Twilio, Amazon Connect). Implement authentication mechanisms (OAuth, JWT, API Gateway) for secure interactions. AI, NLP & Machine Learning Optimization Train and fine-tune Dialogflow CX models for better natural language understanding (NLU). Leverage Google Vertex AI and Generative AI to enhance chatbot responses. Utilize NLP concepts for both text and speech recognition to improve bot accuracy. Continuously monitor and refine AI-driven bot performance using real-world data. Testing, Performance Monitoring & Analytics Conduct unit and integration testing for chatbot and voice bot functionalities. Develop test cases and automation scripts to validate chatbot workflows. Monitor bot performance metrics, latency, and accuracy to identify areas for improvement. Analyze customer interaction logs and apply data-driven improvements. Best Practices & Documentation Guide customers on best practices for chatbot and voice bot development. Maintain comprehensive documentation for chatbot flows, APIs, and integrations. Collaborate with UX designers to ensure human-like and intuitive interactions. Provide technical support and troubleshooting for chatbot deployments. Required Skills & Qualifications 5+ years of experience in chatbot development, preferably with Google Dialogflow CX. Hands-on experience with Agent Assist, Event Handlers, and Smart Reply configuration. Strong expertise in NLP, AI, and Machine Learning concepts for both text and voice-based bots. Experience in API integration & web development using Node.js, Python, or Java. Familiarity with Google Cloud Platform (GCP) services such as Cloud Functions and Firebase. Experience with contact center solutions (Genesys, Amazon Connect, Avaya, Twilio, UJET). Strong analytical skills for chatbot performance monitoring and data analysis. Experience with unit and integration testing for chatbot and voice bot applications. Excellent problem-solving, debugging, and troubleshooting skills. Strong communication and collaboration skills to work with cross-functional teams. Nice To Have (Preferred Qualifications) Experience with other chatbot & voice bot platforms (AWS Lex, Kore.ai, Rasa, Watson). Knowledge of Generative AI and Google Vertex AI for advanced chatbot enhancements. Experience with UX/UI principles to create human-like chatbot interactions. Exposure to machine learning models for speech and text processing. Why Join Us? Opportunity to work on cutting-edge AI-driven conversational experiences. Collaborate with industry experts in AI, NLP, and cloud technologies. Work in an innovative and growth-focused environment. Competitive compensation and professional growth opportunities.
London, United Kingdom
Hybrid
Full Time
07-04-2025
Company background Company brand
Company Name
EPAM Systems
Job Title
Lead Data Engineer (Capital Markets Risk - Data Visualization)
Job Description
As one of the world’s leading digital transformation service providers, we are looking to actively expand our Data Practice across the UK to meet increasing client demand for our services. We are hiring for a Lead Data Engineer with a specific focus on Capital Markets Risk. The ideal candidate will have extensive experience designing, developing and optimizing data pipelines and infrastructure within the capital markets risk domain. This role requires expertise in risk analytics, data engineering best practices and data visualization. Responsibilities Design, develop and optimize data pipelines and infrastructure for risk analytics in capital markets Implement and manage workflows, DAGs and tasks in Apache Airflow, ensuring adherence to best practices Deploy and manage cloud database services, with a preference for Snowflake Utilize SQL and Python to manipulate and analyze data effectively Develop and maintain risk technology solutions, including VaR, Stress Testing, Sensitivities and P&L Vectors Create market risk and credit risk reports, defining and calculating relevant KPIs Work as a bridge between business and technology teams to enable insights for better decision-making Develop dashboards and visualizations using tools such as Qlik, Power BI or similar platforms Implement cloud-based solutions using modern cloud technologies, preferably AWS Ensure high data quality standards through validation and governance processes Collaborate with cross-functional teams to deliver robust data solutions Utilize modern SDLC tooling such as Git, Bamboo, Jira or similar Troubleshoot and resolve complex data-related issues efficiently Requirements Experience in risk analytics, including market risk or credit risk Strong software development skills in Python or another common programming language (e.g., Java or Scala) Proficiency in deploying and managing cloud database services, particularly Snowflake Advanced skills in SQL and Python for data manipulation and analysis Expertise in data visualization tools such as Qlik, Power BI or similar platforms Hands-on experience implementing cloud-based solutions with AWS or similar cloud platforms Familiarity with modern SDLC tooling, including Git, Bamboo, Jira or similar Nice to have Strong understanding of derivatives, pricing and risk management for structured products, options and exotic derivatives Knowledge of additional data processing libraries and tools to enhance data engineering workflows Experience with HTML, CSS and JavaScript, using libraries like React or Angular for enhanced visualizations Expertise in Apache Airflow, including designing and managing workflows Experience in real-time data processing frameworks such as Apache Flink or Kafka Streams We offer EPAM Employee Stock Purchase Plan (ESPP) Protection benefits including life assurance, income protection and critical illness cover Private medical insurance and dental care Employee Assistance Program Competitive group pension plan Cyclescheme, Techscheme and season ticket loans Various perks such as free Wednesday lunch in-office, on-site massages and regular social events Learning and development opportunities including in-house training and coaching, professional certifications, over 22,000 courses on LinkedIn Learning Solutions and much more If otherwise eligible, participation in the discretionary annual bonus program If otherwise eligible and hired into a qualifying level, participation in the discretionary Long-Term Incentive (LTI) Program *All benefits and perks are subject to certain eligibility requirements
London, United Kingdom
On site
Full Time
06-04-2025
Company background Company brand
Company Name
Mayflower Recruitment Ltd
Job Title
Senior Data / Cloud Engineer - AWS - Manchester
Job Description
We are looking for a Senior Data/DevOps Engineer for a growing client near Manchester Your role will primarily be to perform DevOps backend and cloud development on the data infrastructure to develop innovative solutions to effectively scale and maintain the data platform. You will be working on complex data problems in a challenging and fun environment using some of the latest Big Data open-source technologies like Apache Spark as well as Amazon Web Service technologies including Elastic MapReduce Athena and Lambda to develop scalable data solutions. • Adhering to Company Policies and Procedures with respect to Security Quality and Health & Safety. • Writing application code and tests that conform to standards. • Developing infrastructure automation and scheduling scripts for reliable data processing. • Continually evaluating and contribute towards using cutting -edge tools and technologies to improve the design architecture and performance of the data platform. • Supporting the production systems running the deployed data software. • Regularly reviewing colleagues' work and providing helpful feedback. • Working with stakeholders to fully understand requirements. • Be the subject matter expert for the data platform and supporting processes and be able to present to others to knowledge share. Here's what we're looking for: The ability to problem-solve. Knowledge of AWS or equivalent cloud technologies. Knowledge of Serverless technologies frameworks and best practices. Apache Spark (Scala or Pyspark) Experience using AWS CloudFormation or Terraform for infrastructure automation. Knowledge of Scala or 00 language such as Java or C#. SQL or Python development experience. High-quality coding and testing practices. Willingness to learn new technologies and methodologies. Knowledge of agile software development practices including continuous integration automated testing and working with software engineering requirements and specifications. Good interpersonal skills positive attitude willing to help other members of the team. Experience debugging and dealing with failures on business-critical systems. Preferred: Exposure to Apache Spark Apache Trino or another big data processing system. Knowledge of streaming data principles and best practices. Understanding of database technologies and standards. Experience working on large and complex datasets. Exposure to Data Engineering practices used in Machine Learning training and inference. Experience using Git Jenkins and other Cl/CD tools. Mayflower is acting as an Employment Agency in relation to this vacancy.
Manchester, United Kingdom
Hybrid
Full Time
07-04-2025
Company background Company brand
Company Name
Cognitive Group
Job Title
Cloud Engineer - Databricks - Terraform - Azure
Job Description
Cloud Engineer 2 days per week on-site in Edinburgh About the Role We are looking for a highly skilled Cloud Engineer with expertise in Databricks and Terraform to join our dynamic team. You will be responsible for designing, developing, testing, and deploying Azure-based cloud platform components, ensuring seamless Continuous Integration and Continuous Deployment (CI/CD). Key Responsibilities Design and develop scalable cloud solutions using Azure, Databricks, and Terraform. Implement and optimize CI/CD pipelines to facilitate seamless software deployment. Ensure high-quality validation of changes with a strong emphasis on automation. Conduct peer reviews of code and documentation to maintain accuracy and completeness. Provide second-line support, troubleshooting cloud-based services and infrastructure. Stay up to date with industry certifications and emerging cloud technologies. Identify and deliver innovative solutions to enhance cloud efficiency and performance. What We’re Looking For Strong experience with Azure, Databricks, and Terraform. Expertise in CI/CD pipelines and DevOps methodologies. Proficiency in scripting and automation using Python, PowerShell, or similar languages. Experience with IaC (Infrastructure as Code) best practices. Ability to troubleshoot and optimize cloud-based environments. Excellent communication and collaboration skills.
Edinburgh, United Kingdom
Hybrid
Full Time
03-04-2025