cover image
SuperAnnotate

QA Engineer/Code Evaluator – AI Chatbot Response Evaluations (Excel)

Remote

United Kingdom

Freelance

10-04-2025

Job Specifications

Role Overview:
We are seeking a QA Specialist with deep expertise in Microsoft Excel to audit evaluations performed by Data Annotators reviewing AI-generated spreadsheet tasks. You will be responsible for verifying that annotators correctly assess instruction-following and data accuracy, test formula correctness, and validate proof-of-work outputs.

Responsibilities:
Audit annotator evaluations of AI-generated Excel responses, including formulas, functions, charts, and data transformations.
Validate code functionality and test cases provided by annotators for correctness and completeness.
Review instruction-following accuracy and proper rubric application in evaluations.
Identify incorrect or incomplete assessments, errors in logic, or flawed proof-of-work.
Provide constructive, concise feedback to annotators to uphold evaluation standards.
Collaborate with QA leads to align on rubric interpretation and process improvements.

Required Qualifications:
7+ years of experience working with Microsoft Excel for data analysis, reporting, automation, or QA.
Proficient in advanced formulas (e.g., VLOOKUP, INDEX-MATCH, IF, nested logic), charts, data validation, pivot tables, and error handling.
Experience auditing Excel-based outputs, validating logic, and ensuring consistency.
Familiarity with spreadsheet auditing tools and Excel testing methods.
Strong written communication and analytical skills.
English proficiency at B2, C1, C2, or Native level.

Preferred Qualifications:
Prior experience with QA of AI-generated content or Excel-based annotation workflows.
Understanding of LLM behavior in generating structured outputs.
Familiarity with macros/VBA or Excel integrations (optional but beneficial).

Contract Type: Freelance/Contract/Part-Time

Why Join Us?
Become part of an innovative team at the forefront of AI technology. Your QA expertise will play an essential role in creating accurate, reliable, and safe AI solutions.Fully remote with flexible hours and milestone-based structure. Competitive compensation based on project deliverables.

If you are an experienced QA specialist or code reviewer with strong skills and a keen eye for detail in evaluating AI-generated content, we encourage you to apply.

#Excel #QA #DataQA #Spreadsheet #AI #RemoteJobs #Automation #TechHiring #Freelance

About the Company

SuperAnnotate is the leading platform for building, fine-tuning, iterating, and managing your AI models faster with the highest-quality training data. With advanced annotation and QA tools, data curation, automation features, native integrations, and data governance, we enable enterprises to build datasets and successful ML pipelines. Partner with SuperAnnotate’s expert and professionally managed annotation workforce that can help you quickly deliver high-quality data for building top-performing models. Know more

Related Jobs

Company background Company brand
Company Name
Ballpoint
Job Title
Data Scientist / MMM Lead (Fractional / PT)
Job Description
About us and our clients: We’re Ballpoint, a creative growth agency based in London. We’re young and ambitious. Since launching 2 years ago, we’ve grown 2.5x YoY and plan to continue that trajectory for the next 5 years. Ballpoint was born as it became clear creativity would be crucial for modern paid social advertising. The best agencies of the 2010s were either ‘performance’ agencies or ‘creative’ agencies, we’ve been both since day one. Many of our clients are similar to us, early innovators in their fields. We work with growing start-ups through to household names, each with their own growth challenges for us to break through. The role in a nutshell: A freelance econometrician to own our Marketing Mix Models. You’ll iterate, and productionise MMMs in Meta Robyn (R) and Google Meridian (Python), then pressure‑test them with lift studies and hold‑outs. The goal: bullet‑proof budget recommendations our clients can take straight to the CFO. You’ll plug into our existing data engineering, media buying and creative pods to make measurement a competitive edge. Important info This is a flexible role in regards to days/hours, for the right person This is a 1 month contract full time or 3 month part time position, paying between £400 - £500 a day depending on experience and skillset What you’ll do: Day to Day Pull granular spend / KPI data from BigQuery, Snowflake, GA4, Meta & Ads APIs. Clean, transform and QA datasets; code lives in Git + CI. Fit Robyn (ridge + evolutionary) and Meridian (Bayesian + JAX) models; compare accuracy, stability and ROI outputs. Calibrate with geo‑tests, holdouts and causal‑impact checks. Turn outputs into dashboards and optimisation scenarios; brief media buyers on re‑allocations. Document everything so any analyst can reproduce in one command. Skills you must have: R (tidyverse, prophet, Robyn) & Python (Jupyter, numpy/pandas, JAX, Meridian). Solid econometrics: Bayesian regression, adstock, saturation, priors, cross‑validation. SQL (BigQuery), dbt or similar for pipelines. Experiment design: lift, geo‑split, incrementality. Clear storyteller – can defend a model to finance, founders and creatives. Nice-to-have skills: GCP or AWS infra, Docker, Airflow. ML‑Ops (CI/CD for notebooks, Feature Store). Causal inference libs (DoWhy, CausalImpact). Visualisation in Looker, Tableau or Streamlit. Experience pushing MMM results into bid‑modifiers or scripted budget allocations. Why we think you’ll love it at Ballpoint: A culture that celebrates big ideas and supports bold thinking Work across brands like Tilly Sveaas, Mother Root, Origin Coffee & Thriva London based office with monthly socials, plus summer and Christmas parties You'll join a small but mighty team and be able to make your mark from day one Application process Please fill in the Typeform - we aim to respond to all applicants within 14 days to let you know if we’ll be progressing your application. If selected, the remaining process will be: 30-minute intro call with our Head of Creative Studio In-person technical interview with Ballpoint’s founder Final decision and offer
London, United Kingdom
On site
Freelance
06-05-2025
Company background Company brand
Company Name
Queen Square Recruitment
Job Title
Senior GCP Data Engineer
Job Description
Senior GCP Data Engineer (Contract) Position: Senior GCP Data Engineer Location: London (Hybrid – 3 days onsite) Contract Type:12-month Contract Day Rate:£500(Inside IR35) Start Date: ASAP Client: Leading Investment Bank (Cyber Security Division) Why Join Us? Innovative Projects: Lead the development of cutting-edge data solutions within the cyber security domain. Collaborative Environment: Work alongside a team of seasoned professionals in a supportive setting. Professional Growth: Engage in projects that offer opportunities for skill enhancement and career advancement. Role Overview We are seeking a seasoned Senior GCP Data Engineer to spearhead data initiatives within our cyber security division. The ideal candidate will possess a robust background in data engineering, particularly within the Google Cloud Platform (GCP), and will be adept at designing and implementing scalable data solutions. Key Responsibilities API Development: Design, develop, and maintain robust and scalable backend systems and APIs. Data Ingestion: Develop and maintain data pipelines to extract data from various sources and load it into Google Cloud environments. Data Transformation: Implement data transformation processes, including data cleansing, normalization, and aggregation, to ensure data quality and consistency. Data Modelling: Develop and maintain data models and schemas to support efficient data storage and retrieval in Google Cloud platforms. Data Integration: Integrate data from multiple sources, both on-premises and cloud-based, using Cloud Composer or other relevant tools. Data Lakes: Build data lakes using Google Cloud services such as BigQuery. Performance Optimization: Optimize data pipelines and queries for improved performance and scalability in Google Cloud environments. Collaboration: Work with product managers to ensure superior product delivery to drive business value & transformation. Documentation: Document data engineering processes, data flows, and system configurations for future reference and knowledge sharing. Required Skills & Experience 8+ years of experience in data engineering, with a strong focus on Google Cloud Platform (GCP)-based solutions. Proficiency in the GCP platform, particularly in Data & AI services (e.g., BigQuery, DataProc, Cloud SQL, DataFlow, Pub/Sub, Cloud Data Fusion, Cloud Composer, Python, SQL). Designing, developing, and deploying scalable, reliable, and secure cloud-based solutions using GCP services. Translating business requirements into technical specifications. Proficiently utilizing core GCP services, such as Compute Engine, Kubernetes Engine (GKE), Cloud Storage, Cloud Functions, Cloud SQL, and BigQuery. Implementing and managing GCP networking configurations. Using Infrastructure as Code (IaC) tools like Terraform to automate infrastructure provisioning and management. Implementing and managing Continuous Integration/Continuous Delivery (CI/CD) pipelines. Automating deployment processes. Implementing security best practices to protect cloud infrastructure and data. Identifying and resolving performance bottlenecks. Preferred Qualifications Experience with data analytics and Big Data technologies. Knowledge of cloud security best practices and compliance standards. Experience with agile development methodologies. GCP certifications (e.g., Google Cloud Certified Professional Cloud Developer) are a plus.
London, United Kingdom
Hybrid
Freelance
06-05-2025
Company background Company brand
Company Name
LHH
Job Title
Data Engineer
Job Description
Contract Data Engineers (x3) – Defence Sector – Bristol Inside IR35 | DV Clearance Required or Willing to Undergo Location: Bristol (Onsite/Hybrid) | Contract Duration: 6-12 months+ We are recruiting three experienced Data Engineers for a leading defence client in Bristol. This is a contract opportunity operating inside IR35, offering a chance to work in a high-security, mission-critical environment. You must already hold UK DV Clearance, or be eligible and willing to undergo the vetting process. Key Responsibilities: Design, develop, and manage complex, large-scale data pipelines in secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with Apache NiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure high performance and reliability Write efficient, secure scripts and code using Python, Bash, or similar languages Collaborate with cross-functional teams to meet technical and operational requirements Essential Skills and Experience: 3+ years’ experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with Apache NiFi Strong understanding of data security, governance, and compliance requirements Experience building real-time, large-scale data pipelines Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in secure deployments Experience using Infrastructure as Code tools Excellent communication and stakeholder management skills Detail-oriented with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning algorithms and data science workflows Proven ability to deliver end-to-end data solutions Knowledge of Terraform, Ansible, or similar IaC tools Interested? If you're ready to work on cutting-edge data engineering projects that make a difference, get in touch to learn more.
Bristol, United Kingdom
On site
Freelance
06-05-2025
Company background Company brand
Company Name
TJ REC SOLUTIONS
Job Title
Technical Data Business Analyst (ETRM)
Job Description
We are seeking a skilled and experienced Technical Data Business Analyst to join our Data & Analytics portfolio team. This role requires a deep understanding of the energy trading domain, strong data analysis skills, and the ability to work cross-functionally to deliver high-impact digital analytics solutions. You must have the right to work in the UK! Location: Hybrid – London, UK Office (3 days onsite, 2 days remote) Duration: 12 Month initial contract Rate: £350-£400 per day (Gross) Inside IR35 Core Skills Required: ETRM (Energy Trading and Risk Management) SQL Oil & Gas or Energy Trading domain experience Power BI / Tableau Python Data visualization and modeling skills Qualifications & Experience: 4+ years of experience as a Business Analyst or Product Manager delivering data-centric solutions. Domain expertise in Energy Trading, Financial Services, or Oil & Gas is essential. Hands-on experience with SQL and working knowledge of data analytics tools (Python is a plus). Familiarity with data visualization platforms (Power BI, Tableau, etc.). Strong knowledge of Agile methodologies and tools such as Azure DevOps or JIRA. Bachelor’s or Master’s degree in Computer Science, Data Science, Business, or related field. Key Responsibilities: Drive the delivery of data products across workstreams within the Data & Analytics portfolio. Collaborate with platform engineers, data engineers, and business stakeholders to gather and define product requirements. Translate business needs into technical specifications, considering system capabilities and limitations. Provide high-level assurance and testing of data products before business release. Manage the entire product development lifecycle from ideation through to delivery. Prioritize product features and work items for incremental delivery of business value. Act as a liaison between technical teams and business stakeholders, managing communications, feedback, and expectations. Stay up to date with data product trends, emerging technologies, and best practices to drive innovation. Champion user-centered design by incorporating user feedback and research into product development. If interested please apply with an up to date CV.
London, United Kingdom
Hybrid
Freelance
06-05-2025