
Coding Expertise for AI Training
Remote
Belgium
Full Time
03-04-2025
Job Specifications
Outlier helps the world’s most innovative companies improve their AI models by providing human feedback. Are you an experienced software engineer who would like to lend your coding expertise to train AI models?
We partner with organizations to train AI large language models, helping cutting-edge generative AI models write better code. Projects typically include discrete, highly variable problems that involve engaging with these models as they learn to code. There is no requirement for previous AI experience.
About The Opportunity
Outlier is looking for talented coders to help train generative artificial intelligence models
This freelance opportunity is remote and hours are flexible, so you can work whenever is best for you
You may contribute your expertise by…
Crafting and answering questions related to computer science in order to help train AI models
Evaluating and ranking code generated by AI models
Examples Of Desirable Expertise
Currently enrolled in or completed a bachelor's degree or higher in computer science at a selective institution
Proficiency working with one or more of the the following languages: Java, Python, JavaScript / TypeScript, C++, Swift, and Verilog
Excellent attention to detail, including grammar, punctuation, and style guidelines
Payment
Currently, pay rates for core project work by coding experts range from USD $12.50 to $50 per hour.
Rates vary based on expertise, skills assessment, location, project need, and other factors. For example, higher rates may be offered to PhDs. For non-core work, such as during initial project onboarding or project overtime phases, lower rates may apply. Certain projects offer incentive payments. Please review the payment terms for each project.
About the Company
Outlier is a remote work platform that offers freelancers flexible, well-compensated opportunities across various domains. We've paid out over $100M to more than 50,000 freelancers across the world. Contributors on our site help improve the world's best AI systems. Get started by applying on our website today! Know more
Related Jobs


- Company Name
- Tropos.io
- Job Title
- Data Analytics Solution Architect
- Job Description
- About Tropos At Tropos, we deliver end-to-end data analytics solutions that lead to actionable business insights. Collaborating with clients across diverse industries, our innovative team leverages cutting-edge technologies to tackle real-world data challenges and drive long-term success. Within the company, we embrace a can-do and no-nonsense attitude. We work together as a team and maintain an innovative startup mentality. About The Job We are seeking an experienced Solution Architect in Data Analytics to lead the design and implementation of data-driven solutions for our clients. In this role, you will bridge the gap between business needs and technical execution, ensuring scalable, secure, and robust analytics solutions that deliver actionable insights. Key Responsibilities Architecture Design: Develop and define end-to-end architecture for data analytics solutions, aligning with business requirements and technical constraints. Technology Leadership: Evaluate and recommend tools, frameworks, and platforms for data ingestion, storage, processing, and visualization. Collaboration: Partner with our project manager, technology partners, and our team of data engineers to ensure solutions align with business objectives. Data Strategy: Define data modeling, governance, and integration strategies to ensure high-quality and accessible data. Scalability & Performance: Design architectures that are scalable, cost-effective, and optimized for performance. Security & Compliance: Ensure that solutions comply with data security, privacy, and regulatory standards. Proven experience as a Solution Architect, Data Architect, or similar role in data analytics projects. Deep understanding of data warehousing, ETL/ELT processes, and analytics platforms (e.g. Snowflake, …). Proficiency in cloud platforms such as AWS, Azure, or Google Cloud (e.g., Redshift, BigQuery, Databricks). Strong knowledge of data modeling techniques, SQL, and big data technologies (e.g., dbt, Hadoop, Spark). Experience with business intelligence tools like Tableau, Power BI, or Looker. Familiarity with machine learning workflows and tools is a plus. Exceptional communication and stakeholder management skills. Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. A team player with a coaching attitude and a growth mindset towards oneself and others; Fluent in Dutch and English, French is a plus and other languages a bonus IN RETURN, WE OFFER A competitive salary package with extra-legal benefits, including: a company car / legal mobility budget hospitalization and group insurance flexible and tax-friendly options via a cafeteria plan a high-end laptop expense allowances meal and eco-vouchers A flexible and hybrid way of working: a healthy mix between home & on-site work; INTERESTED? APPLY NOW! If we think there could be a match, you’ll be invited for a short intake interview (video call). At Tropos, we embrace diversity in all its forms - backgrounds, experiences, talents and perspectives. We believe that these differences enrich our team and fuel creativity.


- Company Name
- Vivid Resourcing
- Job Title
- Data Engineer
- Job Description
- Join a dynamic team where your expertise in Google Cloud Platform fuels innovation, drives smarter decisions, and supports business growth through powerful data engineering. We’re on the lookout for a skilled GCP Data Engineer to design and implement efficient, secure, and scalable data pipelines. Your work will directly empower real-time analytics, optimise data storage, and enhance operational efficiency using the latest GCP tools. What You’ll Be Doing Design, develop, maintain, and optimize data pipelines on Google Cloud Platform. Build and manage data warehouses and lakes using tools like BigQuery, Cloud Storage, and Dataflow. Leverage GCP services including Cloud Pub/Sub, Cloud Composer, and Dataflow to ensure seamless data flow and processing. Use your expertise in SQL, along with experience in relational and NoSQL databases. Transform and process data using Python, Spark, or Scala. Implement secure, scalable ETL frameworks that deliver value and insights. What’s In It For You? Competitive salary. Flexible Hybrid Working - 2 days a week onsite. Work with the latest in GCP data engineering technologies. Collaborate with a close-knit team of ~50 passionate data professionals. Explore a wide range of industries, tech stacks, and project types. We’re more than just colleagues—we’re a village of data enthusiasts who love what we do, help each other grow, and make space for fun along the way. If this is the next big step on your career, reach me on the following: +31 (0) 203 997 864 euan.mccaul-gallimore@vividresourcing.com !


- Company Name
- ENGIBEX
- Job Title
- Software and AI Quality Assurance Engineer
- Job Description
- We are a Belgian engineering consulting firm that assists SMEs and large industrial organizations in bringing their innovation and R&D initiatives to life. Job Description Gain a comprehensive understanding of the sorter's mechanical and optical systems to perform system integration tests. Ensure smooth integration and proactively identify potential integration issues. Develop and implement quality deliverables such as exploratory testing, automated test cases, reports, and documentation. Identify and document edge test cases and execute tests carefully, logging and documenting results meticulously. Analyse tests and report outcomes accurately and objectively to stakeholders. Design and maintain software quality assurance processes. Analyse & present test results, identify technical issues and their root cause. Conduct statistical analysis. Participate in technical meetings for requirement refinement. Gain hands-on work experience with industrial machines. Drive the automation test infrastructure strategy by researching methodologies and proposing new tools and equipment. Design automation infrastructure in collaboration with other engineers to streamline test case definition. Profile / Requirements Bachelor's or master's degree in software engineering, Industrial Engineering or equivalent. Ability to write production quality code in Python and other programming/scripting languages as needed. Experience in manual testing for web-based and/or desktop applications. Familiarity with automation test frameworks and equipment (e.g., robot_python, hardware-in-the-loop systems.). Knowledge of LAN and relevant troubleshooting. Experience with Git. Strong communication skills for effective interactions with project stakeholders, including presenting content and reporting issues. Analytical & structured problem-solving skills. Hands-on, proactive and engaged in onsite work. Willing and able to travel. Nice To Have Knowledge of Behaviour Driven Development (BDD) and writing features files. Experience with product lifecycle systems such as Azure DevOps. Experience with Agile methodologies (Lean development). Knowledge in DevOps infrastructure administration and maintenance: Team city and/or GitLab. Experience testing in C# or C++ ecosystem. Experience with PyTest. Experience with image processing and/or computer vision systems. Experience testing AI systems. ISTQB-Foundation Level certificate.


- Company Name
- ING Belgium
- Job Title
- Senior DataStage Engineer
- Job Description
- What will you do? You work in a squad as an IT Engineer to unlock a wide range of sources to our Information Warehouse and Data Marts. Sources deliver batch but also real-time. Modeling in the Information Warehouse is based on a 'Data Vault'-like model. Data Marts are dimensionally modeled and are accessible by end users through Cognos Analytics. Resources and their metadata are first described by the Business in the Information Governance Catalog and are logically modeled where necessary using the IBM BDW (Banking Data Warehouse) reference model. Your work covers the end-to-end data integration process from technical analysis to deployment and management. Your work environment ING works agile in multi-disciplinary teams (i.e. squads) based on Scrum and DevOps. As an engineer you work in a squad, but you are also part of a cross-squad expertise focused chapter. In your case, that is the Data Processing chapter. Check our way of working: https://www.youtube.com/watch?v=NcB0ZKWAPA0 Who are you? You are an enthusiastic, motivated engineer who enjoys working in a team on complex data integration solutions. You automate your own work! Instead of building the same job several times, standardizing and automating work is deeply rooted in your Engineer DNA. This applies not only to job building but to complete end-to-end delivery, including automatic testing and continuous delivery. You don't shy away from challenges. In fact, you go ahead and share your solution with your colleagues. You also endorse the values of ING, you consider it natural to behave in accordance with these values in your work and you are, like all ING employees, prepared to take the banker's oath. For more information, In Addition, You Recognize Yourself In The Following Profile Fluency in English Experience in end-to-end delivery of a data integration solution, including deployment and management Working experience in IBM Infosphere DataStage is a must Knowledge of SAS Enterprise guide is a big plus Experience in Scheduling tools (preferably UAC) Working experience in SQL Knowledge of Oracle is a plus Knowledge of UNIX, Linux is a plus Experience in agile development (pre) Familiar with test driven development (pre) Good knowledge of dimensional modelling Working experience in GIT Familiar of Continuous Delivery , ING Bank