
Project Brains
About the Company
Project Brains is a Future of Work platform, helping ambitious businesses grow.
We match business needs with fractional specialists with expertise to deliver successful outcomes. So businesses can focus on core priorities, helped by fractional specialists from our vetted community.
Listed Jobs


- Company Name
- Project Brains
- Job Title
- Scala Developer
- Job Description
-
Company Description
Project Brains is a modern talent platform connecting independent experts to ambitious businesses.
We’re currently supporting a global financial services business in their search for Scala Engineers to help enhance their data infrastructure and streaming capabilities.
Role Description
We are looking for experienced Scala Developers (10+ years) to join the team on a 6-month fixed-term contract, with strong likelihood of extension. This is a Northampton-based hybrid role (2 days a week onsite).
You will design, develop, and optimise large-scale data processing systems using Scala, Spark, Kafka, and related technologies. The work involves building real-time and batch data pipelines, working with cloud-native tools (AWS, Databricks), and supporting modern data lake architectures.
Key responsibilities include:
- Building scalable ETL and streaming pipelines with Scala and Spark
- Working with data lake tech like Databricks Delta Lake, Iceberg, or Hudi
- Implementing event-driven architectures and distributed data solutions
- Collaborating with data scientists, engineers, and architects
- Supporting governance, security, and performance best practices
Qualifications
- 10+ years’ software development experience with a strong Scala focus
- Deep experience with Apache Spark (batch & streaming), Kafka, Flink
- Strong grounding in distributed systems, functional programming, and cloud platforms (AWS preferred)
- Skilled in containerisation (Docker, Kubernetes) and IaC (Terraform)
- Familiarity with JSON, Parquet, Avro, ORC formats and both SQL and NoSQL databases
- Background in financial services or previous experience with Hadoop to Databricks migration is a plus
Apply online or email us at info@projectbrains.io if you're interested or know someone great.


- Company Name
- Project Brains
- Job Title
- Senior SAS Engineer
- Job Description
-
Overview:
Project Brains is a Future of Work platform that helps ambitious businesses grow by matching their needs with fractional specialists who deliver successful outcomes. Businesses can focus on core priorities with the help of vetted fractional specialists from our community.
We are seeking a highly skilled Senior SAS Data Migration Engineer to lead the migration of an existing Data Lake into the Google Cloud. This critical role involves a fast-paced environment with a deadline-driven project aiming for completion by June 30th, with a possible 14-day extension.
Key Responsibilities:
• Architect and implement data migration strategies from a SAS-based Data Lake to Google Cloud.
• Utilize programming and data processing tools such as SAS, SQL, dbt, git, Airflow (Python), and ODE Generic Export Framework.
• Work with tools and software like KNIME, VS Code, DIL-Pipelines, InnovatorX (MDD Software, Interfaces, Documentation), and manage Iceberg/Biglake Tables.
• Collaborate with cross-functional teams to ensure a seamless and efficient migration process.
• Adhere to project deadlines and provide regular updates on progress.
Key Qualifications:
• Minimum of 5 years of relevant experience in SAS and data engineering.
• Proficient in Programming & Data Processing tools: SAS, SQL, dbt, git, Airflow (Python), and ODE Generic Export Framework.
• Experienced with Additional Tools & Software: KNIME, VS Code, DIL-Pipelines, InnovatorX (MDD Software, Interfaces, Documentation), and Iceberg/Biglake Tables.
• Demonstrated ability to meet strict deadlines in a high-pressure environment.
• Strong analytical and problem-solving skills.
• Excellent communication and collaboration abilities.
Application Instructions:
Qualified candidates are encouraged to apply by submitting a resume and a cover letter outlining their experience and suitability for the role. You can contact us on info@projectbrains.io


- Company Name
- Project Brains
- Job Title
- Prompt Engineering Internship
- Job Description
-
Company Description
Project Brains is a Future of Work platform that matches business needs with fractional specialists to help ambitious businesses grow. Businesses can focus on core priorities with the support of fractional specialists from our vetted community.
Role Description
This is a contract remote role for a Prompt Engineering Intern at Project Brains. The intern will be responsible for developing and maintaining prompts embedded in our tools, collaborating with the team to create innovative solutions, and assisting in the optimisation of prompts in our delivery processes.
Qualifications
Prompt writing and testing skills
Experience in AI applications
Strong problem-solving and analytical skills
Ability to work in a team and collaborate effectively
Familiarity with prompt engineering tools and techniques
Excellent communication and interpersonal skills


- Company Name
- Project Brains
- Job Title
- Python Data Dev
- Job Description
-
Work with Us
We’re Project Brains – a fractional talent platform powering the future of work.
Our client is an international tech services major working with one of the world’s most respected financial institutions. They’re hiring a Python Data Engineer to join their Internal Audit Data Team in Birmingham.
This role is perfect for someone who thrives on building robust data infrastructure in high-trust, regulated environments.
The Role
You’ll be building scalable data pipelines to support risk and audit analytics.
Expect to work with Python (Pandas, PySpark, NumPy) and cloud tools like AWS or Azure, orchestrating data ingestion from structured and unstructured sources across the firm.
Your pipelines will power strategic insights for internal auditors and stakeholders, so data quality, governance, and compliance are front and centre.
You’ll also collaborate closely with non-technical users, document workflows, and ensure every transformation is auditable and reproducible.
You’ll Bring
3+ years of hands-on Python (ideally with PySpark or similar big data tooling)
Proven experience building ETL/ELT pipelines in a cloud-native environment
AWS, GCP, or Azure experience in real-world deployments
Strong grasp of data validation, governance, and compliance standards
Bonus if you’ve worked in financial services, risk, audit, or regulated industries
Confident communicator who thrives in cross-functional teams
Ready to build with purpose?
Reach out to the team on info@projectbrains.io