Data Analyst - Paris - 6 months+
On site
Paris, France
Freelance
11-11-2024
Job Specifications
Global Enterprise Partners is currently looking for a Paris based Data Analyst to assist a global FMCG client with their AI driven global supply chain forecasting tool. We are looking for passionate and proactive people who can make a difference and add value to an organization.
The Data Analyst should have the following skillset:
5+ years of Data Analytics experience
advanced level in SQL and Python (panda library and visualization packages
Experience with DBT is a strong plus
Advanced level in Power BI or similar data visualization solution
Understanding of data circulation & data architecture principles
English AND French speaking
Contract details:
Start date: ASAP - flexible
Type of contract: long-term rolling contract
Time zone: CET
Location: Paris - hybrid
Interested? Let's connect!
Razvan
About the Company
Global Enterprise Partners is a specialist enterprise technology recruitment agency which places specialists in Enterprise Software and Business Process globally. We supply consultants in over 100-countries compliantly. Our network reaches all major technology hubs across the world and we are geared to provide compliant recruitment solutions for IT projects. Global Enterprise Partners is the safe and reliable partner of choice to support your global IT initiatives. We have exclusive focus across all Enterprise Software v... Know more
Related Jobs
- Company Name
- Innova Solutions
- Job Title
- Data Architect
- Job Description
- Data Ingénieur/Data Architect Freelancer Nous recrutons pour l'un de nos clients situé en région parisienne un Data ingénieur / Data architect. Missions : - Fournir un soutien technique et organisationnel à l'équipe opérationnelle pour assurer une transition efficace vers Cloudera Data Platform. - Élaborer et exécuter la roadmap d'architecture (Rationalisation des environnements - distribution Hadoop -Stratégie de Backup - Process d'onboarding : nouveau cas d'usage, nouvel utilisateur - Secret Management - SSSD - Refonte de l'organisation LDAP - Gestion des permissions - SSO - Haute disponibilité - Catalogue de données et de usecase...) - Développer et appliquer les principes de gouvernance data afin d'assurer un contrôle et une gestion optimaux des données. - Intervenir en soutien pour résoudre les blocages liés au déploiement, à l'installation des outils ou au débogage. - Offrir une expertise technique pour : - Le choix des technologies. - L'optimisation des performances. - L'application des meilleures pratiques de développement. Stack technique : - Les outils de stockage : Hadoop, Hive, Hbase, Postgresql, Ceph - Les outils de gestion de calcul distribué : Spark, Yarn - Les outils de prototypage : Jupyter, Dataiku, Hue, Airflow - Les outils d'alimentation : Kafka, Nifi - Les outils d'affichage : Grafana, Tableau - Les outils d'administration : Kerberos, Zookeeper, Solr, Ranger, Atlas, Ldap - Les outils de proxyfication : Knox, HAProxy - Les outils de supervision : Prometheus - Les outils de configuration management : Puppet, Ansible - Outil de secret management : Vault - Cloud Computing : Openstack Profil recherché : - Minimum 4 ans d'expérience professionnelle en tant que data ingénieur / data architect. - Une maîtrise approfondie de : Python, Spark, Hadoop et Cloudera Data Platform Bienvenue chez Innova Solutions A la fois Cabinet de recrutement & Société de conseil, Innova Solutions a une couverture nationale. Nous sommes basés à Sophia Antipolis, au coeur de la French Riviera, nous accompagnons nos clients dans toute la France et offrons à nos candidats des postes dans les secteurs de l'IT, des Télécoms et de l'Engineering. Nos clients ont accès à plus de 85 bureaux à travers le monde : - Aux États-Unis - En Europe (Belgique et Nice) - Au Royaume-Uni - A Singapour - En Inde
- Company Name
- DATAPY
- Job Title
- Data Engineer - Informatica Cloud (IICS/IDMC) & Snowflake
- Job Description
- Key Responsibilities:Data Integration Engineering: Design and implement robust data pipelines to integrate data into Snowflake using Informatica Cloud (IICS/IDMC). Ensure pipelines are efficient, scalable, and reliable to meet organizational requirements. Monitoring & Optimization: Establish and manage key performance indicators (KPIs) for data processes, including: Volume of integrated data. Processing speed and error rates. Success and failure rates of data pipelines. Continuously monitor the performance of workflows and implement optimizations as needed. Data Quality Assurance: Define and enforce data quality standards, focusing on: Completeness, accuracy, and consistency of data. Detection and resolution of data errors and anomalies. Troubleshoot integration issues to maintain a high level of data integrity. Collaboration & Reporting: Collaborate with cross-functional teams to align data engineering efforts with business objectives. Provide actionable insights and regular reports on pipeline performance and data quality to stakeholders. Qualifications and Experience:Required Skills and Experience: 8+ years of experience in data engineering or a related role. Hands-on expertise in: Informatica Cloud (IICS/IDMC) andSnowflake. Strong experience in building and optimizing data pipelines. In-depth knowledge of data quality assurance and monitoring best practices. Excellent troubleshooting and problem-solving skills for data integration issues. Preferred Skills: Previous experience in the pharmaceutical or life sciences industry is an advantage. Effective communication skills for remote collaboration Contract Type: Renewable.
- Company Name
- Coforge U.K. Ltd
- Job Title
- Big Data Engineer
- Job Description
- Role: Big Data Engineer Skills: Bigdata, MapR,Spark, Hadoop, Kafka and Impala Location: Nice France Type: Contract We are at Coforge hiring for Big Data Engineer with MapR, Spark, Hadoop, Kafka and Impala Hands-on experience writing Production quality code in Scala (preferred) and Spark Hands-on experience on big data technologies, including the MapR filesystem, Apache Spark, Hadoop, Kafka & Impala. Proven experience with the Apache Spark Architecture and components. Hands-on experience on scripting languages including Bash scripting and Python. Knowing different kinds of database technologies, including of course SQL. Previous work with no-SQL databases, like MongoDB will be a plus. Solid knowledge of build tools and in particular Maven and SBT. Experience on building and maintaining DevOps pipelines will be a plus. Hands-on experience on Azure cloud technologies. Hands-on experience on Azure Databricks and Databricks SQL. Professional experience of 4 – 9 Yrs Knowledge of Software Design under Agile frameworks (Scrum, Kanban, Safe) Proven experience on design and maintaining applications using Scala, (PySpark) MAPR distribution, Hadoop ecosystem, Kafka, Impala. Hands-on experience on Microsoft Azure, Databricks
- Company Name
- Antler
- Job Title
- ML / AI Engineer & Startup Founder (March. 2024)
- Job Description
- Note: This ad is not for a job, but rather for the opportunity to build your own VC-backed startup! Note: Antler France is part of Antler Continental Europe. For the September Residency, you will need to relocate to Amsterdam, Berlin, or Munich for the entire 10-week duration. After forming your co-founding team, you will receive support from the Antler Paris Office and can then relocate to France Who you are you ? You are a ML or AI engineer with experience in APIs model integration, ML powered models optimization, or AI powered robotics? Do you have strong problem-solving abilities, a leadership track record, and a passion for developing tech solutions using cutting-edge technologies? Have you spent your career building game-changing software/hardware for early-stage startups, scale-ups or big corporations? Most importantly, are you looking to build a disruptive tech venture in the vibrant ecosystem of Paris? If so, we would like to hear from you ! About Antler Antler is the world's most active venture capital firm, providing funding from pre-seed to Series C. We invest in Founders who seek to define our tomorrow and reshape the world we live in with groundbreaking technological innovation. Every year, we handpick the strongest Founders across our 30 locations on the globe, which has led us to work with more than 8,000 founders to date and a portfolio of more than 1,200 startups The Antler Founder Residency Our 10-week, full-time program is designed to give you the best possible start for your entrepreneurial journey. Whether you have founded a company before or are just getting started, we will help you find a co-founding team among the 50-70 people that will be joining the program. Founders that join our programs come from a variety of backgrounds and have an assortment of skills and expertise. Our program will facilitate you and your co-founder to quickly build on an idea (either you come with an idea or you will be defining it with your team). If you enjoy a fast-paced, diverse, demanding, and incredibly engaging environment, join us ! 10-Week full-time Immersive Journey: Kicking off March 2025 in Amsterdam, Berlin or Munich Co-Founder Discovery: In the first 10 weeks, collaborate with fellow Founders to find your co-founders Pitch to Invest: Present your team and business idea to the Antler Investment Committee: €300k investment commitment at Day Zero in two tranches of €100k for 10% without dilution protection and €200k converting at the valuation of your next round Accelerate Growth: If funded, we work closely with you to make your startup a success! About you: For our upcoming residency program, we are scouting technical profiles across different areas in the IT industry who want to build the next disruptive tech venture from scratch! Technical profiles joining our program play an instrumental role as Co-founders, mainly being responsible for the tech/product side of the business, which requires that you: identify as someone with deep expertise in data engineering, machine learning, artificial intelligence have strong analytical skills with hands-on experience in conducting technological analyses and applying advanced analytics & Machine Learning methodologies execute on data collection plans from both structured and unstructured sources that help in data exploration, hypothesis testing and statistical modelling design, develop, and implement models, proofs-of-concept and pre-product prototypes Application process Our streamlined application process takes only 2-3 weeks Apply through our website Participate in an introductory interview Progress to an in-depth interview with our partners Receive a potential offer Meet Our Success Stories Treyd: A buy-now-pay-later solution for supplier invoices Two: A comprehensive B2B payment suite simplifying sales for online and offline businesses Evyon: A developer of eco-friendly battery energy storage systems Glint Solar: Software accelerating identification of optimal solar installation sites PerPlant: AI-powered sensor systems creating a gateway to sustainable agriculture