Responsibilities
- Develop Robust Data Pipelines
- Apply data engineering practices and standards to build scalable, maintainable, and efficient data pipelines.
- Optimize Raw Data Ingestion
- Analyze and structure raw data ingestion pipelines for improved efficiency and organization.
- Assess Business Needs
- Evaluate business requirements and objectives to align data solutions with strategic goals.
- Support Data-Driven Decision Making
- Collaborate with senior business stakeholders to define new data product use cases and assess their business value.
- Own and Maintain Data Products
- Take full ownership of data product pipelines, ensuring their continuous improvement and reliability.
- Ensure Data Quality and Reliability
- Act as the “Quality Gatekeeper” by implementing measures to enhance data accuracy, consistency, and trustworthiness.
- Apply Industry Best Practices
- Leverage insights from the Data One community and other industry standards to refine data engineering workflows.
- Continuously Improve Processes
- Identify opportunities for process optimization and efficiency improvements, providing concrete proposals for implementation.
- Lead Cross-Team Collaboration
- Proactively engage with other teams to align on data strategies, ensuring smooth integration and workflow continuity.
- Embrace Agile Responsibilities
- Take on additional responsibilities as needed within the Agile Team, demonstrating flexibility and adaptability.
- Implement Scalable Data Solutions
- Design and implement scalable data solutions that can support future growth and evolving business needs.
- Monitor and Troubleshoot Pipelines
- Continuously monitor data pipelines, identifying and resolving issues to maintain seamless data flow.
Requirements
- Proven Experience in Data Engineering
- Demonstrated experience as a Data Engineer, working on complex data solutions.
- Expertise in Data Modeling
- Strong technical knowledge of data modeling techniques, especially Data Vault methodology.
- Mastery of ETL Tools
- Advanced expertise with ETL tools such as Talend, Alteryx, or similar solutions.
- Strong SQL Skills
- Advanced SQL programming experience for efficient data querying and transformation
- Agile Development Experience
- Previous experience working with Agile methodologies in software and data engineering projects.
- Hands-On with Data Transformation Tools
- Experience using DBT (Data Build Tool) for transforming and structuring data effectively.
- Familiarity with DevOps & DataOps
- Experience working with CI/CD pipelines, GitLab, and DataOps.live to streamline data operations.
- Snowflake Expertise
- Practical experience with Snowflake for data warehousing and analytics
- Lifecycle Management of Data Products
- Experience managing end-to-end lifecycle of data products, from ingestion to consumption.
- Knowledge of Data Mesh Concepts
- Understanding of Data Mesh architecture and FAIR principles (Findability, Accessibility, Interoperability, and Reusability).
- ETL & Database Development Experience
- At least 2-3 years of experience in ETL and DB development
- Strong Problem-Solving and Analytical Skills
- Ability to diagnose issues, optimize data workflows, and implement scalable solutions.
What we offer
- B2B Contract
- Employment based on a B2B contract
- Stable and Dynamic International Firm
- Opportunity to work in a stable, dynamically developing international company.
- Engaging Projects and Latest IT
- Chance to participate in interesting projects and work with the latest information technologies
- Competitive Rates
- Attractive remuneration rates offered
- Renowned International Projects
- Involvement in the most prestigious international projects.
- Multisport and Private Medical Care
- Access to Multisport benefits and private healthcare services.
Nice to have
- Python Programming Skills
- Experience using Python for data processing, automation, and scripting.
- Snowflake Certification
- Official Snowflake certification demonstrating expertise in cloud-based data warehousing.
- Experience with Advanced Data Governance
- Familiarity with metadata management, data cataloging, and governance frameworks for improved data accessibility.

Work with us
Apply & join the team
Didn’t find anything for yourself? Send your CV to [email protected]