Careers At Seargin

Break into the IT industry without coding or tech skills and join teams working on international projects.

Senior Data Engineer

Seargin is a dynamic multinational tech company operating in 50 countries. At Seargin, we drive innovation and create projects that shape the future and greatly enhance the quality of life. You will find our solutions in the space industry, supporting scientists in the development of cancer drugs, and implementing innovative technological solutions for industrial clients worldwide. These are just some of the areas in which we operate.


Senior Data Engineer




European Union

Form of employment:


Experience level:



  • Snowflake Data Integration

    Integrate Snowflake into existing data infrastructure, ensuring seamless data ingestion, storage, and querying capabilities

  • Implementation

    Implement platform for streamlined data operations, automating data pipelines, monitoring, and governance processes

  • DBT Data Modeling

    Utilize DBT (Data Build Tool) for advanced data modeling, transformation, and orchestration, ensuring data consistency and reliability

  • Architecting Scalable Data Solutions

    Design scalable and efficient data architectures to support development initiatives in the affiliate space, ensuring flexibility and performance

  • Data Pipeline Optimization

    Optimize data pipelines for efficiency, reliability, and performance, leveraging technologies like Apache Spark and Airflow

  • ETL Process Enhancement

    Enhance Extract, Transform, Load (ETL) processes for improved data quality, accuracy, and timeliness, mitigating data inconsistencies and errors

  • Data Governance Implementation

    Implement data governance frameworks and policies to ensure data integrity, security, and compliance with regulatory standards

  • Mentorship and Collaboration

    Mentor junior team members, collaborate with cross-functional teams, and provide technical leadership in data engineering initiatives to drive project success

What we offer


  • Programming Language Proficiency (Python or R)

    Possess over 4 years of experience working with Python or R, focusing on developing data pipelines for efficient data processing and analysis

  • SQL Expertise

    Demonstrate over 4 years of proficiency in SQL, ensuring the ability to effectively query and manipulate data from relational databases

  • Data Pipeline Maintenance

    Proficiency in Python equivalent to 4-5 years of experience, demonstrating advanced programming capabilities

  • Storage and Data Variety Experience

    Utilize over 3 years of experience with various storage types and data varieties, including filesystems, relational databases, NoSQL databases, and diverse data formats

  • Data Architecture Concepts Proficiency

    Demonstrate over 3 years of experience in data architecture concepts, such as data modeling, metadata management, ETL/ELT, and real-time streaming, ensuring comprehensive knowledge to design and implement robust data solutions

  • Cloud Technologies for Data Pipelines

    Possess over 3 years of experience with cloud technologies for data pipelines, including Airflow, AWS Glue, Google Dataflow, and other smart solutions for handling data in the cloud

  • Java/Scala Experience

    Possess over 1 year of experience in Java and/or Scala, complementing programming language skills for diverse data engineering tasks

  • Data Serialization Languages Mastery

    Exhibit excellent knowledge of data serialization languages such as JSON, XML, and YAML, ensuring compatibility and interoperability in data processing workflows

  • Git, DevOps, and Unix Skills

    Possess excellent knowledge of Git, Gitflow, and DevOps tools like Docker, Jenkins, and Terraform, along with proficiency in Unix, ensuring efficient collaboration, version control, and deployment processes

Apply & join the team

    Ready to elevate your business? Let’s start the conversation.

    Reach out to learn more