to boost your career?

Senior Data Engineer
Company name: Seargin Sp. z o. o.
Company size: 500+ people
Wrocław/Warsaw/Katowice/Poznań, Poland
Job Specification

Website Seargin_ Seargin Sp. z o. o.


Seargin is looking for a Senior Data Engineer

  • Position: Senior Data Engineer
  • Technologies: Data Platforms, SQL, ETL
  • Location: Wrocław/Warsaw/Katowice/Poznań/Remote or remote
  • Country: Poland
  • Area: Project
  • Form of employment: Permanent
  • Experience level: Senior

The main tasks for the Senior Data Engineer will be:
  • Leading the distribution of
    • data models
    • data storage models
    • data migration to manage data inside the organization, for small to medium-sized ventures
  • Leading the manufacture of high-quality data engineering deliverables, helping to safeguard project timelines are met and giving informal mentoring / training to junior members of the crew
  • Provide technical expertise to maximize value from present applications, solutions, infrastructure, and emerging technologies and drive continuous improvement of internal processes
  • Leading the distribution of data value reviews counting data cleansing were needed to guarantee integrity and excellence
  • Resolving intensified design and application problems with moderate to high difficulty
  • Follow appropriate processes, procedures, and standards for data engineering and data modeling
  • Examining the latest industry trends like cloud computing and scattered processing and begin to infer the risks and benefits of their use in business
  • Develop working relationships with colleagues on other engineering teams and begin to cooperate to develop leading-edge data engineering solutions

The Candidate should have:
  • Batch Processing – the ability to project an efficient way of processing large amounts of data in which a group of transactions is accumulated over a period of time
  • Data Integration (Sourcing, Storage, and Migration) – the ability to create and implement enterprise data management models, capabilities, and solutions (structured and unstructured, data archiving principles, data warehouses, data sourcing, etc.). This involves data models, storage requirements, and migration of data between systems
  • Data Quality, Profiling, and Cleansing – the ability to overview (profile) a data set to determine its quality with respect to a defined set of parameters and to identify data for which corrective action (cleansing) is required to improve it
  • Stream Systems – the ability to explore, integrate, and retrieve all available data from the machines that produce it, as fast as it’s produced, in any format, and at any quality
  • Bachelor’s degree in Computer Science, Cyber Security, Actuarial, Behavior Economics, Data Analytics, Data Science, Econometrics, Engineering, IT, or similar field
  • Background in designing and building Data Platforms integrating disparate data sources
  • Knowledge of distributed computing
  • Expertise in:
  • ETL, SQL
  • MPP solutions to deal with massive amounts of data
  • Azure, Azure Databricks, Azure SQL, Synapse
  • developing dataflows using NiFi/ADF, Databricks
  • Outstanding, practical experience in:
  • implementing large scale data warehouses and data lakes
  • in Spark architecture and implementation
  • Experience working with:
  • Distributed Message Systems like Kafka
  • Python, PySpark, or R
  • Knowledge of:
  • working with structured and unstructured data sources
  • security measures like HTTPS and Kerberos
  • Proficiency in writing complex SQL and procedures
  • Familiarity with or creation of workflows using Oozie or similar tools
  • Team-oriented, detail-oriented, efficient, and solution-oriented attitude
  • Superb analytical and problem-solving skills
  • Excellent communication and interpersonal skills
  • Flexibility and ability to work independently and in a team
  • Great English skills (written and spoken)

It would be a plus if the Candidate had: 
  • Knowledge of:
  • Graph Databases, preferably Neo4J, Cypher, and Cosmos DB
  • Graph Data Modeling
  • Azure Data Lake Store, Databricks Delta lakes
  • Spark ML
  • Experience in developing Microservices

The Candidate can expect:
  • Permanent Contract
  • Challenging job in an international and multilingual environment
  • Professional development
  • Attractive and competitive compensation

If you meet the requirements described above, please send your application in English (.doc) at stating the name of the position in a subject and/or call +(48) 735 246 583.

    First name*



    Attach CV

    Company size: 500+
    Main location:Poland
    Seargin Sp. z o. o.
    Wrocław/Warsaw/Katowice/Poznań, Poland
    Get similar jobs like these by email


    Similar job

    Senior AWS Developer
    SAP Logistics Consultant
    SAP Logistics Consultant QM/PM/PP

    Job offers that might interest you


    Senior Audit Specialist


    Senior Java Developer


    Senior Cloud Engineer


    Fullstack Developer


    Salesforce Administrator / Developer


    Data Science Architect



    Send us a message using the contact form below.
    I'm a Client looking for an IT Expert or a Professional Team
    I'm an IT Expert looking for a great job / project opportunity