Ready
to boost your career?

Senior Data Engineer
Company name: Seargin Sp. z o. o.
Company size: 500+ people
Junior
Professional
Senior
Warsaw/Wroclaw/Poznan/Katowice/Remote, Poland
Job Specification

Website Seargin_ Seargin Sp. z o. o.

THE ELITE OF TOP IT EXPERTS

Seargin is looking for Senior Data Engineer

  • Position: Senior Data Engineer
  • Technologies: Azure, ETL, SQL
  • Location: Warsaw/Wroclaw/Poznan/Katowice/remote or remote
  • Country: Poland
  • Area: Project
  • Form of employment: B2B
  • Experience Level: Senior
The main tasks for the Senior Data Engineer will be:
  • Leading the manufacturing of high-quality data manufacturing deliverables, assisting to safeguard project timelines are met, and offering informal tutoring / training to junior members of the squad
  • Leading the distribution of data quality evaluations involving data cleansing where necessary to guarantee reliability and quality
  • Leading the distribution of
    • data models
    • data storage models
    • data migration to control data within the organization, for a small to medium-sized project
  • Settling intensified design and application problems with moderate to high difficulty
  • Examining the latest business styles such as:
    • cloud computing
    • distributed processing
    • start to infer hazards and advantages of their use in business
  • Offering technical proficiency to maximize benefit from existing applications, solutions, infrastructure and developing technologies and seek to constantly enhance internal procedures
  • Developing working relations with peers through other engineering groups and starting to cooperate to develop leading data engineering solutions
  • Driving observance to the appropriate data engineering and data modelling practices, methods and standards

The Candidate should have:
  • Ability to design an effective way of processing high amounts of data where a group of transactions is accumulated over a period of time
  • Ability to design and implement:
    • models,
    • capabilities
    • solutions to handle data within the business (structured and unstructured, data file away standards, data warehousing, data sourcing, etc.). This contains the data models, storage conditions and migration of data from one system to another
  • Ability to evaluate (profile) a data set to determine its value versus a defined set of factors and to highlight data where remedial action (cleansing) is necessary to remediate the data
  • Ability to:
    • discover
    • integrate
    • ingest all offered data from the machines that generate it, as fast as it’s created, in any structure, and at any value
  • B.S. in:
    • Actuarial
    • Behavior Economics
    • Computer Science
    • Data Analytics
    • Data Science
    • Econometrics
    • Engineering
    • IT, Cyber Security
    • or linked field favored
  • Practice creating and building Data Platforms integrating disparate data sources
  • Knowledge of distributed computing
  • Proficiency in:
    • ETL
    • SQL
  • Know-how working with MPP solutions to deal with huge quantities of data
  • Proficiency in:
    • Azure
    • Azure Databricks
    • Azure SQL, Synapse
  • Capability developing dataflows utilizing NiFi/ADF, Databricks
  • Developed, hands-on capability applying huge scale data warehouses and data lakes
  • Highly developed, hands-on capability in Spark construction and application
  • Experience working with Distributed Message Systems like Kafka
  • Hands on practice in:
    • Python,
    • PySpark
    • R
  • Understanding of working with defined and undefined data sources
  • Specialist in creating complicated SQL and methods
  • Understanding or designing ingestion workflows utilizing Oozie or comparable tools
  • Understanding of safety procedures like HTTPS and Kerberos
  • Team-oriented, detail-oriented, efficient and solution-oriented attitude
  • Superb analytical and problem-solving skills
  • Excellent communication and interpersonal skills
  • Flexibility and ability to work independently and in a team
  • Great English skills (written and spoken)

It would be a plus if the Candidate had: 
  • Expertise in Graph Databases, if possible:
    • Neo4J
    • Cypher
    • Cosmos DB
  • Graph Data Modelling
  • Azure Data Lake Store
  • Databricks Delta lakes
  • Spark ML
  • Practice developing Microservices
The Candidate can expect:
  • B2B Contract
  • Challenging job in an international and multilingual environment
  • Professional development
  • Attractive and competitive compensation

If you meet the requirements described above, please send your application in English (.doc) at natalia.lewandowska@seargin.com stating the name of the position in a subject and/or call +(48) 735 246 583.

    First name*

    E-mail*

    Message

    Attach CV

    ABOUT COMPANY
    Company size: 500+
    Main location:Poland
    Seargin Sp. z o. o.
    Warsaw/Wroclaw/Poznan/Katowice/Remote, Poland
    Get similar jobs like these by email

    Loading

    Similar job

    Delphi Developer
    Check
    Senior DevSecOps Consultant
    Check
    Lead Developer
    Check

    Job offers that might interest you

    Poland

    Information Security Consultant

    Check
    Poland

    Backup Engineer

    Check
    Poland

    IT Architect – Mobility Solutions

    Check
    Poland

    Go Developer

    Check
    Poland

    Senior Performance Engineer

    Check
    Poland

    Senior Oracle APEX Developer

    Check

    READY TO JOIN IT REVOLUTION?

    Send us a message using the contact form below.
    I'm a Client looking for an IT Expert or a Professional Team
    I'm an IT Expert looking for a great job / project opportunity