+
Вход

Въведи своя e-mail и парола за вход, ако вече имаш създаден профил в DEV.BG/Jobs

Забравена парола?
+
Създай своя профил в DEV.BG/Jobs

За да потвърдите, че не сте робот, моля отговорете на въпроса, като попълните празното поле:

65+13 =
+
Забравена парола

Въведи своя e-mail и ще ти изпратим твоята парола

Endava

Senior Data Engineer

ApplyКандидатствай

Обявата е публикувана в следните категории

+
  • Anywhere
  • Съобщи проблем Megaphone icon

Съобщи за проблем с обявата

×

    Какво не е наред с обявата?*
    Моля опиши ни, къде е проблемът:
    За да потвърдите, че не сте робот, моля отговорете на въпроса, като попълните празното поле:
    Tech Stack / Изисквания

    Company Description

    Technology is our how. And people are our why. For over two decades, we have been harnessing technology to drive meaningful change.

    By combining world-class engineering, industry expertise and a people-centric mindset, we consult and partner with leading brands from various industries to create dynamic platforms and intelligent digital experiences that drive innovation and transform businesses.

    From prototype to real-world impact – be part of a global shift by doing work that matters.

    Job Description

    Our data team has expertise across engineering, analysis, architecture, modeling, machine learning, artificial intelligence, and data science. This discipline is responsible for transforming raw data into actionable insights, building robust data infrastructures, and enabling data-driven decision-making and innovation through advanced analytics and predictive modeling.

    Responsibilities:

    • Develop the architecture for the data pipelines and implement it using Python.
    • Collaborate with the data architect and data analyst to create and maintain source to target mapping documentation
    • Identify, document, and develop measures to visualize and record source data and data pipeline quality objectives
    • Experience with data migration projects between systems hosted in different hosting environments (on-premises, AWS, Azure, etc.)
    • Contribute to agile ceremonies, size tasks and deliverables.

    Qualifications

    • 5+ years of experience designing and developing data pipelines and managing data infrastructure
    • Understanding of Extract Transform Load (ETL) processes
    • Experience with one or more RDBMS (SQL Server, Oracle, DB2)
    • Experience with Cloud-based data tools (Databricks, Snowflake, Synapse, Redshift, BigQuery, etc)
    • Strong proficiency in SQL and Python (pandas, PySpark)
    • Experience with IaC tools will be an advantage
    • Open to working in an agile environment as part of a scrum team
    • Great interpersonal and communication skills.

    Additional Information

    Discover some of the global benefits that empower our people to become the best version of themselves:

    • Finance: Competitive salary package, share plan, company performance bonuses, value-based recognition awards, referral bonus;
    • Career Development: Career coaching, global career opportunities, non-linear career paths, internal development programmes for management and technical leadership;
    • Learning Opportunities: Complex projects, rotations, internal tech communities, training, certifications, coaching, online learning platforms subscriptions, pass-it-on sessions, workshops, conferences;
    • Work-Life Balance: Hybrid work and flexible working hours, employee assistance programme;
    • Health: Global internal wellbeing programme, access to wellbeing apps;
    • Community: Global internal tech communities, hobby clubs and interest groups, inclusion and diversity programmes, events and celebrations.