Въведи своя e-mail и парола за вход, ако вече имаш създаден профил в DEV.BG/Jobs

Забравена парола?

Създай своя профил в DEV.BG/Jobs

За да потвърдите, че не сте робот, моля отговорете на въпроса, като попълните празното поле:

100 / 25 =

Забравена парола

Въведи своя e-mail и ще ти изпратим твоята парола

Една от всички 15 обяви за Big Data в София

Виж всички


Xogito Group | дистанционно
Тази позиция позволява отдалечена работа

Only in


Тази обява е публикувана само в DEV.BG Jobs: Преглеждаме значимите български сайтове за обяви за работа (с поне 400 IT обяви за работа). Тази обява не е публикувана в нито един от тях.
11 юни
Обявата е публикувана в следните минибордове
  • Sofia, Bulgaria
  • Съобщи за проблем с обявата

Съобщи за проблем с обявата


    Какво не е наред с обявата?*
    Моля опиши ни, къде е проблемът:
    За да потвърдите, че не сте робот, моля отговорете на въпроса, като попълните празното поле:

    Purpose of the Role

    The Data Engineer will support the development and maintenance of a big data marketing solution that handles huge amounts of scalable data stores. Aside from strong programming skills, we are looking for a true back-end and data engineer with a strong understanding of databases and infrastructure requirements in such an environment.

    Duties and Responsibilities

    • Build and maintain data pipelines using Python, handle huge amounts of data in different storage tools
    • Process massive amounts of data in short bursts and find solutions to time-sensitive problems that can arise
    • Troubleshoot data quality issues and perform the necessary fixes/backfills
    • Work with data scientists and BI analysts to understand data models and data ingestion requirements
    • Enhance the codebase to fix bugs, add and document features
    • Participate and collaborate in daily scrum meetings, sprint planning, reviews, and retrospective meetings

    Required Experience & Knowledge

    • Strong experience working with Python in big data environments
    • Experience handling hundreds of TeraBytes of data in different databases across multiple services
    • Good knowledge of ElasticSearch, along with experience performing queries, infrastructure knowledge, and time-based indexes
    • Deep experience with AWS Cloud Services such as DynamoDB, S3, EC2, SQS, ECS, CloudWatch and experience choosing the proper tool for storage
    • Experience creating infrastructures that must handle big-data throughputs, scale, and be time and cost-efficient

    Skills and Attributes

    • Good written and verbal communication skills
    • A self-motivated, curious person with good time management skills
    • Ability to learn technical concepts and communicate with multiple functional groups
    • Team up with the professionals in a multicultural virtual work environment
    • Strong troubleshooting skills and high attention to detail

    Required Education & Qualifications

    • Advanced level of both spoken and written English language
    • Bachelor’s or Master’s degree in Computer Science or relevant experience
    • Any relevant certificate would be considered a plus