Въведи своя e-mail и парола за вход, ако вече имаш създаден профил в DEV.BG/Jobs

Забравена парола?

Създай своя профил в DEV.BG/Jobs

За да потвърдите, че не сте робот, моля отговорете на въпроса, като попълните празното поле:

96-14 =

Забравена парола

Въведи своя e-mail и ще ти изпратим твоята парола

Една от всички 17 обяви за Big Data в София

Виж всички
Data Engineer

Обявата е публикувана в следните минибордове

  • Anywhere
  • Съобщи за проблем с обявата

Съобщи за проблем с обявата


    Какво не е наред с обявата?*
    Моля опиши ни, къде е проблемът:
    За да потвърдите, че не сте робот, моля отговорете на въпроса, като попълните празното поле:

    Company Description

    dxFeed is a leading provider of data services for the capital markets’ industry. The company sources and stores direct market data feeds from a variety of exchanges and market participants around the world. dxFeed has built one of the most comprehensive ticker plants, in addition to offering the broadest range of data services for streaming, consolidation, storage, extraction, and analytics. These include index construction and maintenance for buy-side and sell-side institutions of the global financial industry.

    We are looking for experienced Data Engineer to help us make better products for our customers. You will be joining a team of professionals with extensive experience in creating financial software products.

    Job Description


    • Contribute to development of our in-house cross-functional data storage platform
    • Establish a data infrastructure strategy to capture and harness new data assets for the new products and technology practices
    • Collaborate with decision makers, project managers, data scientists, and other stakeholders
    • Build data pipelines using Apache Airflow, Redshift, Spark, Kafka, and more
    • Connect new data sources
    • Maintain data infrastructure (resource management, upgrades)
    • Monitor data pipelines and improve their performance


    • You are proficient in writing SQL on analytics (OLAP) databases (preferably Redshift, Clickhouse)
    • You are proficient in writing SQL on transactional (OLTP) databases (preferably PostgreSQL)
    • You are experienced in transforming data model requirements into Big Data solutions
    • You have worked on efficient cloud-based data warehouse/data lake operation
    • You are experienced in tasks automation, preferably in Python
    • You have experience with deployment and support of cloud-based solutions, preferrably AWS+Terraform
    • You are able to work in self-organizing environment

    Life in Devexperts

    We will only achieve our mission if we live our culture. We start with becoming learners in all things—having a growth mindset. Then we apply that mindset to learning about our customers, being diverse and inclusive, working together as one, and—ultimately—making a difference in the world.