Въведи своя e-mail и парола за вход, ако вече имаш създаден профил в DEV.BG/Jobs

Забравена парола?

Създай своя профил в DEV.BG/Jobs

За да потвърдите, че не сте робот, моля отговорете на въпроса, като попълните празното поле:

60 - 61 =

Забравена парола

Въведи своя e-mail и ще ти изпратим твоята парола

Една от всички 225 обяви за Data Science в София

Виж всички

Data Engineer (with ETL, Azure)

Luxoft | София

Only in


Тази обява е публикувана само в DEV.BG Jobs: Преглеждаме значимите български сайтове за обяви за работа (с поне 400 IT обяви за работа). Тази обява не е публикувана в нито един от тях.
4 юни
Обявата е публикувана в следните минибордове
  • Sofia, Bulgaria
  • Съобщи за проблем с обявата

Съобщи за проблем с обявата


    Какво не е наред с обявата?*
    Моля опиши ни, къде е проблемът:
    За да потвърдите, че не сте робот, моля отговорете на въпроса, като попълните празното поле:

    Project Description

    As part of our strategic partnership with one of the biggest financial institutions in the World, we are hiring various IT specialists, who will become part of their new IT Service Center in Sofia. The bank is an international organization dedicated to providing financing, advice, and research to developing nations to aid their economic advancement. The bank predominantly acts as an organization that attempts to fight poverty by offering developmental assistance to middle- and low-income countries. Our client is a provider of financial and technical assistance to individual countries around the globe. The bank considers itself a unique financial institution that sets up partnerships to reduce poverty and support economic development.

    Team specific:
    This position focuses on data architecture with a perspective of data integration through Extract/Transform/Load. The incumbent will be working with business owners of data, business analysts, and data custodians. The incumbent will be expected to take data Extract/Transform/Load artifacts, as well as deploy the detailed design through enterprise standard tools.

    The selected candidate will work closely with the BI team to provide technical leadership to facilitate development projects that involve the computing environment which may include the coordination of software upgrades and the installation of new products. He/she will design and validate data solutions that are practical, flexible, scalable, reusable and strategic. These efforts enhance data quality, enrich access and provide business decision makers with information upon which they can make more accurate and effective decisions across multiple domains.


    • Develop ETL mappings, interacting with large data processing pipelines in distributed data stores, using cloud-based ETL tools.
    • Determine database structural requirements by analyzing client operations, applications, and programming, while reviewing objectives with clients and evaluating current systems
    • Work with application DBA and modelers to construct data stores;
    • Define database physical structure and functional capabilities to accommodate data integration requirements, security, back-up, and recovery specifications
    • Ensure data is ready for use by consuming application, analyst and scientist using frameworks and microservices to serve data.
    • Collaborate with data architects, modelers and IT team members on project goals
    • Ensure optimum performance techniques for data integration, coordinate deployment actions, and document actions
    • Integrate new data management technologies and software engineering tools into existing structures
    • Maintain overall performance of components involved in Data Integration through ETL by identifying and resolving production and application development problems
    • Answer user questions
    • Provide maintenance and support to data integration and ETL components by coding utilities, and resolving problems


    Must have

    • Educational Qualifications and Experience:
      • Education: Bachelor’s degree in Computer Science/Engineering
      • Role Specific Experience: 7+ years of hands-on experience working on ETL mappings, interacting with large data processing pipelines in distributed data stores, and distributed file systems using cloud-based ETL tools such as IICS and Azure Data Factory/SSIS.
      • Extensive experience coding complex SQL queries in one or more leading RDBMS e.g. Oracle, Azure Synapse, MS SQL Server, Postgres etc.
    • Required technologies:
      • Tools: Informatica Cloud Services (IICS), Informatica PowerCenter 10.x, Azure Data Factory
      • Database: Postgres, Oracle 19c, Azure Synapse/SQL DW/ SQL DB, SQL Server 2016/2014/2012
      • Cloud Technologies: Azure Microsoft Technologies, Unix, Linux, Windows
    • Required Skills/Abilities:
      • Advanced knowledge of Data Integration concepts and standard approaches
      • Strong leadership and communication skills
      • Ability to work independently once guidance and goals are provided

    Nice to have

    • Experience in one or more of the following technologies:
      • Data dictionaries
      • Data warehousing
      • Enterprise application integration
      • Metadata registry
      • Master Data Management (MDM)
      • Relational Databases
      • NoSQL
      • Semantics
      • Data retention
      • Structured Query Language (SQL)
      • Procedural SQL
      • Unified Modeling Language (UML)
      • XML, including schema definitions (XSD and RELAX NG) and transformations
      • Additional consideration will be given to those individuals who possess the following specific competencies (in no specific order): Tibco Data Virtualization, SAP BW/Hana


    English: B2 Upper Intermediate



    Relocation package

    If needed, we can help you with relocation process.