Въведи своя e-mail и парола за вход, ако вече имаш създаден профил в DEV.BG/Jobs
Въведи своя e-mail и ще ти изпратим твоята парола
Data Engineer is needed to work remotely on an interesting project, to be part of big data team with other data engineers, data analysts and data scientists.
Responsibilities:
Design, construct, install, test, and maintain highly scalable data management systems, ensuring proper integration within Client’s cloud-based architecture.
Implement processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders.
Build high-performance algorithms, prototypes, predictive models, and proof of concepts.
Research opportunities for data acquisition and new uses for existing data.
Develop data set processes for data modeling, mining, and production.
Integrate new data management technologies and software engineering tools into existing structures.
Collaborate with data architects, modelers, and IT team members on project goals.
Recommend ways to improve data reliability, efficiency, and quality.
Essential Skills and Qualifications:
Extensive experience with SQL and Python for complex data manipulation, transformation, and analysis.
Strong experience with AWS or GCP cloud services, preferably particularly those related to data handling (e.g., AWS Glue, AWS Lambda, and Amazon S3).
Familiarity with data modeling, ETL development, and data warehousing techniques.
Strong analytical skills and problem-solving abilities.
Excellent verbal and written communication skills, capable of clearly articulating complex technical ideas to non-technical stakeholders.
Creative problem-solving abilities and a proactive approach to identifying and addressing challenges.
Excellent communication skills, capable of articulating complex concepts to diverse audiences, including revenue managers and senior stakeholders.
Ability to work collaboratively in a fast-paced, team-oriented environment, as well as independently with minimal supervision.
Desirable Skills:
Proficiency in Databricks and Snowflake, with a solid understanding of the transition challenges between these platforms.
Experience with DBT (Data Build Tool) for managing data transformation workflows.
Knowledge of DevOps practices as they relate to data engineering, including CI/CD pipeline integration.
Background in handling large-scale data migration projects across different platforms.
Gamito is a licensed recruitment agency under number 1820/16.12.2014 with free-of-charge services to the candidates.
High social package, home office, flexible work time.
Guidance and onboarding will be provided.