Latest MNC Updates

6/recent/ticker-posts

Ad Code

DHL Recruitment Drive; Hiring Data Engineer – Apply Now


DHL invites job applications for the post of Data Engineer. Candidates with Bachelor’s degree in Engineering/Technology & Minimum 2-3 years of experience in the Data Engineer role and Experience in performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement are eligible to apply. Candidates are advised to apply soon, before the link expires

Name of the Organization: DHL

Requisition ID: APIN03230

Positions: Data Engineer

Location: Across India

Salary: As per company Norms

Required Qualifications:

  • Bachelor’s degree in Engineering/Technology;
  • Minimum 2-3 years of experience in the Data Engineer role;
  • Expertise using relational Database systems such as Oracle, MS/Azure SQL, MySQL, etc.;
  • Expert SQL knowledge. It’s great it you have experience with Snowflake SaaS data warehouse or alternative solutions.
  • Practical experience developing and/or supporting CDC data pipelines – we use Qlik Replicate but any other technology is welcome!
  • Excellent problem-solving, communication, and collaboration skills to provide effective support and assistance in data engineering projects;
  • Experience with development and/or support of Lakehouse architectures – we use Parquet / Delta, Synapse Serverless and Databricks/Databricks SQL;
  • Proficiency in Python programming and experience with PySpark libraries and APIs;
  • Very good understanding of Software Development Lifecycle, source code management, code reviews, etc.;
  • Experience in; managing of Incident life-cycle from ticket creation till closure (we use Service Now and JIRA)
  • Experience in performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement;
  • Experience in building processes supporting data transformation, data structures, metadata, dependency and workload management;

Skills Required:

  • Experience with Data Lake/Big Data Projects implementation in Cloud (preferably MS Azure) and/or On-premise platforms:
    • Cloud – Azure technology stack: ADLS Gen2, Databricks (proven experience is a big plus!), EventHub, Stream Analytics, Synapse Analytics, AKS, Key Vault;
    • On Premise: Spark, HDFS, Hive, Hadoop distributions (Cloudera or MapR), Kafka, Airflow (or any other scheduler)
  • Ability to develop, maintain and distribute the code in modularized fashion;
  • Working experience with DevOps framework;
  • Ability to collaborate across different teams/geographies/stakeholders/levels of seniority;
  • Energetic, enthusiastic and results-oriented personality;
  • Customer focus with an eye on continuous improvement;
  • Motivation and ability to perform as a consultant in data engineering projects;
  • Ability to work independently but also within a Team - you must be a team player!
  • Strong will to overcome the complexities involved in developing and supporting data pipelines;
  • Agile mindset; Language requirements:
  • English – Fluent spoken and written (C1 level)

Responsibilities:

  • Designing, developing and maintaining near-real time ingestion pipelines through Qlik Replicate (or alternative technology) replicating data from transactional Databases to our Data Eco-system powered by Azure Data Lake and Snowflake.
  • Monitoring and supporting batch data pipelines from transactional Databases to our Data Eco-system powered by Azure Data Lake and Snowflake.
  • Setting up new and monitoring of existing metrics, analyzing data, performing root cause analysis and proposing issue resolutions.
  • Managing the lifecycle of all incidents, ensuring that normal service operation is restored as quickly as possible and minimizing the impact on business operations.
  • Document data pipelines, data models, and data integration processes to facilitate knowledge sharing and maintain data lineage.
  • Cooperate with other Data Platform & Operations team members and our stakeholders to identify and implement system and process improvements.
  • Leveraging DevOps framework and CI/CD.
  • Supporting and promoting Agile way of working using SCRUM framework.

 

Apply Link – Click Here

For Regular Updates Join our Telegram – Click Here

Post a Comment

0 Comments

Ad Code