Data Engineering Analyst Job Openings in Hyderabad 2026!!!
Optum announced job vacancy for the post of Data Engineering Analyst. The place of posting will be at Hyderabad. Candidates who have completed Graduate / Engineering / Post Graduate with Fresher / Experience are eligible to apply. More details about qualifications, job description and roles & responsibilities are as follows
Company
Overview
|
Name of the Company |
Optum |
|
Required Qualifications |
Graduate / Post Graduate |
|
Skills |
Python, Scala, Spark, etc.., |
|
Category |
Technology |
|
Work Type |
Onsite |
The
Data Engineer (Grade 25) will support the design, development, and maintenance
of scalable Big Data solutions on cloud platforms. This role is ideal for early
career data engineers with hands-on experience in Spark-based data processing,
Azure cloud services, and data pipeline orchestration.
The
individual will work closely with senior engineers, architects, and cross-functional
teams to build reliable data pipelines, improve existing solutions, and ensure
secure and efficient data operations. The role emphasizes solid fundamentals in
data engineering, cloud technologies, and production support, while providing
opportunities to grow into advanced Big Data and cloud-native architectures.
Job Details
Θ Positions: Data
Engineering Analyst
Θ Job
Location:
Hyderabad
Θ Salary: As per
company standards
Θ Job Type: Full Time
Θ Requisition
ID: 2348586
Roles and
Responsibilities:
- Design, code, test, document, and maintain high-quality, scalable Big Data applications using PySpark and Scala Spark on Azure Cloud platforms
- Develop and manage data pipelines and schedule workflows using Apache Airflow, ensuring proper job dependencies and execution order
- Securely manage secrets and credentials using Azure Key Vault, following enterprise security best practices
- Analyze existing data pipelines and applications to identify gaps, risks, and opportunities for improvement
- Assist in analyzing data architecture and design frameworks, working with multiple databases and data warehouses
- Create prototypes and proof-of-concepts (POCs) and participate in design and code reviews
- Write and maintain technical documentation for data pipelines, workflows, and operational procedures
- Participate in production support activities, including monitoring, troubleshooting, and issue resolution
- Perform performance optimization of data pipelines and assist with application migration efforts across environments
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required
Skills & Qualifications:
- Bachelor's or Master's degree in Computer Science, Information Technology, or equivalent, with more than 1 year of relevant work experience
- Hands-on experience with Python and Scala for data engineering and Big Data development
- Hands-on exposure to Azure Data Lake, Azure Databricks, Azure Data Factory, and Azure Key Vault
- Hands-on exposure to AI-assisted development tools such as Microsoft Copilot, with a basic understanding of prompt engineering to improve coding efficiency, data analysis productivity, and documentation quality
- Working experience with Apache Spark and good understanding of Hadoop ecosystem concepts
- Experience with job scheduling and orchestration tools, particularly Apache Airflow
- Experience with Snowflake and writing Shell scripts for automation and operational tasks
- Good experience working in cloud environments, preferably Microsoft Azure
- Solid experience in writing complex SQL and PL/SQL queries
- Exposure to CI/CD pipelines using tools such as Jenkins and GitHub Actions
- Basic understanding of software development best practices, version control, and collaborative development
- Proven good analytical skills, attention to detail, and willingness to learn new technologies and frameworks
- Proven ability to work effectively in a team-oriented and fast-paced environment
How to Apply
Apply Link –
Click Here
For Regular
Updates Join our WhatsApp – Click Here
For Regular Updates Join our Telegram – Click Here
Disclaimer:
The
information provided on this page is intended solely for informational purposes
for Students, Freshers & Experience candidates. All the recruitment details
are sourced directly from the official website and pages of the respective
company. Latest MNC Jobs do not guarantee job placement, and the recruitment
process will follow the company’s official rules and Human Resource guidelines.
Latest MNC Jobs do not charge any fees for sharing job information. Latest MNC
Jobs strongly advise Students, Freshers & Experience candidates not to make
any payments for any job opportunities.
0 Comments
Thanks for your comment, Will Reply shortly.