Resume and JobRESUME AND JOB
Cognizant logo

Azure Data Engineer

Cognizant

Software and Technology Jobs

Azure Data Engineer

full-timePosted: Dec 7, 2025

Job Description

Skill: Azure Databricks

Experience: 4 to 9 years

Location: AIA Noida

1. Job Title : Developer 2. Job Summary : We are seeking an experienced Developer with 7 to 10 years of expertise in Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark. The role involves working in a hybrid model with a focus on developing and optimizing data solutions. The ideal candidate will contribute to our data-driven initiatives enhancing our capabilities and driving impactful outcomes. 3. Experience : 7 - 10 years 4. Required Skills : Technical Skills: Databricks SQL Databricks Delta Lake Databricks Workflows PySpark Domain Skills: 5. Nice to have skills : Domain Skills: 6. Technology : Cloud Modernization/Migration 7. Shift : Day 8. Responsibilities : - Develop and optimize data solutions using Databricks SQL and Delta Lake to support business objectives and enhance data accessibility. - Collaborate with cross-functional teams to design and implement scalable data workflows using Databricks Workflows. - Utilize PySpark to process and analyze large datasets ensuring data integrity and accuracy. - Provide technical expertise in Databricks to troubleshoot and resolve data-related issues efficiently. - Oversee the integration of data from various sources into the Databricks environment to ensure seamless data flow. - Implement best practices for data governance and security within the Databricks platform. - Contribute to the continuous improvement of data processes by identifying areas for automation and optimization. - Ensure timely delivery of data solutions by managing project timelines and coordinating with stakeholders. - Monitor and maintain the performance of data pipelines to ensure high availability and reliability. - Document technical specifications and processes to facilitate knowledge sharing and team collaboration. - Participate in code reviews and provide constructive feedback to peers to maintain code quality. - Stay updated with the latest advancements in Databricks and related technologies to drive innovation. - Support data-driven decision-making by providing insights and recommendations based on data analysis. -Qualifications - Possess a strong understanding of Databricks SQL and Delta Lake for efficient data management. - Demonstrate proficiency in Databricks Workflows for designing robust data pipelines. - Exhibit expertise in PySpark for processing and analyzing large-scale datasets. - Have a proven track record of implementing data solutions in a hybrid work model. - Show experience in collaborating with cross-functional teams to achieve project goals. - Display knowledge of data governance and security best practices. - Be adept at troubleshooting and resolving technical issues in a timely manner. 9. Job Location : Primary Location :INKABLRA03(ITIND Manyata (MBP) Bld F3 SEZ) Alternate Location :NA NA Alternate Location 1 :NA NA 10. Job Type : Associate - Projects [65PM00] 11. Demand Requires Travel? : No 12. Certifications Required : NA

The Cognizant community:
We are a high caliber team who appreciate and support one another. Our people uphold an energetic, collaborative and inclusive workplace where everyone can thrive.

  • Cognizant is a global community with more than 300,000 associates around the world.
  • We don’t just dream of a better way – we make it happen.
  • We take care of our people, clients, company, communities and climate by doing what’s right.
  • We foster an innovative environment where you can build the career path that’s right for you.

About us:
Cognizant is one of the world's leading professional services companies, transforming clients' business, operating, and technology models for the digital era. Our unique industry-based, consultative approach helps clients envision, build, and run more innovative and efficient businesses. Headquartered in the U.S., Cognizant (a member of the NASDAQ-100 and one of Forbes World’s Best Employers 2025) is consistently listed among the most admired companies in the world. Learn how Cognizant helps clients lead with digital at www.cognizant.com

Cognizant is an equal opportunity employer. Your application and candidacy will not be considered based on race, color, sex, religion, creed, sexual orientation, gender identity, national origin, disability, genetic information, pregnancy, veteran status or any other characteristic protected by federal, state or local laws.

If you have a disability that requires reasonable accommodation to search for a job opening or submit an application, please email CareersNA2@cognizant.com with your request and contact information.

Disclaimer:
Compensation information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.

Applicants may be required to attend interviews in person or by video conference. In addition, candidates may be required to present their current state or government issued ID during each interview.

About the Role/Company

  • Cognizant is a global community with more than 300,000 associates around the world
  • We don’t just dream of a better way – we make it happen
  • We take care of our people, clients, company, communities and climate by doing what’s right
  • We foster an innovative environment where you can build the career path that’s right for you
  • Cognizant is one of the world's leading professional services companies, transforming clients' business, operating, and technology models for the digital era
  • Headquartered in the U.S., Cognizant is a member of the NASDAQ-100 and one of Forbes World’s Best Employers 2025
  • Cognizant is consistently listed among the most admired companies in the world
  • Cognizant helps clients lead with digital at www.cognizant.com
  • Cognizant is an equal opportunity employer
  • Applicants may be required to attend interviews in person or by video conference and present their current state or government issued ID during each interview

Key Responsibilities

  • Develop and optimize data solutions using Databricks SQL and Delta Lake to support business objectives and enhance data accessibility
  • Collaborate with cross-functional teams to design and implement scalable data workflows using Databricks Workflows
  • Utilize PySpark to process and analyze large datasets ensuring data integrity and accuracy
  • Provide technical expertise in Databricks to troubleshoot and resolve data-related issues efficiently
  • Oversee the integration of data from various sources into the Databricks environment to ensure seamless data flow
  • Implement best practices for data governance and security within the Databricks platform
  • Contribute to the continuous improvement of data processes by identifying areas for automation and optimization
  • Ensure timely delivery of data solutions by managing project timelines and coordinating with stakeholders
  • Monitor and maintain the performance of data pipelines to ensure high availability and reliability
  • Document technical specifications and processes to facilitate knowledge sharing and team collaboration
  • Participate in code reviews and provide constructive feedback to peers to maintain code quality
  • Stay updated with the latest advancements in Databricks and related technologies to drive innovation
  • Support data-driven decision-making by providing insights and recommendations based on data analysis

Required Qualifications

  • Strong understanding of Databricks SQL and Delta Lake for efficient data management
  • Proficiency in Databricks Workflows for designing robust data pipelines
  • Expertise in PySpark for processing and analyzing large-scale datasets
  • Proven track record of implementing data solutions in a hybrid work model
  • Experience in collaborating with cross-functional teams to achieve project goals
  • Knowledge of data governance and security best practices
  • Adept at troubleshooting and resolving technical issues in a timely manner

Skills Required

  • Databricks SQL
  • Databricks Delta Lake
  • Databricks Workflows
  • PySpark

Additional Requirements

  • Experience: 7 to 10 years
  • Location: AIA Noida
  • Job Type: Associate - Projects
  • Travel: No
  • Certifications Required: NA

Locations

  • India

Salary

Estimated Salary Rangemedium confidence

800,000 - 1,500,000 INR / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Databricks SQLintermediate
  • Databricks Delta Lakeintermediate
  • Databricks Workflowsintermediate
  • PySparkintermediate

Required Qualifications

  • Strong understanding of Databricks SQL and Delta Lake for efficient data management (experience)
  • Proficiency in Databricks Workflows for designing robust data pipelines (experience)
  • Expertise in PySpark for processing and analyzing large-scale datasets (experience)
  • Proven track record of implementing data solutions in a hybrid work model (experience)
  • Experience in collaborating with cross-functional teams to achieve project goals (experience)
  • Knowledge of data governance and security best practices (experience)
  • Adept at troubleshooting and resolving technical issues in a timely manner (experience)

Responsibilities

  • Develop and optimize data solutions using Databricks SQL and Delta Lake to support business objectives and enhance data accessibility
  • Collaborate with cross-functional teams to design and implement scalable data workflows using Databricks Workflows
  • Utilize PySpark to process and analyze large datasets ensuring data integrity and accuracy
  • Provide technical expertise in Databricks to troubleshoot and resolve data-related issues efficiently
  • Oversee the integration of data from various sources into the Databricks environment to ensure seamless data flow
  • Implement best practices for data governance and security within the Databricks platform
  • Contribute to the continuous improvement of data processes by identifying areas for automation and optimization
  • Ensure timely delivery of data solutions by managing project timelines and coordinating with stakeholders
  • Monitor and maintain the performance of data pipelines to ensure high availability and reliability
  • Document technical specifications and processes to facilitate knowledge sharing and team collaboration
  • Participate in code reviews and provide constructive feedback to peers to maintain code quality
  • Stay updated with the latest advancements in Databricks and related technologies to drive innovation
  • Support data-driven decision-making by providing insights and recommendations based on data analysis

Travel Requirements

2

Target Your Resume for "Azure Data Engineer" , Cognizant

Get personalized recommendations to optimize your resume specifically for Azure Data Engineer. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Azure Data Engineer" , Cognizant

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

TechnologyIT ServicesTechnologyConsulting

Answer 10 quick questions to check your fit for Azure Data Engineer @ Cognizant.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.

Cognizant logo

Azure Data Engineer

Cognizant

Software and Technology Jobs

Azure Data Engineer

full-timePosted: Dec 7, 2025

Job Description

Skill: Azure Databricks

Experience: 4 to 9 years

Location: AIA Noida

1. Job Title : Developer 2. Job Summary : We are seeking an experienced Developer with 7 to 10 years of expertise in Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark. The role involves working in a hybrid model with a focus on developing and optimizing data solutions. The ideal candidate will contribute to our data-driven initiatives enhancing our capabilities and driving impactful outcomes. 3. Experience : 7 - 10 years 4. Required Skills : Technical Skills: Databricks SQL Databricks Delta Lake Databricks Workflows PySpark Domain Skills: 5. Nice to have skills : Domain Skills: 6. Technology : Cloud Modernization/Migration 7. Shift : Day 8. Responsibilities : - Develop and optimize data solutions using Databricks SQL and Delta Lake to support business objectives and enhance data accessibility. - Collaborate with cross-functional teams to design and implement scalable data workflows using Databricks Workflows. - Utilize PySpark to process and analyze large datasets ensuring data integrity and accuracy. - Provide technical expertise in Databricks to troubleshoot and resolve data-related issues efficiently. - Oversee the integration of data from various sources into the Databricks environment to ensure seamless data flow. - Implement best practices for data governance and security within the Databricks platform. - Contribute to the continuous improvement of data processes by identifying areas for automation and optimization. - Ensure timely delivery of data solutions by managing project timelines and coordinating with stakeholders. - Monitor and maintain the performance of data pipelines to ensure high availability and reliability. - Document technical specifications and processes to facilitate knowledge sharing and team collaboration. - Participate in code reviews and provide constructive feedback to peers to maintain code quality. - Stay updated with the latest advancements in Databricks and related technologies to drive innovation. - Support data-driven decision-making by providing insights and recommendations based on data analysis. -Qualifications - Possess a strong understanding of Databricks SQL and Delta Lake for efficient data management. - Demonstrate proficiency in Databricks Workflows for designing robust data pipelines. - Exhibit expertise in PySpark for processing and analyzing large-scale datasets. - Have a proven track record of implementing data solutions in a hybrid work model. - Show experience in collaborating with cross-functional teams to achieve project goals. - Display knowledge of data governance and security best practices. - Be adept at troubleshooting and resolving technical issues in a timely manner. 9. Job Location : Primary Location :INKABLRA03(ITIND Manyata (MBP) Bld F3 SEZ) Alternate Location :NA NA Alternate Location 1 :NA NA 10. Job Type : Associate - Projects [65PM00] 11. Demand Requires Travel? : No 12. Certifications Required : NA

The Cognizant community:
We are a high caliber team who appreciate and support one another. Our people uphold an energetic, collaborative and inclusive workplace where everyone can thrive.

  • Cognizant is a global community with more than 300,000 associates around the world.
  • We don’t just dream of a better way – we make it happen.
  • We take care of our people, clients, company, communities and climate by doing what’s right.
  • We foster an innovative environment where you can build the career path that’s right for you.

About us:
Cognizant is one of the world's leading professional services companies, transforming clients' business, operating, and technology models for the digital era. Our unique industry-based, consultative approach helps clients envision, build, and run more innovative and efficient businesses. Headquartered in the U.S., Cognizant (a member of the NASDAQ-100 and one of Forbes World’s Best Employers 2025) is consistently listed among the most admired companies in the world. Learn how Cognizant helps clients lead with digital at www.cognizant.com

Cognizant is an equal opportunity employer. Your application and candidacy will not be considered based on race, color, sex, religion, creed, sexual orientation, gender identity, national origin, disability, genetic information, pregnancy, veteran status or any other characteristic protected by federal, state or local laws.

If you have a disability that requires reasonable accommodation to search for a job opening or submit an application, please email CareersNA2@cognizant.com with your request and contact information.

Disclaimer:
Compensation information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.

Applicants may be required to attend interviews in person or by video conference. In addition, candidates may be required to present their current state or government issued ID during each interview.

About the Role/Company

  • Cognizant is a global community with more than 300,000 associates around the world
  • We don’t just dream of a better way – we make it happen
  • We take care of our people, clients, company, communities and climate by doing what’s right
  • We foster an innovative environment where you can build the career path that’s right for you
  • Cognizant is one of the world's leading professional services companies, transforming clients' business, operating, and technology models for the digital era
  • Headquartered in the U.S., Cognizant is a member of the NASDAQ-100 and one of Forbes World’s Best Employers 2025
  • Cognizant is consistently listed among the most admired companies in the world
  • Cognizant helps clients lead with digital at www.cognizant.com
  • Cognizant is an equal opportunity employer
  • Applicants may be required to attend interviews in person or by video conference and present their current state or government issued ID during each interview

Key Responsibilities

  • Develop and optimize data solutions using Databricks SQL and Delta Lake to support business objectives and enhance data accessibility
  • Collaborate with cross-functional teams to design and implement scalable data workflows using Databricks Workflows
  • Utilize PySpark to process and analyze large datasets ensuring data integrity and accuracy
  • Provide technical expertise in Databricks to troubleshoot and resolve data-related issues efficiently
  • Oversee the integration of data from various sources into the Databricks environment to ensure seamless data flow
  • Implement best practices for data governance and security within the Databricks platform
  • Contribute to the continuous improvement of data processes by identifying areas for automation and optimization
  • Ensure timely delivery of data solutions by managing project timelines and coordinating with stakeholders
  • Monitor and maintain the performance of data pipelines to ensure high availability and reliability
  • Document technical specifications and processes to facilitate knowledge sharing and team collaboration
  • Participate in code reviews and provide constructive feedback to peers to maintain code quality
  • Stay updated with the latest advancements in Databricks and related technologies to drive innovation
  • Support data-driven decision-making by providing insights and recommendations based on data analysis

Required Qualifications

  • Strong understanding of Databricks SQL and Delta Lake for efficient data management
  • Proficiency in Databricks Workflows for designing robust data pipelines
  • Expertise in PySpark for processing and analyzing large-scale datasets
  • Proven track record of implementing data solutions in a hybrid work model
  • Experience in collaborating with cross-functional teams to achieve project goals
  • Knowledge of data governance and security best practices
  • Adept at troubleshooting and resolving technical issues in a timely manner

Skills Required

  • Databricks SQL
  • Databricks Delta Lake
  • Databricks Workflows
  • PySpark

Additional Requirements

  • Experience: 7 to 10 years
  • Location: AIA Noida
  • Job Type: Associate - Projects
  • Travel: No
  • Certifications Required: NA

Locations

  • India

Salary

Estimated Salary Rangemedium confidence

800,000 - 1,500,000 INR / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Databricks SQLintermediate
  • Databricks Delta Lakeintermediate
  • Databricks Workflowsintermediate
  • PySparkintermediate

Required Qualifications

  • Strong understanding of Databricks SQL and Delta Lake for efficient data management (experience)
  • Proficiency in Databricks Workflows for designing robust data pipelines (experience)
  • Expertise in PySpark for processing and analyzing large-scale datasets (experience)
  • Proven track record of implementing data solutions in a hybrid work model (experience)
  • Experience in collaborating with cross-functional teams to achieve project goals (experience)
  • Knowledge of data governance and security best practices (experience)
  • Adept at troubleshooting and resolving technical issues in a timely manner (experience)

Responsibilities

  • Develop and optimize data solutions using Databricks SQL and Delta Lake to support business objectives and enhance data accessibility
  • Collaborate with cross-functional teams to design and implement scalable data workflows using Databricks Workflows
  • Utilize PySpark to process and analyze large datasets ensuring data integrity and accuracy
  • Provide technical expertise in Databricks to troubleshoot and resolve data-related issues efficiently
  • Oversee the integration of data from various sources into the Databricks environment to ensure seamless data flow
  • Implement best practices for data governance and security within the Databricks platform
  • Contribute to the continuous improvement of data processes by identifying areas for automation and optimization
  • Ensure timely delivery of data solutions by managing project timelines and coordinating with stakeholders
  • Monitor and maintain the performance of data pipelines to ensure high availability and reliability
  • Document technical specifications and processes to facilitate knowledge sharing and team collaboration
  • Participate in code reviews and provide constructive feedback to peers to maintain code quality
  • Stay updated with the latest advancements in Databricks and related technologies to drive innovation
  • Support data-driven decision-making by providing insights and recommendations based on data analysis

Travel Requirements

2

Target Your Resume for "Azure Data Engineer" , Cognizant

Get personalized recommendations to optimize your resume specifically for Azure Data Engineer. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Azure Data Engineer" , Cognizant

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

TechnologyIT ServicesTechnologyConsulting

Answer 10 quick questions to check your fit for Azure Data Engineer @ Cognizant.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.