Resume and JobRESUME AND JOB
Cognizant logo

Databricks Architect - Remote

Cognizant

Databricks Architect - Remote

Cognizant logo

Cognizant

full-time

Posted: December 7, 2025

Number of Vacancies: 1

Job Description

Job Title - Databricks Architect

Job Summary

Databricks Architect with expertise in building scalable data solutions on AWS

  • Supply Chain, Consumer Goods / Retail domain knowledge required.
  • Experience in modernizing (solutions and hands-on execution) enterprise data platforms from legacy to AWS cloud
  • Deep knowledge and experience in Data Modelling (OLAP and OLTP), Data Lake, Data Warehousing, ETL/ELT pipelines on both on-premise legacy and on AWS
  • Lead migration and modernization of legacy ETL processes from Informatica, DataStage, and SQL Server to cloud-native solutions.
  • Design and optimize data workflows for ingestion, transformation, and analytics using AWS-native services
  • Design and Build data pipelines and solutions using Databricks (PySpark and SparkSQL) on AWS
  • Experience with building Medallion architecture based data estates
  • Experience in building Databricks Delta Lake based Lakehouse using DLT, PySpark Jobs, Databricks Workflows
  • Proficient in SQL, Python, PySpark, S3, Lambda
  • Working knowledge of Git, CI/CD, VS Code
  • Proficient in AWS data ingestion stack
  • Must have knowledge and ability to scale up on Glue, Lambda, Step Functions, Spark Streaming and other services on a need basis
  • Implementation experience of several key data concepts such as CDC (Change Data Capture), Streaming and/or Batch ingestion, Pull v/s Push paradigms, Source to Target mapping, and so on
  • Collaborate with cross-functional teams to gather requirements and deliver scalable, secure, and high-performance data solutions.
  • Strong semantic layer modelling and implementation experience
  • Establish best practices for data governance, lineage, and quality across hybrid environments.
  • Provide technical leadership and mentoring to data engineers and developers.
  • Monitor and troubleshoot performance issues across Databricks and AWS services.
  • Understanding of key reporting stack such as Power BI, Tableau, Alteryx, Excel BI Add-Ins a plus

Certifications Required

Databricks Certified Data Engineer Associate

**Applications will be accepted until 10/17/25**

Salary and Other Compensation:

The annual salary for this position is between $88,200 - $139,500 depending on experience and other qualifications of the successful candidate.

Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:

Medical/Dental/Vision/Life Insurance
Paid holidays plus Paid Time Off
401(k) plan and contributions
Long-term/Short-term Disability
Paid Parental Leave
Employee Stock Purchase Plan

Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.

The Cognizant community:
We are a high caliber team who appreciate and support one another. Our people uphold an energetic, collaborative and inclusive workplace where everyone can thrive.

  • Cognizant is a global community with more than 300,000 associates around the world.
  • We don’t just dream of a better way – we make it happen.
  • We take care of our people, clients, company, communities and climate by doing what’s right.
  • We foster an innovative environment where you can build the career path that’s right for you.

About us:
Cognizant is one of the world's leading professional services companies, transforming clients' business, operating, and technology models for the digital era. Our unique industry-based, consultative approach helps clients envision, build, and run more innovative and efficient businesses. Headquartered in the U.S., Cognizant (a member of the NASDAQ-100 and one of Forbes World’s Best Employers 2025) is consistently listed among the most admired companies in the world. Learn how Cognizant helps clients lead with digital at www.cognizant.com

Cognizant is an equal opportunity employer. Your application and candidacy will not be considered based on race, color, sex, religion, creed, sexual orientation, gender identity, national origin, disability, genetic information, pregnancy, veteran status or any other characteristic protected by federal, state or local laws.

If you have a disability that requires reasonable accommodation to search for a job opening or submit an application, please email CareersNA2@cognizant.com with your request and contact information.

Disclaimer:
Compensation information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.

Applicants may be required to attend interviews in person or by video conference. In addition, candidates may be required to present their current state or government issued ID during each interview.

About the Role/Company

  • Cognizant is a global community with more than 300,000 associates around the world
  • We don’t just dream of a better way – we make it happen
  • We take care of our people, clients, company, communities and climate by doing what’s right
  • We foster an innovative environment where you can build the career path that’s right for you
  • Cognizant is one of the world's leading professional services companies, transforming clients' business, operating, and technology models for the digital era
  • Headquartered in the U.S., Cognizant is a member of the NASDAQ-100 and one of Forbes World’s Best Employers 2025
  • Cognizant is consistently listed among the most admired companies in the world
  • Cognizant is an equal opportunity employer

Key Responsibilities

  • Lead migration and modernization of legacy ETL processes from Informatica, DataStage, and SQL Server to cloud-native solutions
  • Design and optimize data workflows for ingestion, transformation, and analytics using AWS-native services
  • Design and Build data pipelines and solutions using Databricks (PySpark and SparkSQL) on AWS
  • Collaborate with cross-functional teams to gather requirements and deliver scalable, secure, and high-performance data solutions
  • Establish best practices for data governance, lineage, and quality across hybrid environments
  • Provide technical leadership and mentoring to data engineers and developers
  • Monitor and troubleshoot performance issues across Databricks and AWS services

Required Qualifications

  • Supply Chain, Consumer Goods / Retail domain knowledge
  • Experience in modernizing enterprise data platforms from legacy to AWS cloud
  • Deep knowledge and experience in Data Modelling (OLAP and OLTP), Data Lake, Data Warehousing, ETL/ELT pipelines on both on-premise legacy and on AWS
  • Lead migration and modernization of legacy ETL processes from Informatica, DataStage, and SQL Server to cloud-native solutions
  • Design and optimize data workflows for ingestion, transformation, and analytics using AWS-native services
  • Design and Build data pipelines and solutions using Databricks (PySpark and SparkSQL) on AWS
  • Experience with building Medallion architecture based data estates
  • Experience in building Databricks Delta Lake based Lakehouse using DLT, PySpark Jobs, Databricks Workflows
  • Proficient in SQL, Python, PySpark, S3, Lambda
  • Working knowledge of Git, CI/CD, VS Code
  • Proficient in AWS data ingestion stack
  • Must have knowledge and ability to scale up on Glue, Lambda, Step Functions, Spark Streaming and other services on a need basis
  • Implementation experience of several key data concepts such as CDC (Change Data Capture), Streaming and/or Batch ingestion, Pull v/s Push paradigms, Source to Target mapping
  • Collaborate with cross-functional teams to gather requirements and deliver scalable, secure, and high-performance data solutions
  • Strong semantic layer modelling and implementation experience
  • Establish best practices for data governance, lineage, and quality across hybrid environments
  • Provide technical leadership and mentoring to data engineers and developers
  • Monitor and troubleshoot performance issues across Databricks and AWS services
  • Databricks Certified Data Engineer Associate

Preferred Qualifications

  • Understanding of key reporting stack such as Power BI, Tableau, Alteryx, Excel BI Add-Ins

Skills Required

  • SQL
  • Python
  • PySpark
  • S3
  • Lambda
  • Git
  • CI/CD
  • VS Code
  • AWS data ingestion stack
  • Glue
  • Step Functions
  • Spark Streaming

Benefits & Perks

  • Medical/Dental/Vision/Life Insurance
  • Paid holidays plus Paid Time Off
  • 01(k) plan and contributions
  • Long-term/Short-term Disability
  • Paid Parental Leave
  • Employee Stock Purchase Plan

Additional Requirements

  • Applications will be accepted until 10/17/25
  • Applicants may be required to attend interviews in person or by video conference
  • Candidates may be required to present their current state or government issued ID during each interview

Locations

  • India

Salary

88,200 - 139,500 USD / yearly

Skills Required

  • SQLintermediate
  • Pythonintermediate
  • PySparkintermediate
  • S3intermediate
  • Lambdaintermediate
  • Gitintermediate
  • CI/CDintermediate
  • VS Codeintermediate
  • AWS data ingestion stackintermediate
  • Glueintermediate
  • Step Functionsintermediate
  • Spark Streamingintermediate

Required Qualifications

  • Supply Chain, Consumer Goods / Retail domain knowledge (experience)
  • Experience in modernizing enterprise data platforms from legacy to AWS cloud (experience)
  • Deep knowledge and experience in Data Modelling (OLAP and OLTP), Data Lake, Data Warehousing, ETL/ELT pipelines on both on-premise legacy and on AWS (experience)
  • Lead migration and modernization of legacy ETL processes from Informatica, DataStage, and SQL Server to cloud-native solutions (experience)
  • Design and optimize data workflows for ingestion, transformation, and analytics using AWS-native services (experience)
  • Design and Build data pipelines and solutions using Databricks (PySpark and SparkSQL) on AWS (experience)
  • Experience with building Medallion architecture based data estates (experience)
  • Experience in building Databricks Delta Lake based Lakehouse using DLT, PySpark Jobs, Databricks Workflows (experience)
  • Proficient in SQL, Python, PySpark, S3, Lambda (experience)
  • Working knowledge of Git, CI/CD, VS Code (experience)
  • Proficient in AWS data ingestion stack (experience)
  • Must have knowledge and ability to scale up on Glue, Lambda, Step Functions, Spark Streaming and other services on a need basis (experience)
  • Implementation experience of several key data concepts such as CDC (Change Data Capture), Streaming and/or Batch ingestion, Pull v/s Push paradigms, Source to Target mapping (experience)
  • Collaborate with cross-functional teams to gather requirements and deliver scalable, secure, and high-performance data solutions (experience)
  • Strong semantic layer modelling and implementation experience (experience)
  • Establish best practices for data governance, lineage, and quality across hybrid environments (experience)
  • Provide technical leadership and mentoring to data engineers and developers (experience)
  • Monitor and troubleshoot performance issues across Databricks and AWS services (experience)
  • Databricks Certified Data Engineer Associate (experience)

Preferred Qualifications

  • Understanding of key reporting stack such as Power BI, Tableau, Alteryx, Excel BI Add-Ins (experience)

Responsibilities

  • Lead migration and modernization of legacy ETL processes from Informatica, DataStage, and SQL Server to cloud-native solutions
  • Design and optimize data workflows for ingestion, transformation, and analytics using AWS-native services
  • Design and Build data pipelines and solutions using Databricks (PySpark and SparkSQL) on AWS
  • Collaborate with cross-functional teams to gather requirements and deliver scalable, secure, and high-performance data solutions
  • Establish best practices for data governance, lineage, and quality across hybrid environments
  • Provide technical leadership and mentoring to data engineers and developers
  • Monitor and troubleshoot performance issues across Databricks and AWS services

Benefits

  • general: Medical/Dental/Vision/Life Insurance
  • general: Paid holidays plus Paid Time Off
  • general: 01(k) plan and contributions
  • general: Long-term/Short-term Disability
  • general: Paid Parental Leave
  • general: Employee Stock Purchase Plan

Target Your Resume for "Databricks Architect - Remote" , Cognizant

Get personalized recommendations to optimize your resume specifically for Databricks Architect - Remote. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Databricks Architect - Remote" , Cognizant

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

TechnologyIT ServicesTechnologyConsulting

Related Jobs You May Like

No related jobs found at the moment.

Cognizant logo

Databricks Architect - Remote

Cognizant

Databricks Architect - Remote

Cognizant logo

Cognizant

full-time

Posted: December 7, 2025

Number of Vacancies: 1

Job Description

Job Title - Databricks Architect

Job Summary

Databricks Architect with expertise in building scalable data solutions on AWS

  • Supply Chain, Consumer Goods / Retail domain knowledge required.
  • Experience in modernizing (solutions and hands-on execution) enterprise data platforms from legacy to AWS cloud
  • Deep knowledge and experience in Data Modelling (OLAP and OLTP), Data Lake, Data Warehousing, ETL/ELT pipelines on both on-premise legacy and on AWS
  • Lead migration and modernization of legacy ETL processes from Informatica, DataStage, and SQL Server to cloud-native solutions.
  • Design and optimize data workflows for ingestion, transformation, and analytics using AWS-native services
  • Design and Build data pipelines and solutions using Databricks (PySpark and SparkSQL) on AWS
  • Experience with building Medallion architecture based data estates
  • Experience in building Databricks Delta Lake based Lakehouse using DLT, PySpark Jobs, Databricks Workflows
  • Proficient in SQL, Python, PySpark, S3, Lambda
  • Working knowledge of Git, CI/CD, VS Code
  • Proficient in AWS data ingestion stack
  • Must have knowledge and ability to scale up on Glue, Lambda, Step Functions, Spark Streaming and other services on a need basis
  • Implementation experience of several key data concepts such as CDC (Change Data Capture), Streaming and/or Batch ingestion, Pull v/s Push paradigms, Source to Target mapping, and so on
  • Collaborate with cross-functional teams to gather requirements and deliver scalable, secure, and high-performance data solutions.
  • Strong semantic layer modelling and implementation experience
  • Establish best practices for data governance, lineage, and quality across hybrid environments.
  • Provide technical leadership and mentoring to data engineers and developers.
  • Monitor and troubleshoot performance issues across Databricks and AWS services.
  • Understanding of key reporting stack such as Power BI, Tableau, Alteryx, Excel BI Add-Ins a plus

Certifications Required

Databricks Certified Data Engineer Associate

**Applications will be accepted until 10/17/25**

Salary and Other Compensation:

The annual salary for this position is between $88,200 - $139,500 depending on experience and other qualifications of the successful candidate.

Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:

Medical/Dental/Vision/Life Insurance
Paid holidays plus Paid Time Off
401(k) plan and contributions
Long-term/Short-term Disability
Paid Parental Leave
Employee Stock Purchase Plan

Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.

The Cognizant community:
We are a high caliber team who appreciate and support one another. Our people uphold an energetic, collaborative and inclusive workplace where everyone can thrive.

  • Cognizant is a global community with more than 300,000 associates around the world.
  • We don’t just dream of a better way – we make it happen.
  • We take care of our people, clients, company, communities and climate by doing what’s right.
  • We foster an innovative environment where you can build the career path that’s right for you.

About us:
Cognizant is one of the world's leading professional services companies, transforming clients' business, operating, and technology models for the digital era. Our unique industry-based, consultative approach helps clients envision, build, and run more innovative and efficient businesses. Headquartered in the U.S., Cognizant (a member of the NASDAQ-100 and one of Forbes World’s Best Employers 2025) is consistently listed among the most admired companies in the world. Learn how Cognizant helps clients lead with digital at www.cognizant.com

Cognizant is an equal opportunity employer. Your application and candidacy will not be considered based on race, color, sex, religion, creed, sexual orientation, gender identity, national origin, disability, genetic information, pregnancy, veteran status or any other characteristic protected by federal, state or local laws.

If you have a disability that requires reasonable accommodation to search for a job opening or submit an application, please email CareersNA2@cognizant.com with your request and contact information.

Disclaimer:
Compensation information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.

Applicants may be required to attend interviews in person or by video conference. In addition, candidates may be required to present their current state or government issued ID during each interview.

About the Role/Company

  • Cognizant is a global community with more than 300,000 associates around the world
  • We don’t just dream of a better way – we make it happen
  • We take care of our people, clients, company, communities and climate by doing what’s right
  • We foster an innovative environment where you can build the career path that’s right for you
  • Cognizant is one of the world's leading professional services companies, transforming clients' business, operating, and technology models for the digital era
  • Headquartered in the U.S., Cognizant is a member of the NASDAQ-100 and one of Forbes World’s Best Employers 2025
  • Cognizant is consistently listed among the most admired companies in the world
  • Cognizant is an equal opportunity employer

Key Responsibilities

  • Lead migration and modernization of legacy ETL processes from Informatica, DataStage, and SQL Server to cloud-native solutions
  • Design and optimize data workflows for ingestion, transformation, and analytics using AWS-native services
  • Design and Build data pipelines and solutions using Databricks (PySpark and SparkSQL) on AWS
  • Collaborate with cross-functional teams to gather requirements and deliver scalable, secure, and high-performance data solutions
  • Establish best practices for data governance, lineage, and quality across hybrid environments
  • Provide technical leadership and mentoring to data engineers and developers
  • Monitor and troubleshoot performance issues across Databricks and AWS services

Required Qualifications

  • Supply Chain, Consumer Goods / Retail domain knowledge
  • Experience in modernizing enterprise data platforms from legacy to AWS cloud
  • Deep knowledge and experience in Data Modelling (OLAP and OLTP), Data Lake, Data Warehousing, ETL/ELT pipelines on both on-premise legacy and on AWS
  • Lead migration and modernization of legacy ETL processes from Informatica, DataStage, and SQL Server to cloud-native solutions
  • Design and optimize data workflows for ingestion, transformation, and analytics using AWS-native services
  • Design and Build data pipelines and solutions using Databricks (PySpark and SparkSQL) on AWS
  • Experience with building Medallion architecture based data estates
  • Experience in building Databricks Delta Lake based Lakehouse using DLT, PySpark Jobs, Databricks Workflows
  • Proficient in SQL, Python, PySpark, S3, Lambda
  • Working knowledge of Git, CI/CD, VS Code
  • Proficient in AWS data ingestion stack
  • Must have knowledge and ability to scale up on Glue, Lambda, Step Functions, Spark Streaming and other services on a need basis
  • Implementation experience of several key data concepts such as CDC (Change Data Capture), Streaming and/or Batch ingestion, Pull v/s Push paradigms, Source to Target mapping
  • Collaborate with cross-functional teams to gather requirements and deliver scalable, secure, and high-performance data solutions
  • Strong semantic layer modelling and implementation experience
  • Establish best practices for data governance, lineage, and quality across hybrid environments
  • Provide technical leadership and mentoring to data engineers and developers
  • Monitor and troubleshoot performance issues across Databricks and AWS services
  • Databricks Certified Data Engineer Associate

Preferred Qualifications

  • Understanding of key reporting stack such as Power BI, Tableau, Alteryx, Excel BI Add-Ins

Skills Required

  • SQL
  • Python
  • PySpark
  • S3
  • Lambda
  • Git
  • CI/CD
  • VS Code
  • AWS data ingestion stack
  • Glue
  • Step Functions
  • Spark Streaming

Benefits & Perks

  • Medical/Dental/Vision/Life Insurance
  • Paid holidays plus Paid Time Off
  • 01(k) plan and contributions
  • Long-term/Short-term Disability
  • Paid Parental Leave
  • Employee Stock Purchase Plan

Additional Requirements

  • Applications will be accepted until 10/17/25
  • Applicants may be required to attend interviews in person or by video conference
  • Candidates may be required to present their current state or government issued ID during each interview

Locations

  • India

Salary

88,200 - 139,500 USD / yearly

Skills Required

  • SQLintermediate
  • Pythonintermediate
  • PySparkintermediate
  • S3intermediate
  • Lambdaintermediate
  • Gitintermediate
  • CI/CDintermediate
  • VS Codeintermediate
  • AWS data ingestion stackintermediate
  • Glueintermediate
  • Step Functionsintermediate
  • Spark Streamingintermediate

Required Qualifications

  • Supply Chain, Consumer Goods / Retail domain knowledge (experience)
  • Experience in modernizing enterprise data platforms from legacy to AWS cloud (experience)
  • Deep knowledge and experience in Data Modelling (OLAP and OLTP), Data Lake, Data Warehousing, ETL/ELT pipelines on both on-premise legacy and on AWS (experience)
  • Lead migration and modernization of legacy ETL processes from Informatica, DataStage, and SQL Server to cloud-native solutions (experience)
  • Design and optimize data workflows for ingestion, transformation, and analytics using AWS-native services (experience)
  • Design and Build data pipelines and solutions using Databricks (PySpark and SparkSQL) on AWS (experience)
  • Experience with building Medallion architecture based data estates (experience)
  • Experience in building Databricks Delta Lake based Lakehouse using DLT, PySpark Jobs, Databricks Workflows (experience)
  • Proficient in SQL, Python, PySpark, S3, Lambda (experience)
  • Working knowledge of Git, CI/CD, VS Code (experience)
  • Proficient in AWS data ingestion stack (experience)
  • Must have knowledge and ability to scale up on Glue, Lambda, Step Functions, Spark Streaming and other services on a need basis (experience)
  • Implementation experience of several key data concepts such as CDC (Change Data Capture), Streaming and/or Batch ingestion, Pull v/s Push paradigms, Source to Target mapping (experience)
  • Collaborate with cross-functional teams to gather requirements and deliver scalable, secure, and high-performance data solutions (experience)
  • Strong semantic layer modelling and implementation experience (experience)
  • Establish best practices for data governance, lineage, and quality across hybrid environments (experience)
  • Provide technical leadership and mentoring to data engineers and developers (experience)
  • Monitor and troubleshoot performance issues across Databricks and AWS services (experience)
  • Databricks Certified Data Engineer Associate (experience)

Preferred Qualifications

  • Understanding of key reporting stack such as Power BI, Tableau, Alteryx, Excel BI Add-Ins (experience)

Responsibilities

  • Lead migration and modernization of legacy ETL processes from Informatica, DataStage, and SQL Server to cloud-native solutions
  • Design and optimize data workflows for ingestion, transformation, and analytics using AWS-native services
  • Design and Build data pipelines and solutions using Databricks (PySpark and SparkSQL) on AWS
  • Collaborate with cross-functional teams to gather requirements and deliver scalable, secure, and high-performance data solutions
  • Establish best practices for data governance, lineage, and quality across hybrid environments
  • Provide technical leadership and mentoring to data engineers and developers
  • Monitor and troubleshoot performance issues across Databricks and AWS services

Benefits

  • general: Medical/Dental/Vision/Life Insurance
  • general: Paid holidays plus Paid Time Off
  • general: 01(k) plan and contributions
  • general: Long-term/Short-term Disability
  • general: Paid Parental Leave
  • general: Employee Stock Purchase Plan

Target Your Resume for "Databricks Architect - Remote" , Cognizant

Get personalized recommendations to optimize your resume specifically for Databricks Architect - Remote. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Databricks Architect - Remote" , Cognizant

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

TechnologyIT ServicesTechnologyConsulting

Related Jobs You May Like

No related jobs found at the moment.