Resume and JobRESUME AND JOB
JP Morgan Chase logo

Data Engineer III - AWS / Databricks

JP Morgan Chase

Software and Technology Jobs

Data Engineer III - AWS / Databricks

full-timePosted: Dec 2, 2025

Job Description

Data Engineer III - AWS / Databricks

Location: Plano, TX, United States

Job Family: Data Engineering

About the Role

At JP Morgan Chase, we are at the forefront of financial innovation, leveraging cutting-edge technology to power global banking, investment, and asset management services. As a Data Engineer III - AWS / Databricks in our Plano, TX technology hub, you will play a pivotal role in developing, testing, and maintaining critical data pipelines and architectures that drive data-driven decisions across our enterprise. This position within the Data Engineering category offers the opportunity to work on high-impact projects that handle petabytes of financial data, ensuring seamless integration and real-time processing for everything from fraud detection to portfolio optimization. You will collaborate with cross-functional teams in a dynamic environment that values innovation while adhering to stringent regulatory standards in the financial services industry. Your core responsibilities will include architecting scalable solutions using AWS services like S3, Glue, and EMR, alongside Databricks for advanced Spark-based processing. You will build robust ETL pipelines to ingest diverse data sources, including transaction logs, market data, and customer interactions, transforming them into actionable insights for JP Morgan's business lines. Emphasis will be placed on data governance, implementing quality controls, and ensuring compliance with financial regulations such as Basel III and data privacy laws. This role demands a proactive approach to troubleshooting production issues and optimizing performance to support 24/7 global operations. Joining JP Morgan Chase means becoming part of a world-class team committed to excellence and inclusion. We offer unparalleled opportunities for growth, with access to mentorship from senior engineers and involvement in strategic initiatives that shape the future of finance. In Plano, TX, you'll benefit from a collaborative culture, state-of-the-art facilities, and a focus on work-life balance, all while contributing to the stability and innovation of the world's leading financial institution.

Key Responsibilities

  • Design, develop, and optimize scalable data pipelines using AWS and Databricks to support financial analytics and reporting
  • Collaborate with data scientists and analysts to ingest, transform, and store large volumes of financial data securely
  • Implement data quality checks and monitoring to ensure accuracy and compliance with regulatory standards like GDPR and SOX
  • Troubleshoot and resolve issues in production data environments, minimizing downtime for critical banking operations
  • Build and maintain data architectures that integrate multiple sources, including real-time transaction data and market feeds
  • Contribute to the evolution of JP Morgan's data platform by adopting best practices in cloud-native technologies
  • Document data pipelines and processes to facilitate knowledge sharing across global teams
  • Participate in code reviews and ensure adherence to security protocols in handling sensitive financial information
  • Support ad-hoc data requests for business units, such as risk management or investment advisory

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field; Master's degree preferred
  • 5+ years of experience in data engineering, with a focus on building and maintaining data pipelines
  • Proficiency in AWS services including S3, Glue, Lambda, and EMR
  • Hands-on experience with Databricks, Spark, and Delta Lake for large-scale data processing
  • Strong programming skills in Python, Scala, or Java
  • Experience with ETL processes and data modeling in a financial services environment
  • Ability to work in a fast-paced, regulated industry with strict compliance requirements

Preferred Qualifications

  • Experience with financial data systems such as transaction processing or risk modeling
  • Knowledge of machine learning pipelines and integration with Databricks MLflow
  • Familiarity with CI/CD tools like Jenkins or GitHub Actions in cloud environments
  • Previous work in agile teams within large financial institutions
  • Certifications such as AWS Certified Data Analytics or Databricks Certified Data Engineer

Required Skills

  • AWS cloud services (S3, Glue, EMR, Lambda)
  • Databricks platform and Apache Spark
  • Python or Scala programming
  • ETL/ELT pipeline development
  • Data modeling and SQL
  • Big data processing with Delta Lake
  • Version control with Git
  • CI/CD automation tools
  • Problem-solving and debugging
  • Agile methodologies
  • Financial data compliance and security
  • Collaboration and communication
  • Attention to detail in regulated environments
  • Time management in fast-paced settings
  • Analytical thinking for data optimization

Benefits

  • Competitive base salary and performance-based annual bonuses
  • Comprehensive health, dental, and vision insurance plans
  • 401(k) retirement savings plan with company matching contributions
  • Generous paid time off, including vacation, sick days, and parental leave
  • Professional development opportunities, including tuition reimbursement and access to internal training programs
  • Employee stock purchase plan and financial wellness resources
  • On-site fitness centers and wellness programs at JP Morgan locations
  • Flexible work arrangements, including hybrid options in Plano, TX

JP Morgan Chase is an equal opportunity employer.

Locations

  • Plano, US

Salary

Estimated Salary Rangehigh confidence

180,000 - 250,000 USD / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • AWS cloud services (S3, Glue, EMR, Lambda)intermediate
  • Databricks platform and Apache Sparkintermediate
  • Python or Scala programmingintermediate
  • ETL/ELT pipeline developmentintermediate
  • Data modeling and SQLintermediate
  • Big data processing with Delta Lakeintermediate
  • Version control with Gitintermediate
  • CI/CD automation toolsintermediate
  • Problem-solving and debuggingintermediate
  • Agile methodologiesintermediate
  • Financial data compliance and securityintermediate
  • Collaboration and communicationintermediate
  • Attention to detail in regulated environmentsintermediate
  • Time management in fast-paced settingsintermediate
  • Analytical thinking for data optimizationintermediate

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field; Master's degree preferred (experience)
  • 5+ years of experience in data engineering, with a focus on building and maintaining data pipelines (experience)
  • Proficiency in AWS services including S3, Glue, Lambda, and EMR (experience)
  • Hands-on experience with Databricks, Spark, and Delta Lake for large-scale data processing (experience)
  • Strong programming skills in Python, Scala, or Java (experience)
  • Experience with ETL processes and data modeling in a financial services environment (experience)
  • Ability to work in a fast-paced, regulated industry with strict compliance requirements (experience)

Preferred Qualifications

  • Experience with financial data systems such as transaction processing or risk modeling (experience)
  • Knowledge of machine learning pipelines and integration with Databricks MLflow (experience)
  • Familiarity with CI/CD tools like Jenkins or GitHub Actions in cloud environments (experience)
  • Previous work in agile teams within large financial institutions (experience)
  • Certifications such as AWS Certified Data Analytics or Databricks Certified Data Engineer (experience)

Responsibilities

  • Design, develop, and optimize scalable data pipelines using AWS and Databricks to support financial analytics and reporting
  • Collaborate with data scientists and analysts to ingest, transform, and store large volumes of financial data securely
  • Implement data quality checks and monitoring to ensure accuracy and compliance with regulatory standards like GDPR and SOX
  • Troubleshoot and resolve issues in production data environments, minimizing downtime for critical banking operations
  • Build and maintain data architectures that integrate multiple sources, including real-time transaction data and market feeds
  • Contribute to the evolution of JP Morgan's data platform by adopting best practices in cloud-native technologies
  • Document data pipelines and processes to facilitate knowledge sharing across global teams
  • Participate in code reviews and ensure adherence to security protocols in handling sensitive financial information
  • Support ad-hoc data requests for business units, such as risk management or investment advisory

Benefits

  • general: Competitive base salary and performance-based annual bonuses
  • general: Comprehensive health, dental, and vision insurance plans
  • general: 401(k) retirement savings plan with company matching contributions
  • general: Generous paid time off, including vacation, sick days, and parental leave
  • general: Professional development opportunities, including tuition reimbursement and access to internal training programs
  • general: Employee stock purchase plan and financial wellness resources
  • general: On-site fitness centers and wellness programs at JP Morgan locations
  • general: Flexible work arrangements, including hybrid options in Plano, TX

Target Your Resume for "Data Engineer III - AWS / Databricks" , JP Morgan Chase

Get personalized recommendations to optimize your resume specifically for Data Engineer III - AWS / Databricks. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Data Engineer III - AWS / Databricks" , JP Morgan Chase

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

Data EngineeringFinancial ServicesBankingJP MorganData Engineering

Answer 10 quick questions to check your fit for Data Engineer III - AWS / Databricks @ JP Morgan Chase.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.

JP Morgan Chase logo

Data Engineer III - AWS / Databricks

JP Morgan Chase

Software and Technology Jobs

Data Engineer III - AWS / Databricks

full-timePosted: Dec 2, 2025

Job Description

Data Engineer III - AWS / Databricks

Location: Plano, TX, United States

Job Family: Data Engineering

About the Role

At JP Morgan Chase, we are at the forefront of financial innovation, leveraging cutting-edge technology to power global banking, investment, and asset management services. As a Data Engineer III - AWS / Databricks in our Plano, TX technology hub, you will play a pivotal role in developing, testing, and maintaining critical data pipelines and architectures that drive data-driven decisions across our enterprise. This position within the Data Engineering category offers the opportunity to work on high-impact projects that handle petabytes of financial data, ensuring seamless integration and real-time processing for everything from fraud detection to portfolio optimization. You will collaborate with cross-functional teams in a dynamic environment that values innovation while adhering to stringent regulatory standards in the financial services industry. Your core responsibilities will include architecting scalable solutions using AWS services like S3, Glue, and EMR, alongside Databricks for advanced Spark-based processing. You will build robust ETL pipelines to ingest diverse data sources, including transaction logs, market data, and customer interactions, transforming them into actionable insights for JP Morgan's business lines. Emphasis will be placed on data governance, implementing quality controls, and ensuring compliance with financial regulations such as Basel III and data privacy laws. This role demands a proactive approach to troubleshooting production issues and optimizing performance to support 24/7 global operations. Joining JP Morgan Chase means becoming part of a world-class team committed to excellence and inclusion. We offer unparalleled opportunities for growth, with access to mentorship from senior engineers and involvement in strategic initiatives that shape the future of finance. In Plano, TX, you'll benefit from a collaborative culture, state-of-the-art facilities, and a focus on work-life balance, all while contributing to the stability and innovation of the world's leading financial institution.

Key Responsibilities

  • Design, develop, and optimize scalable data pipelines using AWS and Databricks to support financial analytics and reporting
  • Collaborate with data scientists and analysts to ingest, transform, and store large volumes of financial data securely
  • Implement data quality checks and monitoring to ensure accuracy and compliance with regulatory standards like GDPR and SOX
  • Troubleshoot and resolve issues in production data environments, minimizing downtime for critical banking operations
  • Build and maintain data architectures that integrate multiple sources, including real-time transaction data and market feeds
  • Contribute to the evolution of JP Morgan's data platform by adopting best practices in cloud-native technologies
  • Document data pipelines and processes to facilitate knowledge sharing across global teams
  • Participate in code reviews and ensure adherence to security protocols in handling sensitive financial information
  • Support ad-hoc data requests for business units, such as risk management or investment advisory

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field; Master's degree preferred
  • 5+ years of experience in data engineering, with a focus on building and maintaining data pipelines
  • Proficiency in AWS services including S3, Glue, Lambda, and EMR
  • Hands-on experience with Databricks, Spark, and Delta Lake for large-scale data processing
  • Strong programming skills in Python, Scala, or Java
  • Experience with ETL processes and data modeling in a financial services environment
  • Ability to work in a fast-paced, regulated industry with strict compliance requirements

Preferred Qualifications

  • Experience with financial data systems such as transaction processing or risk modeling
  • Knowledge of machine learning pipelines and integration with Databricks MLflow
  • Familiarity with CI/CD tools like Jenkins or GitHub Actions in cloud environments
  • Previous work in agile teams within large financial institutions
  • Certifications such as AWS Certified Data Analytics or Databricks Certified Data Engineer

Required Skills

  • AWS cloud services (S3, Glue, EMR, Lambda)
  • Databricks platform and Apache Spark
  • Python or Scala programming
  • ETL/ELT pipeline development
  • Data modeling and SQL
  • Big data processing with Delta Lake
  • Version control with Git
  • CI/CD automation tools
  • Problem-solving and debugging
  • Agile methodologies
  • Financial data compliance and security
  • Collaboration and communication
  • Attention to detail in regulated environments
  • Time management in fast-paced settings
  • Analytical thinking for data optimization

Benefits

  • Competitive base salary and performance-based annual bonuses
  • Comprehensive health, dental, and vision insurance plans
  • 401(k) retirement savings plan with company matching contributions
  • Generous paid time off, including vacation, sick days, and parental leave
  • Professional development opportunities, including tuition reimbursement and access to internal training programs
  • Employee stock purchase plan and financial wellness resources
  • On-site fitness centers and wellness programs at JP Morgan locations
  • Flexible work arrangements, including hybrid options in Plano, TX

JP Morgan Chase is an equal opportunity employer.

Locations

  • Plano, US

Salary

Estimated Salary Rangehigh confidence

180,000 - 250,000 USD / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • AWS cloud services (S3, Glue, EMR, Lambda)intermediate
  • Databricks platform and Apache Sparkintermediate
  • Python or Scala programmingintermediate
  • ETL/ELT pipeline developmentintermediate
  • Data modeling and SQLintermediate
  • Big data processing with Delta Lakeintermediate
  • Version control with Gitintermediate
  • CI/CD automation toolsintermediate
  • Problem-solving and debuggingintermediate
  • Agile methodologiesintermediate
  • Financial data compliance and securityintermediate
  • Collaboration and communicationintermediate
  • Attention to detail in regulated environmentsintermediate
  • Time management in fast-paced settingsintermediate
  • Analytical thinking for data optimizationintermediate

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field; Master's degree preferred (experience)
  • 5+ years of experience in data engineering, with a focus on building and maintaining data pipelines (experience)
  • Proficiency in AWS services including S3, Glue, Lambda, and EMR (experience)
  • Hands-on experience with Databricks, Spark, and Delta Lake for large-scale data processing (experience)
  • Strong programming skills in Python, Scala, or Java (experience)
  • Experience with ETL processes and data modeling in a financial services environment (experience)
  • Ability to work in a fast-paced, regulated industry with strict compliance requirements (experience)

Preferred Qualifications

  • Experience with financial data systems such as transaction processing or risk modeling (experience)
  • Knowledge of machine learning pipelines and integration with Databricks MLflow (experience)
  • Familiarity with CI/CD tools like Jenkins or GitHub Actions in cloud environments (experience)
  • Previous work in agile teams within large financial institutions (experience)
  • Certifications such as AWS Certified Data Analytics or Databricks Certified Data Engineer (experience)

Responsibilities

  • Design, develop, and optimize scalable data pipelines using AWS and Databricks to support financial analytics and reporting
  • Collaborate with data scientists and analysts to ingest, transform, and store large volumes of financial data securely
  • Implement data quality checks and monitoring to ensure accuracy and compliance with regulatory standards like GDPR and SOX
  • Troubleshoot and resolve issues in production data environments, minimizing downtime for critical banking operations
  • Build and maintain data architectures that integrate multiple sources, including real-time transaction data and market feeds
  • Contribute to the evolution of JP Morgan's data platform by adopting best practices in cloud-native technologies
  • Document data pipelines and processes to facilitate knowledge sharing across global teams
  • Participate in code reviews and ensure adherence to security protocols in handling sensitive financial information
  • Support ad-hoc data requests for business units, such as risk management or investment advisory

Benefits

  • general: Competitive base salary and performance-based annual bonuses
  • general: Comprehensive health, dental, and vision insurance plans
  • general: 401(k) retirement savings plan with company matching contributions
  • general: Generous paid time off, including vacation, sick days, and parental leave
  • general: Professional development opportunities, including tuition reimbursement and access to internal training programs
  • general: Employee stock purchase plan and financial wellness resources
  • general: On-site fitness centers and wellness programs at JP Morgan locations
  • general: Flexible work arrangements, including hybrid options in Plano, TX

Target Your Resume for "Data Engineer III - AWS / Databricks" , JP Morgan Chase

Get personalized recommendations to optimize your resume specifically for Data Engineer III - AWS / Databricks. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Data Engineer III - AWS / Databricks" , JP Morgan Chase

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

Data EngineeringFinancial ServicesBankingJP MorganData Engineering

Answer 10 quick questions to check your fit for Data Engineer III - AWS / Databricks @ JP Morgan Chase.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.