Resume and JobRESUME AND JOB
JP Morgan Chase logo

Software Engineer III - Data Engineer, Spark, Databricks, SQL

JP Morgan Chase

Software and Technology Jobs

Software Engineer III - Data Engineer, Spark, Databricks, SQL

full-timePosted: Dec 8, 2025

Job Description

Software Engineer III - Data Engineer, Spark, Databricks, SQL

Location: Plano, TX, United States

Job Family: Software Engineering

About the Role

At JP Morgan Chase, we are at the forefront of financial innovation, powering the world's leading financial services firm with cutting-edge technology. As a Software Engineer III - Data Engineer specializing in Spark, Databricks, and SQL, you will join our dynamic team in Plano, TX, to design, develop, and troubleshoot innovative solutions for our cloud data platforms. In this role, you will play a pivotal part in transforming vast amounts of financial data into actionable insights that drive business decisions, manage risks, and ensure regulatory compliance across our global operations. Our commitment to diversity, inclusion, and ethical technology practices makes this an exciting opportunity to contribute to a firm that serves millions of customers and institutions worldwide. Your primary focus will be on building scalable data pipelines that handle high-velocity transaction data from trading platforms, customer interactions, and market analytics. Leveraging Spark for distributed processing and Databricks for collaborative environments, you will optimize SQL-based queries to support real-time reporting and advanced analytics essential to JP Morgan Chase's investment banking, asset management, and consumer banking divisions. You will collaborate with data scientists, analysts, and stakeholders to integrate these solutions into our hybrid cloud infrastructure, ensuring robustness against the demands of a 24/7 financial ecosystem. Troubleshooting complex issues, such as data latency or integration failures, will be key to maintaining the reliability of systems that underpin billions in daily transactions. We value creativity and drive in our engineers, offering a supportive environment where you can innovate while adhering to stringent security and compliance standards like those from the SEC and FINRA. This position provides exposure to the latest in financial technology, with opportunities to work on projects that enhance fraud detection, portfolio optimization, and personalized banking services. Join JP Morgan Chase to advance your career in a role that combines technical excellence with meaningful impact on the global economy.

Key Responsibilities

  • Design and develop scalable data pipelines using Spark and Databricks to support financial analytics and reporting
  • Collaborate with cross-functional teams to integrate data solutions into JP Morgan Chase's cloud platforms
  • Optimize SQL queries and data workflows for performance in high-volume transaction environments
  • Troubleshoot and resolve data quality issues, ensuring accuracy for risk management and compliance
  • Implement data governance practices aligned with financial regulations like GDPR and SOX
  • Build and maintain ETL processes to handle diverse data sources from trading systems and customer databases
  • Contribute to innovation in cloud data architectures, leveraging JP Morgan's hybrid cloud strategy
  • Mentor junior engineers and participate in code reviews to uphold engineering best practices
  • Monitor and scale data platforms to support real-time financial decision-making
  • Document technical solutions and ensure seamless knowledge transfer within the team

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field
  • 5+ years of experience in software engineering with a focus on data engineering
  • Proficiency in Spark, Databricks, and SQL for large-scale data processing
  • Experience with cloud platforms such as AWS, Azure, or GCP
  • Strong understanding of data modeling, ETL processes, and data pipelines
  • Ability to troubleshoot complex data issues in a high-stakes financial environment
  • Demonstrated experience working in agile teams with version control systems like Git

Preferred Qualifications

  • Master's degree in a relevant technical field
  • Experience in the financial services industry, particularly with regulatory compliance
  • Knowledge of Python or Scala for data engineering tasks
  • Familiarity with machine learning frameworks integrated with Databricks
  • Certifications in cloud data platforms (e.g., Databricks Certified Data Engineer)

Required Skills

  • Expertise in Apache Spark for distributed data processing
  • Proficiency in Databricks for collaborative data engineering
  • Advanced SQL skills for querying large financial datasets
  • Experience with ETL tools and data orchestration (e.g., Airflow)
  • Knowledge of cloud computing services (AWS S3, Azure Data Lake)
  • Programming in Python, Scala, or Java for data manipulation
  • Understanding of data security and compliance in finance
  • Problem-solving and analytical thinking for complex data challenges
  • Collaboration and communication skills in agile environments
  • Version control with Git and CI/CD pipelines
  • Familiarity with big data technologies (Hadoop, Kafka)
  • Attention to detail for ensuring data integrity in transactions
  • Adaptability to fast-paced financial industry demands
  • Experience with data visualization tools (e.g., Tableau integration)

Benefits

  • Competitive base salary and performance-based annual bonuses
  • Comprehensive health, dental, and vision insurance plans
  • 401(k) retirement savings plan with company matching contributions
  • Generous paid time off, including vacation, sick days, and parental leave
  • Professional development opportunities, including tuition reimbursement and certifications
  • Employee stock purchase plan and financial wellness programs
  • On-site fitness centers and wellness initiatives at JP Morgan Chase locations
  • Flexible work arrangements, including hybrid remote options in Plano, TX

JP Morgan Chase is an equal opportunity employer.

Locations

  • Plano, US

Salary

Estimated Salary Rangehigh confidence

180,000 - 250,000 USD / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Expertise in Apache Spark for distributed data processingintermediate
  • Proficiency in Databricks for collaborative data engineeringintermediate
  • Advanced SQL skills for querying large financial datasetsintermediate
  • Experience with ETL tools and data orchestration (e.g., Airflow)intermediate
  • Knowledge of cloud computing services (AWS S3, Azure Data Lake)intermediate
  • Programming in Python, Scala, or Java for data manipulationintermediate
  • Understanding of data security and compliance in financeintermediate
  • Problem-solving and analytical thinking for complex data challengesintermediate
  • Collaboration and communication skills in agile environmentsintermediate
  • Version control with Git and CI/CD pipelinesintermediate
  • Familiarity with big data technologies (Hadoop, Kafka)intermediate
  • Attention to detail for ensuring data integrity in transactionsintermediate
  • Adaptability to fast-paced financial industry demandsintermediate
  • Experience with data visualization tools (e.g., Tableau integration)intermediate

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field (experience)
  • 5+ years of experience in software engineering with a focus on data engineering (experience)
  • Proficiency in Spark, Databricks, and SQL for large-scale data processing (experience)
  • Experience with cloud platforms such as AWS, Azure, or GCP (experience)
  • Strong understanding of data modeling, ETL processes, and data pipelines (experience)
  • Ability to troubleshoot complex data issues in a high-stakes financial environment (experience)
  • Demonstrated experience working in agile teams with version control systems like Git (experience)

Preferred Qualifications

  • Master's degree in a relevant technical field (experience)
  • Experience in the financial services industry, particularly with regulatory compliance (experience)
  • Knowledge of Python or Scala for data engineering tasks (experience)
  • Familiarity with machine learning frameworks integrated with Databricks (experience)
  • Certifications in cloud data platforms (e.g., Databricks Certified Data Engineer) (experience)

Responsibilities

  • Design and develop scalable data pipelines using Spark and Databricks to support financial analytics and reporting
  • Collaborate with cross-functional teams to integrate data solutions into JP Morgan Chase's cloud platforms
  • Optimize SQL queries and data workflows for performance in high-volume transaction environments
  • Troubleshoot and resolve data quality issues, ensuring accuracy for risk management and compliance
  • Implement data governance practices aligned with financial regulations like GDPR and SOX
  • Build and maintain ETL processes to handle diverse data sources from trading systems and customer databases
  • Contribute to innovation in cloud data architectures, leveraging JP Morgan's hybrid cloud strategy
  • Mentor junior engineers and participate in code reviews to uphold engineering best practices
  • Monitor and scale data platforms to support real-time financial decision-making
  • Document technical solutions and ensure seamless knowledge transfer within the team

Benefits

  • general: Competitive base salary and performance-based annual bonuses
  • general: Comprehensive health, dental, and vision insurance plans
  • general: 401(k) retirement savings plan with company matching contributions
  • general: Generous paid time off, including vacation, sick days, and parental leave
  • general: Professional development opportunities, including tuition reimbursement and certifications
  • general: Employee stock purchase plan and financial wellness programs
  • general: On-site fitness centers and wellness initiatives at JP Morgan Chase locations
  • general: Flexible work arrangements, including hybrid remote options in Plano, TX

Target Your Resume for "Software Engineer III - Data Engineer, Spark, Databricks, SQL" , JP Morgan Chase

Get personalized recommendations to optimize your resume specifically for Software Engineer III - Data Engineer, Spark, Databricks, SQL. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Software Engineer III - Data Engineer, Spark, Databricks, SQL" , JP Morgan Chase

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

Software EngineeringFinancial ServicesBankingJP MorganSoftware Engineering

Answer 10 quick questions to check your fit for Software Engineer III - Data Engineer, Spark, Databricks, SQL @ JP Morgan Chase.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.

JP Morgan Chase logo

Software Engineer III - Data Engineer, Spark, Databricks, SQL

JP Morgan Chase

Software and Technology Jobs

Software Engineer III - Data Engineer, Spark, Databricks, SQL

full-timePosted: Dec 8, 2025

Job Description

Software Engineer III - Data Engineer, Spark, Databricks, SQL

Location: Plano, TX, United States

Job Family: Software Engineering

About the Role

At JP Morgan Chase, we are at the forefront of financial innovation, powering the world's leading financial services firm with cutting-edge technology. As a Software Engineer III - Data Engineer specializing in Spark, Databricks, and SQL, you will join our dynamic team in Plano, TX, to design, develop, and troubleshoot innovative solutions for our cloud data platforms. In this role, you will play a pivotal part in transforming vast amounts of financial data into actionable insights that drive business decisions, manage risks, and ensure regulatory compliance across our global operations. Our commitment to diversity, inclusion, and ethical technology practices makes this an exciting opportunity to contribute to a firm that serves millions of customers and institutions worldwide. Your primary focus will be on building scalable data pipelines that handle high-velocity transaction data from trading platforms, customer interactions, and market analytics. Leveraging Spark for distributed processing and Databricks for collaborative environments, you will optimize SQL-based queries to support real-time reporting and advanced analytics essential to JP Morgan Chase's investment banking, asset management, and consumer banking divisions. You will collaborate with data scientists, analysts, and stakeholders to integrate these solutions into our hybrid cloud infrastructure, ensuring robustness against the demands of a 24/7 financial ecosystem. Troubleshooting complex issues, such as data latency or integration failures, will be key to maintaining the reliability of systems that underpin billions in daily transactions. We value creativity and drive in our engineers, offering a supportive environment where you can innovate while adhering to stringent security and compliance standards like those from the SEC and FINRA. This position provides exposure to the latest in financial technology, with opportunities to work on projects that enhance fraud detection, portfolio optimization, and personalized banking services. Join JP Morgan Chase to advance your career in a role that combines technical excellence with meaningful impact on the global economy.

Key Responsibilities

  • Design and develop scalable data pipelines using Spark and Databricks to support financial analytics and reporting
  • Collaborate with cross-functional teams to integrate data solutions into JP Morgan Chase's cloud platforms
  • Optimize SQL queries and data workflows for performance in high-volume transaction environments
  • Troubleshoot and resolve data quality issues, ensuring accuracy for risk management and compliance
  • Implement data governance practices aligned with financial regulations like GDPR and SOX
  • Build and maintain ETL processes to handle diverse data sources from trading systems and customer databases
  • Contribute to innovation in cloud data architectures, leveraging JP Morgan's hybrid cloud strategy
  • Mentor junior engineers and participate in code reviews to uphold engineering best practices
  • Monitor and scale data platforms to support real-time financial decision-making
  • Document technical solutions and ensure seamless knowledge transfer within the team

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field
  • 5+ years of experience in software engineering with a focus on data engineering
  • Proficiency in Spark, Databricks, and SQL for large-scale data processing
  • Experience with cloud platforms such as AWS, Azure, or GCP
  • Strong understanding of data modeling, ETL processes, and data pipelines
  • Ability to troubleshoot complex data issues in a high-stakes financial environment
  • Demonstrated experience working in agile teams with version control systems like Git

Preferred Qualifications

  • Master's degree in a relevant technical field
  • Experience in the financial services industry, particularly with regulatory compliance
  • Knowledge of Python or Scala for data engineering tasks
  • Familiarity with machine learning frameworks integrated with Databricks
  • Certifications in cloud data platforms (e.g., Databricks Certified Data Engineer)

Required Skills

  • Expertise in Apache Spark for distributed data processing
  • Proficiency in Databricks for collaborative data engineering
  • Advanced SQL skills for querying large financial datasets
  • Experience with ETL tools and data orchestration (e.g., Airflow)
  • Knowledge of cloud computing services (AWS S3, Azure Data Lake)
  • Programming in Python, Scala, or Java for data manipulation
  • Understanding of data security and compliance in finance
  • Problem-solving and analytical thinking for complex data challenges
  • Collaboration and communication skills in agile environments
  • Version control with Git and CI/CD pipelines
  • Familiarity with big data technologies (Hadoop, Kafka)
  • Attention to detail for ensuring data integrity in transactions
  • Adaptability to fast-paced financial industry demands
  • Experience with data visualization tools (e.g., Tableau integration)

Benefits

  • Competitive base salary and performance-based annual bonuses
  • Comprehensive health, dental, and vision insurance plans
  • 401(k) retirement savings plan with company matching contributions
  • Generous paid time off, including vacation, sick days, and parental leave
  • Professional development opportunities, including tuition reimbursement and certifications
  • Employee stock purchase plan and financial wellness programs
  • On-site fitness centers and wellness initiatives at JP Morgan Chase locations
  • Flexible work arrangements, including hybrid remote options in Plano, TX

JP Morgan Chase is an equal opportunity employer.

Locations

  • Plano, US

Salary

Estimated Salary Rangehigh confidence

180,000 - 250,000 USD / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Expertise in Apache Spark for distributed data processingintermediate
  • Proficiency in Databricks for collaborative data engineeringintermediate
  • Advanced SQL skills for querying large financial datasetsintermediate
  • Experience with ETL tools and data orchestration (e.g., Airflow)intermediate
  • Knowledge of cloud computing services (AWS S3, Azure Data Lake)intermediate
  • Programming in Python, Scala, or Java for data manipulationintermediate
  • Understanding of data security and compliance in financeintermediate
  • Problem-solving and analytical thinking for complex data challengesintermediate
  • Collaboration and communication skills in agile environmentsintermediate
  • Version control with Git and CI/CD pipelinesintermediate
  • Familiarity with big data technologies (Hadoop, Kafka)intermediate
  • Attention to detail for ensuring data integrity in transactionsintermediate
  • Adaptability to fast-paced financial industry demandsintermediate
  • Experience with data visualization tools (e.g., Tableau integration)intermediate

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field (experience)
  • 5+ years of experience in software engineering with a focus on data engineering (experience)
  • Proficiency in Spark, Databricks, and SQL for large-scale data processing (experience)
  • Experience with cloud platforms such as AWS, Azure, or GCP (experience)
  • Strong understanding of data modeling, ETL processes, and data pipelines (experience)
  • Ability to troubleshoot complex data issues in a high-stakes financial environment (experience)
  • Demonstrated experience working in agile teams with version control systems like Git (experience)

Preferred Qualifications

  • Master's degree in a relevant technical field (experience)
  • Experience in the financial services industry, particularly with regulatory compliance (experience)
  • Knowledge of Python or Scala for data engineering tasks (experience)
  • Familiarity with machine learning frameworks integrated with Databricks (experience)
  • Certifications in cloud data platforms (e.g., Databricks Certified Data Engineer) (experience)

Responsibilities

  • Design and develop scalable data pipelines using Spark and Databricks to support financial analytics and reporting
  • Collaborate with cross-functional teams to integrate data solutions into JP Morgan Chase's cloud platforms
  • Optimize SQL queries and data workflows for performance in high-volume transaction environments
  • Troubleshoot and resolve data quality issues, ensuring accuracy for risk management and compliance
  • Implement data governance practices aligned with financial regulations like GDPR and SOX
  • Build and maintain ETL processes to handle diverse data sources from trading systems and customer databases
  • Contribute to innovation in cloud data architectures, leveraging JP Morgan's hybrid cloud strategy
  • Mentor junior engineers and participate in code reviews to uphold engineering best practices
  • Monitor and scale data platforms to support real-time financial decision-making
  • Document technical solutions and ensure seamless knowledge transfer within the team

Benefits

  • general: Competitive base salary and performance-based annual bonuses
  • general: Comprehensive health, dental, and vision insurance plans
  • general: 401(k) retirement savings plan with company matching contributions
  • general: Generous paid time off, including vacation, sick days, and parental leave
  • general: Professional development opportunities, including tuition reimbursement and certifications
  • general: Employee stock purchase plan and financial wellness programs
  • general: On-site fitness centers and wellness initiatives at JP Morgan Chase locations
  • general: Flexible work arrangements, including hybrid remote options in Plano, TX

Target Your Resume for "Software Engineer III - Data Engineer, Spark, Databricks, SQL" , JP Morgan Chase

Get personalized recommendations to optimize your resume specifically for Software Engineer III - Data Engineer, Spark, Databricks, SQL. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Software Engineer III - Data Engineer, Spark, Databricks, SQL" , JP Morgan Chase

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

Software EngineeringFinancial ServicesBankingJP MorganSoftware Engineering

Answer 10 quick questions to check your fit for Software Engineer III - Data Engineer, Spark, Databricks, SQL @ JP Morgan Chase.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.