Resume and JobRESUME AND JOB
JP Morgan Chase logo

Software Engineer III - PySpark/Databricks

JP Morgan Chase

Software and Technology Jobs

Software Engineer III - PySpark/Databricks

full-timePosted: Dec 4, 2025

Job Description

Software Engineer III - PySpark/Databricks

Location: Plano, TX, United States

Job Family: Software Engineering

About the Role

At JP Morgan Chase, we are a leading global financial services firm with a commitment to innovation and excellence in technology. We are seeking a talented Software Engineer III specializing in PySpark and Databricks to join our dynamic team in Plano, TX. In this role, you will play a pivotal part in building and maintaining advanced data platforms that power our banking, investment, and asset management operations. Leveraging your expertise in PySpark and Java, you will develop robust solutions to handle petabyte-scale financial datasets, ensuring seamless integration with our core systems for real-time decision-making and risk management. Your primary focus will be on designing scalable data pipelines using PySpark within the Databricks environment, optimizing for performance in high-stakes financial applications such as fraud detection and portfolio analytics. You will collaborate with cross-functional teams, including quants, traders, and compliance experts, to translate complex business requirements into efficient engineering solutions. Optional experience with React will allow you to contribute to intuitive data visualization tools, enhancing how our stakeholders interact with critical insights. At JP Morgan Chase, you will work in a secure, regulated environment where data privacy and accuracy are paramount, adhering to stringent standards like those set by the SEC and FDIC. This position offers an exciting opportunity to advance your career in a world-class organization known for its technological leadership in finance. You will benefit from exposure to cutting-edge tools and methodologies, while contributing to initiatives that impact millions of customers worldwide. If you are passionate about big data technologies and eager to drive innovation in the financial sector, join us in Plano to help shape the future of banking at JP Morgan Chase.

Key Responsibilities

  • Design, develop, and optimize PySpark-based data pipelines for processing large volumes of financial transaction data
  • Collaborate with data scientists and analysts to build scalable ETL processes using Databricks
  • Integrate Java components into Spark applications to enhance performance and functionality in trading systems
  • Implement data quality checks and monitoring to ensure accuracy in risk assessment models
  • Contribute to the development of real-time analytics dashboards, potentially incorporating React for frontend elements
  • Work closely with cross-functional teams to align engineering solutions with business objectives in wealth management
  • Troubleshoot and resolve performance issues in distributed computing environments
  • Participate in code reviews, agile ceremonies, and continuous integration/continuous deployment (CI/CD) practices
  • Ensure compliance with JP Morgan Chase's security standards and regulatory requirements for financial data handling
  • Mentor junior engineers and contribute to knowledge sharing within the technology team

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field
  • 5+ years of professional experience in software engineering with a focus on big data technologies
  • Strong proficiency in PySpark and Apache Spark for data processing and analytics
  • Experience developing and maintaining Java-based applications in enterprise environments
  • Demonstrated ability to work with large-scale data pipelines in a financial services context
  • Familiarity with cloud platforms such as AWS, Azure, or Databricks for distributed computing
  • Proven track record of delivering high-quality code in agile development teams

Preferred Qualifications

  • Experience with React.js for building user interfaces in data visualization tools
  • Knowledge of financial data modeling, risk analytics, or regulatory compliance in banking
  • Certification in Databricks or Spark, such as Databricks Certified Developer
  • Prior work in a large financial institution handling sensitive customer data
  • Master's degree in a quantitative field or equivalent advanced education

Required Skills

  • PySpark and Apache Spark for big data processing
  • Java programming for backend development
  • Databricks platform for collaborative data engineering
  • SQL and data querying in distributed systems
  • ETL pipeline design and optimization
  • Cloud computing (AWS, Azure, or similar)
  • Agile methodologies and Scrum practices
  • Version control with Git
  • Problem-solving and debugging complex systems
  • Data modeling for financial applications
  • React.js for UI development (preferred)
  • Regulatory compliance in finance (e.g., GDPR, SOX)
  • Collaboration and communication in team settings
  • Performance tuning for large-scale data environments
  • Unit testing and code quality assurance

Benefits

  • Competitive base salary and performance-based annual bonuses
  • Comprehensive health, dental, and vision insurance coverage
  • 401(k) retirement savings plan with company matching contributions
  • Generous paid time off, including vacation, sick days, and parental leave
  • Professional development opportunities, including tuition reimbursement and certifications
  • Employee stock purchase plan and financial wellness programs
  • On-site fitness centers, wellness programs, and mental health support
  • Flexible work arrangements, including hybrid options in Plano, TX

JP Morgan Chase is an equal opportunity employer.

Locations

  • Plano, US

Salary

Estimated Salary Rangehigh confidence

180,000 - 250,000 USD / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • PySpark and Apache Spark for big data processingintermediate
  • Java programming for backend developmentintermediate
  • Databricks platform for collaborative data engineeringintermediate
  • SQL and data querying in distributed systemsintermediate
  • ETL pipeline design and optimizationintermediate
  • Cloud computing (AWS, Azure, or similar)intermediate
  • Agile methodologies and Scrum practicesintermediate
  • Version control with Gitintermediate
  • Problem-solving and debugging complex systemsintermediate
  • Data modeling for financial applicationsintermediate
  • React.js for UI development (preferred)intermediate
  • Regulatory compliance in finance (e.g., GDPR, SOX)intermediate
  • Collaboration and communication in team settingsintermediate
  • Performance tuning for large-scale data environmentsintermediate
  • Unit testing and code quality assuranceintermediate

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field (experience)
  • 5+ years of professional experience in software engineering with a focus on big data technologies (experience)
  • Strong proficiency in PySpark and Apache Spark for data processing and analytics (experience)
  • Experience developing and maintaining Java-based applications in enterprise environments (experience)
  • Demonstrated ability to work with large-scale data pipelines in a financial services context (experience)
  • Familiarity with cloud platforms such as AWS, Azure, or Databricks for distributed computing (experience)
  • Proven track record of delivering high-quality code in agile development teams (experience)

Preferred Qualifications

  • Experience with React.js for building user interfaces in data visualization tools (experience)
  • Knowledge of financial data modeling, risk analytics, or regulatory compliance in banking (experience)
  • Certification in Databricks or Spark, such as Databricks Certified Developer (experience)
  • Prior work in a large financial institution handling sensitive customer data (experience)
  • Master's degree in a quantitative field or equivalent advanced education (experience)

Responsibilities

  • Design, develop, and optimize PySpark-based data pipelines for processing large volumes of financial transaction data
  • Collaborate with data scientists and analysts to build scalable ETL processes using Databricks
  • Integrate Java components into Spark applications to enhance performance and functionality in trading systems
  • Implement data quality checks and monitoring to ensure accuracy in risk assessment models
  • Contribute to the development of real-time analytics dashboards, potentially incorporating React for frontend elements
  • Work closely with cross-functional teams to align engineering solutions with business objectives in wealth management
  • Troubleshoot and resolve performance issues in distributed computing environments
  • Participate in code reviews, agile ceremonies, and continuous integration/continuous deployment (CI/CD) practices
  • Ensure compliance with JP Morgan Chase's security standards and regulatory requirements for financial data handling
  • Mentor junior engineers and contribute to knowledge sharing within the technology team

Benefits

  • general: Competitive base salary and performance-based annual bonuses
  • general: Comprehensive health, dental, and vision insurance coverage
  • general: 401(k) retirement savings plan with company matching contributions
  • general: Generous paid time off, including vacation, sick days, and parental leave
  • general: Professional development opportunities, including tuition reimbursement and certifications
  • general: Employee stock purchase plan and financial wellness programs
  • general: On-site fitness centers, wellness programs, and mental health support
  • general: Flexible work arrangements, including hybrid options in Plano, TX

Target Your Resume for "Software Engineer III - PySpark/Databricks" , JP Morgan Chase

Get personalized recommendations to optimize your resume specifically for Software Engineer III - PySpark/Databricks. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Software Engineer III - PySpark/Databricks" , JP Morgan Chase

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

Software EngineeringFinancial ServicesBankingJP MorganSoftware Engineering

Answer 10 quick questions to check your fit for Software Engineer III - PySpark/Databricks @ JP Morgan Chase.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.

JP Morgan Chase logo

Software Engineer III - PySpark/Databricks

JP Morgan Chase

Software and Technology Jobs

Software Engineer III - PySpark/Databricks

full-timePosted: Dec 4, 2025

Job Description

Software Engineer III - PySpark/Databricks

Location: Plano, TX, United States

Job Family: Software Engineering

About the Role

At JP Morgan Chase, we are a leading global financial services firm with a commitment to innovation and excellence in technology. We are seeking a talented Software Engineer III specializing in PySpark and Databricks to join our dynamic team in Plano, TX. In this role, you will play a pivotal part in building and maintaining advanced data platforms that power our banking, investment, and asset management operations. Leveraging your expertise in PySpark and Java, you will develop robust solutions to handle petabyte-scale financial datasets, ensuring seamless integration with our core systems for real-time decision-making and risk management. Your primary focus will be on designing scalable data pipelines using PySpark within the Databricks environment, optimizing for performance in high-stakes financial applications such as fraud detection and portfolio analytics. You will collaborate with cross-functional teams, including quants, traders, and compliance experts, to translate complex business requirements into efficient engineering solutions. Optional experience with React will allow you to contribute to intuitive data visualization tools, enhancing how our stakeholders interact with critical insights. At JP Morgan Chase, you will work in a secure, regulated environment where data privacy and accuracy are paramount, adhering to stringent standards like those set by the SEC and FDIC. This position offers an exciting opportunity to advance your career in a world-class organization known for its technological leadership in finance. You will benefit from exposure to cutting-edge tools and methodologies, while contributing to initiatives that impact millions of customers worldwide. If you are passionate about big data technologies and eager to drive innovation in the financial sector, join us in Plano to help shape the future of banking at JP Morgan Chase.

Key Responsibilities

  • Design, develop, and optimize PySpark-based data pipelines for processing large volumes of financial transaction data
  • Collaborate with data scientists and analysts to build scalable ETL processes using Databricks
  • Integrate Java components into Spark applications to enhance performance and functionality in trading systems
  • Implement data quality checks and monitoring to ensure accuracy in risk assessment models
  • Contribute to the development of real-time analytics dashboards, potentially incorporating React for frontend elements
  • Work closely with cross-functional teams to align engineering solutions with business objectives in wealth management
  • Troubleshoot and resolve performance issues in distributed computing environments
  • Participate in code reviews, agile ceremonies, and continuous integration/continuous deployment (CI/CD) practices
  • Ensure compliance with JP Morgan Chase's security standards and regulatory requirements for financial data handling
  • Mentor junior engineers and contribute to knowledge sharing within the technology team

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field
  • 5+ years of professional experience in software engineering with a focus on big data technologies
  • Strong proficiency in PySpark and Apache Spark for data processing and analytics
  • Experience developing and maintaining Java-based applications in enterprise environments
  • Demonstrated ability to work with large-scale data pipelines in a financial services context
  • Familiarity with cloud platforms such as AWS, Azure, or Databricks for distributed computing
  • Proven track record of delivering high-quality code in agile development teams

Preferred Qualifications

  • Experience with React.js for building user interfaces in data visualization tools
  • Knowledge of financial data modeling, risk analytics, or regulatory compliance in banking
  • Certification in Databricks or Spark, such as Databricks Certified Developer
  • Prior work in a large financial institution handling sensitive customer data
  • Master's degree in a quantitative field or equivalent advanced education

Required Skills

  • PySpark and Apache Spark for big data processing
  • Java programming for backend development
  • Databricks platform for collaborative data engineering
  • SQL and data querying in distributed systems
  • ETL pipeline design and optimization
  • Cloud computing (AWS, Azure, or similar)
  • Agile methodologies and Scrum practices
  • Version control with Git
  • Problem-solving and debugging complex systems
  • Data modeling for financial applications
  • React.js for UI development (preferred)
  • Regulatory compliance in finance (e.g., GDPR, SOX)
  • Collaboration and communication in team settings
  • Performance tuning for large-scale data environments
  • Unit testing and code quality assurance

Benefits

  • Competitive base salary and performance-based annual bonuses
  • Comprehensive health, dental, and vision insurance coverage
  • 401(k) retirement savings plan with company matching contributions
  • Generous paid time off, including vacation, sick days, and parental leave
  • Professional development opportunities, including tuition reimbursement and certifications
  • Employee stock purchase plan and financial wellness programs
  • On-site fitness centers, wellness programs, and mental health support
  • Flexible work arrangements, including hybrid options in Plano, TX

JP Morgan Chase is an equal opportunity employer.

Locations

  • Plano, US

Salary

Estimated Salary Rangehigh confidence

180,000 - 250,000 USD / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • PySpark and Apache Spark for big data processingintermediate
  • Java programming for backend developmentintermediate
  • Databricks platform for collaborative data engineeringintermediate
  • SQL and data querying in distributed systemsintermediate
  • ETL pipeline design and optimizationintermediate
  • Cloud computing (AWS, Azure, or similar)intermediate
  • Agile methodologies and Scrum practicesintermediate
  • Version control with Gitintermediate
  • Problem-solving and debugging complex systemsintermediate
  • Data modeling for financial applicationsintermediate
  • React.js for UI development (preferred)intermediate
  • Regulatory compliance in finance (e.g., GDPR, SOX)intermediate
  • Collaboration and communication in team settingsintermediate
  • Performance tuning for large-scale data environmentsintermediate
  • Unit testing and code quality assuranceintermediate

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field (experience)
  • 5+ years of professional experience in software engineering with a focus on big data technologies (experience)
  • Strong proficiency in PySpark and Apache Spark for data processing and analytics (experience)
  • Experience developing and maintaining Java-based applications in enterprise environments (experience)
  • Demonstrated ability to work with large-scale data pipelines in a financial services context (experience)
  • Familiarity with cloud platforms such as AWS, Azure, or Databricks for distributed computing (experience)
  • Proven track record of delivering high-quality code in agile development teams (experience)

Preferred Qualifications

  • Experience with React.js for building user interfaces in data visualization tools (experience)
  • Knowledge of financial data modeling, risk analytics, or regulatory compliance in banking (experience)
  • Certification in Databricks or Spark, such as Databricks Certified Developer (experience)
  • Prior work in a large financial institution handling sensitive customer data (experience)
  • Master's degree in a quantitative field or equivalent advanced education (experience)

Responsibilities

  • Design, develop, and optimize PySpark-based data pipelines for processing large volumes of financial transaction data
  • Collaborate with data scientists and analysts to build scalable ETL processes using Databricks
  • Integrate Java components into Spark applications to enhance performance and functionality in trading systems
  • Implement data quality checks and monitoring to ensure accuracy in risk assessment models
  • Contribute to the development of real-time analytics dashboards, potentially incorporating React for frontend elements
  • Work closely with cross-functional teams to align engineering solutions with business objectives in wealth management
  • Troubleshoot and resolve performance issues in distributed computing environments
  • Participate in code reviews, agile ceremonies, and continuous integration/continuous deployment (CI/CD) practices
  • Ensure compliance with JP Morgan Chase's security standards and regulatory requirements for financial data handling
  • Mentor junior engineers and contribute to knowledge sharing within the technology team

Benefits

  • general: Competitive base salary and performance-based annual bonuses
  • general: Comprehensive health, dental, and vision insurance coverage
  • general: 401(k) retirement savings plan with company matching contributions
  • general: Generous paid time off, including vacation, sick days, and parental leave
  • general: Professional development opportunities, including tuition reimbursement and certifications
  • general: Employee stock purchase plan and financial wellness programs
  • general: On-site fitness centers, wellness programs, and mental health support
  • general: Flexible work arrangements, including hybrid options in Plano, TX

Target Your Resume for "Software Engineer III - PySpark/Databricks" , JP Morgan Chase

Get personalized recommendations to optimize your resume specifically for Software Engineer III - PySpark/Databricks. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Software Engineer III - PySpark/Databricks" , JP Morgan Chase

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

Software EngineeringFinancial ServicesBankingJP MorganSoftware Engineering

Answer 10 quick questions to check your fit for Software Engineer III - PySpark/Databricks @ JP Morgan Chase.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.