Resume and JobRESUME AND JOB
JP Morgan Chase logo

Databricks Optimization - Lead Software Engineer

JP Morgan Chase

Software and Technology Jobs

Databricks Optimization - Lead Software Engineer

full-timePosted: Dec 10, 2025

Job Description

Databricks Optimization - Lead Software Engineer

Location: Jersey City, NJ, United States

Job Family: Software Engineering

About the Role

At JP Morgan Chase, we are at the forefront of financial innovation, leveraging cutting-edge technology to power global banking, investment, and asset management services. As a Lead Software Engineer in Databricks Optimization, you will play a pivotal role in enhancing our data infrastructure to support mission-critical applications in risk management, trading analytics, and customer insights. Based in our Jersey City, NJ office, you will lead efforts to optimize Databricks workloads, ensuring they deliver high performance while maintaining cost efficiency in a highly regulated environment. This position offers the opportunity to work with diverse teams across the firm, contributing to solutions that handle petabyte-scale financial data securely and reliably. Your primary focus will be on architecting and refining data processing pipelines using Databricks and Apache Spark, tailored to the unique demands of the financial services industry. You will identify inefficiencies in current workloads, implement advanced tuning techniques, and integrate with JP Morgan's hybrid cloud ecosystem to drive operational excellence. Collaborating with data engineers, scientists, and business stakeholders, you will ensure that optimizations align with strategic goals, such as faster fraud detection and more accurate portfolio risk assessments. Compliance with industry regulations, including data privacy and audit requirements, will be integral to your work, safeguarding sensitive client information. This role demands a blend of technical depth and leadership, where you will mentor team members, foster innovation, and champion best practices in data engineering. At JP Morgan Chase, we value engineers who thrive in dynamic settings and are passionate about using technology to solve real-world financial challenges. Join us to advance your career in a supportive environment that emphasizes growth, collaboration, and impact on a global scale.

Key Responsibilities

  • Lead the optimization of Databricks workloads to enhance performance and reduce costs for financial applications
  • Design and implement scalable data pipelines using Spark and Databricks for high-volume transaction data
  • Collaborate with data scientists and analysts to tune queries and models for efficiency in risk assessment and fraud detection
  • Monitor and analyze resource utilization to ensure cost-effective operations within JP Morgan Chase's cloud infrastructure
  • Develop best practices for Databricks cluster management and auto-scaling in a secure financial environment
  • Mentor junior engineers on optimization techniques and contribute to code reviews
  • Integrate Databricks solutions with JP Morgan's enterprise systems, ensuring compliance with regulatory standards like GDPR and SOX
  • Troubleshoot and resolve performance bottlenecks in real-time trading and analytics workloads
  • Stay updated on emerging Databricks features and apply them to improve financial services workflows

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field; advanced degree preferred
  • 7+ years of experience in software engineering with a focus on big data technologies
  • Proven expertise in Databricks, Spark, and cloud-based data platforms
  • Strong understanding of performance tuning and optimization in distributed computing environments
  • Experience working in the financial services industry, particularly with regulatory compliance and data security
  • Ability to collaborate with cross-functional teams in a fast-paced environment

Preferred Qualifications

  • Master's degree in Computer Science or Data Engineering
  • Experience with AWS, Azure, or GCP cloud services integrated with Databricks
  • Knowledge of financial modeling and risk management applications
  • Certifications in Databricks or Apache Spark
  • Prior leadership experience in agile development teams

Required Skills

  • Proficiency in Python, Scala, or Java for Spark development
  • Expertise in Databricks Lakehouse architecture and Delta Lake
  • Strong knowledge of Apache Spark SQL, DataFrames, and RDDs
  • Experience with performance profiling tools like Spark UI and Ganglia
  • Understanding of cloud cost management and optimization strategies
  • Familiarity with financial data standards such as FIX protocol and market data feeds
  • Skills in CI/CD pipelines using Jenkins or GitHub Actions
  • Knowledge of data security practices, including encryption and access controls
  • Analytical problem-solving for complex optimization challenges
  • Effective communication for stakeholder presentations
  • Agile methodologies and scrum framework experience
  • SQL optimization and database tuning
  • Machine learning integration with Databricks MLflow
  • Version control with Git
  • Team leadership and mentoring abilities

Benefits

  • Competitive base salary and performance-based annual bonuses
  • Comprehensive health, dental, and vision insurance coverage
  • 401(k) retirement savings plan with company matching contributions
  • Generous paid time off, including vacation, sick days, and parental leave
  • Professional development opportunities, including tuition reimbursement and access to internal training programs
  • Employee stock purchase plan and financial wellness resources
  • On-site fitness centers and wellness programs at JP Morgan Chase locations
  • Flexible work arrangements, including hybrid options in Jersey City

JP Morgan Chase is an equal opportunity employer.

Locations

  • Jersey City, US

Salary

Estimated Salary Rangehigh confidence

250,000 - 400,000 USD / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Proficiency in Python, Scala, or Java for Spark developmentintermediate
  • Expertise in Databricks Lakehouse architecture and Delta Lakeintermediate
  • Strong knowledge of Apache Spark SQL, DataFrames, and RDDsintermediate
  • Experience with performance profiling tools like Spark UI and Gangliaintermediate
  • Understanding of cloud cost management and optimization strategiesintermediate
  • Familiarity with financial data standards such as FIX protocol and market data feedsintermediate
  • Skills in CI/CD pipelines using Jenkins or GitHub Actionsintermediate
  • Knowledge of data security practices, including encryption and access controlsintermediate
  • Analytical problem-solving for complex optimization challengesintermediate
  • Effective communication for stakeholder presentationsintermediate
  • Agile methodologies and scrum framework experienceintermediate
  • SQL optimization and database tuningintermediate
  • Machine learning integration with Databricks MLflowintermediate
  • Version control with Gitintermediate
  • Team leadership and mentoring abilitiesintermediate

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field; advanced degree preferred (experience)
  • 7+ years of experience in software engineering with a focus on big data technologies (experience)
  • Proven expertise in Databricks, Spark, and cloud-based data platforms (experience)
  • Strong understanding of performance tuning and optimization in distributed computing environments (experience)
  • Experience working in the financial services industry, particularly with regulatory compliance and data security (experience)
  • Ability to collaborate with cross-functional teams in a fast-paced environment (experience)

Preferred Qualifications

  • Master's degree in Computer Science or Data Engineering (experience)
  • Experience with AWS, Azure, or GCP cloud services integrated with Databricks (experience)
  • Knowledge of financial modeling and risk management applications (experience)
  • Certifications in Databricks or Apache Spark (experience)
  • Prior leadership experience in agile development teams (experience)

Responsibilities

  • Lead the optimization of Databricks workloads to enhance performance and reduce costs for financial applications
  • Design and implement scalable data pipelines using Spark and Databricks for high-volume transaction data
  • Collaborate with data scientists and analysts to tune queries and models for efficiency in risk assessment and fraud detection
  • Monitor and analyze resource utilization to ensure cost-effective operations within JP Morgan Chase's cloud infrastructure
  • Develop best practices for Databricks cluster management and auto-scaling in a secure financial environment
  • Mentor junior engineers on optimization techniques and contribute to code reviews
  • Integrate Databricks solutions with JP Morgan's enterprise systems, ensuring compliance with regulatory standards like GDPR and SOX
  • Troubleshoot and resolve performance bottlenecks in real-time trading and analytics workloads
  • Stay updated on emerging Databricks features and apply them to improve financial services workflows

Benefits

  • general: Competitive base salary and performance-based annual bonuses
  • general: Comprehensive health, dental, and vision insurance coverage
  • general: 401(k) retirement savings plan with company matching contributions
  • general: Generous paid time off, including vacation, sick days, and parental leave
  • general: Professional development opportunities, including tuition reimbursement and access to internal training programs
  • general: Employee stock purchase plan and financial wellness resources
  • general: On-site fitness centers and wellness programs at JP Morgan Chase locations
  • general: Flexible work arrangements, including hybrid options in Jersey City

Target Your Resume for "Databricks Optimization - Lead Software Engineer" , JP Morgan Chase

Get personalized recommendations to optimize your resume specifically for Databricks Optimization - Lead Software Engineer. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Databricks Optimization - Lead Software Engineer" , JP Morgan Chase

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

Software EngineeringFinancial ServicesBankingJP MorganSoftware Engineering

Answer 10 quick questions to check your fit for Databricks Optimization - Lead Software Engineer @ JP Morgan Chase.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.

JP Morgan Chase logo

Databricks Optimization - Lead Software Engineer

JP Morgan Chase

Software and Technology Jobs

Databricks Optimization - Lead Software Engineer

full-timePosted: Dec 10, 2025

Job Description

Databricks Optimization - Lead Software Engineer

Location: Jersey City, NJ, United States

Job Family: Software Engineering

About the Role

At JP Morgan Chase, we are at the forefront of financial innovation, leveraging cutting-edge technology to power global banking, investment, and asset management services. As a Lead Software Engineer in Databricks Optimization, you will play a pivotal role in enhancing our data infrastructure to support mission-critical applications in risk management, trading analytics, and customer insights. Based in our Jersey City, NJ office, you will lead efforts to optimize Databricks workloads, ensuring they deliver high performance while maintaining cost efficiency in a highly regulated environment. This position offers the opportunity to work with diverse teams across the firm, contributing to solutions that handle petabyte-scale financial data securely and reliably. Your primary focus will be on architecting and refining data processing pipelines using Databricks and Apache Spark, tailored to the unique demands of the financial services industry. You will identify inefficiencies in current workloads, implement advanced tuning techniques, and integrate with JP Morgan's hybrid cloud ecosystem to drive operational excellence. Collaborating with data engineers, scientists, and business stakeholders, you will ensure that optimizations align with strategic goals, such as faster fraud detection and more accurate portfolio risk assessments. Compliance with industry regulations, including data privacy and audit requirements, will be integral to your work, safeguarding sensitive client information. This role demands a blend of technical depth and leadership, where you will mentor team members, foster innovation, and champion best practices in data engineering. At JP Morgan Chase, we value engineers who thrive in dynamic settings and are passionate about using technology to solve real-world financial challenges. Join us to advance your career in a supportive environment that emphasizes growth, collaboration, and impact on a global scale.

Key Responsibilities

  • Lead the optimization of Databricks workloads to enhance performance and reduce costs for financial applications
  • Design and implement scalable data pipelines using Spark and Databricks for high-volume transaction data
  • Collaborate with data scientists and analysts to tune queries and models for efficiency in risk assessment and fraud detection
  • Monitor and analyze resource utilization to ensure cost-effective operations within JP Morgan Chase's cloud infrastructure
  • Develop best practices for Databricks cluster management and auto-scaling in a secure financial environment
  • Mentor junior engineers on optimization techniques and contribute to code reviews
  • Integrate Databricks solutions with JP Morgan's enterprise systems, ensuring compliance with regulatory standards like GDPR and SOX
  • Troubleshoot and resolve performance bottlenecks in real-time trading and analytics workloads
  • Stay updated on emerging Databricks features and apply them to improve financial services workflows

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field; advanced degree preferred
  • 7+ years of experience in software engineering with a focus on big data technologies
  • Proven expertise in Databricks, Spark, and cloud-based data platforms
  • Strong understanding of performance tuning and optimization in distributed computing environments
  • Experience working in the financial services industry, particularly with regulatory compliance and data security
  • Ability to collaborate with cross-functional teams in a fast-paced environment

Preferred Qualifications

  • Master's degree in Computer Science or Data Engineering
  • Experience with AWS, Azure, or GCP cloud services integrated with Databricks
  • Knowledge of financial modeling and risk management applications
  • Certifications in Databricks or Apache Spark
  • Prior leadership experience in agile development teams

Required Skills

  • Proficiency in Python, Scala, or Java for Spark development
  • Expertise in Databricks Lakehouse architecture and Delta Lake
  • Strong knowledge of Apache Spark SQL, DataFrames, and RDDs
  • Experience with performance profiling tools like Spark UI and Ganglia
  • Understanding of cloud cost management and optimization strategies
  • Familiarity with financial data standards such as FIX protocol and market data feeds
  • Skills in CI/CD pipelines using Jenkins or GitHub Actions
  • Knowledge of data security practices, including encryption and access controls
  • Analytical problem-solving for complex optimization challenges
  • Effective communication for stakeholder presentations
  • Agile methodologies and scrum framework experience
  • SQL optimization and database tuning
  • Machine learning integration with Databricks MLflow
  • Version control with Git
  • Team leadership and mentoring abilities

Benefits

  • Competitive base salary and performance-based annual bonuses
  • Comprehensive health, dental, and vision insurance coverage
  • 401(k) retirement savings plan with company matching contributions
  • Generous paid time off, including vacation, sick days, and parental leave
  • Professional development opportunities, including tuition reimbursement and access to internal training programs
  • Employee stock purchase plan and financial wellness resources
  • On-site fitness centers and wellness programs at JP Morgan Chase locations
  • Flexible work arrangements, including hybrid options in Jersey City

JP Morgan Chase is an equal opportunity employer.

Locations

  • Jersey City, US

Salary

Estimated Salary Rangehigh confidence

250,000 - 400,000 USD / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Proficiency in Python, Scala, or Java for Spark developmentintermediate
  • Expertise in Databricks Lakehouse architecture and Delta Lakeintermediate
  • Strong knowledge of Apache Spark SQL, DataFrames, and RDDsintermediate
  • Experience with performance profiling tools like Spark UI and Gangliaintermediate
  • Understanding of cloud cost management and optimization strategiesintermediate
  • Familiarity with financial data standards such as FIX protocol and market data feedsintermediate
  • Skills in CI/CD pipelines using Jenkins or GitHub Actionsintermediate
  • Knowledge of data security practices, including encryption and access controlsintermediate
  • Analytical problem-solving for complex optimization challengesintermediate
  • Effective communication for stakeholder presentationsintermediate
  • Agile methodologies and scrum framework experienceintermediate
  • SQL optimization and database tuningintermediate
  • Machine learning integration with Databricks MLflowintermediate
  • Version control with Gitintermediate
  • Team leadership and mentoring abilitiesintermediate

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field; advanced degree preferred (experience)
  • 7+ years of experience in software engineering with a focus on big data technologies (experience)
  • Proven expertise in Databricks, Spark, and cloud-based data platforms (experience)
  • Strong understanding of performance tuning and optimization in distributed computing environments (experience)
  • Experience working in the financial services industry, particularly with regulatory compliance and data security (experience)
  • Ability to collaborate with cross-functional teams in a fast-paced environment (experience)

Preferred Qualifications

  • Master's degree in Computer Science or Data Engineering (experience)
  • Experience with AWS, Azure, or GCP cloud services integrated with Databricks (experience)
  • Knowledge of financial modeling and risk management applications (experience)
  • Certifications in Databricks or Apache Spark (experience)
  • Prior leadership experience in agile development teams (experience)

Responsibilities

  • Lead the optimization of Databricks workloads to enhance performance and reduce costs for financial applications
  • Design and implement scalable data pipelines using Spark and Databricks for high-volume transaction data
  • Collaborate with data scientists and analysts to tune queries and models for efficiency in risk assessment and fraud detection
  • Monitor and analyze resource utilization to ensure cost-effective operations within JP Morgan Chase's cloud infrastructure
  • Develop best practices for Databricks cluster management and auto-scaling in a secure financial environment
  • Mentor junior engineers on optimization techniques and contribute to code reviews
  • Integrate Databricks solutions with JP Morgan's enterprise systems, ensuring compliance with regulatory standards like GDPR and SOX
  • Troubleshoot and resolve performance bottlenecks in real-time trading and analytics workloads
  • Stay updated on emerging Databricks features and apply them to improve financial services workflows

Benefits

  • general: Competitive base salary and performance-based annual bonuses
  • general: Comprehensive health, dental, and vision insurance coverage
  • general: 401(k) retirement savings plan with company matching contributions
  • general: Generous paid time off, including vacation, sick days, and parental leave
  • general: Professional development opportunities, including tuition reimbursement and access to internal training programs
  • general: Employee stock purchase plan and financial wellness resources
  • general: On-site fitness centers and wellness programs at JP Morgan Chase locations
  • general: Flexible work arrangements, including hybrid options in Jersey City

Target Your Resume for "Databricks Optimization - Lead Software Engineer" , JP Morgan Chase

Get personalized recommendations to optimize your resume specifically for Databricks Optimization - Lead Software Engineer. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Databricks Optimization - Lead Software Engineer" , JP Morgan Chase

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

Software EngineeringFinancial ServicesBankingJP MorganSoftware Engineering

Answer 10 quick questions to check your fit for Databricks Optimization - Lead Software Engineer @ JP Morgan Chase.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.