Resume and JobRESUME AND JOB
JP Morgan Chase logo

Data Engineer II - Databricks and Python

JP Morgan Chase

Software and Technology Jobs

Data Engineer II - Databricks and Python

full-timePosted: Sep 4, 2025

Job Description

Data Engineer II - Databricks and Python

Location: GLASGOW, LANARKSHIRE, United Kingdom

Job Family: Data Engineering

About the Role

Join J.P. Morgan Chase's innovative Digital Intelligence team in Glasgow, where we harness cutting-edge technology to transform data into actionable insights for the global financial services industry. As a Data Engineer II specializing in Databricks and Python, you will play a pivotal role in building robust data infrastructure that supports everything from risk assessment to customer analytics. Our team drives the firm's digital transformation by leveraging advanced tools to process vast amounts of financial data securely and efficiently, ensuring we remain at the forefront of banking innovation. This position offers the opportunity to work on high-impact projects that directly influence J.P. Morgan's operations across markets worldwide. In this role, you will design and implement scalable data pipelines using Databricks and Python, focusing on ETL processes that handle sensitive transaction data, market feeds, and regulatory reports. You will collaborate closely with cross-functional teams, including data scientists, business analysts, and compliance experts, to deliver solutions that enhance decision-making in areas like fraud detection, portfolio management, and personalized banking services. Your work will involve optimizing Spark-based workflows for performance, integrating with cloud platforms, and ensuring all data practices align with stringent financial regulations such as GDPR and SEC requirements. This is a chance to contribute to mission-critical systems that power one of the world's leading financial institutions. At J.P. Morgan Chase, we value innovation, integrity, and inclusion, providing a supportive environment where your technical expertise can thrive. You will benefit from ongoing professional development, exposure to global projects, and the resources of a top-tier firm committed to sustainable growth in the financial sector. If you are passionate about data engineering and eager to make a tangible impact in finance, this role in our Glasgow office offers an exciting path to advance your career while contributing to the future of banking.

Key Responsibilities

  • Design, build, and maintain scalable data pipelines using Databricks and Python to support financial analytics and reporting
  • Collaborate with data scientists and analysts in the Digital Intelligence team to ingest, process, and transform large datasets from diverse financial sources
  • Implement data quality checks and governance practices to ensure accuracy and compliance with regulatory requirements in the banking sector
  • Optimize Spark jobs and Databricks workflows for performance in handling high-volume transaction data
  • Integrate Databricks with enterprise systems for real-time data streaming and batch processing in support of risk modeling and fraud detection
  • Troubleshoot and resolve data engineering issues, ensuring minimal downtime for critical financial operations
  • Contribute to the development of reusable data frameworks that enhance efficiency across J.P. Morgan's global data platforms
  • Stay updated on emerging technologies and apply them to improve data infrastructure for the firm's digital transformation initiatives
  • Document data pipelines and processes to facilitate knowledge sharing within the team and ensure audit readiness

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field
  • At least 3-5 years of professional experience in data engineering or a similar role
  • Proficiency in Python programming with experience in data manipulation libraries such as Pandas and NumPy
  • Hands-on experience with Databricks platform, including Delta Lake and Spark for big data processing
  • Strong understanding of data pipelines, ETL processes, and cloud-based data architectures
  • Experience working with SQL and relational databases in a financial services context
  • Ability to handle sensitive financial data with strict adherence to compliance and security standards

Preferred Qualifications

  • Master's degree in a quantitative field or relevant certifications (e.g., Databricks Certified Data Engineer)
  • Experience in the financial services industry, particularly with regulatory reporting or risk management data
  • Familiarity with AWS or Azure cloud services integrated with Databricks
  • Knowledge of machine learning frameworks like MLflow within Databricks
  • Prior experience at a large-scale financial institution like J.P. Morgan Chase

Required Skills

  • Python programming expertise
  • Databricks platform proficiency
  • Apache Spark for distributed data processing
  • SQL and database management
  • ETL pipeline development
  • Data modeling and warehousing
  • Cloud computing (AWS/Azure)
  • Version control with Git
  • Problem-solving and analytical thinking
  • Collaboration and communication skills
  • Attention to detail for data accuracy
  • Knowledge of financial regulations (e.g., GDPR, Basel III)
  • Agile methodologies experience
  • Machine learning basics
  • Scripting for automation

Benefits

  • Competitive base salary and performance-based annual bonuses
  • Comprehensive health, dental, and vision insurance coverage
  • Generous retirement savings plan with company matching contributions
  • Paid time off including vacation, sick leave, and parental leave
  • Professional development opportunities with access to training and certifications
  • Employee stock purchase plan and financial wellness programs
  • Flexible hybrid work arrangements and modern office facilities in Glasgow
  • Global mobility support and career advancement paths within J.P. Morgan Chase

JP Morgan Chase is an equal opportunity employer.

Locations

  • GLASGOW, GB

Salary

Estimated Salary Rangehigh confidence

65,000 - 95,000 GBP / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Python programming expertiseintermediate
  • Databricks platform proficiencyintermediate
  • Apache Spark for distributed data processingintermediate
  • SQL and database managementintermediate
  • ETL pipeline developmentintermediate
  • Data modeling and warehousingintermediate
  • Cloud computing (AWS/Azure)intermediate
  • Version control with Gitintermediate
  • Problem-solving and analytical thinkingintermediate
  • Collaboration and communication skillsintermediate
  • Attention to detail for data accuracyintermediate
  • Knowledge of financial regulations (e.g., GDPR, Basel III)intermediate
  • Agile methodologies experienceintermediate
  • Machine learning basicsintermediate
  • Scripting for automationintermediate

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field (experience)
  • At least 3-5 years of professional experience in data engineering or a similar role (experience)
  • Proficiency in Python programming with experience in data manipulation libraries such as Pandas and NumPy (experience)
  • Hands-on experience with Databricks platform, including Delta Lake and Spark for big data processing (experience)
  • Strong understanding of data pipelines, ETL processes, and cloud-based data architectures (experience)
  • Experience working with SQL and relational databases in a financial services context (experience)
  • Ability to handle sensitive financial data with strict adherence to compliance and security standards (experience)

Preferred Qualifications

  • Master's degree in a quantitative field or relevant certifications (e.g., Databricks Certified Data Engineer) (experience)
  • Experience in the financial services industry, particularly with regulatory reporting or risk management data (experience)
  • Familiarity with AWS or Azure cloud services integrated with Databricks (experience)
  • Knowledge of machine learning frameworks like MLflow within Databricks (experience)
  • Prior experience at a large-scale financial institution like J.P. Morgan Chase (experience)

Responsibilities

  • Design, build, and maintain scalable data pipelines using Databricks and Python to support financial analytics and reporting
  • Collaborate with data scientists and analysts in the Digital Intelligence team to ingest, process, and transform large datasets from diverse financial sources
  • Implement data quality checks and governance practices to ensure accuracy and compliance with regulatory requirements in the banking sector
  • Optimize Spark jobs and Databricks workflows for performance in handling high-volume transaction data
  • Integrate Databricks with enterprise systems for real-time data streaming and batch processing in support of risk modeling and fraud detection
  • Troubleshoot and resolve data engineering issues, ensuring minimal downtime for critical financial operations
  • Contribute to the development of reusable data frameworks that enhance efficiency across J.P. Morgan's global data platforms
  • Stay updated on emerging technologies and apply them to improve data infrastructure for the firm's digital transformation initiatives
  • Document data pipelines and processes to facilitate knowledge sharing within the team and ensure audit readiness

Benefits

  • general: Competitive base salary and performance-based annual bonuses
  • general: Comprehensive health, dental, and vision insurance coverage
  • general: Generous retirement savings plan with company matching contributions
  • general: Paid time off including vacation, sick leave, and parental leave
  • general: Professional development opportunities with access to training and certifications
  • general: Employee stock purchase plan and financial wellness programs
  • general: Flexible hybrid work arrangements and modern office facilities in Glasgow
  • general: Global mobility support and career advancement paths within J.P. Morgan Chase

Target Your Resume for "Data Engineer II - Databricks and Python" , JP Morgan Chase

Get personalized recommendations to optimize your resume specifically for Data Engineer II - Databricks and Python. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Data Engineer II - Databricks and Python" , JP Morgan Chase

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

Data EngineeringFinancial ServicesBankingJP MorganData Engineering

Answer 10 quick questions to check your fit for Data Engineer II - Databricks and Python @ JP Morgan Chase.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.

JP Morgan Chase logo

Data Engineer II - Databricks and Python

JP Morgan Chase

Software and Technology Jobs

Data Engineer II - Databricks and Python

full-timePosted: Sep 4, 2025

Job Description

Data Engineer II - Databricks and Python

Location: GLASGOW, LANARKSHIRE, United Kingdom

Job Family: Data Engineering

About the Role

Join J.P. Morgan Chase's innovative Digital Intelligence team in Glasgow, where we harness cutting-edge technology to transform data into actionable insights for the global financial services industry. As a Data Engineer II specializing in Databricks and Python, you will play a pivotal role in building robust data infrastructure that supports everything from risk assessment to customer analytics. Our team drives the firm's digital transformation by leveraging advanced tools to process vast amounts of financial data securely and efficiently, ensuring we remain at the forefront of banking innovation. This position offers the opportunity to work on high-impact projects that directly influence J.P. Morgan's operations across markets worldwide. In this role, you will design and implement scalable data pipelines using Databricks and Python, focusing on ETL processes that handle sensitive transaction data, market feeds, and regulatory reports. You will collaborate closely with cross-functional teams, including data scientists, business analysts, and compliance experts, to deliver solutions that enhance decision-making in areas like fraud detection, portfolio management, and personalized banking services. Your work will involve optimizing Spark-based workflows for performance, integrating with cloud platforms, and ensuring all data practices align with stringent financial regulations such as GDPR and SEC requirements. This is a chance to contribute to mission-critical systems that power one of the world's leading financial institutions. At J.P. Morgan Chase, we value innovation, integrity, and inclusion, providing a supportive environment where your technical expertise can thrive. You will benefit from ongoing professional development, exposure to global projects, and the resources of a top-tier firm committed to sustainable growth in the financial sector. If you are passionate about data engineering and eager to make a tangible impact in finance, this role in our Glasgow office offers an exciting path to advance your career while contributing to the future of banking.

Key Responsibilities

  • Design, build, and maintain scalable data pipelines using Databricks and Python to support financial analytics and reporting
  • Collaborate with data scientists and analysts in the Digital Intelligence team to ingest, process, and transform large datasets from diverse financial sources
  • Implement data quality checks and governance practices to ensure accuracy and compliance with regulatory requirements in the banking sector
  • Optimize Spark jobs and Databricks workflows for performance in handling high-volume transaction data
  • Integrate Databricks with enterprise systems for real-time data streaming and batch processing in support of risk modeling and fraud detection
  • Troubleshoot and resolve data engineering issues, ensuring minimal downtime for critical financial operations
  • Contribute to the development of reusable data frameworks that enhance efficiency across J.P. Morgan's global data platforms
  • Stay updated on emerging technologies and apply them to improve data infrastructure for the firm's digital transformation initiatives
  • Document data pipelines and processes to facilitate knowledge sharing within the team and ensure audit readiness

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field
  • At least 3-5 years of professional experience in data engineering or a similar role
  • Proficiency in Python programming with experience in data manipulation libraries such as Pandas and NumPy
  • Hands-on experience with Databricks platform, including Delta Lake and Spark for big data processing
  • Strong understanding of data pipelines, ETL processes, and cloud-based data architectures
  • Experience working with SQL and relational databases in a financial services context
  • Ability to handle sensitive financial data with strict adherence to compliance and security standards

Preferred Qualifications

  • Master's degree in a quantitative field or relevant certifications (e.g., Databricks Certified Data Engineer)
  • Experience in the financial services industry, particularly with regulatory reporting or risk management data
  • Familiarity with AWS or Azure cloud services integrated with Databricks
  • Knowledge of machine learning frameworks like MLflow within Databricks
  • Prior experience at a large-scale financial institution like J.P. Morgan Chase

Required Skills

  • Python programming expertise
  • Databricks platform proficiency
  • Apache Spark for distributed data processing
  • SQL and database management
  • ETL pipeline development
  • Data modeling and warehousing
  • Cloud computing (AWS/Azure)
  • Version control with Git
  • Problem-solving and analytical thinking
  • Collaboration and communication skills
  • Attention to detail for data accuracy
  • Knowledge of financial regulations (e.g., GDPR, Basel III)
  • Agile methodologies experience
  • Machine learning basics
  • Scripting for automation

Benefits

  • Competitive base salary and performance-based annual bonuses
  • Comprehensive health, dental, and vision insurance coverage
  • Generous retirement savings plan with company matching contributions
  • Paid time off including vacation, sick leave, and parental leave
  • Professional development opportunities with access to training and certifications
  • Employee stock purchase plan and financial wellness programs
  • Flexible hybrid work arrangements and modern office facilities in Glasgow
  • Global mobility support and career advancement paths within J.P. Morgan Chase

JP Morgan Chase is an equal opportunity employer.

Locations

  • GLASGOW, GB

Salary

Estimated Salary Rangehigh confidence

65,000 - 95,000 GBP / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Python programming expertiseintermediate
  • Databricks platform proficiencyintermediate
  • Apache Spark for distributed data processingintermediate
  • SQL and database managementintermediate
  • ETL pipeline developmentintermediate
  • Data modeling and warehousingintermediate
  • Cloud computing (AWS/Azure)intermediate
  • Version control with Gitintermediate
  • Problem-solving and analytical thinkingintermediate
  • Collaboration and communication skillsintermediate
  • Attention to detail for data accuracyintermediate
  • Knowledge of financial regulations (e.g., GDPR, Basel III)intermediate
  • Agile methodologies experienceintermediate
  • Machine learning basicsintermediate
  • Scripting for automationintermediate

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field (experience)
  • At least 3-5 years of professional experience in data engineering or a similar role (experience)
  • Proficiency in Python programming with experience in data manipulation libraries such as Pandas and NumPy (experience)
  • Hands-on experience with Databricks platform, including Delta Lake and Spark for big data processing (experience)
  • Strong understanding of data pipelines, ETL processes, and cloud-based data architectures (experience)
  • Experience working with SQL and relational databases in a financial services context (experience)
  • Ability to handle sensitive financial data with strict adherence to compliance and security standards (experience)

Preferred Qualifications

  • Master's degree in a quantitative field or relevant certifications (e.g., Databricks Certified Data Engineer) (experience)
  • Experience in the financial services industry, particularly with regulatory reporting or risk management data (experience)
  • Familiarity with AWS or Azure cloud services integrated with Databricks (experience)
  • Knowledge of machine learning frameworks like MLflow within Databricks (experience)
  • Prior experience at a large-scale financial institution like J.P. Morgan Chase (experience)

Responsibilities

  • Design, build, and maintain scalable data pipelines using Databricks and Python to support financial analytics and reporting
  • Collaborate with data scientists and analysts in the Digital Intelligence team to ingest, process, and transform large datasets from diverse financial sources
  • Implement data quality checks and governance practices to ensure accuracy and compliance with regulatory requirements in the banking sector
  • Optimize Spark jobs and Databricks workflows for performance in handling high-volume transaction data
  • Integrate Databricks with enterprise systems for real-time data streaming and batch processing in support of risk modeling and fraud detection
  • Troubleshoot and resolve data engineering issues, ensuring minimal downtime for critical financial operations
  • Contribute to the development of reusable data frameworks that enhance efficiency across J.P. Morgan's global data platforms
  • Stay updated on emerging technologies and apply them to improve data infrastructure for the firm's digital transformation initiatives
  • Document data pipelines and processes to facilitate knowledge sharing within the team and ensure audit readiness

Benefits

  • general: Competitive base salary and performance-based annual bonuses
  • general: Comprehensive health, dental, and vision insurance coverage
  • general: Generous retirement savings plan with company matching contributions
  • general: Paid time off including vacation, sick leave, and parental leave
  • general: Professional development opportunities with access to training and certifications
  • general: Employee stock purchase plan and financial wellness programs
  • general: Flexible hybrid work arrangements and modern office facilities in Glasgow
  • general: Global mobility support and career advancement paths within J.P. Morgan Chase

Target Your Resume for "Data Engineer II - Databricks and Python" , JP Morgan Chase

Get personalized recommendations to optimize your resume specifically for Data Engineer II - Databricks and Python. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Data Engineer II - Databricks and Python" , JP Morgan Chase

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

Data EngineeringFinancial ServicesBankingJP MorganData Engineering

Answer 10 quick questions to check your fit for Data Engineer II - Databricks and Python @ JP Morgan Chase.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.