Resume and JobRESUME AND JOB
Jones Lang LaSalle logo

Data Engineer Tech Lead (Hybrid)

Jones Lang LaSalle

Data Engineer Tech Lead (Hybrid)

full-timePosted: Jan 24, 2026

Job Description

JLL empowers you to shape a brighter way.  

Our people at JLL and JLL Technologies are shaping the future of real estate for a better world by combining world class services, advisory and technology for our clients. We are committed to hiring the best, most talented people  and empowering them to  thrive, grow meaningful careers and to find a place where they belong.  Whether you’ve got deep experience in commercial real estate, skilled trades or technology, or you’re looking to apply your relevant experience to a new industry, join our team as we help shape a brighter way forward.   

We are seeking a Data Engineer Tech Lead to join our capital markets data engineering teams, focusing on designing, building, and maintaining scalable data infrastructure on the Databricks platform. This role requires deep technical expertise in modern data stack technologies and the ability to work with complex financial data systems.

Location: Tel Aviv

Role type: Hybrid

Key Responsibilities

Technical Development

  • Design and implement robust, scalable data pipelines using Databricks, Apache Spark, and Delta Lake as well as BigQuery
  • Use SQL and Python to develop, scale, and optimize advanced data pipelines. 
  • Build and optimize ETL/ELT processes for capital markets data i
  • Develop real-time and batch processing solutions to support trading and risk management operations
  • Implement data quality monitoring, validation, and alerting systems

Platform Engineering

  • Configure and optimize Databricks workspaces, clusters, and job scheduling
  • Work in a Multi-cloud environment including Azure, GCP and AWS
  • Implement security best practices including access controls, encryption, and audit logging
  • Build integrations with market data vendors, trading systems, and risk management platforms
  • Establish monitoring and performance tuning for data pipeline health and efficiency

Collaboration & Mentorship

  • Collaborate with various stakeholders across the company and support business insight requests. 
  • Work closely with quantitative researchers, risk analysts, and product teams to understand data requirements
  • Collaborate with other data engineering teams and infrastructure groups
  • Provide technical guidance to junior engineers and contribute to code reviews
  • Participate in architecture discussions and technology selection decisions

Required Qualifications

  • Engineering experience with strong expertise in Apache Spark and distributed computing
  • Strong programming skills in Python including data handling libraries (pandas, numpy, etc.), and data modeling. 
  • Proficient in Databricks platform and Delta Lake for data lake architecture
  • Proficient in writing complex SQL queries,
  • Experience with cloud-based databases (Google BigQuery \ Snowflake) .
  • Advanced SQL skills and experience with both relational and NoSQL databases
  • Bachelor's degree in Computer Science, Engineering or related field
  • Experience with integration of multiple data sources and working with various databases technologies. 

Preferred Qualifications

  • Experience with Azure cloud platform and associated data services (Data Factory, Event Hubs, Storage)
  • Experience with EKS / AKS
  • Knowledge of data streaming platforms (Kafka, Azure Event Hubs) for real-time processing
  • Experience working in a multi cloud environment

Location:

On-site –TEL AVIV, ISR

Job Tags:

If this job description resonates with you, we encourage you to apply, even if you don’t meet all the requirements.  We’re interested in getting to know you and what you bring to the table!

At JLL, we harness the power of artificial intelligence (AI) to efficiently accelerate meaningful connections between candidates and opportunities. Using AI capabilities, we analyze your application for relevant skills, experiences, and qualifications to generate valuable insights about how your unique profile aligns with the specific requirements of the role you're pursuing.

JLL Privacy Notice

Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLL’s recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely.

For more information about how JLL processes your personal data, please view our Candidate Privacy Statement.

For additional details please see our career site pages for each country.

For candidates in the United States, please see a full copy of our Equal Employment Opportunity policy here.

Jones Lang LaSalle (“JLL”) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities.  If you need a reasonable accommodation because of a disability for any part of the employment process – including the online application and/or overall selection process – you may email us at HRSCLeaves@jll.com. This email is only to request an accommodation. Please direct any other general recruiting inquiries to our Contact Us page > I want to work for JLL.

Locations

  • TEL AVIV, Israel

Salary

Estimated Salary Rangemedium confidence

300,000 - 500,000 ILS / yearly

Source: AI Estimation

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Apache Sparkintermediate
  • distributed computingintermediate
  • Pythonintermediate
  • pandasintermediate
  • numpyintermediate
  • data modelingintermediate
  • Databricksintermediate
  • Delta Lakeintermediate
  • SQLintermediate
  • Google BigQueryintermediate
  • Snowflakeintermediate
  • relational databasesintermediate
  • NoSQL databasesintermediate
  • data integrationintermediate
  • multiple data sourcesintermediate

Required Qualifications

  • Engineering experience with strong expertise in Apache Spark and distributed computing (experience)
  • Strong programming skills in Python including data handling libraries (pandas, numpy, etc.), and data modeling (experience)
  • Proficient in Databricks platform and Delta Lake for data lake architecture (experience)
  • Proficient in writing complex SQL queries (experience)
  • Experience with cloud-based databases (Google BigQuery, Snowflake) (experience)
  • Advanced SQL skills and experience with both relational and NoSQL databases (experience)
  • Bachelor's degree in Computer Science, Engineering or related field (experience)
  • Experience with integration of multiple data sources and working with various databases technologies (experience)

Preferred Qualifications

  • Experience with Azure cloud platform and associated data services (Data Factory, Event Hubs, Storage) (experience)
  • Experience with EKS / AKS (experience)
  • Knowledge of data streaming platforms (Kafka, Azure Event Hubs) for real-time processing (experience)
  • Experience working in a multi cloud environment (experience)

Responsibilities

  • Design and implement robust, scalable data pipelines using Databricks, Apache Spark, and Delta Lake as well as BigQuery
  • Use SQL and Python to develop, scale, and optimize advanced data pipelines
  • Build and optimize ETL/ELT processes for capital markets data
  • Develop real-time and batch processing solutions to support trading and risk management operations
  • Implement data quality monitoring, validation, and alerting systems
  • Configure and optimize Databricks workspaces, clusters, and job scheduling
  • Work in a Multi-cloud environment including Azure, GCP and AWS
  • Implement security best practices including access controls, encryption, and audit logging
  • Build integrations with market data vendors, trading systems, and risk management platforms
  • Establish monitoring and performance tuning for data pipeline health and efficiency
  • Collaborate with various stakeholders across the company and support business insight requests
  • Work closely with quantitative researchers, risk analysts, and product teams to understand data requirements
  • Collaborate with other data engineering teams and infrastructure groups
  • Provide technical guidance to junior engineers and contribute to code reviews
  • Participate in architecture discussions and technology selection decisions

Target Your Resume for "Data Engineer Tech Lead (Hybrid)" , Jones Lang LaSalle

Get personalized recommendations to optimize your resume specifically for Data Engineer Tech Lead (Hybrid). Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Data Engineer Tech Lead (Hybrid)" , Jones Lang LaSalle

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

REQ453734jllcareers

Answer 10 quick questions to check your fit for Data Engineer Tech Lead (Hybrid) @ Jones Lang LaSalle.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.

Jones Lang LaSalle logo

Data Engineer Tech Lead (Hybrid)

Jones Lang LaSalle

Data Engineer Tech Lead (Hybrid)

full-timePosted: Jan 24, 2026

Job Description

JLL empowers you to shape a brighter way.  

Our people at JLL and JLL Technologies are shaping the future of real estate for a better world by combining world class services, advisory and technology for our clients. We are committed to hiring the best, most talented people  and empowering them to  thrive, grow meaningful careers and to find a place where they belong.  Whether you’ve got deep experience in commercial real estate, skilled trades or technology, or you’re looking to apply your relevant experience to a new industry, join our team as we help shape a brighter way forward.   

We are seeking a Data Engineer Tech Lead to join our capital markets data engineering teams, focusing on designing, building, and maintaining scalable data infrastructure on the Databricks platform. This role requires deep technical expertise in modern data stack technologies and the ability to work with complex financial data systems.

Location: Tel Aviv

Role type: Hybrid

Key Responsibilities

Technical Development

  • Design and implement robust, scalable data pipelines using Databricks, Apache Spark, and Delta Lake as well as BigQuery
  • Use SQL and Python to develop, scale, and optimize advanced data pipelines. 
  • Build and optimize ETL/ELT processes for capital markets data i
  • Develop real-time and batch processing solutions to support trading and risk management operations
  • Implement data quality monitoring, validation, and alerting systems

Platform Engineering

  • Configure and optimize Databricks workspaces, clusters, and job scheduling
  • Work in a Multi-cloud environment including Azure, GCP and AWS
  • Implement security best practices including access controls, encryption, and audit logging
  • Build integrations with market data vendors, trading systems, and risk management platforms
  • Establish monitoring and performance tuning for data pipeline health and efficiency

Collaboration & Mentorship

  • Collaborate with various stakeholders across the company and support business insight requests. 
  • Work closely with quantitative researchers, risk analysts, and product teams to understand data requirements
  • Collaborate with other data engineering teams and infrastructure groups
  • Provide technical guidance to junior engineers and contribute to code reviews
  • Participate in architecture discussions and technology selection decisions

Required Qualifications

  • Engineering experience with strong expertise in Apache Spark and distributed computing
  • Strong programming skills in Python including data handling libraries (pandas, numpy, etc.), and data modeling. 
  • Proficient in Databricks platform and Delta Lake for data lake architecture
  • Proficient in writing complex SQL queries,
  • Experience with cloud-based databases (Google BigQuery \ Snowflake) .
  • Advanced SQL skills and experience with both relational and NoSQL databases
  • Bachelor's degree in Computer Science, Engineering or related field
  • Experience with integration of multiple data sources and working with various databases technologies. 

Preferred Qualifications

  • Experience with Azure cloud platform and associated data services (Data Factory, Event Hubs, Storage)
  • Experience with EKS / AKS
  • Knowledge of data streaming platforms (Kafka, Azure Event Hubs) for real-time processing
  • Experience working in a multi cloud environment

Location:

On-site –TEL AVIV, ISR

Job Tags:

If this job description resonates with you, we encourage you to apply, even if you don’t meet all the requirements.  We’re interested in getting to know you and what you bring to the table!

At JLL, we harness the power of artificial intelligence (AI) to efficiently accelerate meaningful connections between candidates and opportunities. Using AI capabilities, we analyze your application for relevant skills, experiences, and qualifications to generate valuable insights about how your unique profile aligns with the specific requirements of the role you're pursuing.

JLL Privacy Notice

Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLL’s recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely.

For more information about how JLL processes your personal data, please view our Candidate Privacy Statement.

For additional details please see our career site pages for each country.

For candidates in the United States, please see a full copy of our Equal Employment Opportunity policy here.

Jones Lang LaSalle (“JLL”) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities.  If you need a reasonable accommodation because of a disability for any part of the employment process – including the online application and/or overall selection process – you may email us at HRSCLeaves@jll.com. This email is only to request an accommodation. Please direct any other general recruiting inquiries to our Contact Us page > I want to work for JLL.

Locations

  • TEL AVIV, Israel

Salary

Estimated Salary Rangemedium confidence

300,000 - 500,000 ILS / yearly

Source: AI Estimation

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Apache Sparkintermediate
  • distributed computingintermediate
  • Pythonintermediate
  • pandasintermediate
  • numpyintermediate
  • data modelingintermediate
  • Databricksintermediate
  • Delta Lakeintermediate
  • SQLintermediate
  • Google BigQueryintermediate
  • Snowflakeintermediate
  • relational databasesintermediate
  • NoSQL databasesintermediate
  • data integrationintermediate
  • multiple data sourcesintermediate

Required Qualifications

  • Engineering experience with strong expertise in Apache Spark and distributed computing (experience)
  • Strong programming skills in Python including data handling libraries (pandas, numpy, etc.), and data modeling (experience)
  • Proficient in Databricks platform and Delta Lake for data lake architecture (experience)
  • Proficient in writing complex SQL queries (experience)
  • Experience with cloud-based databases (Google BigQuery, Snowflake) (experience)
  • Advanced SQL skills and experience with both relational and NoSQL databases (experience)
  • Bachelor's degree in Computer Science, Engineering or related field (experience)
  • Experience with integration of multiple data sources and working with various databases technologies (experience)

Preferred Qualifications

  • Experience with Azure cloud platform and associated data services (Data Factory, Event Hubs, Storage) (experience)
  • Experience with EKS / AKS (experience)
  • Knowledge of data streaming platforms (Kafka, Azure Event Hubs) for real-time processing (experience)
  • Experience working in a multi cloud environment (experience)

Responsibilities

  • Design and implement robust, scalable data pipelines using Databricks, Apache Spark, and Delta Lake as well as BigQuery
  • Use SQL and Python to develop, scale, and optimize advanced data pipelines
  • Build and optimize ETL/ELT processes for capital markets data
  • Develop real-time and batch processing solutions to support trading and risk management operations
  • Implement data quality monitoring, validation, and alerting systems
  • Configure and optimize Databricks workspaces, clusters, and job scheduling
  • Work in a Multi-cloud environment including Azure, GCP and AWS
  • Implement security best practices including access controls, encryption, and audit logging
  • Build integrations with market data vendors, trading systems, and risk management platforms
  • Establish monitoring and performance tuning for data pipeline health and efficiency
  • Collaborate with various stakeholders across the company and support business insight requests
  • Work closely with quantitative researchers, risk analysts, and product teams to understand data requirements
  • Collaborate with other data engineering teams and infrastructure groups
  • Provide technical guidance to junior engineers and contribute to code reviews
  • Participate in architecture discussions and technology selection decisions

Target Your Resume for "Data Engineer Tech Lead (Hybrid)" , Jones Lang LaSalle

Get personalized recommendations to optimize your resume specifically for Data Engineer Tech Lead (Hybrid). Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Data Engineer Tech Lead (Hybrid)" , Jones Lang LaSalle

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

REQ453734jllcareers

Answer 10 quick questions to check your fit for Data Engineer Tech Lead (Hybrid) @ Jones Lang LaSalle.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.