Resume and JobRESUME AND JOB
Comcast logo

Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week

Comcast

Software and Technology Jobs

Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week

full-timePosted: Dec 8, 2025

Job Description

Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week

Location Chicago, Illinois, Englewood, Colorado Req ID R420129 Job Type Full Time
Category Analytics Date posted 10/21/2025
Apply Now
FreeWheel, a Comcast company, provides comprehensive ad platforms for publishers, advertisers, and media buyers. Powered by premium video content, robust data, and advanced technology, we’re making it easier for buyers and sellers to transact across all screens, data types, and sales channels. As a global company, we have offices in nine countries and can insert advertisements around the world.

Job Summary

We are seeking a highly skilled Data Engineer with expertise in SQL and either Python or Scala, experienced in building large-scale data pipelines using Apache Spark and designing robust data architectures on AWS. The ideal candidate will have hands-on experience in data lake architectures, open table formats (Delta Lake/Iceberg), and modern data platforms delivery self-service analytics for large number of users. If you are a problem solver, a data platform enthusiast, and someone who thrives in fast-paced environments, we'd love to hear from you!

Job Description

Core Responsibilities:

  • Architect, design, and develop high-performance, scalable data pipelines.
  • Develop large scale and complex Big Data ETL using modern frameworks such as DBT and Databricks Delta Live Tables (DLT)
  • Define and implement data lake architectures and open table formats (e.g., Delta Lake, Iceberg).
  • Establish and enforce best practices for data engineering, ensuring data quality, integrity, and performance.
  • Drive observability and monitoring for data pipelines, implementing data catalogs and lineage tracking.
  • Ability to own critical modules for complex and large-scale Batch ETL models and ability to articulate models to Product, Engineering and Leadership
  • Effectively context switch between multiple priorities, balancing long-term architectural goals with immediate business needs.
  • Develop complex Semantic layers and help create customer facing self-service analytics
  • Consistent exercise of independent judgment and discretion in matters of significance.
  • Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary.
  • Other duties and responsibilities as assigned.

Qualifications:

  • At-least 4+ years of experience in Data Engineering.
  • Proficient in at-least one of languages such as Python, Scala or Java or GoLang
  • Ability to craft SQLs for complex Lakehouse or warehouses
  • Hands-on experience in Spark and/or Snowflake is preferred
  • Firm grasp on concepts such as Lakehouse (like Apache Iceberg) and Warehouse
  • Firm grasp on Data Warehouse modeling paradigms
  • Firm understanding of Cloud based architectures
  • Problem solving skills


​Preferred (but not required) requirements

  • Experience in Looker or similar BI Modeling tool
  • Understanding of Semantic Layer such as Looker (Look ML), Snowflake or Databricks


Employees at all levels are expected to:

  • Understand our Operating Principles; make them the guidelines for how you do your job.
  • Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services.
  • Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences.
  • Win as a team - make big things happen by working together and being open to new ideas.
  • Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers.
  • Drive results and growth.
  • Support a culture of inclusion in how you work and lead.
  • Do what's right for each other, our customers, investors and our communities.


Disclaimer:This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications.

Comcast is an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.


Skills:

Data Engineering; Datasource; Big Data


Salary:

Primary Location Pay Range: $122,299.12 - $183,448.68

Additional Range: This job can be performed in Denver Campus with a Pay Range of $118,139.40 - $185,647.63

Comcast intends to offer the selected candidate base pay within this range, dependent on job-related, non-discriminatory factors such as experience. The application window is 30 days from the date job is posted, unless the number of applicants requires it to close sooner or later.


The application window is 30 days from the date job is posted, unless the number of applicants requires it to close sooner or later.

Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details.


Education

Bachelor's Degree

While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience.

Relevant Work Experience

7-10 Years

Apply Now

About the Role/Company

  • FreeWheel, a Comcast company, provides comprehensive ad platforms for publishers, advertisers, and media buyers
  • Powered by premium video content, robust data, and advanced technology
  • Global company with offices in nine countries
  • Comcast is an equal opportunity workplace
  • Considers all qualified applicants for employment without discrimination

Key Responsibilities

  • Architect, design, and develop high-performance, scalable data pipelines
  • Develop large scale and complex Big Data ETL using modern frameworks such as DBT and Databricks Delta Live Tables (DLT)
  • Define and implement data lake architectures and open table formats (e.g., Delta Lake, Iceberg)
  • Establish and enforce best practices for data engineering, ensuring data quality, integrity, and performance
  • Drive observability and monitoring for data pipelines, implementing data catalogs and lineage tracking
  • Ability to own critical modules for complex and large-scale Batch ETL models and ability to articulate models to Product, Engineering, and Leadership
  • Effectively context switch between multiple priorities, balancing long-term architectural goals with immediate business needs
  • Develop complex Semantic layers and help create customer facing self-service analytics
  • Consistent exercise of independent judgment and discretion in matters of significance
  • Regular, consistent and punctual attendance
  • Must be able to work nights and weekends, variable schedule(s) as necessary
  • Other duties and responsibilities as assigned

Required Qualifications

  • At-least 4+ years of experience in Data Engineering
  • Proficient in at-least one of languages such as Python, Scala, Java, or GoLang
  • Ability to craft SQLs for complex Lakehouse or warehouses
  • Firm grasp on concepts such as Lakehouse (like Apache Iceberg) and Warehouse
  • Firm grasp on Data Warehouse modeling paradigms
  • Firm understanding of Cloud based architectures
  • Problem solving skills

Preferred Qualifications

  • Hands-on experience in Spark and/or Snowflake
  • Experience in Looker or similar BI Modeling tool
  • Understanding of Semantic Layer such as Looker (Look ML), Snowflake, or Databricks

Skills Required

  • Data Engineering
  • Datasource
  • Big Data
  • SQL
  • Python or Scala
  • Apache Spark
  • AWS
  • Data lake architectures
  • Open table formats (Delta Lake/Iceberg)
  • Modern data platforms

Benefits & Perks

  • Eligible for a Bonus for non-sales positions
  • Best-in-class Benefits to eligible employees
  • Benefits that connect you to support physically, financially, and emotionally

Additional Requirements

  • Bachelor's Degree or equivalent combination of coursework and experience, or extensive related professional experience
  • -10 years of relevant work experience
  • Onsite 4 days a week in Chicago, IL or Denver, CO

Locations

  • Location Chicago, Illinois, Englewood

Salary

118,139.4 - 185,647.63 USD / yearly

Skills Required

  • Data Engineeringintermediate
  • Datasourceintermediate
  • Big Dataintermediate
  • SQLintermediate
  • Python or Scalaintermediate
  • Apache Sparkintermediate
  • AWSintermediate
  • Data lake architecturesintermediate
  • Open table formats (Delta Lake/Iceberg)intermediate
  • Modern data platformsintermediate

Required Qualifications

  • At-least 4+ years of experience in Data Engineering (experience)
  • Proficient in at-least one of languages such as Python, Scala, Java, or GoLang (experience)
  • Ability to craft SQLs for complex Lakehouse or warehouses (experience)
  • Firm grasp on concepts such as Lakehouse (like Apache Iceberg) and Warehouse (experience)
  • Firm grasp on Data Warehouse modeling paradigms (experience)
  • Firm understanding of Cloud based architectures (experience)
  • Problem solving skills (experience)

Preferred Qualifications

  • Hands-on experience in Spark and/or Snowflake (experience)
  • Experience in Looker or similar BI Modeling tool (experience)
  • Understanding of Semantic Layer such as Looker (Look ML), Snowflake, or Databricks (experience)

Responsibilities

  • Architect, design, and develop high-performance, scalable data pipelines
  • Develop large scale and complex Big Data ETL using modern frameworks such as DBT and Databricks Delta Live Tables (DLT)
  • Define and implement data lake architectures and open table formats (e.g., Delta Lake, Iceberg)
  • Establish and enforce best practices for data engineering, ensuring data quality, integrity, and performance
  • Drive observability and monitoring for data pipelines, implementing data catalogs and lineage tracking
  • Ability to own critical modules for complex and large-scale Batch ETL models and ability to articulate models to Product, Engineering, and Leadership
  • Effectively context switch between multiple priorities, balancing long-term architectural goals with immediate business needs
  • Develop complex Semantic layers and help create customer facing self-service analytics
  • Consistent exercise of independent judgment and discretion in matters of significance
  • Regular, consistent and punctual attendance
  • Must be able to work nights and weekends, variable schedule(s) as necessary
  • Other duties and responsibilities as assigned

Benefits

  • general: Eligible for a Bonus for non-sales positions
  • general: Best-in-class Benefits to eligible employees
  • general: Benefits that connect you to support physically, financially, and emotionally

Target Your Resume for "Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week" , Comcast

Get personalized recommendations to optimize your resume specifically for Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week" , Comcast

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

TechnologyMediaTelecommunicationsTechnologyMedia

Answer 10 quick questions to check your fit for Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week @ Comcast.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.

Comcast logo

Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week

Comcast

Software and Technology Jobs

Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week

full-timePosted: Dec 8, 2025

Job Description

Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week

Location Chicago, Illinois, Englewood, Colorado Req ID R420129 Job Type Full Time
Category Analytics Date posted 10/21/2025
Apply Now
FreeWheel, a Comcast company, provides comprehensive ad platforms for publishers, advertisers, and media buyers. Powered by premium video content, robust data, and advanced technology, we’re making it easier for buyers and sellers to transact across all screens, data types, and sales channels. As a global company, we have offices in nine countries and can insert advertisements around the world.

Job Summary

We are seeking a highly skilled Data Engineer with expertise in SQL and either Python or Scala, experienced in building large-scale data pipelines using Apache Spark and designing robust data architectures on AWS. The ideal candidate will have hands-on experience in data lake architectures, open table formats (Delta Lake/Iceberg), and modern data platforms delivery self-service analytics for large number of users. If you are a problem solver, a data platform enthusiast, and someone who thrives in fast-paced environments, we'd love to hear from you!

Job Description

Core Responsibilities:

  • Architect, design, and develop high-performance, scalable data pipelines.
  • Develop large scale and complex Big Data ETL using modern frameworks such as DBT and Databricks Delta Live Tables (DLT)
  • Define and implement data lake architectures and open table formats (e.g., Delta Lake, Iceberg).
  • Establish and enforce best practices for data engineering, ensuring data quality, integrity, and performance.
  • Drive observability and monitoring for data pipelines, implementing data catalogs and lineage tracking.
  • Ability to own critical modules for complex and large-scale Batch ETL models and ability to articulate models to Product, Engineering and Leadership
  • Effectively context switch between multiple priorities, balancing long-term architectural goals with immediate business needs.
  • Develop complex Semantic layers and help create customer facing self-service analytics
  • Consistent exercise of independent judgment and discretion in matters of significance.
  • Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary.
  • Other duties and responsibilities as assigned.

Qualifications:

  • At-least 4+ years of experience in Data Engineering.
  • Proficient in at-least one of languages such as Python, Scala or Java or GoLang
  • Ability to craft SQLs for complex Lakehouse or warehouses
  • Hands-on experience in Spark and/or Snowflake is preferred
  • Firm grasp on concepts such as Lakehouse (like Apache Iceberg) and Warehouse
  • Firm grasp on Data Warehouse modeling paradigms
  • Firm understanding of Cloud based architectures
  • Problem solving skills


​Preferred (but not required) requirements

  • Experience in Looker or similar BI Modeling tool
  • Understanding of Semantic Layer such as Looker (Look ML), Snowflake or Databricks


Employees at all levels are expected to:

  • Understand our Operating Principles; make them the guidelines for how you do your job.
  • Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services.
  • Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences.
  • Win as a team - make big things happen by working together and being open to new ideas.
  • Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers.
  • Drive results and growth.
  • Support a culture of inclusion in how you work and lead.
  • Do what's right for each other, our customers, investors and our communities.


Disclaimer:This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications.

Comcast is an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.


Skills:

Data Engineering; Datasource; Big Data


Salary:

Primary Location Pay Range: $122,299.12 - $183,448.68

Additional Range: This job can be performed in Denver Campus with a Pay Range of $118,139.40 - $185,647.63

Comcast intends to offer the selected candidate base pay within this range, dependent on job-related, non-discriminatory factors such as experience. The application window is 30 days from the date job is posted, unless the number of applicants requires it to close sooner or later.


The application window is 30 days from the date job is posted, unless the number of applicants requires it to close sooner or later.

Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details.


Education

Bachelor's Degree

While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience.

Relevant Work Experience

7-10 Years

Apply Now

About the Role/Company

  • FreeWheel, a Comcast company, provides comprehensive ad platforms for publishers, advertisers, and media buyers
  • Powered by premium video content, robust data, and advanced technology
  • Global company with offices in nine countries
  • Comcast is an equal opportunity workplace
  • Considers all qualified applicants for employment without discrimination

Key Responsibilities

  • Architect, design, and develop high-performance, scalable data pipelines
  • Develop large scale and complex Big Data ETL using modern frameworks such as DBT and Databricks Delta Live Tables (DLT)
  • Define and implement data lake architectures and open table formats (e.g., Delta Lake, Iceberg)
  • Establish and enforce best practices for data engineering, ensuring data quality, integrity, and performance
  • Drive observability and monitoring for data pipelines, implementing data catalogs and lineage tracking
  • Ability to own critical modules for complex and large-scale Batch ETL models and ability to articulate models to Product, Engineering, and Leadership
  • Effectively context switch between multiple priorities, balancing long-term architectural goals with immediate business needs
  • Develop complex Semantic layers and help create customer facing self-service analytics
  • Consistent exercise of independent judgment and discretion in matters of significance
  • Regular, consistent and punctual attendance
  • Must be able to work nights and weekends, variable schedule(s) as necessary
  • Other duties and responsibilities as assigned

Required Qualifications

  • At-least 4+ years of experience in Data Engineering
  • Proficient in at-least one of languages such as Python, Scala, Java, or GoLang
  • Ability to craft SQLs for complex Lakehouse or warehouses
  • Firm grasp on concepts such as Lakehouse (like Apache Iceberg) and Warehouse
  • Firm grasp on Data Warehouse modeling paradigms
  • Firm understanding of Cloud based architectures
  • Problem solving skills

Preferred Qualifications

  • Hands-on experience in Spark and/or Snowflake
  • Experience in Looker or similar BI Modeling tool
  • Understanding of Semantic Layer such as Looker (Look ML), Snowflake, or Databricks

Skills Required

  • Data Engineering
  • Datasource
  • Big Data
  • SQL
  • Python or Scala
  • Apache Spark
  • AWS
  • Data lake architectures
  • Open table formats (Delta Lake/Iceberg)
  • Modern data platforms

Benefits & Perks

  • Eligible for a Bonus for non-sales positions
  • Best-in-class Benefits to eligible employees
  • Benefits that connect you to support physically, financially, and emotionally

Additional Requirements

  • Bachelor's Degree or equivalent combination of coursework and experience, or extensive related professional experience
  • -10 years of relevant work experience
  • Onsite 4 days a week in Chicago, IL or Denver, CO

Locations

  • Location Chicago, Illinois, Englewood

Salary

118,139.4 - 185,647.63 USD / yearly

Skills Required

  • Data Engineeringintermediate
  • Datasourceintermediate
  • Big Dataintermediate
  • SQLintermediate
  • Python or Scalaintermediate
  • Apache Sparkintermediate
  • AWSintermediate
  • Data lake architecturesintermediate
  • Open table formats (Delta Lake/Iceberg)intermediate
  • Modern data platformsintermediate

Required Qualifications

  • At-least 4+ years of experience in Data Engineering (experience)
  • Proficient in at-least one of languages such as Python, Scala, Java, or GoLang (experience)
  • Ability to craft SQLs for complex Lakehouse or warehouses (experience)
  • Firm grasp on concepts such as Lakehouse (like Apache Iceberg) and Warehouse (experience)
  • Firm grasp on Data Warehouse modeling paradigms (experience)
  • Firm understanding of Cloud based architectures (experience)
  • Problem solving skills (experience)

Preferred Qualifications

  • Hands-on experience in Spark and/or Snowflake (experience)
  • Experience in Looker or similar BI Modeling tool (experience)
  • Understanding of Semantic Layer such as Looker (Look ML), Snowflake, or Databricks (experience)

Responsibilities

  • Architect, design, and develop high-performance, scalable data pipelines
  • Develop large scale and complex Big Data ETL using modern frameworks such as DBT and Databricks Delta Live Tables (DLT)
  • Define and implement data lake architectures and open table formats (e.g., Delta Lake, Iceberg)
  • Establish and enforce best practices for data engineering, ensuring data quality, integrity, and performance
  • Drive observability and monitoring for data pipelines, implementing data catalogs and lineage tracking
  • Ability to own critical modules for complex and large-scale Batch ETL models and ability to articulate models to Product, Engineering, and Leadership
  • Effectively context switch between multiple priorities, balancing long-term architectural goals with immediate business needs
  • Develop complex Semantic layers and help create customer facing self-service analytics
  • Consistent exercise of independent judgment and discretion in matters of significance
  • Regular, consistent and punctual attendance
  • Must be able to work nights and weekends, variable schedule(s) as necessary
  • Other duties and responsibilities as assigned

Benefits

  • general: Eligible for a Bonus for non-sales positions
  • general: Best-in-class Benefits to eligible employees
  • general: Benefits that connect you to support physically, financially, and emotionally

Target Your Resume for "Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week" , Comcast

Get personalized recommendations to optimize your resume specifically for Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week" , Comcast

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

TechnologyMediaTelecommunicationsTechnologyMedia

Answer 10 quick questions to check your fit for Sr. Data Engineer- Chicago, IL or Denver, CO- ONSITE 4 X Week @ Comcast.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.