Resume and JobRESUME AND JOB
Thermo Fisher Scientific logo

Systems Developer (Data Warehouse)

Thermo Fisher Scientific

Systems Developer (Data Warehouse)

full-timePosted: Jan 12, 2026

Job Description

Work Schedule

Other

Environmental Conditions

Office

Job Description

Summarized Purpose:

We are offering an outstanding opportunity to join Thermo Fisher Scientific as a Data Warehouse Developer. In this role, you will focus on designing, building, and optimizing database-centric solutions that power our analytical and operational data platforms. You will play a key role in developing robust, scalable data pipelines and warehouse structures -primarily on AWS Redshift—supporting data-intensive workloads and downstream reporting applications.

Education/Experience:

  • Bachelor’s degree or comparable experience in Computer Science, Information Science, or a related area of study
  • Around 3+ years of experience in database development, data engineering, or a related field
  • Equivalent combinations of education, training, and experience will also be considered

Major Job Responsibilities:

  • Design, implement, and refine data warehouse schemas (e.g., star/snowflake models) and data pipelines in AWS Redshift or similar RDBMS platforms.
  • Build and manage SQL-centric ETL/ELT procedures for the processing, transformation, and merging of data from diverse origins.
  • Improve database performance through query optimization and efficient data processing.
  • Collaborate with stakeholders to translate data needs into scalable and maintainable data structures.
  • Support data quality, validation, and reconciliation processes to ensure accurate and reliable datasets.
  • Engage with AWS services including Lambda, Step Functions, and S3 for orchestrating and automating data workflows.
  • Participate in design reviews, documentation, and testing activities to ensure adherence to quality and compliance standards.
  • Collaborate with Operations and DevOps teams to deploy and monitor data workflows using CI/CD pipelines where applicable.
  • Troubleshoot production issues, analyze root causes, and propose balanced solutions.
  • Leverage AI-assisted development tools to improve query efficiency, code refactoring, and documentation.

Knowledge, Skills and Abilities:

  • Strong hands-on SQL development skills including complex queries, window functions, joins, and analytical operations.
  • Mastery in data modeling and grasp of data warehousing principles (ETL/ELT, dimensional modeling, slowly altering dimensions).
  • Experience working with large relational databases (e.g., Redshift, PostgreSQL, SQL Server, MySQL, Oracle).
  • Knowledge of AWS cloud services, especially S3, Lambda, Redshift, and Step Functions.
  • Familiarity with Python or NodeJS for scripting, automation, or data workflows based on Lambda.
  • Excellent analytical and problem-solving skills with attention to detail.
  • Strong communication and collaboration skills in a team-oriented environment.

Must Have skills:

  • Advanced SQL and RDBMS experience – Ability to develop and optimize queries, stored procedures, and data transformations for large-scale data workloads.
  • Data warehousing and ETL/ELT – Practical experience designing and maintaining data warehouse environments and data pipelines.
  • Practical exposure to AWS services – Hands-on experience with AWS data services like Redshift, S3, Lambda, and Step Functions.
  • Skilled in Python or NodeJS programming – Capable of using a programming language for automation or data integration purposes.
  • Data modeling and schema creation – Experience crafting normalized and dimensional schemas for analytics and reporting.

Good to have skills:

  • Exposure to data Lakehouse or big data environments (Databricks, Snowflake, or similar).
  • Knowledge of AI-assisted or modern query optimization tools and practices.

Working Hours:

India: 05:30 PM to 02:30 AM IST

Philippines: 08:00 PM to 05:00 AM PST

Locations

  • Global

Salary

Estimated Salary Rangemedium confidence

50,000 - 90,000 USD / yearly

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Advanced SQL (complex queries, window functions, joins)intermediate
  • Data modeling (star/snowflake)intermediate
  • ETL/ELT proceduresintermediate
  • AWS services (Redshift, S3, Lambda, Step Functions)intermediate
  • Python or NodeJS scriptingintermediate
  • Query optimizationintermediate

Required Qualifications

  • Bachelor’s in Computer Science, Information Science or related (experience)
  • 3+ years database development, data engineering (experience)

Responsibilities

  • Design/implement data warehouse schemas and pipelines
  • Build/manage SQL ETL/ELT
  • Improve database performance
  • Collaborate on data needs
  • Support data quality/validation
  • Engage AWS services
  • Participate in reviews/documentation/testing
  • Deploy/monitor workflows
  • Troubleshoot issues

Target Your Resume for "Systems Developer (Data Warehouse)" , Thermo Fisher Scientific

Get personalized recommendations to optimize your resume specifically for Systems Developer (Data Warehouse). Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Systems Developer (Data Warehouse)" , Thermo Fisher Scientific

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score
Quiz Challenge

Answer 10 quick questions to check your fit for Systems Developer (Data Warehouse) @ Thermo Fisher Scientific.

10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.

Thermo Fisher Scientific logo

Systems Developer (Data Warehouse)

Thermo Fisher Scientific

Systems Developer (Data Warehouse)

full-timePosted: Jan 12, 2026

Job Description

Work Schedule

Other

Environmental Conditions

Office

Job Description

Summarized Purpose:

We are offering an outstanding opportunity to join Thermo Fisher Scientific as a Data Warehouse Developer. In this role, you will focus on designing, building, and optimizing database-centric solutions that power our analytical and operational data platforms. You will play a key role in developing robust, scalable data pipelines and warehouse structures -primarily on AWS Redshift—supporting data-intensive workloads and downstream reporting applications.

Education/Experience:

  • Bachelor’s degree or comparable experience in Computer Science, Information Science, or a related area of study
  • Around 3+ years of experience in database development, data engineering, or a related field
  • Equivalent combinations of education, training, and experience will also be considered

Major Job Responsibilities:

  • Design, implement, and refine data warehouse schemas (e.g., star/snowflake models) and data pipelines in AWS Redshift or similar RDBMS platforms.
  • Build and manage SQL-centric ETL/ELT procedures for the processing, transformation, and merging of data from diverse origins.
  • Improve database performance through query optimization and efficient data processing.
  • Collaborate with stakeholders to translate data needs into scalable and maintainable data structures.
  • Support data quality, validation, and reconciliation processes to ensure accurate and reliable datasets.
  • Engage with AWS services including Lambda, Step Functions, and S3 for orchestrating and automating data workflows.
  • Participate in design reviews, documentation, and testing activities to ensure adherence to quality and compliance standards.
  • Collaborate with Operations and DevOps teams to deploy and monitor data workflows using CI/CD pipelines where applicable.
  • Troubleshoot production issues, analyze root causes, and propose balanced solutions.
  • Leverage AI-assisted development tools to improve query efficiency, code refactoring, and documentation.

Knowledge, Skills and Abilities:

  • Strong hands-on SQL development skills including complex queries, window functions, joins, and analytical operations.
  • Mastery in data modeling and grasp of data warehousing principles (ETL/ELT, dimensional modeling, slowly altering dimensions).
  • Experience working with large relational databases (e.g., Redshift, PostgreSQL, SQL Server, MySQL, Oracle).
  • Knowledge of AWS cloud services, especially S3, Lambda, Redshift, and Step Functions.
  • Familiarity with Python or NodeJS for scripting, automation, or data workflows based on Lambda.
  • Excellent analytical and problem-solving skills with attention to detail.
  • Strong communication and collaboration skills in a team-oriented environment.

Must Have skills:

  • Advanced SQL and RDBMS experience – Ability to develop and optimize queries, stored procedures, and data transformations for large-scale data workloads.
  • Data warehousing and ETL/ELT – Practical experience designing and maintaining data warehouse environments and data pipelines.
  • Practical exposure to AWS services – Hands-on experience with AWS data services like Redshift, S3, Lambda, and Step Functions.
  • Skilled in Python or NodeJS programming – Capable of using a programming language for automation or data integration purposes.
  • Data modeling and schema creation – Experience crafting normalized and dimensional schemas for analytics and reporting.

Good to have skills:

  • Exposure to data Lakehouse or big data environments (Databricks, Snowflake, or similar).
  • Knowledge of AI-assisted or modern query optimization tools and practices.

Working Hours:

India: 05:30 PM to 02:30 AM IST

Philippines: 08:00 PM to 05:00 AM PST

Locations

  • Global

Salary

Estimated Salary Rangemedium confidence

50,000 - 90,000 USD / yearly

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Advanced SQL (complex queries, window functions, joins)intermediate
  • Data modeling (star/snowflake)intermediate
  • ETL/ELT proceduresintermediate
  • AWS services (Redshift, S3, Lambda, Step Functions)intermediate
  • Python or NodeJS scriptingintermediate
  • Query optimizationintermediate

Required Qualifications

  • Bachelor’s in Computer Science, Information Science or related (experience)
  • 3+ years database development, data engineering (experience)

Responsibilities

  • Design/implement data warehouse schemas and pipelines
  • Build/manage SQL ETL/ELT
  • Improve database performance
  • Collaborate on data needs
  • Support data quality/validation
  • Engage AWS services
  • Participate in reviews/documentation/testing
  • Deploy/monitor workflows
  • Troubleshoot issues

Target Your Resume for "Systems Developer (Data Warehouse)" , Thermo Fisher Scientific

Get personalized recommendations to optimize your resume specifically for Systems Developer (Data Warehouse). Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Systems Developer (Data Warehouse)" , Thermo Fisher Scientific

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score
Quiz Challenge

Answer 10 quick questions to check your fit for Systems Developer (Data Warehouse) @ Thermo Fisher Scientific.

10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.