Resume and JobRESUME AND JOB
Capgemini logo

Data Engineer - C

Capgemini

Software and Technology Jobs

Data Engineer - C

full-timePosted: Aug 19, 2025

Job Description

Data Engineer - C

📋 Job Overview

The Data Engineer - C role at Capgemini involves leading and managing a team of data engineers to build reliable and scalable data infrastructure. Responsibilities include overseeing data engineering projects, ensuring technical excellence, and fostering collaboration with stakeholders to deliver high-quality data solutions. This position drives data-driven objectives by unlocking the value of data assets for meaningful insights and decision-making.

📍 Location: Hyderabad

💼 Experience Level: Experienced Professionals

🏢 Business Unit: FS

🎯 Key Responsibilities

  • Leading and managing a team of data engineers
  • Overseeing data engineering projects
  • Ensuring technical excellence
  • Fostering collaboration with stakeholders
  • Driving the success of data engineering initiatives
  • Delivering reliable and high-quality data solutions

✅ Required Qualifications

  • Experience in leading and managing teams of data engineers
  • Background in overseeing data engineering projects
  • Knowledge of ensuring technical excellence in data solutions

🛠️ Required Skills

  • Ab Initio
  • Agile (Software Development Framework)
  • Apache Hadoop
  • AWS Airflow
  • AWS Athena
  • AWS Code Pipeline
  • AWS EFS
  • AWS EMR
  • AWS Redshift
  • AWS S3
  • Azure ADLS Gen2
  • Azure Data Factory
  • Azure Data Lake Storage
  • Azure Databricks
  • Azure Event Hub
  • Azure Stream Analytics
  • Azure Synapse
  • Bitbucket
  • Change Management
  • Client Centricity
  • Collaboration
  • Continuous Integration and Continuous Delivery (CI/CD)
  • Data Architecture Patterns
  • Data Format Analysis
  • Data Governance
  • Data Modeling
  • Data Validation
  • Data Vault Modeling
  • Database Schema Design
  • Decision-Making
  • DevOps
  • Dimensional Modeling
  • GCP Big Table
  • GCP BigQuery
  • GCP Cloud Storage
  • GCP DataFlow
  • GCP DataProc
  • Git
  • Google Big Table
  • Google Data Proc
  • Greenplum
  • HQL
  • IBM Data Stage
  • IBM DB2
  • Industry Standard Data Modeling (FSLDM)
  • Industry Standard Data Modeling (IBM FSDM)
  • Influencing
  • Informatica IICS
  • Inmon methodology
  • JavaScript
  • Jenkins
  • Kimball
  • Linux - Redhat
  • Negotiation
  • Netezza
  • NewSQL
  • Oracle Exadata
  • Performance Tuning
  • Perl
  • Platform Update Management
  • Project Management
  • PySpark
  • Python
  • R
  • RDD Optimization
  • Santos
  • SAS
  • Scala Spark
  • Shell Script
  • Snowflake
  • SPARK
  • SPARK Code Optimization
  • SQL
  • Stakeholder Management
  • Sun Solaris
  • Synapse
  • Talend
  • Teradata
  • Time Management
  • Ubuntu
  • Vendor Management

Locations

  • Hyderabad, India

Salary

Estimated Salary Rangemedium confidence

2,500,000 - 4,200,000 INR / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Ab Initiointermediate
  • Agile (Software Development Framework)intermediate
  • Apache Hadoopintermediate
  • AWS Airflowintermediate
  • AWS Athenaintermediate
  • AWS Code Pipelineintermediate
  • AWS EFSintermediate
  • AWS EMRintermediate
  • AWS Redshiftintermediate
  • AWS S3intermediate
  • Azure ADLS Gen2intermediate
  • Azure Data Factoryintermediate
  • Azure Data Lake Storageintermediate
  • Azure Databricksintermediate
  • Azure Event Hubintermediate
  • Azure Stream Analyticsintermediate
  • Azure Synapseintermediate
  • Bitbucketintermediate
  • Change Managementintermediate
  • Client Centricityintermediate
  • Collaborationintermediate
  • Continuous Integration and Continuous Delivery (CI/CD)intermediate
  • Data Architecture Patternsintermediate
  • Data Format Analysisintermediate
  • Data Governanceintermediate
  • Data Modelingintermediate
  • Data Validationintermediate
  • Data Vault Modelingintermediate
  • Database Schema Designintermediate
  • Decision-Makingintermediate
  • DevOpsintermediate
  • Dimensional Modelingintermediate
  • GCP Big Tableintermediate
  • GCP BigQueryintermediate
  • GCP Cloud Storageintermediate
  • GCP DataFlowintermediate
  • GCP DataProcintermediate
  • Gitintermediate
  • Google Big Tableintermediate
  • Google Data Procintermediate
  • Greenplumintermediate
  • HQLintermediate
  • IBM Data Stageintermediate
  • IBM DB2intermediate
  • Industry Standard Data Modeling (FSLDM)intermediate
  • Industry Standard Data Modeling (IBM FSDM)intermediate
  • Influencingintermediate
  • Informatica IICSintermediate
  • Inmon methodologyintermediate
  • JavaScriptintermediate
  • Jenkinsintermediate
  • Kimballintermediate
  • Linux - Redhatintermediate
  • Negotiationintermediate
  • Netezzaintermediate
  • NewSQLintermediate
  • Oracle Exadataintermediate
  • Performance Tuningintermediate
  • Perlintermediate
  • Platform Update Managementintermediate
  • Project Managementintermediate
  • PySparkintermediate
  • Pythonintermediate
  • Rintermediate
  • RDD Optimizationintermediate
  • Santosintermediate
  • SASintermediate
  • Scala Sparkintermediate
  • Shell Scriptintermediate
  • Snowflakeintermediate
  • SPARKintermediate
  • SPARK Code Optimizationintermediate
  • SQLintermediate
  • Stakeholder Managementintermediate
  • Sun Solarisintermediate
  • Synapseintermediate
  • Talendintermediate
  • Teradataintermediate
  • Time Managementintermediate
  • Ubuntuintermediate
  • Vendor Managementintermediate

Required Qualifications

  • Experience in leading and managing teams of data engineers (experience)
  • Background in overseeing data engineering projects (experience)
  • Knowledge of ensuring technical excellence in data solutions (experience)

Responsibilities

  • Leading and managing a team of data engineers
  • Overseeing data engineering projects
  • Ensuring technical excellence
  • Fostering collaboration with stakeholders
  • Driving the success of data engineering initiatives
  • Delivering reliable and high-quality data solutions

Target Your Resume for "Data Engineer - C" , Capgemini

Get personalized recommendations to optimize your resume specifically for Data Engineer - C. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Data Engineer - C" , Capgemini

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

FSData & AIExperienced ProfessionalsFS

Answer 10 quick questions to check your fit for Data Engineer - C @ Capgemini.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.

Capgemini logo

Data Engineer - C

Capgemini

Software and Technology Jobs

Data Engineer - C

full-timePosted: Aug 19, 2025

Job Description

Data Engineer - C

📋 Job Overview

The Data Engineer - C role at Capgemini involves leading and managing a team of data engineers to build reliable and scalable data infrastructure. Responsibilities include overseeing data engineering projects, ensuring technical excellence, and fostering collaboration with stakeholders to deliver high-quality data solutions. This position drives data-driven objectives by unlocking the value of data assets for meaningful insights and decision-making.

📍 Location: Hyderabad

💼 Experience Level: Experienced Professionals

🏢 Business Unit: FS

🎯 Key Responsibilities

  • Leading and managing a team of data engineers
  • Overseeing data engineering projects
  • Ensuring technical excellence
  • Fostering collaboration with stakeholders
  • Driving the success of data engineering initiatives
  • Delivering reliable and high-quality data solutions

✅ Required Qualifications

  • Experience in leading and managing teams of data engineers
  • Background in overseeing data engineering projects
  • Knowledge of ensuring technical excellence in data solutions

🛠️ Required Skills

  • Ab Initio
  • Agile (Software Development Framework)
  • Apache Hadoop
  • AWS Airflow
  • AWS Athena
  • AWS Code Pipeline
  • AWS EFS
  • AWS EMR
  • AWS Redshift
  • AWS S3
  • Azure ADLS Gen2
  • Azure Data Factory
  • Azure Data Lake Storage
  • Azure Databricks
  • Azure Event Hub
  • Azure Stream Analytics
  • Azure Synapse
  • Bitbucket
  • Change Management
  • Client Centricity
  • Collaboration
  • Continuous Integration and Continuous Delivery (CI/CD)
  • Data Architecture Patterns
  • Data Format Analysis
  • Data Governance
  • Data Modeling
  • Data Validation
  • Data Vault Modeling
  • Database Schema Design
  • Decision-Making
  • DevOps
  • Dimensional Modeling
  • GCP Big Table
  • GCP BigQuery
  • GCP Cloud Storage
  • GCP DataFlow
  • GCP DataProc
  • Git
  • Google Big Table
  • Google Data Proc
  • Greenplum
  • HQL
  • IBM Data Stage
  • IBM DB2
  • Industry Standard Data Modeling (FSLDM)
  • Industry Standard Data Modeling (IBM FSDM)
  • Influencing
  • Informatica IICS
  • Inmon methodology
  • JavaScript
  • Jenkins
  • Kimball
  • Linux - Redhat
  • Negotiation
  • Netezza
  • NewSQL
  • Oracle Exadata
  • Performance Tuning
  • Perl
  • Platform Update Management
  • Project Management
  • PySpark
  • Python
  • R
  • RDD Optimization
  • Santos
  • SAS
  • Scala Spark
  • Shell Script
  • Snowflake
  • SPARK
  • SPARK Code Optimization
  • SQL
  • Stakeholder Management
  • Sun Solaris
  • Synapse
  • Talend
  • Teradata
  • Time Management
  • Ubuntu
  • Vendor Management

Locations

  • Hyderabad, India

Salary

Estimated Salary Rangemedium confidence

2,500,000 - 4,200,000 INR / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Ab Initiointermediate
  • Agile (Software Development Framework)intermediate
  • Apache Hadoopintermediate
  • AWS Airflowintermediate
  • AWS Athenaintermediate
  • AWS Code Pipelineintermediate
  • AWS EFSintermediate
  • AWS EMRintermediate
  • AWS Redshiftintermediate
  • AWS S3intermediate
  • Azure ADLS Gen2intermediate
  • Azure Data Factoryintermediate
  • Azure Data Lake Storageintermediate
  • Azure Databricksintermediate
  • Azure Event Hubintermediate
  • Azure Stream Analyticsintermediate
  • Azure Synapseintermediate
  • Bitbucketintermediate
  • Change Managementintermediate
  • Client Centricityintermediate
  • Collaborationintermediate
  • Continuous Integration and Continuous Delivery (CI/CD)intermediate
  • Data Architecture Patternsintermediate
  • Data Format Analysisintermediate
  • Data Governanceintermediate
  • Data Modelingintermediate
  • Data Validationintermediate
  • Data Vault Modelingintermediate
  • Database Schema Designintermediate
  • Decision-Makingintermediate
  • DevOpsintermediate
  • Dimensional Modelingintermediate
  • GCP Big Tableintermediate
  • GCP BigQueryintermediate
  • GCP Cloud Storageintermediate
  • GCP DataFlowintermediate
  • GCP DataProcintermediate
  • Gitintermediate
  • Google Big Tableintermediate
  • Google Data Procintermediate
  • Greenplumintermediate
  • HQLintermediate
  • IBM Data Stageintermediate
  • IBM DB2intermediate
  • Industry Standard Data Modeling (FSLDM)intermediate
  • Industry Standard Data Modeling (IBM FSDM)intermediate
  • Influencingintermediate
  • Informatica IICSintermediate
  • Inmon methodologyintermediate
  • JavaScriptintermediate
  • Jenkinsintermediate
  • Kimballintermediate
  • Linux - Redhatintermediate
  • Negotiationintermediate
  • Netezzaintermediate
  • NewSQLintermediate
  • Oracle Exadataintermediate
  • Performance Tuningintermediate
  • Perlintermediate
  • Platform Update Managementintermediate
  • Project Managementintermediate
  • PySparkintermediate
  • Pythonintermediate
  • Rintermediate
  • RDD Optimizationintermediate
  • Santosintermediate
  • SASintermediate
  • Scala Sparkintermediate
  • Shell Scriptintermediate
  • Snowflakeintermediate
  • SPARKintermediate
  • SPARK Code Optimizationintermediate
  • SQLintermediate
  • Stakeholder Managementintermediate
  • Sun Solarisintermediate
  • Synapseintermediate
  • Talendintermediate
  • Teradataintermediate
  • Time Managementintermediate
  • Ubuntuintermediate
  • Vendor Managementintermediate

Required Qualifications

  • Experience in leading and managing teams of data engineers (experience)
  • Background in overseeing data engineering projects (experience)
  • Knowledge of ensuring technical excellence in data solutions (experience)

Responsibilities

  • Leading and managing a team of data engineers
  • Overseeing data engineering projects
  • Ensuring technical excellence
  • Fostering collaboration with stakeholders
  • Driving the success of data engineering initiatives
  • Delivering reliable and high-quality data solutions

Target Your Resume for "Data Engineer - C" , Capgemini

Get personalized recommendations to optimize your resume specifically for Data Engineer - C. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Data Engineer - C" , Capgemini

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

FSData & AIExperienced ProfessionalsFS

Answer 10 quick questions to check your fit for Data Engineer - C @ Capgemini.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.