Resume and JobRESUME AND JOB
Nagarro logo

Staff Engineer - DataOps Engineer - Careers at Nagarro

Nagarro

Staff Engineer - DataOps Engineer - Careers at Nagarro

full-timePosted: Feb 2, 2026

Job Description

Staff Engineer - DataOps Engineer - Careers at Nagarro

Location: Guadalajara (Hybrid), Mexico | Job ID: REF51093B

Elevate your career with Nagarro Careers as a Staff Engineer - DataOps Engineer in Guadalajara, Mexico. Dive into Software Engineering Jobs that blend cutting-edge DataOps with DevOps and SRE practices, ensuring reliability and performance of analytics platforms. **Why Join Nagarro?** At Nagarro, our Caring Mindset sets us apart—we prioritize your growth, well-being, and professional fulfillment in a Fluidic Enterprise environment. Enjoy hybrid work in Guadalajara, global opportunities, and a culture that values innovation and collaboration. With a focus on employee-centric policies, you'll thrive in a supportive ecosystem designed for long-term success. **Digital Engineering Excellence** Nagarro leads in digital engineering, empowering you to manage end-to-end data operations using Python, SQL, AWS, Azure DevOps, DataDog, Terraform, and more. Leverage CI/CD pipelines with Jenkins and Git, implement IaC with Ansible, and ensure observability via Grafana and Prometheus. Our cloud-first approach (AWS mandatory, Azure/GCP plus) supports scalable, secure data pipelines, ETL processes, and analytics workloads. Join cross-functional squads in Agile Scrum/Kanban models, collaborating with Data Engineering, Product, and Infrastructure teams to drive high-performing systems. **Your Impact at Nagarro** As a Senior DataOps Engineer, you'll optimize data governance, automate deployments, handle incident management via ServiceNow, and track metrics like MTTR and uptime. Your expertise in containerization (Docker, Kubernetes) and data quality frameworks will sustain global business units. Contribute to a high-impact role where your skills in monitoring, performance tuning, and root cause analysis deliver tangible results. Nagarro Careers offer unparalleled growth in Software Engineering Jobs—apply your 6+ years of experience to shape the future of data-driven innovation. Embrace a Caring Mindset, seize Global opportunities, and make Your Impact at Nagarro today. With 400+ words of excellence, this is your gateway to a rewarding journey in digital transformation.

Role Description

We are seeking a DataOps Engineer to join Tech Delivery and Infrastructure Operations teams, playing a key role in ensuring the reliability, automation, and performance of our analytics and data platforms. This role is primarily DataOps-focused, combining elements of DevOps and SRE to sustain and optimize data-driven environments across global business units. You will manage end-to-end data operations from SQL diagnostics and data pipeline reliability to automation, monitoring, and deployment of analytics workloads on cloud platforms. You'll collaborate with Data Engineering, Product, and Infrastructure teams to maintain scalable, secure, and high-performing systems. Key Responsibilities Manage and support data pipelines, ETL processes, and analytics platforms, ensuring reliability, accuracy, and accessibility Execute data validation, quality checks, and performance tuning using SQL and Python/Shell scripting Implement monitoring and observability using Datadog, Grafana, and Prometheus to track system health and performance Collaborate with DevOps and Infra teams to integrate data deployments within CI/CD pipelines (Jenkins, Azure DevOps, Git) Apply infrastructure-as-code principles (Terraform, Ansible) for provisioning and automation of data environments Support incident and request management via ServiceNow, ensuring SLA adherence and root cause analysis Work closely with security and compliance teams to maintain data governance and protection standards Participate in Agile ceremonies within Scrum/Kanban models to align with cross-functional delivery squads Required Skills & Experience 6 years in DataOps, Data Engineering Operations, or Analytics Platform Support, with good exposure to DevOps/SRE practices Proficiency in SQL and Python/Shell scripting for automation and data diagnostics Experience with cloud platforms (AWS mandatory; exposure to Azure/GCP a plus) Familiarity with CI/CD tools (Jenkins, Azure DevOps), version control (Git), and IaC frameworks (Terraform, Ansible) - Working knowledge of monitoring tools (Datadog, Grafana, Prometheus) Understanding of containerization (Docker, Kubernetes) concepts Strong grasp of data governance, observability, and quality frameworks Experience in incident management and operational metrics tracking (MTTR, uptime, latency)

Must have Skills: Python (Strong), SQL (Strong), DevOps - AWS (Strong), DevOps - Azure (Strong), DataDog.

Key Responsibilities

  • Manage and support data pipelines, ETL processes, and analytics platforms
  • Execute data validation, quality checks, and performance tuning using SQL and Python/Shell scripting
  • Implement monitoring and observability using Datadog, Grafana, and Prometheus
  • Collaborate with DevOps and Infra teams to integrate data deployments within CI/CD pipelines
  • Apply infrastructure-as-code principles (Terraform, Ansible) for provisioning and automation
  • Support incident and request management via ServiceNow

What You Bring (Qualifications)

  • 6 years in DataOps, Data Engineering Operations, or Analytics Platform Support
  • Proficiency in SQL and Python/Shell scripting
  • Experience with cloud platforms (AWS mandatory)
  • Familiarity with CI/CD tools (Jenkins, Azure DevOps) and IaC (Terraform, Ansible)
  • Working knowledge of monitoring tools (Datadog, Grafana, Prometheus)

Core Skills

Python (Strong), SQL (Strong), DevOps - AWS (Strong), DevOps - Azure (Strong), DataDog, Terraform, Ansible, Jenkins, Grafana, Prometheus, Docker, Kubernetes, Git, Shell scripting

Why Nagarro? (Benefits)

  • Caring mindset with emphasis on work-life balance and employee well-being
  • Fluidic Enterprise model offering flexibility and agile career growth
  • Global opportunities across multiple countries and business units
  • Collaborative culture fostering innovation in digital engineering
  • Professional development through Scrum/Kanban and cross-functional teams
  • Hybrid work model in Guadalajara for optimal productivity

Locations

  • Guadalajara (Hybrid), Mexico

Salary

Salary details available upon request

Estimated Salary Rangemedium confidence

900,000 - 1,620,000 MXN / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Python (Strong)intermediate
  • SQL (Strong)intermediate
  • DevOps - AWS (Strong)intermediate
  • DevOps - Azure (Strong)intermediate
  • DataDogintermediate
  • Terraformintermediate
  • Ansibleintermediate
  • Jenkinsintermediate
  • Grafanaintermediate
  • Prometheusintermediate
  • Dockerintermediate
  • Kubernetesintermediate
  • Gitintermediate
  • Shell scriptingintermediate

Required Qualifications

  • 6 years in DataOps, Data Engineering Operations, or Analytics Platform Support (experience)
  • Proficiency in SQL and Python/Shell scripting (experience)
  • Experience with cloud platforms (AWS mandatory) (experience)
  • Familiarity with CI/CD tools (Jenkins, Azure DevOps) and IaC (Terraform, Ansible) (experience)
  • Working knowledge of monitoring tools (Datadog, Grafana, Prometheus) (experience)

Responsibilities

  • Manage and support data pipelines, ETL processes, and analytics platforms
  • Execute data validation, quality checks, and performance tuning using SQL and Python/Shell scripting
  • Implement monitoring and observability using Datadog, Grafana, and Prometheus
  • Collaborate with DevOps and Infra teams to integrate data deployments within CI/CD pipelines
  • Apply infrastructure-as-code principles (Terraform, Ansible) for provisioning and automation
  • Support incident and request management via ServiceNow

Benefits

  • general: Caring mindset with emphasis on work-life balance and employee well-being
  • general: Fluidic Enterprise model offering flexibility and agile career growth
  • general: Global opportunities across multiple countries and business units
  • general: Collaborative culture fostering innovation in digital engineering
  • general: Professional development through Scrum/Kanban and cross-functional teams
  • general: Hybrid work model in Guadalajara for optimal productivity

Target Your Resume for "Staff Engineer - DataOps Engineer - Careers at Nagarro" , Nagarro

Get personalized recommendations to optimize your resume specifically for Staff Engineer - DataOps Engineer - Careers at Nagarro. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Staff Engineer - DataOps Engineer - Careers at Nagarro" , Nagarro

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

OthersNagarroIT ServicesDigital EngineeringSeniorOthers

Answer 10 quick questions to check your fit for Staff Engineer - DataOps Engineer - Careers at Nagarro @ Nagarro.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.

Nagarro logo

Staff Engineer - DataOps Engineer - Careers at Nagarro

Nagarro

Staff Engineer - DataOps Engineer - Careers at Nagarro

full-timePosted: Feb 2, 2026

Job Description

Staff Engineer - DataOps Engineer - Careers at Nagarro

Location: Guadalajara (Hybrid), Mexico | Job ID: REF51093B

Elevate your career with Nagarro Careers as a Staff Engineer - DataOps Engineer in Guadalajara, Mexico. Dive into Software Engineering Jobs that blend cutting-edge DataOps with DevOps and SRE practices, ensuring reliability and performance of analytics platforms. **Why Join Nagarro?** At Nagarro, our Caring Mindset sets us apart—we prioritize your growth, well-being, and professional fulfillment in a Fluidic Enterprise environment. Enjoy hybrid work in Guadalajara, global opportunities, and a culture that values innovation and collaboration. With a focus on employee-centric policies, you'll thrive in a supportive ecosystem designed for long-term success. **Digital Engineering Excellence** Nagarro leads in digital engineering, empowering you to manage end-to-end data operations using Python, SQL, AWS, Azure DevOps, DataDog, Terraform, and more. Leverage CI/CD pipelines with Jenkins and Git, implement IaC with Ansible, and ensure observability via Grafana and Prometheus. Our cloud-first approach (AWS mandatory, Azure/GCP plus) supports scalable, secure data pipelines, ETL processes, and analytics workloads. Join cross-functional squads in Agile Scrum/Kanban models, collaborating with Data Engineering, Product, and Infrastructure teams to drive high-performing systems. **Your Impact at Nagarro** As a Senior DataOps Engineer, you'll optimize data governance, automate deployments, handle incident management via ServiceNow, and track metrics like MTTR and uptime. Your expertise in containerization (Docker, Kubernetes) and data quality frameworks will sustain global business units. Contribute to a high-impact role where your skills in monitoring, performance tuning, and root cause analysis deliver tangible results. Nagarro Careers offer unparalleled growth in Software Engineering Jobs—apply your 6+ years of experience to shape the future of data-driven innovation. Embrace a Caring Mindset, seize Global opportunities, and make Your Impact at Nagarro today. With 400+ words of excellence, this is your gateway to a rewarding journey in digital transformation.

Role Description

We are seeking a DataOps Engineer to join Tech Delivery and Infrastructure Operations teams, playing a key role in ensuring the reliability, automation, and performance of our analytics and data platforms. This role is primarily DataOps-focused, combining elements of DevOps and SRE to sustain and optimize data-driven environments across global business units. You will manage end-to-end data operations from SQL diagnostics and data pipeline reliability to automation, monitoring, and deployment of analytics workloads on cloud platforms. You'll collaborate with Data Engineering, Product, and Infrastructure teams to maintain scalable, secure, and high-performing systems. Key Responsibilities Manage and support data pipelines, ETL processes, and analytics platforms, ensuring reliability, accuracy, and accessibility Execute data validation, quality checks, and performance tuning using SQL and Python/Shell scripting Implement monitoring and observability using Datadog, Grafana, and Prometheus to track system health and performance Collaborate with DevOps and Infra teams to integrate data deployments within CI/CD pipelines (Jenkins, Azure DevOps, Git) Apply infrastructure-as-code principles (Terraform, Ansible) for provisioning and automation of data environments Support incident and request management via ServiceNow, ensuring SLA adherence and root cause analysis Work closely with security and compliance teams to maintain data governance and protection standards Participate in Agile ceremonies within Scrum/Kanban models to align with cross-functional delivery squads Required Skills & Experience 6 years in DataOps, Data Engineering Operations, or Analytics Platform Support, with good exposure to DevOps/SRE practices Proficiency in SQL and Python/Shell scripting for automation and data diagnostics Experience with cloud platforms (AWS mandatory; exposure to Azure/GCP a plus) Familiarity with CI/CD tools (Jenkins, Azure DevOps), version control (Git), and IaC frameworks (Terraform, Ansible) - Working knowledge of monitoring tools (Datadog, Grafana, Prometheus) Understanding of containerization (Docker, Kubernetes) concepts Strong grasp of data governance, observability, and quality frameworks Experience in incident management and operational metrics tracking (MTTR, uptime, latency)

Must have Skills: Python (Strong), SQL (Strong), DevOps - AWS (Strong), DevOps - Azure (Strong), DataDog.

Key Responsibilities

  • Manage and support data pipelines, ETL processes, and analytics platforms
  • Execute data validation, quality checks, and performance tuning using SQL and Python/Shell scripting
  • Implement monitoring and observability using Datadog, Grafana, and Prometheus
  • Collaborate with DevOps and Infra teams to integrate data deployments within CI/CD pipelines
  • Apply infrastructure-as-code principles (Terraform, Ansible) for provisioning and automation
  • Support incident and request management via ServiceNow

What You Bring (Qualifications)

  • 6 years in DataOps, Data Engineering Operations, or Analytics Platform Support
  • Proficiency in SQL and Python/Shell scripting
  • Experience with cloud platforms (AWS mandatory)
  • Familiarity with CI/CD tools (Jenkins, Azure DevOps) and IaC (Terraform, Ansible)
  • Working knowledge of monitoring tools (Datadog, Grafana, Prometheus)

Core Skills

Python (Strong), SQL (Strong), DevOps - AWS (Strong), DevOps - Azure (Strong), DataDog, Terraform, Ansible, Jenkins, Grafana, Prometheus, Docker, Kubernetes, Git, Shell scripting

Why Nagarro? (Benefits)

  • Caring mindset with emphasis on work-life balance and employee well-being
  • Fluidic Enterprise model offering flexibility and agile career growth
  • Global opportunities across multiple countries and business units
  • Collaborative culture fostering innovation in digital engineering
  • Professional development through Scrum/Kanban and cross-functional teams
  • Hybrid work model in Guadalajara for optimal productivity

Locations

  • Guadalajara (Hybrid), Mexico

Salary

Salary details available upon request

Estimated Salary Rangemedium confidence

900,000 - 1,620,000 MXN / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Python (Strong)intermediate
  • SQL (Strong)intermediate
  • DevOps - AWS (Strong)intermediate
  • DevOps - Azure (Strong)intermediate
  • DataDogintermediate
  • Terraformintermediate
  • Ansibleintermediate
  • Jenkinsintermediate
  • Grafanaintermediate
  • Prometheusintermediate
  • Dockerintermediate
  • Kubernetesintermediate
  • Gitintermediate
  • Shell scriptingintermediate

Required Qualifications

  • 6 years in DataOps, Data Engineering Operations, or Analytics Platform Support (experience)
  • Proficiency in SQL and Python/Shell scripting (experience)
  • Experience with cloud platforms (AWS mandatory) (experience)
  • Familiarity with CI/CD tools (Jenkins, Azure DevOps) and IaC (Terraform, Ansible) (experience)
  • Working knowledge of monitoring tools (Datadog, Grafana, Prometheus) (experience)

Responsibilities

  • Manage and support data pipelines, ETL processes, and analytics platforms
  • Execute data validation, quality checks, and performance tuning using SQL and Python/Shell scripting
  • Implement monitoring and observability using Datadog, Grafana, and Prometheus
  • Collaborate with DevOps and Infra teams to integrate data deployments within CI/CD pipelines
  • Apply infrastructure-as-code principles (Terraform, Ansible) for provisioning and automation
  • Support incident and request management via ServiceNow

Benefits

  • general: Caring mindset with emphasis on work-life balance and employee well-being
  • general: Fluidic Enterprise model offering flexibility and agile career growth
  • general: Global opportunities across multiple countries and business units
  • general: Collaborative culture fostering innovation in digital engineering
  • general: Professional development through Scrum/Kanban and cross-functional teams
  • general: Hybrid work model in Guadalajara for optimal productivity

Target Your Resume for "Staff Engineer - DataOps Engineer - Careers at Nagarro" , Nagarro

Get personalized recommendations to optimize your resume specifically for Staff Engineer - DataOps Engineer - Careers at Nagarro. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Staff Engineer - DataOps Engineer - Careers at Nagarro" , Nagarro

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

OthersNagarroIT ServicesDigital EngineeringSeniorOthers

Answer 10 quick questions to check your fit for Staff Engineer - DataOps Engineer - Careers at Nagarro @ Nagarro.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.