RESUME AND JOB
Nagarro
Location: WFA/Remote, Colombia | Job ID: REF51091Q
We are seeking a DataOps Engineer to join Tech Delivery and Infrastructure Operations teams, playing a key role in ensuring the reliability, automation, and performance of our analytics and data platforms. This role is primarily DataOps-focused, combining elements of DevOps and SRE to sustain and optimize data-driven environments across global business units. You will manage end-to-end data operations from SQL diagnostics and data pipeline reliability to automation, monitoring, and deployment of analytics workloads on cloud platforms. You'll collaborate with Data Engineering, Product, and Infrastructure teams to maintain scalable, secure, and high-performing systems. Key Responsibilities Manage and support data pipelines, ETL processes, and analytics platforms, ensuring reliability, accuracy, and accessibility Execute data validation, quality checks, and performance tuning using SQL and Python/Shell scripting Implement monitoring and observability using Datadog, Grafana, and Prometheus to track system health and performance Collaborate with DevOps and Infra teams to integrate data deployments within CI/CD pipelines (Jenkins, Azure DevOps, Git) Apply infrastructure-as-code principles (Terraform, Ansible) for provisioning and automation of data environments Support incident and request management via ServiceNow, ensuring SLA adherence and root cause analysis Work closely with security and compliance teams to maintain data governance and protection standards Participate in Agile ceremonies within Scrum/Kanban models to align with cross-functional delivery squads Required Skills & Experience 6 years in DataOps, Data Engineering Operations, or Analytics Platform Support, with good exposure to DevOps/SRE practices Proficiency in SQL and Python/Shell scripting for automation and data diagnostics Experience with cloud platforms (AWS mandatory; exposure to Azure/GCP a plus) Familiarity with CI/CD tools (Jenkins, Azure DevOps), version control (Git), and IaC frameworks (Terraform, Ansible) - Working knowledge of monitoring tools (Datadog, Grafana, Prometheus) Understanding of containerization (Docker, Kubernetes) concepts Strong grasp of data governance, observability, and quality frameworks Experience in incident management and operational metrics tracking (MTTR, uptime, latency)
Must have Skills: Python (Strong), SQL (Strong), DevOps - AWS (Strong), DevOps - Azure (Strong), DataDog.
Python (Strong), SQL (Strong), DevOps - AWS (Strong), DevOps - Azure (Strong), DataDog, Terraform, Ansible, Jenkins, Grafana, Prometheus, Docker, Kubernetes, Git, Shell scripting
Search keywords: Nagarro Careers, WFA/Remote Engineering Jobs, Digital Transformation Roles, Python (Strong), SQL (Strong), DevOps - AWS (Strong) Opportunities.
Salary details available upon request
90,000,000 - 162,000,000 COP / yearly
Source: ai estimated
* This is an estimated range based on market data and may vary based on experience and qualifications.
Get personalized recommendations to optimize your resume specifically for Staff Engineer - DataOps Engineer - Careers at Nagarro. Takes only 15 seconds!
Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.
Answer 10 quick questions to check your fit for Staff Engineer - DataOps Engineer - Careers at Nagarro @ Nagarro.

No related jobs found at the moment.

© 2026 Pointers. All rights reserved.

Nagarro
Location: WFA/Remote, Colombia | Job ID: REF51091Q
We are seeking a DataOps Engineer to join Tech Delivery and Infrastructure Operations teams, playing a key role in ensuring the reliability, automation, and performance of our analytics and data platforms. This role is primarily DataOps-focused, combining elements of DevOps and SRE to sustain and optimize data-driven environments across global business units. You will manage end-to-end data operations from SQL diagnostics and data pipeline reliability to automation, monitoring, and deployment of analytics workloads on cloud platforms. You'll collaborate with Data Engineering, Product, and Infrastructure teams to maintain scalable, secure, and high-performing systems. Key Responsibilities Manage and support data pipelines, ETL processes, and analytics platforms, ensuring reliability, accuracy, and accessibility Execute data validation, quality checks, and performance tuning using SQL and Python/Shell scripting Implement monitoring and observability using Datadog, Grafana, and Prometheus to track system health and performance Collaborate with DevOps and Infra teams to integrate data deployments within CI/CD pipelines (Jenkins, Azure DevOps, Git) Apply infrastructure-as-code principles (Terraform, Ansible) for provisioning and automation of data environments Support incident and request management via ServiceNow, ensuring SLA adherence and root cause analysis Work closely with security and compliance teams to maintain data governance and protection standards Participate in Agile ceremonies within Scrum/Kanban models to align with cross-functional delivery squads Required Skills & Experience 6 years in DataOps, Data Engineering Operations, or Analytics Platform Support, with good exposure to DevOps/SRE practices Proficiency in SQL and Python/Shell scripting for automation and data diagnostics Experience with cloud platforms (AWS mandatory; exposure to Azure/GCP a plus) Familiarity with CI/CD tools (Jenkins, Azure DevOps), version control (Git), and IaC frameworks (Terraform, Ansible) - Working knowledge of monitoring tools (Datadog, Grafana, Prometheus) Understanding of containerization (Docker, Kubernetes) concepts Strong grasp of data governance, observability, and quality frameworks Experience in incident management and operational metrics tracking (MTTR, uptime, latency)
Must have Skills: Python (Strong), SQL (Strong), DevOps - AWS (Strong), DevOps - Azure (Strong), DataDog.
Python (Strong), SQL (Strong), DevOps - AWS (Strong), DevOps - Azure (Strong), DataDog, Terraform, Ansible, Jenkins, Grafana, Prometheus, Docker, Kubernetes, Git, Shell scripting
Search keywords: Nagarro Careers, WFA/Remote Engineering Jobs, Digital Transformation Roles, Python (Strong), SQL (Strong), DevOps - AWS (Strong) Opportunities.
Salary details available upon request
90,000,000 - 162,000,000 COP / yearly
Source: ai estimated
* This is an estimated range based on market data and may vary based on experience and qualifications.
Get personalized recommendations to optimize your resume specifically for Staff Engineer - DataOps Engineer - Careers at Nagarro. Takes only 15 seconds!
Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.
Answer 10 quick questions to check your fit for Staff Engineer - DataOps Engineer - Careers at Nagarro @ Nagarro.

No related jobs found at the moment.

© 2026 Pointers. All rights reserved.