Resume and JobRESUME AND JOB
Northrop Grumman logo

Big Data Engineer

Northrop Grumman

Big Data Engineer

Northrop Grumman logo

Northrop Grumman

full-time

Posted: January 2, 2026

Job Description

Scripting and Automation:

Develop and maintain scripts for automating data processes and workflows. Troubleshooting:

Diagnose and resolve technical issues related to big data systems, data pipelines, and integrations. Integration Testing:

Conduct thorough testing to ensure seamless integration of new tools and technologies into existing systems. Linux Internals:

Utilize in-depth knowledge of Linux internals to optimize performance and reliability of big data infrastructure. Network Protocols:

Apply understanding of TCP/IP and OSI models in the design and troubleshooting of networked systems. Data Pipeline Support:

Manage and optimize data pipelines and orchestration tools such as NiFi and Airflow. Scalable System Design:

Design scalable big data systems with a focus on security, GDPR compliance, and privacy Hybrid Cloud Management:

Leverage hybrid cloud experience to manage on-premise and AWS cloud resources effectively Data Science Support:

Support data science, machine learning, and AI workloads using tools like Jupyter, Spacy, Transformers, and NLTK. Big Data Platforms:

Utilize big data NoSQL engines and platforms such as Hive, Impala, and Elasticsearch for data storage and processing BI and Visualization:

Implement and support business intelligence and visualization tools like Tableau, Kibana, and PowerBI to provide actionable insights. Strong troubleshooting skills and the ability to diagnose and resolve complex technical issues. Experience with integration testing and ensuring seamless tool integration. In-depth knowledge of Linux internals and system administration. Understanding of TCP/IP and OSI models. Hands-on experience with data pipeline tools like NiFi and Airflow. Proven ability to design scalable big data systems with a focus on security and GDPR compliance. Hybrid cloud experience, specifically with on-premise and AWS cloud environments. Familiarity with data science, machine learning, and AI tools such as Jupyter, Spacy, Transformers, and NLTK. Experience with big data NoSQL engines/platforms such as Hive, Impala, and Elasticsearch. Proficiency with business intelligence and visualization tools like Tableau, Kibana, and PowerBI. Excellent communication and collaboration skills. Certification in AWS or other cloud platforms. Experience with additional data orchestration tools. Familiarity with other big data tools and technologies. Previous experience in a similar role within a dynamic and fast-paced environment. Experience in Cloudera, Hadoop, Cloudera Data Science Workbench (CDSW), Cloudera Machine Learning (CML) would be highly desirable.

Locations

  • United Kingdom, London, United States

Salary

Estimated Salary Rangemedium confidence

90,000 - 150,000 USD / yearly

Source: rule based estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Required Qualifications

  • Strong troubleshooting skills (experience)
  • Experience with integration testing (experience)
  • In-depth knowledge of Linux internals and system administration (experience)
  • Understanding of TCP/IP and OSI models (experience)
  • Hands-on experience with NiFi, Airflow (experience)
  • Experience designing scalable big data systems with security/GDPR focus (experience)
  • Hybrid cloud experience (on-premise/AWS) (experience)
  • Familiarity with data science/ML/AI tools: Jupyter, Spacy, Transformers, NLTK (experience)

Responsibilities

  • Develop/maintain scripts for automating data processes/workflows
  • Diagnose/resolve issues in big data systems, pipelines, integrations
  • Conduct integration testing for new tools/technologies
  • Utilize Linux internals knowledge to optimize big data infrastructure
  • Apply network protocols knowledge in design/troubleshooting
  • Manage/optimize data pipelines/orchestration tools (NiFi, Airflow)
  • Design scalable big data systems focusing on security, GDPR, privacy
  • Manage hybrid cloud resources (on-premise/AWS)
  • Support data science, ML, AI workloads
  • Implement/support BI and visualization tools

Target Your Resume for "Big Data Engineer" , Northrop Grumman

Get personalized recommendations to optimize your resume specifically for Big Data Engineer. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Big Data Engineer" , Northrop Grumman

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

AerospaceDefenseTechnologyDefense

Related Jobs You May Like

No related jobs found at the moment.

Northrop Grumman logo

Big Data Engineer

Northrop Grumman

Big Data Engineer

Northrop Grumman logo

Northrop Grumman

full-time

Posted: January 2, 2026

Job Description

Scripting and Automation:

Develop and maintain scripts for automating data processes and workflows. Troubleshooting:

Diagnose and resolve technical issues related to big data systems, data pipelines, and integrations. Integration Testing:

Conduct thorough testing to ensure seamless integration of new tools and technologies into existing systems. Linux Internals:

Utilize in-depth knowledge of Linux internals to optimize performance and reliability of big data infrastructure. Network Protocols:

Apply understanding of TCP/IP and OSI models in the design and troubleshooting of networked systems. Data Pipeline Support:

Manage and optimize data pipelines and orchestration tools such as NiFi and Airflow. Scalable System Design:

Design scalable big data systems with a focus on security, GDPR compliance, and privacy Hybrid Cloud Management:

Leverage hybrid cloud experience to manage on-premise and AWS cloud resources effectively Data Science Support:

Support data science, machine learning, and AI workloads using tools like Jupyter, Spacy, Transformers, and NLTK. Big Data Platforms:

Utilize big data NoSQL engines and platforms such as Hive, Impala, and Elasticsearch for data storage and processing BI and Visualization:

Implement and support business intelligence and visualization tools like Tableau, Kibana, and PowerBI to provide actionable insights. Strong troubleshooting skills and the ability to diagnose and resolve complex technical issues. Experience with integration testing and ensuring seamless tool integration. In-depth knowledge of Linux internals and system administration. Understanding of TCP/IP and OSI models. Hands-on experience with data pipeline tools like NiFi and Airflow. Proven ability to design scalable big data systems with a focus on security and GDPR compliance. Hybrid cloud experience, specifically with on-premise and AWS cloud environments. Familiarity with data science, machine learning, and AI tools such as Jupyter, Spacy, Transformers, and NLTK. Experience with big data NoSQL engines/platforms such as Hive, Impala, and Elasticsearch. Proficiency with business intelligence and visualization tools like Tableau, Kibana, and PowerBI. Excellent communication and collaboration skills. Certification in AWS or other cloud platforms. Experience with additional data orchestration tools. Familiarity with other big data tools and technologies. Previous experience in a similar role within a dynamic and fast-paced environment. Experience in Cloudera, Hadoop, Cloudera Data Science Workbench (CDSW), Cloudera Machine Learning (CML) would be highly desirable.

Locations

  • United Kingdom, London, United States

Salary

Estimated Salary Rangemedium confidence

90,000 - 150,000 USD / yearly

Source: rule based estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Required Qualifications

  • Strong troubleshooting skills (experience)
  • Experience with integration testing (experience)
  • In-depth knowledge of Linux internals and system administration (experience)
  • Understanding of TCP/IP and OSI models (experience)
  • Hands-on experience with NiFi, Airflow (experience)
  • Experience designing scalable big data systems with security/GDPR focus (experience)
  • Hybrid cloud experience (on-premise/AWS) (experience)
  • Familiarity with data science/ML/AI tools: Jupyter, Spacy, Transformers, NLTK (experience)

Responsibilities

  • Develop/maintain scripts for automating data processes/workflows
  • Diagnose/resolve issues in big data systems, pipelines, integrations
  • Conduct integration testing for new tools/technologies
  • Utilize Linux internals knowledge to optimize big data infrastructure
  • Apply network protocols knowledge in design/troubleshooting
  • Manage/optimize data pipelines/orchestration tools (NiFi, Airflow)
  • Design scalable big data systems focusing on security, GDPR, privacy
  • Manage hybrid cloud resources (on-premise/AWS)
  • Support data science, ML, AI workloads
  • Implement/support BI and visualization tools

Target Your Resume for "Big Data Engineer" , Northrop Grumman

Get personalized recommendations to optimize your resume specifically for Big Data Engineer. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Big Data Engineer" , Northrop Grumman

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

AerospaceDefenseTechnologyDefense

Related Jobs You May Like

No related jobs found at the moment.