Resume and JobRESUME AND JOB
AT&T logo

Sr Specialist Data/AI Engineering - Palantir Foundry

AT&T

Engineering Jobs

Sr Specialist Data/AI Engineering - Palantir Foundry

full-timePosted: Dec 2, 2025

Job Description

Job Description:

Key Responsibilities:

Data Engineering and Integration:

  • Design, develop, and optimize scalable data pipelines and ETL processes using Palantir Foundry for data integration, transformation, and creating data models to support analytics and business use cases.
  • Build and manage APIs and microservices using Python and Java to integrate data across systems and manage data processing and application logic.

Data Modeling and Architecture:

  • Design and implement robust, scalable, and efficient data models in Palantir Foundry.
  • Collaborate with data architects to define data governance, data lineage, and data quality standards.
  • Develop and maintain reusable pipelines and templates for data transformation and enrichment.

Azure Cloud Expertise:

  • Utilize Azure SynapseAzure Data Lake, and Azure Storage for data storage and processing.
  • Implement secure and efficient data workflows on Azure, ensuring compliance with organizational and regulatory policies.
  • Monitor and troubleshoot Azure data pipelines for performance optimization.

Programming and Automation:

  • Write clean, maintainable, and efficient code in Python and Java for data processing, automation, and integration tasks.
  • Develop scripts to automate repetitive tasks and improve overall system efficiency.
  • Hands on Experience on Snowflake in writing procedures / functions, Snowpipe, Data Pipelines, Data Transformation.

Collaboration and Stakeholder Engagement:

  • Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights.
  • Provide technical support and training on Palantir Foundry to internal teams.
  • Participate in Agile development processes and collaborate with DevOps teams to ensure seamless deployment.

Performance Monitoring and Optimization:

  • Monitor data pipelines and applications for reliability, scalability, and performance.
  • Implement best practices for error handling, logging, and alerting to ensure system stability.

Required Skills and Qualifications:

  • Technical Expertise:
    • Proficiency in Palantir Foundry for data integration, modeling, and pipeline development.
    • Strong programming skills in Python and Java.
    • Hands-on experience with Azure Data Engineering tools (e.g., Azure Data Factory, Databricks, Synapse, Data Lake, etc.).
    • Solid understanding of data structures, algorithms, and software engineering principles.
  • Data Engineering Skills:
    • Experience in building and optimizing ETL/ELT pipelines for large-scale data processing.
    • Proficiency in SQL for data querying and transformation.
    • Familiarity with data governance, data lineage, and data security practices.
  • Cloud Expertise:
    • Strong knowledge of Azure cloud services and infrastructure for data engineering.
    • Experience with CI/CD pipelines, containerization (e.g., Docker), and orchestration tools (e.g., Kubernetes).
  • Problem Solving and Collaboration:
    • Excellent problem-solving skills and ability to troubleshoot complex issues in data systems.
    • Strong communication skills to collaborate effectively with technical and non-technical stakeholders.

Preferred Qualifications:

  • Certification in Palantir or Microsoft Azure Data Engineering or related cloud certifications.
  • Experience in working in Agile/Scrum environments.

Weekly Hours:

40

Time Type:

Regular

Location:

Bangalore, India

It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.

Locations

  • Bangalore, Karnataka, India

Salary

Estimated Salary Rangemedium confidence

45,000 - 85,000 USD / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Palantir Foundryintermediate
  • Pythonintermediate
  • Javaintermediate
  • Azure Data Engineering tools (Azure Data Factory, Databricks, Synapse, Data Lake)intermediate
  • Data structuresintermediate
  • Algorithmsintermediate
  • Software engineering principlesintermediate
  • ETL/ELT pipelinesintermediate
  • SQLintermediate
  • Data governanceintermediate
  • Data lineageintermediate
  • Data securityintermediate
  • Azure cloud servicesintermediate
  • CI/CD pipelinesintermediate
  • Dockerintermediate
  • Kubernetesintermediate
  • Problem-solvingintermediate
  • Communicationintermediate

Required Qualifications

  • Proficiency in Palantir Foundry for data integration, modeling, and pipeline development (experience)
  • Strong programming skills in Python and Java (experience)
  • Hands-on experience with Azure Data Engineering tools (e.g., Azure Data Factory, Databricks, Synapse, Data Lake, etc.) (experience)
  • Solid understanding of data structures, algorithms, and software engineering principles (experience)
  • Experience in building and optimizing ETL/ELT pipelines for large-scale data processing (experience)
  • Proficiency in SQL for data querying and transformation (experience)
  • Familiarity with data governance, data lineage, and data security practices (experience)
  • Strong knowledge of Azure cloud services and infrastructure for data engineering (experience)
  • Experience with CI/CD pipelines, containerization (e.g., Docker), and orchestration tools (e.g., Kubernetes) (experience)
  • Excellent problem-solving skills and ability to troubleshoot complex issues in data systems (experience)
  • Strong communication skills to collaborate effectively with technical and non-technical stakeholders (experience)

Preferred Qualifications

  • Certification in Palantir or Microsoft Azure Data Engineering or related cloud certifications (experience)
  • Experience in working in Agile/Scrum environments (experience)

Responsibilities

  • Design, develop, and optimize scalable data pipelines and ETL processes using Palantir Foundry for data integration, transformation, and creating data models to support analytics and business use cases
  • Build and manage APIs and microservices using Python and Java to integrate data across systems and manage data processing and application logic
  • Design and implement robust, scalable, and efficient data models in Palantir Foundry
  • Collaborate with data architects to define data governance, data lineage, and data quality standards
  • Develop and maintain reusable pipelines and templates for data transformation and enrichment
  • Utilize Azure Synapse, Azure Data Lake, and Azure Storage for data storage and processing
  • Implement secure and efficient data workflows on Azure, ensuring compliance with organizational and regulatory policies
  • Monitor and troubleshoot Azure data pipelines for performance optimization
  • Write clean, maintainable, and efficient code in Python and Java for data processing, automation, and integration tasks
  • Develop scripts to automate repetitive tasks and improve overall system efficiency
  • Hands on Experience on Snowflake in writing procedures / functions, Snowpipe, Data Pipelines, Data Transformation
  • Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights
  • Provide technical support and training on Palantir Foundry to internal teams
  • Participate in Agile development processes and collaborate with DevOps teams to ensure seamless deployment
  • Monitor data pipelines and applications for reliability, scalability, and performance
  • Implement best practices for error handling, logging, and alerting to ensure system stability

Target Your Resume for "Sr Specialist Data/AI Engineering - Palantir Foundry" , AT&T

Get personalized recommendations to optimize your resume specifically for Sr Specialist Data/AI Engineering - Palantir Foundry. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Sr Specialist Data/AI Engineering - Palantir Foundry" , AT&T

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

TelecommunicationsTelecommunications

Answer 10 quick questions to check your fit for Sr Specialist Data/AI Engineering - Palantir Foundry @ AT&T.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.

AT&T logo

Sr Specialist Data/AI Engineering - Palantir Foundry

AT&T

Engineering Jobs

Sr Specialist Data/AI Engineering - Palantir Foundry

full-timePosted: Dec 2, 2025

Job Description

Job Description:

Key Responsibilities:

Data Engineering and Integration:

  • Design, develop, and optimize scalable data pipelines and ETL processes using Palantir Foundry for data integration, transformation, and creating data models to support analytics and business use cases.
  • Build and manage APIs and microservices using Python and Java to integrate data across systems and manage data processing and application logic.

Data Modeling and Architecture:

  • Design and implement robust, scalable, and efficient data models in Palantir Foundry.
  • Collaborate with data architects to define data governance, data lineage, and data quality standards.
  • Develop and maintain reusable pipelines and templates for data transformation and enrichment.

Azure Cloud Expertise:

  • Utilize Azure SynapseAzure Data Lake, and Azure Storage for data storage and processing.
  • Implement secure and efficient data workflows on Azure, ensuring compliance with organizational and regulatory policies.
  • Monitor and troubleshoot Azure data pipelines for performance optimization.

Programming and Automation:

  • Write clean, maintainable, and efficient code in Python and Java for data processing, automation, and integration tasks.
  • Develop scripts to automate repetitive tasks and improve overall system efficiency.
  • Hands on Experience on Snowflake in writing procedures / functions, Snowpipe, Data Pipelines, Data Transformation.

Collaboration and Stakeholder Engagement:

  • Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights.
  • Provide technical support and training on Palantir Foundry to internal teams.
  • Participate in Agile development processes and collaborate with DevOps teams to ensure seamless deployment.

Performance Monitoring and Optimization:

  • Monitor data pipelines and applications for reliability, scalability, and performance.
  • Implement best practices for error handling, logging, and alerting to ensure system stability.

Required Skills and Qualifications:

  • Technical Expertise:
    • Proficiency in Palantir Foundry for data integration, modeling, and pipeline development.
    • Strong programming skills in Python and Java.
    • Hands-on experience with Azure Data Engineering tools (e.g., Azure Data Factory, Databricks, Synapse, Data Lake, etc.).
    • Solid understanding of data structures, algorithms, and software engineering principles.
  • Data Engineering Skills:
    • Experience in building and optimizing ETL/ELT pipelines for large-scale data processing.
    • Proficiency in SQL for data querying and transformation.
    • Familiarity with data governance, data lineage, and data security practices.
  • Cloud Expertise:
    • Strong knowledge of Azure cloud services and infrastructure for data engineering.
    • Experience with CI/CD pipelines, containerization (e.g., Docker), and orchestration tools (e.g., Kubernetes).
  • Problem Solving and Collaboration:
    • Excellent problem-solving skills and ability to troubleshoot complex issues in data systems.
    • Strong communication skills to collaborate effectively with technical and non-technical stakeholders.

Preferred Qualifications:

  • Certification in Palantir or Microsoft Azure Data Engineering or related cloud certifications.
  • Experience in working in Agile/Scrum environments.

Weekly Hours:

40

Time Type:

Regular

Location:

Bangalore, India

It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.

Locations

  • Bangalore, Karnataka, India

Salary

Estimated Salary Rangemedium confidence

45,000 - 85,000 USD / yearly

Source: ai estimated

* This is an estimated range based on market data and may vary based on experience and qualifications.

Skills Required

  • Palantir Foundryintermediate
  • Pythonintermediate
  • Javaintermediate
  • Azure Data Engineering tools (Azure Data Factory, Databricks, Synapse, Data Lake)intermediate
  • Data structuresintermediate
  • Algorithmsintermediate
  • Software engineering principlesintermediate
  • ETL/ELT pipelinesintermediate
  • SQLintermediate
  • Data governanceintermediate
  • Data lineageintermediate
  • Data securityintermediate
  • Azure cloud servicesintermediate
  • CI/CD pipelinesintermediate
  • Dockerintermediate
  • Kubernetesintermediate
  • Problem-solvingintermediate
  • Communicationintermediate

Required Qualifications

  • Proficiency in Palantir Foundry for data integration, modeling, and pipeline development (experience)
  • Strong programming skills in Python and Java (experience)
  • Hands-on experience with Azure Data Engineering tools (e.g., Azure Data Factory, Databricks, Synapse, Data Lake, etc.) (experience)
  • Solid understanding of data structures, algorithms, and software engineering principles (experience)
  • Experience in building and optimizing ETL/ELT pipelines for large-scale data processing (experience)
  • Proficiency in SQL for data querying and transformation (experience)
  • Familiarity with data governance, data lineage, and data security practices (experience)
  • Strong knowledge of Azure cloud services and infrastructure for data engineering (experience)
  • Experience with CI/CD pipelines, containerization (e.g., Docker), and orchestration tools (e.g., Kubernetes) (experience)
  • Excellent problem-solving skills and ability to troubleshoot complex issues in data systems (experience)
  • Strong communication skills to collaborate effectively with technical and non-technical stakeholders (experience)

Preferred Qualifications

  • Certification in Palantir or Microsoft Azure Data Engineering or related cloud certifications (experience)
  • Experience in working in Agile/Scrum environments (experience)

Responsibilities

  • Design, develop, and optimize scalable data pipelines and ETL processes using Palantir Foundry for data integration, transformation, and creating data models to support analytics and business use cases
  • Build and manage APIs and microservices using Python and Java to integrate data across systems and manage data processing and application logic
  • Design and implement robust, scalable, and efficient data models in Palantir Foundry
  • Collaborate with data architects to define data governance, data lineage, and data quality standards
  • Develop and maintain reusable pipelines and templates for data transformation and enrichment
  • Utilize Azure Synapse, Azure Data Lake, and Azure Storage for data storage and processing
  • Implement secure and efficient data workflows on Azure, ensuring compliance with organizational and regulatory policies
  • Monitor and troubleshoot Azure data pipelines for performance optimization
  • Write clean, maintainable, and efficient code in Python and Java for data processing, automation, and integration tasks
  • Develop scripts to automate repetitive tasks and improve overall system efficiency
  • Hands on Experience on Snowflake in writing procedures / functions, Snowpipe, Data Pipelines, Data Transformation
  • Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights
  • Provide technical support and training on Palantir Foundry to internal teams
  • Participate in Agile development processes and collaborate with DevOps teams to ensure seamless deployment
  • Monitor data pipelines and applications for reliability, scalability, and performance
  • Implement best practices for error handling, logging, and alerting to ensure system stability

Target Your Resume for "Sr Specialist Data/AI Engineering - Palantir Foundry" , AT&T

Get personalized recommendations to optimize your resume specifically for Sr Specialist Data/AI Engineering - Palantir Foundry. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Sr Specialist Data/AI Engineering - Palantir Foundry" , AT&T

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

TelecommunicationsTelecommunications

Answer 10 quick questions to check your fit for Sr Specialist Data/AI Engineering - Palantir Foundry @ AT&T.

Quiz Challenge
10 Questions
~2 Minutes
Instant Score

Related Books and Jobs

No related jobs found at the moment.