Resume and JobRESUME AND JOB
Coinbase logo

Staff Analytics Engineer

Coinbase

Staff Analytics Engineer

Coinbase logo

Coinbase

full-time

Posted: July 11, 2025

Number of Vacancies: 1

Job Description

Responsibilities

  • Be the expert: Quickly build subject matter expertise in a specific business area and data domain. Understand the data flows from creation, ingestion, transformation, and delivery.
  • Generate business value: Interface with stakeholders on data and product teams to deliver the most commercial value from data (directly or indirectly).
  • Focus on outcomes not tools: Use a variety of frameworks and paradigms to identify the best-fit tools to deliver value.
  • Develop and maintain foundational data models that serve as the single source of truth for analytics across the organization.
  • Empower stakeholders by translating business requirements into scalable data models, dashboards, and tools.
  • Partner with engineering, data science, product, and business teams to ensure alignment on priorities and data solutions.
  • Build frameworks, tools, and workflows that maximize efficiency for data users, while maintaining high standards of data quality and performance.
  • Use modern development and analytics tools to deliver value quickly, while ensuring long-term maintainability.

Required Qualifications

  • Customer Support Data Experience: Familiarity with data elements and processes supporting successful Customer Support initiatives, including employee performance monitoring, workforce/staffing inputs, and the handling of sensitive PII across a broad stakeholder base.
  • Data Modeling Expertise: Strong understanding of best practices for designing modular and reusable data models (e.g., star schemas, snowflake schemas).
  • Prompt Design and Engineering: Expertise in prompt engineering and design for LLMs (e.g., GPT), including creating, refining, and optimizing prompts to improve response accuracy, relevance, and performance for internal tools and use cases.
  • Advanced SQL: Proficiency in advanced SQL techniques for data transformation, querying, and optimization.
  • Intermediate to Advanced Python: Expertise in scripting and automation, with experience in Object-Oriented Programming (OOP) and building scalable frameworks.
  • Collaboration and Communication: Strong ability to translate technical concepts into business value for cross-functional stakeholders. Proven ability to manage projects and communicate effectively across teams.
  • Data Pipeline Development: Experience building, maintaining, and optimizing ETL/ELT pipelines, using modern tools like dbt, Airflow, or similar.
  • Data Visualization: Proficiency in building polished dashboards using tools like Looker, Tableau, Superset, or Python visualization libraries (Matplotlib, Plotly).
  • Development Tools: Familiarity with version control (GitHub), CI/CD, and modern development workflows.
  • Data Architecture: Knowledge of modern data lake/warehouse architectures (e.g., Snowflake, Databricks) and transformation frameworks.
  • Business Acumen: Ability to understand and address business challenges through analytics engineering.
  • Data savvy: Familiarity with statistics and probability.

Preferred Qualifications

  • Experience with cloud platforms (e.g., AWS, GCP).
  • Familiarity with Docker or Kubernetes.

Required Skills

  • Customer Support Data Experience
  • Data Modeling (star schemas, snowflake schemas)
  • Prompt Engineering for LLMs (GPT)
  • Advanced SQL
  • Intermediate to Advanced Python (OOP, scripting, automation)
  • Collaboration and Communication
  • Data Pipeline Development (ETL/ELT, dbt, Airflow)
  • Data Visualization (Looker, Tableau, Superset, Matplotlib, Plotly)
  • Development Tools (GitHub, CI/CD)
  • Data Architecture (Snowflake, Databricks)
  • Business Acumen
  • Statistics and Probability

Benefits

  • Bonus eligibility
  • Medical
  • Dental
  • Vision
  • 401(k)

Salary Range

$207000 - $244100 USD

Locations

  • NAMER, United States (Remote)

Salary

207,000 - 244,100 USD / yearly

Skills Required

  • Customer Support Data Experienceintermediate
  • Data Modeling (star schemas, snowflake schemas)intermediate
  • Prompt Engineering for LLMs (GPT)intermediate
  • Advanced SQLintermediate
  • Intermediate to Advanced Python (OOP, scripting, automation)intermediate
  • Collaboration and Communicationintermediate
  • Data Pipeline Development (ETL/ELT, dbt, Airflow)intermediate
  • Data Visualization (Looker, Tableau, Superset, Matplotlib, Plotly)intermediate
  • Development Tools (GitHub, CI/CD)intermediate
  • Data Architecture (Snowflake, Databricks)intermediate
  • Business Acumenintermediate
  • Statistics and Probabilityintermediate

Required Qualifications

  • Customer Support Data Experience: Familiarity with data elements and processes supporting successful Customer Support initiatives, including employee performance monitoring, workforce/staffing inputs, and the handling of sensitive PII across a broad stakeholder base. (experience)
  • Data Modeling Expertise: Strong understanding of best practices for designing modular and reusable data models (e.g., star schemas, snowflake schemas). (experience)
  • Prompt Design and Engineering: Expertise in prompt engineering and design for LLMs (e.g., GPT), including creating, refining, and optimizing prompts to improve response accuracy, relevance, and performance for internal tools and use cases. (experience)
  • Advanced SQL: Proficiency in advanced SQL techniques for data transformation, querying, and optimization. (experience)
  • Intermediate to Advanced Python: Expertise in scripting and automation, with experience in Object-Oriented Programming (OOP) and building scalable frameworks. (experience)
  • Collaboration and Communication: Strong ability to translate technical concepts into business value for cross-functional stakeholders. Proven ability to manage projects and communicate effectively across teams. (experience)
  • Data Pipeline Development: Experience building, maintaining, and optimizing ETL/ELT pipelines, using modern tools like dbt, Airflow, or similar. (experience)
  • Data Visualization: Proficiency in building polished dashboards using tools like Looker, Tableau, Superset, or Python visualization libraries (Matplotlib, Plotly). (experience)
  • Development Tools: Familiarity with version control (GitHub), CI/CD, and modern development workflows. (experience)
  • Data Architecture: Knowledge of modern data lake/warehouse architectures (e.g., Snowflake, Databricks) and transformation frameworks. (experience)
  • Business Acumen: Ability to understand and address business challenges through analytics engineering. (experience)
  • Data savvy: Familiarity with statistics and probability. (experience)

Preferred Qualifications

  • Experience with cloud platforms (e.g., AWS, GCP). (experience)
  • Familiarity with Docker or Kubernetes. (experience)

Responsibilities

  • Be the expert: Quickly build subject matter expertise in a specific business area and data domain. Understand the data flows from creation, ingestion, transformation, and delivery.
  • Generate business value: Interface with stakeholders on data and product teams to deliver the most commercial value from data (directly or indirectly).
  • Focus on outcomes not tools: Use a variety of frameworks and paradigms to identify the best-fit tools to deliver value.
  • Develop and maintain foundational data models that serve as the single source of truth for analytics across the organization.
  • Empower stakeholders by translating business requirements into scalable data models, dashboards, and tools.
  • Partner with engineering, data science, product, and business teams to ensure alignment on priorities and data solutions.
  • Build frameworks, tools, and workflows that maximize efficiency for data users, while maintaining high standards of data quality and performance.
  • Use modern development and analytics tools to deliver value quickly, while ensuring long-term maintainability.

Benefits

  • general: Bonus eligibility
  • general: Medical
  • general: Dental
  • general: Vision
  • general: 401(k)

Target Your Resume for "Staff Analytics Engineer" , Coinbase

Get personalized recommendations to optimize your resume specifically for Staff Analytics Engineer. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Staff Analytics Engineer" , Coinbase

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

Data EngineeringCryptocurrencyBlockchainFinanceCryptoWeb3Data Engineering

Related Jobs You May Like

No related jobs found at the moment.

Coinbase logo

Staff Analytics Engineer

Coinbase

Staff Analytics Engineer

Coinbase logo

Coinbase

full-time

Posted: July 11, 2025

Number of Vacancies: 1

Job Description

Responsibilities

  • Be the expert: Quickly build subject matter expertise in a specific business area and data domain. Understand the data flows from creation, ingestion, transformation, and delivery.
  • Generate business value: Interface with stakeholders on data and product teams to deliver the most commercial value from data (directly or indirectly).
  • Focus on outcomes not tools: Use a variety of frameworks and paradigms to identify the best-fit tools to deliver value.
  • Develop and maintain foundational data models that serve as the single source of truth for analytics across the organization.
  • Empower stakeholders by translating business requirements into scalable data models, dashboards, and tools.
  • Partner with engineering, data science, product, and business teams to ensure alignment on priorities and data solutions.
  • Build frameworks, tools, and workflows that maximize efficiency for data users, while maintaining high standards of data quality and performance.
  • Use modern development and analytics tools to deliver value quickly, while ensuring long-term maintainability.

Required Qualifications

  • Customer Support Data Experience: Familiarity with data elements and processes supporting successful Customer Support initiatives, including employee performance monitoring, workforce/staffing inputs, and the handling of sensitive PII across a broad stakeholder base.
  • Data Modeling Expertise: Strong understanding of best practices for designing modular and reusable data models (e.g., star schemas, snowflake schemas).
  • Prompt Design and Engineering: Expertise in prompt engineering and design for LLMs (e.g., GPT), including creating, refining, and optimizing prompts to improve response accuracy, relevance, and performance for internal tools and use cases.
  • Advanced SQL: Proficiency in advanced SQL techniques for data transformation, querying, and optimization.
  • Intermediate to Advanced Python: Expertise in scripting and automation, with experience in Object-Oriented Programming (OOP) and building scalable frameworks.
  • Collaboration and Communication: Strong ability to translate technical concepts into business value for cross-functional stakeholders. Proven ability to manage projects and communicate effectively across teams.
  • Data Pipeline Development: Experience building, maintaining, and optimizing ETL/ELT pipelines, using modern tools like dbt, Airflow, or similar.
  • Data Visualization: Proficiency in building polished dashboards using tools like Looker, Tableau, Superset, or Python visualization libraries (Matplotlib, Plotly).
  • Development Tools: Familiarity with version control (GitHub), CI/CD, and modern development workflows.
  • Data Architecture: Knowledge of modern data lake/warehouse architectures (e.g., Snowflake, Databricks) and transformation frameworks.
  • Business Acumen: Ability to understand and address business challenges through analytics engineering.
  • Data savvy: Familiarity with statistics and probability.

Preferred Qualifications

  • Experience with cloud platforms (e.g., AWS, GCP).
  • Familiarity with Docker or Kubernetes.

Required Skills

  • Customer Support Data Experience
  • Data Modeling (star schemas, snowflake schemas)
  • Prompt Engineering for LLMs (GPT)
  • Advanced SQL
  • Intermediate to Advanced Python (OOP, scripting, automation)
  • Collaboration and Communication
  • Data Pipeline Development (ETL/ELT, dbt, Airflow)
  • Data Visualization (Looker, Tableau, Superset, Matplotlib, Plotly)
  • Development Tools (GitHub, CI/CD)
  • Data Architecture (Snowflake, Databricks)
  • Business Acumen
  • Statistics and Probability

Benefits

  • Bonus eligibility
  • Medical
  • Dental
  • Vision
  • 401(k)

Salary Range

$207000 - $244100 USD

Locations

  • NAMER, United States (Remote)

Salary

207,000 - 244,100 USD / yearly

Skills Required

  • Customer Support Data Experienceintermediate
  • Data Modeling (star schemas, snowflake schemas)intermediate
  • Prompt Engineering for LLMs (GPT)intermediate
  • Advanced SQLintermediate
  • Intermediate to Advanced Python (OOP, scripting, automation)intermediate
  • Collaboration and Communicationintermediate
  • Data Pipeline Development (ETL/ELT, dbt, Airflow)intermediate
  • Data Visualization (Looker, Tableau, Superset, Matplotlib, Plotly)intermediate
  • Development Tools (GitHub, CI/CD)intermediate
  • Data Architecture (Snowflake, Databricks)intermediate
  • Business Acumenintermediate
  • Statistics and Probabilityintermediate

Required Qualifications

  • Customer Support Data Experience: Familiarity with data elements and processes supporting successful Customer Support initiatives, including employee performance monitoring, workforce/staffing inputs, and the handling of sensitive PII across a broad stakeholder base. (experience)
  • Data Modeling Expertise: Strong understanding of best practices for designing modular and reusable data models (e.g., star schemas, snowflake schemas). (experience)
  • Prompt Design and Engineering: Expertise in prompt engineering and design for LLMs (e.g., GPT), including creating, refining, and optimizing prompts to improve response accuracy, relevance, and performance for internal tools and use cases. (experience)
  • Advanced SQL: Proficiency in advanced SQL techniques for data transformation, querying, and optimization. (experience)
  • Intermediate to Advanced Python: Expertise in scripting and automation, with experience in Object-Oriented Programming (OOP) and building scalable frameworks. (experience)
  • Collaboration and Communication: Strong ability to translate technical concepts into business value for cross-functional stakeholders. Proven ability to manage projects and communicate effectively across teams. (experience)
  • Data Pipeline Development: Experience building, maintaining, and optimizing ETL/ELT pipelines, using modern tools like dbt, Airflow, or similar. (experience)
  • Data Visualization: Proficiency in building polished dashboards using tools like Looker, Tableau, Superset, or Python visualization libraries (Matplotlib, Plotly). (experience)
  • Development Tools: Familiarity with version control (GitHub), CI/CD, and modern development workflows. (experience)
  • Data Architecture: Knowledge of modern data lake/warehouse architectures (e.g., Snowflake, Databricks) and transformation frameworks. (experience)
  • Business Acumen: Ability to understand and address business challenges through analytics engineering. (experience)
  • Data savvy: Familiarity with statistics and probability. (experience)

Preferred Qualifications

  • Experience with cloud platforms (e.g., AWS, GCP). (experience)
  • Familiarity with Docker or Kubernetes. (experience)

Responsibilities

  • Be the expert: Quickly build subject matter expertise in a specific business area and data domain. Understand the data flows from creation, ingestion, transformation, and delivery.
  • Generate business value: Interface with stakeholders on data and product teams to deliver the most commercial value from data (directly or indirectly).
  • Focus on outcomes not tools: Use a variety of frameworks and paradigms to identify the best-fit tools to deliver value.
  • Develop and maintain foundational data models that serve as the single source of truth for analytics across the organization.
  • Empower stakeholders by translating business requirements into scalable data models, dashboards, and tools.
  • Partner with engineering, data science, product, and business teams to ensure alignment on priorities and data solutions.
  • Build frameworks, tools, and workflows that maximize efficiency for data users, while maintaining high standards of data quality and performance.
  • Use modern development and analytics tools to deliver value quickly, while ensuring long-term maintainability.

Benefits

  • general: Bonus eligibility
  • general: Medical
  • general: Dental
  • general: Vision
  • general: 401(k)

Target Your Resume for "Staff Analytics Engineer" , Coinbase

Get personalized recommendations to optimize your resume specifically for Staff Analytics Engineer. Takes only 15 seconds!

AI-powered keyword optimization
Skills matching & gap analysis
Experience alignment suggestions

Check Your ATS Score for "Staff Analytics Engineer" , Coinbase

Find out how well your resume matches this job's requirements. Get comprehensive analysis including ATS compatibility, keyword matching, skill gaps, and personalized recommendations.

ATS compatibility check
Keyword optimization analysis
Skill matching & gap identification
Format & readability score

Tags & Categories

Data EngineeringCryptocurrencyBlockchainFinanceCryptoWeb3Data Engineering

Related Jobs You May Like

No related jobs found at the moment.