Your opportunity
We seek a skilled Senior Manager of Data Engineering Platform to lead our efforts in constructing a scalable and user-friendly data platform tailored for various technical teams within New Relic. Additionally, you will oversee the development of data applications on this platform as needed.

In this role, you will shape the vision for our data infrastructure and analysis tools, establishing best practices for building systems and datasets utilized across the company. Leveraging your profound technical expertise, you will fortify our environment to establish a robust data foundation. We are looking for someone enthusiastic about making a significant impact at New Relic, particularly concerning revenue generation and enhancing customer experience.
What you'll do
  • Senior leadership experience with Data Engineering platforms.
  • 12+ Years of relevant work experience with 6+ Years into leading teams. 
  • Collaborate with team members to design and implement scalable data infrastructure capable of handling petabytes of data daily through streaming and batch processing.
  • Lead efforts to deliver data to our data lake, facilitating its utilization by the Data
  • Warehouse team, Analytics teams, and Data Scientists.
  • Maintain and support the Data Lakehouse system, overseeing data ingestion and pipelining and implementing tools for automation and orchestration to enhance performance, reliability, and operational efficiency.
  • Define and develop batch and streaming data-parallel processing pipelines and distributed processing back-ends.
  • Establish CI/CD pipelines and manage configuration management.
  • Develop tools and services running on Kubernetes within our data ecosystem.
  • Regularly write efficient, well-commented Python code.
  • Possess clear communication skills to convey complex technical concepts effectively.
  • Contribute to scaling our data warehouse (utilizing Snowflake) to ensure clean, analysis-ready data delivery.
  • Collaborate closely with Analytic Engineers and Data Analysts to gather and analyze raw data for models that empower end users.
  • Enhance and scale our warehouse platform for data ingestion, logging, search, aggregation, viewing, and analysis.
This role requires
  • Minimum of 6 years managing and leading teams
  • 3 plus years of professional experience in Python and/or Java development.
  • Extensive experience with Python scripting (Unix, bash, Python).
  • AWS Certification or equivalent experience.
  • Proficiency in Terraform or similar Infrastructure as Code (IaC) tools (preference for Terraform).
  • Familiarity with Streaming Data technologies such as Apache Beam, Flink, Spark, and Kafka.
  • Experience with modern data technologies, including Airflow, Snowflake, Redshift, and Spark.
  • Knowledge of source control, git-flow, GitLab flow, and CI/CD (e.g., GitLab, CircleCI).
  • Experience working with Kubernetes, Docker, and Helm.
  • Proficiency with automation and orchestration tools such as Argo.
  • Bachelor’s degree or equivalent in computer science, information systems, or related
  • field, or equivalent combination of education and experience.
Bonus points if you have
If you are passionate about leading data engineering initiatives and driving impactful outcomes in a dynamic environment, we encourage you to apply for this exciting opportunity.
Is a Remote Job?
Hybrid (Remote with required office time)

New Relic helps engineers and developers do their best work every day — using data, not opinions — at every stage of the software lifecycle. The world’s best engineering teams rely on New Relic to...

Apply Now