Your opportunity
Do you enjoy creating and driving a vision across multiple teams? Do you get excited about complex distributed systems? Do you love data? We’re looking for an experienced technology leader to lead and grow our global engineering team and help our customers build better software! How do you know if this is the right opportunity for you? You are an experienced builder and leader of teams and possess the attention to detail needed to work with some of our most critical systems while thriving in a dynamic, customer-obsessed, continuous deployment environment. You have a profound sense of ownership to drive enduring business success with a heightened

We seek a skilled engineering leader to guide a team in building a groundbreaking data pipeline observability platform, enhancing the integrity and reliability of enterprise data flows. This role requires a deep understanding of distributed systems, monitoring techniques, solid understanding of ELT, EDW and data pipelines. You'll combine technical leadership skills to build user-centric products that enables data teams to establish reliability and trust in their data
This will be a hybrid role some time required working in the office and optional work from home available.

What you'll do
  • Lead a team of software and data engineers in the design, development, and deployment of ETL pipeline observability software.
  • Implement best practices for logging, tracing, and debugging to enhance visibility into data flow and pipeline execution.
  • Define the technical architecture and roadmap for the ETL pipeline observability platform, ensuring scalability, reliability, and performance.
  • Collaborate with product management to gather requirements, prioritize features, and define the product vision and strategy.
  • History of successful collaboration with cross-functional teams and the ability to translate technical concepts for diverse audiences.
  • Drive the recruitment, training, and development of top-tier talent in the region.


This role requires
  • Experience in software engineering, with a focus on building scalable and reliable software systems.
  • Strong knowledge of software engineering best practices, including agile development methodologies,
  • code review processes, and continuous integration/continuous deployment (CI/CD) pipelines.
  • Deep understanding of ETL processes and technologies, with experience working with tools such as
  • Apache Spark, Apache Kafka, and Airflow
  • Significant experience building and managing data pipelines (ETL/ELT processes, data streaming, etc.)
  • is highly valued.
  • Strong experience in containerization technologies (e.g., Docker, Kubernetes) and building software in
  • one of the one of the cloud platforms (e.g., AWS, Azure, GCP)
  • Strong programming skills, preferably in Python, Java, or similar languages. Experience with relevant data processing libraries or frameworks (e.g., Apache Spark, Kafka, Airflow, etc.)


Bonus points if you have
AI/ML models
Is a Remote Job?
Hybrid (Remote with required office time)

New Relic helps engineers and developers do their best work every day — using data, not opinions — at every stage of the software lifecycle. The world’s best engineering teams rely on New Relic to...

Apply Now