In this role, you will work with business leaders, data scientists and software developers to build and optimize cloud data services.

You’ll get to work in a cloud and data development group that works across HP, Inc with printing devices and manufacturing processes.  This group supports agile methods and continuous integration/deployment.  We work together as a supportive team and support your career development.  HP, Inc has an exciting 10-year plan that will have a big impact on the industry and world.  If this sounds exciting and fun to you, please apply! 

 

Responsibilities

Develop technical strategies for new data projects and for the optimization of existing solutions which may include remediation of legacy solutions such as Teradata

Collaborate with architecture and platform teams on end-to-end designs for enterprise-scale data and analytics solutions and data products

Collaborate with/across data engineering teams on designs and implementation details for distributed data processing pipelines using tools and languages prevalent in the big data ecosystem

Complex problem solving

Keep a pulse on the industry and the enterprise. Research, evaluate and use new technologies/tools/frameworks centred around high-volume data processing. Continually innovate to address both current and future customer needs

Define, design, and build utilities, user defined functions, libraries, and frameworks to better enable data flow patterns

Create strategies for building or improvement of continuous integration, test-driven development, and production deployment frameworks and SW development lifecycles

 

Desired Skills: 

Experience rolling out self-service analytics solutions

Cloud compute platforms (AWS, Azure)

Experience with SpringBoot and DataBricks

 

In accordance with applicable law, an offer of employment is conditional upon you providing proof that you are fully vaccinated against COVID 19 (as defined by the CDC) as of your first day of employment.

BS or MS in Computer Science, Mathematics or equivalent

Extensive experience working with Spark and related processing frameworks

Experience with messaging/streaming/complex event processing tooling and frameworks with an emphasis on Spark Streaming

Experience with data warehousing concepts, SQL and SQL Analytical functions and working on solutions such as Teradata

Experience with machine learning processing and frameworks

Experience with workflow orchestration tools like Apache Airflow

Experience with source code control tools like Github or Bitbucket

Extensive experience with performance and scalability tuning

Experience with build tools such as Terraform or CloudFormation and automation tools such as Jenkins

Experience with practices like Continuous Development, Continuous Integration and Automated Testing

Proven ability to influence and communicate effectively, both verbally and written, with team members and technical and business partners

Proven track record of rapidly learning new technologies and developing and implementing proof of concepts and practical, working code

Computer Science fundamentals in algorithm design, complexity analysis, problem solving and diagnosis

Knowledge of professional SW engineering practices 

Technical Skills
Is a Remote Job?
Hybrid (Remote with required office time)
Employment Type
Full time

You’re out to reimagine and reinvent what’s possible—in your career as well as the world around you.  

So are we. We love taking on tough challenges, disrupting the status quo, and creating what’s...

Apply Now