We are looking for a hybrid Data Product Engineer who sits at the intersection of Data Engineering, Cloud Architecture, and Software Development. Unlike traditional data roles that focus solely on SQL pipelines, you will be a "Builder"—writing Python code for AWS infrastructure, developing React frontends for internal apps, and integrating AI models to drive smarter business decisions. You will turn data into interactive software products.
This role is Hybrid based on-site three days per week in our Atlanta or Austin office.
• Data Engineering & Warehousing: Architect and maintain robust data
pipelines using Python, AWS Glue/Lambda, and sql to efficiently move and
transform data between Snowflake and custom applications.
• AI & LLM Integration: Implement Generative AI solutions using AWS Bedrock
and LLMs to automate workflows and derive predictive insights from
commercial data.
• AWS Cloud Infrastructure: Deploy and manage secure cloud resources (APIs,
Databases, Storage) using Infrastructure as Code tools like Terraform.
• Data Consistency: Collaborate with the Core Data team to ensure applications
consume data from the single source of truth in strict alignment with
governance standards.
• API Development: Build secure, scalable REST/HTTP APIs to connect the data
layer with front-end applications.
• Full Stack Visualisation: Develop internal web apps using React, Vite, and
TypeScript to visualise datasets and capture user inputs.
• Rapid Prototyping: Execute short sprints to build Proof of Concepts (PoCs)
that validate new data and AI business ideas.
Skills, know-how and experience:
Must Have:
• Core Data Engineering: Strong coding skills in Python for backend data
processing and the ability to write complex, performant sql to query data
warehouses (specifically Snowflake).
• AWS Expertise: Hands-on experience with core AWS services (S3,
Lambda, Glue, API Gateway) and an understanding of cloud security best
practices.
• AI/LLM Knowledge: A solid understanding of Large Language Models
(LLMs) and experience or strong familiarity with AI platforms such as AWS
Bedrock.
• Modern Development: Familiarity with version control (Git/GitHub) and
CI/CD deployment flows.
• Frontend Competence: Experience with React (or similar modern JS
frameworks) and TypeScript to support full-stack requirements.
Preferred:
• Data Modelling: Experience with dbt to understand and collaborate on
core data transformation layers.
• Advanced AI Engineering: Practical experience fine-tuning models or
engineering prompts for commercial applications.
Key performance indicators:
• Successful deployment of PoCs and applications within agreed timelines.
• Quality and reliability of code (bugs/uptime).
• Ability to translate a vague business problem into a working technical
solution.
Plenty of perks:
• Competitive salaries that landed us top 5% of similar sized companies (according to Comparably)
• Comprehensive health, dental and vision coverage
• 401(k) retirement match (100% matching up to 4%)
• 32 days paid time off (21 personal days, 10 national holidays, 1 floating holiday)
• 18 weeks paid parental leave for birth, adoption or surrogacy offered 1 year after start date
• 5 days paid yearly to volunteer (through Sage Foundation)
• $5,250 tuition reimbursement per calendar year starting 6 months after hire date
• Sage Wellness Rewards Program ($600 wellness credit and $360 fitness reimbursement annually)
• Library of on-demand career development options and ongoing training offerings
What it’s like to work at Sage:
Careers homepage - https://www.sage.com/en-us/company/careers/
Glassdoor reviews - https://www.glassdoor.com/Reviews/Sage-Reviews-E1150.htm
LinkedIn page - https://www.linkedin.com/company/sage-software
#LI-RM1
At Sage, we knock down barriers with information, insights, and tools to help your business flow.
We provide businesses with software and services that are simple and easy to use, as we work with you...
Apply Now