div style=font-family:Arial;font-size:1.0empAt EY, we’re all in to shape your future with confidence. /ppWe’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. /ppJoin EY and help to build a better working world. /p/divdiv style=font-family:Arial;font-size:1.0em /divpspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptstrongDesignation AIOps Engineer/strong/span/pp /ppspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptstrongJob Description/strong/span/pp /pulli style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptDevelop and maintain scalable data pipelines using tools like Apache Kafka, Apache Spark, or AWS Glue to ingest, process, and transform large datasets from various sources, ensuring efficient data flow and processing./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptDesign and implement data models and schemas in data warehouses (e.g., Amazon Redshift, Google BigQuery) and data lakes (e.g., AWS S3, Azure Data Lake) to support analytics and reporting needs./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptCollaborate with data scientists and analysts to understand data requirements, ensuring data availability and accessibility for analytics, machine learning, and reporting./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptUtilize ETL tools and frameworks (e.g., Apache NiFi, Talend, or custom Python scripts) to automate data extraction, transformation, and loading processes, ensuring data quality and integrity./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptMonitor and optimize data pipeline performance using tools like Apache Airflow or AWS Step Functions, implementing best practices for data processing and workflow management./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptWrite, test, and maintain scripts in Python, SQL, or Bash for data processing, automation tasks, and data validation, ensuring high code quality and performance./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptImplement CI/CD practices for data engineering workflows using tools like Jenkins, GitLab CI, or Azure DevOps, automating the deployment of data pipelines and infrastructure changes./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptCollaborate with DevOps teams to integrate data solutions into existing infrastructure, leveraging Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation for provisioning and managing resources./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptManage containerized data applications using Docker and orchestrate them with Kubernetes, ensuring scalability and reliability of data processing applications./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptImplement monitoring and logging solutions using tools like Prometheus, Grafana, or ELK Stack to track data pipeline performance, troubleshoot issues, and ensure data quality./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptEnsure compliance with data governance, security best practices, and data privacy regulations, embedding DevSecOps principles in data workflows./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptParticipate in code reviews and contribute to the development of best practices for data engineering, data quality, and DevOps methodologies./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptContribute to the documentation of data architecture, processes, and workflows for knowledge sharing, compliance, and onboarding purposes./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptDemonstrate strong communication skills to collaborate effectively with cross-functional teams, including data science, analytics, and business stakeholders./span/li/ulp /pp /ppspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptstrongDesired Profile/strong/span/pp /pulli style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptExpert proficiency in Terraform and extensive experience across at least two major cloud platforms (AWS, Azure, GCP)./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptStrong hands-on experience with Kubernetes, Helm charts, and designing/optimizing CI/CD pipelines (e.g., Jenkins, GitLab CI) is essential./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptProficiency in Python and scripting languages (Bash/PowerShell) is also a must./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptExperience with data-related tools and technologies, including Apache Kafka, Apache Spark, or AWS Glue for data processing, as well as familiarity with data storage solutions like Amazon Redshift, Google BigQuery, or Azure Data Lake, is highly valued./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptValued experience includes leading cloud migration initiatives, contributing to RFP/RFI processes, and mentoring teams./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptExcellent problem-solving, communication, and collaboration skills are critical./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptExperience with configuration management tools (Ansible, Puppet) and a solid understanding of DevSecOps principles is required; experience with OpenShift is a plus./span/li/ulp /pp /ppspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptstrongExperience 6 years and above/strong/span/pp /ppspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptstrongEducation B.Tech. / BS in Computer Science/strong/span/pp /pp /ppspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptstrongTechnical Skills amp; Certifications /strong/span/pp /pulli style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptCertifications in cloud platforms (e.g., AWS Certified Data Analytics, Google Professional Data Engineer, Azure Data Engineer Associate)./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptProficiency in data processing tools and frameworks such as Apache Kafka, Apache Spark, and AWS Glue./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptStrong experience with data storage solutions like Amazon Redshift, Google BigQuery, and Azure Data Lake./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptExpertise in ETL tools and frameworks (e.g., Apache NiFi, Talend) and scripting languages (Python, SQL, Bash)./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptFamiliarity with CI/CD practices for data engineering workflows using tools like Jenkins, GitLab CI, or Azure DevOps./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptExperience with Infrastructure as Code (IaC) tools like Terraform for provisioning and managing data resources./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptKnowledge of containerization technologies (Docker) and orchestration (Kubernetes) for managing data applications./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptUnderstanding of monitoring and logging solutions (e.g., Prometheus, Grafana, ELK Stack) to ensure data pipeline performance and quality./span/lili style=font-family:arial, helvetica, sans-serif;font-size:10.0ptspan style=font-family:arial, helvetica, sans-serif;font-size:10.0ptAwareness of data governance, security best practices, and data privacy regulations, embedding DevSecOps principles in data workflows./span/li/ulp /pdiv style=font-family:Arial;font-size:1.0empbEY | Building a better working world /b/ppEY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets./ppEnabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow./ppEY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories./p/div
At EY, our purpose is Building a better working world. The insights and quality services we provide help build trust and confidence in the capital markets and in economies the world over. We develop...
Apply Now