AWS Data Engineering

AWS Data Engineering

Confidential | Hyderabad, TG, IN

Posted 18 days ago

Apply Now

Description

AWS Data Engineering services like GLUE, EMR, EC2, Lambda, Athena, SNS, Redshift etc., along with Python, Spark and Bit Bucket/Git Hub etc.
• Build partnerships with key stakeholders and provide front line support for business requests from various divisions/groups
• As a Senior Data Engineer, coordinate with stakeholders on optimal effort estimates, timelines, and resource assignments
• For Agile Projects, collaborate with Product Owner on epics and user story definitions and delivery assigned user stories.
• Lead the design and development of data delivery solutions in various fields/domains
• Experience building complete data platform solutions, including storage, governance, security and supporting various read/write access patterns
• Outline and participate in producing team deliverables (including architecture and technical design documentation, standards, code development, and QA) to high quality standards
• Enforce sound development practices and ensure the quality delivery of enterprise solutions including, but not limited to, executing code reviews.
• Hands on experience on developing foundation frame works in various technologies including AWS Cloud Data lake, Databricks.
• Act as technical adviser for application development and internal teams in the data delivery team, to plan, implement, and support new and existing technologies.
• Responsible for delivery of various data driven applications by leading several consultants.
• Technical liaison to one or two Agile delivery teams.
• Assist with the development of consistent technology frameworks, platforms, standards, and processes and identify current and future application development trends to be incorporated into a strategic road map.
• Deliver platform and architecture recommendations based on project requirements and industry best practices.
• As the subject matter expert, mentor junior team members to design, build data delivery solutions.
• Resolve technical issues and identify risks by building consensus among technical stakeholders.
• Hands on coding and leading a team to implement a solution
• Track record of innovation and expertise in Data Engineering using various technologies includes AWS or in any cloud platform.
• Develop documentation and training materials to support data delivery organization.
Education:
Bachelor's Degree in Computer Science or equivalent with 7+ years of experience
• More than 5+ years' experience in a data & analytics role with a strong understanding of technical, business, and operational process requirements.
Experience
• Experience with requirement gathering and be able to propose and translate into technical solutions.
• Experience in providing guidance on and facilitate user acceptance testing (UAT)
• Experience with handling urgent matters, be able to prioritize tasks to support urgent needs.
• Hands-on, working experience with writing complex SQL queries
• Expertise in data management practices in both legacy and modern data warehouse technologies.
• Expertise in cloud and on-prim data platform tools like AWS Data Lake, AWS Athena, RedShift, Informatica, Data Bricks, SPARK, data pipelines.
• Excellent experience in Data Lake using AWS Databricks, Apache Spark & Python
• 2+ years of working experience in a DevOps environment, data integration and pipeline development.
• 2+ years of Experience with AWS Cloud on data integration with Apache Spark, EMR, Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS, MongoDB/DynamoDB ecosystems
• Demonstrated skill and ability in the development of data warehouse projects/applications (Oracle & SQL Server)
• Strong real-life experience in python development especially in pySpark in AWS Cloud environment.
• Experience in Python and common python libraries.
• Strong analytical experience with database in writing complex queries, query optimization, debugging, user defined functions, views, indexes etc.
• Experience with source control systems such as GitHub, Bit bucket, and Jenkins build and continuous integration tools.
• Knowledge of extract development against ERPs - SAP, Siebel, JDE, BAAN preferred

Location: Pune, Mumbai, Chennai, Kolkata ,Delhi and Bangalore

AWS Data Engineering

Skills: Aws, Spark, Python, Ec2

Experience: 7.00-10.00 Years