Sorry, this job is no longer available.(Loading More Opportunities)
Cloud Data Engineer
Purpose of the Position: As a Cloud Data Engineer, this position requires candidate who are enthusiastic about data and driven to help in organization's AI & Analytics challenges. As a member of the team, you will help our clients on their data and analytics journey, by building data pipelines, improving consistency, and building the necessary infrastructure to support their analytics platform.
Work and Technical Experience:
- Strong understanding of cloud and data engineering concepts
- Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity
- Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization
- Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it
- In-depth knowledge of key cloud services for data integration, BI and data processing services
- In-depth knowledge of cloud storage services & compute services
- Can write unit/integration tests, contribute to engineering wiki, and documents work.
- Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
- Works closely with a team of frontend and backend engineers, product/project managers, and analysts
- Knowledge of containerization services such as Dockers and its orchestration through Kubernetes
- Good expertise on big data services such as Spark, Airflow, etc.
- Strong Experience with event stream processing technologies such as Kafka
- Experience with at least one programming language (Java, Scala, Python)
- Experience with at least one major Hadoop platform (Cloudera, Hortonworks, MapR) will be a plus
- Knowledge of Operating System (Must have) - Any flavor of Linux, ETL Tools (Good to have) – Informatica, Talend
- Deep understanding of cloud computing infrastructure and platforms.
- Experience of enabling DevOps automation for AWS or Azure with appropriate security and privacy considerations
- Expertise in developing ETL workflows comprising complex transformations like SCD, deduplications, aggregations etc.
- Good expertise on databases including NoSQL. Experience on cloud DBs such as Snowflake, Redshift, etc. will be a plus
- Proven ability to work on multiple requirements, design, and development approaches, methodologies (agile, iterative, waterfall, etc.) and risk mitigation approaches.
- Overall work 4+ years of experience of with minimum of 2 to 3 years’ experience on AWS or Azure related projects
- BS Degree in IT, MIS or business-related functional discipline
- Experience with or knowledge of Agile Software Development methodologies