Scala & Spark Architect - Azure Data Factory

Job Description :


- Strong understanding of Big Data Analytics platforms and ETL in the context of Big Data

- Ability to frame architectural decisions, provide technology leadership & direction.

- Should be able to design complex and high performance Data architecture.

- Developing and maintaining strong client relations with senior executives & architects- developing new insights into the client's business model and pain points, and delivering actionable, high-impact results.

- Should be able to design complex and high performance Data architecture

- Expertise in object oriented/scripting language: PySpark or Spark Scala

- Experience with big data tools: Spark must [Hadoop, Kafka would be good to have]

- Hands on experience in Microsoft Azure Data Factory, Azure Logic apps service, Data Bricks - Developing and maintaining strong client relations - developing new insights into the client's business model and pain points, and delivering actionable, high-impact results.

- Extensive experience working with large data sets with hands-on technology skills to design and build robust Big Data solutions using Spark framework, Azure Data Bricks and Azure Data Factory.

- Extensive experience in data modeling and database design involving any combination of - Data warehousing and Business Intelligence systems and tools

- To measure and analyze the online metrics for business operations, performance reporting and ad hoc analysis

- Contributing to the thought capital through the creation of executive presentations, architecture documents, and IT position papers

- Ability to work well with aggressive deadlines meeting up the requirements and deliverables, mentor team, code review.

Job Requirements: Scala, Data Management, Data Analysis

Mandatory Skills Scala, Data Management, Data Analysis

MUST HAVES:

SCALA AND SPARK - Min 2-3 years of experience

(ref:hirist.com)
Posted
10/04/2021
Location
Chennai, TN, IN
Apply Now