Sorry, this job is no longer available.
loading...

(Loading More Opportunities)

AI Data Architect

JOB IS OPEN ACROSS ALL LOCATIONS IN INDIA


We are looking for a highly motivated, credible, and trusted Data Platform Architect to drive high priority customer initiatives on all major cloud platforms in collaboration with customers and the various accounts segment of our business. The ideal candidate will have experience in customer-facing roles and success leading deep technical architecture discussions with senior customer executives, Enterprise Architects, IT Management and Developers to drive Data Platform and Advanced Analytics solutions to production.



Responsibilities

  • Provide technical oversight to solution delivery in creating business driven solutions adhering to the enterprise architecture and data governance standards
  • Contribute to the process of establishing a Data Lake / Data Warehouse (EDW), Operational Data Store (ODS)
  • Play a key role in the process of data transformation (Data Collection, Data validation/quality, Data Cleaning, Data Exploration and Analysis activities) required for effective reporting, analytics, and visualization
  • Develop and evolve the enterprise-wide data architecture strategy and roadmap to support delivery
  • Build and maintain artefacts including Entity Relationship Models, Data dictionary, taxonomy, governance framework to aid data traceability
  • Partner with other key team to define, document, and communicate principles, patterns, and reference architectures that provide a clear vision for delivering value from data
  • Ability to identify fit for purpose data stores (relational, NoSQL, document, graph, etc.) to meet business requirements
  • Oversee the migration of data from legacy systems to new solutions
  • Serve as a trusted advisor for the data domain, ensuring architectures enable operational and analytical use cases for a wide range of users
  • Ability to communicate, influence and persuade peers and leadership


Required/Minimum Qualifications

  • Bachelor's Degree in Computer Science, Information Technology, Engineering, or related field AND 8+ years’ experience in technology solutions, practice development, architecture, consulting, and/or cloud/infrastructure technologies
  • 3+ years of success in consultative/complex technical sales and deployment projects, architecture, design, implementation, and/or support of highly distributed applications
  • Proven experience in architecting and implementing big data platforms like Azure synapse analytics, Snowflake, GCP Big Query, AWS Redshift etc.
  • A comprehensive understanding of the principles and best practices behind data engineering, and the supporting technologies such as RDBMS, NoSQL, Cache & In-memory stores
  • Experience with distributed data and analytics architectures in cloud and hybrid environments and handled data volumes exceeding 1 TB
  • Experience using Azure, GCP, AWS cloud data platforms to design schema, build views and optimize data transformation processes or similar technologies
  • Experience in implementing standards, best practices, and latest technical advancements in areas of data engineering and big data processing
  • Experience in optimizing the infrastructure cost and application runtimes by monitoring the utilization stats using tools available and right sizing the underlying compute
  • Awareness of visualization / reporting environments including Tableau, Power BI or other similar tools
  • Deep knowledge of architectural patterns – such as data warehouse, data lake, and data hub – and the ability to leverage them to enable operational and analytical use cases
  • Familiarity with data modeling skills, data architecture concepts, including master data management, data curation, ETL/ELT, data pipelines and data security best practices
  • Worked on Hadoop development framework, including real-time processing using Spark, NoSQL, and other Big Data open-source technologies
  • Experience in using analytical tools such as Flume, Kafka, and Spark; data collection and storage tools including RedShift, PostgreSQL, MongoDB, Mark Logic and Cassandra; ETL tools like Pig and Hive; the Hadoop file system or other similar tools and technologies
  • Azure Date Engineer (DP 203) certified or equivalent preferred
  • Azure Solution Architect (AZ-300/301 or AZ-303) certified or equivalent will be nice to have


The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email, or face to face. Travel may be required as per the job requirements.

Company
Infosys
Posted
07/14/2022
Location
bangalore, KA, IN