Hiring-Big Data Architects

Hiring-Big Data Architects

HCLTech | Bangalore, KA, IN

Posted 6 days ago

Apply Now

Description

Role: Bigdata Architect

Roles/Responsibilities

  • Should have skills in big data tools and technologies; it includes technologies like the Hadoop, accumulo, MapReduce, Hive, HBase, panoply and redshift.
  • Works closely with the customer and the solutions architect to translate the customer's business requirements into a Big Data solution
  • Understands the complexity of data and can design systems and models to handle different data variety including (structured, semi-structured, unstructured), volume, velocity (including stream processing), and veracity
  • Understand Data Modeling
  • Address information governance and security challenges associated with systems.
  • Ability to identify/support non-functional requirements for the solution
  • Select and integrate any Big Data tools and frameworks
  • Understand Cluster Management, Network Requirements, Latency, Scalability
  • Understand Data Replication and Synchronization, High Availability, Disaster Recovery
  • Understand Overall performance (Query Performance, Workload Management, Database Tuning)
  • Propose recommended and/or best practices regarding the movement, manipulation, and storage of data in a big data solution (including, but not limited to:
  • Understand Data ingestion technical options
  • Understand Data storage options and ramifications (for example, understand the additional requirements and challenges introduced by data in the cloud)
  • Understand Data querying techniques & availability to support analytics
  • Understand Data lineage and data governance
  • Understand Data variety (social, machine data) and data volume
  • Understand/Implement and provide guidance around data security to support implementation, including but not limited to:
  • Understand LDAP Security
  • Understand User Roles/Security
  • Understand Data Monitoring
  • Implement ETL process
  • Supervise the migration of data from legacy systems to new solutions
  • Monitor performance and advise necessary infrastructure changes
  • Prepare database design and architecture reports
  • Define data retention policies
  • Test, troubleshoot, and integrate new features


  • Required Qualifications
  • Bachelor’s degree or equivalent experience in Computer Science
  • Should have minimum of 10 years of IT experience.
  • Minimum 3+ years of experience in designing and implementing a fully operational Big Data solutions on Cloud
  • Knowledge as well as experience to handle data technologies that are latest such as Hadoop, MapReduce, HBase, oozie, Flume, MongoDB, Cassandra and Pig.
  • Should have a keen interest and experience in programming languages and all the technologies that are latest. All kinds of JavaScript frameworks like HTML5, RESTful services, Spark, Python, Hive, Kafka, and CSS are few essential frameworks
  • Should know how to work in cloud environments and should have the experience and knowledge of cloud computing.