Hadoop Developer

Short Description

About Capgemini

With more than 190,000 people, Capgemini is present in over 40 countries and celebrates its 50th Anniversary year in 2017. A global leader in consulting, technology and outsourcing services, the Group reported 2016 global revenues of EUR 12.5 billion. Together with its clients, Capgemini creates and delivers business, technology and digital solutions that fit their needs, enabling them to achieve innovation and competitiveness. A deeply multicultural organization, Capgemini has developed its own way of working, the Collaborative Business ExperienceTM, and draws on Rightshore®, its worldwide delivery model.Rightshore ® is a trademark belonging to Capgemini 

Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law.

Position Type: Full Time/Permanent

Role: Hadoop Developer

Job Description:

Hadoop developer with 6 to 10 years of relevant experience and excellent Hands-on experience with the technology.
• Work in Hadoop that includes:
• Designing, Building, installing, configuring and supporting Hadoop
• Create a complete Data Lake that will stream in data from multiple sources and enable analytics using standard BI analytic tools on the Data Lake
• Propose best practices / standards
• Maintain security and data privacy
• Liaise with business team for testing etc.
• Work with the client directly
Basic Technical Requirements
Excellent Understanding of HDFS and Big Data design
Ability to write MapReduce jobs 
Hands on experience in writing Pig scripts
Hands on experience in Hive
Familiarity with data loading tools like Flume and SQOOP
Knowledge of workflow/schedulers like Oozie
Knowledge of Spark will be a big plus
Familiarity with tools like HUE/IMPALA
Familiarity on Hadoop Infrastructure, installation and setup. The candidate will be required to quickly setup HDFS space for analytics and POCs.
Experience with Big Data on Cloud services like Amazon EMR (Elastic Map Reduce) on S3 (Simple storage Services) is desirable.
Worked on Unix shell scripts 
Bachelor’s Degree in Engineering, preferably Computer Science/Engineering, from a top tier University
Excellent communication skills in English language are essential, both written and oral
Desirable Skills:
Adaptable to working with new and different technologies.

Disclaimer: Capgemini America Inc and its U.S. affiliates are EEO/AA employers.  Capgemini conducts all employment-related activities without regard to race, religion, color, national origin, age, sex, marital status, sexual orientation, gender identity/expression, disability, citizenship status, genetics, or status as a Vietnam-era, special disabled and other covered veteran status. Click the following link for more information on your rights as an Applicant: http://www.capgemini.com/resources/equal-

Company
Capgemini
Posted
03/26/2018
Type
Full time
Location
New York, New York, US