• Spark Developer

Industry IT
Location Pune
Experience Range 6 - 12 Years
Qualification Graduate
Not active

Functional IT Software-Other
Job Description
About Us
“Quess IT Staffing is India’s largest IT staffing company with over 20 years of experience in staffing IT professionals in 300+ companies across levels and skillsets. Our 10,000+ associates deployed in 80+ cities and towns are proficient in over 500 technological skills. Our associates help enable cutting edge solutions some of the biggest names across industried. Quess IT Staffing is a division of Quess Corp Limited, India’s leading business services provider and largest domestic private sector employer. Quess Corp Limited is - ‘A Great Place to Work’ certified – a testament to our excellent culture, people, and processes.”
About Company
www.magna.in
Roles and Responsibility

Must Have Skills:

  • Understand functional requirements related to Data processing. Deliver scalable, and high-performance, real-time data processing applications
  • Development of various KPIs in Spark, Map Reduce, Hive
  • Good knowledge of Spark in Python/Scala/Java
  • Migration of PIG KPI scripts to Hadoop M/R Java jobs
  • Good knowledge of Hbase, Hive,Pig and Impala
  • Develop scripts/code to schedule jobs and automate file management

 

Good to Have Skills:

  • Knowledge on architecture of Hadoop 2.0
  • Good to have knowledge on Java, Kafka or any message broker (Active MQ, Rabbit MQ).
  • Good knowledge of Linux and tool like Splunk, Tableau and Data Meer
  • Familiarity with open source configuration management and deployment tools such as Puppet or Chef.
  • Know ledge of any of the scripting language( Bash, Perl, Python). Experience in statistics methods, machine learning and AI is a plus

 

Roles & Responsibilities:

  • Deliver solutions to User Story based requirements
  • Engaging with the Scrum Team and delivery of Sprint commitments
  • Share responsibility for all team deliverables, and favor informal communications with Product Owner
  • Responsible for implementation and support of Hadoop environment
  • Involved in fine tuning, performance tuning and scaling
  • Hand on experience on Cloudera or MapR
  • Monitoring Hadoop cluster connectivity and security
  • Scheduling the Job using tool like oozie and monitoring the schedule
  • Should be able to perform backup, space management and recovery task.
  • Scheduling the Job using tool like oozie and monitoring the schedule.

 

Top 3 Skills Required:

  • Spark,
  • MR
  • Scala,
  • Python

 

A+| A| A-