Posted On 09 August

  • Hadoop Developer

    • Company Capgemini
    • No. of Openings 10+
    • Salary Not Disclosed
    • Work Type on-site

    Job Description :

    Urgent Requirement for Hadoop Developer

     

    Duties and Responsibilities:

    • Providing engineering supports in coming projects and operations
    • Design, develop and maintain scala/spark applications on the Hadoop ecosystem
    • Supporting database performance tuning with discussing with the client
    • Supporting engineering security inspections
    • Basic Hadoop administration and job monitoring

    Requirements:

    • Sound understanding of the Hadoop ecosystem
    • Hands-on experience of Hadoop components: HDFS, Hive, HBase, Phoenix, Solr, Oozie
    • Minimum 3+ years of experience in Scala
    • Strong coding expertise with Scala and Spark.
    • Good understanding of database concepts and SQL
    • Experience with Unix and shell scripts
    • Good knowledge of git and sbt
    • Experience of database performance tuning for Oracle and SQL Server
    • Working experience on Hadoop framework including HDFS, Hive, HBase, MapReduce, Oozie, Phoenix, Solr
    • Experience in Scala, Spark 2.0 and tools such as Eclipse or Intellij
    • Experience in RDBMS and SQL, Unix/Linux shell scripting, python, Java and cloud computing
    • Data transformation using spark, streaming

     

    Language Requirement:

    Japanese: Business Level (Mandatory) and English: Business level

    Information

    • HR Name :Arushi Dhar
    • HR Email :cg_interview_helpdesk.in@capgemini.com
    • HR Phone :+81 3-6865-9510
Top