Join now
Log in Join

Hadoop Engineer / Expert Position Open (Munich)

Hi All,

There is an open position for a Hadoop Engineer or Expert to set up a big data platform. It is based in Munich.

Key Responsibilities
• Manage very large-scale, multi-tenant and secure, highly available Hadoop infrastructure supporting rapid data growth for multiple internal customers. Install operating system and Hadoop updates, patches, version upgrades
• Design, implement and maintain enterprise-level security (Kerberos, Active Directory, etc.)
• Troubleshoot Hadoop related applications, components and infrastructure issues at large scale
• Identify new components, functions and features and drive exploration to implementation
• Create run books for troubleshooting, cluster recovery and routine cluster maintenance
• Design, configure and manage the strategy and execution for backup and disaster recovery of big data
• Provide hardware architectural guidance, planning, estimating cluster capacity, and creating roadmaps for Hadoop cluster deployment
• 3’rd-Level-Support (DevOps) for business-critical applications and use cases
• Evaluate and propose new tools and technologies to meet the needs of the extended organization (Allianz Group)
• Closely working with infrastructure, network, database, application, business intelligence and data science units

Key Requirements, Skills and Experience
• University degree in computer science, mathematics, business informatics or in another technical field of study
• Deep understanding of grid computing design principles and the factors determining and affecting distributed system performance
• Experience with implementing Hadoop in a large scale environment, preferably including multi-tenancy and security with Kerberos
• Excellent hands-on working experience with Hadoop ecosystem for at least 2 years including HDFS, Map, Reduce, Pig, Hive,
Impala, Mahout, Oozie, Zookeeper, Flume but also with SQL, R, Python/Shell Scripting, Linux/Unix
• Well versed in installing, upgrading & managing distributions of Hadoop (CDH5x), Cloudera Manager, MapR, etc.
• Cluster node configuration, connectivity, capacity, compute architecture, name node/data node/job tracker deployment layout,
server requirements, SAN, RAID configurations etc.
• Experience working in an international environment
• Excellent communication skills and high level of initiative (self-starter)
• Strong sense of ownership to independently drive a topic to resolution
• Ability and willingness to go the extra mile and support the overall team
• Business fluent English in speech and writing, German is a plus

Thanks in advance for spreading the word :-)