Life at Yahoo
Big Job Alert: Tech LeadPosted: 29th of August, 2011
Are you motivated by the design challenges inherent in building highly
scalable, complex and reliable software systems? Do you relish the
opportunity to work on one of the largest distributed systems in the
world, running on thousands of machines and handling petabytes of data?
The Hadoop Engineering team in Yahoo Cloud Computing, Bangalore is looking for smart engineers to help develop hugely Scalable, highly Performant, and Reliable platforms, and Scheduling for such environments. These platforms are used to handle the data manipulation, mining and storage needs of applications that work with several multi-terabyte data sets. Developing this infrastructure requires solving many technical challenges in the areas of parallel and distributed computing, multi-terabyte storage systems, and high-performance computing. It calls for skills in distributed algorithms and file systems, software design principles, systems programming and expertise in Java and C/C++. You will be expected to build scalable and modular system; measure and optimize system performance, and ensure that systems run reliably in a 24/7 production environment.
Requsition Number: 34827
Location: Bangalore, IN - Bangalore Intermediate Ring Road
Our primary distributed computing platform is Hadoop (http://hadoop.apache.org/), an Apache Software Foundation open source project, which is fast becoming a widely prevalent Grid platform of choice. We are the primary contributors to Hadoop. The Bangalore Hadoop team is also responsible for developing and enhancing Hive (http://wiki.apache.org/hadoop/Hive).
- You should have a formal degree in engineering course in Computer Science
- You should have a total of 5+ years of experience with specialization in Distributed and parallel computing - addressing requirements like high performance, fault tolerance, maintainability serviceability. You should be able to establish through your experience that you have successfully designed and created the applications that devoured all the available computing resources to deliver maximum performance and scalability.
- 5+ years of experience in Java and or C++ on UNIX/Linux platform. Proficiency in core Java and multi-threaded programming is a must. Hands-on experience in using design patterns and best practices for developing scalable and high performance applications.
- Strong debugging and troubleshooting skills.
- Strong analytical and problem solving skills.
- Collaborate with teams in US to define the architecture and design of the next generation of Hadoop
- M.S in Computer Science.
- Understanding of Cloud Computing, Resource managers like Torque/Maui, Hadoop
- Understanding database internals like Query parsing, execution plan, query optimization, data storage