Startup Jobs Startup Companies Post a Job! Startup Newswire Job Widgets
Search Startup Jobs
Funding20M-50M USD
IndustrySecurity, Software, Database, Enterprise, Open Source
View full company profile
Other Openings

Sr. Performance Engineer – Distributed Architectures

at Hortonworks Inc. in CA 94303   —   Jun 27, 2014   |  
About Hortonworks, Inc.
Apache Hadoop™ is the principal open source Cloud Computing platform for storing, managing, processing and analyzing “Big Data”. Most leading Internet and Social Networking businesses use it, along with other companies around the globe. We believe that more than half of the world’s data will be stored in Hadoop environments within five years.

Hortonworks, Inc. is a dynamic startup, created to accelerate the development and adoption of Apache Hadoop. Together with the Apache opensource community, we are making Hadoop more powerful, more robust, and easier to install, manage and use. We also provide support and training for Apache Hadoop.
The Hortonworks Engineering team is looking for experienced software engineers who are passionate about Performance Engineering, to deliver focused improvement in the opensource Apache Hadoop projects (HDFS, Map/Reduce, HBase, and others) and related products that we provide to our customers. You will do this through your deep understanding of performance test tools and methodology, performance-related infrastructure issues, and your well-developed problem solving and software design skills.
The person who can contribute strongly in this role will have visibility in the prestigious Apache opensource community, and the opportunity to build a public reputation for your contributions.
Responsibilities include:
Work with customers and partners to help them analyze, model and optimize their performance issues and bring back requirements and knowledge to Hortonworks Engineering.
Propose performance patches to network, RPC, and file system implementations to improve performance
Communicate and “sell” design improvements for performance, both to internal architects and to customer IT staff
Hands-on engineering – you must be able and willing to implement, test, and validate what you design
Knows benchmarking and performance analysis tools forwards and backwards
A coding god, deeply familiar with implementations of one or more perf hotspots:
Network stacks
RPC and MP systems
File systems
Database optimization
Able to explain, in depth, the interactions of the above elements with specific distributed algorithms, in the Hadoop context
Additional skills
Excellent verbal and written communication
Collaborative and self-driven
Knowledge about Hadoop and related technologies is desirable.
Experience coding and improving performance in Hadoop is a big plus.
- BS Computer Science minium or MS and/or PhD is a plus.
Want this Job? Apply Now
About Us  |  Privacy Policy   |   Terms & Conditions  |  Contact Us
© 2014 Job Alchemist, Inc. All rights reserved.


Startuply is in beta. Love it? Hate it? Want to suggest new features or report a bug? We'd love to hear from you.