Adobe believes in hiring the very best. We are known for our vibrant, dynamic and rewarding workplace where personal and professional fulfillment and company success go hand in hand. We take pride in creating exceptional work experiences, encouraging innovation and being involved with our employees, customers and communities. We invite you to discover what makes Adobe a place where you can do and be your best.
Experience A Day in the Life at Adobe.
Adobe is seeking a Senior Hadoop Engineer with top-notch technical skills to build, configure and tune multi petabyte Hadoop clusters. This individual will deploy and administer the Hadoop software, MapReduce, HDFS, Hbase, Hue and related tools. This individual will ensure that Hadoop infrastructure is implemented in best practice fashion, managed optimally and aligned with the strategic architectural roadmap. Individual will work closely with the big data and server admin teams.
- Design and deploy Hadoop cluster production environment that can scale to petabytes
- Manage HDFS, Hue and all the related Hadoop tools.
- Design, configure and manage the backup and disaster recovery for Hadoop data
- Optimize and tune the Hadoop environment to meet the performance requirements
- Install and configure monitoring tools
- for the Hadoop environment
- Work with big data developers, designers and scientists in troubleshooting map reduce job failures and issues with Hive, Pig, HBASE, Flume etc.
- Be the SME on Hadoop and related tools. Provide guidance to the big data team in getting the best out of the Hadoop cluster.
- Work with Linux server admin team in administering the server hardware and operating system
- Minimum 2-3 years of experience in deploying and administering a multi petabyte Hadoop Cluster
- BS degree (or equivalent) and min 10 years overall experience in Linux systems and storage administration.
- Well versed in installing & managing distributions of Hadoop (CDH3, CDH4, Cloudera manager, MapR, Hortonworks, etc.
- Excellent knowledge of Hadoop integration points with Enterprise BI and EDW tools
- Excellent knowledge of Python/PERL/Shell script language and hands-on programming skills.
- Advanced knowledge in performance troubleshooting and tuning Hadoop Clusters
- Good knowledge of Hadoop cluster connectivity and security.
- Development experience with Java programming.
- Development experience in Hive, PIG, Sqoop, Flume, HBASE desired
- Can speak and understand the language of big data scientists
- Exposure to Cassandra, MangoDB, CouchDB is a plus
- Prior knowledge of scale out storage systems and non Hadoop distributed computing systems is a plus
Adobe has been a pioneer and innovator throughout its history and is recognized as one of the Top 100 Best Global Brands according to Interbrand. Adobe’s dynamic working environment is also well known and has received awardsthroughout the globe. Recognizing that employees are at the core of our success, Adobe recruits and retains highly qualified and motivated individuals, creates an environment where they can innovate and achieve their best, and rewards them for their performance by giving them an opportunity to share in the company’s success.
Build careers that change the world. Download the Adobe Life digital magazine and discover what our employees are saying about their career experiences at Adobe.
Adobe is an equal opportunity employer. We welcome and encourage diversity in the workplace.
About Adobe United States
Adobe has over 5,800 employees in the U.S. and is headquartered in San Jose, California, with other office locations nationwide.
Americas-USW-San Jose (Headquarters)
Per l’annuncio completo clicca qui.