Hadoop Developer Intern

Bookmark This


As a company, TekTree has strengths that are hard to find. For more than a decade, we have acquired, assembled and integrated the dynamic capabilities modern businesses need to succeed in the global marketplace. When we begin working with you, our strengths become your own.

Today's businesses need partners who can talk about strategy and technology in the same conversation. At TekTree, we believe true value from technology requires an in-depth understanding of business strategy. Our cross-industry consulting services help you craft a vision for your organization and then provide a specific, practical business and technology framework that will make that vision a reality.

Business Consulting :
TekTree's strategic business consulting services focus on enhancing business performance of its clients by streamlining processes, reducing organizational risk and leveraging the global sourcing / outsourcing organizational model.

Process Consulting :
Today's business environment is characterized by multiple challenges that need to be met and overcome by business leaders and executives. As they plan for developing capabilities for the emerging, complex world of tomorrow, business and IT leaders need answers.

Technology Consulting :
The technology management consulting team at TekTree improves the alignment of business and technology through improved process efficiency, reduced cost and enhanced business value of IT. Our consultants combine extensive technical experience with strong strategic and business focused leadership.


Job Description :
Role : Hadoop Developer
Location : Metro Cities in US
Duration : 6+ Months

Job Description and Responsibilities :
The candidate's primary role will be a Hadoop Developer. As a developer on the team, the resource will be writing Linux Scripts, setting up Autosys jobs, writing Pig Scripts, Hive queries, Oozie workflows and Map Reduce programs. In addition to development activities, the resource will participate in analysis and design as well as complete project documentation.

- Experience with emerging Hadoop-based big data, NoSQL.
- Hands on experience with Pig and Hive
- Must have excellent and in-depth knowledge in SQL
- Analyzing data with Hive and Pig
- Importing and exporting the data using Sqoop
- Developing MapReduce Program to format the data.
- Experience with Cloudera Impala
- Must have excellent communication skills.
- Financial/Consumer experience is a plus

Contact US :

Sunny at 734-661-7972 or Email: sunny@tektreeinc.com // sunny.pydi@gmail.com

Please visit our website www.tektreeinc.com


Job Requirements:

Must have these skills:
1. Minimum 3-4 years experience using Core Java

2. Must have 3+ years of development experience in any of the RDBMS like





- MS SQL etc.

3. Experience writing Shell scripts in LINUX or Unix.

4. Experience with Autosys

5. Must have excellent and in-depth knowledge in SQL

6. Experience in analyzing text, streams with emerging Hadoop-based big data, NoSQL.

- Hands on experience with Running Pig and Hive Queries.

- Analyzing data with Hive, Pig and HBase

- Hand on experience with Oozie.

- Importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.

- Loading data into HDFS.

- Developing Map Reduce Program to format the data.

7. Experience developing a batch process

8. Have the zeal to contribute, collaborate and work in a team environment

9. Must have excellent communication skills.

How To Apply

Login or Sign Up to apply.