File APIs for Java Developers
Manipulate DOC, XLS, PPT, PDF and many others from your application.
http://aspose.com/file-tools
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

How much time (approx) can it take to learn and use hadoop ?

 
David Payne
Ranch Hand
Posts: 35
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I wanted to learn hadoop as soon as possible so that I can work on enterprise level projects. Given that I have a "pretty good" knowledge of Java (OO design, Threads ,Data structures, Generics), how much time do i need to get a decent foundation in hadoop ? Is it like studying for the OCJP exam, which most people can prepare for in just 3-4 months ?
 
Junilu Lacar
Bartender
Pie
Posts: 6548
22
Android Eclipse IDE IntelliJ IDE Java Linux Mac Scala Spring Ubuntu
  • 2
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Assuming your study style is a mix of reading and doing, I'd say as much time as it takes for you to go through a decent chunk of a good book on Hadoop would be the investment in just Hadoop itself. There a lot of subprojects of Hadoop though and depending on the enterprise, they could also be using some of those to make it easier to work with Hadoop. Just as the Spring Framework has many different parts to it, learning everything about Hadoop is almost a herculean task. But talking about just the basics and getting to a level where you can be effective in an enterprise setting, you're probably looking at 3 to 6 months of serious play-around time.

Caveat: Everyone has their own definition of "can be effective in an enterprise setting." I set the bar pretty high, so your mileage may vary.
 
Srinivas Mupparapu
Greenhorn
Posts: 14
Java
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
In addition to what Junilu have said above, going throug the book "Hadoop: The Definitive Guide" by Tom White should give you a fair idea on what you need to learn. A typical use case of Hadoop involves the use of Hive and/or Pig (if Hadoop MR is not sufficient) to process the data stored in HDFS. If you need random read/write access to your data stored in HDFS then you also need to look into HBase.
 
Anjali Singh Singh
Greenhorn
Posts: 1
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
How about taking an online class. Here is one - http://www.dezyre.com/Big-Data-and-Hadoop/19

Here is a sample session recording - https://www.youtube.com/watch?v=6wNXfzpe5HQ
 
Mehul Sharma
Greenhorn
Posts: 2
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,
To have a strong base in any technology, one needs to go through good tutorials, study materials & a good training in that particular technology.
For more details , click on this – http://tinyurl.com/qap5grf
 
I agree. Here's the link: http://aspose.com/file-tools
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic