This week's book giveaways are in the AI and JavaScript forums.
We're giving away four copies each of GANs in Action and WebAssembly in Action and have the authors on-line!
See this thread and this one for details.
Win a copy of GANs in ActionE this week in the AI forum
or WebAssembly in Action in the JavaScript forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Bear Bibeault
  • Paul Clapham
  • Jeanne Boyarsky
  • Knute Snortum
Sheriffs:
  • Liutauras Vilda
  • Tim Cooke
  • Junilu Lacar
Saloon Keepers:
  • Ron McLeod
  • Stephan van Hulst
  • Tim Moores
  • Tim Holloway
  • Carey Brown
Bartenders:
  • Joe Ess
  • salvin francis
  • fred rosenberger

Hadoop installation problem

 
Ranch Hand
Posts: 84
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I am new in Hadoop. i am facing problem when install Hadoop.
I have CentOS6
I have extract hadoop-0.20.2-cdh3u4.tar.gz in Home folder.

i used the following commands.





then made folder pseudo/dfs/data , pseudo/dfs/name in hadoop-0.20.2-cdh3u4 folder.

then modify xml files:

core-site.xml


hdfs-site.xml:


mapred-site.xml:


.bashrc:


then use following command:


it shows:


now when i give this two url from browser:

http://localhost:50030/jobtracker.jsp
http://localhost:50070/dfshealth.jsp

it shows pages in offline mode.


when use following command:


it shows:


please advice to run the Hadoop on my machine.

Thank you.
 
Ranch Hand
Posts: 544
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Check if this file has any clue why jobtracker would not have started "/home/kumar/hadoop-0.20.2-cdh3u4/logs/hadoop-kumar-jobtracker-kumar.hadoop.out " .

Regards,
Amit
 
Bartender
Posts: 2407
36
Scala Python Oracle Postgres Database Linux
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
If you are just starting out and you just want to explore Hadoop and related tools like Hive, Pig etc, then you might find it easier to use one of the pre-packaged virtual machines from Hortonworks or Cloudera.

For example, I've been using the Hortonworks Sandbox. This gives you an integrated single-node Hadoop installation with tools like Hive, Pig, HCatalog and Hue, plus links to lots of well structured tutorials. The sandbox runs as a virtual machine e.g. inside Virtualbox or VMWare Player, and you can access a lot of the functionality very easily via the browser-based Hue interface. It's a lot easier than installing all these components by hand, and it's a great resource for learning about Hadoop, even if you plan to use a different Hadoop distribution for your project.
 
Babu Singh
Ranch Hand
Posts: 84
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,

I am not able to find out the exact cause.

please review again and suggest to solve out the problem.

Thanks.

amit punekar wrote:Check if this file has any clue why jobtracker would not have started "/home/kumar/hadoop-0.20.2-cdh3u4/logs/hadoop-kumar-jobtracker-kumar.hadoop.out " .

Regards,
Amit

 
Babu Singh
Ranch Hand
Posts: 84
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,

I am not using vm. I have installed CentOS 6 on my machine. now trying to set up again on my machine.

although hadoop had already run on my machine. i run two months on my machine with Hive, Pig, Sqoop. but unfortunately hadoop automatically stopped and hadoop folder removed. i don't know what happened.

anyways, please suggest to solve the issue.

Thanks for your reply.


chris webster wrote:If you are just starting out and you just want to explore Hadoop and related tools like Hive, Pig etc, then you might find it easier to use one of the pre-packaged virtual machines from Hortonworks or Cloudera.

For example, I've been using the Hortonworks Sandbox. This gives you an integrated single-node Hadoop installation with tools like Hive, Pig, HCatalog and Hue, plus links to lots of well structured tutorials. The sandbox runs as a virtual machine e.g. inside Virtualbox or VMWare Player, and you can access a lot of the functionality very easily via the browser-based Hue interface. It's a lot easier than installing all these components by hand, and it's a great resource for learning about Hadoop, even if you plan to use a different Hadoop distribution for your project.

 
Babu Singh
Ranch Hand
Posts: 84
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
One day my machine have occurred a problem of disk space. (machine does not have enough space). then i remove some items from download folder using command prompt. after some time when i tried to run Hadoop, it was not run. and Hadoop folder did not show in Home folder where i extract that.

Thanks.


chris webster wrote:If you are just starting out and you just want to explore Hadoop and related tools like Hive, Pig etc, then you might find it easier to use one of the pre-packaged virtual machines from Hortonworks or Cloudera.

For example, I've been using the Hortonworks Sandbox. This gives you an integrated single-node Hadoop installation with tools like Hive, Pig, HCatalog and Hue, plus links to lots of well structured tutorials. The sandbox runs as a virtual machine e.g. inside Virtualbox or VMWare Player, and you can access a lot of the functionality very easily via the browser-based Hue interface. It's a lot easier than installing all these components by hand, and it's a great resource for learning about Hadoop, even if you plan to use a different Hadoop distribution for your project.

 
Babu Singh
Ranch Hand
Posts: 84
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Amit,

log file give the error:



please suggest to resolve this.

Thanks.

amit punekar wrote:Check if this file has any clue why jobtracker would not have started "/home/kumar/hadoop-0.20.2-cdh3u4/logs/hadoop-kumar-jobtracker-kumar.hadoop.out " .

Regards,
Amit

 
Babu Singh
Ranch Hand
Posts: 84
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
now i have again setup Hadoop in my machine.
I have change the port number.
mapred-site.xml:


core-site.xml:


now when i stop all nodes.


when i use these URL's on browser it shows Server not found.
http://localhost:55531/jobtracker.jsp

http://localhost:55571/dfshealth.jsp


log's files:

hadoop-kumar-jobtracker-kumar.hadoop.log:


hadoop-kumar-datanode-kumar.hadoop.log:


hadoop-kumar-namenode-kumar.hadoop.log:


hadoop-kumar-tasktracker-kumar.hadoop.log:


hadoop-kumar-secondarynamenode-kumar.hadoop.log:


please advice to solve the issue.

Thanks.

chris webster wrote:If you are just starting out and you just want to explore Hadoop and related tools like Hive, Pig etc, then you might find it easier to use one of the pre-packaged virtual machines from Hortonworks or Cloudera.

For example, I've been using the Hortonworks Sandbox. This gives you an integrated single-node Hadoop installation with tools like Hive, Pig, HCatalog and Hue, plus links to lots of well structured tutorials. The sandbox runs as a virtual machine e.g. inside Virtualbox or VMWare Player, and you can access a lot of the functionality very easily via the browser-based Hue interface. It's a lot easier than installing all these components by hand, and it's a great resource for learning about Hadoop, even if you plan to use a different Hadoop distribution for your project.

 
Greenhorn
Posts: 8
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I used the Apache Hadoop 2.5.0 installation and was able to install,configure, start the nodes and run the mapreduce programs.

The concepts have changed since then. Now it is YARN based Job scheduling.

I hope the latest download will work for you. The version which you are currently trying to install is very old

Best Wishes,
Tushar Arsekar
Oracle SOA Architect Certified Expert.
Oracle Certified Master, Java EE 5 Enterprise Architect certification
IBM Certified Rational Software Architect.
IBM Certified Service Oriented Architecture (SOA) - Associate.
IBM Websphere MQ 6.0 Certified - System Adminstrator
Sun Certified Web Component Developer for J2EE 5.0.
Sun Certified Java Programmer SE 5.0.
 
What kind of corn soldier are you? And don't say "kernel" - that's only for this tiny ad:
Java file APIs (DOC, XLS, PDF, and many more)
https://products.aspose.com/total/java
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!