• Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

hadoop problems with -files option when run submitting job from remote node

 
Gg Francis
Greenhorn
Posts: 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I run hadoop map red jobs from a remote machine ( windows ) using the command

java -jar XMLDriver.jar -files junkwords.txt -libjars XMLInputFormat.jar

and submit job to a linux box which runs hadoop.


I know that this distribution cache file will be sent to the HDFS on my remote box ( Am i right ??? )

But in mapper code am unable to retrive this file name using the api

Path[] cacheFiles = DistributedCache.getLocalCacheFiles(conf);

fileName = cacheFiles[0].toString();

Should I use DistributedCache.addCacheFile() api and symlinks api, if so wht is the parameter URI I need to mention as I dont know where the files would be copied by hadoop?

Also,

I tried to copy the junkwords.txt file manually to hdfs and specified the hdfs path here in command line as

java -jar XMLDriver.jar -files /users/junkwords.txt -libjars XMLInputFormat.jar

This throws a FileNotFoundException when I job the job on my local windows machine.

What is the solution for accessing the distributed cached file in mapper when passed from remote machine using -file command line option?
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic