• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

hadoop problems with -files option when run submitting job from remote node

 
Greenhorn
Posts: 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I run hadoop map red jobs from a remote machine ( windows ) using the command

java -jar XMLDriver.jar -files junkwords.txt -libjars XMLInputFormat.jar

and submit job to a linux box which runs hadoop.


I know that this distribution cache file will be sent to the HDFS on my remote box ( Am i right ??? )

But in mapper code am unable to retrive this file name using the api

Path[] cacheFiles = DistributedCache.getLocalCacheFiles(conf);

fileName = cacheFiles[0].toString();

Should I use DistributedCache.addCacheFile() api and symlinks api, if so wht is the parameter URI I need to mention as I dont know where the files would be copied by hadoop?

Also,

I tried to copy the junkwords.txt file manually to hdfs and specified the hdfs path here in command line as

java -jar XMLDriver.jar -files /users/junkwords.txt -libjars XMLInputFormat.jar

This throws a FileNotFoundException when I job the job on my local windows machine.

What is the solution for accessing the distributed cached file in mapper when passed from remote machine using -file command line option?
 
Consider Paul's rocket mass heater.
reply
    Bookmark Topic Watch Topic
  • New Topic