File APIs for Java Developers
Manipulate DOC, XLS, PPT, PDF and many others from your application.
http://aspose.com/file-tools
The moose likes Hadoop and the fly likes java.lang.IllegalStateException: Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Databases » Hadoop
Bookmark "java.lang.IllegalStateException: "Reducer has been already set" Exception when invoking ChainReducer" Watch "java.lang.IllegalStateException: "Reducer has been already set" Exception when invoking ChainReducer" New topic
Author

java.lang.IllegalStateException: "Reducer has been already set" Exception when invoking ChainReducer

Joseph Hwang
Greenhorn

Joined: Aug 17, 2013
Posts: 13
I chained several Mappers and Reducers like below

MapperFirstRelocalize(Mapper) => MapperSecondSortNCalc(Mapper) => ReducerThirdSortNCalc(Reducer) => MapperFifthCarrierCode(Mapper) => ReducerFinalOutput(Reducer)

Thelast ReducerFinalOutput reducer throws exception like this;

Exception in thread "main" java.lang.IllegalStateException: Reducer has been already set at org.apache.hadoop.mapred.lib.Chain.setReducer(Chain.java:278)

Is it impossible to invoke ChainReducer.setReducer method 2 times in the same JobConf? How can I chain 2 reduers in one JobConf?

Any advice will be appreciated.

This is the driver class source.


public class HadoopETLDriver extends Configured implements Tool {

@Override
public int run(String[] arg0) throws Exception {
// TODO Auto-generated method stub
JobConf conf1 = new JobConf(getConf(),HadoopETLDriver.class);
conf1.setJobName("HadoopETLDriver_1");

FileInputFormat.setInputPaths(conf1, new Path("/home/user01/input_temp"));
FileOutputFormat.setOutputPath(conf1, new Path("/home/user01/output"));

conf1.setInputFormat(TextInputFormat.class);
conf1.setOutputFormat(TextOutputFormat.class);
conf1.setPartitionerClass(GroupKeyPartitioner.class);

JobConf mapperFirstConf = new JobConf(false);
ChainMapper.addMapper(conf1, MapperFirstRelocalize.class, LongWritable.class, Text.class, CarrierKey.class, Text.class, true, mapperFirstConf);

JobConf mapperSecondConf = new JobConf(false);
ChainMapper.addMapper(conf1, MapperSecondSortNCalc.class, CarrierKey.class, Text.class, CarrierKey.class, IntWritable.class, true, mapperSecondConf);

JobConf reducerThirdConf = new JobConf(false);
ChainReducer.setReducer(conf1, ReducerThirdSortNCalc.class, CarrierKey.class, IntWritable.class, CarrierKey.class, IntWritable.class, true, reducerThirdConf);

JobConf mapperFifthConf = new JobConf(false);
ChainReducer.addMapper(conf1, MapperFifthCarrierCode.class, CarrierKey.class, IntWritable.class, Text.class, Text.class, true, mapperFifthConf);

// It works well until this line

JobConf reducerFinalConf = new JobConf(false);

// java.lang.IllegalStateException: Reducer has been already set Exception is throws on this line
ChainReducer.setReducer(conf1, ReducerFinalOutput.class, Text.class, Text.class, Text.class, Text.class, true, reducerFinalConf);

JobClient.runJob(conf1);

return 0;
}

public static void main(String[] args) throws Exception {
// TODO Auto-generated method stub
int res = ToolRunner.run(new Configuration(), new HadoopETLDriver(), args);
System.exit(res);
}

}
Joseph Hwang
Greenhorn

Joined: Aug 17, 2013
Posts: 13
I found the answer myself. ChainReducer.setReducer can set the only ONE reducer class.
Rasaq Otunba
Greenhorn

Joined: Mar 14, 2014
Posts: 1
I understand that the reducer can be set once but how did you achieve using the second reduce stage? Does hadoop automatically use the reduce again?? I want the same flow: map->reduce->map->reduce. The last reduce causes the exception but I need a second reduce stage.

Thanks
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: java.lang.IllegalStateException: "Reducer has been already set" Exception when invoking ChainReducer
 
Similar Threads
WordCount program giving exception.
Calling EJB Bean with WebSphere application server Scheduler
Hadoop key mismatch
ejb create() Error java.lang. IllegalStateEx: Failed to find method for hash
HADOOP Reducer code not working...!