Hadoop FileAlreadyExistsException:输出目录 hdfs://:9000/输入已存在

Hadoop FileAlreadyExistsException:输出目录 hdfs://:9000/输入已存在

我已将 Hadoop 设置为完全分布式模式,包含一个主服务器和三个从服务器。我正在尝试执行一个名为 的 jar 文件,Tasks.jar该文件arg[0]作为输入目录和arg[1]输出目录。

在我的 hadoop 环境中,我在目录中有输入文件,但我的 hadoop 环境中/input没有目录。/output

我使用命令检查了上述hadoop fs -ls /内容

现在,当我尝试使用以下命令执行我的 jar 文件时:

hadoop jar Tasks.jar ProgrammingAssigment/Tasks /input /output

我收到以下异常:

ubuntu@ip-172-31-5-213:~$ hadoop jar Tasks.jar ProgrammingAssignment/Tasks /input /output
16/10/14 02:26:23 INFO client.RMProxy: Connecting to ResourceManager at ec2-52-55-2-64.compute-1.amazonaws.com/172.31.5.213:8032
Exception in thread "main" org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://ec2-52-55-2-64.compute-1.amazonaws.com:9000/input already exists
    at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146)
    at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
    at ProgrammingAssignment.Tasks.main(Tasks.java:96)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

源代码:

    public static void main(String []args)throws Exception{
    Configuration conf=new Configuration();
    Job wordCount=new Job(conf,"Word Count");
    wordCount.setJarByClass(Tasks.class); 
    FileInputFormat.addInputPath(wordCount, new Path(args[0]));//input1
    FileOutputFormat.setOutputPath(wordCount,new Path(args[1]));//output1 & input2
    //FileInputFormat.addInputPath(wordCount, new Path("/input"));
    //FileOutputFormat.setOutputPath(wordCount,new Path("/output"));
    wordCount.setMapperClass(totalOccurenceMapper.class);
    wordCount.setReducerClass(totalOccurenceReducer.class);
    wordCount.setMapOutputKeyClass(Text.class);
    wordCount.setMapOutputValueClass(Text.class);
    wordCount.setOutputKeyClass(Text.class);
    wordCount.setOutputValueClass(Text.class);
//  wordCount.waitForCompletion(true);
    System.exit(wordCount.waitForCompletion(true) ? 0 : 1);
    }

如果我对上面代码中注释的路径进行硬编码,我会得到以下输出:

ubuntu@ip-172-31-5-213:~$ hadoop jar Tasks.jar ProgrammingAssignment/Tasks 

16/10/14 15:51:19 INFO client.RMProxy:连接到 ec2-52-55-2-64.compute-1.amazonaws.com/172.31.5.213:8032 上的 ResourceManager 16/10/14 15:51:20 INFO ipc.Client:重试连接到服务器:ec2-52-55-2-64.compute-1.amazonaws.com/172.31.5.213:8032。已尝试 0 次;重试策略为 RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS) 16/10/14 15:51:21 INFO ipc.Client: 重试连接到服务器:ec2-52-55-2-64.compute-1.amazonaws.com/172.31.5.213:8032。已尝试 1 次;重试策略为 RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS) 16/10/14 15:51:22 INFO ipc.Client: 重试连接到服务器:ec2-52-55-2-64.compute-1.amazonaws.com/172.31.5.213:8032。已尝试 2 次;重试策略为 RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 毫秒) 16/10/14 15:51:23 INFO ipc.Client:正在重试连接到服务器:ec2-52-55-2-64.compute-1.amazonaws.com/172.31.5.213:8032。已尝试 3 次;重试策略为 RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 毫秒)

答案1

运行以下命令时:

hadoop jar Tasks.jar ProgrammingAssigment/Tasks /input /output

args 数组将包含以下元素:

args[0]     ProgrammingAssigment/Tasks
args[1]     /input
args[2]     /output

尝试省略该ProgrammingAssigment/Tasks参数,我猜它是不需要的。如果出于某种原因需要它,则在代码中分别使用args[1]和作为输入和输出目录。args[2]

至于您获得的超时,我不知道。您可以尝试增加它提到的 maxRetries 或 sleepTime 值。

相关内容