我想在我的 macbook pro 上启动 Hadoop,我按照 apache 要求的步骤做了。当我使用命令“bin/start-all.sh”时,我得到了以下信息:
starting namenode, logging to /Users/alibozorgkhan/Desktop/hadoop-0.20.203.0/bin/../logs/hadoop-alibozorgkhan-namenode-d142-058-172-111.wireless.sfu.ca.out
localhost: starting datanode, logging to /Users/alibozorgkhan/Desktop/hadoop-0.20.203.0/bin/../logs/hadoop-alibozorgkhan-datanode-d142-058-172-111.wireless.sfu.ca.out
localhost: starting secondarynamenode, logging to /Users/alibozorgkhan/Desktop/hadoop-0.20.203.0/bin/../logs/hadoop-alibozorgkhan-secondarynamenode-d142-058-172-111.wireless.sfu.ca.out
starting jobtracker, logging to /Users/alibozorgkhan/Desktop/hadoop-0.20.203.0/bin/../logs/hadoop-alibozorgkhan-jobtracker-d142-058-172-111.wireless.sfu.ca.out
localhost: starting tasktracker, logging to /Users/alibozorgkhan/Desktop/hadoop-0.20.203.0/bin/../logs/hadoop-alibozorgkhan-tasktracker-d142-058-172-111.wireless.sfu.ca.out
Hadoop 无法启动。我检查了 datanode 的日志,其中有以下内容:
2011-10-06 18:03:45,513 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.lang.NullPointerException
at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:136)
at org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:176)
at org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:206)
at org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:200)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:306)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:268)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1480)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1419)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1437)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1563)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1573)
有办法解决这个问题吗?谢谢
答案1
几年前,我在我的 MacbookPro 上安装了 Hadoop 进行测试。这是一个坏主意。我花了大约五个小时才让一切运行起来。它需要正确的 Java 版本、正确的 PATH 设置、正确的 HDFS/Hadoop 设置和版本。
由于我不需要性能来进行测试,因此不久之后我改用虚拟机。只需在 Google 上搜索一下,就能找到大量免费虚拟机,可以使用免费的虚拟机播放器来运行它们。