请稍等 ...
×

采纳答案成功!

向帮助你的同学说点啥吧!感谢那些助人为乐的人

datanode无法启动

2018-11-25 00:31:24,717 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2018-11-25 00:31:25,369 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2018-11-25 00:31:25,439 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2018-11-25 00:31:25,448 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is localhost
2018-11-25 00:31:25,452 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
2018-11-25 00:31:25,480 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
2018-11-25 00:31:25,482 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is 10485760 bytes/s
2018-11-25 00:31:26,021 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup
2018-11-25 00:31:26,159 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 50020
2018-11-25 00:31:26,365 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
2018-11-25 00:46:09,109 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
        at org.apache.hadoop.ipc.Client.call(Client.java:1508)
        at org.apache.hadoop.ipc.Client.call(Client.java:1441)
STARTUP_MSG:   java = 1.8.0_91
************************************************************/
2018-11-25 00:46:50,590 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2018-11-25 00:46:51,234 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2018-11-25 00:46:51,302 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2018-11-25 00:46:51,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is hadoop000
2018-11-25 00:46:51,315 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
2018-11-25 00:46:51,335 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
2018-11-25 00:46:51,337 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is 10485760 bytes/s
2018-11-25 00:46:51,938 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup
2018-11-25 00:46:52,096 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 50020
2018-11-25 01:00:24,965 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-28074851-192.168.199.233-1539551174228 blk_1073742285_1461 file /home/hadoop/app/tmp/dfs/data/current/BP-28074851-192.168.199.233-1539551174228/current/finalized/subdir0/subdir1/blk_1073742285
2018-11-25 01:00:24,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-28074851-192.168.199.233-1539551174228 blk_1073742286_1462 file /home/hadoop/app/tmp/dfs/data/current/BP-28074851-192.168.199.233-1539551174228/current/finalized/subdir0/subdir1/blk_1073742286
2018-11-25 01:00:24,966 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetAsyncDiskService: Deleted BP-28074851-192.168.199.233-1539551174228 blk_1073742287_1463 file /home/hadoop/app/tmp/dfs/data/current/BP-28074851-192.168.199.233-1539551174228/current/finalized/subdir0/subdir1/blk_1073742287
2018-11-25 01:13:25,281 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService

2018-11-25 01:13:25,281 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService
java.io.EOFException: End of File Exception between local host is: "hadoop000/192.168.199.233"; destination host is: "hadoop000":8020; : java.io.EOFException; For more details see:  http://wiki.apache.org/hadoop/EOFException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
        at org.apache.hadoop.ipc.Client.call(Client.java:1508)
        at org.apache.hadoop.ipc.Client.call(Client.java:1441)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
        at com.sun.proxy.$Proxy16.sendHeartbeat(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.sendHeartbeat(DatanodeProtocolClientSideTranslatorPB.java:154)
        at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.sendHeartBeat(BPServiceActor.java:406)
        at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:509)

老师这个怎么解决啊?

正在回答 回答被采纳积分+3

6回答

提问者 96年的nash 2019-08-30 15:42:19

https://img1.sycdn.imooc.com//szimg/5d68d33309c12ff307720143.jpg

https://img1.sycdn.imooc.com//szimg/5d68d33309919eb306120165.jpg

老师,我的ip和hosts文件都修改了,这个between local host is: "hadoop000/192.168.199.233";应给去哪儿修改呢?

1 回复 有任何疑惑可以回复我~
  • 这个233肯定是错的,因为是我家的IP。
    回复 有任何疑惑可以回复我~ 2019-08-30 17:16:10
  • 你这改的是哪个文件?etc host s吗
    回复 有任何疑惑可以回复我~ 2019-08-30 17:22:23
  • 提问者 96年的nash 回复 Michael_PK #3
    老师到底是哪儿没改好呢?您看我的截图,我的IP和hosts都改成我自己的了,为什么datanode还是您的IP,这个要去哪儿检查呢老师,查了半个下午了还是没改好
    回复 有任何疑惑可以回复我~ 2019-08-30 17:23:15
提问者 96年的nash 2019-08-30 17:22:15

https://img1.sycdn.imooc.com//szimg/5d68eab00946255609230613.jpg

这个生成的文件夹还是您的IP

0 回复 有任何疑惑可以回复我~
提问者 96年的nash 2019-08-30 17:20:11

重启虚拟机后又一次报错

INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-28074851-192.168.199.233-1539551174228

2018-11-25 00:31:26,645 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /home/hadoop/app/tmp/dfs/data/current/BP-28074851-192.168.199.233-1539551174228

2018-11-25 00:31:26,647 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Setting up storage: nsid=1635790697;bpid=BP-28074851-192.168.199.233-1539551174228;lv=-56;nsInfo=lv=-60;cid=CID-7806dc12-b049-401e-9063

为什么还会出现老师您那的192.168.199.233啊(大哭 )

0 回复 有任何疑惑可以回复我~
提问者 96年的nash 2019-08-30 16:29:56

https://img1.sycdn.imooc.com//szimg/5d68de6b094f897b07640268.jpg

这边format的时候显示的也是我本地的IP啊

0 回复 有任何疑惑可以回复我~
Michael_PK 2019-08-30 15:24:22

根据日志分析,你看到没IP是我家的IP,也就说明你使用的机器的IP或者host没有改成功。得改成你自己的IP才行

0 回复 有任何疑惑可以回复我~
提问者 96年的nash 2019-08-30 14:38:25

https://img1.sycdn.imooc.com//szimg/5d68c44109e7325811220245.jpg

还有老师我的/etc/hosts文件这么设置对吗?还是要把localhost修改下?

0 回复 有任何疑惑可以回复我~
问题已解决,确定采纳
还有疑问,暂不采纳
意见反馈 帮助中心 APP下载
官方微信