请稍等 ...
×

采纳答案成功!

向帮助你的同学说点啥吧!感谢那些助人为乐的人

HDFS连接不上

命令行可操作,防火墙已关闭
[hadoop@hadoop001 cloudera]$ hadoop fs -copyFromLocal cdh_version.properties /
21/05/15 15:40:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
[hadoop@hadoop001 cloudera]$ hadoop fs -ls /
21/05/15 15:41:11 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Found 2 items
-rw-r–r-- 1 hadoop supergroup 1366 2021-05-15 15:17 /README.txt
-rw-r–r-- 1 hadoop supergroup 548 2021-05-15 15:40 /cdh_version.properties
[hadoop@hadoop001 cloudera]$ sudo firewall-cmd --state
[sudo] hadoop 的密码:
not running
[hadoop@hadoop001 cloudera]$ hostname
hadoop001
[hadoop@hadoop001 cloudera]$ ip a
1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000
link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
inet 127.0.0.1/8 scope host lo
valid_lft forever preferred_lft forever
inet6 ::1/128 scope host
valid_lft forever preferred_lft forever
2: enp0s3: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc pfifo_fast state UP group default qlen 1000
link/ether 08:00:27:e7:3e:72 brd ff:ff:ff:ff:ff:ff
inet 10.8.11.200/24 brd 10.8.11.255 scope global noprefixroute dynamic enp0s3
valid_lft 43sec preferred_lft 43sec
inet6 fe80::1c96:e2a8:bb57:1267/64 scope link noprefixroute
valid_lft forever preferred_lft forever

HDFS连接报错信息
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread “main” java.net.ConnectException: Call From caicm.local/127.0.0.1 to hadoop001:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731)
at org.apache.hadoop.ipc.Client.call(Client.java:1508)
at org.apache.hadoop.ipc.Client.call(Client.java:1441)
at org.apache.hadoop.ipc.ProtobufRpcEngineInvoker.invoke(ProtobufRpcEngine.java:230)atcom.sun.proxy.Invoker.invoke(ProtobufRpcEngine.java:230) at com.sun.proxy.Invoker.invoke(ProtobufRpcEngine.java:230)atcom.sun.proxy.Proxy10.mkdirs(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:575)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
at com.sun.proxy.$Proxy11.mkdirs(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3155)
at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:3122)
at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1005)
at org.apache.hadoop.hdfs.DistributedFileSystem19.doCall(DistributedFileSystem.java:1001)atorg.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)atorg.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1001)atorg.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:993)atorg.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1970)athadoop.hdfs.HDFSApp.main(HDFSApp.java:21)Causedby:java.net.ConnectException:Connectionrefusedatsun.nio.ch.SocketChannelImpl.checkConnect(NativeMethod)atsun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:715)atorg.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)atorg.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)atorg.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)atorg.apache.hadoop.ipc.Client19.doCall(DistributedFileSystem.java:1001) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1001) at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:993) at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1970) at hadoop.hdfs.HDFSApp.main(HDFSApp.java:21) Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:715) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494) at org.apache.hadoop.ipc.Client19.doCall(DistributedFileSystem.java:1001)atorg.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)atorg.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1001)atorg.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:993)atorg.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1970)athadoop.hdfs.HDFSApp.main(HDFSApp.java:21)Causedby:java.net.ConnectException:Connectionrefusedatsun.nio.ch.SocketChannelImpl.checkConnect(NativeMethod)atsun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:715)atorg.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)atorg.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)atorg.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)atorg.apache.hadoop.ipc.ClientConnection.setupConnection(Client.java:648)
at org.apache.hadoop.ipc.ClientConnection.setupIOstreams(Client.java:744)atorg.apache.hadoop.ipc.ClientConnection.setupIOstreams(Client.java:744) at org.apache.hadoop.ipc.ClientConnection.setupIOstreams(Client.java:744)atorg.apache.hadoop.ipc.ClientConnection.access$3000(Client.java:396)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1557)
at org.apache.hadoop.ipc.Client.call(Client.java:1480)
… 20 more

正在回答 回答被采纳积分+3

1回答

提问者 ccmfirst 2021-05-15 15:48:45

[hadoop@hadoop001 cloudera]$ ping hadoop001:8020

ping: hadoop001:8020: 未知的名称或服务

8020 ping不通

0 回复 有任何疑惑可以回复我~
  • 你的机器中hosts中配置hadoop001和ip的映射关系没?
    回复 有任何疑惑可以回复我~ 2021-05-15 23:49:35
  • 提问者 ccmfirst 回复 Michael_PK #2
    映射了, 换个虚拟机重新部署了一下ok了,这台虚拟机未找到原因,谢谢老师
    回复 有任何疑惑可以回复我~ 2021-05-16 18:57:46
问题已解决,确定采纳
还有疑问,暂不采纳
微信客服

购课补贴
联系客服咨询优惠详情

帮助反馈 APP下载

慕课网APP
您的移动学习伙伴

公众号

扫描二维码
关注慕课网微信公众号