老师你好,
本地用虚拟机跑了一遍教程没有出现问题,所以开了一个云主机玩一下 Spark Streaming。
流程,顺序和配置都正确的,安全组也把所有的端口都开放了。
启动Flume也正常,netstat -anp|grep 44444 和 41414 都是正常占用,telnet 这两个端口也正常。
但是本地调试启动IDEA 去连接41414端口的时候会报错,
请问云主机一般这种情况的解决思路是什么呢?
报错信息如下:
20/09/11 13:39:33 ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting receiver 0 - java.io.IOException: Error connecting to hadoop/96.30.196.34:41414
at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:261)
at org.apache.avro.ipc.NettyTransceiver.(NettyTransceiver.java:203)
at org.apache.avro.ipc.NettyTransceiver.(NettyTransceiver.java:138)
at org.apache.spark.streaming.flume.FlumePollingReceiverKaTeX parse error: $ within math modeanonfun$onStart1.apply(FlumePollingInputDStream.scala:82)atscala.collection.immutable.List.foreach(List.scala:381)atorg.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:82)atorg.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)atorg.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)atorg.apache.spark.streaming.scheduler.ReceiverTracker1.apply(FlumePollingInputDStream.scala:82)
at scala.collection.immutable.List.foreach(List.scala:381)
at org.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:82)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
at org.apache.spark.streaming.scheduler.ReceiverTracker1.apply(FlumePollingInputDStream.scala:82)atscala.collection.immutable.List.foreach(List.scala:381)atorg.apache.spark.streaming.flume.FlumePollingReceiver.onStart(FlumePollingInputDStream.scala:82)atorg.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)atorg.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)atorg.apache.spark.streaming.scheduler.ReceiverTrackerReceiverTrackerEndpointKaTeX parse error: $ within math modeanonfun9.apply(ReceiverTracker.scala:597)atorg.apache.spark.SparkContext9.apply(ReceiverTracker.scala:597)
at org.apache.spark.SparkContext9.apply(ReceiverTracker.scala:597)atorg.apache.spark.SparkContext$anonfun34.apply(SparkContext.scala:2173)atorg.apache.spark.SparkContext34.apply(SparkContext.scala:2173)
at org.apache.spark.SparkContext34.apply(SparkContext.scala:2173)atorg.apache.spark.SparkContext$anonfun34.apply(SparkContext.scala:2173)atorg.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)atorg.apache.spark.scheduler.Task.run(Task.scala:108)atorg.apache.spark.executor.Executor34.apply(SparkContext.scala:2173)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor34.apply(SparkContext.scala:2173)atorg.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)atorg.apache.spark.scheduler.Task.run(Task.scala:108)atorg.apache.spark.executor.ExecutorTaskRunner.run(Executor.scala:335)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.ConnectException: Connection refused: hadoop/96.30.196.34:41414
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:714)
at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:152)
at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:105)
at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:79)
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337)
at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:42)
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
… 3 more