请稍等 ...
×

采纳答案成功!

向帮助你的同学说点啥吧!感谢那些助人为乐的人

spark-shell 启动失败

错误日志如下
[hadoop@hadoop000 bin]$ ./spark-shell
20/05/28 12:53:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/05/28 12:53:12 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: 无法指定被请求的地址: Service ‘sparkDriver’ failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service ‘sparkDriver’ (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
at io.netty.channel.AbstractChannelAbstractUnsafe.bind(AbstractChannel.java:558)atio.netty.channel.DefaultChannelPipelineAbstractUnsafe.bind(AbstractChannel.java:558) at io.netty.channel.DefaultChannelPipelineAbstractUnsafe.bind(AbstractChannel.java:558)atio.netty.channel.DefaultChannelPipelineHeadContext.bind(DefaultChannelPipeline.java:1283)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
at io.netty.util.concurrent.SingleThreadEventExecutor5.run(SingleThreadEventExecutor.java:858)atio.netty.util.concurrent.DefaultThreadFactory5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory5.run(SingleThreadEventExecutor.java:858)atio.netty.util.concurrent.DefaultThreadFactoryDefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:745)
java.net.BindException: 无法指定被请求的地址: Service ‘sparkDriver’ failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service ‘sparkDriver’ (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
at io.netty.channel.AbstractChannelAbstractUnsafe.bind(AbstractChannel.java:558)atio.netty.channel.DefaultChannelPipelineAbstractUnsafe.bind(AbstractChannel.java:558) at io.netty.channel.DefaultChannelPipelineAbstractUnsafe.bind(AbstractChannel.java:558)atio.netty.channel.DefaultChannelPipelineHeadContext.bind(DefaultChannelPipeline.java:1283)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
at io.netty.util.concurrent.SingleThreadEventExecutor5.run(SingleThreadEventExecutor.java:858)atio.netty.util.concurrent.DefaultThreadFactory5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory5.run(SingleThreadEventExecutor.java:858)atio.netty.util.concurrent.DefaultThreadFactoryDefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:745)
:14: error: not found: value spark
import spark.implicits._
^
:14: error: not found: value spark
import spark.sql
^
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/_,// //_\ version 2.3.0
/
/

Using Scala version 2.11.8 (Java HotSpot™ 64-Bit Server VM, Java 1.8.0_91)
Type in expressions to have them evaluated.
Type :help for more information.

我不知道哪里出错了老师看一下

正在回答 回答被采纳积分+3

2回答

OliverSong 2021-03-08 05:44:22

兄弟后来你处理好了吗?我的是第一次启动spark就出问题,不像因为有作业在运行。

并且Google了一下,加了export SPARK_LOCAL_IP="127.0.0.1" 在load-spark-env.sh(存在于spark/bin) 也无效。

谢谢!

0 回复 有任何疑惑可以回复我~
  • sparkDriver’ failed after 16 retries 已经超过了16次了,默认最大是16次
    回复 有任何疑惑可以回复我~ 2021-03-08 13:26:32
  • 嗯嗯试了16次。
    
    以下供其他同学类似问题参考:
    
    这个问题和另一个同学的问题一样:
    https://coding.imooc.com/learn/questiondetail/75703.html
    老师回答提供的链接中,答主说将SPARK_LOCAL_IP设为"127.0.0.1",其实对我们用老师的OOTB不适用,因为hostname的IP地址不是这个,且可能会变,每个人不同。主要是要告诉spark从这个IP启动?
    
    所以最终需要:
    1. 加 export SPARK_LOCAL_IP="xxxx" 在load-spark-env.sh(存在于spark/bin), xxxx是你的机器hostname的IP地址,通过ifconfig命令获得。而不是“127.0.0.1”。
    2. 之后遇到javaconnection问题,因为hadoop没启动。
    启动参考https://blog.csdn.net/u011495642/article/details/84063496
    
    解决~
    回复 有任何疑惑可以回复我~ 2021-03-09 14:43:21
Michael_PK 2020-05-28 14:36:44

仔细看日志超过16次了,你jps看看是不是这机器很多spark作业在运行中

0 回复 有任何疑惑可以回复我~
问题已解决,确定采纳
还有疑问,暂不采纳
意见反馈 帮助中心 APP下载
官方微信