请稍等 ...
×

采纳答案成功!

向帮助你的同学说点啥吧!感谢那些助人为乐的人

spark yarn模式启动报错

老师,我在配置spark-env.sh的yarn模式后,启动./spark-shell --master yarn --name testapp后报错,如下,(其中172.24.177.102是阿里云私有ip)

20/03/18 13:45:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
20/03/18 13:45:45 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
20/03/18 13:45:56 ERROR client.TransportClient: Failed to send RPC RPC 7853641892381695594 to /172.24.177.102:53760: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
	at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
20/03/18 13:45:56 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to get executor loss reason for executor id 1 at RPC address 172.24.177.102:53768, but got no response. Marking as slave lost.
java.io.IOException: Failed to send RPC RPC 7853641892381695594 to /172.24.177.102:53760: java.nio.channels.ClosedChannelException
	at org.apache.spark.network.client.TransportClient$RpcChannelListener.handleFailure(TransportClient.java:362)
	at org.apache.spark.network.client.TransportClient$StdChannelListener.operationComplete(TransportClient.java:339)
	at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
	at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481)
	at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
	at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.safeSetFailure(AbstractChannel.java:987)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.write(AbstractChannel.java:869)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.write(DefaultChannelPipeline.java:1316)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
	at io.netty.channel.AbstractChannelHandlerContext.access$1900(AbstractChannelHandlerContext.java:38)
	at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:1081)
	at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1128)
	at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1070)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.nio.channels.ClosedChannelException
	at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
20/03/18 13:45:56 ERROR cluster.YarnScheduler: Lost executor 1 on zhangbt3: Slave lost
20/03/18 13:46:02 ERROR cluster.YarnClientSchedulerBackend: YARN application has exited unexpectedly with state UNDEFINED! Check the YARN application logs for more details.
20/03/18 13:46:02 ERROR cluster.YarnClientSchedulerBackend: Diagnostics message: Shutdown hook called before final status was reported.
20/03/18 13:46:02 ERROR client.TransportClient: Failed to send RPC RPC 8855881767670035725 to /172.24.177.102:53784: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
	at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
20/03/18 13:46:02 ERROR cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Sending RequestExecutors(0,0,Map(),Set()) to AM was unsuccessful
java.io.IOException: Failed to send RPC RPC 8855881767670035725 to /172.24.177.102:53784: java.nio.channels.ClosedChannelException
	at org.apache.spark.network.client.TransportClient$RpcChannelListener.handleFailure(TransportClient.java:362)
	at org.apache.spark.network.client.TransportClient$StdChannelListener.operationComplete(TransportClient.java:339)
	at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
	at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481)
	at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
	at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.safeSetFailure(AbstractChannel.java:987)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.write(AbstractChannel.java:869)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.write(DefaultChannelPipeline.java:1316)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
	at io.netty.channel.AbstractChannelHandlerContext.access$1900(AbstractChannelHandlerContext.java:38)
	at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:1081)
	at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1128)
	at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1070)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.nio.channels.ClosedChannelException
	at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
20/03/18 13:46:02 ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Spark context stopped while waiting for backend

正在回答 回答被采纳积分+3

插入代码

2回答

懂渊 2020-08-02 15:03:23

<property>

    <name>yarn.nodemanager.pmem-check-enabled</name>

    <value>false</value>

</property>

<property>

    <name>yarn.nodemanager.vmem-check-enabled</name>

    <value>false</value>

    <description>Whether virtual memory limits will be enforced for containers</description>

</property>

<property>

    <name>yarn.nodemanager.vmem-pmem-ratio</name>

    <value>3</value>

    <description>Ratio between virtual memory to physical memory when setting memory limits for containers</description>

</property>


0 回复 有任何疑惑可以回复我~
叁金 2020-03-19 15:37:12

怎么感觉像是资源问题程序没有跑起来呢?你的yarn环境正常吗?

0 回复 有任何疑惑可以回复我~
  • 提问者 WaniuZhang_ITBoy #1
    配置下yarn-site.xml中的几个参数就好了:
    yarn.scheduler.maximum-allocation-mb
    yarn.scheduler.minimum-allocation-mb
    yarn.nodemanager.vmem-pmem-ratio
    回复 有任何疑惑可以回复我~ 2020-03-20 12:28:34
问题已解决,确定采纳
还有疑问,暂不采纳
意见反馈 帮助中心 APP下载
官方微信