请稍等 ...
×

采纳答案成功!

向帮助你的同学说点啥吧!感谢那些助人为乐的人

spark2.1.0的standardalone模式,sbin下启动start-all.sh脚本,报JAVA_HOME is not set的问题咨询

spark2.1.0的sbin下启动start-all.sh脚本后,报如下信息,即,

starting org.apache.spark.deploy.master.Master, logging to /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-hadoop001.out

localhost: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-hadoop001.out

localhost: failed to launch: nice -n 0 /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://hadoop001:7077

localhost:   JAVA_HOME is not set

localhost: full log in /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-hadoop001.out


master的日志内容如下:

[hadoop@hadoop001 ~]$ tail -100f /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-hadoop001.out

Spark Command: /usr/lib/jvm/jdk1.8.0_161/bin/java -cp /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/conf/:/home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/jars/* -Xmx1g org.apache.spark.deploy.master.Master --host hadoop001 --port 7077 --webui-port 8080

========================================

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

18/06/04 09:57:06 INFO Master: Started daemon with process name: 2321@hadoop001

18/06/04 09:57:06 INFO SignalUtils: Registered signal handler for TERM

18/06/04 09:57:06 INFO SignalUtils: Registered signal handler for HUP

18/06/04 09:57:06 INFO SignalUtils: Registered signal handler for INT

18/06/04 09:57:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

18/06/04 09:57:07 INFO SecurityManager: Changing view acls to: hadoop

18/06/04 09:57:07 INFO SecurityManager: Changing modify acls to: hadoop

18/06/04 09:57:07 INFO SecurityManager: Changing view acls groups to: 

18/06/04 09:57:07 INFO SecurityManager: Changing modify acls groups to: 

18/06/04 09:57:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(hadoop); groups with view permissions: Set(); users  with modify permissions: Set(hadoop); groups with modify permissions: Set()

18/06/04 09:57:09 INFO Utils: Successfully started service 'sparkMaster' on port 7077.

18/06/04 09:57:09 INFO Master: Starting Spark master at spark://hadoop001:7077

18/06/04 09:57:09 INFO Master: Running Spark version 2.1.0

18/06/04 09:57:10 INFO Utils: Successfully started service 'MasterUI' on port 8080.

18/06/04 09:57:10 INFO MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://192.168.222.200:8080

18/06/04 09:57:10 INFO Utils: Successfully started service on port 6066.

18/06/04 09:57:10 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066

18/06/04 09:57:11 INFO Master: I have been elected leader! New state: ALIVE

但master没看到内存注入和core的注入的信息。


worker的比较简单就是个:JAVA_HOME is not set

我查看了环境变量~/.bash_profile下有JAVA_HOME,即,

export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_161

export PATH=$PATH:$JAVA_HOME/bin

这样改了后,不报JAVA_HOME is not set的问题了,日志最后几行为,即,

18/06/04 09:57:09 INFO Utils: Successfully started service 'sparkMaster' on port 7077.

18/06/04 09:57:09 INFO Master: Starting Spark master at spark://hadoop001:7077

18/06/04 09:57:09 INFO Master: Running Spark version 2.1.0

18/06/04 09:57:10 INFO Utils: Successfully started service 'MasterUI' on port 8080.

18/06/04 09:57:10 INFO MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://192.168.222.200:8080

18/06/04 09:57:10 INFO Utils: Successfully started service on port 6066.

18/06/04 09:57:10 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066

18/06/04 09:57:11 INFO Master: I have been elected leader! New state: ALIVE

18/06/04 10:15:19 ERROR Master: RECEIVED SIGNAL TERM

正在回答

2回答

你把javahome设置到spark-env.sh中去,把所有进程都停了,重新来过,再看下日志。standalone模式建议玩玩就行,生产上很少用

0 回复 有任何疑惑可以回复我~
  • 提问者 慕神816625 #1
    嗯嗯,好的,非常感谢老师
    回复 有任何疑惑可以回复我~ 2018-06-04 20:49:08
  • 老师,我的也报了这个错,网上百度了个方法是把java_home 添加到spark-congfig.sh中,启动也正常了。spark-env.sh与spark-config.sh作用是不是作用域不一样?
    exoport JAVA_HOME=本地javahome路径
    回复 有任何疑惑可以回复我~ 2018-08-18 10:11:33
Michael_PK 2018-08-18 13:36:38

spark env中加上Java home

0 回复 有任何疑惑可以回复我~
问题已解决,确定采纳
还有疑问,暂不采纳
微信客服

购课补贴
联系客服咨询优惠详情

帮助反馈 APP下载

慕课网APP
您的移动学习伙伴

公众号

扫描二维码
关注慕课网微信公众号