请稍等 ...
×

采纳答案成功!

向帮助你的同学说点啥吧!感谢那些助人为乐的人

spark 启动scala 或者 pyspark都会有问题,好像路径有问题?

17/12/21 17:55:40 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0

17/12/21 17:55:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

17/12/21 17:55:41 WARN Utils: Your hostname, hadoop001 resolves to a loopback address: 127.0.0.1; using 10.41.13.210 instead (on interface eth1)

17/12/21 17:55:41 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address

17/12/21 17:55:44 WARN HiveMetaStore: Retrying creating default database after error: Error creating transactional connection factory

javax.jdo.JDOFatalInternalException: Error creating transactional connection factory

at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)

at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)



正在回答 回答被采纳积分+3

3回答

提问者 Beebop 2017-12-22 10:54:55

启动 start-all.sh也有问题


starting org.apache.spark.deploy.master.Master, logging to /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-hadoop001.out

localhost: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/app/spark-2.1.0-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-hadoop001.out


0 回复 有任何疑惑可以回复我~
提问者 Beebop 2017-12-22 10:22:19

sc 没有启动起来

0 回复 有任何疑惑可以回复我~
提问者 Beebop 2017-12-22 10:04:48


Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.


Caused by: org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.


Caused by: java.lang.reflect.InvocationTargetException: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':


Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':

 

Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient


Caused by: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient


Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient



Caused by: java.lang.reflect.InvocationTargetException: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory


Caused by: java.lang.reflect.InvocationTargetException: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.

Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.

  

<console>:14: error: not found: value spark

       import spark.implicits._

              ^

<console>:14: error: not found: value spark

       import spark.sql


0 回复 有任何疑惑可以回复我~
  • 日志都告诉你了少MySQL驱动,详细看视频里面的操作
    回复 有任何疑惑可以回复我~ 2017-12-22 11:25:42
  • 提问者 Beebop 回复 Michael_PK #2
    对,我不知道改了哪里,今天hive,hdfs都不能启动了。谢谢老师。我在排查
    回复 有任何疑惑可以回复我~ 2017-12-22 12:17:41
  • Michael_PK 回复 提问者 Beebop #3
    先hdfs起来先,spark shell启动加--jars
    回复 有任何疑惑可以回复我~ 2017-12-22 12:21:45
问题已解决,确定采纳
还有疑问,暂不采纳
意见反馈 帮助中心 APP下载
官方微信