请稍等 ...
×

采纳答案成功!

向帮助你的同学说点啥吧!感谢那些助人为乐的人

执行./hivecontext.sh报访问的Hive表不存在

代码如下:

import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.hive.HiveContext

/**
  * HiveSQL使用
  */
object HiveContextApp {

  def main(args: Array[String]): Unit = {

    // 1) 创建相应的Context
    val sparkConf = new SparkConf()
    // 本地调试使用,在测试和生产中,通过脚本的方式传入
    //   sparkConf.setAppName("SQLContextApp").setMaster("local[2]")
    val sc = new SparkContext(sparkConf)
    val hiveContext = new HiveContext(sc)

    // 2) 相关处理
    hiveContext.table("hive_wordcount").show

    // 3) 关闭资源
    sc.stop()
  }

}

Hive表存在:

https://img1.sycdn.imooc.com//szimg/5a95279c0001014119150457.jpg

HDFS进程也存在:

https://img1.sycdn.imooc.com//szimg/5a9527ae00015fd509630120.jpg

抛出异常如下,提示对应的Hive表不存在:

https://img1.sycdn.imooc.com//szimg/5a9527bb0001786813880736.jpg

正在回答

3回答

驱动加没。。

0 回复 有任何疑惑可以回复我~
  • 提问者 你猜一下 #1
    驱动加了,这是启动的脚本
    spark-submit \
        --class com.baidu.spark.HiveContextApp \
        --jars /home/users/dxh/hadoop/mysql-connector-java-5.1.39-bin.jar \
        --master local[2] \
        /home/users/dxh/hadoop/lib/sparksql-1.0.jar
    回复 有任何疑惑可以回复我~ 2018-02-27 21:06:51
  • 提问者 你猜一下 #2
    非常感谢老师,驱动的目录写错了
    回复 有任何疑惑可以回复我~ 2018-02-27 21:09:12
Michael_PK 2018-02-27 20:21:23

hivesite放到spark/conf下没

0 回复 有任何疑惑可以回复我~
  • 提问者 你猜一下 #1
    放了,放到之后提示的错误如下:
    18/02/27 17:50:57 WARN HiveMetaStore: Retrying creating default database after error: Error creating transactional connection factory
    是hive版本不兼容造成的吗?
    具体的错误堆栈信息,如下面的回复
    回复 有任何疑惑可以回复我~ 2018-02-27 20:38:56
提问者 你猜一下 2018-02-27 18:35:17

我把hive-site.xml放在spark安装目录的conf下,提示的错误如下:

18/02/27 17:50:57 INFO ObjectStore: ObjectStore, initialize called

18/02/27 17:50:57 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored

18/02/27 17:50:57 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored

18/02/27 17:50:57 WARN HiveMetaStore: Retrying creating default database after error: Error creating transactional connection factory

javax.jdo.JDOFatalInternalException: Error creating transactional connection factory

at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)

at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)

at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)

at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)

at java.security.AccessController.doPrivileged(Native Method)

at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)

at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)

at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)

at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)

at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)

at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)

at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)

at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)

at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)

at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)

at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)

at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)

at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)

at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)

at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)

at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)

at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)

at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)

at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)

at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)

at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)

at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)

at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)

at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)

at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)

at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)

at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)

at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)

at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)

at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)

at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)

at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)

at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)

at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)

at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)

at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)

at scala.Option.getOrElse(Option.scala:121)

at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)

at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)

at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)

at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)

at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)

at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)

at org.apache.spark.sql.SparkSession.table(SparkSession.scala:574)

at org.apache.spark.sql.SQLContext.table(SQLContext.scala:708)

at com.baidu.spark.HiveContextApp$.main(HiveContextApp.scala:21)

at com.baidu.spark.HiveContextApp.main(HiveContextApp.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

是hive版本不兼容造成的吗?

0 回复 有任何疑惑可以回复我~
问题已解决,确定采纳
还有疑问,暂不采纳
意见反馈 帮助中心 APP下载
官方微信