报错内容如下:
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_database: default
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_database: default
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_table : db=default tbl=t
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_table : db=default tbl=t
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_database: default
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_database: default
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_database: default
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_database: default
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_database: default
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_database: default
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_table : db=default tbl=t
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_table : db=default tbl=t
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_database: default
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_database: default
20/07/21 00:23:46 INFO HiveMetaStore: 0: create_table: Table(tableName:t, dbName:default, owner:hadoop, createTime:1595262226, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:key, type:string, comment:null), FieldSchema(name:value, type:string, comment:null)], location:file:/user/hive/warehouse/t, inputFormat:org.apache.hadoop.mapred.TextInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{})), partitionKeys:[], parameters:{spark.sql.sources.schema.part.0={“type”:“struct”,“fields”:[{“name”:“key”,“type”:“string”,“nullable”:true,“metadata”:{}},{“name”:“value”,“type”:“string”,“nullable”:true,“metadata”:{}}]}, spark.sql.sources.schema.numParts=1, spark.sql.create.version=2.4.3}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{}, groupPrivileges:null, rolePrivileges:null))
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=create_table: Table(tableName:t, dbName:default, owner:hadoop, createTime:1595262226, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:key, type:string, comment:null), FieldSchema(name:value, type:string, comment:null)], location:file:/user/hive/warehouse/t, inputFormat:org.apache.hadoop.mapred.TextInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{})), partitionKeys:[], parameters:{spark.sql.sources.schema.part.0={“type”:“struct”,“fields”:[{“name”:“key”,“type”:“string”,“nullable”:true,“metadata”:{}},{“name”:“value”,“type”:“string”,“nullable”:true,“metadata”:{}}]}, spark.sql.sources.schema.numParts=1, spark.sql.create.version=2.4.3}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{}, groupPrivileges:null, rolePrivileges:null))
20/07/21 00:23:46 WARN HiveMetaStore: Location: file:/user/hive/warehouse/t specified for non-external table:t
20/07/21 00:23:46 INFO FileUtils: Creating directory if it doesn’t exist: file:/user/hive/warehouse/t
Error in query: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:file:/user/hive/warehouse/t is not a directory or unable to create one);