You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Ashish Thusoo (JIRA)" <ji...@apache.org> on 2008/12/27 01:25:44 UTC
[jira] Commented: (HIVE-151) HiveQL Query execution bug:
java.lang.NullPointerException
[ https://issues.apache.org/jira/browse/HIVE-151?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12659291#action_12659291 ]
Ashish Thusoo commented on HIVE-151:
------------------------------------
explain <query>;
shows that the fetch task (Stage-0) which should be dependent on Stage-3 is actually a root task. So I think what is happening is that task is trying to read from the output of Stage-3 which has not even started yet!! This looks like a bad bug...
The output of the explain is as follows:
hive> explain SELECT t11.subject, t22.object , t33.subject , t66.object FROM ( SELECT t1.subject FROM triples t1 WHERE t1.pre
dicate='http://sofa.semanticweb.org/sofa/v1.0/system#__INSTANCEOF_REL' AND t1.object='http://ontos/OntosMiner/Common.English/
ontology#Citation' ) t11 JOIN ( SELECT t2.subject , t2.object FROM triples t2 WHERE t2.predicate='http://sofa.semanticweb.org
/sofa/v1.0/system#__LABEL_REL' ) t22 ON (t11.subject=t22.subject) JOIN ( SELECT t3.subject , t3.object FROM triples t3 WHERE
t3.predicate='http://www.ontosearch.com/2007/12/ontosofa-ns#_from' ) t33 ON (t11.subject=t33.object) JOIN ( SELECT t4.subject
FROM triples t4 WHERE t4.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__INSTANCEOF_REL' AND t4.object='http://ont
os/OntosMiner/Common.English/ontology#Author' ) t44 ON (t44.subject=t33.subject) JOIN ( SELECT t5.subject, t5.object as obh F
ROM triples t5 WHERE t5.predicate='http://www.ontosearch.com/2007/12/ontosofa-ns#_to' ) t55 ON (t55.subject=t44.subject) JOIN
( SELECT t6.subject, t6.object FROM triples t6 WHERE t6.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__LABEL_REL'
) t66 ON (t66.subject=t55.obh);
OK
ABSTRACT SYNTAX TREE:
(TOK_QUERY (TOK_FROM (TOK_JOIN (TOK_JOIN (TOK_JOIN (TOK_JOIN (TOK_JOIN (TOK_SUBQUERY (TOK_QUERY (TOK_FROM (TOK_TABREF tripl
es t1)) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_COLREF t1 subject))) (TOK_WHERE (A
ND (= (TOK_COLREF t1 predicate) 'http://sofa.semanticweb.org/sofa/v1.0/system#__INSTANCEOF_REL') (= (TOK_COLREF t1 object) 'h
ttp://ontos/OntosMiner/Common.English/ontology#Citation'))))) t11) (TOK_SUBQUERY (TOK_QUERY (TOK_FROM (TOK_TABREF triples t2)
) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_COLREF t2 subject)) (TOK_SELEXPR (TOK_CO
LREF t2 object))) (TOK_WHERE (= (TOK_COLREF t2 predicate) 'http://sofa.semanticweb.org/sofa/v1.0/system#__LABEL_REL')))) t22)
(= (TOK_COLREF t11 subject) (TOK_COLREF t22 subject))) (TOK_SUBQUERY (TOK_QUERY (TOK_FROM (TOK_TABREF triples t3)) (TOK_INSE
RT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_COLREF t3 subject)) (TOK_SELEXPR (TOK_COLREF t3 obj
ect))) (TOK_WHERE (= (TOK_COLREF t3 predicate) 'http://www.ontosearch.com/2007/12/ontosofa-ns#_from')))) t33) (= (TOK_COLREF
t11 subject) (TOK_COLREF t33 object))) (TOK_SUBQUERY (TOK_QUERY (TOK_FROM (TOK_TABREF triples t4)) (TOK_INSERT (TOK_DESTINATI
ON (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_COLREF t4 subject))) (TOK_WHERE (AND (= (TOK_COLREF t4 predicate) 'h
ttp://sofa.semanticweb.org/sofa/v1.0/system#__INSTANCEOF_REL') (= (TOK_COLREF t4 object) 'http://ontos/OntosMiner/Common.Engl
ish/ontology#Author'))))) t44) (= (TOK_COLREF t44 subject) (TOK_COLREF t33 subject))) (TOK_SUBQUERY (TOK_QUERY (TOK_FROM (TOK
_TABREF triples t5)) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_COLREF t5 subject)) (
TOK_SELEXPR (TOK_COLREF t5 object) obh)) (TOK_WHERE (= (TOK_COLREF t5 predicate) 'http://www.ontosearch.com/2007/12/ontosofa-
ns#_to')))) t55) (= (TOK_COLREF t55 subject) (TOK_COLREF t44 subject))) (TOK_SUBQUERY (TOK_QUERY (TOK_FROM (TOK_TABREF triple
s t6)) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_COLREF t6 subject)) (TOK_SELEXPR (T
OK_COLREF t6 object))) (TOK_WHERE (= (TOK_COLREF t6 predicate) 'http://sofa.semanticweb.org/sofa/v1.0/system#__LABEL_REL'))))
t66) (= (TOK_COLREF t66 subject) (TOK_COLREF t55 obh)))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (T
OK_SELEXPR (TOK_COLREF t11 subject)) (TOK_SELEXPR (TOK_COLREF t22 object)) (TOK_SELEXPR (TOK_COLREF t33 subject)) (TOK_SELEXP
R (TOK_COLREF t66 object)))))
STAGE DEPENDENCIES:
Stage-1 is a root stage
Stage-2 is a root stage
Stage-3 depends on stages: Stage-2
Stage-0 is a root stage
STAGE PLANS:
Stage: Stage-1
Map Reduce
Alias -> Map Operator Tree:
t66:t6
Select Operator
expressions:
expr: subject
type: string
expr: object
type: string
expr: predicate
type: string
Filter Operator
predicate:
expr: (2 = 'http://sofa.semanticweb.org/sofa/v1.0/system#__LABEL_REL')
type: boolean
Select Operator
expressions:
expr: 0
type: string
expr: 1
type: string
Reduce Output Operator
key expressions:
expr: 0
type: string
sort order: +
Map-reduce partition columns:
expr: 0
type: string
tag: 1
value expressions:
expr: 0
type: string
expr: 1
type: string
Reduce Operator Tree:
Join Operator
condition map:
Inner Join 0 to 1
condition expressions:
0 {VALUE.0} {VALUE.1} {VALUE.2} {VALUE.3} {VALUE.4} {VALUE.5} {VALUE.6} {VALUE.7}
1 {VALUE.0} {VALUE.1}
Select Operator
expressions:
expr: 3
type: string
expr: 5
type: string
expr: 0
type: string
expr: 9
type: string
File Output Operator
compressed: true
table:
input format: org.apache.hadoop.mapred.TextInputFormat
output format: org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat
Stage: Stage-2
Map Reduce
Alias -> Map Operator Tree:
t22:t2
Select Operator
expressions:
expr: subject
type: string
expr: object
type: string
expr: predicate
type: string
Filter Operator
predicate:
expr: (2 = 'http://sofa.semanticweb.org/sofa/v1.0/system#__LABEL_REL')
type: boolean
Select Operator
expressions:
expr: 0
type: string
expr: 1
type: string
Reduce Output Operator
key expressions:
expr: 0
type: string
sort order: +
Map-reduce partition columns:
expr: 0
type: string
tag: 1
value expressions:
expr: 0
type: string
expr: 1
type: string
t33:t3
Select Operator
expressions:
expr: subject
type: string
expr: object
type: string
expr: predicate
type: string
Filter Operator
predicate:
expr: (2 = 'http://www.ontosearch.com/2007/12/ontosofa-ns#_from')
type: boolean
Select Operator
expressions:
expr: 0
type: string
expr: 1
type: string
Reduce Output Operator
key expressions:
expr: 1
type: string
sort order: +
Map-reduce partition columns:
expr: 1
type: string
tag: 2
value expressions:
expr: 0
type: string
expr: 1
type: string
t11:t1
Select Operator
expressions:
expr: subject
type: string
expr: predicate
type: string
expr: object
type: string
Filter Operator
predicate:
expr: ((1 = 'http://sofa.semanticweb.org/sofa/v1.0/system#__INSTANCEOF_REL') and (2 = 'http://ontos/Ontos
Miner/Common.English/ontology#Citation'))
type: boolean
Select Operator
expressions:
expr: 0
type: string
Reduce Output Operator
key expressions:
expr: 0
type: string
sort order: +
Map-reduce partition columns:
expr: 0
type: string
tag: 0
value expressions:
expr: 0
type: string
Reduce Operator Tree:
Join Operator
condition map:
Inner Join 0 to 1
Inner Join 0 to 1
condition expressions:
0 {VALUE.0}
1 {VALUE.0} {VALUE.1}
2 {VALUE.0} {VALUE.1}
File Output Operator
compressed: true
table:
input format: org.apache.hadoop.mapred.SequenceFileInputFormat
output format: org.apache.hadoop.mapred.SequenceFileOutputFormat
name: binary_table
Stage: Stage-3
Map Reduce
Alias -> Map Operator Tree:
$INTNAME
Reduce Output Operator
key expressions:
expr: 3
type: string
sort order: +
Map-reduce partition columns:
expr: 3
type: string
tag: 0
value expressions:
expr: 3
type: string
expr: 4
type: string
expr: 0
type: string
expr: 1
type: string
expr: 2
type: string
t55:t5
Select Operator
expressions:
expr: subject
type: string
expr: object
type: string
expr: predicate
type: string
Filter Operator
predicate:
expr: (2 = 'http://www.ontosearch.com/2007/12/ontosofa-ns#_to')
type: boolean
Select Operator
expressions:
expr: 0
type: string
expr: 1
type: string
Reduce Output Operator
key expressions:
expr: 0
type: string
sort order: +
Map-reduce partition columns:
expr: 0
type: string
tag: 2
value expressions:
expr: 0
type: string
expr: 1
type: string
t44:t4
Select Operator
expressions:
expr: subject
type: string
expr: predicate
type: string
expr: object
type: string
Filter Operator
predicate:
expr: ((1 = 'http://sofa.semanticweb.org/sofa/v1.0/system#__INSTANCEOF_REL') and (2 = 'http://ontos/Ontos
Miner/Common.English/ontology#Author'))
type: boolean
Select Operator
expressions:
expr: 0
type: string
Reduce Output Operator
key expressions:
expr: 0
type: string
sort order: +
Map-reduce partition columns:
expr: 0
type: string
tag: 1
value expressions:
expr: 0
type: string
Reduce Operator Tree:
Join Operator
condition map:
Inner Join 0 to 1
Inner Join 1 to 1
condition expressions:
0 {VALUE.0} {VALUE.1} {VALUE.2} {VALUE.3} {VALUE.4}
1 {VALUE.0}
2 {VALUE.0} {VALUE.1}
Reduce Output Operator
key expressions:
expr: 7
type: string
sort order: +
Map-reduce partition columns:
expr: 7
type: string
tag: 0
value expressions:
expr: 0
type: string
expr: 1
type: string
expr: 5
type: string
expr: 2
type: string
expr: 3
type: string
expr: 4
type: string
expr: 6
type: string
expr: 7
type: string
Stage: Stage-0
Fetch Operator
limit: -1
Time taken: 0.69 seconds
> HiveQL Query execution bug: java.lang.NullPointerException
> ----------------------------------------------------------
>
> Key: HIVE-151
> URL: https://issues.apache.org/jira/browse/HIVE-151
> Project: Hadoop Hive
> Issue Type: Bug
> Components: Query Processor
> Environment: Ubuntu Linux 386, Hadoop 0.19.0 Hive trunk
> Reporter: Viacheslav Seledkin
> Fix For: 0.2.0
>
> Attachments: _user_hive_warehouse_triples_part-00000.tar.gz
>
>
> Executing a query
> ------------------------------------- query start ----------------------------------------------------
> SELECT t11.subject, t22.object , t33.subject , t55.object, t66.object
> FROM
> (
> SELECT t1.subject
> FROM triples t1
> WHERE
> t1.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__INSTANCEOF_REL'
> AND
> t1.object='http://ontos/OntosMiner/Common.English/ontology#Citation'
> ) t11
> JOIN
> (
> SELECT t2.subject , t2.object
> FROM triples t2
> WHERE
> t2.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__LABEL_REL'
> ) t22
> ON (t11.subject=t22.subject)
> JOIN
> (
> SELECT t3.subject , t3.object
> FROM triples t3
> WHERE
> t3.predicate='http://www.ontosearch.com/2007/12/ontosofa-ns#_from'
>
> ) t33
> ON (t11.subject=t33.object)
> JOIN
> (
> SELECT t4.subject
> FROM triples t4
> WHERE
> t4.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__INSTANCEOF_REL'
> AND
> t4.object='http://ontos/OntosMiner/Common.English/ontology#Author'
>
> ) t44
> ON (t44.subject=t33.subject)
> JOIN
> (
> SELECT t5.subject, t5.object
> FROM triples t5
> WHERE
> t5.predicate='http://www.ontosearch.com/2007/12/ontosofa-ns#_to'
> ) t55
> ON (t55.subject=t44.subject)
> JOIN
> (
> SELECT t6.subject, t6.object
> FROM triples t6
> WHERE
> t6.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__LABEL_REL'
> ) t66
> ON (t66.subject=t55.object)
> ------------------------------------- query end ----------------------------------------------------
> on table
> ------------------------------------- table start ----------------------------------------------------
> CREATE TABLE triples (foo string,subject string, predicate string, object string, foo2 string)
> ------------------------------------- table end -----------------------------------------------------
> gives the foolowing output
> ------------------------------------ console output ----------------------------------------------
> INFO [main] (Driver.java:156) - Starting command: SELECT t11.subject, t22.object , t33.subject , t66.object FROM ( SELECT t1.subject FROM triples t1 WHERE t1.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__INSTANCEOF_REL' AND t1.object='http://ontos/OntosMiner/Common.English/ontology#Citation' ) t11 JOIN ( SELECT t2.subject , t2.object FROM triples t2 WHERE t2.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__LABEL_REL' ) t22 ON (t11.subject=t22.subject) JOIN ( SELECT t3.subject , t3.object FROM triples t3 WHERE t3.predicate='http://www.ontosearch.com/2007/12/ontosofa-ns#_from' ) t33 ON (t11.subject=t33.object) JOIN ( SELECT t4.subject FROM triples t4 WHERE t4.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__INSTANCEOF_REL' AND t4.object='http://ontos/OntosMiner/Common.English/ontology#Author' ) t44 ON (t44.subject=t33.subject) JOIN ( SELECT t5.subject, t5.object as obh FROM triples t5 WHERE t5.predicate='http://www.ontosearch.com/2007/12/ontosofa-ns#_to' ) t55 ON (t55.subject=t44.subject) JOIN ( SELECT t6.subject, t6.object FROM triples t6 WHERE t6.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__LABEL_REL' ) t66 ON (t66.subject=t55.obh)
> INFO [main] (ParseDriver.java:249) - Parsing command: SELECT t11.subject, t22.object , t33.subject , t66.object FROM ( SELECT t1.subject FROM triples t1 WHERE t1.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__INSTANCEOF_REL' AND t1.object='http://ontos/OntosMiner/Common.English/ontology#Citation' ) t11 JOIN ( SELECT t2.subject , t2.object FROM triples t2 WHERE t2.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__LABEL_REL' ) t22 ON (t11.subject=t22.subject) JOIN ( SELECT t3.subject , t3.object FROM triples t3 WHERE t3.predicate='http://www.ontosearch.com/2007/12/ontosofa-ns#_from' ) t33 ON (t11.subject=t33.object) JOIN ( SELECT t4.subject FROM triples t4 WHERE t4.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__INSTANCEOF_REL' AND t4.object='http://ontos/OntosMiner/Common.English/ontology#Author' ) t44 ON (t44.subject=t33.subject) JOIN ( SELECT t5.subject, t5.object as obh FROM triples t5 WHERE t5.predicate='http://www.ontosearch.com/2007/12/ontosofa-ns#_to' ) t55 ON (t55.subject=t44.subject) JOIN ( SELECT t6.subject, t6.object FROM triples t6 WHERE t6.predicate='http://sofa.semanticweb.org/sofa/v1.0/system#__LABEL_REL' ) t66 ON (t66.subject=t55.obh)
> INFO [main] (ParseDriver.java:263) - Parse Completed
> INFO [main] (HiveMetaStore.java:126) - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> INFO [main] (ObjectStore.java:124) - ObjectStore, initialize called
> INFO [main] (ObjectStore.java:146) - found resource jpox.properties at file:/home/vseledkin/workspace/HiveDrv/bin/jpox.properties
> WARN [main] (Log4JLogger.java:98) - Bundle "org.jpox" has an optional dependency to "org.eclipse.equinox.registry" but it cannot be resolved
> WARN [main] (Log4JLogger.java:98) - Bundle "org.jpox" has an optional dependency to "org.eclipse.core.runtime" but it cannot be resolved
> INFO [main] (Log4JLogger.java:79) - ================= Persistence Configuration ===============
> INFO [main] (Log4JLogger.java:79) - JPOX Persistence Factory - Vendor: "JPOX" Version: "1.2.2"
> INFO [main] (Log4JLogger.java:79) - JPOX Persistence Factory initialised for datastore URL="jdbc:derby:;databaseName=metastore_db;create=true" driver="org.apache.derby.jdbc.EmbeddedDriver" userName="APP"
> INFO [main] (Log4JLogger.java:79) - ===========================================================
> INFO [main] (Log4JLogger.java:79) - Initialising Catalog "", Schema "APP" using "SchemaTable" auto-start option
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of org.apache.hadoop.hive.metastore.model.MDatabase since it was managed previously
> INFO [main] (Log4JLogger.java:79) - No manager for annotations was found in the CLASSPATH so all annotations are ignored.
> WARN [main] (Log4JLogger.java:98) - MetaData Parser encountered an error in file "jar:file:/home/vseledkin/workspace/hive/build/hive_metastore.jar!/package.jdo" at line 282, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of org.apache.hadoop.hive.metastore.model.MStorageDescriptor since it was managed previously
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of org.apache.hadoop.hive.metastore.model.MSerDeInfo since it was managed previously
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of org.apache.hadoop.hive.metastore.model.MTable since it was managed previously
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of org.apache.hadoop.hive.metastore.model.MPartition since it was managed previously
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS, InheritanceStrategy : new-table]
> INFO [main] (Log4JLogger.java:79) - The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MSerDeInfo [Table : SERDES, InheritanceStrategy : new-table]
> INFO [main] (Log4JLogger.java:79) - The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MStorageDescriptor [Table : SDS, InheritanceStrategy : new-table]
> INFO [main] (Log4JLogger.java:79) - The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
> INFO [main] (Log4JLogger.java:79) - The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MTable [Table : TBLS, InheritanceStrategy : new-table]
> INFO [main] (Log4JLogger.java:79) - The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
> INFO [main] (Log4JLogger.java:79) - The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MPartition [Table : PARTITIONS, InheritanceStrategy : new-table]
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MSerDeInfo.parameters [Table : SERDE_PARAMS]
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MPartition.parameters [Table : PARTITION_PARAMS]
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MPartition.values [Table : PARTITION_KEY_VALS]
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.parameters [Table : TABLE_PARAMS]
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.partitionKeys [Table : PARTITION_KEYS]
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.bucketCols [Table : BUCKETING_COLS]
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.cols [Table : COLUMNS]
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.parameters [Table : SD_PARAMS]
> INFO [main] (Log4JLogger.java:79) - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.sortCols [Table : SORT_COLS]
> INFO [main] (Log4JLogger.java:79) - Validating 1 unique key(s) for table SERDES
> INFO [main] (Log4JLogger.java:79) - Validating 0 foreign key(s) for table SERDES
> INFO [main] (Log4JLogger.java:79) - Validating 1 index(es) for table SERDES
> INFO [main] (Log4JLogger.java:79) - Validating 2 unique key(s) for table PARTITIONS
> INFO [main] (Log4JLogger.java:79) - Validating 2 foreign key(s) for table PARTITIONS
> INFO [main] (Log4JLogger.java:79) - Validating 4 index(es) for table PARTITIONS
> INFO [main] (Log4JLogger.java:79) - Validating 2 unique key(s) for table TBLS
> INFO [main] (Log4JLogger.java:79) - Validating 2 foreign key(s) for table TBLS
> INFO [main] (Log4JLogger.java:79) - Validating 4 index(es) for table TBLS
> INFO [main] (Log4JLogger.java:79) - Validating 1 unique key(s) for table SDS
> INFO [main] (Log4JLogger.java:79) - Validating 1 foreign key(s) for table SDS
> INFO [main] (Log4JLogger.java:79) - Validating 2 index(es) for table SDS
> INFO [main] (Log4JLogger.java:79) - Validating 2 unique key(s) for table DBS
> INFO [main] (Log4JLogger.java:79) - Validating 0 foreign key(s) for table DBS
> INFO [main] (Log4JLogger.java:79) - Validating 2 index(es) for table DBS
> INFO [main] (Log4JLogger.java:79) - Validating 1 unique key(s) for table SORT_COLS
> INFO [main] (Log4JLogger.java:79) - Validating 1 foreign key(s) for table SORT_COLS
> INFO [main] (Log4JLogger.java:79) - Validating 2 index(es) for table SORT_COLS
> INFO [main] (Log4JLogger.java:79) - Validating 1 unique key(s) for table TABLE_PARAMS
> INFO [main] (Log4JLogger.java:79) - Validating 1 foreign key(s) for table TABLE_PARAMS
> INFO [main] (Log4JLogger.java:79) - Validating 2 index(es) for table TABLE_PARAMS
> INFO [main] (Log4JLogger.java:79) - Validating 1 unique key(s) for table COLUMNS
> INFO [main] (Log4JLogger.java:79) - Validating 1 foreign key(s) for table COLUMNS
> INFO [main] (Log4JLogger.java:79) - Validating 2 index(es) for table COLUMNS
> INFO [main] (Log4JLogger.java:79) - Validating 1 unique key(s) for table PARTITION_KEYS
> INFO [main] (Log4JLogger.java:79) - Validating 1 foreign key(s) for table PARTITION_KEYS
> INFO [main] (Log4JLogger.java:79) - Validating 2 index(es) for table PARTITION_KEYS
> INFO [main] (Log4JLogger.java:79) - Validating 1 unique key(s) for table SD_PARAMS
> INFO [main] (Log4JLogger.java:79) - Validating 1 foreign key(s) for table SD_PARAMS
> INFO [main] (Log4JLogger.java:79) - Validating 2 index(es) for table SD_PARAMS
> INFO [main] (Log4JLogger.java:79) - Validating 1 unique key(s) for table PARTITION_PARAMS
> INFO [main] (Log4JLogger.java:79) - Validating 1 foreign key(s) for table PARTITION_PARAMS
> INFO [main] (Log4JLogger.java:79) - Validating 2 index(es) for table PARTITION_PARAMS
> INFO [main] (Log4JLogger.java:79) - Validating 1 unique key(s) for table PARTITION_KEY_VALS
> INFO [main] (Log4JLogger.java:79) - Validating 1 foreign key(s) for table PARTITION_KEY_VALS
> INFO [main] (Log4JLogger.java:79) - Validating 2 index(es) for table PARTITION_KEY_VALS
> INFO [main] (Log4JLogger.java:79) - Validating 1 unique key(s) for table SERDE_PARAMS
> INFO [main] (Log4JLogger.java:79) - Validating 1 foreign key(s) for table SERDE_PARAMS
> INFO [main] (Log4JLogger.java:79) - Validating 2 index(es) for table SERDE_PARAMS
> INFO [main] (Log4JLogger.java:79) - Validating 1 unique key(s) for table BUCKETING_COLS
> INFO [main] (Log4JLogger.java:79) - Validating 1 foreign key(s) for table BUCKETING_COLS
> INFO [main] (Log4JLogger.java:79) - Validating 2 index(es) for table BUCKETING_COLS
> INFO [main] (Log4JLogger.java:79) - Catalog "", Schema "APP" initialised - managing 14 classes
> INFO [main] (Log4JLogger.java:79) - >> Found StoreManager org.jpox.store.rdbms.RDBMSManager
> INFO [main] (ObjectStore.java:110) - Initialized ObjectStore
> INFO [main] (SemanticAnalyzer.java:3086) - Starting Semantic Analysis
> INFO [main] (SemanticAnalyzer.java:3088) - Completed phase 1 of Semantic Analysis
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (HiveMetaStore.java:164) - 0: get_table : db=default tbl=triples
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct triples { string tid, string subject, string predicate, string object, string type}
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (HiveMetaStore.java:164) - 0: get_table : db=default tbl=triples
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct triples { string tid, string subject, string predicate, string object, string type}
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (HiveMetaStore.java:164) - 0: get_table : db=default tbl=triples
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct triples { string tid, string subject, string predicate, string object, string type}
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (HiveMetaStore.java:164) - 0: get_table : db=default tbl=triples
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct triples { string tid, string subject, string predicate, string object, string type}
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (HiveMetaStore.java:164) - 0: get_table : db=default tbl=triples
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct triples { string tid, string subject, string predicate, string object, string type}
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (HiveMetaStore.java:164) - 0: get_table : db=default tbl=triples
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct triples { string tid, string subject, string predicate, string object, string type}
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:3091) - Completed getting MetaData in Semantic Analysis
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0, string reducesinkvalue1}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0, string reducesinkvalue1}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0, string reducesinkvalue1, string reducesinkvalue2, string reducesinkvalue3, string reducesinkvalue4}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0, string reducesinkvalue1}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0, string reducesinkvalue1, string reducesinkvalue2, string reducesinkvalue3, string reducesinkvalue4, string reducesinkvalue5, string reducesinkvalue6, string reducesinkvalue7}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0, string reducesinkvalue1}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (HiveMetaStore.java:164) - 0: get_table : db=default tbl=triples
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct triples { string tid, string subject, string predicate, string object, string type}
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (HiveMetaStore.java:164) - 0: get_table : db=default tbl=triples
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct triples { string tid, string subject, string predicate, string object, string type}
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (HiveMetaStore.java:164) - 0: get_table : db=default tbl=triples
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct triples { string tid, string subject, string predicate, string object, string type}
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (HiveMetaStore.java:164) - 0: get_table : db=default tbl=triples
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct triples { string tid, string subject, string predicate, string object, string type}
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (HiveMetaStore.java:164) - 0: get_table : db=default tbl=triples
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct triples { string tid, string subject, string predicate, string object, string type}
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:579) - Get metadata for source tables
> INFO [main] (HiveMetaStore.java:164) - 0: get_table : db=default tbl=triples
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct triples { string tid, string subject, string predicate, string object, string type}
> INFO [main] (SemanticAnalyzer.java:595) - Get metadata for subqueries
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (SemanticAnalyzer.java:602) - Get metadata for destination tables
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0, string reducesinkvalue1}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0, string reducesinkvalue1}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0, string reducesinkvalue1, string reducesinkvalue2, string reducesinkvalue3, string reducesinkvalue4}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0, string reducesinkvalue1}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0, string reducesinkvalue1, string reducesinkvalue2, string reducesinkvalue3, string reducesinkvalue4, string reducesinkvalue5, string reducesinkvalue6, string reducesinkvalue7}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string reducesinkkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string reducesinkvalue0, string reducesinkvalue1}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_sortable_table { string joinkey0}
> INFO [main] (SemanticAnalyzer.java:3107) - Completed partition pruning
> INFO [main] (SemanticAnalyzer.java:3111) - Completed sample pruning
> INFO [main] (MetaStoreUtils.java:461) - DDL: struct binary_table { string temporarycol0, string temporarycol1, string temporarycol2, string temporarycol3, string temporarycol4}
> INFO [main] (SemanticAnalyzer.java:3120) - Completed plan generation
> INFO [main] (Driver.java:173) - Semantic Analysis Completed
> Total MapReduce jobs = 3
> INFO [main] (SessionState.java:254) - Total MapReduce jobs = 3
> Number of reducers = 1
> INFO [main] (SessionState.java:254) - Number of reducers = 1
> In order to change numer of reducers use:
> INFO [main] (SessionState.java:254) - In order to change numer of reducers use:
> set mapred.reduce.tasks = <number>
> INFO [main] (SessionState.java:254) - set mapred.reduce.tasks = <number>
> WARN [main] (ExecDriver.java:109) - Number of reduce tasks not specified. Defaulting to jobconf value of: 1
> INFO [main] (ExecDriver.java:238) - Adding input file /user/hive/warehouse/triples
> WARN [main] (JobClient.java:547) - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
> INFO [main] (FileInputFormat.java:181) - Total input paths to process : 1
> Starting Job = job_200812091129_0144, Tracking URL = http://ubunder.avicomp.com:50030/jobdetails.jsp?jobid=job_200812091129_0144
> INFO [main] (SessionState.java:254) - Starting Job = job_200812091129_0144, Tracking URL = http://ubunder.avicomp.com:50030/jobdetails.jsp?jobid=job_200812091129_0144
> Kill Command = /home/vseledkin/workspace/HiveDrv/programs/hadoop-0.19.0 job -Dmapred.job.tracker=ubunder.avicomp.com:9001 -kill job_200812091129_0144
> INFO [main] (SessionState.java:254) - Kill Command = /home/vseledkin/workspace/HiveDrv/programs/hadoop-0.19.0 job -Dmapred.job.tracker=ubunder.avicomp.com:9001 -kill job_200812091129_0144
> map = 0%, reduce =0%
> INFO [main] (SessionState.java:254) - map = 0%, reduce =0%
> map = 50%, reduce =0%
> INFO [main] (SessionState.java:254) - map = 50%, reduce =0%
> map = 100%, reduce =0%
> INFO [main] (SessionState.java:254) - map = 100%, reduce =0%
> map = 100%, reduce =100%
> INFO [main] (SessionState.java:254) - map = 100%, reduce =100%
> ERROR [main] (SessionState.java:263) - Ended Job = job_200812091129_0144 with errors
> Ended Job = job_200812091129_0144 with errors
> FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.ExecDriver
> ERROR [main] (SessionState.java:263) - FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.ExecDriver
> ------------------------------------ console output end ----------------------------------------
> and the stack trace in hadoop logs
> ------------------------------------ stack trace ---------------------------------------------------
> java.lang.NullPointerException
> at org.apache.hadoop.hive.ql.exec.ExecReducer.configure(ExecReducer.java:81)
> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:58)
> at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:83)
> at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:337)
> at org.apache.hadoop.mapred.Child.main(Child.java:155)
> ------------------------------------ stack trace end ---------------------------------------------
> attached file contains table data to test problematic query
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.