You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Andre Araujo <ar...@pythian.com> on 2014/06/18 22:21:12 UTC
Re:
Could you send a "show create table" for the two tables involved?
On 19 June 2014 00:25, Clay McDonald <st...@bateswhite.com> wrote:
> I’m trying to run the following hive join query and get the following
> error. Any suggestions?
>
>
>
>
>
>
>
>
>
> hive> select count(B.txn_id) AS CNT FROM txn_hdr_combined AS B JOIN
> upc_tab
> le AS C ON B.txn_id = C.txn_id;
>
> com.esotericsoftware.kryo.KryoException: Class cannot be created (missing
> no-arg
> constructor):
> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableH
> iveVarcharObjectInspector
>
> Serialization trace:
>
> objectInspector (org.apache.hadoop.hive.ql.exec.ColumnInfo)
>
> signature (org.apache.hadoop.hive.ql.exec.RowSchema)
>
> rowSchema (org.apache.hadoop.hive.ql.exec.ReduceSinkOperator)
>
> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
>
> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
>
> mapWork (org.apache.hadoop.hive.ql.plan.MapredWork)
>
> at com.esotericsoftware.kryo.Kryo.newInstantiator(Kryo.java:1097)
>
> at com.esotericsoftware.kryo.Kryo.newInstance(Kryo.java:1109)
>
> at
> com.esotericsoftware.kryo.serializers.FieldSerializer.create(FieldSer
> ializer.java:526)
>
> at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
> lizer.java:502)
>
> at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>
> at
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja
> va:106)
>
> at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
> lizer.java:507)
>
> at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
>
> at
> com.esotericsoftware.kryo.serializers.CollectionSerializer.read(Colle
> ctionSerializer.java:112)
>
> at
> com.esotericsoftware.kryo.serializers.CollectionSerializer.read(Colle
> ctionSerializer.java:18)
>
> at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>
> at
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja
> va:106)
>
> at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
> lizer.java:507)
>
> at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>
> at
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja
> va:106)
>
> at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
> lizer.java:507)
>
> at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
>
> at
> com.esotericsoftware.kryo.serializers.CollectionSerializer.read(Colle
> ctionSerializer.java:112)
>
> at
> com.esotericsoftware.kryo.serializers.CollectionSerializer.read(Colle
> ctionSerializer.java:18)
>
> at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>
> at
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja
> va:106)
>
> at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
> lizer.java:507)
>
> at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
>
> at
> com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerialize
> r.java:139)
>
> at
> com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerialize
> r.java:17)
>
> at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>
> at
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja
> va:106)
>
> at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
> lizer.java:507)
>
> at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>
> at
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja
> va:106)
>
> at
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria
> lizer.java:507)
>
> at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672)
>
> at
> org.apache.hadoop.hive.ql.exec.Utilities.deserializeObjectByKryo(Util
> ities.java:810)
>
> at
> org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.ja
> va:718)
>
> at
> org.apache.hadoop.hive.ql.exec.Utilities.clonePlan(Utilities.java:748
> )
>
> at
> org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinTaskDispatcher
> .processCurrentTask(CommonJoinTaskDispatcher.java:503)
>
> at
> org.apache.hadoop.hive.ql.optimizer.physical.AbstractJoinTaskDispatch
> er.dispatch(AbstractJoinTaskDispatcher.java:182)
>
> at
> org.apache.hadoop.hive.ql.lib.TaskGraphWalker.dispatch(TaskGraphWalke
> r.java:111)
>
> at
> org.apache.hadoop.hive.ql.lib.TaskGraphWalker.walk(TaskGraphWalker.ja
> va:194)
>
> at
> org.apache.hadoop.hive.ql.lib.TaskGraphWalker.startWalking(TaskGraphW
> alker.java:139)
>
> at
> org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver.resol
> ve(CommonJoinResolver.java:79)
>
> at
> org.apache.hadoop.hive.ql.optimizer.physical.PhysicalOptimizer.optimi
> ze(PhysicalOptimizer.java:90)
>
> at
> org.apache.hadoop.hive.ql.parse.MapReduceCompiler.compile(MapReduceCo
> mpiler.java:300)
>
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(Sema
> nticAnalyzer.java:8410)
>
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSema
> nticAnalyzer.java:284)
>
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:441)
>
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:342)
>
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1000)
>
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
>
> at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:2
> 59)
>
> at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
>
> at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
>
> at
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781
> )
>
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
>
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
> java:39)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:25)
>
> at java.lang.reflect.Method.invoke(Method.java:597)
>
> at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
>
> FAILED: SemanticException Generate Map Join Task Error: Class cannot be
> created
> (missing no-arg
> constructor):
> org.apache.hadoop.hive.serde2.objectinspector.prim
> itive.WritableHiveVarcharObjectInspector
>
> Serialization trace:
>
> objectInspector (org.apache.hadoop.hive.ql.exec.ColumnInfo)
>
> signature (org.apache.hadoop.hive.ql.exec.RowSchema)
>
> rowSchema (org.apache.hadoop.hive.ql.exec.ReduceSinkOperator)
>
> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
>
> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
>
> mapWork (org.apache.hadoop.hive.ql.plan.MapredWork)
>
> hive>
>
>
>
> *Clay McDonald*
> Database Administrator
>
> Bates White, LLC
> 1300 Eye St, NW, Suite 600 East
> Washington, DC 20005
> Main: 202.408.6110
> Cell: 202.560.4101
> Direct: 202.747.5962
> Email: clay.mcdonald@bateswhite.com
>
> ****************************************************
> This electronic message transmission contains information from Bates
> White, LLC, which may be confidential or privileged. The information is
> intended to be for the use of the individual or entity named above. If you
> are not the intended recipient, be aware that any disclosure, copying,
> distribution, or use of the contents of this information is prohibited.
>
> If you have received this electronic transmission in error, please notify
> me by telephone at 202.747.5962 or by electronic mail at
> clay.mcdonald@bateswhite.com immediately.
>
> *****************************************************
>
--
André Araújo
Big Data Consultant/Solutions Architect
The Pythian Group - Australia - www.pythian.com
Office (calls from within Australia): 1300 366 021 x1270
Office (international): +61 2 8016 7000 x270 *OR* +1 613 565 8696 x1270
Mobile: +61 410 323 559
Fax: +61 2 9805 0544
IM: pythianaraujo @ AIM/MSN/Y! or araujo@pythian.com @ GTalk
“Success is not about standing at the top, it's the steps you leave behind.”
— Iker Pou (rock climber)
--
--
RE: Re:
Posted by Clay McDonald <st...@bateswhite.com>.
CREATE TABLE txn_hdr_combined(
txn_id varchar(20) COMMENT 'from deserializer',
txn_hdr_src_cd smallint COMMENT 'from deserializer',
card_nbr bigint COMMENT 'from deserializer',
store_id int COMMENT 'from deserializer',
txn_tm varchar(20) COMMENT 'from deserializer',
txn_type_id smallint COMMENT 'from deserializer',
checker_nbr int COMMENT 'from deserializer',
txn_dte date COMMENT 'from deserializer',
register_nbr smallint COMMENT 'from deserializer',
register_txn_seq_nbr int COMMENT 'from deserializer',
card_scan_type_cd smallint COMMENT 'from deserializer',
txn_indicat1 int COMMENT 'from deserializer',
total_gross_amt float COMMENT 'from deserializer',
total_tax_amt float COMMENT 'from deserializer',
total_margin_amt float COMMENT 'from deserializer',
total_mkdn_amt float COMMENT 'from deserializer',
total_mfr_cpn_amt float COMMENT 'from deserializer',
total_misc_amt float COMMENT 'from deserializer',
total_item_qty int COMMENT 'from deserializer',
total_disc_amt float COMMENT 'from deserializer',
cycle_id int COMMENT 'from deserializer',
offer_deliv_cd smallint COMMENT 'from deserializer',
hh_dflt_ind smallint COMMENT 'from deserializer',
tender_amt_unknown float COMMENT 'from deserializer',
tender_amt_cash float COMMENT 'from deserializer',
tender_amt_chk float COMMENT 'from deserializer',
tender_amt_wic float COMMENT 'from deserializer',
tender_amt_foodstamps float COMMENT 'from deserializer',
tender_amt_misc float COMMENT 'from deserializer',
tender_amt_gas float COMMENT 'from deserializer',
tender_amt_cr_card float COMMENT 'from deserializer',
tender_amt_corp_chrg float COMMENT 'from deserializer',
tender_amt_dr_card float COMMENT 'from deserializer',
tender_amt_ebt float COMMENT 'from deserializer',
tender_amt_smart_chk float COMMENT 'from deserializer',
tender_amt_gift_certificates float COMMENT 'from deserializer',
tender_amt_othr float COMMENT 'from deserializer',
tender_amt_not_defined float COMMENT 'from deserializer',
tender_amt_mc float COMMENT 'from deserializer',
tender_amt_visa float COMMENT 'from deserializer',
tender_amt_amex float COMMENT 'from deserializer',
tender_amt_discover float COMMENT 'from deserializer')
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '|'
STORED AS INPUTFORMAT
'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
LOCATION
'hdfs://node5.bateswhite.com:8020/apps/hive/warehouse/ops.db/txn_hdr_combined'
TBLPROPERTIES (
'numPartitions'='0',
'numFiles'='12',
'transient_lastDdlTime'='1403043095',
'totalSize'='283763066645',
'numRows'='0',
'rawDataSize'='0')
CREATE TABLE upc_table(
txn_id varchar(13),
txn_dte date,
pharm_flag bigint,
check_flag bigint,
bottle_refund bigint,
money_order bigint)
ROW FORMAT SERDE
'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'
STORED AS INPUTFORMAT
'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
'hdfs://node5.bateswhite.com:8020/apps/hive/warehouse/ops.db/upc_table'
TBLPROPERTIES (
'numPartitions'='0',
'numFiles'='2856',
'transient_lastDdlTime'='1402562506',
'totalSize'='49693625256',
'numRows'='0',
'rawDataSize'='0')
From: Andre Araujo [mailto:araujo@pythian.com]
Sent: Wednesday, June 18, 2014 4:21 PM
To: user
Subject: Re:
Could you send a "show create table" for the two tables involved?
On 19 June 2014 00:25, Clay McDonald <st...@bateswhite.com>> wrote:
I’m trying to run the following hive join query and get the following error. Any suggestions?
hive> select count(B.txn_id) AS CNT FROM txn_hdr_combined AS B JOIN upc_tab le AS C ON B.txn_id = C.txn_id;
com.esotericsoftware.kryo.KryoException: Class cannot be created (missing no-arg constructor): org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableH iveVarcharObjectInspector
Serialization trace:
objectInspector (org.apache.hadoop.hive.ql.exec.ColumnInfo)
signature (org.apache.hadoop.hive.ql.exec.RowSchema)
rowSchema (org.apache.hadoop.hive.ql.exec.ReduceSinkOperator)
childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
mapWork (org.apache.hadoop.hive.ql.plan.MapredWork)
at com.esotericsoftware.kryo.Kryo.newInstantiator(Kryo.java:1097)
at com.esotericsoftware.kryo.Kryo.newInstance(Kryo.java:1109)
at com.esotericsoftware.kryo.serializers.FieldSerializer.create(FieldSer ializer.java:526)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria lizer.java:502)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja va:106)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria lizer.java:507)
at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.read(Colle ctionSerializer.java:112)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.read(Colle ctionSerializer.java:18)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja va:106)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria lizer.java:507)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja va:106)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria lizer.java:507)
at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.read(Colle ctionSerializer.java:112)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.read(Colle ctionSerializer.java:18)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja va:106)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria lizer.java:507)
at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
at com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerialize r.java:139)
at com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerialize r.java:17)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja va:106)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria lizer.java:507)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.ja va:106)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSeria lizer.java:507)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672)
at org.apache.hadoop.hive.ql.exec.Utilities.deserializeObjectByKryo(Util ities.java:810)
at org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.ja va:718)
at org.apache.hadoop.hive.ql.exec.Utilities.clonePlan(Utilities.java:748 )
at org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinTaskDispatcher .processCurrentTask(CommonJoinTaskDispatcher.java:503)
at org.apache.hadoop.hive.ql.optimizer.physical.AbstractJoinTaskDispatch er.dispatch(AbstractJoinTaskDispatcher.java:182)
at org.apache.hadoop.hive.ql.lib.TaskGraphWalker.dispatch(TaskGraphWalke r.java:111)
at org.apache.hadoop.hive.ql.lib.TaskGraphWalker.walk(TaskGraphWalker.ja va:194)
at org.apache.hadoop.hive.ql.lib.TaskGraphWalker.startWalking(TaskGraphW alker.java:139)
at org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver.resol ve(CommonJoinResolver.java:79)
at org.apache.hadoop.hive.ql.optimizer.physical.PhysicalOptimizer.optimi ze(PhysicalOptimizer.java:90)
at org.apache.hadoop.hive.ql.parse.MapReduceCompiler.compile(MapReduceCo mpiler.java:300)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(Sema nticAnalyzer.java:8410)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSema nticAnalyzer.java:284)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:441)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:342)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1000)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:2 59)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781 )
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
FAILED: SemanticException Generate Map Join Task Error: Class cannot be created (missing no-arg constructor): org.apache.hadoop.hive.serde2.objectinspector.prim itive.WritableHiveVarcharObjectInspector
Serialization trace:
objectInspector (org.apache.hadoop.hive.ql.exec.ColumnInfo)
signature (org.apache.hadoop.hive.ql.exec.RowSchema)
rowSchema (org.apache.hadoop.hive.ql.exec.ReduceSinkOperator)
childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
mapWork (org.apache.hadoop.hive.ql.plan.MapredWork)
hive>
Clay McDonald
Database Administrator
Bates White, LLC
1300 Eye St, NW, Suite 600 East
Washington, DC 20005
Main: 202.408.6110
Cell: 202.560.4101
Direct: 202.747.5962
Email: clay.mcdonald@bateswhite.com<ma...@bateswhite.com>
****************************************************
This electronic message transmission contains information from Bates White, LLC, which may be confidential or privileged. The information is intended to be for the use of the individual or entity named above. If you are not the intended recipient, be aware that any disclosure, copying, distribution, or use of the contents of this information is prohibited.
If you have received this electronic transmission in error, please notify me by telephone at 202.747.5962 or by electronic mail at clay.mcdonald@bateswhite.com<ma...@bateswhite.com> immediately.
*****************************************************
--
André Araújo
Big Data Consultant/Solutions Architect
The Pythian Group - Australia - www.pythian.com<http://www.pythian.com>
Office (calls from within Australia): 1300 366 021 x1270
Office (international): +61 2 8016 7000 x270 OR +1 613 565 8696 x1270
Mobile: +61 410 323 559
Fax: +61 2 9805 0544
IM: pythianaraujo @ AIM/MSN/Y! or araujo@pythian.com<ma...@pythian.com> @ GTalk
“Success is not about standing at the top, it's the steps you leave behind.” — Iker Pou (rock climber)
--