You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by David Lerman <dl...@videoegg.com> on 2009/11/19 22:33:23 UTC

TransactionNotWritableException

Has anyone seen a TransactionNotWritableException during a LOAD DATA
command?  We get it very sporadically - I'd say one out of every couple
hundred loads - and if you rerun the same query, it runs fine.  We're using
Hive trunk r819727, and MySQL directly as the metastore.  Log below.
Thanks!

-Dave

2009-11-17 13:17:18,598 INFO  hive.HiveAPI (HiveAPI.java:executeUpdate(570))
- Executing Hive query: LOAD DATA INPATH '/vdat/tmp/scheduler/41619'
OVERWRITE INTO TABLE vdat_921 PARTITION (localday='2009-11-16')
2009-11-17 13:17:18,600 INFO  parse.ParseDriver
(ParseDriver.java:parse(347)) - Parsing command: LOAD DATA INPATH
'/vdat/tmp/scheduler/41619' OVERWRITE INTO TABLE vdat_921 PARTITION
(localday='2009-11-16')
2009-11-17 13:17:18,602 INFO  parse.ParseDriver
(ParseDriver.java:parse(362)) - Parse Completed
2009-11-17 13:17:18,604 INFO  metastore.HiveMetaStore
(HiveMetaStore.java:logStartFunction(167)) - 0: get_table : db=default
tbl=vdat_921
2009-11-17 13:17:18,723 INFO  hive.log
(MetaStoreUtils.java:getDDLFromFieldSchema(455)) - DDL: struct vdat_921 {
i32 flightcreativeid, string creativename, string flightname, string
campaignname, string advertisername, string service, i64 servicecount}
2009-11-17 13:17:18,782 ERROR ql.Driver (SessionState.java:printError(243))
- FAILED: Unknown exception: Cant write fields outside of transactions. You
may want to set 'NontransactionalWrite=true'.
org.datanucleus.jdo.exceptions.TransactionNotWritableException: Cant write
fields outside of transactions. You may want to set
'NontransactionalWrite=true'.
FailedObject:165711[OID]org.apache.hadoop.hive.metastore.model.MSerDeInfo
        at 
org.datanucleus.jdo.state.Hollow.transitionWriteField(Hollow.java:142)
        at 
org.datanucleus.state.AbstractStateManager.transitionWriteField(AbstractStat
eManager.java:565)
        at 
org.datanucleus.state.JDOStateManagerImpl.preWriteField(JDOStateManagerImpl.
java:4239)
        at 
org.datanucleus.state.JDOStateManagerImpl.makeDirty(JDOStateManagerImpl.java
:1161)
        at 
org.datanucleus.state.JDOStateManagerImpl.makeDirty(JDOStateManagerImpl.java
:1182)
        at 
org.apache.hadoop.hive.metastore.model.MSerDeInfo.jdoMakeDirty(MSerDeInfo.ja
va)
        at 
org.datanucleus.jdo.JDOAdapter.makeFieldDirty(JDOAdapter.java:1069)
        at org.datanucleus.sco.backed.Map.makeDirty(Map.java:409)
        at org.datanucleus.sco.backed.Map.put(Map.java:733)
        at 
org.apache.hadoop.hive.ql.metadata.Table.setSerdeParam(Table.java:563)
        at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:382)
        at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:313)
        at 
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer$tableSpec.<init>(BaseSe
manticAnalyzer.java:263)
        at 
org.apache.hadoop.hive.ql.parse.LoadSemanticAnalyzer.analyzeInternal(LoadSem
anticAnalyzer.java:178)
        at 
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAna
lyzer.java:76)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:252)
        at org.apache.hadoop.hive.ql.Driver.runCommand(Driver.java:297)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:289)
        at [internal code which calls Driver]