You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sqoop.apache.org by "Abraham Elmahrek (JIRA)" <ji...@apache.org> on 2015/03/27 00:27:52 UTC

[jira] [Updated] (SQOOP-2269) Sqoop2: Parquet integration tests

     [ https://issues.apache.org/jira/browse/SQOOP-2269?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Abraham Elmahrek updated SQOOP-2269:
------------------------------------
    Attachment: SQOOP-2269.001.patch

> Sqoop2: Parquet integration tests
> ---------------------------------
>
>                 Key: SQOOP-2269
>                 URL: https://issues.apache.org/jira/browse/SQOOP-2269
>             Project: Sqoop
>          Issue Type: Bug
>          Components: sqoop2-kite-connector
>            Reporter: Abraham Elmahrek
>            Assignee: Abraham Elmahrek
>             Fix For: 2.0.0
>
>         Attachments: SQOOP-2269.001.patch
>
>
> Writing hive integration tests w/ parquet yields:
> {code}
> 44348 java.lang.RuntimeException: java.lang.NoSuchFieldError: doubleTypeInfo
> 44349         at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:84)
> 44350         at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
> 44351         at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
> 44352         at java.security.AccessController.doPrivileged(Native Method)
> 44353         at javax.security.auth.Subject.doAs(Subject.java:415)
> 44354         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
> 44355         at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
> 44356         at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
> 44357         at com.sun.proxy.$Proxy93.executeStatementAsync(Unknown Source)
> 44358         at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:233)
> 44359         at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
> 44360         at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
> 44361         at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
> 44362         at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> 44363         at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> 44364         at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
> 44365         at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
> 44366         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 44367         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 44368         at java.lang.Thread.run(Thread.java:745)
> 44369 Caused by: java.lang.NoSuchFieldError: doubleTypeInfo
> 44370         at org.apache.hadoop.hive.ql.io.parquet.serde.ArrayWritableObjectInspector.getObjectInspector(ArrayWritableObjectInspector.java:66)
> 44371         at org.apache.hadoop.hive.ql.io.parquet.serde.ArrayWritableObjectInspector.<init>(ArrayWritableObjectInspector.java:59)
> 44372         at org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe.initialize(ParquetHiveSerDe.java:113)
> 44373         at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:339)
> 44374         at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:288)
> 44375         at org.apache.hadoop.hive.ql.metadata.Table.checkValidity(Table.java:194)
> 44376         at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1017)
> 44377         at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> 44378         at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1223)
> 44379         at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1192)
> 44380         at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9209)
> 44381         at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> 44382         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:422)
> 44383         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:322)
> 44384         at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:975)
> 44385         at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:968)
> 44386         at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:99)
> 44387         at org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:172)
> 44388         at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
> 44389         at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:218)
> 44390         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 44391         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 44392         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 44393         at java.lang.reflect.Method.invoke(Method.java:606)
> 44394         at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
> 44395         ... 19 more
> {code}
> Let's figure that out and get a hive/parquet integration test.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)