You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@drill.apache.org by "Ramana Inukonda Nagaraj (JIRA)" <ji...@apache.org> on 2015/01/12 01:41:34 UTC

[jira] [Commented] (DRILL-1980) Invalid FIXED_LEN_BYTE_ARRAY length for parquet file written by drill

    [ https://issues.apache.org/jira/browse/DRILL-1980?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14273118#comment-14273118 ] 

Ramana Inukonda Nagaraj commented on DRILL-1980:
------------------------------------------------

drillbit.log contains stacktrace as below
{code}
org.apache.drill.exec.work.foreman.ForemanException: Unexpected exception during fragment initialization: Internal error: Error while applying rule DrillTableRule, args [rel#6060:EnumerableTableAccessRel.ENUMERABLE.ANY([]).[](table=[dfs, root, /parquet_all_types/0_0_0.parquet])]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:194) [drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.work.WorkManager$RunnableWrapper.run(WorkManager.java:254) [drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_45]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_45]
        at java.lang.Thread.run(Thread.java:744) [na:1.7.0_45]
Caused by: java.lang.AssertionError: Internal error: Error while applying rule DrillTableRule, args [rel#6060:EnumerableTableAccessRel.ENUMERABLE.ANY([]).[](table=[dfs, root, /parquet_all_types/0_0_0.parquet])]
        at org.eigenbase.util.Util.newInternal(Util.java:750) ~[optiq-core-0.9-drill-r12.jar:na]
        at org.eigenbase.relopt.volcano.VolcanoRuleCall.onMatch(VolcanoRuleCall.java:246) ~[optiq-core-0.9-drill-r12.jar:na]
        at org.eigenbase.relopt.volcano.VolcanoPlanner.findBestExp(VolcanoPlanner.java:661) ~[optiq-core-0.9-drill-r12.jar:na]
        at net.hydromatic.optiq.tools.Programs$RuleSetProgram.run(Programs.java:165) ~[optiq-core-0.9-drill-r12.jar:na]
        at net.hydromatic.optiq.prepare.PlannerImpl.transform(PlannerImpl.java:276) ~[optiq-core-0.9-drill-r12.jar:na]
        at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.convertToDrel(DefaultSqlHandler.java:155) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.planner.sql.handlers.ExplainHandler.getPlan(ExplainHandler.java:59) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:145) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:507) [drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:185) [drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        ... 4 common frames omitted
Caused by: org.apache.drill.common.exceptions.DrillRuntimeException: Failure creating scan.
        at org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:74) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:62) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.planner.logical.DrillScanRule.onMatch(DrillScanRule.java:37) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.eigenbase.relopt.volcano.VolcanoRuleCall.onMatch(VolcanoRuleCall.java:223) ~[optiq-core-0.9-drill-r12.jar:na]
        ... 12 common frames omitted
Caused by: java.io.IOException: Could not read footer: java.lang.IllegalArgumentException: Invalid FIXED_LEN_BYTE_ARRAY length: 0
        at parquet.hadoop.ParquetFileReader.readAllFootersInParallel(ParquetFileReader.java:195) ~[parquet-hadoop-1.5.1-drill-r4.jar:0.7.0-SNAPSHOT]
        at parquet.hadoop.ParquetFileReader.readAllFootersInParallel(ParquetFileReader.java:208) ~[parquet-hadoop-1.5.1-drill-r4.jar:0.7.0-SNAPSHOT]
        at parquet.hadoop.ParquetFileReader.readFooters(ParquetFileReader.java:224) ~[parquet-hadoop-1.5.1-drill-r4.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.store.parquet.ParquetGroupScan.readFooter(ParquetGroupScan.java:208) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.store.parquet.ParquetGroupScan.<init>(ParquetGroupScan.java:167) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.store.parquet.ParquetFormatPlugin.getGroupScan(ParquetFormatPlugin.java:157) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.store.parquet.ParquetFormatPlugin.getGroupScan(ParquetFormatPlugin.java:64) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.store.dfs.FileSystemPlugin.getPhysicalScan(FileSystemPlugin.java:124) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.store.AbstractStoragePlugin.getPhysicalScan(AbstractStoragePlugin.java:53) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.planner.logical.DrillTable.getGroupScan(DrillTable.java:53) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        at org.apache.drill.exec.planner.logical.DrillScanRel.<init>(DrillScanRel.java:72) ~[drill-java-exec-0.7.0-SNAPSHOT-rebuffed.jar:0.7.0-SNAPSHOT]
        ... 15 common frames omitted
Caused by: java.lang.IllegalArgumentException: Invalid FIXED_LEN_BYTE_ARRAY length: 0
        at parquet.Preconditions.checkArgument(Preconditions.java:47) ~[parquet-common-1.5.1-drill-r4.jar:na]
        at parquet.schema.Types$PrimitiveBuilder.build(Types.java:285) ~[parquet-column-1.5.1-drill-r5.jar:na]
        at parquet.schema.Types$PrimitiveBuilder.build(Types.java:1) ~[parquet-column-1.5.1-drill-r5.jar:na]
        at parquet.schema.Types$Builder.named(Types.java:194) ~[parquet-column-1.5.1-drill-r5.jar:na]
        at parquet.format.converter.ParquetMetadataConverter.buildChildren(ParquetMetadataConverter.java:482) ~[parquet-hadoop-1.5.1-drill-r4.jar:na]
        at parquet.format.converter.ParquetMetadataConverter.fromParquetSchema(ParquetMetadataConverter.java:446) ~[parquet-hadoop-1.5.1-drill-r4.jar:na]
        at parquet.format.converter.ParquetMetadataConverter.fromParquetMetadata(ParquetMetadataConverter.java:390) ~[parquet-hadoop-1.5.1-drill-r4.jar:na]
        at parquet.format.converter.ParquetMetadataConverter.readParquetMetadata(ParquetMetadataConverter.java:385) ~[parquet-hadoop-1.5.1-drill-r4.jar:na]
        at parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:301) ~[parquet-hadoop-1.5.1-drill-r4.jar:0.7.0-SNAPSHOT]
        at parquet.hadoop.ParquetFileReader$2.call(ParquetFileReader.java:185) ~[parquet-hadoop-1.5.1-drill-r4.jar:0.7.0-SNAPSHOT]
        at parquet.hadoop.ParquetFileReader$2.call(ParquetFileReader.java:1) ~[parquet-hadoop-1.5.1-drill-r4.jar:0.7.0-SNAPSHOT]
        at java.util.concurrent.FutureTask.run(FutureTask.java:262) ~[na:1.7.0_45]
        ... 3 common frames omitted
{code}

> Invalid FIXED_LEN_BYTE_ARRAY length for parquet file written by drill
> ---------------------------------------------------------------------
>
>                 Key: DRILL-1980
>                 URL: https://issues.apache.org/jira/browse/DRILL-1980
>             Project: Apache Drill
>          Issue Type: Bug
>          Components: Storage - Parquet
>    Affects Versions: 0.7.0
>            Reporter: Ramana Inukonda Nagaraj
>            Assignee: Parth Chandra
>            Priority: Critical
>
> Created a parquet file from a json file with all types listed in it.
> {code}
> 0: jdbc:drill:> CREATE TABLE parquet_all_types AS SELECT cast( INT_col as int) INT_col,cast( BIGINT_col as bigint) BIGINT_col,cast( DECIMAL9_col as decimal) DECIMAL9_col,cast( DECIMAL18_col as decimal(18,9)) DECIMAL18_col,cast( DECIMAL28SPARSE_col as decimal(28, 14)) DECIMAL28SPARSE_col,cast( DECIMAL38SPARSE_col as decimal(38, 19)) DECIMAL38SPARSE_col,cast( DATE_col as date) DATE_col,cast( TIME_col as time) TIME_col,cast( TIMESTAMP_col as timestamp) TIMESTAMP_col,cast( FLOAT4_col as float) FLOAT4_col,cast( FLOAT8_col as double) FLOAT8_col,cast( BIT_col as boolean) BIT_col,cast( VARCHAR_col as varchar(65000)) VARCHAR_col,cast( VAR16CHAR_col as varchar(65000)) VAR16CHAR_col,cast( VARBINARY_col as varbinary(65000)) VARBINARY_col,cast( INTERVALYEAR_col as interval year) INTERVALYEAR_col,cast( INTERVALDAY_col as interval day) INTERVALDAY_col FROM `/user/root/alltypes.json`;
> +------------+---------------------------+
> |  Fragment  | Number of records written |
> +------------+---------------------------+
> | 0_0        | 8                         |
> +------------+---------------------------+
> 1 row selected (0.595 seconds)
> {code}
> Tried reading created parquet file from drill. Fails with
> {code}
> 0: jdbc:drill:> explain plan for select * from `/parquet_all_types/0_0_0.parquet`;
> Query failed: Query failed: Unexpected exception during fragment initialization: Internal error: Error while applying rule DrillTableRule, args [rel#6060:EnumerableTableAccessRel.ENUMERABLE.ANY([]).[](table=[dfs, root, /parquet_all_types/0_0_0.parquet])]
> Error: exception while executing query: Failure while executing query. (state=,code=0)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)