You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2020/11/18 05:00:53 UTC

[GitHub] [hudi] bhushanamk opened a new issue #2258: [SUPPORT] Unable to query hudi tables in Presto

bhushanamk opened a new issue #2258:
URL: https://github.com/apache/hudi/issues/2258


   **_Tips before filing an issue_**
   
   - Have you gone through our [FAQs](https://cwiki.apache.org/confluence/display/HUDI/FAQ)?Yes
   
   - Join the mailing list to engage in conversations and get faster support at dev-subscribe@hudi.apache.org.
   
   - If you have triaged this as a bug, then file an [issue](https://issues.apache.org/jira/projects/HUDI/issues) directly.
   
   **Describe the problem you faced**
   
   presto:smart_api> select * from cities;Query 20201117_173741_00013_27sh4, FAILED, 2 nodesSplits: 59 total, 0 done (0.00%)0:03 [0 rows, 0B] [0 rows/s, 0B/s]Query 20201117_173741_00013_27sh4 failed: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
   A clear and concise description of the problem.
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1. create an emr
   2. run deltasreamer 
   spark-submit --class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer --conf spark.sql.hive.convertMetastoreParquet=false --packages org.apache.spark:spark-avro_2.11:2.4.6 --master yarn --deploy-mode client s3://mrTeam-4/dev/hudi-utilities-bundle_2.11-0.6.0.jar --table-type COPY_ON_WRITE   --source-ordering-field cdc_ts   --source-class org.apache.hudi.utilities.sources.ParquetDFSSource   --target-base-path s3://nxgen-dataplatform-4/prod/mysql/default/cities --target-table cities  --enable-hive-sync  --transformer-class org.apache.hudi.utilities.transform.AWSDmsTransformer   --payload-class org.apache.hudi.payload.AWSDmsAvroPayload   --hoodie-conf hoodie.datasource.write.recordkey.field=id --hoodie-conf hoodie.datasource.write.partitionpath.field=state_id --hoodie-conf hoodie.datasource.hive_sync.partition_fields=state_id --hoodie-conf hoodie.deltastreamer.source.dfs.root=s3a://nxgen-dataplatform-4-rawzone/mysql/default/cities --hoodie-conf hoodie.datasource.hive_sync.e
 nable=true --hoodie-conf hoodie.datasource.hive_sync.database=default --hoodie-conf hoodie.datasource.hive_sync.table=cities --hoodie-conf hoodie.datasource.hive_sync.username=,hoodie.datasource.hive_sync.password= --hoodie-conf hoodie.datasource.hive_sync.jdbcurl=jdbc:hive2://ip-10-0-2-11.ap-south-1.compute.internal:10000/ --hoodie-conf hoodie.datasource.hive_sync.partition_extractor_class=org.apache.hudi.hive.MultiPartKeysValueExtractor
   3.
   Hive , query the data ,
   hive> select * from cities LIMIT 10;;
   OK
   20201117172523  20201117172523_1_3      4595    1000    41d094d9-1154-4988-b197-be660eb5d8cd-0_1-23-12092_20201117172523.parquet        2020-11-16 05:19:05.773893   4595    Akhnoor 181204  NULL    32.8455 74.4756 true    NULL    NULL            1000
   20201117172523  20201117172523_1_6      3822    1000    41d094d9-1154-4988-b197-be660eb5d8cd-0_1-23-12092_20201117172523.parquet        2020-11-16 05:19:05.769144   3822    Anantnag        192101  NULL    33.7000 75.1100 true    NULL    NULL            1000
   20201117172523  20201117172523_1_21     1964    1000    41d094d9-1154-4988-b197-be660eb5d8cd-0_1-23-12092_20201117172523.parquet        2020-11-16 05:19:05.757823   1964    Diver Anderbugh 193223  NULL    34.4590 74.4486 true    NULL    NULL            1000
   20201117172523  20201117172523_1_23     1512    1000    41d094d9-1154-4988-b197-be660eb5d8cd-0_1-23-12092_20201117172523.parquet        2020-11-16 05:19:05.755653   1512    Jammu   180001  NULL    NULL    NULL    true    24166   24166           1000
   20201117172523  20201117172523_40_3024  5684    1001    f9886f5e-78cb-40b0-887a-fa59c330bfb6-0_40-23-12131_20201117172523.parquet       2020-11-16 05:19:05.780776   5684    Pathankot       145001  NULL    32.2511 75.6575 true    NULL    NULL            1001
   20201117172523  20201117172523_40_3025  6048    1001    f9886f5e-78cb-40b0-887a-fa59c330bfb6-0_40-23-12131_20201117172523.parquet       2020-11-16 05:19:05.783052   6048    Kot     175028  NULL    31.5535 77.0278 true    NULL    NULL            1001
   20201117172523  20201117172523_40_3026  6090    1001    f9886f5e-78cb-40b0-887a-fa59c330bfb6-0_40-23-12131_20201117172523.parquet       2020-11-16 05:19:05.783305   6090    Shillihar       175125  NULL    31.9148 77.2184 true    NULL    NULL            1001
   20201117172523  20201117172523_40_3027  4603    1001    f9886f5e-78cb-40b0-887a-fa59c330bfb6-0_40-23-12131_20201117172523.parquet       2020-11-16 05:19:05.773941   4603    Mant Khas       176215  NULL    32.2193 76.3226 true    NULL    NULL            1001
   20201117172523  20201117172523_40_3028  6355    1001    f9886f5e-78cb-40b0-887a-fa59c330bfb6-0_40-23-12131_20201117172523.parquet       2020-11-16 05:19:05.785007   6355    Dharampur       173209  NULL    30.9005 77.0150 true    NULL    NULL            1001
   20201117172523  20201117172523_40_3029  1682    1001    f9886f5e-78cb-40b0-887a-fa59c330bfb6-0_40-23-12131_20201117172523.parquet       2020-11-16 05:19:05.756270   1682    Bharlar 176402  NULL    32.2098 75.7375 true    NULL    NULL            1001
   Time taken: 0.162 seconds, Fetched: 10 row(s)
   
   4.
   presto:smart_api> select * from cities limit  10;
   
   Query 20201118_045733_00001_27sh4, FAILED, 2 nodes
   http://localhost:8889/ui/query.html?20201118_045733_00001_27sh4
   Splits: 60 total, 0 done (0.00%)
   CPU Time: 0.0s total,     0 rows/s,     0B/s, 0% active
   Per Node: 0.0 parallelism,     0 rows/s,     0B/s
   Parallelism: 0.0
   Peak User Memory: 0B
   Peak Total Memory: 0B
   Peak Task Total Memory: 0B
   0:02 [0 rows, 0B] [0 rows/s, 0B/s]
   
   Query 20201118_045733_00001_27sh4 failed: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
   java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
           at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:49)
   
   
   
   **Environment Description**
   
   * Hudi version : 0.6.0
   
   * Spark version : 2.4.6
   
   * Hive version : Hive 2.3.7-amzn-1
   
   * Hadoop version : Hadoop 2.10.0-amzn-0
   
   * Storage (HDFS/S3/GCS..) : S3
   
   * Running on Docker? (yes/no) : No
   
   
   **Additional context**
   
   Add any other context about the problem here.
   
   **Stacktrace**
   
   ```Add the stacktrace of the error.```
   
   presto:smart_api> select * from cities limit  10;
   
   Query 20201118_045733_00001_27sh4, FAILED, 2 nodes
   http://localhost:8889/ui/query.html?20201118_045733_00001_27sh4
   Splits: 60 total, 0 done (0.00%)
   CPU Time: 0.0s total,     0 rows/s,     0B/s, 0% active
   Per Node: 0.0 parallelism,     0 rows/s,     0B/s
   Parallelism: 0.0
   Peak User Memory: 0B
   Peak Total Memory: 0B
   Peak Task Total Memory: 0B
   0:02 [0 rows, 0B] [0 rows/s, 0B/s]
   
   Query 20201118_045733_00001_27sh4 failed: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
   java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable
           at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:49)
           at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:26)
           at com.facebook.presto.hive.GenericHiveRecordCursor.parseDecimalColumn(GenericHiveRecordCursor.java:416)
           at com.facebook.presto.hive.GenericHiveRecordCursor.parseColumn(GenericHiveRecordCursor.java:511)
           at com.facebook.presto.hive.GenericHiveRecordCursor.isNull(GenericHiveRecordCursor.java:466)
           at com.facebook.presto.hive.HiveRecordCursor.isNull(HiveRecordCursor.java:233)
           at com.facebook.presto.spi.RecordPageSource.getNextPage(RecordPageSource.java:112)
           at com.facebook.presto.operator.TableScanOperator.getOutput(TableScanOperator.java:262)
           at com.facebook.presto.operator.Driver.processInternal(Driver.java:382)
           at com.facebook.presto.operator.Driver.lambda$processFor$8(Driver.java:284)
           at com.facebook.presto.operator.Driver.tryWithLock(Driver.java:672)
           at com.facebook.presto.operator.Driver.processFor(Driver.java:277)
           at com.facebook.presto.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:1077)
           at com.facebook.presto.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:162)
           at com.facebook.presto.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:545)
           at com.facebook.presto.$gen.Presto_0_238_3_amzn_0____20201117_171112_1.run(Unknown Source)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhushanamk commented on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhushanamk commented on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-729426423


   <pre>
   presto:smart_api> describe cities;
            Column         |     Type      |     Extra     | Comment
   ------------------------+---------------+---------------+---------
    _hoodie_commit_time    | varchar       |               |
    _hoodie_commit_seqno   | varchar       |               |
    _hoodie_record_key     | varchar       |               |
    _hoodie_partition_path | varchar       |               |
    _hoodie_file_name      | varchar       |               |
    cdc_ts                 | varchar       |               |
    id                     | bigint        |               |
    name                   | varchar       |               |
    code                   | varchar       |               |
    alias                  | varchar       |               |
    lat                    | decimal(10,4) |               |
    lng                    | decimal(10,4) |               |
    is_active              | boolean       |               |
    created_by             | bigint        |               |
    updated_by             | bigint        |               |
    op                     | varchar       |               |
    state_id               | bigint        | partition key |
   (17 rows)
   
   Query 20201118_050841_00010_27sh4, FINISHED, 3 nodes
   Splits: 36 total, 36 done (100.00%)
   0:00 [17 rows, 1.19KB] [62 rows/s, 4.37KB/s]
   
   <pre>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhushanamk commented on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhushanamk commented on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-729420144


   hive> describe cities;
   OK
   _hoodie_commit_time     string
   _hoodie_commit_seqno    string
   _hoodie_record_key      string
   _hoodie_partition_path  string
   _hoodie_file_name       string
   cdc_ts                  string
   id                      bigint
   name                    string
   code                    string
   alias                   string
   lat                     decimal(10,4)
   lng                     decimal(10,4)
   is_active               boolean
   created_by              bigint
   updated_by              bigint
   op                      string
   state_id                bigint
   
   # Partition Information
   # col_name              data_type               comment
   
   state_id                bigint
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhushanamk commented on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhushanamk commented on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-732657229


   using Presto 0.238.3 which came along with  the  emr emr-5.31.0


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bvaradar commented on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bvaradar commented on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-730026955


   @bhasudha : Can you take a look at this issue ?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhushanamk edited a comment on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhushanamk edited a comment on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-729420144


   <pre> hive> describe cities;
   OK
   _hoodie_commit_time     string
   _hoodie_commit_seqno    string
   _hoodie_record_key      string
   _hoodie_partition_path  string
   _hoodie_file_name       string
   cdc_ts                  string
   id                      bigint
   name                    string
   code                    string
   alias                   string
   lat                     decimal(10,4)
   lng                     decimal(10,4)
   is_active               boolean
   created_by              bigint
   updated_by              bigint
   op                      string
   state_id                bigint
   
   # Partition Information
   # col_name              data_type               comment
   
   state_id                bigint <pre>
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] trikota-kc edited a comment on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
trikota-kc edited a comment on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-740555412


   Ok to anyone out there struggling with data type issues that are related to improper column ordering in Presto queries of Deltastreamer output.
   This is what worked for me:
   - Use emr of the latest version with jars that AWS provide on the master node 
   - hive.parquet.use-column-names = true; then restart presto-server
   - Fix order of columns with hoodie.deltastreamer.transformer.sql . When using DMS and Transformer make sure to include "Op" column first in order
   - Partition column MUST be the last one in order
   - Also you can try to cast partition column to string in the  hoodie.deltastreamer.transformer.sql


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] trikota-kc commented on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
trikota-kc commented on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-740000850


   > I switched to new cluster and I am able to query the data now Thanks @bhasudha
   
   Would you mind sharing what setting worked for you? And maybe what was the issue?
   I'm facing the same issue and its very confusing


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhushanamk closed issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhushanamk closed issue #2258:
URL: https://github.com/apache/hudi/issues/2258


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] trikota-kc edited a comment on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
trikota-kc edited a comment on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-740555412


   Ok to anyone out there struggling with data type issues that are related to improper column ordering in Presto.
   This is what worked for me:
   - Use emr of the latest version with jars that AWS provide on the master node 
   - hive.parquet.use-column-names = true; then restart presto-server
   - Fix order of columns with hoodie.deltastreamer.transformer.sql 
   - Partition column MUST be the last one in order
   - Also you can try to cast partition column to string in the  hoodie.deltastreamer.transformer.sql


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhushanamk edited a comment on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhushanamk edited a comment on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-729420144


   `hive> describe cities;
   OK
   _hoodie_commit_time     string
   _hoodie_commit_seqno    string
   _hoodie_record_key      string
   _hoodie_partition_path  string
   _hoodie_file_name       string
   cdc_ts                  string
   id                      bigint
   name                    string
   code                    string
   alias                   string
   lat                     decimal(10,4)
   lng                     decimal(10,4)
   is_active               boolean
   created_by              bigint
   updated_by              bigint
   op                      string
   state_id                bigint
   
   # Partition Information
   # col_name              data_type               comment
   
   state_id                bigint`
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhushanamk edited a comment on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhushanamk edited a comment on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-732657229


   @bhasudha ,using Presto 0.238.3 which came along with  the  emr emr-5.31.0


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhushanamk commented on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhushanamk commented on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-737163257


   I switched to new cluster and  I am able to query the data now Thanks @bhasudha 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhushanamk commented on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhushanamk commented on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-729417187


   Presto did  set the below but no avail
   set session hive.parquet_use_column_names=false;
   
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] trikota-kc commented on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
trikota-kc commented on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-740555412


   Ok to anyone out there struggling with data type issues that are related to improper column ordering in Presto.
   This is what worked for me:
   - Use emr of the latest version with jars that AWS provide on the master node 
   - hive.parquet.use-column-names = true; then restart presto-server
   - Fix order of columns with hoodie.deltastreamer.transformer.sql 
   - Partition column MUST be the last one in order
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhasudha commented on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhasudha commented on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-737056877


   @bhushanamk  Are you able to select other fields except the decimal ones? @bvaradar  I believe this is to do with decimal type support.  Need to dig further.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhushanamk commented on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhushanamk commented on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-729436047


   <pre>
   message hoodie.source.hoodie_source {
     optional binary _hoodie_commit_time (STRING);
     optional binary _hoodie_commit_seqno (STRING);
     optional binary _hoodie_record_key (STRING);
     optional binary _hoodie_partition_path (STRING);
     optional binary _hoodie_file_name (STRING);
     optional binary cdc_ts (STRING);
     optional int64 id;
     optional binary name (STRING);
     optional binary code (STRING);
     optional binary alias (STRING);
     optional int64 state_id;
     optional fixed_len_byte_array(5) lat (DECIMAL(10,4));
     optional fixed_len_byte_array(5) lng (DECIMAL(10,4));
     optional boolean is_active;
     optional int64 created_by;
     optional int64 updated_by;
     required binary Op (STRING);
   }
   
   
   <pre>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhushanamk edited a comment on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhushanamk edited a comment on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-729436047


   parquet tools schema
   <pre>
   message hoodie.source.hoodie_source {
     optional binary _hoodie_commit_time (STRING);
     optional binary _hoodie_commit_seqno (STRING);
     optional binary _hoodie_record_key (STRING);
     optional binary _hoodie_partition_path (STRING);
     optional binary _hoodie_file_name (STRING);
     optional binary cdc_ts (STRING);
     optional int64 id;
     optional binary name (STRING);
     optional binary code (STRING);
     optional binary alias (STRING);
     optional int64 state_id;
     optional fixed_len_byte_array(5) lat (DECIMAL(10,4));
     optional fixed_len_byte_array(5) lng (DECIMAL(10,4));
     optional boolean is_active;
     optional int64 created_by;
     optional int64 updated_by;
     required binary Op (STRING);
   }
   
   
   <pre>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bhasudha commented on issue #2258: [SUPPORT] Unable to query hudi tables in Presto

Posted by GitBox <gi...@apache.org>.
bhasudha commented on issue #2258:
URL: https://github.com/apache/hudi/issues/2258#issuecomment-730205400


   @bhushanamk  which version of Presto are you using. And if you are using your own version, how did you do the setup ? 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org