You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2022/01/18 02:57:44 UTC

[GitHub] [hudi] logan-jun opened a new issue #4623: [SUPPORT] Delete is not working in python, parquet files

logan-jun opened a new issue #4623:
URL: https://github.com/apache/hudi/issues/4623


   **_Tips before filing an issue_**
   
   - Have you gone through our [FAQs](https://hudi.apache.org/learn/faq/)? y
   
   - Join the mailing list to engage in conversations and get faster support at dev-subscribe@hudi.apache.org.
   
   - If you have triaged this as a bug, then file an [issue](https://issues.apache.org/jira/projects/HUDI/issues) directly.
   
   **Describe the problem you faced**
   
   A clear and concise description of the problem.
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1. Prepare parquet file for original dataset
   2. Prepare parquet file for delete dataset
   3. python code with delete feature 
   
   
   **Expected behavior**
   
   Original dataset is delete with delete dataset using primary key
   A clear and concise description of what you expected to happen.
   
   **Environment Description**
   
   * Hudi version : 2.12-0.9.0
   
   * Spark version : 3
   
   * Hive version : x
   
   * Hadoop version :x
   
   * Storage (HDFS/S3/GCS..) : s3
   
   * Running on Docker? (yes/no) : no, but using aws glue
   
   
   **Additional context**
   
   Add any other context about the problem here.
   
   **Stacktrace**
   
   ```Add the stacktrace of the error.```
   
   this is my code
   `import pyspark
   from pyspark import SparkConf, SparkContext
   from pyspark.sql import SparkSession
   from pyspark.sql.functions import col, asc, desc
   from awsglue.utils import getResolvedOptions
   from awsglue.dynamicframe import DynamicFrame
   from awsglue.context import GlueContext
   import sys
   
   spark = GlueContext(SparkContext.getOrCreate()).sparkSession
   
   args = getResolvedOptions(sys.argv, ['source', 'target', 'pal_num', 'tbl_nm'])
   
   df_deletes = spark.read.parquet(args['source'])
   df_deletes = df_deletes.withColumnRenamed("part_dt", "col_dt")
   
   
   
   hudi_options = {
       'hoodie.table.name': args['tbl_nm'],
       'hoodie.datasource.write.recordkey.field': 'col_pk',
       'hoodie.datasource.write.partitionpath.field': 'col_dt',
       'hoodie.datasource.write.table.name': args['tbl_nm'],
       'hoodie.datasource.write.operation': 'delete',
       'hoodie.datasource.write.precombine.field': 'col_pk',
       'hoodie.delete.shuffle.parallelism': args['pal_num'],
       'hoodie.upsert.shuffle.parallelism': args['pal_num'], 
       'hoodie.insert.shuffle.parallelism': args['pal_num']
   }
   
   # Writing HUDI table
   df_deletes.write.format("hudi").options(
       **hudi_options).mode("append").save(args['target'])`
   
   and returns error:
   `org.apache.avro.SchemaParseException: Illegal initial character: 500m-h-pq_record`
   
   there is no column "500m-h-pq_record" in my table and actually "500m-h-pq" is the table name(args['tbl_nm'])
   I do not understand where  "500m-h-pq_record" came from.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] xushiyan commented on issue #4623: [SUPPORT] Delete is not working in python, parquet files

Posted by GitBox <gi...@apache.org>.
xushiyan commented on issue #4623:
URL: https://github.com/apache/hudi/issues/4623#issuecomment-1015075092


   > @logan-jun hudi table name will be used to generate avro schema record name and hyphen is not a valid character in avro schema name. see https://avro.apache.org/docs/1.8.2/spec.html#names
   > 
   > > The name portion of a fullname, record field names, and enum symbols must:
   > > start with [A-Za-z_]
   > > subsequently contain only [A-Za-z0-9_]
   > 
   > make the table name start with a letter and change hyphen to underscore should resolve the problem.
   
   it needs to start with letter or underscore


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] xushiyan commented on issue #4623: [SUPPORT] Delete is not working in python, parquet files

Posted by GitBox <gi...@apache.org>.
xushiyan commented on issue #4623:
URL: https://github.com/apache/hudi/issues/4623#issuecomment-1015070882


   @logan-jun hudi table name will be used to generate avro schema record name and hyphen is not a valid character in avro schema name. see https://avro.apache.org/docs/1.8.2/spec.html#names
   
   > The name portion of a fullname, record field names, and enum symbols must:
   > start with [A-Za-z_]
   > subsequently contain only [A-Za-z0-9_]
   
   make the table name start with a letter and change hyphen to underscore should resolve the problem.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] logan-jun closed issue #4623: [SUPPORT] Delete is not working in python, parquet files

Posted by GitBox <gi...@apache.org>.
logan-jun closed issue #4623:
URL: https://github.com/apache/hudi/issues/4623


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] logan-jun commented on issue #4623: [SUPPORT] Delete is not working in python, parquet files

Posted by GitBox <gi...@apache.org>.
logan-jun commented on issue #4623:
URL: https://github.com/apache/hudi/issues/4623#issuecomment-1015075724


   Oh, I see. let me test it again


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] logan-jun commented on issue #4623: [SUPPORT] Delete is not working in python, parquet files

Posted by GitBox <gi...@apache.org>.
logan-jun commented on issue #4623:
URL: https://github.com/apache/hudi/issues/4623#issuecomment-1015074559


   Hi. I've already tested changing table name to '500mdeltest' which does not have any hyphen or other special characters. but it returns same error: org.apache.avro.SchemaParseException: Illegal initial character: 500mdeltest_record
   
   It seems that hudi is attaching underscore and 'record' to table name


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] logan-jun commented on issue #4623: [SUPPORT] Delete is not working in python, parquet files

Posted by GitBox <gi...@apache.org>.
logan-jun commented on issue #4623:
URL: https://github.com/apache/hudi/issues/4623#issuecomment-1015080879


   It works. thank you very much


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org