You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/05/31 11:32:39 UTC

[GitHub] [spark] HeartSaVioR commented on issue #24738: [WIP][SPARK-23098][SQL] Migrate Kafka Batch source to v2.

HeartSaVioR commented on issue #24738: [WIP][SPARK-23098][SQL] Migrate Kafka Batch source to v2.
URL: https://github.com/apache/spark/pull/24738#issuecomment-497677198
 
 
   Looks like Table interface requires **static schema** along with read and write, for batch and streaming. It would hold true for sources which are related to `table` or having simple schema, but it doesn't seem to hold true for Kafka source, which reader and writer have different schema.
   
   One other thing I've also observed is, even Table interface requires schema, (unlike batch write,) Spark doesn't throw error on streaming write even it doesn't match with schema. For streaming query, `schema` method doesn't look like being referenced.
   
   @cloud-fan @rdblue Could you please review the specific case on Kafka source? I'd like to see whether this case is one of missed spots, or a known limitation.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org