You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/05/09 15:53:17 UTC

[GitHub] [spark] hehuiyuan edited a comment on issue #24270: [SPARK-27343][KAFKA][SS]Avoid hardcoded for spark-sql-kafka-0-10

hehuiyuan edited a comment on issue #24270: [SPARK-27343][KAFKA][SS]Avoid hardcoded for spark-sql-kafka-0-10
URL: https://github.com/apache/spark/pull/24270#issuecomment-490957492
 
 
   > 
   > 
   > @gaborgsomogyi do you have any other comments besides the one above and the two at [#24270 (comment)](https://github.com/apache/spark/pull/24270#discussion_r274827760) ?
   > 
   > What's the action for the one above, prefix the keys with "spark.sql."? The Kafka configs start with "spark.kafka." _except_ for "spark.sql.kafkaConsumerCache.capacity" which sort of looks like an error. That config isn't documented anywhere, but this was noticed and brought up at https://issues.apache.org/jira/browse/SPARK-25466 and #22138 proposes to keep the spark.sql. prefix I think.
   > 
   > I don't know if we want to change it here, but seems like we want to move away from spark.sql. prefixes here if anything?
   
   
   
   > What's the action for the one above, prefix the keys with "spark.sql."? The Kafka configs start with "spark.kafka." _except_ for "spark.sql.kafkaConsumerCache.capacity" which sort of looks like an error.
   
   In spark streaming kafka, this parameter spark.streaming.kafka.consumer.cache.maxCapacity is used.
   I think that might want to distinguish it from this parameter.
   
   But `spark.kafka.` should be more clear.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org