You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/02/11 21:02:20 UTC

[GitHub] srowen commented on a change in pull request #23742: [SPARK-26835][DOCS] Documentationof Spark SQL Generic Load/Save Functions options extended

srowen commented on a change in pull request #23742: [SPARK-26835][DOCS] Documentationof Spark SQL Generic Load/Save Functions options extended
URL: https://github.com/apache/spark/pull/23742#discussion_r255691164
 
 

 ##########
 File path: docs/sql-data-sources-load-save-functions.md
 ##########
 @@ -41,6 +41,11 @@ name (i.e., `org.apache.spark.sql.parquet`), but for built-in sources you can al
 names (`json`, `parquet`, `jdbc`, `orc`, `libsvm`, `csv`, `text`). DataFrames loaded from any data
 source type can be converted into other types using this syntax.
 
+The available extra options are documented in the API documentation. For basic formats
+(like `json`, `parquet`, `jdbc`, `orc`, `csv`, `text`) you want to check the API documentation of
+`org.apache.spark.sql.DataFrameReader` and `org.apache.spark.sql.DataFrameWriter`. For other
 
 Review comment:
   You can link to, for example, http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.DataFrameWriter and http://spark.apache.org/docs/latest/api/python/pyspark.sql.html?highlight=dataframewriter#pyspark.sql.DataFrameWriter I think the Scala docs are probably most comprehensive though, and would be relevant to Pyspark users too.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org