You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Christopher Hoshino-Fish (Jira)" <ji...@apache.org> on 2019/09/05 21:28:00 UTC

[jira] [Commented] (SPARK-28977) JDBC Dataframe Reader Doc Doesn't Match JDBC Data Source Page

    [ https://issues.apache.org/jira/browse/SPARK-28977?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16923741#comment-16923741 ] 

Christopher Hoshino-Fish commented on SPARK-28977:
--------------------------------------------------

thanks Sean!

> JDBC Dataframe Reader Doc Doesn't Match JDBC Data Source Page
> -------------------------------------------------------------
>
>                 Key: SPARK-28977
>                 URL: https://issues.apache.org/jira/browse/SPARK-28977
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>    Affects Versions: 2.4.3
>            Reporter: Christopher Hoshino-Fish
>            Assignee: Sean Owen
>            Priority: Minor
>             Fix For: 2.4.5, 3.0.0
>
>
> [https://spark.apache.org/docs/2.4.3/sql-data-sources-jdbc.html]
> Specifically in the partitionColumn section, this page says:
> "{{partitionColumn}} must be a numeric, date, or timestamp column from the table in question."
>  
> But then in this doc: [https://spark.apache.org/docs/2.4.3/api/scala/index.html#org.apache.spark.sql.DataFrameReader]
> in def jdbc(url: String, table: String, columnName: String, lowerBound: Long, upperBound: Long, numPartitions: Int, connectionProperties: Properties): DataFrame
> we have:
> columnName
> the name of a column of integral type that will be used for partitioning.
>  
> This appears to go back pretty far, to 1.6.3, but I'm not sure when this was accurate.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org