You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "huanghuai (JIRA)" <ji...@apache.org> on 2018/10/15 10:32:00 UTC

[jira] [Updated] (SPARK-25723) spark sql External DataSource question

     [ https://issues.apache.org/jira/browse/SPARK-25723?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

huanghuai updated SPARK-25723:
------------------------------
    Description: 
trait PrunedFilteredScan {
 def buildScan(requiredColumns: Array[String], filters: Array[Filter]): RDD[Row]
}

 

if i implement this trait, i find requiredColumns param is different everytime,Why are the order different????

you can use spark.read.jdbc  and connect to your local mysql DB, and debug at 

org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation#buildScan(scala:130)

to show this param;

attachement is my screenshot 
        Summary: spark sql External DataSource question  (was: spark sql External DataSource)

> spark sql External DataSource question
> --------------------------------------
>
>                 Key: SPARK-25723
>                 URL: https://issues.apache.org/jira/browse/SPARK-25723
>             Project: Spark
>          Issue Type: Question
>          Components: SQL
>    Affects Versions: 2.3.2
>         Environment: local mode
>            Reporter: huanghuai
>            Priority: Major
>         Attachments: QQ图片20181015182502.jpg
>
>
> trait PrunedFilteredScan {
>  def buildScan(requiredColumns: Array[String], filters: Array[Filter]): RDD[Row]
> }
>  
> if i implement this trait, i find requiredColumns param is different everytime,Why are the order different????
> you can use spark.read.jdbc  and connect to your local mysql DB, and debug at 
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation#buildScan(scala:130)
> to show this param;
> attachement is my screenshot 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org