You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/02/28 07:08:53 UTC

[GitHub] [spark] AngersZhuuuu commented on pull request #35662: [SPARK-38333][SQL] [3.1]DPP cause DataSourceScanExec java.lang.NullPointer…

AngersZhuuuu commented on pull request #35662:
URL: https://github.com/apache/spark/pull/35662#issuecomment-1053952820


   > > On the other hand, if we backport [SPARK-35798](https://issues.apache.org/jira/browse/SPARK-35798) to branch-3.1, can this issue be solved?
   > 
   > After backport [SPARK-35798](https://issues.apache.org/jira/browse/SPARK-35798) , new NullPointerException will throws:
   > 
   > ```
   > case class FileSourceScanExec(
   >     @transient relation: HadoopFsRelation,
   >     output: Seq[Attribute],
   >     requiredSchema: StructType,
   >     partitionFilters: Seq[Expression],
   >     optionalBucketSet: Option[BitSet],
   >     optionalNumCoalescedBuckets: Option[Int],
   >     dataFilters: Seq[Expression],
   >     tableIdentifier: Option[TableIdentifier],
   >     disableBucketedScan: Boolean = false)
   >   extends DataSourceScanExec {
   > 
   >   // Note that some vals referring the file-based relation are lazy intentionally
   >   // so that this plan can be canonicalized on executor side too. See SPARK-23731.
   >   override lazy val supportsColumnar: Boolean = {
   >     relation.fileFormat.supportBatch(relation.sparkSession, schema)
   >   }
   > ```
   > 
   > because relation is null
   
   I think we need to find which pr fix this issue correctly. then backport to spark 3.1 is the best way? or the code path is not same between 3.1 and master?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org