You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ahmed Metwally (Jira)" <ji...@apache.org> on 2020/09/29 17:15:00 UTC

[jira] [Created] (SPARK-33028) Fix ScalaReflection issue for Case Classes with complex type aliased data

Ahmed Metwally created SPARK-33028:
--------------------------------------

             Summary: Fix ScalaReflection issue for Case Classes with complex type aliased data
                 Key: SPARK-33028
                 URL: https://issues.apache.org/jira/browse/SPARK-33028
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.0.1
            Reporter: Ahmed Metwally


THIS BUG WAS DISCOVERED BY AHMED METWALLY AND WAS FIXED BY GAURAV AHUJA WHILE BOTH WERE WORKING AT UBER INC.

 

BELOW IS THE CONTEXT FROM GAURAV AHUJA.

 

Context:
 * Creating a dataframe from a RDD[T] fails with scala.matchError. T is a class with complex type aliased param eg: type TypeAliasedArray = Array[_].
 * After debugging I found that the code path fails to create a serializer for class T.
 ** In order to create the serializer, spark first finds all the params for this class. Then uses reflection to find dataType for each param.
 ** The private function dataTypeFor is called for each parameter fieldType.
 ** The private function dataTypeFor throws an error if the filedType is a Type Aliased array.
 ** This is because it fails to find the element type for this type aliased array.

Fix:
 * Dealias the type before extracting the element type of the type aliased array

 

 

BELOW IS THE FIX FROM GAURAV AHUJA IN org/apache/spark/sql/catalyst/ScalaReflection.scala. ONLY ONE LINE NEEDS TO BE CHANGED, LINE 104.

 

` val TypeRef(_, _, Seq(elementType)) = tpe`

SHOULD BE CHANGED TO

` val TypeRef(_, _, Seq(elementType)) = tpe.dealias`

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org