You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/05/03 05:15:57 UTC
[GitHub] [spark] sunchao commented on pull request #36427: [SPARK-39086][SQL] Support UDT in Spark Parquet vectorized reader
sunchao commented on PR #36427:
URL: https://github.com/apache/spark/pull/36427#issuecomment-1115746980
Looks like `CodeGenerator.getValueFromVector` and `CodeGenerator.getValue` needs to be updated since previously a `ColumnVector` won't be UDT type, but now it can. If input `dataType` is UDT:
```scala
def getValueFromVector(vector: String, dataType: DataType, rowId: String): String = {
if (dataType.isInstanceOf[StructType]) {
// `ColumnVector.getStruct` is different from `InternalRow.getStruct`, it only takes an
// `ordinal` parameter.
s"$vector.getStruct($rowId)"
} else {
getValue(vector, dataType, rowId)
}
```
this will call `getValue` instead, and
```scala
def getValue(input: String, dataType: DataType, ordinal: String): String = {
val jt = javaType(dataType)
dataType match {
case _ if isPrimitiveType(jt) => s"$input.get${primitiveTypeName(jt)}($ordinal)"
case t: DecimalType => s"$input.getDecimal($ordinal, ${t.precision}, ${t.scale})"
case StringType => s"$input.getUTF8String($ordinal)"
case BinaryType => s"$input.getBinary($ordinal)"
case CalendarIntervalType => s"$input.getInterval($ordinal)"
case t: StructType => s"$input.getStruct($ordinal, ${t.size})"
case _: ArrayType => s"$input.getArray($ordinal)"
case _: MapType => s"$input.getMap($ordinal)"
case NullType => "null"
case udt: UserDefinedType[_] => getValue(input, udt.sqlType, ordinal)
case _ => s"($jt)$input.get($ordinal, null)"
}
}
```
and `getValue` will recursively call itself on `udt.sqlType` which is struct type, and thus will call `$input.getStruct($ordinal, ${t.size})` which will fail with the above error.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org