You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2021/08/13 17:10:00 UTC
[jira] [Resolved] (SPARK-36503) Add RowToColumnConverter for
BinaryType
[ https://issues.apache.org/jira/browse/SPARK-36503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-36503.
-----------------------------------
Fix Version/s: 3.3.0
Resolution: Fixed
Issue resolved by pull request 33733
[https://github.com/apache/spark/pull/33733]
> Add RowToColumnConverter for BinaryType
> ---------------------------------------
>
> Key: SPARK-36503
> URL: https://issues.apache.org/jira/browse/SPARK-36503
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.3.0
> Reporter: Huaxin Gao
> Assignee: Huaxin Gao
> Priority: Minor
> Fix For: 3.3.0
>
>
> currently, we have RowToColumnConverter for all data types except BinaryType
> {code:java}
> private def getConverterForType(dataType: DataType, nullable: Boolean): TypeConverter = {
> val core = dataType match {
> case BooleanType => BooleanConverter
> case ByteType => ByteConverter
> case ShortType => ShortConverter
> case IntegerType | DateType => IntConverter
> case FloatType => FloatConverter
> case LongType | TimestampType => LongConverter
> case DoubleType => DoubleConverter
> case StringType => StringConverter
> case CalendarIntervalType => CalendarConverter
> case at: ArrayType => ArrayConverter(getConverterForType(at.elementType, at.containsNull))
> case st: StructType => new StructConverter(st.fields.map(
> (f) => getConverterForType(f.dataType, f.nullable)))
> case dt: DecimalType => new DecimalConverter(dt)
> case mt: MapType => MapConverter(getConverterForType(mt.keyType, nullable = false),
> getConverterForType(mt.valueType, mt.valueContainsNull))
> case unknown => throw QueryExecutionErrors.unsupportedDataTypeError(unknown.toString)
> }
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org