You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "PengLei (Jira)" <ji...@apache.org> on 2021/10/29 10:01:00 UTC

[jira] [Created] (SPARK-37161) RowToColumnConverter support AnsiIntervalType

PengLei created SPARK-37161:
-------------------------------

             Summary: RowToColumnConverter  support AnsiIntervalType
                 Key: SPARK-37161
                 URL: https://issues.apache.org/jira/browse/SPARK-37161
             Project: Spark
          Issue Type: Sub-task
          Components: SQL
    Affects Versions: 3.3.0
            Reporter: PengLei


currently, we have RowToColumnConverter for all data types except AnsiIntervalType
{code:java}
// code placeholder
val core = dataType match {
  case BinaryType => BinaryConverter
  case BooleanType => BooleanConverter
  case ByteType => ByteConverter
  case ShortType => ShortConverter
  case IntegerType | DateType => IntConverter
  case FloatType => FloatConverter
  case LongType | TimestampType => LongConverter
  case DoubleType => DoubleConverter
  case StringType => StringConverter
  case CalendarIntervalType => CalendarConverter
  case at: ArrayType => ArrayConverter(getConverterForType(at.elementType, at.containsNull))
  case st: StructType => new StructConverter(st.fields.map(
    (f) => getConverterForType(f.dataType, f.nullable)))
  case dt: DecimalType => new DecimalConverter(dt)
  case mt: MapType => MapConverter(getConverterForType(mt.keyType, nullable = false),
    getConverterForType(mt.valueType, mt.valueContainsNull))
  case unknown => throw QueryExecutionErrors.unsupportedDataTypeError(unknown.toString)
}

if (nullable) {
  dataType match {
    case CalendarIntervalType => new StructNullableTypeConverter(core)
    case st: StructType => new StructNullableTypeConverter(core)
    case _ => new BasicNullableTypeConverter(core)
  }
} else {
  core
}

{code}
 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org