You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yuming Wang (JIRA)" <ji...@apache.org> on 2018/07/02 15:47:00 UTC
[jira] [Updated] (SPARK-24718) Timestamp support pushdown to
parquet data source
[ https://issues.apache.org/jira/browse/SPARK-24718?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yuming Wang updated SPARK-24718:
--------------------------------
Description:
Some thing like this:
{code:java}
// INT96 deprecated, doesn't support pushdown, see: PARQUET-323
case ParquetSchemaType(TIMESTAMP_MICROS, INT64, decimal)
if pushDownDecimal =>
(n: String, v: Any) => FilterApi.eq(
longColumn(n),
Option(v).map(t => (t.asInstanceOf[java.sql.Timestamp].getTime * 1000)
.asInstanceOf[java.lang.Long]).orNull)
case ParquetSchemaType(TIMESTAMP_MILLIS, INT64, decimal)
if pushDownDecimal =>
(n: String, v: Any) => FilterApi.eq(
longColumn(n),
Option(v).map(_.asInstanceOf[java.sql.Timestamp].getTime
.asInstanceOf[java.lang.Long]).orNull)
{code}
> Timestamp support pushdown to parquet data source
> -------------------------------------------------
>
> Key: SPARK-24718
> URL: https://issues.apache.org/jira/browse/SPARK-24718
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 2.4.0
> Reporter: Yuming Wang
> Priority: Major
>
> Some thing like this:
> {code:java}
> // INT96 deprecated, doesn't support pushdown, see: PARQUET-323
> case ParquetSchemaType(TIMESTAMP_MICROS, INT64, decimal)
> if pushDownDecimal =>
> (n: String, v: Any) => FilterApi.eq(
> longColumn(n),
> Option(v).map(t => (t.asInstanceOf[java.sql.Timestamp].getTime * 1000)
> .asInstanceOf[java.lang.Long]).orNull)
> case ParquetSchemaType(TIMESTAMP_MILLIS, INT64, decimal)
> if pushDownDecimal =>
> (n: String, v: Any) => FilterApi.eq(
> longColumn(n),
> Option(v).map(_.asInstanceOf[java.sql.Timestamp].getTime
> .asInstanceOf[java.lang.Long]).orNull)
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org