You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by cloud-fan <gi...@git.apache.org> on 2016/07/25 08:52:39 UTC

[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

GitHub user cloud-fan opened a pull request:

    https://github.com/apache/spark/pull/14344

    [SPARK-16706][SQL] support java map in encoder

    ## What changes were proposed in this pull request?
    
    finish the TODO, create a new expression `ExternalMapToCatalyst` to iterate the map directly.
    
    ## How was this patch tested?
    
    new test in `JavaDatasetSuite`


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/cloud-fan/spark java-map

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/14344.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #14344
    
----
commit f23549314cd2558fda0f859418ef36e11d9fe9f9
Author: Wenchen Fan <we...@databricks.com>
Date:   2016-07-25T08:50:08Z

    support java map in encoder

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14344: [SPARK-16706][SQL] support java map in encoder

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14344
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14344: [SPARK-16706][SQL] support java map in encoder

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14344
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/62816/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14344#discussion_r72045939
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala ---
    @@ -501,6 +501,143 @@ case class MapObjects private(
       }
     }
     
    +object ExternalMapToCatalyst {
    +  private val curId = new java.util.concurrent.atomic.AtomicInteger()
    +
    +  def apply(
    +      inputMap: Expression,
    +      keyType: DataType,
    +      keyConverter: Expression => Expression,
    +      valueType: DataType,
    +      valueConverter: Expression => Expression): ExternalMapToCatalyst = {
    +    val id = curId.getAndIncrement()
    +    val keyName = "ExternalMapToCatalyst_key" + id
    +    val valueName = "ExternalMapToCatalyst_value" + id
    +    val valueIsNull = "ExternalMapToCatalyst_value_isNull" + id
    +
    +    ExternalMapToCatalyst(
    +      keyName,
    +      keyType,
    +      keyConverter(LambdaVariable(keyName, "false", keyType)),
    +      valueName,
    +      valueIsNull,
    +      valueType,
    +      valueConverter(LambdaVariable(valueName, valueIsNull, valueType)),
    +      inputMap
    +    )
    +  }
    +}
    +
    +case class ExternalMapToCatalyst private(
    +    key: String,
    +    keyType: DataType,
    +    keyConverter: Expression,
    +    value: String,
    +    valueIsNull: String,
    +    valueType: DataType,
    +    valueConverter: Expression,
    +    child: Expression)
    +  extends UnaryExpression with NonSQLExpression {
    +
    +  override def foldable: Boolean = false
    +
    +  override def dataType: MapType = MapType(keyConverter.dataType, valueConverter.dataType)
    +
    +  override def eval(input: InternalRow): Any =
    +    throw new UnsupportedOperationException("Only code-generated evaluation is supported")
    +
    +  override protected def doGenCode(ctx: CodegenContext, ev: ExprCode): ExprCode = {
    +    val inputMap = child.genCode(ctx)
    +    val genKeyConverter = keyConverter.genCode(ctx)
    +    val genValueConverter = valueConverter.genCode(ctx)
    +    val length = ctx.freshName("length")
    +    val index = ctx.freshName("index")
    +    val convertedKeys = ctx.freshName("convertedKeys")
    +    val convertedValues = ctx.freshName("convertedValues")
    +    val entry = ctx.freshName("entry")
    +    val entries = ctx.freshName("entries")
    +
    +    val (defineEntries, defineKeyValue) = child.dataType match {
    +      case ObjectType(cls) if classOf[java.util.Map[_, _]].isAssignableFrom(cls) =>
    +        val javaIteratorCls = classOf[java.util.Iterator[_]].getName
    +        val javaMapEntryCls = classOf[java.util.Map.Entry[_, _]].getName
    +
    +        val defineEntries =
    +          s"final $javaIteratorCls $entries = ${inputMap.value}.entrySet().iterator();"
    +
    +        val defineKeyValue =
    +          s"""
    +            final $javaMapEntryCls $entry = ($javaMapEntryCls) $entries.next();
    +            ${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) $entry.getKey();
    +            ${ctx.javaType(valueType)} $value = (${ctx.boxedType(valueType)}) $entry.getValue();
    +          """
    +
    +        defineEntries -> defineKeyValue
    +
    +      case ObjectType(cls) if classOf[scala.collection.Map[_, _]].isAssignableFrom(cls) =>
    +        val scalaIteratorCls = classOf[Iterator[_]].getName
    +        val scalaMapEntryCls = classOf[Tuple2[_, _]].getName
    +
    +        val defineEntries = s"final $scalaIteratorCls $entries = ${inputMap.value}.iterator();"
    +
    +        val defineKeyValue =
    +          s"""
    +            final $scalaMapEntryCls $entry = ($scalaMapEntryCls) $entries.next();
    +            ${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) $entry._1();
    +            ${ctx.javaType(valueType)} $value = (${ctx.boxedType(valueType)}) $entry._2();
    +          """
    +
    +        defineEntries -> defineKeyValue
    +    }
    +
    +    val valueNullCheck = if (ctx.isPrimitiveType(valueType)) {
    +      s"boolean $valueIsNull = false;"
    +    } else {
    +      s"boolean $valueIsNull = $value == null;"
    +    }
    +
    +    val arrayCls = classOf[GenericArrayData].getName
    +    val mapCls = classOf[ArrayBasedMapData].getName
    +    val convertedKeyType = ctx.boxedType(keyConverter.dataType)
    +    val convertedValueType = ctx.boxedType(valueConverter.dataType)
    +    val code =
    +      s"""
    +        ${inputMap.code}
    +        ${ctx.javaType(dataType)} ${ev.value} = ${ctx.defaultValue(dataType)};
    +        if (!${inputMap.isNull}) {
    +          final int $length = ${inputMap.value}.size();
    +          final Object[] $convertedKeys = new Object[$length];
    +          final Object[] $convertedValues = new Object[$length];
    +          int $index = 0;
    +          $defineEntries
    +          while($entries.hasNext()) {
    +            $defineKeyValue
    +            $valueNullCheck
    +
    +            ${genKeyConverter.code}
    +            if (${genKeyConverter.isNull}) {
    +              throw new RuntimeException("Cannot use null as map key!");
    +            } else {
    +              $convertedKeys[$index] = ($convertedKeyType) ${genKeyConverter.value};
    --- End diff --
    
    Hm, I'd expect auto-boxing happens here, but I can be wrong...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14344: [SPARK-16706][SQL] support java map in encoder

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14344
  
    **[Test build #62864 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62864/consoleFull)** for PR 14344 at commit [`b4dad74`](https://github.com/apache/spark/commit/b4dad7433266caba0b7e9d6b0a88c5a9c3e3afa7).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14344#discussion_r72035534
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala ---
    @@ -501,6 +501,143 @@ case class MapObjects private(
       }
     }
     
    +object ExternalMapToCatalyst {
    +  private val curId = new java.util.concurrent.atomic.AtomicInteger()
    +
    +  def apply(
    +      inputMap: Expression,
    +      keyType: DataType,
    +      keyConverter: Expression => Expression,
    +      valueType: DataType,
    +      valueConverter: Expression => Expression): ExternalMapToCatalyst = {
    +    val id = curId.getAndIncrement()
    +    val keyName = "ExternalMapToCatalyst_key" + id
    +    val valueName = "ExternalMapToCatalyst_value" + id
    +    val valueIsNull = "ExternalMapToCatalyst_value_isNull" + id
    +
    +    ExternalMapToCatalyst(
    +      keyName,
    +      keyType,
    +      keyConverter(LambdaVariable(keyName, "false", keyType)),
    +      valueName,
    +      valueIsNull,
    +      valueType,
    +      valueConverter(LambdaVariable(valueName, valueIsNull, valueType)),
    +      inputMap
    +    )
    +  }
    +}
    +
    +case class ExternalMapToCatalyst private(
    +    key: String,
    +    keyType: DataType,
    +    keyConverter: Expression,
    +    value: String,
    +    valueIsNull: String,
    +    valueType: DataType,
    +    valueConverter: Expression,
    +    child: Expression)
    +  extends UnaryExpression with NonSQLExpression {
    +
    +  override def foldable: Boolean = false
    +
    +  override def dataType: MapType = MapType(keyConverter.dataType, valueConverter.dataType)
    +
    +  override def eval(input: InternalRow): Any =
    +    throw new UnsupportedOperationException("Only code-generated evaluation is supported")
    +
    +  override protected def doGenCode(ctx: CodegenContext, ev: ExprCode): ExprCode = {
    +    val inputMap = child.genCode(ctx)
    +    val genKeyConverter = keyConverter.genCode(ctx)
    +    val genValueConverter = valueConverter.genCode(ctx)
    +    val length = ctx.freshName("length")
    +    val index = ctx.freshName("index")
    +    val convertedKeys = ctx.freshName("convertedKeys")
    +    val convertedValues = ctx.freshName("convertedValues")
    +    val entry = ctx.freshName("entry")
    +    val entries = ctx.freshName("entries")
    +
    +    val (defineEntries, defineKeyValue) = child.dataType match {
    +      case ObjectType(cls) if classOf[java.util.Map[_, _]].isAssignableFrom(cls) =>
    +        val javaIteratorCls = classOf[java.util.Iterator[_]].getName
    +        val javaMapEntryCls = classOf[java.util.Map.Entry[_, _]].getName
    +
    +        val defineEntries =
    +          s"final $javaIteratorCls $entries = ${inputMap.value}.entrySet().iterator();"
    +
    +        val defineKeyValue =
    +          s"""
    +            final $javaMapEntryCls $entry = ($javaMapEntryCls) $entries.next();
    +            ${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) $entry.getKey();
    +            ${ctx.javaType(valueType)} $value = (${ctx.boxedType(valueType)}) $entry.getValue();
    +          """
    +
    +        defineEntries -> defineKeyValue
    +
    +      case ObjectType(cls) if classOf[scala.collection.Map[_, _]].isAssignableFrom(cls) =>
    +        val scalaIteratorCls = classOf[Iterator[_]].getName
    +        val scalaMapEntryCls = classOf[Tuple2[_, _]].getName
    +
    +        val defineEntries = s"final $scalaIteratorCls $entries = ${inputMap.value}.iterator();"
    +
    +        val defineKeyValue =
    +          s"""
    +            final $scalaMapEntryCls $entry = ($scalaMapEntryCls) $entries.next();
    +            ${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) $entry._1();
    +            ${ctx.javaType(valueType)} $value = (${ctx.boxedType(valueType)}) $entry._2();
    +          """
    +
    +        defineEntries -> defineKeyValue
    +    }
    +
    +    val valueNullCheck = if (ctx.isPrimitiveType(valueType)) {
    +      s"boolean $valueIsNull = false;"
    +    } else {
    +      s"boolean $valueIsNull = $value == null;"
    +    }
    +
    +    val arrayCls = classOf[GenericArrayData].getName
    +    val mapCls = classOf[ArrayBasedMapData].getName
    +    val convertedKeyType = ctx.boxedType(keyConverter.dataType)
    +    val convertedValueType = ctx.boxedType(valueConverter.dataType)
    +    val code =
    +      s"""
    +        ${inputMap.code}
    +        ${ctx.javaType(dataType)} ${ev.value} = ${ctx.defaultValue(dataType)};
    +        if (!${inputMap.isNull}) {
    +          final int $length = ${inputMap.value}.size();
    +          final Object[] $convertedKeys = new Object[$length];
    +          final Object[] $convertedValues = new Object[$length];
    +          int $index = 0;
    +          $defineEntries
    +          while($entries.hasNext()) {
    +            $defineKeyValue
    +            $valueNullCheck
    +
    +            ${genKeyConverter.code}
    +            if (${genKeyConverter.isNull}) {
    +              throw new RuntimeException("Cannot use null as map key!");
    +            } else {
    +              $convertedKeys[$index] = ($convertedKeyType) ${genKeyConverter.value};
    +            }
    +
    +            ${genValueConverter.code}
    +            if (${genValueConverter.isNull}) {
    +              $convertedValues[$index] = null;
    +            } else {
    +              $convertedValues[$index] = ($convertedValueType) ${genValueConverter.value};
    --- End diff --
    
    Same as above.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14344: [SPARK-16706][SQL] support java map in encoder

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14344
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14344: [SPARK-16706][SQL] support java map in encoder

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14344
  
    **[Test build #62864 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62864/consoleFull)** for PR 14344 at commit [`b4dad74`](https://github.com/apache/spark/commit/b4dad7433266caba0b7e9d6b0a88c5a9c3e3afa7).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14344#discussion_r72035518
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala ---
    @@ -501,6 +501,143 @@ case class MapObjects private(
       }
     }
     
    +object ExternalMapToCatalyst {
    +  private val curId = new java.util.concurrent.atomic.AtomicInteger()
    +
    +  def apply(
    +      inputMap: Expression,
    +      keyType: DataType,
    +      keyConverter: Expression => Expression,
    +      valueType: DataType,
    +      valueConverter: Expression => Expression): ExternalMapToCatalyst = {
    +    val id = curId.getAndIncrement()
    +    val keyName = "ExternalMapToCatalyst_key" + id
    +    val valueName = "ExternalMapToCatalyst_value" + id
    +    val valueIsNull = "ExternalMapToCatalyst_value_isNull" + id
    +
    +    ExternalMapToCatalyst(
    +      keyName,
    +      keyType,
    +      keyConverter(LambdaVariable(keyName, "false", keyType)),
    +      valueName,
    +      valueIsNull,
    +      valueType,
    +      valueConverter(LambdaVariable(valueName, valueIsNull, valueType)),
    +      inputMap
    +    )
    +  }
    +}
    +
    +case class ExternalMapToCatalyst private(
    +    key: String,
    +    keyType: DataType,
    +    keyConverter: Expression,
    +    value: String,
    +    valueIsNull: String,
    +    valueType: DataType,
    +    valueConverter: Expression,
    +    child: Expression)
    +  extends UnaryExpression with NonSQLExpression {
    +
    +  override def foldable: Boolean = false
    +
    +  override def dataType: MapType = MapType(keyConverter.dataType, valueConverter.dataType)
    +
    +  override def eval(input: InternalRow): Any =
    +    throw new UnsupportedOperationException("Only code-generated evaluation is supported")
    +
    +  override protected def doGenCode(ctx: CodegenContext, ev: ExprCode): ExprCode = {
    +    val inputMap = child.genCode(ctx)
    +    val genKeyConverter = keyConverter.genCode(ctx)
    +    val genValueConverter = valueConverter.genCode(ctx)
    +    val length = ctx.freshName("length")
    +    val index = ctx.freshName("index")
    +    val convertedKeys = ctx.freshName("convertedKeys")
    +    val convertedValues = ctx.freshName("convertedValues")
    +    val entry = ctx.freshName("entry")
    +    val entries = ctx.freshName("entries")
    +
    +    val (defineEntries, defineKeyValue) = child.dataType match {
    +      case ObjectType(cls) if classOf[java.util.Map[_, _]].isAssignableFrom(cls) =>
    +        val javaIteratorCls = classOf[java.util.Iterator[_]].getName
    +        val javaMapEntryCls = classOf[java.util.Map.Entry[_, _]].getName
    +
    +        val defineEntries =
    +          s"final $javaIteratorCls $entries = ${inputMap.value}.entrySet().iterator();"
    +
    +        val defineKeyValue =
    +          s"""
    +            final $javaMapEntryCls $entry = ($javaMapEntryCls) $entries.next();
    +            ${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) $entry.getKey();
    +            ${ctx.javaType(valueType)} $value = (${ctx.boxedType(valueType)}) $entry.getValue();
    +          """
    +
    +        defineEntries -> defineKeyValue
    +
    +      case ObjectType(cls) if classOf[scala.collection.Map[_, _]].isAssignableFrom(cls) =>
    +        val scalaIteratorCls = classOf[Iterator[_]].getName
    +        val scalaMapEntryCls = classOf[Tuple2[_, _]].getName
    +
    +        val defineEntries = s"final $scalaIteratorCls $entries = ${inputMap.value}.iterator();"
    +
    +        val defineKeyValue =
    +          s"""
    +            final $scalaMapEntryCls $entry = ($scalaMapEntryCls) $entries.next();
    +            ${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) $entry._1();
    +            ${ctx.javaType(valueType)} $value = (${ctx.boxedType(valueType)}) $entry._2();
    +          """
    +
    +        defineEntries -> defineKeyValue
    +    }
    +
    +    val valueNullCheck = if (ctx.isPrimitiveType(valueType)) {
    +      s"boolean $valueIsNull = false;"
    +    } else {
    +      s"boolean $valueIsNull = $value == null;"
    +    }
    +
    +    val arrayCls = classOf[GenericArrayData].getName
    +    val mapCls = classOf[ArrayBasedMapData].getName
    +    val convertedKeyType = ctx.boxedType(keyConverter.dataType)
    +    val convertedValueType = ctx.boxedType(valueConverter.dataType)
    +    val code =
    +      s"""
    +        ${inputMap.code}
    +        ${ctx.javaType(dataType)} ${ev.value} = ${ctx.defaultValue(dataType)};
    +        if (!${inputMap.isNull}) {
    +          final int $length = ${inputMap.value}.size();
    +          final Object[] $convertedKeys = new Object[$length];
    +          final Object[] $convertedValues = new Object[$length];
    +          int $index = 0;
    +          $defineEntries
    +          while($entries.hasNext()) {
    +            $defineKeyValue
    +            $valueNullCheck
    +
    +            ${genKeyConverter.code}
    +            if (${genKeyConverter.isNull}) {
    +              throw new RuntimeException("Cannot use null as map key!");
    +            } else {
    +              $convertedKeys[$index] = ($convertedKeyType) ${genKeyConverter.value};
    --- End diff --
    
    The type casting doesn't seem to be necessary here since `convertedKeys` is an `Object[]`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14344#discussion_r72100760
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala ---
    @@ -501,6 +501,143 @@ case class MapObjects private(
       }
     }
     
    +object ExternalMapToCatalyst {
    +  private val curId = new java.util.concurrent.atomic.AtomicInteger()
    +
    +  def apply(
    +      inputMap: Expression,
    +      keyType: DataType,
    +      keyConverter: Expression => Expression,
    +      valueType: DataType,
    +      valueConverter: Expression => Expression): ExternalMapToCatalyst = {
    +    val id = curId.getAndIncrement()
    +    val keyName = "ExternalMapToCatalyst_key" + id
    +    val valueName = "ExternalMapToCatalyst_value" + id
    +    val valueIsNull = "ExternalMapToCatalyst_value_isNull" + id
    +
    +    ExternalMapToCatalyst(
    +      keyName,
    +      keyType,
    +      keyConverter(LambdaVariable(keyName, "false", keyType)),
    +      valueName,
    +      valueIsNull,
    +      valueType,
    +      valueConverter(LambdaVariable(valueName, valueIsNull, valueType)),
    +      inputMap
    +    )
    +  }
    +}
    +
    +case class ExternalMapToCatalyst private(
    --- End diff --
    
    need documentation to explain what this does


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/14344


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14344: [SPARK-16706][SQL] support java map in encoder

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14344
  
    **[Test build #62816 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62816/consoleFull)** for PR 14344 at commit [`f235493`](https://github.com/apache/spark/commit/f23549314cd2558fda0f859418ef36e11d9fe9f9).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14344: [SPARK-16706][SQL] support java map in encoder

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14344
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/62864/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14344#discussion_r72035859
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala ---
    @@ -501,6 +501,143 @@ case class MapObjects private(
       }
     }
     
    +object ExternalMapToCatalyst {
    +  private val curId = new java.util.concurrent.atomic.AtomicInteger()
    +
    +  def apply(
    +      inputMap: Expression,
    +      keyType: DataType,
    +      keyConverter: Expression => Expression,
    +      valueType: DataType,
    +      valueConverter: Expression => Expression): ExternalMapToCatalyst = {
    +    val id = curId.getAndIncrement()
    +    val keyName = "ExternalMapToCatalyst_key" + id
    +    val valueName = "ExternalMapToCatalyst_value" + id
    +    val valueIsNull = "ExternalMapToCatalyst_value_isNull" + id
    +
    +    ExternalMapToCatalyst(
    +      keyName,
    +      keyType,
    +      keyConverter(LambdaVariable(keyName, "false", keyType)),
    +      valueName,
    +      valueIsNull,
    +      valueType,
    +      valueConverter(LambdaVariable(valueName, valueIsNull, valueType)),
    +      inputMap
    +    )
    +  }
    +}
    +
    +case class ExternalMapToCatalyst private(
    +    key: String,
    +    keyType: DataType,
    +    keyConverter: Expression,
    +    value: String,
    +    valueIsNull: String,
    +    valueType: DataType,
    +    valueConverter: Expression,
    +    child: Expression)
    +  extends UnaryExpression with NonSQLExpression {
    +
    +  override def foldable: Boolean = false
    +
    +  override def dataType: MapType = MapType(keyConverter.dataType, valueConverter.dataType)
    +
    +  override def eval(input: InternalRow): Any =
    +    throw new UnsupportedOperationException("Only code-generated evaluation is supported")
    +
    +  override protected def doGenCode(ctx: CodegenContext, ev: ExprCode): ExprCode = {
    +    val inputMap = child.genCode(ctx)
    +    val genKeyConverter = keyConverter.genCode(ctx)
    +    val genValueConverter = valueConverter.genCode(ctx)
    +    val length = ctx.freshName("length")
    +    val index = ctx.freshName("index")
    +    val convertedKeys = ctx.freshName("convertedKeys")
    +    val convertedValues = ctx.freshName("convertedValues")
    +    val entry = ctx.freshName("entry")
    +    val entries = ctx.freshName("entries")
    +
    +    val (defineEntries, defineKeyValue) = child.dataType match {
    +      case ObjectType(cls) if classOf[java.util.Map[_, _]].isAssignableFrom(cls) =>
    +        val javaIteratorCls = classOf[java.util.Iterator[_]].getName
    +        val javaMapEntryCls = classOf[java.util.Map.Entry[_, _]].getName
    +
    +        val defineEntries =
    +          s"final $javaIteratorCls $entries = ${inputMap.value}.entrySet().iterator();"
    +
    +        val defineKeyValue =
    +          s"""
    +            final $javaMapEntryCls $entry = ($javaMapEntryCls) $entries.next();
    +            ${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) $entry.getKey();
    +            ${ctx.javaType(valueType)} $value = (${ctx.boxedType(valueType)}) $entry.getValue();
    +          """
    +
    +        defineEntries -> defineKeyValue
    +
    +      case ObjectType(cls) if classOf[scala.collection.Map[_, _]].isAssignableFrom(cls) =>
    +        val scalaIteratorCls = classOf[Iterator[_]].getName
    +        val scalaMapEntryCls = classOf[Tuple2[_, _]].getName
    +
    +        val defineEntries = s"final $scalaIteratorCls $entries = ${inputMap.value}.iterator();"
    +
    +        val defineKeyValue =
    +          s"""
    +            final $scalaMapEntryCls $entry = ($scalaMapEntryCls) $entries.next();
    +            ${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) $entry._1();
    +            ${ctx.javaType(valueType)} $value = (${ctx.boxedType(valueType)}) $entry._2();
    +          """
    +
    +        defineEntries -> defineKeyValue
    +    }
    +
    +    val valueNullCheck = if (ctx.isPrimitiveType(valueType)) {
    +      s"boolean $valueIsNull = false;"
    +    } else {
    +      s"boolean $valueIsNull = $value == null;"
    +    }
    +
    +    val arrayCls = classOf[GenericArrayData].getName
    +    val mapCls = classOf[ArrayBasedMapData].getName
    +    val convertedKeyType = ctx.boxedType(keyConverter.dataType)
    +    val convertedValueType = ctx.boxedType(valueConverter.dataType)
    +    val code =
    +      s"""
    +        ${inputMap.code}
    +        ${ctx.javaType(dataType)} ${ev.value} = ${ctx.defaultValue(dataType)};
    +        if (!${inputMap.isNull}) {
    +          final int $length = ${inputMap.value}.size();
    +          final Object[] $convertedKeys = new Object[$length];
    +          final Object[] $convertedValues = new Object[$length];
    +          int $index = 0;
    +          $defineEntries
    +          while($entries.hasNext()) {
    +            $defineKeyValue
    +            $valueNullCheck
    +
    +            ${genKeyConverter.code}
    +            if (${genKeyConverter.isNull}) {
    +              throw new RuntimeException("Cannot use null as map key!");
    +            } else {
    +              $convertedKeys[$index] = ($convertedKeyType) ${genKeyConverter.value};
    --- End diff --
    
    it handles the case when key/value is primitive(this can happen for scala map), and we can not assign primitive values to Object.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14344: [SPARK-16706][SQL] support java map in encoder

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on the issue:

    https://github.com/apache/spark/pull/14344
  
    cc @yhuai @liancheng


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14344#discussion_r72036250
  
    --- Diff: sql/core/src/test/java/test/org/apache/spark/sql/JavaDatasetSuite.java ---
    @@ -673,21 +694,24 @@ public void testJavaBeanEncoder() {
           new byte[]{1, 2},
           new String[]{"hello", null},
           Arrays.asList("a", "b"),
    -      Arrays.asList(100L, null, 200L)});
    +      Arrays.asList(100L, null, 200L),
    +      map1});
         Row row2 = new GenericRow(new Object[]{
           false,
           30,
           new byte[]{3, 4},
           new String[]{null, "world"},
           Arrays.asList("x", "y"),
    -      Arrays.asList(300L, null, 400L)});
    +      Arrays.asList(300L, null, 400L),
    +      map2});
         StructType schema = new StructType()
           .add("a", BooleanType, false)
           .add("b", IntegerType, false)
           .add("c", BinaryType)
           .add("d", createArrayType(StringType))
           .add("e", createArrayType(StringType))
    -      .add("f", createArrayType(LongType));
    +      .add("f", createArrayType(LongType))
    +      .add("g", createMapType(IntegerType, StringType));
    --- End diff --
    
    Maybe more test cases for nested key/value types?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14344: [SPARK-16706][SQL] support java map in encoder

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on the issue:

    https://github.com/apache/spark/pull/14344
  
    LGTM, merging to master.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14344: [SPARK-16706][SQL] support java map in encoder

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14344
  
    **[Test build #62816 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62816/consoleFull)** for PR 14344 at commit [`f235493`](https://github.com/apache/spark/commit/f23549314cd2558fda0f859418ef36e11d9fe9f9).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org