You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2016/03/22 23:36:25 UTC

[jira] [Created] (SPARK-14083) Analyze JVM bytecode and turn closures into Catalyst expressions

Reynold Xin created SPARK-14083:
-----------------------------------

             Summary: Analyze JVM bytecode and turn closures into Catalyst expressions
                 Key: SPARK-14083
                 URL: https://issues.apache.org/jira/browse/SPARK-14083
             Project: Spark
          Issue Type: New Feature
          Components: SQL
            Reporter: Reynold Xin


In the Dataset API, we are relying more on user-defined functions, which are typically slower than expressions because we can more flexibility to optimize expressions (known data types, no virtual function calls, etc).

In many cases, it's actually not going to be very difficult to look into the byte code of these closures and figure out what they are trying to do. If we can understand them, then we can turn them directly into Catalyst expressions for more optimized executions.

Some examples are:

{code}
df.map(_.name)  // equivalent to expression col("name")

ds.groupBy(_.gender)  // equivalent to expression col("gender")

df.filter(_.age > 18)  // equivalent to expression GreaterThan(col("age"), lit(18)

df.map(_.id + 1)  // equivalent to Add(col("age"), lit(1))
{code}

The goal of this ticket is to design a small framework for byte code analysis and use that to convert closures/lambdas into Catalyst expressions in order to speed up Dataset execution. It is a little bit futuristic, but I believe it is very doable. The framework should be easy to reason about (e.g. similar to Catalyst).

Note that a big emphasis on "small" and "easy to reason about". A patch should be rejected if it is too complicated or difficult to reason about.






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org