You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "GokiMori (Jira)" <ji...@apache.org> on 2020/12/24 05:28:00 UTC

[jira] [Created] (SPARK-33897) Can't set option 'cross' in join method.

GokiMori created SPARK-33897:
--------------------------------

             Summary: Can't set option 'cross' in join method.
                 Key: SPARK-33897
                 URL: https://issues.apache.org/jira/browse/SPARK-33897
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 3.0.1
            Reporter: GokiMori


[The PySpark documentation|https://spark.apache.org/docs/3.0.1/api/python/pyspark.sql.html#pyspark.sql.DataFrame.join] says "Must be one of: inner, cross, outer, full, fullouter, full_outer, left, leftouter, left_outer, right, rightouter, right_outer, semi, leftsemi, left_semi, anti, leftanti and left_anti."
However, I get the following error when I set the cross option.

 
{code:java}
scala> val df1 = spark.createDataFrame(Seq((1,"a"),(2,"b")))
df1: org.apache.spark.sql.DataFrame = [_1: int, _2: string]
scala> val df2 = spark.createDataFrame(Seq((1,"A"),(2,"B"), (3, "C")))
df2: org.apache.spark.sql.DataFrame = [_1: int, _2: string]
scala> df1.join(right = df2, usingColumns = Seq("_1"), joinType = "cross").show()
java.lang.IllegalArgumentException: requirement failed: Unsupported using join type Cross
 at scala.Predef$.require(Predef.scala:281)
 at org.apache.spark.sql.catalyst.plans.UsingJoin.<init>(joinTypes.scala:106)
 at org.apache.spark.sql.Dataset.join(Dataset.scala:1025)
 ... 53 elided

{code}
 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org