You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Vitaly Larchenkov (JIRA)" <ji...@apache.org> on 2019/02/18 10:28:00 UTC

[jira] [Created] (SPARK-26911) Spark

Vitaly Larchenkov created SPARK-26911:
-----------------------------------------

             Summary: Spark 
                 Key: SPARK-26911
                 URL: https://issues.apache.org/jira/browse/SPARK-26911
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.3.1
         Environment: PySpark (Spark 2.3.1)
            Reporter: Vitaly Larchenkov


Spark cannot find column that actually exists in array
{code:java}
org.apache.spark.sql.AnalysisException: cannot resolve '`id`' given input columns: [flid.palfl_timestamp, flid.id, flid.pal_state, flid.prs_id, flid.bank_id, flid.wr_id, flid.link_id]; {code}
 

 
{code:java}
---------------------------------------------------------------------------
Py4JJavaError                             Traceback (most recent call last)
/usr/share/spark/python/pyspark/sql/utils.py in deco(*a, **kw)
     62         try:
---> 63             return f(*a, **kw)
     64         except py4j.protocol.Py4JJavaError as e:

/usr/share/spark/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name)
    327                     "An error occurred while calling {0}{1}{2}.\n".
--> 328                     format(target_id, ".", name), value)
    329             else:

Py4JJavaError: An error occurred while calling o35.sql.
: org.apache.spark.sql.AnalysisException: cannot resolve '`id`' given input columns: [flid.palfl_timestamp, flid.id, flid.pal_state, flid.prs_id, flid.bank_id, flid.wr_id, flid.link_id]; line 10 pos 98;
'Project ['multiples.id, 'multiples.link_id]{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org