You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by dwgw <dw...@gmail.com> on 2020/09/02 04:39:56 UTC

value col is not a member of org.apache.spark.rdd.RDD

Hi
I am trying to generate a hierarchy table using Spark GraphX but during
runtime i am getting following error.  

*error: value col is not a member of org.apache.spark.rdd.RDD[(Any, (Int,
Any, String, Int, Int))]
       val empHirearchyDF = empHirearchyExtDF.join(empDF ,
empDF.col("emp_id") ===
empHirearchyExtDF.col("emp_id_pk")).selectExpr("emp_id","first_name","last_name","title","mgr_id","level","root","path","iscyclic","isleaf")*


*Scala code:*

import org.apache.spark._
import org.apache.spark.graphx._
import org.apache.spark.rdd.RDD
import org.apache.spark.sql.DataFrame
import scala.util.hashing.MurmurHash3

...
...

// call the function
val empHirearchyExtDF = calcTopLevelHierarcy(empVertexDF,empEdgeDF)
.map{ case(pk,(level,root,path,iscyclic,isleaf)) =>
(pk.asInstanceOf[String],level,root.asInstanceOf[String],path,iscyclic,isleaf)}
.toDF("emp_id_pk","level","root","path","iscyclic","isleaf").cache()

// extend original table with new columns. *Errors occur in the following
line*
val empHirearchyDF = empHirearchyExtDF.join(empDF , empDF.col("emp_id") ===
empHirearchyExtDF.col("emp_id_pk")).selectExpr("emp_id","first_name","last_name","title","mgr_id","level","root","path","iscyclic","isleaf")

...
...

The errors occur during the execution of resultstage. I have found in the
spark shell that the action 'col' is available for the data frame empDF

*scala> empDF.*
!=                              
##                              
+                               
->                              
==                              
agg                             
alias                           
apply                           
as                              
asInstanceOf                    
cache                           
checkpoint                      
coalesce                        
*col *                            
colRegex                        
collect                         
collectAsList                   
columns 

Any insight on the issue will be appreciated.

Regards




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org