You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xin Ren (JIRA)" <ji...@apache.org> on 2016/06/27 20:01:52 UTC

[jira] [Commented] (SPARK-16233) test_sparkSQL.R is failing

    [ https://issues.apache.org/jira/browse/SPARK-16233?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15351703#comment-15351703 ] 

Xin Ren commented on SPARK-16233:
---------------------------------

I'm working on this

> test_sparkSQL.R is failing
> --------------------------
>
>                 Key: SPARK-16233
>                 URL: https://issues.apache.org/jira/browse/SPARK-16233
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR, Tests
>    Affects Versions: 2.0.0
>            Reporter: Xin Ren
>            Priority: Minor
>
> By running 
> {code}
> ./R/run-tests.sh 
> {code}
> Getting error:
> {code}
> 15. Error: create DataFrame from list or data.frame (@test_sparkSQL.R#277) -----
> java.lang.NoClassDefFoundorg/apache/spark/sql/execution/datasources/PreInsertCastAndRename$
> 	at org.apache.spark.sql.hive.HiveSessionState$$anon$1.<init>(HiveSessionState.scala:69)
> 	at org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:63)
> 	at org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:62)
> 	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
> 	at org.apache.spark.sql.SparkSession.createDataFrame(SparkSession.scala:533)
> 	at org.apache.spark.sql.SparkSession.createDataFrame(SparkSession.scala:293)
> 	at org.apache.spark.sql.api.r.SQLUtils$.createDF(SQLUtils.scala:135)
> 	at org.apache.spark.sql.api.r.SQLUtils.createDF(SQLUtils.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:141)
> 	at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:86)
> 	at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:38)
> 	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
> 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
> 	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
> 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
> 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
> 	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)
> 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
> 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
> 	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
> 	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> 	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
> 	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
> 	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
> 	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
> 	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
> 	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
> 	at java.lang.Thread.run(Thread.java:745)
> 1: createDataFrame(l, c("a", "b")) at /Users/quickmobile/workspace/spark/R/lib/SparkR/tests/testthat/test_sparkSQL.R:277
> 2: dispatchFunc("createDataFrame(data, schema = NULL, samplingRatio = 1.0)", x, ...)
> 3: f(x, ...)
> 4: callJStatic("org.apache.spark.sql.api.r.SQLUtils", "createDF", srdd, schema$jobj,
>        sparkSession)
> 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> 6: stop(readString(conn))
> DONE ===========================================================================
> Execution halted
> {code}
> Cause: most probably these tests are using 'createDataFrame(sqlContext...)' which is deprecated. Should update tests method invocations. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org