You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Leon <pa...@gmail.com> on 2014/11/25 20:35:45 UTC

RE: Spark SQL parser bug?

Hello

I just stumbled on exactly the same issue as you are discussing in this
thread. Here are my dependencies:
<dependencies>
        
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.10</artifactId>
            <version>1.1.0</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector-java_2.10</artifactId>
            <version>1.1.0</version>
        </dependency>

        
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.1.2-SNAPSHOT</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>1.1.2-SNAPSHOT</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.1.2-SNAPSHOT</version>
            <scope>provided</scope>
        </dependency>
    </dependencies>

As you can see I am using the latest of Spark and Spark Cassandra Connector
and I still get the same error message:
Exception in thread "main" java.util.NoSuchElementException: head of empty
list

So, I don't believe this bug was really fixed in Spark 1.1.1 release as
reported above.

Did you problem get fixed with the latest Spark update?

Thanks,
Leon



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-parser-bug-tp15999p19793.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


RE: Spark SQL parser bug?

Posted by Mohammed Guller <mo...@glassbeam.com>.
Leon,

I solved the problem by creating a work around for it, so didn't have a need to upgrade to 1.1.2-SNAPSHOT. 

Mohammed

-----Original Message-----
From: Leon [mailto:pachkulja@gmail.com] 
Sent: Tuesday, November 25, 2014 11:36 AM
To: user@spark.incubator.apache.org
Subject: RE: Spark SQL parser bug?

Hello

I just stumbled on exactly the same issue as you are discussing in this thread. Here are my dependencies:
<dependencies>
        
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.10</artifactId>
            <version>1.1.0</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector-java_2.10</artifactId>
            <version>1.1.0</version>
        </dependency>

        
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.1.2-SNAPSHOT</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>1.1.2-SNAPSHOT</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.1.2-SNAPSHOT</version>
            <scope>provided</scope>
        </dependency>
    </dependencies>

As you can see I am using the latest of Spark and Spark Cassandra Connector and I still get the same error message:
Exception in thread "main" java.util.NoSuchElementException: head of empty list

So, I don't believe this bug was really fixed in Spark 1.1.1 release as reported above.

Did you problem get fixed with the latest Spark update?

Thanks,
Leon



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-parser-bug-tp15999p19793.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For additional commands, e-mail: user-help@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org