You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by shyla deshpande <de...@gmail.com> on 2016/11/04 21:00:56 UTC
java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession$
. Please Help!!!!!!!
object App {
import org.apache.spark.sql.functions._
import org.apache.spark.sql.SparkSession
def main(args : Array[String]) {
println( "Hello World!" )
val sparkSession = SparkSession.builder.
master("local")
.appName("spark session example")
.getOrCreate()
}
}
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.version>2.11.8</scala.version>
<scala.compat.version>2.11</scala.compat.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.0.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.specs2</groupId>
<artifactId>specs2-core_${scala.compat.version}</artifactId>
<version>2.4.16</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
</build>
Re: java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession$
. Please Help!!!!!!!
Posted by shyla deshpande <de...@gmail.com>.
I feel so good that Holden replied.
Yes, that was the problem. I was running from Intellij, I removed the
provided scope and works great.
Thanks a lot.
On Fri, Nov 4, 2016 at 2:05 PM, Holden Karau <ho...@pigscanfly.ca> wrote:
> It seems like you've marked the spark jars as provided, in this case they
> would only be provided you run your application with spark-submit or
> otherwise have Spark's JARs on your class path. How are you launching your
> application?
>
> On Fri, Nov 4, 2016 at 2:00 PM, shyla deshpande <de...@gmail.com>
> wrote:
>
>> object App {
>>
>>
>> import org.apache.spark.sql.functions._
>> import org.apache.spark.sql.SparkSession
>>
>> def main(args : Array[String]) {
>> println( "Hello World!" )
>> val sparkSession = SparkSession.builder.
>> master("local")
>> .appName("spark session example")
>> .getOrCreate()
>> }
>>
>> }
>>
>>
>> <properties>
>> <maven.compiler.source>1.8</maven.compiler.source>
>> <maven.compiler.target>1.8</maven.compiler.target>
>> <encoding>UTF-8</encoding>
>> <scala.version>2.11.8</scala.version>
>> <scala.compat.version>2.11</scala.compat.version>
>> </properties>
>>
>> <dependencies>
>> <dependency>
>> <groupId>org.scala-lang</groupId>
>> <artifactId>scala-library</artifactId>
>> <version>${scala.version}</version>
>> </dependency>
>>
>> <dependency>
>> <groupId>org.apache.spark</groupId>
>> <artifactId>spark-core_2.11</artifactId>
>> <version>2.0.1</version>
>> <scope>provided</scope>
>> </dependency>
>> <dependency>
>> <groupId>org.apache.spark</groupId>
>> <artifactId>spark-sql_2.11</artifactId>
>> <version>2.0.1</version>
>> <scope>provided</scope>
>> </dependency>
>>
>> <dependency>
>> <groupId>org.specs2</groupId>
>> <artifactId>specs2-core_${scala.compat.version}</artifactId>
>> <version>2.4.16</version>
>> <scope>test</scope>
>> </dependency>
>> </dependencies>
>>
>> <build>
>> <sourceDirectory>src/main/scala</sourceDirectory>
>> </build>
>>
>>
>
>
> --
> Cell : 425-233-8271
> Twitter: https://twitter.com/holdenkarau
>
Re: java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession$
. Please Help!!!!!!!
Posted by Holden Karau <ho...@pigscanfly.ca>.
It seems like you've marked the spark jars as provided, in this case they
would only be provided you run your application with spark-submit or
otherwise have Spark's JARs on your class path. How are you launching your
application?
On Fri, Nov 4, 2016 at 2:00 PM, shyla deshpande <de...@gmail.com>
wrote:
> object App {
>
>
> import org.apache.spark.sql.functions._
> import org.apache.spark.sql.SparkSession
>
> def main(args : Array[String]) {
> println( "Hello World!" )
> val sparkSession = SparkSession.builder.
> master("local")
> .appName("spark session example")
> .getOrCreate()
> }
>
> }
>
>
> <properties>
> <maven.compiler.source>1.8</maven.compiler.source>
> <maven.compiler.target>1.8</maven.compiler.target>
> <encoding>UTF-8</encoding>
> <scala.version>2.11.8</scala.version>
> <scala.compat.version>2.11</scala.compat.version>
> </properties>
>
> <dependencies>
> <dependency>
> <groupId>org.scala-lang</groupId>
> <artifactId>scala-library</artifactId>
> <version>${scala.version}</version>
> </dependency>
>
> <dependency>
> <groupId>org.apache.spark</groupId>
> <artifactId>spark-core_2.11</artifactId>
> <version>2.0.1</version>
> <scope>provided</scope>
> </dependency>
> <dependency>
> <groupId>org.apache.spark</groupId>
> <artifactId>spark-sql_2.11</artifactId>
> <version>2.0.1</version>
> <scope>provided</scope>
> </dependency>
>
> <dependency>
> <groupId>org.specs2</groupId>
> <artifactId>specs2-core_${scala.compat.version}</artifactId>
> <version>2.4.16</version>
> <scope>test</scope>
> </dependency>
> </dependencies>
>
> <build>
> <sourceDirectory>src/main/scala</sourceDirectory>
> </build>
>
>
--
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau