You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jerry <je...@gmail.com> on 2015/08/18 19:28:39 UTC

What am I missing that's preventing javac from finding the libraries (CLASSPATH is setup...)?

Hello,

So I setup Spark to run on my local machine to see if I can reproduce the
issue I'm having with data frames, but I'm running into issues with the
compiler.

Here's what I got:

$ echo $CLASSPATH
/usr/lib/jvm/java-6-oracle/lib:/home/adminz/dev/spark/spark-1.4.1/lib/spark-assembly-1.4.1-hadoop2.6.0.jar


javac Test.java
Test.java:1: package org.apache.spark.sql.api.java does not exist
import org.apache.spark.sql.api.java.*;
^
Test.java:6: package org.apache.spark.sql does not exist
import org.apache.spark.sql.*;
^
Test.java:7: package org.apache.spark.sql.hive does not exist
import org.apache.spark.sql.hive.*;
....


Let me know what I'm doing wrong.

Thanks,
        Jerry

Re: What am I missing that's preventing javac from finding the libraries (CLASSPATH is setup...)?

Posted by UMESH CHAUDHARY <um...@gmail.com>.
Just add spark_1.4.1_yarn_shuffle.jar in ClassPath or create a New Maven
project using below dependency:

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.4.1</version>
</dependency>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.4.1</version>
</dependency>





On Tue, Aug 18, 2015 at 11:51 PM, Jerry <je...@gmail.com> wrote:

> So from what I understand, those usually pull dependencies for a given
> project? I'm able to run the spark shell so I'd assume I have everything.
> What am I missing from the big picture and what directory do I run maven on?
>
> Thanks,
>         Jerry
>
> On Tue, Aug 18, 2015 at 11:15 AM, Ted Yu <yu...@gmail.com> wrote:
>
>> Normally people would establish maven project with Spark dependencies or,
>> use sbt.
>>
>> Can you go with either approach ?
>>
>> Cheers
>>
>> On Tue, Aug 18, 2015 at 10:28 AM, Jerry <je...@gmail.com> wrote:
>>
>>> Hello,
>>>
>>> So I setup Spark to run on my local machine to see if I can reproduce
>>> the issue I'm having with data frames, but I'm running into issues with the
>>> compiler.
>>>
>>> Here's what I got:
>>>
>>> $ echo $CLASSPATH
>>>
>>> /usr/lib/jvm/java-6-oracle/lib:/home/adminz/dev/spark/spark-1.4.1/lib/spark-assembly-1.4.1-hadoop2.6.0.jar
>>>
>>>
>>> javac Test.java
>>> Test.java:1: package org.apache.spark.sql.api.java does not exist
>>> import org.apache.spark.sql.api.java.*;
>>> ^
>>> Test.java:6: package org.apache.spark.sql does not exist
>>> import org.apache.spark.sql.*;
>>> ^
>>> Test.java:7: package org.apache.spark.sql.hive does not exist
>>> import org.apache.spark.sql.hive.*;
>>> ....
>>>
>>>
>>> Let me know what I'm doing wrong.
>>>
>>> Thanks,
>>>         Jerry
>>>
>>
>>
>

Re: What am I missing that's preventing javac from finding the libraries (CLASSPATH is setup...)?

Posted by Jerry <je...@gmail.com>.
So from what I understand, those usually pull dependencies for a given
project? I'm able to run the spark shell so I'd assume I have everything.
What am I missing from the big picture and what directory do I run maven on?

Thanks,
        Jerry

On Tue, Aug 18, 2015 at 11:15 AM, Ted Yu <yu...@gmail.com> wrote:

> Normally people would establish maven project with Spark dependencies or,
> use sbt.
>
> Can you go with either approach ?
>
> Cheers
>
> On Tue, Aug 18, 2015 at 10:28 AM, Jerry <je...@gmail.com> wrote:
>
>> Hello,
>>
>> So I setup Spark to run on my local machine to see if I can reproduce the
>> issue I'm having with data frames, but I'm running into issues with the
>> compiler.
>>
>> Here's what I got:
>>
>> $ echo $CLASSPATH
>>
>> /usr/lib/jvm/java-6-oracle/lib:/home/adminz/dev/spark/spark-1.4.1/lib/spark-assembly-1.4.1-hadoop2.6.0.jar
>>
>>
>> javac Test.java
>> Test.java:1: package org.apache.spark.sql.api.java does not exist
>> import org.apache.spark.sql.api.java.*;
>> ^
>> Test.java:6: package org.apache.spark.sql does not exist
>> import org.apache.spark.sql.*;
>> ^
>> Test.java:7: package org.apache.spark.sql.hive does not exist
>> import org.apache.spark.sql.hive.*;
>> ....
>>
>>
>> Let me know what I'm doing wrong.
>>
>> Thanks,
>>         Jerry
>>
>
>

Re: What am I missing that's preventing javac from finding the libraries (CLASSPATH is setup...)?

Posted by Ted Yu <yu...@gmail.com>.
Normally people would establish maven project with Spark dependencies or,
use sbt.

Can you go with either approach ?

Cheers

On Tue, Aug 18, 2015 at 10:28 AM, Jerry <je...@gmail.com> wrote:

> Hello,
>
> So I setup Spark to run on my local machine to see if I can reproduce the
> issue I'm having with data frames, but I'm running into issues with the
> compiler.
>
> Here's what I got:
>
> $ echo $CLASSPATH
>
> /usr/lib/jvm/java-6-oracle/lib:/home/adminz/dev/spark/spark-1.4.1/lib/spark-assembly-1.4.1-hadoop2.6.0.jar
>
>
> javac Test.java
> Test.java:1: package org.apache.spark.sql.api.java does not exist
> import org.apache.spark.sql.api.java.*;
> ^
> Test.java:6: package org.apache.spark.sql does not exist
> import org.apache.spark.sql.*;
> ^
> Test.java:7: package org.apache.spark.sql.hive does not exist
> import org.apache.spark.sql.hive.*;
> ....
>
>
> Let me know what I'm doing wrong.
>
> Thanks,
>         Jerry
>