You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ashok Kumar <as...@yahoo.com.INVALID> on 2016/06/05 10:06:23 UTC
Basic question on using one's own classes in the Scala app
Hi all,
Appreciate any advice on this. It is about scala
I have created a very basic Utilities.scala that contains a test class and method. I intend to add my own classes and methods as I expand and make references to these classes and methods in my other apps
class getCheckpointDirectory { def CheckpointDirectory (ProgramName: String) : String = { var hdfsDir = "hdfs://host:9000/user/user/checkpoint/"+ProgramName return hdfsDir }}I have used sbt to create a jar file for it. It is created as a jar file
utilities-assembly-0.1-SNAPSHOT.jar
Now I want to make a call to that method CheckpointDirectory in my app code myapp.dcala to return the value for hdfsDir
val ProgramName = this.getClass.getSimpleName.trim val getCheckpointDirectory = new getCheckpointDirectory val hdfsDir = getCheckpointDirectory.CheckpointDirectory(ProgramName)
However, I am getting a compilation error as expected
not found: type getCheckpointDirectory[error] val getCheckpointDirectory = new getCheckpointDirectory[error] ^[error] one error found[error] (compile:compileIncremental) Compilation failed
So a basic question, in order for compilation to work do I need to create a package for my jar file or add dependency like the following I do in sbt
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1"libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.1"
Any advise will be appreciated.
Thanks
Re: Basic question on using one's own classes in the Scala app
Posted by Ashok Kumar <as...@yahoo.com.INVALID>.
Thank you.
I added this as dependency
libraryDependencies += "com.databricks" % "apps.twitter_classifier" % "1.0.0"
That number at the end I chose arbitrary? Is that correct
Also in my TwitterAnalyzer.scala I added this linw
import com.databricks.apps.twitter_classifier._
Now I am getting this error
[info] Resolving com.databricks#apps.twitter_classifier;1.0.0 ...[warn] module not found: com.databricks#apps.twitter_classifier;1.0.0[warn] ==== local: tried[warn] /home/hduser/.ivy2/local/com.databricks/apps.twitter_classifier/1.0.0/ivys/ivy.xml[warn] ==== public: tried[warn] https://repo1.maven.org/maven2/com/databricks/apps.twitter_classifier/1.0.0/apps.twitter_classifier-1.0.0.pom[info] Resolving org.fusesource.jansi#jansi;1.4 ...[warn] ::::::::::::::::::::::::::::::::::::::::::::::[warn] :: UNRESOLVED DEPENDENCIES ::[warn] ::::::::::::::::::::::::::::::::::::::::::::::[warn] :: com.databricks#apps.twitter_classifier;1.0.0: not found[warn] ::::::::::::::::::::::::::::::::::::::::::::::[warn][warn] Note: Unresolved dependencies path:[warn] com.databricks:apps.twitter_classifier:1.0.0 (/home/hduser/scala/TwitterAnalyzer/build.sbt#L18-19)[warn] +- scala:scala_2.10:1.0sbt.ResolveException: unresolved dependency: com.databricks#apps.twitter_classifier;1.0.0: not found
Any ideas?
regards,
On Sunday, 5 June 2016, 22:22, Jacek Laskowski <ja...@japila.pl> wrote:
On Sun, Jun 5, 2016 at 9:01 PM, Ashok Kumar
<as...@yahoo.com.invalid> wrote:
> Now I have added this
>
> libraryDependencies += "com.databricks" % "apps.twitter_classifier"
>
> However, I am getting an error
>
>
> error: No implicit for Append.Value[Seq[sbt.ModuleID],
> sbt.impl.GroupArtifactID] found,
> so sbt.impl.GroupArtifactID cannot be appended to Seq[sbt.ModuleID]
> libraryDependencies += "com.databricks" % "apps.twitter_classifier"
> ^
> [error] Type error in expression
> Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?
Missing version element, e.g.
libraryDependencies += "com.databricks" % "apps.twitter_classifier" %
"VERSION_HERE"
Jacek
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
Re: Basic question on using one's own classes in the Scala app
Posted by Jacek Laskowski <ja...@japila.pl>.
On Sun, Jun 5, 2016 at 9:01 PM, Ashok Kumar
<as...@yahoo.com.invalid> wrote:
> Now I have added this
>
> libraryDependencies += "com.databricks" % "apps.twitter_classifier"
>
> However, I am getting an error
>
>
> error: No implicit for Append.Value[Seq[sbt.ModuleID],
> sbt.impl.GroupArtifactID] found,
> so sbt.impl.GroupArtifactID cannot be appended to Seq[sbt.ModuleID]
> libraryDependencies += "com.databricks" % "apps.twitter_classifier"
> ^
> [error] Type error in expression
> Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?
Missing version element, e.g.
libraryDependencies += "com.databricks" % "apps.twitter_classifier" %
"VERSION_HERE"
Jacek
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
Re: Basic question on using one's own classes in the Scala app
Posted by Ashok Kumar <as...@yahoo.com.INVALID>.
Hello for 1, I read the doc as
libraryDependencies += groupID % artifactID % revision
jar tvf utilities-assembly-0.1-SNAPSHOT.jar|grep CheckpointDirectory
com/databricks/apps/twitter_classifier/getCheckpointDirectory.class
getCheckpointDirectory.class
Now I have added this
libraryDependencies += "com.databricks" % "apps.twitter_classifier"
However, I am getting an error
error: No implicit for Append.Value[Seq[sbt.ModuleID], sbt.impl.GroupArtifactID] found,
so sbt.impl.GroupArtifactID cannot be appended to Seq[sbt.ModuleID]
libraryDependencies += "com.databricks" % "apps.twitter_classifier"
^
[error] Type error in expression
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?
Any ideas very appreciated
Thanking yoou
On Sunday, 5 June 2016, 17:39, Ted Yu <yu...@gmail.com> wrote:
For #1, please find examples on the nete.g.
http://www.scala-sbt.org/0.13/docs/Scala-Files-Example.html
For #2,
import <package-name>. getCheckpointDirectory
Cheers
On Sun, Jun 5, 2016 at 8:36 AM, Ashok Kumar <as...@yahoo.com> wrote:
Thank you sir.
At compile time can I do something similar to
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"
I have these
name := "scala"
version := "1.0"
scalaVersion := "2.10.4"
And if I look at jar file i have
jar tvf utilities-assembly-0.1-SNAPSHOT.jar|grep Check 1180 Sun Jun 05 10:14:36 BST 2016 com/databricks/apps/twitter_classifier/getCheckpointDirectory.class 1043 Sun Jun 05 10:14:36 BST 2016 getCheckpointDirectory.class 1216 Fri Sep 18 09:12:40 BST 2015 scala/collection/parallel/ParIterableLike$StrictSplitterCheckTask$class.class 615 Fri Sep 18 09:12:40 BST 2015 scala/collection/parallel/ParIterableLike$StrictSplitterCheckTask.class
two questions please
What do I need to put in libraryDependencies line
and what do I need to add to the top of scala app like
import java.io.Fileimport org.apache.log4j.Loggerimport org.apache.log4j.Levelimport ?
Thanks
On Sunday, 5 June 2016, 15:21, Ted Yu <yu...@gmail.com> wrote:
At compilation time, you need to declare the dependence on getCheckpointDirectory.
At runtime, you can use '--jars utilities-assembly-0.1-SNAPSHOT.jar' to pass the jar.
Cheers
On Sun, Jun 5, 2016 at 3:06 AM, Ashok Kumar <as...@yahoo.com.invalid> wrote:
Hi all,
Appreciate any advice on this. It is about scala
I have created a very basic Utilities.scala that contains a test class and method. I intend to add my own classes and methods as I expand and make references to these classes and methods in my other apps
class getCheckpointDirectory { def CheckpointDirectory (ProgramName: String) : String = { var hdfsDir = "hdfs://host:9000/user/user/checkpoint/"+ProgramName return hdfsDir }}I have used sbt to create a jar file for it. It is created as a jar file
utilities-assembly-0.1-SNAPSHOT.jar
Now I want to make a call to that method CheckpointDirectory in my app code myapp.dcala to return the value for hdfsDir
val ProgramName = this.getClass.getSimpleName.trim val getCheckpointDirectory = new getCheckpointDirectory val hdfsDir = getCheckpointDirectory.CheckpointDirectory(ProgramName)
However, I am getting a compilation error as expected
not found: type getCheckpointDirectory[error] val getCheckpointDirectory = new getCheckpointDirectory[error] ^[error] one error found[error] (compile:compileIncremental) Compilation failed
So a basic question, in order for compilation to work do I need to create a package for my jar file or add dependency like the following I do in sbt
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1"libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.1"
Any advise will be appreciated.
Thanks
Re: Basic question on using one's own classes in the Scala app
Posted by Ted Yu <yu...@gmail.com>.
For #1, please find examples on the net
e.g.
http://www.scala-sbt.org/0.13/docs/Scala-Files-Example.html
For #2,
import <package-name>. getCheckpointDirectory
Cheers
On Sun, Jun 5, 2016 at 8:36 AM, Ashok Kumar <as...@yahoo.com> wrote:
> Thank you sir.
>
> At compile time can I do something similar to
>
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"
>
> I have these
>
> name := "scala"
>
> version := "1.0"
>
> scalaVersion := "2.10.4"
>
> And if I look at jar file i have
>
>
> jar tvf utilities-assembly-0.1-SNAPSHOT.jar|grep Check
> 1180 Sun Jun 05 10:14:36 BST 2016
> com/databricks/apps/twitter_classifier/getCheckpointDirectory.class
> 1043 Sun Jun 05 10:14:36 BST 2016 getCheckpointDirectory.class
> 1216 Fri Sep 18 09:12:40 BST 2015
> scala/collection/parallel/ParIterableLike$StrictSplitterCheckTask$class.class
> 615 Fri Sep 18 09:12:40 BST 2015
> scala/collection/parallel/ParIterableLike$StrictSplitterCheckTask.class
>
> two questions please
>
> What do I need to put in libraryDependencies line
>
> and what do I need to add to the top of scala app like
>
> import java.io.File
> import org.apache.log4j.Logger
> import org.apache.log4j.Level
> import ?
>
> Thanks
>
>
>
>
>
> On Sunday, 5 June 2016, 15:21, Ted Yu <yu...@gmail.com> wrote:
>
>
> At compilation time, you need to declare the dependence
> on getCheckpointDirectory.
>
> At runtime, you can use '--jars utilities-assembly-0.1-SNAPSHOT.jar' to
> pass the jar.
>
> Cheers
>
> On Sun, Jun 5, 2016 at 3:06 AM, Ashok Kumar <as...@yahoo.com.invalid>
> wrote:
>
> Hi all,
>
> Appreciate any advice on this. It is about scala
>
> I have created a very basic Utilities.scala that contains a test class and
> method. I intend to add my own classes and methods as I expand and make
> references to these classes and methods in my other apps
>
> class getCheckpointDirectory {
> def CheckpointDirectory (ProgramName: String) : String = {
> var hdfsDir = "hdfs://host:9000/user/user/checkpoint/"+ProgramName
> return hdfsDir
> }
> }
> I have used sbt to create a jar file for it. It is created as a jar file
>
> utilities-assembly-0.1-SNAPSHOT.jar
>
> Now I want to make a call to that method CheckpointDirectory in my app
> code myapp.dcala to return the value for hdfsDir
>
> val ProgramName = this.getClass.getSimpleName.trim
> val getCheckpointDirectory = new getCheckpointDirectory
> val hdfsDir = getCheckpointDirectory.CheckpointDirectory(ProgramName)
>
> However, I am getting a compilation error as expected
>
> not found: type getCheckpointDirectory
> [error] val getCheckpointDirectory = new getCheckpointDirectory
> [error] ^
> [error] one error found
> [error] (compile:compileIncremental) Compilation failed
>
> So a basic question, in order for compilation to work do I need to create
> a package for my jar file or add dependency like the following I do in sbt
>
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"
> libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1"
> libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.1"
>
>
> Any advise will be appreciated.
>
> Thanks
>
>
>
>
>
>
>
>
Re: Basic question on using one's own classes in the Scala app
Posted by Ashok Kumar <as...@yahoo.com.INVALID>.
Thank you sir.
At compile time can I do something similar to
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"
I have these
name := "scala"
version := "1.0"
scalaVersion := "2.10.4"
And if I look at jar file i have
jar tvf utilities-assembly-0.1-SNAPSHOT.jar|grep Check 1180 Sun Jun 05 10:14:36 BST 2016 com/databricks/apps/twitter_classifier/getCheckpointDirectory.class 1043 Sun Jun 05 10:14:36 BST 2016 getCheckpointDirectory.class 1216 Fri Sep 18 09:12:40 BST 2015 scala/collection/parallel/ParIterableLike$StrictSplitterCheckTask$class.class 615 Fri Sep 18 09:12:40 BST 2015 scala/collection/parallel/ParIterableLike$StrictSplitterCheckTask.class
two questions please
What do I need to put in libraryDependencies line
and what do I need to add to the top of scala app like
import java.io.Fileimport org.apache.log4j.Loggerimport org.apache.log4j.Levelimport ?
Thanks
On Sunday, 5 June 2016, 15:21, Ted Yu <yu...@gmail.com> wrote:
At compilation time, you need to declare the dependence on getCheckpointDirectory.
At runtime, you can use '--jars utilities-assembly-0.1-SNAPSHOT.jar' to pass the jar.
Cheers
On Sun, Jun 5, 2016 at 3:06 AM, Ashok Kumar <as...@yahoo.com.invalid> wrote:
Hi all,
Appreciate any advice on this. It is about scala
I have created a very basic Utilities.scala that contains a test class and method. I intend to add my own classes and methods as I expand and make references to these classes and methods in my other apps
class getCheckpointDirectory { def CheckpointDirectory (ProgramName: String) : String = { var hdfsDir = "hdfs://host:9000/user/user/checkpoint/"+ProgramName return hdfsDir }}I have used sbt to create a jar file for it. It is created as a jar file
utilities-assembly-0.1-SNAPSHOT.jar
Now I want to make a call to that method CheckpointDirectory in my app code myapp.dcala to return the value for hdfsDir
val ProgramName = this.getClass.getSimpleName.trim val getCheckpointDirectory = new getCheckpointDirectory val hdfsDir = getCheckpointDirectory.CheckpointDirectory(ProgramName)
However, I am getting a compilation error as expected
not found: type getCheckpointDirectory[error] val getCheckpointDirectory = new getCheckpointDirectory[error] ^[error] one error found[error] (compile:compileIncremental) Compilation failed
So a basic question, in order for compilation to work do I need to create a package for my jar file or add dependency like the following I do in sbt
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1"libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.1"
Any advise will be appreciated.
Thanks
Re: Basic question on using one's own classes in the Scala app
Posted by Ted Yu <yu...@gmail.com>.
At compilation time, you need to declare the dependence
on getCheckpointDirectory.
At runtime, you can use '--jars utilities-assembly-0.1-SNAPSHOT.jar' to
pass the jar.
Cheers
On Sun, Jun 5, 2016 at 3:06 AM, Ashok Kumar <as...@yahoo.com.invalid>
wrote:
> Hi all,
>
> Appreciate any advice on this. It is about scala
>
> I have created a very basic Utilities.scala that contains a test class and
> method. I intend to add my own classes and methods as I expand and make
> references to these classes and methods in my other apps
>
> class getCheckpointDirectory {
> def CheckpointDirectory (ProgramName: String) : String = {
> var hdfsDir = "hdfs://host:9000/user/user/checkpoint/"+ProgramName
> return hdfsDir
> }
> }
> I have used sbt to create a jar file for it. It is created as a jar file
>
> utilities-assembly-0.1-SNAPSHOT.jar
>
> Now I want to make a call to that method CheckpointDirectory in my app
> code myapp.dcala to return the value for hdfsDir
>
> val ProgramName = this.getClass.getSimpleName.trim
> val getCheckpointDirectory = new getCheckpointDirectory
> val hdfsDir = getCheckpointDirectory.CheckpointDirectory(ProgramName)
>
> However, I am getting a compilation error as expected
>
> not found: type getCheckpointDirectory
> [error] val getCheckpointDirectory = new getCheckpointDirectory
> [error] ^
> [error] one error found
> [error] (compile:compileIncremental) Compilation failed
>
> So a basic question, in order for compilation to work do I need to create
> a package for my jar file or add dependency like the following I do in sbt
>
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"
> libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1"
> libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.1"
>
>
> Any advise will be appreciated.
>
> Thanks
>
>
>
>
>
Re: Fw: Basic question on using one's own classes in the Scala app
Posted by Marco Mistroni <mm...@gmail.com>.
HI Ashok
this is not really a spark-related question so i would not use this
mailing list.....
Anyway, my 2 cents here
as outlined by earlier replies, if the class you are referencing is in a
different jar, at compile time you will need to add that dependency to your
build.sbt,
I'd personally leave alone $CLASSPATH...
AT RUN TIME, you have two options:
1 - as suggested by Ted, when yo u launch your app via spark-submit you
can use '--jars utilities-assembly-0.1-SNAPSHOT.jar' to pass the jar.
2 - Use sbt assembly plugin to package your classes and jars into a 'fat
jar', and then at runtime all you need to do is to do
spark-submit --class <your spark app class name> <path to your fat
jar>
I'd personally go for 1 as it is the easiest option. (FYI for 2 you
might encounter situations where you have dependencies referring to same
classes, adn that will require you to define an assemblyMergeStrategy....)
hth
On Mon, Jun 6, 2016 at 8:52 AM, Ashok Kumar <as...@yahoo.com.invalid>
wrote:
> Anyone can help me with this please
>
>
> On Sunday, 5 June 2016, 11:06, Ashok Kumar <as...@yahoo.com> wrote:
>
>
> Hi all,
>
> Appreciate any advice on this. It is about scala
>
> I have created a very basic Utilities.scala that contains a test class and
> method. I intend to add my own classes and methods as I expand and make
> references to these classes and methods in my other apps
>
> class getCheckpointDirectory {
> def CheckpointDirectory (ProgramName: String) : String = {
> var hdfsDir = "hdfs://host:9000/user/user/checkpoint/"+ProgramName
> return hdfsDir
> }
> }
> I have used sbt to create a jar file for it. It is created as a jar file
>
> utilities-assembly-0.1-SNAPSHOT.jar
>
> Now I want to make a call to that method CheckpointDirectory in my app
> code myapp.dcala to return the value for hdfsDir
>
> val ProgramName = this.getClass.getSimpleName.trim
> val getCheckpointDirectory = new getCheckpointDirectory
> val hdfsDir = getCheckpointDirectory.CheckpointDirectory(ProgramName)
>
> However, I am getting a compilation error as expected
>
> not found: type getCheckpointDirectory
> [error] val getCheckpointDirectory = new getCheckpointDirectory
> [error] ^
> [error] one error found
> [error] (compile:compileIncremental) Compilation failed
>
> So a basic question, in order for compilation to work do I need to create
> a package for my jar file or add dependency like the following I do in sbt
>
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"
> libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1"
> libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.1"
>
>
> Or add the jar file to $CLASSPATH?
>
> Any advise will be appreciated.
>
> Thanks
>
>
>
>
>
>
>
Fw: Basic question on using one's own classes in the Scala app
Posted by Ashok Kumar <as...@yahoo.com.INVALID>.
Anyone can help me with this please
On Sunday, 5 June 2016, 11:06, Ashok Kumar <as...@yahoo.com> wrote:
Hi all,
Appreciate any advice on this. It is about scala
I have created a very basic Utilities.scala that contains a test class and method. I intend to add my own classes and methods as I expand and make references to these classes and methods in my other apps
class getCheckpointDirectory { def CheckpointDirectory (ProgramName: String) : String = { var hdfsDir = "hdfs://host:9000/user/user/checkpoint/"+ProgramName return hdfsDir }}I have used sbt to create a jar file for it. It is created as a jar file
utilities-assembly-0.1-SNAPSHOT.jar
Now I want to make a call to that method CheckpointDirectory in my app code myapp.dcala to return the value for hdfsDir
val ProgramName = this.getClass.getSimpleName.trim val getCheckpointDirectory = new getCheckpointDirectory val hdfsDir = getCheckpointDirectory.CheckpointDirectory(ProgramName)
However, I am getting a compilation error as expected
not found: type getCheckpointDirectory[error] val getCheckpointDirectory = new getCheckpointDirectory[error] ^[error] one error found[error] (compile:compileIncremental) Compilation failed
So a basic question, in order for compilation to work do I need to create a package for my jar file or add dependency like the following I do in sbt
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1"libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.1"
Or add the jar file to $CLASSPATH?
Any advise will be appreciated.
Thanks