You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by ankit tyagi <an...@gmail.com> on 2015/07/29 14:20:45 UTC

Exception while submit spark job through yarn client

Hi,

I am using below yarn spark client to submit spark job from remote jvm

   @Override
public ApplicationId submitQuery(String requestId, String query,String
fileLocations) {
 String driverJar = getDriverJar();
String driverClass =
 propertyService.getAppPropertyValue(TypeString.QUERY_DRIVER_CLASS);
String driverAppName =
propertyService.getAppPropertyValue(TypeString.DRIVER_APP_NAME);
  String[] args = new String[] {
           // the name of your application
           "--name",
           driverAppName,

           // memory for driver (optional)
           "--driver-memory",
           "1000M",

           // path to your application's JAR file
           // required in yarn-cluster mode
           "--jar",

"local:/home/ankit/Repository/Personalization/rtis/Cust360QueryDriver/target/SnapdealCustomer360QueryDriver.jar",

           "--addJars",

"local:/home/ankit/Downloads/lib/spark-assembly-1.3.1-hadoop2.4.0.jar",


           // name of your application's main class (required)
           "--class",
           driverClass,

           "--arg",
           requestId,

           "--arg",
           query,

           "--arg",
           fileLocations,

           "--arg",
           "yarn-client"
       };

  System.setProperty("HADOOP_CONF_DIR",
"/home/hduser/hadoop-2.7.0/etc/hadoop");
  Configuration config = new Configuration();

  System.setProperty("SPARK_YARN_MODE", "true");


  SparkConf sparkConf = new SparkConf();

  ClientArguments cArgs = new ClientArguments(args, sparkConf);

  // create an instance of yarn Client client
       Client client = new Client(cArgs, config, sparkConf);

       ApplicationId id = client.submitApplication();

       return id;

}


Job is getting submitted to yarn-cluster but getting below exception while
running spark job

*  Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/Logging*
*        at java.lang.ClassLoader.defineClass1(Native Method)*
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at
sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 13 more


though mentioned class in
/home/ankit/Downloads/lib/spark-assembly-1.3.1-hadoop2.4.0.jar. looks like
 jar mentioned in --addJars is not getting added in driver's spark context.

Am i doing something?? Any help would be appreciated.

Regards,
Ankit Tyagi