You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Sumona Routh <su...@gmail.com> on 2016/07/21 22:43:08 UTC

Upgrade from 1.2 to 1.6 - parsing flat files in working directory

Hi all,
We are running into a classpath issue when we upgrade our application from
1.2 to 1.6.

In 1.2, we load properties from a flat file (from working directory of the
spark-submit script) using classloader resource approach. This was executed
up front (by the driver) before any processing happened.

 val confStream =
Thread.currentThread().getContextClassLoader().getResourceAsStream(appConfigPath)

confProperties.load(confStream)

In 1.6, the line getResourceAsStream returns a null and thus causes a
subsequent NullPointerException when loading the properties.

How do we pass flat files (there are many, so we really want to add a
directory to the classpath) in Spark 1.6? We haven't had much luck with
--files and --driver-class-path and spark.driver.extraClasspath. We also
couldn't find much documentation on this.

Thanks!
Sumona

Re: Upgrade from 1.2 to 1.6 - parsing flat files in working directory

Posted by Sumona Routh <su...@gmail.com>.
Can anyone provide some guidance on how to get files on the classpath for
our Spark job? This used to work in 1.2, however after upgrading we are
getting nulls when attempting to load resources.

Thanks,
Sumona

On Thu, Jul 21, 2016 at 4:43 PM Sumona Routh <su...@gmail.com> wrote:

> Hi all,
> We are running into a classpath issue when we upgrade our application from
> 1.2 to 1.6.
>
> In 1.2, we load properties from a flat file (from working directory of the
> spark-submit script) using classloader resource approach. This was executed
> up front (by the driver) before any processing happened.
>
>  val confStream =
> Thread.currentThread().getContextClassLoader().getResourceAsStream(appConfigPath)
>
> confProperties.load(confStream)
>
> In 1.6, the line getResourceAsStream returns a null and thus causes a
> subsequent NullPointerException when loading the properties.
>
> How do we pass flat files (there are many, so we really want to add a
> directory to the classpath) in Spark 1.6? We haven't had much luck with
> --files and --driver-class-path and spark.driver.extraClasspath. We also
> couldn't find much documentation on this.
>
> Thanks!
> Sumona
>