You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by YaoPau <jo...@gmail.com> on 2015/10/19 22:08:37 UTC

Spark SQL Exception: Conf non-local session path expected to be non-null

I've connected Spark SQL to the Hive Metastore and currently I'm running SQL
code via pyspark.  Typically everything works fine, but sometimes after a
long-running Spark SQL job I get the error below, and from then on I can no
longer run Spark SQL commands.  I still do have both my sc and my sqlCtx.

Any idea what this could mean?

An error occurred while calling o36.sql.
: org.apache.spark.sql.AnalysisException: Conf non-local session path
expected to be non-null;
	at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:260)
	at
org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:41)
	at
org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:40)
	at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
	at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
	at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
	at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
	at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
	at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
	at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
	at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
	at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
	at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
	at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
	at
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
	at
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
	at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
	at
scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
	at
org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
	at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:139)
	at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:139)
	at
org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
	at
org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
	at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
	at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
	at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
	at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
	at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
	at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
	at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
	at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
	at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
	at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
	at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
	at
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
	at
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
	at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
	at
scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
	at
org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
	at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:235)
	at
org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContext.scala:92)
	at
org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContext.scala:92)
	at scala.Option.getOrElse(Option.scala:120)
	at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:92)
	at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
	at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
	at py4j.Gateway.invoke(Gateway.java:259)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.GatewayConnection.run(GatewayConnection.java:207)
	at java.lang.Thread.run(Thread.java:745)




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Exception-Conf-non-local-session-path-expected-to-be-non-null-tp25127.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Spark SQL Exception: Conf non-local session path expected to be non-null

Posted by Davies Liu <da...@databricks.com>.
The thread-local things does not work well with PySpark, because the
thread used by PySpark in JVM could change over time, SessionState
could be lost.

This should be fixed in master by https://github.com/apache/spark/pull/8909


On Mon, Oct 19, 2015 at 1:08 PM, YaoPau <jo...@gmail.com> wrote:
> I've connected Spark SQL to the Hive Metastore and currently I'm running SQL
> code via pyspark.  Typically everything works fine, but sometimes after a
> long-running Spark SQL job I get the error below, and from then on I can no
> longer run Spark SQL commands.  I still do have both my sc and my sqlCtx.
>
> Any idea what this could mean?
>
> An error occurred while calling o36.sql.
> : org.apache.spark.sql.AnalysisException: Conf non-local session path
> expected to be non-null;
>         at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:260)
>         at
> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:41)
>         at
> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:40)
>         at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>         at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>         at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>         at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>         at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>         at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>         at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>         at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>         at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>         at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>         at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>         at
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>         at
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>         at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>         at
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>         at
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>         at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:139)
>         at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:139)
>         at
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
>         at
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
>         at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>         at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>         at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>         at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>         at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>         at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>         at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>         at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>         at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>         at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>         at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>         at
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>         at
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>         at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>         at
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>         at
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>         at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:235)
>         at
> org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContext.scala:92)
>         at
> org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContext.scala:92)
>         at scala.Option.getOrElse(Option.scala:120)
>         at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:92)
>         at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
>         at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
>         at py4j.Gateway.invoke(Gateway.java:259)
>         at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
>         at py4j.commands.CallCommand.execute(CallCommand.java:79)
>         at py4j.GatewayConnection.run(GatewayConnection.java:207)
>         at java.lang.Thread.run(Thread.java:745)
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Exception-Conf-non-local-session-path-expected-to-be-non-null-tp25127.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Spark SQL Exception: Conf non-local session path expected to be non-null

Posted by Deenar Toraskar <de...@gmail.com>.
This seems to be set using hive.exec.scratchdir, is that set?

    hdfsSessionPath = new Path(hdfsScratchDirURIString, sessionId);
    createPath(conf, hdfsSessionPath, scratchDirPermission, false, true);
    conf.set(HDFS_SESSION_PATH_KEY, hdfsSessionPath.toUri().toString());


On 20 October 2015 at 00:20, Ted Yu <yu...@gmail.com> wrote:

> A brief search led me
> to ql/src/java/org/apache/hadoop/hive/ql/session/SessionState.java :
>
>   private static final String HDFS_SESSION_PATH_KEY =
> "_hive.hdfs.session.path";
> ...
>   public static Path getHDFSSessionPath(Configuration conf) {
>     SessionState ss = SessionState.get();
>     if (ss == null) {
>       String sessionPathString = conf.get(HDFS_SESSION_PATH_KEY);
>       Preconditions.checkNotNull(sessionPathString,
>           "Conf non-local session path expected to be non-null");
>       return new Path(sessionPathString);
>     }
>     Preconditions.checkNotNull(ss.hdfsSessionPath,
>         "Non-local session path expected to be non-null");
>     return ss.hdfsSessionPath;
>
> FYI
>
> On Mon, Oct 19, 2015 at 1:08 PM, YaoPau <jo...@gmail.com> wrote:
>
>> I've connected Spark SQL to the Hive Metastore and currently I'm running
>> SQL
>> code via pyspark.  Typically everything works fine, but sometimes after a
>> long-running Spark SQL job I get the error below, and from then on I can
>> no
>> longer run Spark SQL commands.  I still do have both my sc and my sqlCtx.
>>
>> Any idea what this could mean?
>>
>> An error occurred while calling o36.sql.
>> : org.apache.spark.sql.AnalysisException: Conf non-local session path
>> expected to be non-null;
>>         at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:260)
>>         at
>>
>> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:41)
>>         at
>>
>> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:40)
>>         at
>> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>>         at
>> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>>         at
>> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>>         at
>> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>>         at
>> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>>         at
>> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>>         at
>>
>> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>>         at
>>
>> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>>         at
>> org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:139)
>>         at
>> org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:139)
>>         at
>>
>> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
>>         at
>>
>> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
>>         at
>> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>>         at
>> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>>         at
>> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>>         at
>> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>>         at
>> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>>         at
>> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>>         at
>>
>> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>>         at
>>
>> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>>         at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:235)
>>         at
>>
>> org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContext.scala:92)
>>         at
>>
>> org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContext.scala:92)
>>         at scala.Option.getOrElse(Option.scala:120)
>>         at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:92)
>>         at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
>>         at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
>>         at
>> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
>>         at py4j.Gateway.invoke(Gateway.java:259)
>>         at
>> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
>>         at py4j.commands.CallCommand.execute(CallCommand.java:79)
>>         at py4j.GatewayConnection.run(GatewayConnection.java:207)
>>         at java.lang.Thread.run(Thread.java:745)
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Exception-Conf-non-local-session-path-expected-to-be-non-null-tp25127.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Re: Spark SQL Exception: Conf non-local session path expected to be non-null

Posted by Ted Yu <yu...@gmail.com>.
A brief search led me
to ql/src/java/org/apache/hadoop/hive/ql/session/SessionState.java :

  private static final String HDFS_SESSION_PATH_KEY =
"_hive.hdfs.session.path";
...
  public static Path getHDFSSessionPath(Configuration conf) {
    SessionState ss = SessionState.get();
    if (ss == null) {
      String sessionPathString = conf.get(HDFS_SESSION_PATH_KEY);
      Preconditions.checkNotNull(sessionPathString,
          "Conf non-local session path expected to be non-null");
      return new Path(sessionPathString);
    }
    Preconditions.checkNotNull(ss.hdfsSessionPath,
        "Non-local session path expected to be non-null");
    return ss.hdfsSessionPath;

FYI

On Mon, Oct 19, 2015 at 1:08 PM, YaoPau <jo...@gmail.com> wrote:

> I've connected Spark SQL to the Hive Metastore and currently I'm running
> SQL
> code via pyspark.  Typically everything works fine, but sometimes after a
> long-running Spark SQL job I get the error below, and from then on I can no
> longer run Spark SQL commands.  I still do have both my sc and my sqlCtx.
>
> Any idea what this could mean?
>
> An error occurred while calling o36.sql.
> : org.apache.spark.sql.AnalysisException: Conf non-local session path
> expected to be non-null;
>         at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:260)
>         at
>
> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:41)
>         at
>
> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:40)
>         at
> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>         at
> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>         at
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>         at
> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>         at
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>         at
>
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>         at
>
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>         at
> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>         at
>
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>         at
>
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>         at
> org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:139)
>         at
> org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:139)
>         at
>
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
>         at
>
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
>         at
> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>         at
> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>         at
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>         at
> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>         at
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>         at
>
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>         at
>
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>         at
> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>         at
>
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>         at
>
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>         at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:235)
>         at
>
> org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContext.scala:92)
>         at
>
> org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContext.scala:92)
>         at scala.Option.getOrElse(Option.scala:120)
>         at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:92)
>         at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
>         at
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
>         at py4j.Gateway.invoke(Gateway.java:259)
>         at
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
>         at py4j.commands.CallCommand.execute(CallCommand.java:79)
>         at py4j.GatewayConnection.run(GatewayConnection.java:207)
>         at java.lang.Thread.run(Thread.java:745)
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Exception-Conf-non-local-session-path-expected-to-be-non-null-tp25127.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Spark SQL Exception: Conf non-local session path expected to be non-null

Posted by Jon Gregg <jo...@gmail.com>.
1.3 on cdh 5.4.4 ... I'll take the responses to mean that the fix will be
probably a few months away for us.  Not a huge problem but something I've
run into a number of times.

On Tue, Oct 20, 2015 at 3:01 PM, Yin Huai <yh...@databricks.com> wrote:

> btw, what version of Spark did you use?
>
> On Mon, Oct 19, 2015 at 1:08 PM, YaoPau <jo...@gmail.com> wrote:
>
>> I've connected Spark SQL to the Hive Metastore and currently I'm running
>> SQL
>> code via pyspark.  Typically everything works fine, but sometimes after a
>> long-running Spark SQL job I get the error below, and from then on I can
>> no
>> longer run Spark SQL commands.  I still do have both my sc and my sqlCtx.
>>
>> Any idea what this could mean?
>>
>> An error occurred while calling o36.sql.
>> : org.apache.spark.sql.AnalysisException: Conf non-local session path
>> expected to be non-null;
>>         at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:260)
>>         at
>>
>> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:41)
>>         at
>>
>> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:40)
>>         at
>> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>>         at
>> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>>         at
>> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>>         at
>> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>>         at
>> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>>         at
>> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>>         at
>>
>> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>>         at
>>
>> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>>         at
>> org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:139)
>>         at
>> org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:139)
>>         at
>>
>> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
>>         at
>>
>> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
>>         at
>> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>>         at
>> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>>         at
>> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>>         at
>> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>>         at
>> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>>         at
>>
>> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>>         at
>> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>>         at
>>
>> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>>         at
>>
>> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>>         at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:235)
>>         at
>>
>> org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContext.scala:92)
>>         at
>>
>> org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContext.scala:92)
>>         at scala.Option.getOrElse(Option.scala:120)
>>         at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:92)
>>         at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
>>         at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
>>         at
>> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
>>         at py4j.Gateway.invoke(Gateway.java:259)
>>         at
>> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
>>         at py4j.commands.CallCommand.execute(CallCommand.java:79)
>>         at py4j.GatewayConnection.run(GatewayConnection.java:207)
>>         at java.lang.Thread.run(Thread.java:745)
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Exception-Conf-non-local-session-path-expected-to-be-non-null-tp25127.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Re: Spark SQL Exception: Conf non-local session path expected to be non-null

Posted by Yin Huai <yh...@databricks.com>.
btw, what version of Spark did you use?

On Mon, Oct 19, 2015 at 1:08 PM, YaoPau <jo...@gmail.com> wrote:

> I've connected Spark SQL to the Hive Metastore and currently I'm running
> SQL
> code via pyspark.  Typically everything works fine, but sometimes after a
> long-running Spark SQL job I get the error below, and from then on I can no
> longer run Spark SQL commands.  I still do have both my sc and my sqlCtx.
>
> Any idea what this could mean?
>
> An error occurred while calling o36.sql.
> : org.apache.spark.sql.AnalysisException: Conf non-local session path
> expected to be non-null;
>         at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:260)
>         at
>
> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:41)
>         at
>
> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:40)
>         at
> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>         at
> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>         at
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>         at
> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>         at
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>         at
>
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>         at
>
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>         at
> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>         at
>
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>         at
>
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>         at
> org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:139)
>         at
> org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:139)
>         at
>
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
>         at
>
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
>         at
> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>         at
> scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>         at
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>         at
> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>         at
>
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>         at
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>         at
>
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>         at
>
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>         at
> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>         at
>
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>         at
>
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>         at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:235)
>         at
>
> org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContext.scala:92)
>         at
>
> org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContext.scala:92)
>         at scala.Option.getOrElse(Option.scala:120)
>         at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:92)
>         at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
>         at
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
>         at py4j.Gateway.invoke(Gateway.java:259)
>         at
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
>         at py4j.commands.CallCommand.execute(CallCommand.java:79)
>         at py4j.GatewayConnection.run(GatewayConnection.java:207)
>         at java.lang.Thread.run(Thread.java:745)
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Exception-Conf-non-local-session-path-expected-to-be-non-null-tp25127.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>