You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Laurent Legrand <ll...@skapane.com> on 2016/10/05 12:12:08 UTC

pyspark: sqlContext.read.text() does not work with a list of paths

Hello,

When I try to load multiple text files with the sqlContext, I get the
following error:

spark-2.0.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/sql/readwriter.py",
line 282, in text
UnboundLocalError: local variable 'path' referenced before assignment

According to the code
(https://github.com/apache/spark/blob/master/python/pyspark/sql/readwriter.py#L291),
the variable 'path' is not set if the argument is not a string.

Could you confirm it is a bug?

Regards,
Laurent




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/pyspark-sqlContext-read-text-does-not-work-with-a-list-of-paths-tp27838.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: pyspark: sqlContext.read.text() does not work with a list of paths

Posted by Laurent Legrand <ll...@skapane.com>.
Hello,

I've just created the issue: 
https://issues.apache.org/jira/browse/SPARK-17805

For the PR, I can work on it tomorrow.

Laurent


Le 06/10/2016 � 09:29, Hyukjin Kwon a �crit :
>
> It seems obviously a bug. It was introduced from my PR, 
> https://github.com/apache/spark/commit/d37c7f7f042f7943b5b684e53cf4284c601fb347
>
> +1 for creating a JIRA and PR. If you have any problem with this, I 
> would like to do this quickly.
>
>
> On 5 Oct 2016 9:12 p.m., "Laurent Legrand" <llegrand@skapane.com 
> <ma...@skapane.com>> wrote:
>
>     Hello,
>
>     When I try to load multiple text files with the sqlContext, I get the
>     following error:
>
>     spark-2.0.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/sql/readwriter.py",
>     line 282, in text
>     UnboundLocalError: local variable 'path' referenced before assignment
>
>     According to the code
>     (https://github.com/apache/spark/blob/master/python/pyspark/sql/readwriter.py#L291
>     <https://github.com/apache/spark/blob/master/python/pyspark/sql/readwriter.py#L291>),
>     the variable 'path' is not set if the argument is not a string.
>
>     Could you confirm it is a bug?
>
>     Regards,
>     Laurent
>
>
>
>
>     --
>     View this message in context:
>     http://apache-spark-user-list.1001560.n3.nabble.com/pyspark-sqlContext-read-text-does-not-work-with-a-list-of-paths-tp27838.html
>     <http://apache-spark-user-list.1001560.n3.nabble.com/pyspark-sqlContext-read-text-does-not-work-with-a-list-of-paths-tp27838.html>
>     Sent from the Apache Spark User List mailing list archive at
>     Nabble.com.
>
>     ---------------------------------------------------------------------
>     To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>     <ma...@spark.apache.org>
>


Re: pyspark: sqlContext.read.text() does not work with a list of paths

Posted by Hyukjin Kwon <gu...@gmail.com>.
It seems obviously a bug. It was introduced from my PR,
https://github.com/apache/spark/commit/d37c7f7f042f7943b5b684e53cf4284c601fb347

+1 for creating a JIRA and PR. If you have any problem with this, I would
like to do this quickly.


On 5 Oct 2016 9:12 p.m., "Laurent Legrand" <ll...@skapane.com> wrote:

> Hello,
>
> When I try to load multiple text files with the sqlContext, I get the
> following error:
>
> spark-2.0.0-bin-hadoop2.4/python/lib/pyspark.zip/pyspark/
> sql/readwriter.py",
> line 282, in text
> UnboundLocalError: local variable 'path' referenced before assignment
>
> According to the code
> (https://github.com/apache/spark/blob/master/python/pyspark/
> sql/readwriter.py#L291),
> the variable 'path' is not set if the argument is not a string.
>
> Could you confirm it is a bug?
>
> Regards,
> Laurent
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/pyspark-sqlContext-read-text-does-not-
> work-with-a-list-of-paths-tp27838.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>