You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2019/10/26 23:15:00 UTC

[jira] [Resolved] (SPARK-28092) Spark cannot load files with COLON(:) char if not specified full path

     [ https://issues.apache.org/jira/browse/SPARK-28092?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean R. Owen resolved SPARK-28092.
----------------------------------
    Resolution: Invalid

You'd have to provide the error here. Elsewhere when this is reported it's because people are providing file paths that are interpretable as URIs with a scheme, and that's incorrect.

> Spark cannot load files with COLON(:) char if not specified full path
> ---------------------------------------------------------------------
>
>                 Key: SPARK-28092
>                 URL: https://issues.apache.org/jira/browse/SPARK-28092
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.3
>         Environment: Cloudera 6.2
> Spark latest parcel (I think 2.4.3)
>            Reporter: Ladislav Jech
>            Priority: Major
>
> Scenario:
> I have CSV files in S3 bucket like this:
> s3a://bucket/prefix/myfile_2019:04:05.csv
> s3a://bucket/prefix/myfile_2019:04:06.csv
> Now when I try to load files with something like:
> df = spark.read.load("s3://bucket/prefix/*", format="csv", sep=":", inferSchema="true", header="true")
>  
> It fails on error about URI (sorry don't have here exact exception), but when I list all files from S3 and provide path like array:
> df = spark.read.load(path=["s3://bucket/prefix/myfile_2019:04:05.csv","s3://bucket/prefix/myfile_2019:04:05.csv"], format="csv", sep=":", inferSchema="true", header="true")
>  
> It works, the reason is COLON character in the name of files as per my observations.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org