You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sivakumar (Jira)" <ji...@apache.org> on 2020/01/17 12:41:00 UTC

[jira] [Issue Comment Deleted] (SPARK-30542) Two Spark structured streaming jobs cannot write to same base path

     [ https://issues.apache.org/jira/browse/SPARK-30542?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sivakumar updated SPARK-30542:
------------------------------
    Comment: was deleted

(was: Hi Jungtaek,

I thought this might be a feature that should be added to structured streaming. Also Please lemme know If you have any work around for this.)

> Two Spark structured streaming jobs cannot write to same base path
> ------------------------------------------------------------------
>
>                 Key: SPARK-30542
>                 URL: https://issues.apache.org/jira/browse/SPARK-30542
>             Project: Spark
>          Issue Type: Bug
>          Components: Structured Streaming
>    Affects Versions: 2.3.0
>            Reporter: Sivakumar
>            Priority: Major
>
> Hi All,
> Spark Structured Streaming doesn't allow two structured streaming jobs to write data to the same base directory which is possible with using dstreams.
> As __spark___metadata directory will be created by default for one job, second job cannot use the same directory as base path as already _spark__metadata directory is created by other job, It is throwing exception.
> Is there any workaround for this, other than creating separate base path's for both the jobs.
> Is it possible to create the __spark__metadata directory else where or disable without any data loss.
> If I had to change the base path for both the jobs, then my whole framework will get impacted, So i don't want to do that.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org