You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "antonkulaga (JIRA)" <ji...@apache.org> on 2017/07/25 17:39:01 UTC

[jira] [Updated] (SPARK-21531) CLONE - Spark build encounters "File name too long" on some encrypted filesystems

     [ https://issues.apache.org/jira/browse/SPARK-21531?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

antonkulaga updated SPARK-21531:
--------------------------------
    Priority: Major  (was: Minor)

> CLONE - Spark build encounters "File name too long" on some encrypted filesystems
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-21531
>                 URL: https://issues.apache.org/jira/browse/SPARK-21531
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation
>            Reporter: antonkulaga
>            Assignee: Theodore Vasiloudis
>             Fix For: 1.4.0
>
>
> This was reported by Luchesar Cekov on github along with a proposed fix. The fix has some potential downstream issues (it will modify the classnames) so until we understand better how many users are affected we aren't going to merge it. However, I'd like to include the issue and workaround here. If you encounter this issue please comment on the JIRA so we can assess the frequency.
> The issue produces this error:
> {code}
> [error] == Expanded type of tree ==
> [error] 
> [error] ConstantType(value = Constant(Throwable))
> [error] 
> [error] uncaught exception during compilation: java.io.IOException
> [error] File name too long
> [error] two errors found
> {code}
> The workaround is in maven under the compile options add: 
> {code}
> +              <arg>-Xmax-classfile-name</arg>
> +              <arg>128</arg>
> {code}
> In SBT add:
> {code}
> +    scalacOptions in Compile ++= Seq("-Xmax-classfile-name", "128"),
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org