You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Stephan Ewen (JIRA)" <ji...@apache.org> on 2018/02/21 14:08:00 UTC

[jira] [Commented] (FLINK-8720) Logging exception with S3 connector and BucketingSink

    [ https://issues.apache.org/jira/browse/FLINK-8720?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16371442#comment-16371442 ] 

Stephan Ewen commented on FLINK-8720:
-------------------------------------

That looks like a classloading / class duplication issue. I assume that happens because you (accidentally) have Hadoop packaged into your application fat jar.

You should be able to resolve this in either of those ways:
  - Remove Hadoop as a dependency from your application jar
  - Upgrade to Flink 1.4.1, which should tolerate this
  - Set classloading to "parent-first"

> Logging exception with S3 connector and BucketingSink
> -----------------------------------------------------
>
>                 Key: FLINK-8720
>                 URL: https://issues.apache.org/jira/browse/FLINK-8720
>             Project: Flink
>          Issue Type: Bug
>          Components: Streaming Connectors
>    Affects Versions: 1.4.1
>            Reporter: dejan miljkovic
>            Priority: Critical
>
> Trying to stream data to S3. Code works from InteliJ. When submitting code trough UI on my machine (single node cluster started by start-cluster.sh script) below stack trace is produced.
>  
> Below is the link to the simple test app that is streaming data to S3. [https://github.com/dmiljkovic/test-flink-bucketingsink-s3]
> The behavior is bit different but same error is produced.  Job works only once. If job is submitted second time below stack trace is produced. If I restart the cluster job works but only for the first time.
>  
>  
> org.apache.commons.logging.LogConfigurationException: java.lang.IllegalAccessError: org/apache/commons/logging/impl/LogFactoryImpl$3 (Caused by java.lang.IllegalAccessError: org/apache/commons/logging/impl/LogFactoryImpl$3)
> 	at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:637)
> 	at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:336)
> 	at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:310)
> 	at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:685)
> 	at org.apache.http.impl.conn.PoolingClientConnectionManager.<init>(PoolingClientConnectionManager.java:76)
> 	at org.apache.http.impl.conn.PoolingClientConnectionManager.<init>(PoolingClientConnectionManager.java:102)
> 	at org.apache.http.impl.conn.PoolingClientConnectionManager.<init>(PoolingClientConnectionManager.java:88)
> 	at org.apache.http.impl.conn.PoolingClientConnectionManager.<init>(PoolingClientConnectionManager.java:96)
> 	at com.amazonaws.http.ConnectionManagerFactory.createPoolingClientConnManager(ConnectionManagerFactory.java:26)
> 	at com.amazonaws.http.HttpClientFactory.createHttpClient(HttpClientFactory.java:96)
> 	at com.amazonaws.http.AmazonHttpClient.<init>(AmazonHttpClient.java:158)
> 	at com.amazonaws.AmazonWebServiceClient.<init>(AmazonWebServiceClient.java:119)
> 	at com.amazonaws.services.s3.AmazonS3Client.<init>(AmazonS3Client.java:389)
> 	at com.amazonaws.services.s3.AmazonS3Client.<init>(AmazonS3Client.java:371)
> 	at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:235)
> 	at org.apache.flink.streaming.connectors.fs.bucketing.BucketingSink.createHadoopFileSystem(BucketingSink.java:1206)
> 	at org.apache.flink.streaming.connectors.fs.bucketing.BucketingSink.initFileSystem(BucketingSink.java:411)
> 	at org.apache.flink.streaming.connectors.fs.bucketing.BucketingSink.initializeState(BucketingSink.java:355)
> 	at org.apache.flink.streaming.util.functions.StreamingFunctionUtils.tryRestoreFunction(StreamingFunctionUtils.java:178)
> 	at org.apache.flink.streaming.util.functions.StreamingFunctionUtils.restoreFunctionState(StreamingFunctionUtils.java:160)
> 	at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.initializeState(AbstractUdfStreamOperator.java:96)
> 	at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:258)
> 	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
> 	at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
> 	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
> 	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.IllegalAccessError: org/apache/commons/logging/impl/LogFactoryImpl$3
> 	at org.apache.commons.logging.impl.LogFactoryImpl.getParentClassLoader(LogFactoryImpl.java:700)
> 	at org.apache.commons.logging.impl.LogFactoryImpl.createLogFromClass(LogFactoryImpl.java:1187)
> 	at org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:914)
> 	at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:604)
> 	... 26 more
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)