You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Andrea Sella <an...@radicalbit.io> on 2016/03/14 15:18:51 UTC

Integration Alluxio and Flink

Hi to all,

I'm trying to integrate Alluxio and Apache Flink, I followed Running Flink
on Alluxio
<http://www.alluxio.org/documentation/en/Running-Flink-on-Alluxio.html> to
setup Flink.

I tested in local mode executing:

bin/flink run ./examples/batch/WordCount.jar --input
alluxio:///flink/README.txt

But I've faced a TimeoutException, I attach the logs. It seems the trouble
is due to netty dependencies-conflict.

Thank you,
Andrea

Re: Integration Alluxio and Flink

Posted by Andrea Sella <an...@radicalbit.io>.
Hi Robert,

I've missed to tell that I built a fat-jar of the job using `sbt assembly`
so my job is including alluxio-core-client.


2016-03-15 17:45 GMT+01:00 Robert Metzger <rm...@apache.org>:

> Hi Andrea,
>
> the filesystem class can not be in the job jar. You have to put it into
> the lib folder.
>
> On Tue, Mar 15, 2016 at 5:40 PM, Andrea Sella <an...@radicalbit.io>
> wrote:
>
>> Hi Till,
>>
>> I put the jar as dependency of my job on build.sbt. I need to do
>> somenthing else?
>>
>> val flinkDependencies = Seq(
>>   "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
>>   "org.apache.flink" %% "flink-streaming-scala" % flinkVersion %
>> "provided",
>>   ("org.alluxio" % "alluxio-core-client" % "1.0.0").
>>     exclude("org.jboss.netty", "netty").
>>     exclude("io.netty", "netty").
>>     exclude("io.netty", "netty-all").
>>     exclude("org.slf4j", "slf4j-api").
>>     exclude("commons-beanutils", "commons-beanutils-core").
>>     exclude("commons-collections", "commons-collections").
>>     exclude("commons-logging", "commons-logging").
>>     exclude("com.esotericsoftware.minlog", "minlog").
>>     exclude("org.apache.hadoop","hadoop-yarn-common")
>> )
>>
>> `show compile:dependencyClasspath` shows alluxio-core-client as
>> dependency.
>>
>> https://github.com/alkagin/alluxio-wordcount/
>>
>> Thanks,
>> Andrea
>>
>>
>>
>> 2016-03-15 17:09 GMT+01:00 Till Rohrmann <tr...@apache.org>:
>>
>>> Hi Andrea,
>>>
>>> can it be that the alluxio.hadoop.FileSystem is not in your classpath?
>>> Have you put the respective jar file in Flink’s lib folder?
>>>
>>> Cheers,
>>> Till
>>> ​
>>>
>>> On Tue, Mar 15, 2016 at 12:55 PM, Andrea Sella <
>>> andrea.sella@radicalbit.io> wrote:
>>>
>>>> Hi Till,
>>>>
>>>> I've tried your suggestion (source-code
>>>> <https://github.com/alkagin/alluxio-wordcount/>) and now it throws:
>>>> java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
>>>> alluxio.hadoop.FileSystem not found.
>>>> The core-site.xml has been set correctly and into the alluxio-wordcount
>>>> jar is present alluxio-client. Do I need to specify the hadoop
>>>> configuration via code or core-site.xml is enough?
>>>>
>>>> Thank you again,
>>>> Andrea
>>>>
>>>> 2016-03-14 17:28 GMT+01:00 Till Rohrmann <tr...@apache.org>:
>>>>
>>>>> Hi Andrea,
>>>>>
>>>>> the problem won’t be netty-all but netty, I suspect. Flink is using
>>>>> version 3.8 whereas alluxio-core-client uses version 3.2.2. I think
>>>>> you have to exclude or shade this dependency away.
>>>>>
>>>>> Cheers,
>>>>> Till
>>>>> ​
>>>>>
>>>>> On Mon, Mar 14, 2016 at 5:12 PM, Andrea Sella <
>>>>> andrea.sella@radicalbit.io> wrote:
>>>>>
>>>>>> Hi Till,
>>>>>> I tried to downgrade the Alluxio's netty version from 4.0.28.Final to
>>>>>> 4.0.27.Final to align Flink and Alluxio dependencies. First of all, Flink
>>>>>> 1.0.0 uses 4.0.27.Final, is it correct? Btw it doesn't work, same error as
>>>>>> above.
>>>>>>
>>>>>> BR,
>>>>>> Andrea
>>>>>>
>>>>>> 2016-03-14 15:30 GMT+01:00 Till Rohrmann <tr...@apache.org>:
>>>>>>
>>>>>>> Yes it seems as if you have a netty version conflict. Maybe the
>>>>>>> alluxio-core-client.jar pulls in an incompatible netty version. Could you
>>>>>>> check whether this is the case? But maybe you also have another
>>>>>>> dependencies which pulls in a wrong netty version, since the Alluxio
>>>>>>> documentation indicates that it works with Flink (but I cannot tell for
>>>>>>> which version).
>>>>>>>
>>>>>>> Cheers,
>>>>>>> Till
>>>>>>>
>>>>>>> On Mon, Mar 14, 2016 at 3:18 PM, Andrea Sella <
>>>>>>> andrea.sella@radicalbit.io> wrote:
>>>>>>>
>>>>>>>> Hi to all,
>>>>>>>>
>>>>>>>> I'm trying to integrate Alluxio and Apache Flink, I followed Running
>>>>>>>> Flink on Alluxio
>>>>>>>> <http://www.alluxio.org/documentation/en/Running-Flink-on-Alluxio.html> to
>>>>>>>> setup Flink.
>>>>>>>>
>>>>>>>> I tested in local mode executing:
>>>>>>>>
>>>>>>>> bin/flink run ./examples/batch/WordCount.jar --input
>>>>>>>> alluxio:///flink/README.txt
>>>>>>>>
>>>>>>>> But I've faced a TimeoutException, I attach the logs. It seems the
>>>>>>>> trouble is due to netty dependencies-conflict.
>>>>>>>>
>>>>>>>> Thank you,
>>>>>>>> Andrea
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Integration Alluxio and Flink

Posted by Robert Metzger <rm...@apache.org>.
Hi Andrea,

the filesystem class can not be in the job jar. You have to put it into the
lib folder.

On Tue, Mar 15, 2016 at 5:40 PM, Andrea Sella <an...@radicalbit.io>
wrote:

> Hi Till,
>
> I put the jar as dependency of my job on build.sbt. I need to do
> somenthing else?
>
> val flinkDependencies = Seq(
>   "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
>   "org.apache.flink" %% "flink-streaming-scala" % flinkVersion %
> "provided",
>   ("org.alluxio" % "alluxio-core-client" % "1.0.0").
>     exclude("org.jboss.netty", "netty").
>     exclude("io.netty", "netty").
>     exclude("io.netty", "netty-all").
>     exclude("org.slf4j", "slf4j-api").
>     exclude("commons-beanutils", "commons-beanutils-core").
>     exclude("commons-collections", "commons-collections").
>     exclude("commons-logging", "commons-logging").
>     exclude("com.esotericsoftware.minlog", "minlog").
>     exclude("org.apache.hadoop","hadoop-yarn-common")
> )
>
> `show compile:dependencyClasspath` shows alluxio-core-client as dependency.
>
> https://github.com/alkagin/alluxio-wordcount/
>
> Thanks,
> Andrea
>
>
>
> 2016-03-15 17:09 GMT+01:00 Till Rohrmann <tr...@apache.org>:
>
>> Hi Andrea,
>>
>> can it be that the alluxio.hadoop.FileSystem is not in your classpath?
>> Have you put the respective jar file in Flink’s lib folder?
>>
>> Cheers,
>> Till
>> ​
>>
>> On Tue, Mar 15, 2016 at 12:55 PM, Andrea Sella <
>> andrea.sella@radicalbit.io> wrote:
>>
>>> Hi Till,
>>>
>>> I've tried your suggestion (source-code
>>> <https://github.com/alkagin/alluxio-wordcount/>) and now it throws:
>>> java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
>>> alluxio.hadoop.FileSystem not found.
>>> The core-site.xml has been set correctly and into the alluxio-wordcount
>>> jar is present alluxio-client. Do I need to specify the hadoop
>>> configuration via code or core-site.xml is enough?
>>>
>>> Thank you again,
>>> Andrea
>>>
>>> 2016-03-14 17:28 GMT+01:00 Till Rohrmann <tr...@apache.org>:
>>>
>>>> Hi Andrea,
>>>>
>>>> the problem won’t be netty-all but netty, I suspect. Flink is using
>>>> version 3.8 whereas alluxio-core-client uses version 3.2.2. I think
>>>> you have to exclude or shade this dependency away.
>>>>
>>>> Cheers,
>>>> Till
>>>> ​
>>>>
>>>> On Mon, Mar 14, 2016 at 5:12 PM, Andrea Sella <
>>>> andrea.sella@radicalbit.io> wrote:
>>>>
>>>>> Hi Till,
>>>>> I tried to downgrade the Alluxio's netty version from 4.0.28.Final to
>>>>> 4.0.27.Final to align Flink and Alluxio dependencies. First of all, Flink
>>>>> 1.0.0 uses 4.0.27.Final, is it correct? Btw it doesn't work, same error as
>>>>> above.
>>>>>
>>>>> BR,
>>>>> Andrea
>>>>>
>>>>> 2016-03-14 15:30 GMT+01:00 Till Rohrmann <tr...@apache.org>:
>>>>>
>>>>>> Yes it seems as if you have a netty version conflict. Maybe the
>>>>>> alluxio-core-client.jar pulls in an incompatible netty version. Could you
>>>>>> check whether this is the case? But maybe you also have another
>>>>>> dependencies which pulls in a wrong netty version, since the Alluxio
>>>>>> documentation indicates that it works with Flink (but I cannot tell for
>>>>>> which version).
>>>>>>
>>>>>> Cheers,
>>>>>> Till
>>>>>>
>>>>>> On Mon, Mar 14, 2016 at 3:18 PM, Andrea Sella <
>>>>>> andrea.sella@radicalbit.io> wrote:
>>>>>>
>>>>>>> Hi to all,
>>>>>>>
>>>>>>> I'm trying to integrate Alluxio and Apache Flink, I followed Running
>>>>>>> Flink on Alluxio
>>>>>>> <http://www.alluxio.org/documentation/en/Running-Flink-on-Alluxio.html> to
>>>>>>> setup Flink.
>>>>>>>
>>>>>>> I tested in local mode executing:
>>>>>>>
>>>>>>> bin/flink run ./examples/batch/WordCount.jar --input
>>>>>>> alluxio:///flink/README.txt
>>>>>>>
>>>>>>> But I've faced a TimeoutException, I attach the logs. It seems the
>>>>>>> trouble is due to netty dependencies-conflict.
>>>>>>>
>>>>>>> Thank you,
>>>>>>> Andrea
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Integration Alluxio and Flink

Posted by Andrea Sella <an...@radicalbit.io>.
Hi Till,

I put the jar as dependency of my job on build.sbt. I need to do somenthing
else?

val flinkDependencies = Seq(
  "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
  ("org.alluxio" % "alluxio-core-client" % "1.0.0").
    exclude("org.jboss.netty", "netty").
    exclude("io.netty", "netty").
    exclude("io.netty", "netty-all").
    exclude("org.slf4j", "slf4j-api").
    exclude("commons-beanutils", "commons-beanutils-core").
    exclude("commons-collections", "commons-collections").
    exclude("commons-logging", "commons-logging").
    exclude("com.esotericsoftware.minlog", "minlog").
    exclude("org.apache.hadoop","hadoop-yarn-common")
)

`show compile:dependencyClasspath` shows alluxio-core-client as dependency.

https://github.com/alkagin/alluxio-wordcount/

Thanks,
Andrea



2016-03-15 17:09 GMT+01:00 Till Rohrmann <tr...@apache.org>:

> Hi Andrea,
>
> can it be that the alluxio.hadoop.FileSystem is not in your classpath?
> Have you put the respective jar file in Flink’s lib folder?
>
> Cheers,
> Till
> ​
>
> On Tue, Mar 15, 2016 at 12:55 PM, Andrea Sella <andrea.sella@radicalbit.io
> > wrote:
>
>> Hi Till,
>>
>> I've tried your suggestion (source-code
>> <https://github.com/alkagin/alluxio-wordcount/>) and now it throws:
>> java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
>> alluxio.hadoop.FileSystem not found.
>> The core-site.xml has been set correctly and into the alluxio-wordcount
>> jar is present alluxio-client. Do I need to specify the hadoop
>> configuration via code or core-site.xml is enough?
>>
>> Thank you again,
>> Andrea
>>
>> 2016-03-14 17:28 GMT+01:00 Till Rohrmann <tr...@apache.org>:
>>
>>> Hi Andrea,
>>>
>>> the problem won’t be netty-all but netty, I suspect. Flink is using
>>> version 3.8 whereas alluxio-core-client uses version 3.2.2. I think you
>>> have to exclude or shade this dependency away.
>>>
>>> Cheers,
>>> Till
>>> ​
>>>
>>> On Mon, Mar 14, 2016 at 5:12 PM, Andrea Sella <
>>> andrea.sella@radicalbit.io> wrote:
>>>
>>>> Hi Till,
>>>> I tried to downgrade the Alluxio's netty version from 4.0.28.Final to
>>>> 4.0.27.Final to align Flink and Alluxio dependencies. First of all, Flink
>>>> 1.0.0 uses 4.0.27.Final, is it correct? Btw it doesn't work, same error as
>>>> above.
>>>>
>>>> BR,
>>>> Andrea
>>>>
>>>> 2016-03-14 15:30 GMT+01:00 Till Rohrmann <tr...@apache.org>:
>>>>
>>>>> Yes it seems as if you have a netty version conflict. Maybe the
>>>>> alluxio-core-client.jar pulls in an incompatible netty version. Could you
>>>>> check whether this is the case? But maybe you also have another
>>>>> dependencies which pulls in a wrong netty version, since the Alluxio
>>>>> documentation indicates that it works with Flink (but I cannot tell for
>>>>> which version).
>>>>>
>>>>> Cheers,
>>>>> Till
>>>>>
>>>>> On Mon, Mar 14, 2016 at 3:18 PM, Andrea Sella <
>>>>> andrea.sella@radicalbit.io> wrote:
>>>>>
>>>>>> Hi to all,
>>>>>>
>>>>>> I'm trying to integrate Alluxio and Apache Flink, I followed Running
>>>>>> Flink on Alluxio
>>>>>> <http://www.alluxio.org/documentation/en/Running-Flink-on-Alluxio.html> to
>>>>>> setup Flink.
>>>>>>
>>>>>> I tested in local mode executing:
>>>>>>
>>>>>> bin/flink run ./examples/batch/WordCount.jar --input
>>>>>> alluxio:///flink/README.txt
>>>>>>
>>>>>> But I've faced a TimeoutException, I attach the logs. It seems the
>>>>>> trouble is due to netty dependencies-conflict.
>>>>>>
>>>>>> Thank you,
>>>>>> Andrea
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Integration Alluxio and Flink

Posted by Till Rohrmann <tr...@apache.org>.
Hi Andrea,

can it be that the alluxio.hadoop.FileSystem is not in your classpath? Have
you put the respective jar file in Flink’s lib folder?

Cheers,
Till
​

On Tue, Mar 15, 2016 at 12:55 PM, Andrea Sella <an...@radicalbit.io>
wrote:

> Hi Till,
>
> I've tried your suggestion (source-code
> <https://github.com/alkagin/alluxio-wordcount/>) and now it throws:
> java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
> alluxio.hadoop.FileSystem not found.
> The core-site.xml has been set correctly and into the alluxio-wordcount
> jar is present alluxio-client. Do I need to specify the hadoop
> configuration via code or core-site.xml is enough?
>
> Thank you again,
> Andrea
>
> 2016-03-14 17:28 GMT+01:00 Till Rohrmann <tr...@apache.org>:
>
>> Hi Andrea,
>>
>> the problem won’t be netty-all but netty, I suspect. Flink is using
>> version 3.8 whereas alluxio-core-client uses version 3.2.2. I think you
>> have to exclude or shade this dependency away.
>>
>> Cheers,
>> Till
>> ​
>>
>> On Mon, Mar 14, 2016 at 5:12 PM, Andrea Sella <andrea.sella@radicalbit.io
>> > wrote:
>>
>>> Hi Till,
>>> I tried to downgrade the Alluxio's netty version from 4.0.28.Final to
>>> 4.0.27.Final to align Flink and Alluxio dependencies. First of all, Flink
>>> 1.0.0 uses 4.0.27.Final, is it correct? Btw it doesn't work, same error as
>>> above.
>>>
>>> BR,
>>> Andrea
>>>
>>> 2016-03-14 15:30 GMT+01:00 Till Rohrmann <tr...@apache.org>:
>>>
>>>> Yes it seems as if you have a netty version conflict. Maybe the
>>>> alluxio-core-client.jar pulls in an incompatible netty version. Could you
>>>> check whether this is the case? But maybe you also have another
>>>> dependencies which pulls in a wrong netty version, since the Alluxio
>>>> documentation indicates that it works with Flink (but I cannot tell for
>>>> which version).
>>>>
>>>> Cheers,
>>>> Till
>>>>
>>>> On Mon, Mar 14, 2016 at 3:18 PM, Andrea Sella <
>>>> andrea.sella@radicalbit.io> wrote:
>>>>
>>>>> Hi to all,
>>>>>
>>>>> I'm trying to integrate Alluxio and Apache Flink, I followed Running
>>>>> Flink on Alluxio
>>>>> <http://www.alluxio.org/documentation/en/Running-Flink-on-Alluxio.html> to
>>>>> setup Flink.
>>>>>
>>>>> I tested in local mode executing:
>>>>>
>>>>> bin/flink run ./examples/batch/WordCount.jar --input
>>>>> alluxio:///flink/README.txt
>>>>>
>>>>> But I've faced a TimeoutException, I attach the logs. It seems the
>>>>> trouble is due to netty dependencies-conflict.
>>>>>
>>>>> Thank you,
>>>>> Andrea
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Integration Alluxio and Flink

Posted by Andrea Sella <an...@radicalbit.io>.
Hi Till,

I've tried your suggestion (source-code
<https://github.com/alkagin/alluxio-wordcount/>) and now it throws:
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
alluxio.hadoop.FileSystem not found.
The core-site.xml has been set correctly and into the alluxio-wordcount jar
is present alluxio-client. Do I need to specify the hadoop configuration
via code or core-site.xml is enough?

Thank you again,
Andrea

2016-03-14 17:28 GMT+01:00 Till Rohrmann <tr...@apache.org>:

> Hi Andrea,
>
> the problem won’t be netty-all but netty, I suspect. Flink is using
> version 3.8 whereas alluxio-core-client uses version 3.2.2. I think you
> have to exclude or shade this dependency away.
>
> Cheers,
> Till
> ​
>
> On Mon, Mar 14, 2016 at 5:12 PM, Andrea Sella <an...@radicalbit.io>
> wrote:
>
>> Hi Till,
>> I tried to downgrade the Alluxio's netty version from 4.0.28.Final to
>> 4.0.27.Final to align Flink and Alluxio dependencies. First of all, Flink
>> 1.0.0 uses 4.0.27.Final, is it correct? Btw it doesn't work, same error as
>> above.
>>
>> BR,
>> Andrea
>>
>> 2016-03-14 15:30 GMT+01:00 Till Rohrmann <tr...@apache.org>:
>>
>>> Yes it seems as if you have a netty version conflict. Maybe the
>>> alluxio-core-client.jar pulls in an incompatible netty version. Could you
>>> check whether this is the case? But maybe you also have another
>>> dependencies which pulls in a wrong netty version, since the Alluxio
>>> documentation indicates that it works with Flink (but I cannot tell for
>>> which version).
>>>
>>> Cheers,
>>> Till
>>>
>>> On Mon, Mar 14, 2016 at 3:18 PM, Andrea Sella <
>>> andrea.sella@radicalbit.io> wrote:
>>>
>>>> Hi to all,
>>>>
>>>> I'm trying to integrate Alluxio and Apache Flink, I followed Running
>>>> Flink on Alluxio
>>>> <http://www.alluxio.org/documentation/en/Running-Flink-on-Alluxio.html> to
>>>> setup Flink.
>>>>
>>>> I tested in local mode executing:
>>>>
>>>> bin/flink run ./examples/batch/WordCount.jar --input
>>>> alluxio:///flink/README.txt
>>>>
>>>> But I've faced a TimeoutException, I attach the logs. It seems the
>>>> trouble is due to netty dependencies-conflict.
>>>>
>>>> Thank you,
>>>> Andrea
>>>>
>>>>
>>>>
>>>
>>
>

Re: Integration Alluxio and Flink

Posted by Till Rohrmann <tr...@apache.org>.
Hi Andrea,

the problem won’t be netty-all but netty, I suspect. Flink is using version
3.8 whereas alluxio-core-client uses version 3.2.2. I think you have to
exclude or shade this dependency away.

Cheers,
Till
​

On Mon, Mar 14, 2016 at 5:12 PM, Andrea Sella <an...@radicalbit.io>
wrote:

> Hi Till,
> I tried to downgrade the Alluxio's netty version from 4.0.28.Final to
> 4.0.27.Final to align Flink and Alluxio dependencies. First of all, Flink
> 1.0.0 uses 4.0.27.Final, is it correct? Btw it doesn't work, same error as
> above.
>
> BR,
> Andrea
>
> 2016-03-14 15:30 GMT+01:00 Till Rohrmann <tr...@apache.org>:
>
>> Yes it seems as if you have a netty version conflict. Maybe the
>> alluxio-core-client.jar pulls in an incompatible netty version. Could you
>> check whether this is the case? But maybe you also have another
>> dependencies which pulls in a wrong netty version, since the Alluxio
>> documentation indicates that it works with Flink (but I cannot tell for
>> which version).
>>
>> Cheers,
>> Till
>>
>> On Mon, Mar 14, 2016 at 3:18 PM, Andrea Sella <andrea.sella@radicalbit.io
>> > wrote:
>>
>>> Hi to all,
>>>
>>> I'm trying to integrate Alluxio and Apache Flink, I followed Running
>>> Flink on Alluxio
>>> <http://www.alluxio.org/documentation/en/Running-Flink-on-Alluxio.html> to
>>> setup Flink.
>>>
>>> I tested in local mode executing:
>>>
>>> bin/flink run ./examples/batch/WordCount.jar --input
>>> alluxio:///flink/README.txt
>>>
>>> But I've faced a TimeoutException, I attach the logs. It seems the
>>> trouble is due to netty dependencies-conflict.
>>>
>>> Thank you,
>>> Andrea
>>>
>>>
>>>
>>
>

Re: Integration Alluxio and Flink

Posted by Andrea Sella <an...@radicalbit.io>.
Hi Till,
I tried to downgrade the Alluxio's netty version from 4.0.28.Final to
4.0.27.Final to align Flink and Alluxio dependencies. First of all, Flink
1.0.0 uses 4.0.27.Final, is it correct? Btw it doesn't work, same error as
above.

BR,
Andrea

2016-03-14 15:30 GMT+01:00 Till Rohrmann <tr...@apache.org>:

> Yes it seems as if you have a netty version conflict. Maybe the
> alluxio-core-client.jar pulls in an incompatible netty version. Could you
> check whether this is the case? But maybe you also have another
> dependencies which pulls in a wrong netty version, since the Alluxio
> documentation indicates that it works with Flink (but I cannot tell for
> which version).
>
> Cheers,
> Till
>
> On Mon, Mar 14, 2016 at 3:18 PM, Andrea Sella <an...@radicalbit.io>
> wrote:
>
>> Hi to all,
>>
>> I'm trying to integrate Alluxio and Apache Flink, I followed Running
>> Flink on Alluxio
>> <http://www.alluxio.org/documentation/en/Running-Flink-on-Alluxio.html> to
>> setup Flink.
>>
>> I tested in local mode executing:
>>
>> bin/flink run ./examples/batch/WordCount.jar --input
>> alluxio:///flink/README.txt
>>
>> But I've faced a TimeoutException, I attach the logs. It seems the
>> trouble is due to netty dependencies-conflict.
>>
>> Thank you,
>> Andrea
>>
>>
>>
>

Re: Integration Alluxio and Flink

Posted by Till Rohrmann <tr...@apache.org>.
Yes it seems as if you have a netty version conflict. Maybe the
alluxio-core-client.jar pulls in an incompatible netty version. Could you
check whether this is the case? But maybe you also have another
dependencies which pulls in a wrong netty version, since the Alluxio
documentation indicates that it works with Flink (but I cannot tell for
which version).

Cheers,
Till

On Mon, Mar 14, 2016 at 3:18 PM, Andrea Sella <an...@radicalbit.io>
wrote:

> Hi to all,
>
> I'm trying to integrate Alluxio and Apache Flink, I followed Running
> Flink on Alluxio
> <http://www.alluxio.org/documentation/en/Running-Flink-on-Alluxio.html> to
> setup Flink.
>
> I tested in local mode executing:
>
> bin/flink run ./examples/batch/WordCount.jar --input
> alluxio:///flink/README.txt
>
> But I've faced a TimeoutException, I attach the logs. It seems the trouble
> is due to netty dependencies-conflict.
>
> Thank you,
> Andrea
>
>
>