You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Aljoscha Krettek <al...@apache.org> on 2016/09/05 09:21:23 UTC

Re: flink dataStream operate dataSet

Hi,
right now it is not possible to mix the DataSet and the DataStream API. The
reason for the "task not serializable" error is that putting the DataSet
into the map function tries to serialize the DataSet, which is not possible.

Cheers,
Aljoscha

On Tue, 30 Aug 2016 at 16:31 <ri...@sina.cn> wrote:

> Hi,
>      i have a problem,a dataStream read from rabbitMQ,and others data from
> a hbase table,which is a dataSet.Those two data from follow:
>
>      val words=connectHelper.readFromRabbitMq(...)  // words is
> DataStream[String]
>      val dataSet=HBaseWrite.fullScan(....)  //dataSet is
> DataSet[(int,String)]
>
>      words.map{ word =>
>          val res = dataSet.map{ y =>
>                        val score = computerScore(x,y)
>                    (word,score)
>               }
>          HBaseWrite.writeToTable(res,...,)
>      }
>
>    the  error is task not serializable,what is the solution?
>   under a DataStream, how to operate a DataSet?
>
>
>
>