You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Niranda Perera <ni...@gmail.com> on 2015/10/09 09:25:23 UTC

passing a AbstractFunction1 to sparkContext().runJob instead of a Closure

hi all,

I want to run a job in the spark context and since I am running the system
in the java environment, I can not use a closure in
the sparkContext().runJob. Instead, I am passing an AbstractFunction1
extension.

while I get the jobs run without an issue, I constantly get the following
WARN message

TID: [-1234] [] [2015-10-06 04:39:43,387]  WARN
{org.apache.spark.util.ClosureCleaner} -  Expected a closure; got
org.wso2.carbon.analytics.spark.core.sources.AnalyticsWritingFunction
{org.apache.spark.util.ClosureCleaner}


I want to know what are the implications of this approach?
could this WARN cause issues in the functionality later on?

rgds
-- 
Niranda
@n1r44 <https://twitter.com/N1R44>
+94-71-554-8430
https://pythagoreanscript.wordpress.com/