You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Z.yinze (JIRA)" <ji...@apache.org> on 2016/07/14 01:18:20 UTC
[jira] [Updated] (SPARK-16527) Scala Error:Type Arguments
[ https://issues.apache.org/jira/browse/SPARK-16527?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Z.yinze updated SPARK-16527:
----------------------------
Description:
I want to process image by HIPI on Spark.So i use hadoopfile to creat RDD.
There are my Code.
val conf = new SparkConf().setAppName("BundleTest")
val sc = new SparkContext(conf)
val bundle0 = sc.hadoopFile[HipiImageHeader,HipiImage,HibInputFormat]("hdfs://192.168.199.11:8020/Hdfs/Image/image.hib.dat",1000)
But I got an error:type arguments.
Error:(39, 22) type arguments [org.hipi.image.HipiImageHeader,org.hipi.image.HipiImage,org.hipi.imagebundle.mapreduce.HibInputFormat] conform to the bounds of none of the overloaded alternatives of
value hadoopFile: [K, V, F <: org.apache.hadoop.mapred.InputFormat[K,V]](path: String)(implicit km: scala.reflect.ClassTag[K], implicit vm: scala.reflect.ClassTag[V], implicit fm: scala.reflect.ClassTag[F])org.apache.spark.rdd.RDD[(K, V)] <and> [K, V, F <: org.apache.hadoop.mapred.InputFormat[K,V]](path: String, minPartitions: Int)(implicit km: scala.reflect.ClassTag[K], implicit vm: scala.reflect.ClassTag[V], implicit fm: scala.reflect.ClassTag[F])org.apache.spark.rdd.RDD[(K, V)]
val bundle0 = sc.hadoopFile[HipiImageHeader,HipiImage,HibInputFormat]("hdfs://192.168.199.11:8020/Hdfs/Image/image.hib.dat",1000)
^
Please give me some advises to solve it.Or,How can I read imagebundle by other ways?
Thanks for your help.
was:
I want to process image by HIPI on Spark.So i use hadoopfile to creat RDD.
There are my Code.
val conf = new SparkConf().setAppName("BundleTest")
val sc = new SparkContext(conf)
val bundle0 = sc.hadoopFile[HipiImageHeader,HipiImage,HibInputFormat]("hdfs://192.168.199.11:8020/Hdfs/Image/image.hib.dat",1000)
But I got an error:type arguments.
Error:(39, 22) type arguments [org.hipi.image.HipiImageHeader,org.hipi.image.HipiImage,org.hipi.imagebundle.mapreduce.HibInputFormat] conform to the bounds of none of the overloaded alternatives of
value hadoopFile: [K, V, F <: org.apache.hadoop.mapred.InputFormat[K,V]](path: String)(implicit km: scala.reflect.ClassTag[K], implicit vm: scala.reflect.ClassTag[V], implicit fm: scala.reflect.ClassTag[F])org.apache.spark.rdd.RDD[(K, V)] <and> [K, V, F <: org.apache.hadoop.mapred.InputFormat[K,V]](path: String, minPartitions: Int)(implicit km: scala.reflect.ClassTag[K], implicit vm: scala.reflect.ClassTag[V], implicit fm: scala.reflect.ClassTag[F])org.apache.spark.rdd.RDD[(K, V)]
val bundle0 = sc.hadoopFile[HipiImageHeader,HipiImage,HibInputFormat]("hdfs://192.168.199.11:8020/Hdfs/Image/image.hib.dat",1000)
^
Please give me some advises to solve it.Or,How can I read imagebundle by other ways?
Thanks for your help.
> Scala Error:Type Arguments
> --------------------------
>
> Key: SPARK-16527
> URL: https://issues.apache.org/jira/browse/SPARK-16527
> Project: Spark
> Issue Type: Bug
> Components: Build, Input/Output, Java API
> Affects Versions: 1.6.1
> Environment: centos
> Reporter: Z.yinze
> Labels: build, windows
>
> I want to process image by HIPI on Spark.So i use hadoopfile to creat RDD.
> There are my Code.
> val conf = new SparkConf().setAppName("BundleTest")
> val sc = new SparkContext(conf)
> val bundle0 = sc.hadoopFile[HipiImageHeader,HipiImage,HibInputFormat]("hdfs://192.168.199.11:8020/Hdfs/Image/image.hib.dat",1000)
> But I got an error:type arguments.
> Error:(39, 22) type arguments [org.hipi.image.HipiImageHeader,org.hipi.image.HipiImage,org.hipi.imagebundle.mapreduce.HibInputFormat] conform to the bounds of none of the overloaded alternatives of
> value hadoopFile: [K, V, F <: org.apache.hadoop.mapred.InputFormat[K,V]](path: String)(implicit km: scala.reflect.ClassTag[K], implicit vm: scala.reflect.ClassTag[V], implicit fm: scala.reflect.ClassTag[F])org.apache.spark.rdd.RDD[(K, V)] <and> [K, V, F <: org.apache.hadoop.mapred.InputFormat[K,V]](path: String, minPartitions: Int)(implicit km: scala.reflect.ClassTag[K], implicit vm: scala.reflect.ClassTag[V], implicit fm: scala.reflect.ClassTag[F])org.apache.spark.rdd.RDD[(K, V)]
> val bundle0 = sc.hadoopFile[HipiImageHeader,HipiImage,HibInputFormat]("hdfs://192.168.199.11:8020/Hdfs/Image/image.hib.dat",1000)
> ^
> Please give me some advises to solve it.Or,How can I read imagebundle by other ways?
> Thanks for your help.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org