You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "chrispy (JIRA)" <ji...@apache.org> on 2015/04/15 08:28:58 UTC

[jira] [Created] (SPARK-6925) Word2Vec's transform method throw IllegalStateException if a word not in vocabulary, but findSynonyms(word: String, num: Int) method neither try catch exception nor throw exception.

chrispy created SPARK-6925:
------------------------------

             Summary: Word2Vec's transform method throw IllegalStateException if a word not in vocabulary, but  findSynonyms(word: String, num: Int) method neither try catch exception nor throw exception. 
                 Key: SPARK-6925
                 URL: https://issues.apache.org/jira/browse/SPARK-6925
             Project: Spark
          Issue Type: Bug
          Components: MLlib
    Affects Versions: 1.3.0
            Reporter: chrispy
            Priority: Critical


    val doc = sc.textFile("/xxx/segrestxt", args(3).toInt).map(line => line.split(" ").toSeq)
    val model = new Word2Vec().setVectorSize(10).setSeed(42L).setNumIterations(3).fit(doc)
	val cpseeds = sc.textFile("/xxx/seeds/cp.tag", args(3).toInt)
	
	// it got IllegalStateException then exit app when tag is a word not in vocabulary.
	// but findSynonyms method neither throw IllegalStateException nor try catch IllegalStateException.  
	cpseeds.map {
      tag =>
		val syn = model.findSynonyms(tag, 30).map(l => l._1)
		tag + ":" + Joiner.on(" ").join(JavaConversions.asJavaIterator(syn.toIterator))
    }.saveAsTextFile("/xxx/synonyms/cp")
	
	// developer need try catch IllegalStateException
    cpseeds.map {
      tag =>
        try {
          val syn = model.findSynonyms(tag, 30).map(l => l._1)
          tag + ":" + Joiner.on(" ").join(JavaConversions.asJavaIterator(syn.toIterator))
        } catch {
          case e: IllegalStateException => log.error(s"cp tag:$tag", e)
        }
    }.saveAsTextFile("/xxx/synonyms/cp")



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org