You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by Pat Ferrel <pa...@occamsmachete.com> on 2014/07/04 03:08:39 UTC

Build broken mea culpa

I’m checking in some scalatests that run Spark on "local". I get a MahoutDistributedContext from:

  def mahoutSparkContext(masterUrl: String, appName: String,
      customJars: TraversableOnce[String] = Nil,
      sparkConf: SparkConf = new SparkConf(),
      addMahoutJars: Boolean = true
      ): SparkDistributedContext = {
    val closeables = new java.util.ArrayDeque[Closeable]()

    try {

      if (addMahoutJars) {
        var mhome = System.getenv("MAHOUT_HOME")
        if (mhome == null) mhome = System.getProperty("mahout.home")

        if (mhome == null)
          throw new IllegalArgumentException("MAHOUT_HOME is required to spawn mahout-based spark jobs.")


But as you see it requires MAHOUT_HOME to be set and it doesn’t seem to be on the build machines? Seems odd that no one else needs MAHOUT_HOME in tests so if I need to roll this back let me know. I have no idea how to change the environment of the build machines.

Furthermore the test creates mahout/spark/tmp/.. then deletes the same when finished. Hopefully this is allowed on the build machines.