You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mahout.apache.org by ak...@apache.org on 2016/03/20 04:11:32 UTC
mahout git commit: Adding instructions for MAHOUT-1794 to the readme.
Repository: mahout
Updated Branches:
refs/heads/flink-binding 1c1abbf3d -> c00b96ecc
Adding instructions for MAHOUT-1794 to the readme.
Conflicts:
examples/bin/README.txt
Project: http://git-wip-us.apache.org/repos/asf/mahout/repo
Commit: http://git-wip-us.apache.org/repos/asf/mahout/commit/c00b96ec
Tree: http://git-wip-us.apache.org/repos/asf/mahout/tree/c00b96ec
Diff: http://git-wip-us.apache.org/repos/asf/mahout/diff/c00b96ec
Branch: refs/heads/flink-binding
Commit: c00b96ecc909fe6bcd4dce5e365a7fcea8a2bfdf
Parents: 1c1abbf
Author: Andrew Musselman <ak...@apache.org>
Authored: Sat Mar 19 14:46:48 2016 -0700
Committer: Andrew Musselman <ak...@apache.org>
Committed: Sat Mar 19 20:10:50 2016 -0700
----------------------------------------------------------------------
examples/bin/README.txt | 5 ++++-
1 file changed, 4 insertions(+), 1 deletion(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/mahout/blob/c00b96ec/examples/bin/README.txt
----------------------------------------------------------------------
diff --git a/examples/bin/README.txt b/examples/bin/README.txt
index f47ab44..a30c35b 100644
--- a/examples/bin/README.txt
+++ b/examples/bin/README.txt
@@ -1,5 +1,8 @@
This directory contains helpful shell scripts for working with some of Mahout's examples.
+To set a non-default temporary work directory: `export MAHOUT_WORK_DIR=/path/in/hdfs/to/temp/dir`
+ Note that this requires the same path to be writable both on the local file system as well as on HDFS.
+
Here's a description of what each does:
classify-20newsgroups.sh -- Run SGD and Bayes classifiers over the classic 20 News Groups. Downloads the data set automatically.
@@ -8,4 +11,4 @@ cluster-syntheticcontrol.sh -- Cluster the Synthetic Control data set. Download
factorize-movielens-1m.sh -- Run the Alternating Least Squares Recommender on the Grouplens data set (size 1M).
factorize-netflix.sh -- (Deprecated due to lack of availability of the data set) Run the ALS Recommender on the Netflix data set.
run-rf.sh -- Create some synthetic data, build a random forest, and test performance.
-spark-document-classifier.mscala -- A mahout-shell script which trains and tests a Naive Bayes model on the Wikipedia XML dump and defines simple methods to classify new text.
\ No newline at end of file
+spark-document-classifier.mscala -- A mahout-shell script which trains and tests a Naive Bayes model on the Wikipedia XML dump and defines simple methods to classify new text.