You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by miccagiann <gi...@git.apache.org> on 2014/07/11 02:04:13 UTC

[GitHub] spark pull request: [SPARK-1945][MLLIB] Documentation Improvements...

Github user miccagiann commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1311#discussion_r14800873
  
    --- Diff: docs/mllib-linear-methods.md ---
    @@ -338,7 +429,76 @@ and [`LassoWithSGD`](api/scala/index.html#org.apache.spark.mllib.regression.Lass
     All of MLlib's methods use Java-friendly types, so you can import and call them there the same
     way you do in Scala. The only caveat is that the methods take Scala RDD objects, while the
     Spark Java API uses a separate `JavaRDD` class. You can convert a Java RDD to a Scala one by
    -calling `.rdd()` on your `JavaRDD` object.
    +calling `.rdd()` on your `JavaRDD` object. The corresponding Java example to
    +the Scala snippet provided, is presented bellow:
    +
    +{% highlight java %}
    +import org.apache.spark.api.java.*;
    +import org.apache.spark.SparkConf;
    +import org.apache.spark.api.java.function.Function;
    +import org.apache.spark.mllib.regression.LabeledPoint;
    +import org.apache.spark.mllib.regression.LinearRegressionWithSGD;
    +import org.apache.spark.mllib.regression.LinearRegressionModel;
    +import org.apache.spark.mllib.regression.LabeledPoint;
    +import org.apache.spark.mllib.linalg.Vectors;
    +import org.apache.spark.mllib.linalg.Vector;
    +
    +import scala.Product2;
    +
    +public class LinearRegression {
    +    public static void main( String[] args ) {
    +        SparkConf conf = new SparkConf().setAppName("Linear Regression Example");
    +        JavaSparkContext sc = new JavaSparkContext(conf);
    +        
    +        // Load and parse the data
    +        String path = "{SPARK_HOME}/mllib/data/ridge-data/lpsa.data";
    +        JavaRDD<String> data = sc.textFile(path);
    +        JavaRDD<LabeledPoint> parsedData = data.map(
    +            new Function<String, LabeledPoint>() {
    +                public LabeledPoint call(String line) {
    +                    String[] parts = line.split(",");
    +                    String[] features = parts[1].split(" ");
    +                    double[] v = new double[features.length];
    +                    for (int i = 0; i < features.length - 1; i++)
    +                        v[i] = Double.parseDouble(features[i]);
    +                    return new LabeledPoint(Double.parseDouble(parts[0]), Vectors.dense(v));
    +                }
    +            }
    +        );
    +
    +        // Building the model
    +        int numIterations = 100;
    +        final LinearRegressionModel model = LinearRegressionWithSGD.train(JavaRDD.toRDD(parsedData), numIterations);
    +
    +        // Evaluate model on training examples and compute training error
    +        JavaRDD<double[]> valuesAndPreds = parsedData.map(
    --- End diff --
    
    From my understanding the LabeledPoint instances consist of two fields:
    a) The first field is the label of the point.
    b) The second field is a double[] array (Vector[Double]) that consists of all the attributes' values.
    I assume that I should switch the values in 'valuesAndPreds' variable so as prediction to proceed the actual label of each datapoint included in ths JavaRDD<double[]>.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---