You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2010/09/22 09:10:33 UTC

[jira] Resolved: (MAHOUT-419) Convert decomposer code to Hadoop 0.20 API

     [ https://issues.apache.org/jira/browse/MAHOUT-419?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved MAHOUT-419.
------------------------------

         Assignee: Jake Mannix
    Fix Version/s:     (was: 0.4)
       Resolution: Duplicate

I agree, and purely for housekeeping purposes I want to roll this into mega-issue MAHOUT-167.

> Convert decomposer code to Hadoop 0.20 API
> ------------------------------------------
>
>                 Key: MAHOUT-419
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-419
>             Project: Mahout
>          Issue Type: Improvement
>          Components: Math
>    Affects Versions: 0.3, 0.4
>            Reporter: Danny Leshem
>            Assignee: Jake Mannix
>
> org.apache.mahout.math.hadoop classes (MatrixMultiplicationJob, TimesSquaredJob, TransposeJob) all use the deprecated Hadoop API. In the spirit of MAHOUT-167 and MAHOUT-143, I suggest converting them to Hadoop's 0.20 API.
> The reason I'm raising this now is that this code no longer runs on my Hadoop 0.22-SNAPSHOT cluster (not sure why really - it was running fine about a month ago, but after updating to the latest Mahout trunk a few days ago the code throws "java.lang.RuntimeException: Error in configuring object" at MapTask.runOldMapper).
> Also, the documentation at https://cwiki.apache.org/MAHOUT/dimensionalreduction.html is no longer accurate - the command line parameters have changed (even without the new arguments from MAHOUT-308). This is partly due to using the new argument parser which receives the input/output directories differently.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.