You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Christophe Taton (JIRA)" <ji...@apache.org> on 2007/10/13 14:21:51 UTC

[jira] Updated: (HADOOP-1848) Redesign of Eclipse plug-in interface with Hadoop

     [ https://issues.apache.org/jira/browse/HADOOP-1848?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Christophe Taton updated HADOOP-1848:
-------------------------------------

    Attachment: 1848_2007-10-13_1.patch

Here is a major rewrite and cleanup of some parts of the Eclipse plug-in:
- The plug-in used to access Hadoop through its command-line interface: this has been replace with direct RPC accesses.
- The plug-in is aware of the SOCKS proxy configuration (which replaces the previous SSH tunneling stuff). There is no dependency on JSch for now. To create a SOCKS proxy on {{localhost:port}} to a firewalled cluster, you can use {{ssh -D <port> cluster-frontend}} (or setup putty to do the same thing under windows).
- The configuration of a location now allows the user to control all available parameters present in a {{JobConf}} or a {{Configuration}} object.


> Redesign of Eclipse plug-in interface with Hadoop
> -------------------------------------------------
>
>                 Key: HADOOP-1848
>                 URL: https://issues.apache.org/jira/browse/HADOOP-1848
>             Project: Hadoop
>          Issue Type: Improvement
>            Reporter: Christophe Taton
>            Assignee: Christophe Taton
>         Attachments: 1848_2007-10-13_1.patch
>
>
> The current Eclipse plug-in connects to Hadoop via shell scripts remotely executed using SSH and raw string marshaling. This is very inefficient and hard to maintain. The purpose of this issue is to let the plug-in directly use Hadoop's client API.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.