You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Tarandeep Singh <ta...@gmail.com> on 2010/01/17 22:41:28 UTC

Hadoop and X11 related error

Hi,

I am running a MR job that requires usage of some java.awt.* classes, that
can't be run in headless mode.

Right now, I am running Hadoop in a single node cluster (my laptop) which
has X11 server running. I have set up my ssh server and client to do X11
forwarding.

I ran the following java program to ensure that X11 forwarding is working-

public class Test
  throws Exception
{
  public static void main( String[] args)
  {
    Process p = runtime.getRuntime( ).exec( "/usr/bin/xterm");
    p.waitFor( );
  }
}

I ran it as-
ssh localhost "java Test"
and it worked confirming that X11 forwarding is working over SSH.

However, when I run a Map Reduce program that uses java.awt.* classes
(trying to create an object that extends Frame), I keep getting this error-

java.lang.InternalError- Can't connect to X11 window server using
'localhost:10.0' as the value of the DISPLAY variable

I printed the value of DISPLAY variable-

echo $DISPLAY
:0.0
ssh localhost
echo $DISPLAY
localhost:10.0

I understand this is more of a SSH and X11 related issue, but X11 forwarding
is working over SSH in my standalone programs, but not in MR program. That's
why I am posting the problem here.

Any help is greatly appreciated.

Thanks,
Tarandeep

Re: Hadoop and X11 related error

Posted by Steve Loughran <st...@apache.org>.
Tarandeep Singh wrote:

>>
> I am using Frame class and it can't be run in headless mode.
> 
> Suggestion of Brien is useful (running x virtual frame buffer) and I guess I
> am going to do the same on my servers but right now I am testing on my
> laptop which is running X server.
> 
> The problem here is, the test program runs over ssh without any problem but
> when I run the map reduce program I keep getting error. Both the standalone
> program and MR program are run as the same user.
> 
> I followed Todd's suggestion and checked the value of XAUTHORITY environment
> variable. It turns out, the value of this variable is not set when I do ssh.
> So I am trying to see if I can set its value and if the MR program runs
> after that. But if this is the problem then the standalone program should
> also not run.
> 

logged in as the user at the console, set all access open:

xhost +

This should stop the xauthority stuff mattering, leaving only the 
DISPLAY env variable as the binding issue

Re: Hadoop and X11 related error

Posted by Allen Wittenauer <aw...@linkedin.com>.


On 1/18/10 10:42 AM, "Tarandeep Singh" <ta...@gmail.com> wrote:

> The problem here is, the test program runs over ssh without any problem but
> when I run the map reduce program I keep getting error. Both the standalone
> program and MR program are run as the same user.

My vote is for "ssh is forwarding the X11 connection".


Re: Hadoop and X11 related error

Posted by Tarandeep Singh <ta...@gmail.com>.
On Mon, Jan 18, 2010 at 2:52 AM, Steve Loughran <st...@apache.org> wrote:

> Tarandeep Singh wrote:
>
>> Hi,
>>
>> I am running a MR job that requires usage of some java.awt.* classes, that
>> can't be run in headless mode.
>>
>> Right now, I am running Hadoop in a single node cluster (my laptop) which
>> has X11 server running. I have set up my ssh server and client to do X11
>> forwarding.
>>
>> I ran the following java program to ensure that X11 forwarding is working-
>>
>
> the problem here is that you need to tell AWT to run headless instead of
> expecting an X11 server to hand. Which means setting java.awt.headless=true
> in the processes that are using AWT to do bitmap rendering.
>
> see http://java.sun.com/developer/technicalArticles/J2SE/Desktop/headless/for details
>
>
I am using Frame class and it can't be run in headless mode.

Suggestion of Brien is useful (running x virtual frame buffer) and I guess I
am going to do the same on my servers but right now I am testing on my
laptop which is running X server.

The problem here is, the test program runs over ssh without any problem but
when I run the map reduce program I keep getting error. Both the standalone
program and MR program are run as the same user.

I followed Todd's suggestion and checked the value of XAUTHORITY environment
variable. It turns out, the value of this variable is not set when I do ssh.
So I am trying to see if I can set its value and if the MR program runs
after that. But if this is the problem then the standalone program should
also not run.

Thanks,
Tarandeep

Re: Hadoop and X11 related error

Posted by Steve Loughran <st...@apache.org>.
Tarandeep Singh wrote:
> Hi,
> 
> I am running a MR job that requires usage of some java.awt.* classes, that
> can't be run in headless mode.
> 
> Right now, I am running Hadoop in a single node cluster (my laptop) which
> has X11 server running. I have set up my ssh server and client to do X11
> forwarding.
> 
> I ran the following java program to ensure that X11 forwarding is working-

the problem here is that you need to tell AWT to run headless instead of 
expecting an X11 server to hand. Which means setting 
java.awt.headless=true in the processes that are using AWT to do bitmap 
rendering.

see 
http://java.sun.com/developer/technicalArticles/J2SE/Desktop/headless/ 
for details


Re: Hadoop and X11 related error

Posted by Wang Xu <gn...@gmail.com>.
Some Java program is hard to run remotely via ssh tunnel, but I do not
know the detail either.

On Mon, Jan 18, 2010 at 5:57 AM, Vladimir Klimontovich
<kl...@gmail.com> wrote:
> Maybe, hadoop running MR jobs using different user? For example, if you followed installation instructions
> from official site or used rpm/deb packages hadoop using "hadoop" user to run jobs. And you probably using different user
> for running your test program.
>
> On Jan 18, 2010, at 12:41 AM, Tarandeep Singh wrote:
>
>> Hi,
>>
>> I am running a MR job that requires usage of some java.awt.* classes, that
>> can't be run in headless mode.
>>
>> Right now, I am running Hadoop in a single node cluster (my laptop) which
>> has X11 server running. I have set up my ssh server and client to do X11
>> forwarding.
>>
>> I ran the following java program to ensure that X11 forwarding is working-
>>
>> public class Test
>>  throws Exception
>> {
>>  public static void main( String[] args)
>>  {
>>    Process p = runtime.getRuntime( ).exec( "/usr/bin/xterm");
>>    p.waitFor( );
>>  }
>> }
>>
>> I ran it as-
>> ssh localhost "java Test"
>> and it worked confirming that X11 forwarding is working over SSH.
>>
>> However, when I run a Map Reduce program that uses java.awt.* classes
>> (trying to create an object that extends Frame), I keep getting this error-
>>
>> java.lang.InternalError- Can't connect to X11 window server using
>> 'localhost:10.0' as the value of the DISPLAY variable
>>
>> I printed the value of DISPLAY variable-
>>
>> echo $DISPLAY
>> :0.0
>> ssh localhost
>> echo $DISPLAY
>> localhost:10.0
>>
>> I understand this is more of a SSH and X11 related issue, but X11 forwarding
>> is working over SSH in my standalone programs, but not in MR program. That's
>> why I am posting the problem here.
>>
>> Any help is greatly appreciated.
>>
>> Thanks,
>> Tarandeep
>
> ---
> Vladimir Klimontovich,
> skype: klimontovich
> GoogleTalk/Jabber: klimontovich@gmail.com
> Cell phone: +7926 890 2349
>
>



-- 
Wang Xu
Ted Turner  - "Sports is like a war without the killing." -
http://www.brainyquote.com/quotes/authors/t/ted_turner.html

Re: Hadoop and X11 related error

Posted by Edward Capriolo <ed...@gmail.com>.
On Sun, Jan 17, 2010 at 9:46 PM, Todd Lipcon <to...@cloudera.com> wrote:
> On Sun, Jan 17, 2010 at 6:19 PM, Tarandeep Singh <ta...@gmail.com>wrote:
>
>> On Sun, Jan 17, 2010 at 1:57 PM, Vladimir Klimontovich <
>> klimontovich@gmail.com> wrote:
>>
>> > Maybe, hadoop running MR jobs using different user? For example, if you
>> > followed installation instructions
>> > from official site or used rpm/deb packages hadoop using "hadoop" user to
>> > run jobs. And you probably using different user
>> > for running your test program.
>> >
>> >
>> Thanks Vladimir..
>> I am not running hadoop as the "hadoop" user. Both my test program and
>> hadoop are run using same user.
>> From the exception, it is clear the DISPLAY variable is set properly, but
>> java is not able to connect to the X server (not sure either some
>> permission
>> issues or what)
>>
>
> My guess is you're either missing the XAUTHORITY variable, or it's pointed
> at a file that the user executing the task can't read.
>
> (as a side note, this seems like a really bad idea - why are your hadoop
> tasks trying to talk to an X11 server in the first place?)
>
> -Todd
>
>
>>
>>
>>
>> > On Jan 18, 2010, at 12:41 AM, Tarandeep Singh wrote:
>> >
>> > > Hi,
>> > >
>> > > I am running a MR job that requires usage of some java.awt.* classes,
>> > that
>> > > can't be run in headless mode.
>> > >
>> > > Right now, I am running Hadoop in a single node cluster (my laptop)
>> which
>> > > has X11 server running. I have set up my ssh server and client to do
>> X11
>> > > forwarding.
>> > >
>> > > I ran the following java program to ensure that X11 forwarding is
>> > working-
>> > >
>> > > public class Test
>> > >  throws Exception
>> > > {
>> > >  public static void main( String[] args)
>> > >  {
>> > >    Process p = runtime.getRuntime( ).exec( "/usr/bin/xterm");
>> > >    p.waitFor( );
>> > >  }
>> > > }
>> > >
>> > > I ran it as-
>> > > ssh localhost "java Test"
>> > > and it worked confirming that X11 forwarding is working over SSH.
>> > >
>> > > However, when I run a Map Reduce program that uses java.awt.* classes
>> > > (trying to create an object that extends Frame), I keep getting this
>> > error-
>> > >
>> > > java.lang.InternalError- Can't connect to X11 window server using
>> > > 'localhost:10.0' as the value of the DISPLAY variable
>> > >
>> > > I printed the value of DISPLAY variable-
>> > >
>> > > echo $DISPLAY
>> > > :0.0
>> > > ssh localhost
>> > > echo $DISPLAY
>> > > localhost:10.0
>> > >
>> > > I understand this is more of a SSH and X11 related issue, but X11
>> > forwarding
>> > > is working over SSH in my standalone programs, but not in MR program.
>> > That's
>> > > why I am posting the problem here.
>> > >
>> > > Any help is greatly appreciated.
>> > >
>> > > Thanks,
>> > > Tarandeep
>> >
>> > ---
>> > Vladimir Klimontovich,
>> > skype: klimontovich
>> > GoogleTalk/Jabber: klimontovich@gmail.com
>> > Cell phone: +7926 890 2349
>> >
>> >
>>
>

(as a side note, this seems like a really bad idea - why are your hadoop
tasks trying to talk to an X11 server in the first place?)

If he is using java.awt, that windowing needs an xserver to draw.
going to java.swing with headless might help. I believe the entire
swing API can work in headless mode, but that would involve a re-code.

Re: Hadoop and X11 related error

Posted by Todd Lipcon <to...@cloudera.com>.
On Sun, Jan 17, 2010 at 6:19 PM, Tarandeep Singh <ta...@gmail.com>wrote:

> On Sun, Jan 17, 2010 at 1:57 PM, Vladimir Klimontovich <
> klimontovich@gmail.com> wrote:
>
> > Maybe, hadoop running MR jobs using different user? For example, if you
> > followed installation instructions
> > from official site or used rpm/deb packages hadoop using "hadoop" user to
> > run jobs. And you probably using different user
> > for running your test program.
> >
> >
> Thanks Vladimir..
> I am not running hadoop as the "hadoop" user. Both my test program and
> hadoop are run using same user.
> From the exception, it is clear the DISPLAY variable is set properly, but
> java is not able to connect to the X server (not sure either some
> permission
> issues or what)
>

My guess is you're either missing the XAUTHORITY variable, or it's pointed
at a file that the user executing the task can't read.

(as a side note, this seems like a really bad idea - why are your hadoop
tasks trying to talk to an X11 server in the first place?)

-Todd


>
>
>
> > On Jan 18, 2010, at 12:41 AM, Tarandeep Singh wrote:
> >
> > > Hi,
> > >
> > > I am running a MR job that requires usage of some java.awt.* classes,
> > that
> > > can't be run in headless mode.
> > >
> > > Right now, I am running Hadoop in a single node cluster (my laptop)
> which
> > > has X11 server running. I have set up my ssh server and client to do
> X11
> > > forwarding.
> > >
> > > I ran the following java program to ensure that X11 forwarding is
> > working-
> > >
> > > public class Test
> > >  throws Exception
> > > {
> > >  public static void main( String[] args)
> > >  {
> > >    Process p = runtime.getRuntime( ).exec( "/usr/bin/xterm");
> > >    p.waitFor( );
> > >  }
> > > }
> > >
> > > I ran it as-
> > > ssh localhost "java Test"
> > > and it worked confirming that X11 forwarding is working over SSH.
> > >
> > > However, when I run a Map Reduce program that uses java.awt.* classes
> > > (trying to create an object that extends Frame), I keep getting this
> > error-
> > >
> > > java.lang.InternalError- Can't connect to X11 window server using
> > > 'localhost:10.0' as the value of the DISPLAY variable
> > >
> > > I printed the value of DISPLAY variable-
> > >
> > > echo $DISPLAY
> > > :0.0
> > > ssh localhost
> > > echo $DISPLAY
> > > localhost:10.0
> > >
> > > I understand this is more of a SSH and X11 related issue, but X11
> > forwarding
> > > is working over SSH in my standalone programs, but not in MR program.
> > That's
> > > why I am posting the problem here.
> > >
> > > Any help is greatly appreciated.
> > >
> > > Thanks,
> > > Tarandeep
> >
> > ---
> > Vladimir Klimontovich,
> > skype: klimontovich
> > GoogleTalk/Jabber: klimontovich@gmail.com
> > Cell phone: +7926 890 2349
> >
> >
>

Re: Hadoop and X11 related error

Posted by Tarandeep Singh <ta...@gmail.com>.
On Sun, Jan 17, 2010 at 1:57 PM, Vladimir Klimontovich <
klimontovich@gmail.com> wrote:

> Maybe, hadoop running MR jobs using different user? For example, if you
> followed installation instructions
> from official site or used rpm/deb packages hadoop using "hadoop" user to
> run jobs. And you probably using different user
> for running your test program.
>
>
Thanks Vladimir..
I am not running hadoop as the "hadoop" user. Both my test program and
hadoop are run using same user.
>From the exception, it is clear the DISPLAY variable is set properly, but
java is not able to connect to the X server (not sure either some permission
issues or what)



> On Jan 18, 2010, at 12:41 AM, Tarandeep Singh wrote:
>
> > Hi,
> >
> > I am running a MR job that requires usage of some java.awt.* classes,
> that
> > can't be run in headless mode.
> >
> > Right now, I am running Hadoop in a single node cluster (my laptop) which
> > has X11 server running. I have set up my ssh server and client to do X11
> > forwarding.
> >
> > I ran the following java program to ensure that X11 forwarding is
> working-
> >
> > public class Test
> >  throws Exception
> > {
> >  public static void main( String[] args)
> >  {
> >    Process p = runtime.getRuntime( ).exec( "/usr/bin/xterm");
> >    p.waitFor( );
> >  }
> > }
> >
> > I ran it as-
> > ssh localhost "java Test"
> > and it worked confirming that X11 forwarding is working over SSH.
> >
> > However, when I run a Map Reduce program that uses java.awt.* classes
> > (trying to create an object that extends Frame), I keep getting this
> error-
> >
> > java.lang.InternalError- Can't connect to X11 window server using
> > 'localhost:10.0' as the value of the DISPLAY variable
> >
> > I printed the value of DISPLAY variable-
> >
> > echo $DISPLAY
> > :0.0
> > ssh localhost
> > echo $DISPLAY
> > localhost:10.0
> >
> > I understand this is more of a SSH and X11 related issue, but X11
> forwarding
> > is working over SSH in my standalone programs, but not in MR program.
> That's
> > why I am posting the problem here.
> >
> > Any help is greatly appreciated.
> >
> > Thanks,
> > Tarandeep
>
> ---
> Vladimir Klimontovich,
> skype: klimontovich
> GoogleTalk/Jabber: klimontovich@gmail.com
> Cell phone: +7926 890 2349
>
>

Re: Hadoop and X11 related error

Posted by Vladimir Klimontovich <kl...@gmail.com>.
Maybe, hadoop running MR jobs using different user? For example, if you followed installation instructions
from official site or used rpm/deb packages hadoop using "hadoop" user to run jobs. And you probably using different user
for running your test program.

On Jan 18, 2010, at 12:41 AM, Tarandeep Singh wrote:

> Hi,
> 
> I am running a MR job that requires usage of some java.awt.* classes, that
> can't be run in headless mode.
> 
> Right now, I am running Hadoop in a single node cluster (my laptop) which
> has X11 server running. I have set up my ssh server and client to do X11
> forwarding.
> 
> I ran the following java program to ensure that X11 forwarding is working-
> 
> public class Test
>  throws Exception
> {
>  public static void main( String[] args)
>  {
>    Process p = runtime.getRuntime( ).exec( "/usr/bin/xterm");
>    p.waitFor( );
>  }
> }
> 
> I ran it as-
> ssh localhost "java Test"
> and it worked confirming that X11 forwarding is working over SSH.
> 
> However, when I run a Map Reduce program that uses java.awt.* classes
> (trying to create an object that extends Frame), I keep getting this error-
> 
> java.lang.InternalError- Can't connect to X11 window server using
> 'localhost:10.0' as the value of the DISPLAY variable
> 
> I printed the value of DISPLAY variable-
> 
> echo $DISPLAY
> :0.0
> ssh localhost
> echo $DISPLAY
> localhost:10.0
> 
> I understand this is more of a SSH and X11 related issue, but X11 forwarding
> is working over SSH in my standalone programs, but not in MR program. That's
> why I am posting the problem here.
> 
> Any help is greatly appreciated.
> 
> Thanks,
> Tarandeep

---
Vladimir Klimontovich,
skype: klimontovich
GoogleTalk/Jabber: klimontovich@gmail.com
Cell phone: +7926 890 2349


Re: Hadoop and X11 related error

Posted by brien colwell <xc...@gmail.com>.
>From memory, some parts of AWT won't run in headless mode. I used to run an
x virtual frame buffer on servers that created graphics. It's a standard
package on most Linux distros. I forget if there was something special
needed to set it up, but might be worth looking into.




On Sun, Jan 17, 2010 at 4:41 PM, Tarandeep Singh <ta...@gmail.com>wrote:

> Hi,
>
> I am running a MR job that requires usage of some java.awt.* classes, that
> can't be run in headless mode.
>
> Right now, I am running Hadoop in a single node cluster (my laptop) which
> has X11 server running. I have set up my ssh server and client to do X11
> forwarding.
>
> I ran the following java program to ensure that X11 forwarding is working-
>
> public class Test
>  throws Exception
> {
>  public static void main( String[] args)
>  {
>    Process p = runtime.getRuntime( ).exec( "/usr/bin/xterm");
>    p.waitFor( );
>  }
> }
>
> I ran it as-
> ssh localhost "java Test"
> and it worked confirming that X11 forwarding is working over SSH.
>
> However, when I run a Map Reduce program that uses java.awt.* classes
> (trying to create an object that extends Frame), I keep getting this error-
>
> java.lang.InternalError- Can't connect to X11 window server using
> 'localhost:10.0' as the value of the DISPLAY variable
>
> I printed the value of DISPLAY variable-
>
> echo $DISPLAY
> :0.0
> ssh localhost
> echo $DISPLAY
> localhost:10.0
>
> I understand this is more of a SSH and X11 related issue, but X11
> forwarding
> is working over SSH in my standalone programs, but not in MR program.
> That's
> why I am posting the problem here.
>
> Any help is greatly appreciated.
>
> Thanks,
> Tarandeep
>