You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Daniel Pinyol (JIRA)" <ji...@apache.org> on 2015/08/18 00:15:46 UTC

[jira] [Created] (SPARK-10067) Long delay (16 seconds) when running local session on offline machine

Daniel Pinyol created SPARK-10067:
-------------------------------------

             Summary: Long delay (16 seconds) when running local session on offline machine
                 Key: SPARK-10067
                 URL: https://issues.apache.org/jira/browse/SPARK-10067
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.4.1
         Environment: Mac 10.10.5, java 1.8 from IntelliJ 14.1
            Reporter: Daniel Pinyol
            Priority: Minor


If I run this
{code:java}
        SparkContext sc = new SparkContext("local", "test");
{code}
on a machine with no network, it hangs during 15 or 16 seconds during this point, and then it successfully resumes. Looks like the problem is when checking the kerberos realm (see callstack below).
Is there anyway to avoid this annoying delay? I reviewed https://spark.apache.org/docs/latest/configuration.html, but couldn't find any solution.

thanks

{noformat}
"main@1" prio=5 tid=0x1 nid=NA runnable
  java.lang.Thread.State: RUNNABLE
	  at java.net.PlainDatagramSocketImpl.peekData(PlainDatagramSocketImpl.java:-1)
	  - locked <0x758> (a java.net.PlainDatagramSocketImpl)
	  at java.net.DatagramSocket.receive(DatagramSocket.java:787)
	  - locked <0x732> (a java.net.DatagramSocket)
	  - locked <0x759> (a java.net.DatagramPacket)
	  at com.sun.jndi.dns.DnsClient.doUdpQuery(DnsClient.java:413)
	  at com.sun.jndi.dns.DnsClient.query(DnsClient.java:207)
	  at com.sun.jndi.dns.Resolver.query(Resolver.java:81)
	  at com.sun.jndi.dns.DnsContext.c_getAttributes(DnsContext.java:434)
	  at com.sun.jndi.toolkit.ctx.ComponentDirContext.p_getAttributes(ComponentDirContext.java:235)
	  at com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.getAttributes(PartialCompositeDirContext.java:141)
	  at com.sun.jndi.toolkit.url.GenericURLDirContext.getAttributes(GenericURLDirContext.java:103)
	  at sun.security.krb5.KrbServiceLocator.getKerberosService(KrbServiceLocator.java:85)
	  at sun.security.krb5.Config.checkRealm(Config.java:1120)
	  at sun.security.krb5.Config.getRealmFromDNS(Config.java:1093)
	  at sun.security.krb5.Config.getDefaultRealm(Config.java:987)
	  at sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethodAccessorImpl.java:-1)
	  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	  at java.lang.reflect.Method.invoke(Method.java:483)
	  at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:75)
	  at org.apache.hadoop.security.authentication.util.KerberosName.<clinit>(KerberosName.java:85)
	  at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:225)
	  - locked <0x57d> (a java.lang.Class)
	  at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:214)
	  at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:669)
	  at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:571)
	  at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2162)
	  at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2162)
	  at scala.Option.getOrElse(Option.scala:121)
	  at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2162)
	  at org.apache.spark.SparkContext.<init>(SparkContext.scala:301)
	  at org.apache.spark.SparkContext.<init>(SparkContext.scala:155)
	  at org.apache.spark.SparkContext.<init>(SparkContext.scala:170)
	  at DataFrameSandbox.<init>(DataFrameSandbox.java:31)
	  at DataFrameSandbox.main(DataFrameSandbox.java:45)

{noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org