You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xin Ren (JIRA)" <ji...@apache.org> on 2016/07/08 06:54:10 UTC

[jira] [Created] (SPARK-16437) SparkR read.df() from parquet got error: SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder"

Xin Ren created SPARK-16437:
-------------------------------

             Summary: SparkR read.df() from parquet got error: SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder"
                 Key: SPARK-16437
                 URL: https://issues.apache.org/jira/browse/SPARK-16437
             Project: Spark
          Issue Type: Bug
          Components: SparkR, SQL
            Reporter: Xin Ren
            Priority: Minor
             Fix For: 2.0.0


start SparkR console
{code}
./bin/sparkR
{code}

then get error
{code}
 Welcome to
    ____              __
   / __/__  ___ _____/ /__
  _\ \/ _ \/ _ `/ __/  '_/
 /___/ .__/\_,_/_/ /_/\_\   version  2.0.0-SNAPSHOT
    /_/


 SparkSession available as 'spark'.
>
>
> library(SparkR)
>
> df <- read.df("examples/src/main/resources/users.parquet")
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
>
>
> head(df)
16/07/07 23:20:54 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
    name favorite_color favorite_numbers
1 Alyssa           <NA>     3, 9, 15, 20
2    Ben            red             NULL
{code}

seems need to add a lib from slf4j
http://stackoverflow.com/questions/7421612/slf4j-failed-to-load-class-org-slf4j-impl-staticloggerbinder



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org