You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sergio Monteiro (JIRA)" <ji...@apache.org> on 2017/05/02 00:43:04 UTC

[jira] [Created] (SPARK-20551) ImportError adding custom class from jar in pyspark

Sergio Monteiro created SPARK-20551:
---------------------------------------

             Summary: ImportError adding custom class from jar in pyspark
                 Key: SPARK-20551
                 URL: https://issues.apache.org/jira/browse/SPARK-20551
             Project: Spark
          Issue Type: Bug
          Components: PySpark, Spark Shell
    Affects Versions: 2.1.0
            Reporter: Sergio Monteiro


the flowwing imports are failing in PySpark, even when I set the --jars or --driver-class-path:

import net.ripe.hadoop.pcap.io.PcapInputFormat
import net.ripe.hadoop.pcap.io.CombinePcapInputFormat
import net.ripe.hadoop.pcap.packet.Packet

Using Python version 2.7.12 (default, Nov 19 2016 06:48:10)
SparkSession available as 'spark'.
>>> import net.ripe.hadoop.pcap.io.PcapInputFormat
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: No module named net.ripe.hadoop.pcap.io.PcapInputFormat
>>> import net.ripe.hadoop.pcap.io.CombinePcapInputFormat
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: No module named net.ripe.hadoop.pcap.io.CombinePcapInputFormat
>>> import net.ripe.hadoop.pcap.packet.Packet
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: No module named net.ripe.hadoop.pcap.packet.Packet
>>>

The same works great in spark-shell. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org