You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Don Drake <do...@gmail.com> on 2015/10/05 17:35:41 UTC

Utility for PySpark DataFrames - smartframes

I would like to announce a Python package that makes creating rows in
DataFrames in PySpark as easy as creating an object.

Code is available on GitHub, PyPi, and soon to be on spark-packages.org.


https://github.com/dondrake/smartframes

Motivation

Spark DataFrames provide a nice interface to datasets that have a schema.
Getting data from your code into a DataFrame in Python means creating a
Row() object with field names and respective values. Given that you already
have a schema with data types per field, it would be nice to easily take an
object that represents the row and create the Row() object automatically.

Smartframes allow you to define a class by just creating the schema that
represents the fields and datatypes. You can then create an object and set
the values like any other Python class. When you are ready to store that in
a DataFrame, just call the createRow() method.

The createRow() method will coerce any values into the correct data types,
for example, if a field is defined as an IntegerType and the value set in
the class is a String, it will attempt to convert the string to an Integer
before creating the Row().

This was written when creating Row()'s with Long datatypes and finding that
Spark did not handle converting integers as longs when passing values to
the JVM. I needed a consistent manner to create Row() for all of my
DataFrames.
<https://github.com/dondrake/smartframes#installation>Installation

pip install smartframes

<https://github.com/dondrake/smartframes#example>
Any feedback is appreciated.

-Don

-- 
Donald Drake
Drake Consulting
http://www.drakeconsulting.com/
800-733-2143