You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mich Talebzadeh <mi...@gmail.com> on 2016/10/25 15:59:04 UTC
Getting only results out of Spark Shell
Is it possible using Spark Shell to printout the actual output without
commands passed through?
Below all I am interested are those three numbers in red
Spark context Web UI available at http://50.140.197.217:55555
Spark context available as 'sc' (master = local, app id =
local-1477410914051).
Spark session available as 'spark'.
Loading ./stocks.scala...
import org.apache.spark.sql.functions._
import java.util.Calendar
import org.joda.time._
import java.util.Calendar
import org.joda.time._
ticker: String = tsco
today: org.joda.time.DateTime = 2016-10-25T16:55:17.261+01:00
df1: org.apache.spark.sql.DataFrame = _c0: string, _c1: string ... 6 more
fields
defined class columns
df2: org.apache.spark.sql.Datasetcolumns = Stock: string, Ticker: string
... 6 more fields
changeToDate: (TradeDate: String)org.apache.spark.sql.Column
rs: org.apache.spark.sql.Datasetorg.apache.spark.sql.Row =
AverageDailyPrice: double
328.0
327.13
325.63
I can do it in shell but there must be a way of running the commands
silently?
Thanks
Dr Mich Talebzadeh
LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
http://talebzadehmich.wordpress.com
*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.