You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Al Pivonka <al...@gmail.com> on 2015/12/14 16:22:05 UTC

Dev Environment (again)

I've read through the mail archives and read the different threads.



I believe there is a great deal of value in teaching others.



I'm a 14yr vet of Java and would like to contribute to different Spark
projects. Here are my dilemmas:

1)    How does one quickly get a working environment up and running?



What do I mean by environment?


   1.  I have an IDE – Not a problem I can build using sbt.
   2. Environment to me is a working standalone spark cluster (Docker)
   which I can take what I build from #1 and re-deploy to test out my changes
   etc.
   3. What are the dependencies between projects internal to Spark?



How to on-board a new developer and make them productive as soon as
possible.



Not looking for answers to just these questions/dilemmas.



There is a wealth of knowledge here (existing Spark & sub-projects
developers/Architects).



My proposal, is to document the onboarding process and dependencies for a
new contributor, what someone will need in order to get working dev
environment up and running for the purposes of being able to add/test new
functionality to the Spark project. Also document how to setup a test
environment in order to deploy and test out said new functionality
(Docker/Standalone).





Suggestions?