[EDP] Add a Spark job type (instead of overloading Java)
Spark EDP has been implemented initially using the Java job type. However, it will be better to suport a specific Spark job type for several reasons:
* the semantics are slightly different. Spark requires a "main" application jar and supporting libs are optional (Java uses all libs)
* the Spark job type may someday support Python apps
* the possible config set for Spark will be different from Java (although they both use edp.java.
* Spark/Java may diverge in different ways in the future
This will need support in sahara-api and the dashboard, there should be no data model
or client impact.
Blueprint information
- Status:
- Complete
- Approver:
- Sergey Lukjanov
- Priority:
- High
- Drafter:
- Trevor McKay
- Direction:
- Approved
- Assignee:
- Trevor McKay
- Definition:
- Approved
- Series goal:
- Accepted for juno
- Implementation:
- Implemented
- Milestone target:
- 2014.2
- Started by
- Sergey Lukjanov
- Completed by
- Sergey Lukjanov
Related branches
Related bugs
Sprints
Whiteboard
Gerrit topic: https:/
Addressed by: https:/
[EDP] Add a Spark job type (instead of overloading Java)
Addressed by: https:/
Implement EDP for a Spark standalone cluster
Addressed by: https:/
Add a Spark job type for EDP