We're updating the issue view to help you get more done. 

Improve on version detection

Description

Spark detects its own version upon startup. It does so by looking at metadata of the packaged code. This does not work properly when running Spark from a development environment (as the code isn't packaged there), causing issues with plugins (plugins need to verify that they can run against the version of Spark that's being used).

Spark should obtain its version from something that's available earlier in the build process.

Environment

None

Acceptance Test - Entry

None

Assignee

Guus der Kinderen

Reporter

Guus der Kinderen

Labels

None

Expected Effort

None

Ignite Forum URL

None

Fix versions

Affects versions

Priority

Minor
Configure