Spark detects its own version upon startup. It does so by looking at metadata of the packaged code. This does not work properly when running Spark from a development environment (as the code isn't packaged there), causing issues with plugins (plugins need to verify that they can run against the version of Spark that's being used).
Spark should obtain its version from something that's available earlier in the build process.