Spark should remove/overwrite old versions of libs during update
It is an old issue, but recently with a switch to Maven it has become more critical, as libraries come with their version numbers in the names from Maven. Spark's installer is not removing older files when doing an update, so it will leave old versions of Smack, Jtattoo, etc, in the libs folder. It also leaves other files, like libs\windows folder on Linux (or linux folder on windows), though build process is already modified to avoid adding those to a not supported platform. Another problem is that Spark uses first version that it founds it seems, so when two versions of Smack are present, it may use the older one. Maybe installer can be instructed to remove previous versions of files, or files should use static names, so they are overwritten by the installer (but this won't solve an issue with old, not needed files being left behind).
Looks like uninstall prior installing new version works fine. Only newest files left after the update. -q switch works also.
Ah, here we go: as part of the installation process, this option can be added: "Uninstall previous installation." That's probably it.
At least for Windows when you run the installer it provides an "Update" option. And this option only overwrites files with the same name, leaving old files in place. Not sure if this is not by design. Openfire usually does uninstall of previous version before doing install, though you also choose Update option. Maybe it could do the same for Spark.
I'm concerned that this happens. We can probably tweak the Maven project to avoid having version numbers in the library jar files, but that will not resolve all problems. Given that the installer does not delete files from an earlier version, it probably will not remove a library that was part of an old release, but not a newer release. That could lead to nasty problems.
Surely, install4j (which we use to create our installers) has a fix for this. We're probably just using it wrong.