Spark's build should unpack .pack'd files at build time, not at first startup
End users don't always have access to edit the file space where Spark is installed, and therefore when the .pack files are automatically unpacked, they can't be written. We need to unpack files at build time, not at startup time.
Just about anything really
Now default is disabled. Can be enabled in the build.xml
.patch file is for the source, so you will have to compile it if you want to apply it. I'm not sure about spark.install4j file, but i think you cant patch anything without the compiling.
Hi guys, new installer of Spark here...
Can someone please advise how this patch actually works ie. how to patch the original installation file
I am a little confused using a .patch file and a .install4j file ...
Any help appreciated...
Thank you !
I have use the evaluation version from install4j to update the spark.install4j.
Now it use pack200 during the installation. The new installer have a size of 9 MB.
It's a different from 5 MB.
Pack200 compressed some(not all) jar-files, to make spark smaller for the installer.
I think the installer must be supports pack200.
I have read the documentation from install4j, that supports pack200.
I think we should use the pack200 from the install4j, that packs all jar-files for the installer with the pack200 compression and
decompress the jars during the installation.
The debian.installer doesn't support pack200 at this time. If I have time, I will change the debian installer to support pack200.
That compress all files with pack200 during building the installer and decompress the files during the installation like install4j.