Spark's build should unpack .pack'd files at build time, not at first startup


End users don't always have access to edit the file space where Spark is installed, and therefore when the .pack files are automatically unpacked, they can't be written. We need to unpack files at build time, not at startup time.


Just about anything really


Michael Will
October 28, 2009, 7:38 PM

Now default is disabled. Can be enabled in the build.xml

July 3, 2009, 3:15 PM

.patch file is for the source, so you will have to compile it if you want to apply it. I'm not sure about spark.install4j file, but i think you cant patch anything without the compiling.

Neil J
July 3, 2009, 12:22 PM

Hi guys, new installer of Spark here...
Can someone please advise how this patch actually works ie. how to patch the original installation file

I am a little confused using a .patch file and a .install4j file ...
Any help appreciated...
Thank you !

Michael Will
February 12, 2009, 2:28 AM

I have use the evaluation version from install4j to update the spark.install4j.
Now it use pack200 during the installation. The new installer have a size of 9 MB.
It's a different from 5 MB.

Michael Will
February 11, 2009, 6:37 AM

Pack200 compressed some(not all) jar-files, to make spark smaller for the installer.

I think the installer must be supports pack200.

For Windows:
I have read the documentation from install4j, that supports pack200.
I think we should use the pack200 from the install4j, that packs all jar-files for the installer with the pack200 compression and
decompress the jars during the installation.

For Linux:
The debian.installer doesn't support pack200 at this time. If I have time, I will change the debian installer to support pack200.
That compress all files with pack200 during building the installer and decompress the files during the installation like install4j.

Your pinned fields
Click on the next to a field label to start pinning.




Michael Will


Daniel Henninger