I've been using spark for a while, and can't remember exactly when I might have have upgraded. I used to be able to use Spark to communicate with my MSN (via an openfire plugin) contacts in Mandarin Chinese, but right now in Spark 2.6.3 under linux mint debian edition, any Chinese characters just come up as square boxes.
Other java applications display the Chinese fonts properly, so this isn't an issue with my Java or font installation in general, though maybe Spark is hard-coded to use a font I don't have? I've tried installing more Chinese fonts but that has not fixed the problem.
The square-boxes-instead-of-chinese-characters are visible in both the main contact list, and in chat windows.
The correct unicode data is being transmitted by the server: if I copy/paste chacters from the Spark chat window into another window, they display properly. Also, if I hover over the icon for a Spark chat window on my gnome-panel, I see the properly rendered Chinese characters for the name of my chat contact. The improperly rendered characters are only seen within Spark itself. When typing in a chat window the characters also don't render properly, so the problem has nothing to do with the server side.
I'm pretty confident that this was working at some point, but I can't remember what version I was using before. I'm also experiencing this issue on another computer running Ubuntu 10.10.
If I go to the "Actions" menu and select "Languages," I also see boxes instead of characters for the relevant non-ascii language names.
Linux (possibly Windows also)
I found a solution to this problem in my installation:
mv jre jre.bak
ln -s /usr/lib/jvm/java-6-sun/jre/ jre
restarted spark, and it works. For anyone else having this problem, make sure you have a JVM installed and change /usr/lib/jvm/java-6-sun/jre/ to your $JAVA_HOME.
I think maybe the font.properties in Spark's bundled JRE is wrong.
To test this bug, just check the language list under the actions menu after installing in Linux. If the fonts are wrong, "中文-中国" will appear as a bunch of boxes instead of Chinese characters.
As bundled JRE has been updated a few times in Spark 2.7.0 builds this should be fixed (chinese font in Language selector is showing ok).