Translations show corrupted non-english characters

Description

After the recent cleanup of all translation files they somehow got broken. Although it looks ok in IDE and on Github, but Spark is not showing special characters correctly. It only shows them correctly if i use \uxxxx codes for such characters. But all translations files have been converted to UTF-8 a while ago to avoid using such codes and for a while it was working fine. I have done lots of translations edits since then and it always was ok. Not sure what happened this time.

Environment

None

Activity

Show:
wroot
June 8, 2020, 5:14 AM

Thanks. Can you show how exactly your modified Res.java looks?

wroot
June 8, 2020, 5:34 AM

Ok, i have figured it out and applied your workaround. Looks good with all languages. Thanks a lot!

Aleksandr Boyko
June 8, 2020, 6:30 AM

Also, same workaround should be placed in org.jivesoftware.spark.plugin.flashing > FlashingResources.java, it uses ResourceBundle too

wroot
June 8, 2020, 10:10 AM

Thanks. Have applied the same fix to Flashing, ROAR, Transfer Guard and Fastpath plugins.

wroot
June 8, 2020, 7:08 PM

Aleksandr, maybe you have idea about ?

Fixed

Assignee

wroot

Reporter

wroot

Labels

Expected Effort

None

Components

Affects versions

Priority

Blocker
Configure