Disclaimer:
This issue seems to be solved by this commit on Mar 4, 2021
https://github.com/apple/swift-corelibs-foundation/commit/5ed74d30b9a239af5f77dd6f74ae961dfe14b83e On iOS 15 and newer, you can use Decimal type for money decoding
Press enter or click to view image in full size
As easy and trivial as it seems, processing money — especially the amount part — in Swift can be a huge pain in the ass.
Imagine we’re given this simple task in Swift for your Money-Exchange-Rate app we’re developing:
Deserialize transaction amount from JSON to Swift struct
Easy!
This should be 2 minutes to code. Let’s dive into Xcode and write our proof of concept.
Press enter or click to view image in full size
Here comes the QA
Every QA engineer knows that input fields are the most common source of low-hanging-bugs that can be found in every app.
Let’s try to convert Euros to Bahraini Dinar he said.
Press enter or click to view image in full size
Damn you, floating-point arithmetic!
Using floating-point type is maybe not the best solution here, try Decimal. Maybe it was a mistake to use Double in the first place anyway.
Press enter or click to view image in full size
Why? Why isn’t Decimal with Codable working? It should!
Try “everything”
- Using NSDecimalNumber — nope, not Codable.
- Using Double and converting to NSDecimalNumber — nope, returns 9159795.994999995.
- Trying to get the amount as String — nope, cannot be deserialized (Expected to decode String but found a number instead.)
- Maybe there is something like DateDecodingStrategy for numbers! — nope, there is not.
- Search for another number data type — nope, there is none that works.
- An hour googling session — nope, no solution (and not much information) to this problem
- Questioning our programming skills and knowledge of the Swift and iOS itself — nope, still OK.
What is happening?
As we’re getting desperate, try to parse the number yourself.
When we provide the string value of our problematic number to the Decimal initializer, everything is OK. So where is the problem? Why is parser returning inaccurate number when the “amount” data type is Decimal?
Always Double
The problem with this issue is JSONDecoder itself. It is using JSONSerialization class for deserialization, and this class has really simple logic for parsing numbers. If the number is not Integer then it’s Double. That’s it.
So since we’re using JSONDecoder, there is no way how to tell it that we don’t want to convert to double first. When your datatype is Decimal, JSONDecoder will parse the number as Double and then will convert it to Decimal. This is bullshit.
Pull request on GitHub exists that solves this issue! Hurray! It will take some time to gut necessary things, but with little effort, we’ll end up with our own “forked” BetterJSONDecoder.
As you can see in this pull request, precision for Decimal is much better, but huge numbers (in the matter of exponent) will still be represented as Double. That can possibly cause some trouble when handling really small fractions in cryptocurrencies or huge amounts caused by hyperinflation.
You can also dig deeper into this issue on swift bugs.
Solutions
In the end, you can solve this issue with 4 approaches.
Rounding 😡
You can round the parsed number to 2 (or 3 in some currencies) decimal places. This is a low-effort and fast solution that will work in most cases.
Own Deserialization ☹️
As seen above, you can write your own (or modify as seen above or use 3rd party lib — but be careful, for example, popular ObjectMapper has this same issue as it’s using JSONSerialization class too) JSONDeserializer. This can take some time for development and could bring more issues in the future. But if you need to do some calculation, right now, there is no “native way” to do it. Just note that there can still be an issue with huge numbers.
Use String 😌
If you own an entire stack of your app (including backend) or can get the developer who is developing the backend to change it, use string. Java and other languages can handle big numbers much better than Swift. It’s sad, but this is the easiest and best solution (and with a small impact on other platforms). Also, this is the solution I ended up with.
Use significand and exponent 🤓
Another option that would need changes on the backend is using more “scientific notation”, as one of the commenters on Reddit pointed out. You can use 2 integers that would remove the need for the decimal point. With this approach, Decimal already provides an initializer that can take these values.
The Apple way
As my former colleague once told me: Apple does things that look easy and nice on the stage during WWCC and will fit the needs for 95% of the cases. But there is always this one case that you need, and you’re unable to do it, and you’ll end up with your own solution in the end.
This is a pretty common case for Apple. We don’t have to go much further. Just try DateDecodingStrategy I mentioned above. For .iso8601, the documentation says: Decode the `Date` as an ISO-8601-formatted string (in RFC 3339 format). I dare you to deserialize this valid case: 2002–10–02T15:00:00.05Z.