I was hoping to my make my first ‘real’ blog post something interesting and useful.
Instead, it’s going to be a laughing rant over the wonders of bit lengths.
Once upon a time, it was necessary to count every byte of data you put into something. If an 8 bit number served your purpose, you used 8 bits exactly and tried to shave some memory out elsewhere. You didn’t use 16 — or higher — bit lengths unless you really needed them because every last bit was precious.
These days, we’re mostly past that. We just say ‘int’ and be done with it, without really thinking about how wide it is. After all, the default of 32 bits is huge, and while some applications can hit that limit, many, many more won’t. (And with 64 bit OS’s becoming more and more common, I would not be surprised to see the default int length become 64…).
So, when I created my Core Data schema, I picked a nice, sensible 16-bit length. Given that I was transitioning from storing a version date to a version number, it made sense. We’re not going to have hundreds and thousands of versions of an ad.
Unfortunately, the server side of the code hasn’t made that transition yet. The server-side code is based on date, still, and the datetime values are reduced to a 32 bit unix timestamp to transfer them to the device.
Unfortunately, the 32 bit value is larger than the 16 bit address space. So I just spent several hours debugging code which reduces, logically, to the following:
For some strange reason, I always got into the if block. It was only after a long stretch of debugging that I discovered that the issue was the above code — at which point it didn’t take too long to figure out what the actual issue was. Just… frustrating to get there.