Fix signed/unsigned comparisons properly
Apparently, gcc only warns about [u]int16_t comparisons on platforms where sizeof(int) == sizeof(int16_t). Seems like integer type promotion happens here, and gcc only warns when it changes the comparison result. The maximum buffer length is 1<<16 (when window size is 15), in which case end takes values >=0x8000, and start between 0 and 0x8000, inclusive. Which means we really want to compare unsigned pos to signed start (except when start is 0x8000), so comparing them as either int16_t or uint16_t would be incorrect. But the difference between end-1 and start is always strictly less than 1<<widow_size, which makes it positive when interpreted as int16_t. The only other case in which the difference between pos and start may reach a positive value on 16-bit platforms is when indexing is used, the window size is 15 and the backlog is empty, making start 0x8000, and pos gets the sentinel end-of-list value 0xFFFF from the index. This case is handled by changing the sentinel value to 0 when the backlog is not full. (Updated, see below.) Conflicts: heatshrink_encoder.c EDIT: Since commit 7e818d55 eliminates the backlog full calculations by initially filling the backlog with zeroes (i.e., the backlog is always full), the special case sentinel value of 0 is no longer necessary. Closes #19.
Please register or sign in to comment