What really frustrates me about that, is that someone put in a lot of effort to be able to write these things out using proper words, but it still isn’t really more readable.
Like, sure, unsigned is very obvious. But short, int, long and long long don’t really tell you anything except “this can fit more or less data”. That same concept can be expressed with a growing number, i.e. i16, i32 and i64.
And when someone actually needs to know how much data fits into each type, well, then the latter approach is just better, because it tells you right on the tin.






Huh, so if you don’t opt for these more specific number types, then your program will explode sooner or later, depending on the architecture it’s being run on…?
I guess, times were different back when C got created, with register size still much more in flux. But yeah, from today’s perspective, that seems terrifying. 😅