Alt text:

Transcendence meme template

function main() {…}

int main() {…}

void main() {…}

U0 main() {…}

/* HolyC example */
U0 Main()
{
  U8 *message = "hello world";
  "%s\n",message;
}
Main;
      • Ephera@lemmy.ml
        link
        fedilink
        English
        arrow-up
        6
        ·
        3 days ago

        What really frustrates me about that, is that someone put in a lot of effort to be able to write these things out using proper words, but it still isn’t really more readable.

        Like, sure, unsigned is very obvious. But short, int, long and long long don’t really tell you anything except “this can fit more or less data”. That same concept can be expressed with a growing number, i.e. i16, i32 and i64.

        And when someone actually needs to know how much data fits into each type, well, then the latter approach is just better, because it tells you right on the tin.

        • labsin@sh.itjust.works
          link
          fedilink
          arrow-up
          9
          ·
          2 days ago

          In c they do indeed just mean shorter and longer int as the size of the int is defined by the compiler and target and originally represented the hardware.

          There are types like int32_t or int_least16_t.

          • Ephera@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            Huh, so if you don’t opt for these more specific number types, then your program will explode sooner or later, depending on the architecture it’s being run on…?

            I guess, times were different back when C got created, with register size still much more in flux. But yeah, from today’s perspective, that seems terrifying. 😅

            • fiqusonnick@lemmy.blahaj.zone
              link
              fedilink
              arrow-up
              2
              ·
              15 hours ago

              The C standard for different ints is absolutely cursed, even after C99 tried to normalize it. The only requirement is that sizeof(char) <= sizeof(short) <= sizeof(int) <= sizeof(long) <= sizeof(long long) and sizeof(char) == 1. Mind you they don’t define what size a byte is so you technically can have an architecture where all of those are 64 bits. Oh and for that same reason exact-size types (int32_t, uint16_t etc) are not guaranteed to be defined

              Fuck

  • HiddenLayer555@lemmy.ml
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    3 days ago

    main =

    This message was brought to you by the Haskell gang

    let () =

    This message was brought to you by the OCaml gang

    This message was brought to you by the Python gang (only betas check __name__, assert your dominance and force every import to run your main routine /s)

  • Ephera@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 days ago

    Oh man, a zero byte long unsigned integer? Lots of languages represent it as an empty tuple these days (the “unit” type), but from quickly scanning the documentation, it looks like HolyC doesn’t support tuples, so I guess you gotta get creative…