• 0 Posts
  • 38 Comments
Joined 2 years ago
cake
Cake day: June 14th, 2023

help-circle


  • What you can achieve in a couple of pages of Python can be pretty spectacular. It’s also mostly very easy-to-read, with the possible exception of class inheritance, which is confusing mess.

    If you need to write more than a couple of pages, then its lack of types becomes a hindrance to me - doing refactors when functions can take basically any arguments is quite painful, for instance. Not requiring any particular structure is great, up until you start to struggle with lack of structure.

    Ideal programming language for when you’re wanting to do something that would be a bit too unwieldy for a shell script. It also makes network requests and json parsing very straightforward, so it’s great for interacting with REST APIs and writing simple microservices. Fast to write and runs quite quickly, so a good choice for Advent Of Code-like tasks. Would probably choose a different language for larger projects or when working in a team, though.


  • I learned z80 assembly back when the cutting edge of technology was a ZX Spectrum, and 68k assembly when I upgraded to an Amiga. That knowledge served me quite well for my early career in industrial automation - it was hard real-time coding on eZ80’s and 65c02 processors, but the knowledge transfers.

    Back in the day, when input got mapped straight into a memory location and the display output was another memory location, then assembly seems like magic. Read the byte they corresponds to the right-hand middle row of the keyboard, check if a certain bit is set in that byte, therefore a key is held down. Call your subroutine that copies a sequence of bytes into a known location. Boom, pressing a key updates the screen. Awesome.

    Modern assembly (x64 and the like) has masses of rules about pointer alignment for the stacks, which you do so often you might as well write a macro for it. Since the OS doesn’t let you write system memory any more (a good thing) then you need to make system calls and call library functions to do the same thing. You do that so often that you might as well write a macro for that as well. Boom, now your assembly looks almost exactly like C. Might as well learn that instead.

    In fact, that’s almost the purpose of C - a more readable, somewhat portable assembly language. Experienced C developers will know which sequence of opcodes they’d expect from any language construction. It’s quite a simple mapping in that regard.

    It’s handy to know a little assembly occasionally, but unless you’re writing eg. crypto implementations, which must take the exact same time and power to execute regardless of the input, then it’s impractical for almost any purpose nowadays.


  • Enough of that crazy talk - plainly WheeledDeviceServiceFactoryBeanImpl is where the dependency injection annotations are placed. If you can decide what the code does without stepping through it with a debugger, and any backtrace doesn’t have at least two hundred lines of Spring boot, then plainly it isn’t enterprise enough.

    Fair enough, though. You can write stupid overly-abstract shit in any language, but Java does encourage it.



  • Well now. My primary exposure to Go would be using it to take first place in my company’s ‘Advent of Code’ several years ago, in order to see what it was like, after which I’ve been pleased never to have to use it again. Some of our teams have used it to provide microservices - REST APIs that do database queries, some lightweight logic, and conversion to and from JSON - and my experience of working with that is that they’ve inexplicably managed to scatter all the logic among dozens of files, for what might be done with 80 lines of Python. I suspect the problem in that case is the developers, though.

    It has some good aspects - I like how easy it is to do a static build that can be deployed in a container.

    The actual language itself I find fairly abominable. The lack of exceptions means that error handling is all through everything, and not necessarily any better than other modern languages. The lack of overloads means that you’ll have multiple definitions of eg. Math.min cluttering things up. I don’t think the container classes are particularly good. The implementation of pointers seems solely implemented to let you have null pointer exceptions, it’s a pointless wart.

    If what you’re wanting to code is the kind of thing that Google do, in the exact same way that Google do it, and you have a team of hipsters who all know how it works, then it may be a fine choice. Otherwise I would probably recommend using something else.


  • I feel that Python is a bit of a ‘Microsoft Word’ of languages. Your own scripts are obviously completely fine, using a sensible and pragmatic selection of the language features in a robust fashion, but everyone else’s are absurd collections of hacks that fall to pieces at the first modification.

    To an extent, ‘other people’s C++ / Bash scripts’ have the same problem. I’m usually okay with ‘other people’s Java’, which to me is one of the big selling points of the language - the slight wordiness and lack of ‘really stupid shit’ makes collaboration easier.

    Now, a Python script that’s more than about two pages long? That makes me question its utility. The ‘duck typing’ everywhere makes any code that you can’t ‘keep in your head’ very difficult to reason about.


  • Frezik has a good answer for SQL.

    In theory, Ansible should be used for creating ‘playbooks’ listing the packages and configuration files which are present on a server or collection of servers, and then ‘playing the playbook’ arranges it so that those servers exist and are configured as you specified. You shouldn’t really care how that is achieved; it is declarative.

    However, in practice it has input, output, loops, conditional branching, and the ability to execute subtasks recursively. (In fact, it can quite difficult to stop people from using those features, since ‘declarative’ doesn’t necessarily come easily to everyone, and it makes for very messy config.) I think those are all the features required for Turing equivalence?

    Being able to deploy a whole fleet of servers in a very straightfoward way comes as close to the ‘infinite memory’ requirement as any programming language can get, although you do need basically infinite money to do that on a cloud service.



  • A binary tree is one way of preparing data, usually for sorting. Each node can have a left, right, or both, children.

      A
     / \
    B   C
       / \
      D   E
    

    “Inverting the tree” means swapping the children for each node, so that the order that the nodes are visited is reversed. Depending on whether you want to copy the tree or swap it in place then the algorithm is different. C++ provides iterators too, so providing a “order reversed” iterator can be done efficiently as well.

    You’re going to have to visit every node and do at least one swap for every node, and an efficient algorithm won’t do much more than that. Bring unable to do it suggests that the student programmer doesn’t understand stacks or recursion yet, so they’ve more to learn.


  • Programming a robust global date-time system and having a transparent conversation between metric and *imperial/traditional" units is just a warm-up to show that you can work with the truly demented currency system. Make sure everything is rounded off to the nearest whole ha’penny.


  • addie@feddit.uktoProgrammer Humor@lemmy.mlIEEE 754
    link
    fedilink
    arrow-up
    18
    ·
    9 months ago

    You can only store rational numbers as a ratio of two numbers, and there’s infinitely times more irrational numbers than rational ones - as soon as you took (almost any) root or did (most) trigonometry, then your accurate ratio would count for nothing. Hardcore maths libraries get around this by keeping the “value in progress” as an expression for as long as possible, but working with expressions is exceptionally slow by computer standards - takes quite a long time to keep them in their simplest form whenever you manipulate them.


  • To be fair, compiling C code with a C++ compiler gets you all the warnings from C++'s strong-typing rules. That’s a big bonus for me, even if it only highlights the areas of your C that are likely to become a maintenance hazard - all those void* casts want some documentation about what assumptions make them safe. Clang will compile variable-length arrays in C++, so you might want to switch off that warning since you’ve probably intended it. Just means that you can’t use designated initialisers, since C++ uses constructors for that and there’s no C equivalent. I’d be happy describing code that compiles in either situation as “C+”.

    Also stops anyone using auto, constexpr or nullptr as variable names, which will help if you want to copy-paste some well-tested code into a different project later.


  • Man alive, don’t get the managers working with audio. “Doubling the stream” might work if you’re using a signed audio format rather than an unsigned one, and the format is in the same endianness as the host computer uses. Neither of which are guaranteed when working with audio.

    But of course, the ear perceives loudness in a logarithmic way (the decibel scale), so for it to be perceived as “twice as loud”, it generally needs an exponential increase. Very high and low frequencies need more, since we’re less sensitive to them and don’t perceive increases so well.




  • Because if you disable browser autocomplete, what’s obviously going to happen is that everyone will have a text file open with every single one of their passwords in so that they can copy-paste them in. So prevent that. But what happens if you prevent that is that everyone will choose terrible, weak passwords instead. Something like September2025! probably meets the ‘complexity’ requirement…


  • addie@feddit.uktoProgrammer Humor@programming.devPsychopath Dev
    link
    fedilink
    arrow-up
    49
    arrow-down
    4
    ·
    edit-2
    1 year ago

    A bit like when we renamed all the master/slave terminology using different phrasing that’s frankly more useful a lot of the time, I think it’s about time we got rid of this “child” task nonsense. I suggest “subtask”. Then we can reword these books into something that no-one can make stupid jokes about any more, like “how to keep your subs in line” and “how to punish your subs when they’ve misbehaved”.


  • Well now. When we’ve been enforcing password requirements at work, we’ve had to enforce a bizarre combination of “you must have a certain level of complexity”, but also, “you must be slightly vague about what the requirements actually are, because otherwise it lets an attacker tune a dictionary attack against you”. Which just strikes me as a way to piss off our users, but security team say it’s a requirement, therefore, it’s a requirement, no arguing.

    “One” special character is crazy; I’d have guessed that was a catch-all for the other strange password requirements:

    • can’t have the same character more than twice in a row
    • can’t be one of the ten-thousand most popular passwords (which is mostly a big list of swears in russian)
    • all whitespace must be condensed into a single character before checking against the other rules

    We’ve had customers’ own security teams asking us if we can enforce “no right click” / “no autocomplete” to stop their users in-house doing such things; I’ve been trying to push back on that as a security misfeature, but you can’t question the cult thinking.