this post was submitted on 30 Sep 2023
317 points (91.6% liked)
Programmer Humor
32472 readers
866 users here now
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I found debuggers practically unusable with asynchronous code. If you've got a timeout in there, it will break, when you arrive at a breakpoint.
Theoretically, this could be solved by 'pausing' the clock that does the timeouts, but that's not trivial.
At least, I haven't seen it solved anywhere yet.
I mean, I'm still hoping, I'm wrong about the above, but at this point, I find it kind of ridiculous that debuggers are so popular to begin with.
Because it implies that synchronous code or asynchronous code without timeouts are still quite popular.
it isn't?
I'm sure it is, I'm just not terribly happy about that fact.
Thing is, any code you write ultimately starts via input and funnels into output. Those two ends have to be asynchronous, because IO fundamentally is.
That means, if at any point between I and O, you want to write synchronous code, then you have to block execution of that synchronous code while output is happening. And if you're not at least spawning a new thread per input, then you may even block your ability to handle new input.
That can be fine, if your program has only one job to do at a time. But as soon as it needs to do a second job, that blocking becomes a problem and you'll need to refactor lots of things to become asynchronous code.
If you just build it as asynchronous from the start, it's significantly less painful.
But yeah, I guess, it's the usual case of synchronous code being fine for small programs, so tons of programmers never learn to feel comfortable with asynchronous code...