• WalterBright 16 hours ago

    > Originally, if you typed an unknown command, it would just say "this is not a git command".

    Back in the 70s, Hal Finney was writing a BASIC interpreter to fit in 2K of ROM on the Mattel Intellivision system. This meant every byte was precious. To report a syntax error, he shortened the message for all errors to:

        EH?
    
    I still laugh about that. He was quite proud of it.
    • zubairq 5 hours ago

      Pretty cool.. I had no idea Hal was such a hacker on the personal computers in those days... makes me think of Bitcoin whenever I hear Hal mentioned

      • vunderba 13 hours ago

        > EH?

        I feel like that would also make a good response from the text parser in an old-school interactive fiction game.

        Slightly related, but I remember some older variants of BASIC using "?" to represent the PRINT statement - though I think it was less about memory and more just to save time for the programmer typing in the REPL.

        • chuckadams 11 hours ago

          It was about saving memory by tokenizing keywords: '?' is how PRINT actually was stored in program memory, it just rendered as 'PRINT'. Most other tokens were typically the first two characters, the first lowercase, the second uppercase: I remember LOAD was 'lO' and DATA was 'dA', though on the C64's default character glyphs they usually looked like L<box char HN won't render> and D<spade suit char>.

          All this being on a C64 of course, but I suspect most versions of Bill Gates's BASIC did something similar.

          • egypturnash 10 hours ago

            C64 basic was tokenized into one byte, with the most significant bit set: https://www.c64-wiki.com/wiki/BASIC_token

            Each command could be typed in two ways: the full name, or the first two letters, with the second capitalized. Plus a few exceptions like "?" turning into the PRINT token ($99, nowhere near the PETSCII value for ?) and π becoming $FF.

            The tokens were expanded into full text strings when you would LIST the program. Which was always amusing if you had a very dense multi-statement line that expanded as longer than the 80 characters the c64's tokenizer routine could handle, you'd have to go back and replace some or all commands with the short form before you could edit it.

            • mkesper 6 hours ago

              As far as I remember you couldn't even run these programs after listing anymore.

              • LocalH 29 minutes ago

                You could run them just fine as long as you didn't try to edit the listed lines if they were longer than two screen lines. The same is true for a C128 in C128 mode, except the limit is extended to 160 characters (four 40-column lines).

            • fragmede 10 hours ago

              D♠

          • nikau 15 hours ago

            How wasteful, ed uses just ? for all errors, a 3x saving

            • ekidd 14 hours ago

              Ed also uses "?" for "Are you sure?" If you're sure, you can type the last command a second time to confirm.

              The story goes that ed was designed for running over a slow remote connection where output was printed on paper, and the keyboard required very firm presses to generate a signal. Whether this is true or folklore, it would explain a lot.

              GNU Ed actually has optional error messages for humans, because why not.

              • llm_trw 10 hours ago

                So much of computer conventions evolved for very good reasons because of physical limitations.

                When each line of code was it's own punch card having a { stand alone on a line was somewhere between stupid and pointless. Also explains the reason why lisps were so hated for so long.

                By the same token today you can tell which projects use an IDE as the only way to code them because of the terrible documentation. It is after all not the end of the world to have to read a small function when you can just tab to see it. Which is true enough until you end up having those small functions calling other small functions and you're in a stack 30 deep trying to figure out where the option you passed at the top went.

                • teraflop 10 hours ago

                  https://www.gnu.org/fun/jokes/ed-msg.en.html

                  "Note the consistent user interface and error reportage. Ed is generous enough to flag errors, yet prudent enough not to overwhelm the novice with verbosity."

                  • fsckboy 43 minutes ago

                    >not to overwhelm the novice with verbosity

                    that doesn't make complete sense, in unixland it's old-timers who understand the beauty of silence and brevity, while novices scan the screen/page around the new prompt for evidence that something happened

                    • Vinnl 14 minutes ago

                      If I didn't know any better, I'd have thought they weren't entirely serious.

                  • p_l 12 hours ago

                    /bin/ed did in fact evolve on very slow teletypes that used roll paper.

                    It made the option to print file content with line numbers very useful (personally only used very dumb terminals instead of physical teletype, but experience is a bit similar just with shorter scrollback :D)

                    • euroderf 4 hours ago

                      Can confirm. Using ed on a Texas Instruments dial-up terminal (modem for phone handset) with a thermal printer.

                      And taking a printed listing before heading home with the terminal.

                  • nine_k 7 hours ago

                    There are really few systems where you can save a part of a byte! And if you need to output a byte anyway, it doesn't matter which byte it is. So you can indulge and use "?", "!", "*", or even "&" to signify various types of error conditions.

                    (On certain architectures, you could use 1-byte soft-interrupt opcodes to call the most used subroutine, but 8080 lacked it IIRC; on 6502 you could theoretically use BRK for that. But likely you had other uses for it than printing error diagnostics.)

                  • cma an hour ago

                    Earliest I've seen with 'Eh?' as an interpreter response is RAND's JOSS:

                    https://en.wikipedia.org/wiki/JOSS#/media/File:JOSS_Session....

                    https://en.wikipedia.org/wiki/JOSS

                    They had about 5KB of memory but comparing to the Intellivision the machine weighed about 5,000lbs.

                    • furyofantares 13 hours ago

                      I run a wordle spinoff, xordle, which involves two wordle puzzles on one board. This means you can guess a word and get all 5 letters green, but it isn't either of the target words. When you do this it just says "Huh?" on the right. People love that bit.

                      • dotancohen an hour ago

                        > People love that bit.

                        Add another seven Easter eggs, and people could love that byte.

                        • speerer 5 hours ago

                          Can confirm. I loved that bit.

                        • WalterBright 16 hours ago

                          I've been sorely tempted to do that with my compiler many times.

                          • euroderf 4 hours ago

                            Canadians everywhere.

                            • nl 15 hours ago

                              It'd be interesting and amusing if he'd made the private key to his part of Bitcoin a variation on that.

                              RIP.

                            • physicles 14 hours ago

                              The root cause here is poorly named settings.

                              If the original setting had been named something bool-y like `help.autocorrect_enabled`, then the request to accept an int (deciseconds) would've made no sense. Another setting `help.autocorrect_accept_after_dsec` would've been required. And `dsec` is so oddball that anyone who uses it would've had to look up.

                              I insist on this all the time in code reviews. Variables must have units in their names if there's any ambiguity. For example, `int timeout` becomes `int timeout_msec`.

                              This is 100x more important when naming settings, because they're part of your public interface and you can't ever change them.

                              • TeMPOraL 14 hours ago

                                > I insist on this all the time in code reviews. Variables must have units in their names if there's any ambiguity. For example, `int timeout` becomes `int timeout_msec`.

                                Same here. I'm still torn when this gets pushed into the type system, but my general rule of thumb in C++ context is:

                                  void FooBar(std::chrono::milliseconds timeout);
                                
                                is OK, because that's a function signature and you'll see the type when you're looking at it, but with variables, `timeout` is not OK, as 99% of the time you'll see it used like:

                                  auto timeout = gl_timeout; // or GetTimeoutFromSomewhere().
                                  FooBar(timeout);
                                
                                Common use of `auto` in C++ makes it a PITA to trace down exact type when it matters.

                                (Yes, I use IDE or a language-server-enabled editor when working with C++, and no, I don't have time to stop every 5 seconds to hover my mouse over random symbols to reveal their types.)

                                • codetrotter 8 minutes ago

                                  > Yes, I use IDE or a language-server-enabled editor when working with C++, and no, I don't have time to stop every 5 seconds to hover my mouse over random symbols to reveal their types.

                                  JetBrains does a great thing where they show types for a lot of things as labels all the time instead of having to hover over all the things.

                                  • OskarS 5 hours ago

                                    One of my favorite features of std::chrono (which can be a pain to use, but this part is pretty sweet) is that you don't have to specify the exact time unit, just a generic duration. So, combined with chrono literals, both of these work just like expected:

                                        std::this_thread::sleep_for(10ms); // sleep for 10 milliseconds
                                        std::this_thread::sleep_for(1s);   // sleep for one second    
                                        std::this_thread::sleep_for(50);   // does not work, unit is required by type system
                                    
                                    That's such a cool way to do it: instead of forcing you to specify the exact unit in the signature (milliseconds or seconds), you just say that it's a time duration of some kind, and let the user of the API pick the unit. Very neat!
                                    • theamk 10 hours ago

                                      It should not matter though, because std::chrono is not int-convertible - so is it "milliseconds" or "microseconds" or whatever is an minor implementation detail.

                                      You cannot compile FooBar(5000), so there is never confusion in C++ like C has. You have to do explicit "FooBar(std::chrono::milliseconds(500))" or "FooBar(500ms)" if you have literals enabled. And this will handle conversion if needed - you can always do FooBar(500ms) and it will work even if actual type in microseconds.

                                      Similarly, your "auto" example will only compile if gl_timeout is a compatible type, so you don't have to worry about units at all when all your intervals are using std::chrono.

                                      • physicles 12 hours ago

                                        Right, your type system can quickly become unwieldy if you try to create a new type for every slight semantic difference.

                                        I feel like Go strikes a good balance here with the time.Duration type, which I use wherever I can (my _msec example came from C). Go doesn’t allow implicit conversion between types defined with a typedef, so your code ends up being very explicit about what’s going on.

                                      • scott_w 4 hours ago

                                        Yes and it's made worse by using "deciseconds," a unit of time I've used literally 0 times in my entire life. If you see a message saying "I'll execute in 1ms," you'd look straight to your settings!

                                        • bmicraft 10 hours ago

                                          > Variables must have units in their names if there's any ambiguity

                                          Then you end up with something where you can write "TimoutSec=60" as well as "TimeoutSec=1min" in the case of systemd :)

                                          I'd argue they'd been better of not putting the unit there. But yes, aside from that particular weirdness I fully agree.

                                          • physicles 9 hours ago

                                            > Then you end up with something where you can write "TimoutSec=60" as well as "TimeoutSec=1min" in the case of systemd :)

                                            But that's wrong too! If TimeoutSec is an integer, then don't accept "1min". If it's some sort of duration type, then don't call it TimeoutSec -- call it Timeout, and don't accept the value "60".

                                          • bambax 8 hours ago

                                            Yes! As it is, '1' is ambiguous, as it can mean "True" or '1 decisecond', and deciseconds are not a common time division. The units commonly used are either seconds or milliseconds. Using uncommon units should have a very strong justification.

                                            • yencabulator 14 hours ago

                                              I do that, but I can't help thinking that it smells like Hungarian notation.

                                              The best alternative I've found is to accept units in the values, "5 seconds" or "5s". Then just "1" is an incorrect value.

                                              • physicles 12 hours ago

                                                That’s not automatically bad. There are two kinds of Hungarian notation: systems Hungarian, which duplicates information that the type system should be tracking; and apps Hungarian, which encodes information you’d express in types if your language’s type system were expressive enough. [1] goes into the difference.

                                                [1] https://www.joelonsoftware.com/2005/05/11/making-wrong-code-...

                                                • yencabulator 12 hours ago

                                                  And this is exactly the kind the language should have a type for, Duration.

                                              • MrDresden 6 hours ago

                                                > I insist on this all the time in code reviews. Variables must have units in their names if there's any ambiguity. For example, `int timeout` becomes `int timeout_msec`.

                                                Personally I flag any such use of int in code reviews, and instead recommend using value classes to properly convey the unit (think Second(2) or Millisecond(2000)).

                                                This of course depends on the language, it's capabilities and norms.

                                                • kqr 6 hours ago

                                                  I agree. Any time we start annotating type information in the variable name is a missed opportunity to actually use the type system for this.

                                                  I suppose this is the "actual" problem with the git setting, in so far as there is an "actual" problem: the variable started out as a boolean, but then quietly turned into a timespan type without triggering warnings on user configs that got reinterpreted as an effect of that.

                                              • thedufer 12 hours ago

                                                > Now, why Junio thought deciseconds was a reasonable unit of time measurement for this is never discussed, so I don't really know why that is.

                                                xmobar uses deciseconds in a similar, albeit more problematic place - to declare how often to refresh each section. Using deciseconds is fantastic if your goal is for example configs to have numbers small enough that they clearly can't be milliseconds, resulting in people making the reasonable assumption that it must thus be seconds, and running their commands 10 times as often as they intended to. I've seen a number of accidental load spikes originating from this issue.

                                                • snet0 16 hours ago

                                                  This seems like really quite bad design.

                                                  EDIT: 1) is the result of my misreading of the article, the "previous value" never existed in git.

                                                  1) Pushing a change that silently break by reinterpreting a previous configuration value (1=true) as a different value (1=0.100ms confirmation delay) should pretty much always be avoided. Obviously you'd want to clear old values if they existed (maybe this did happen? it's unclear to me), but you also probably want to rename the configuration label..

                                                  2) Having `help.autocorrect`'s configuration argument be a time, measured in a non-standard (for most users) unit, is just plainly bad. Give me a boolean to enable, and a decimal to control the confirmation time.

                                                  • jsnell 16 hours ago

                                                    For point 1, I think you're misunderstanding the timeline. That change happened in 2008, during code review of the initial patch to add that option as a boolean, and before it was ever committed to the main git tree.

                                                    • iab 16 hours ago

                                                      “Design” to me intimates an intentional broad-context plan. This is no design, but an organic offshoot

                                                      • snet0 16 hours ago

                                                        Someone thought of a feature (i.e. configurable autocorrect confirmation delay) and decided the interface should be identical to an existing feature (i.e. whether autocorrect is enabled). In my thinking, that second part is "design" of the interface.

                                                        • iab 11 hours ago

                                                          I think that is something that arose from happenstance, not thoughtful intent - this is true because of how confusing the end result is.

                                                    • newman314 5 hours ago

                                                      For reference, Valtteri Bottas supposedly recorded a 40ms!!! reaction time at the 2019 Japanese Grand Prix.

                                                      https://www.formula1.com/en/video/valtteri-bottas-flying-fin...

                                                      • amai an hour ago

                                                        Most probably that was a false start:

                                                        "World Athletics rules that if an athlete moves within 100 milliseconds (0.1 seconds) of the pistol being fired to start the race, then that constitutes a false start."

                                                        https://www.nytimes.com/athletic/5678148/2024/08/03/olympics...

                                                        • arp242 an hour ago

                                                          That value has also been criticised as too high.

                                                        • dotancohen 34 minutes ago

                                                          I once had a .517 reaction time in a drag race. You know how I did that? By fouling too late. It was completely unrepeatable.

                                                          I'm willing to bet Bottas fouled that, too late (or late enough).

                                                          • voidUpdate 2 hours ago

                                                            Is there a random time between the red lights and the green lights, or is it always the same? Because that feels more like learning the timings than reacting to something

                                                            • jsnell 2 hours ago

                                                              Yes, the timing is random.

                                                          • userbinator 17 hours ago

                                                            IMHO this is a great example of "creeping featurism". At best it introduces unnecessary complexity, and at worst those reliant on it will be encouraged to pay less attention to what they're doing.

                                                            • cedws 16 hours ago

                                                              That's git in a nutshell. An elegant data structure masked by many layers of unnecessary crap that has accumulated over the years.

                                                            • zX41ZdbW 16 hours ago

                                                              > Which was what the setting value was changed to in the patch that was eventually accepted. This means that setting help.autocorrect to 1 logically means "wait 100ms (1 decisecond) before continuing".

                                                              The mistake was here. Instead of retargeting the existing setting for a different meaning, they should have added a new setting.

                                                                  help.autocorrect - enable or disable
                                                                  help.autocorrect.milliseconds - how long to wait
                                                              
                                                              There are similar mistakes in other systems, e.g., MySQL has

                                                                  innodb_flush_log_at_trx_commit
                                                              
                                                              which can be 0 if disabled, 1 if enabled, and 2 was added as something special.
                                                              • stouset 15 hours ago

                                                                The “real” issue is an untyped configuration language which tries to guess at what you actually meant by 1. They’re tripling down on this by making 1 a Boolean true but other integers be deciseconds. This is the same questionable logic behind YAML’s infamous “no” == false.

                                                                • Dylan16807 14 hours ago

                                                                  I'd say the new addition is more of a special case of rounding than it is messing up types.

                                                                  • stouset 10 hours ago

                                                                    1 was also accepted as a Boolean true in this context, and it still is in other contexts.

                                                                    • Dylan16807 10 hours ago

                                                                      > 1 was also accepted as a Boolean true in this context, and it still is in other contexts.

                                                                      Is "was" before the change described at the end of the article, or after it?

                                                                      Before the change, any positive number implied that the feature is on, because that's the only thing that makes sense.

                                                                      After the change, you could say that 1 stops being treated as a number, but it's simpler to say it's still being treated as a number and is getting rounded down. The interpretation of various types is still messy, but it didn't get more messy.

                                                                      • stouset 6 hours ago

                                                                        In an earlier iteration the configuration value was Boolean true/false. A 1 was interpreted as true. They changed it to an integral value. This is the entire setup for the problem in the article.

                                                                        Elsewhere, 1 is still allowed as a true equivalent.

                                                                  • JBiserkov 11 hours ago

                                                                    NO is the country code for Norway.

                                                                  • smaudet 7 hours ago

                                                                    Not sure where the best place to mention would be, but 0.1 deciseconds is not unreasonable, either...yes fastest recorded random reaction time maybe 1.5 ds (coincidentally this is the average gamer reaction time), however non-random reaction times can be much faster (e.g. on a beat).

                                                                    So if you wanted to go that fast, you could, the invokation should have relatively stable speeds (order of some milliseconds...

                                                                  • catlifeonmars 16 hours ago

                                                                    I enabled autocorrect (set a 3sec) a year ago and have the following observations about it:

                                                                    1. it does not distinguish between dangerous and safe actions

                                                                    2. it pollutes my shell history with mistyped commands

                                                                    Reading this article gave me just enough of a nudge to just disable it after a year.

                                                                    • layer8 16 hours ago

                                                                      If anything, it’s better to set up aliases for frequent typos. (Still “pollutes” the shell history of course.)

                                                                      • darkwater 15 hours ago

                                                                        About 2, well, you are the actual polluter, even if you just scroll back in history andnuse the same last wrong command because it works anyway.

                                                                        • bobbylarrybobby 14 hours ago

                                                                          The issue is if you accept the wrong command instead of retyping it correctly, you never get the correctly spelled command into your history — and even worse, you don't get it to be more recent than the mistyped command.

                                                                          • catlifeonmars 14 hours ago

                                                                            Well to put it into context, I use fish shell, which will only save commands that have an exit code of 0. By using git autocorrect, I have guaranteed that all git commands have an exit code of 0 :)

                                                                        • theginger 16 hours ago

                                                                          Reaction times differ by types of stimulus, auditory is slightly faster than visual and tactile slightly faster than that at 90 - 180 ms So if git gave you a slap instead of an error message you might just about have time to react.

                                                                          • orangepanda 16 hours ago

                                                                            The slapping device would need to build inertia for you to feel the slap. Is 10ms enough for that?

                                                                            • dullcrisp 16 hours ago

                                                                              I think if it's spring-loaded then definitely. (But it's 100ms, not 10ms.)

                                                                              • orangepanda 16 hours ago

                                                                                Assuming the best case scenario of feeling the slap in 90ms, it would leave 10ms to abort the command. Or did the 90-180ms range refer to something else?

                                                                                • dullcrisp 15 hours ago

                                                                                  Oh I see, you’re right.

                                                                              • Aerroon 9 hours ago

                                                                                This is why any reasonable engineer would go with zaps instead of slaps!

                                                                            • Reason077 17 hours ago

                                                                              Deciseconds is such an oddball choice of units. Better to specify the delay in either milliseconds or seconds - either are far more commonly used in computing.

                                                                              • ralgozino 4 hours ago

                                                                                I got really confused for a moment, thinking that "deciseconds" was some git-unit meaning "seconds needed to make a decision", like in "decision-seconds" xD

                                                                                Note: english is not my mother tongue, but I am from the civilised part of the world that uses the metric system FWIW.

                                                                                • cobbal 15 hours ago

                                                                                  It's a decent, if uncommon, unit for human reactions. The difference between 0 and 1 seconds is a noticeably long time to wait for something, but the difference between n and n+1 milliseconds is too fine to be useful.

                                                                                  • jonas21 14 hours ago

                                                                                    Milliseconds are a commonly-used unit. It doesn't really matter if 1 ms is too fine a granularity -- you'll just have to write "autocorrect = 500" in your config file instead of "autocorrect = 5", but who cares?

                                                                                    • bmicraft 10 hours ago

                                                                                      If you're going to store that unit in one byte (possible even signed) suddenly deci-seconds start making a lot of sense

                                                                                      • zxvkhkxvdvbdxz 10 hours ago

                                                                                        Sure, yes. But for human consumption, decisecond is something one can relate to.

                                                                                        I mean, you probably cannot sense the difference in duration between 20 and 30 ms without special equipment.

                                                                                        But you can possibly sense the difference between 2 and 3 deciseconds (200 ms and 300 ms) after some practice.

                                                                                        I think the issue in this case was rather the retrofitting a boolean setting into a numerical setting.

                                                                                        • LocalH 16 minutes ago

                                                                                          And then you have the rhythm gamers who can adjust their inputs by 5 or 10ms. Hell, I'm not even that good of a player, but in Fortnite Festival, which has a perfect indicator whenever you're within 50ms of the target note timestamp (and a debug display that shows you a running average input offset) and I can easily adjust my play to be slightly earlier or slightly later and watch my average fall or climb.

                                                                                          Several top players have multiple "perfect full combos" under their belt, where they hit every note in the song within 50ms of the target. I even have one myself on one of the easier songs in the game.

                                                                                          • adzm 7 hours ago

                                                                                            > But you can possibly sense the difference between 2 and 3 deciseconds (200 ms and 300 ms) after some practice.

                                                                                            At 120bpm a sixteenth note is 125ms, the difference is very obvious I would think

                                                                                            • fragmede 10 hours ago

                                                                                              The difference between 20 ms and 30ms is the difference between 33 fps and 50 fps which is entirely noticable on a 1080p60hz screen.

                                                                                          • bobbylarrybobby 14 hours ago

                                                                                            But the consumers of the API aren't humans, they're programmers.

                                                                                        • kittikitti 18 hours ago

                                                                                          I sometimes have this realization as I'm pressing enter and reflexively press ctrl+c. As someone whose typing speeds range from 100 to 160 WPM, this makes sense. Pressing keys is much different from Formula One pit stops.

                                                                                          • otherme123 16 hours ago

                                                                                            Not about pit stops. They talk about pro drivers with highly trained reflexes, looking at a red light knowing that it will turn green in the next 3 seconds, so they must push a pedal to the metal as fast as they can. If they react in less than 120ms is considered a jump start.

                                                                                            As for 100WPM, which is a very respectable typing speed, it translates to 500 CPM, less than 10 characters per second, and thus slightly above 100ms per keypress. But Ctrl+C are two key presses: reacting to type them both in under 100 ms is equivalent to a writting speed above 200WPM.

                                                                                            Even the fastest pro-gamers struggle to go faster than 500 actions (keypresses) per minute (and they use tweaks on repeat rates to get there), still more than 100ms for two key presses.

                                                                                            • mjpa86 4 hours ago

                                                                                              There is no green light at the start - it's the lights going out they react to. There's also no minimum time, you can get moving after 1ms - it's legal. In fact, you can move before the lights go out, there's a tolerance before you're classed as moving.

                                                                                              • Aerroon 9 hours ago

                                                                                                >But Ctrl+C are two key presses: reacting to type them both in under 100 ms is equivalent to a writting speed above 200WPM.

                                                                                                I think people don't really type/press buttons at a constant speed. Instead we do combos. You do a quick one-two punch because that's what you're used to ("you've practiced"). You do it much faster than that 100ms, but after that you get a bit of a delay before you start the next combo.

                                                                                                • otherme123 7 hours ago

                                                                                                  As menctioned, pro-gamers train combos for hours daily. The best of them can press up to 10 keys per second without thinking. For example, the fastest StarCraft II player Reynor (Riccardo Romitti) can sustain 500 key presses per minute, and do short busts of 800. He has videos explaining how to tweak the Windows registry to achieve such rate (it involves pressing some keys once and the OS autorepeats faster than you can press), because it can't be done with the standard config dialogs. And you are trying to tell me that you can do double that... not only double that, "much faster" than that.

                                                                                                  I dare anyone to make a script that, after launching, will ask you to press Ctrl+C after a random wait between 1000 and 3000 ms. And record your reaction time meassured after key release. It's allowed to "cheat" and have your fingers ready over the two keys. Unless you jump start and get lucky, you won't get better than 150ms.

                                                                                                  • adzm 6 hours ago

                                                                                                    I actually took you up on this, and best I was able to get was about 250ms when I was really concentrating. Average was around 320!

                                                                                              • snet0 16 hours ago

                                                                                                That reflexivity felt a bit weird the first time I thought about it. I type the incorrect character, but reflexively notice and backspace it without even becoming aware of it until a moment later. I thought it'd be related to seeing an unexpected character appearing on the display, but I do it just as quickly and reflexively with my eyes closed.

                                                                                                That being said, there are obviously cases where you mistype (usually a fat-finger or something, where you don't physically recognise that you've pressed multiple keys) and don't appreciate it until you visually notice it or the application doesn't do what you expected. 100ms to react to an unexpected stimulus like that is obviously not useful.

                                                                                                • grogenaut 15 hours ago

                                                                                                  I type a lot while looking away from the monitors, helps me think / avoid the stimulus of the text on the screen. I can tell when I fat finger. It also pissed off the boomers at the bar who thought I was showing off as I was a) typing faster then they could, and b) not looking at the screen, c) sometimes looking blankly past them (I'm really not looking when I do this sometimes).

                                                                                                  also I typed this entire thing that way without looking at it other than for red squiggles.

                                                                                                • schacon 17 hours ago

                                                                                                  I'm curious if the startup time, plus the overhead of Git trying to figure out what you might have meant is significant enough to give you enough time to realize and hit ctrl+c. In testing it quickly, it looks like typing the wrong command and having it spit out the possible matches without running it takes 0.01-0.03s, so I would venture to guess that it's still not enough time between hitting enter and then immediately hitting ctrl-c, but maybe you're very very fast?

                                                                                                  • johnisgood 2 hours ago

                                                                                                    I think most programs you execute have enough startup overhead to do Ctrl-C before it even begins, including CLI tools. I do this a lot (and calculate in the time of realizing it was the wrong command, or not the flags I wanted, etc.)

                                                                                                    • rad_gruchalski 17 hours ago

                                                                                                      The command is already running, you ctrl+c THE command. But I agree, 100ms is short.

                                                                                                  • cardamomo 17 hours ago

                                                                                                    Reading this post, the term "software archeology" and "programmer archeologist" come to mind. (Thank you, Vernor Vinge, for the latter concept.)

                                                                                                    • scubbo 10 hours ago

                                                                                                      Grrrr, this is such a bugbear for me. I was so excited to read "A Fire Upon The Deep" because hackers talked up the concept of "software archeology" that the book apparently introduced.

                                                                                                      The concept is briefly alluded to in the prologue, and then...nada, not relevant to the rest of the plot at all (the _effects_ of the archeology are, but "software archeologists" are not meaningful characters in the narrative). I felt bait-and-switched.

                                                                                                      • schacon 17 hours ago

                                                                                                        I can’t help but feel like you’re calling me “old”…

                                                                                                        • cardamomo 16 hours ago

                                                                                                          Not my intention! Just an esteemed git archeologist

                                                                                                        • choult 17 hours ago

                                                                                                          I like to say that the danger of software archaeology is the inevitable discovery of coprolites...

                                                                                                        • dusted 17 hours ago

                                                                                                          I think it makes sense, if I typed something wrong, I often feel it before I can read it, but if I already pushed enter, being able to ctrl+c within 100 ms is enough to save me. I'm pretty sure I've also aborted git pushes before they touched anything before I put this on, but this makes it more reliable.

                                                                                                          • Etheryte 17 hours ago

                                                                                                            Maybe worth noting here that 100ms is well under the human reaction time. For context, professional sprinters have been measured to have a reaction time in the ballpark of 160ms, for pretty much everyone else it's higher. And this is only for the reaction, you still need to move your hand, press the keys, etc.

                                                                                                            • dusted 6 hours ago

                                                                                                              There are different ways to measure reaction time. Circumstance is important.

                                                                                                              Reaction to unreasonable, unexpected events will be very slow due to processing and trying to understand what happens and how to respond. Examples, you are a racecar driver, participating in a race, you're driving your car on a racetrack in a peaceful country.

                                                                                                              An armed attack: Slow reaction time, identifying the situation will take a long time, selecting an appropriate response will take longer.

                                                                                                              A kid running into the road on the far side of the audience stands: Faster.

                                                                                                              Kid running into the road near the audience: Faster.

                                                                                                              Car you're tailing braking with no turn to come: Faster.

                                                                                                              Crashed car behind a turn with bad overview: Faster.

                                                                                                              Guy you're slipstreaming braking before a turn: Even faster.

                                                                                                              For rhythm games, you anticipate and time the events, and so you can say these are no longer reactions, but actions.

                                                                                                              In the git context, where you typed something wrong, the lines are blurred, you're processing while you're acting, you're typing while you're evaluating what you're typing, first line of defence is you're feeling/sensing that you typed wrong, either from the feedback that your fingers touched too many keys, or that you felt the rhythm of your typing was wrong, at least for me, this happens way faster than my visual input. I'm making errors as I type this, and they're corrected faster than I can really read it, sometimes I get it wrong and deleted a word that was correct. But still, watching people type, I see this all the time, they're not watching and thinking about the letters exclusively, there's something going on in their minds at the same time. 100 ms is a rather wide window in this context.

                                                                                                              Also, that said, we did a lot of experiments at work with a reaction time tester, most people got less than 70 ms after practice (a led lights up at a random interval between 2 and 10 seconds)

                                                                                                              • shawabawa3 17 hours ago

                                                                                                                In this case the reaction starts before you hit enter, as you're typing the command

                                                                                                                So, you type `git pshu<enter>` and realise you made a typo before you've finished typing. You can't react fast enough to stop hitting enter but you can absolutely ctrl+c before 100 more ms are up

                                                                                                                • dusted an hour ago

                                                                                                                  Yes exactly! This is what I'm trying to argue as well, it happens quite often for me that I submit a typo because it's already "on it's way out" when I catch it (but before, or about the same time it's finished and enter is pressed), so the ctrl+c is already on it's way :)

                                                                                                                  • Etheryte 16 hours ago

                                                                                                                    I'm still pretty skeptical of this claim. If you type 60 wpm, which is faster than an average human, but regular for people who type as professionals, you spend on average 200ms on a keystroke. 60 standard words per minute means 300 chars per minute [0], so 5 chars per second which is 200ms per char. Many people type faster than this, yes, but it's all still very much pushing it just to even meet the 100ms limit, and that's without any reaction or anything on top.

                                                                                                                    [0] https://en.wikipedia.org/wiki/Words_per_minute

                                                                                                                    • pc86 14 hours ago

                                                                                                                      Even if you typed 120 wpm, which is "competitive typing" speed according to this thing[0], it's going to take you 200ms to type ctrl+c, and even if you hit both more-or-less simultaneously you're going to be above the 100ms threshold. So to realistically be able to do something like beat the threshold during normal work and not a speed-centered environment you're probably looking at regularly 160 wpm or more?

                                                                                                                      I'm not a competitive speed typist or anything but I struggle to get above 110 on a standard keyboard and I don't think I've ever seen anyone above the 125-130 range.

                                                                                                                      [0] https://www.typingpal.com/en/documentation/school-edition/pe...

                                                                                                                      • grayhatter 13 hours ago

                                                                                                                        For whatever it's worth*: I'm not skeptical of it at all. I've done this in a terminal before without even looking at the screen, so I know it can't have anything to do with visual reaction.

                                                                                                                        Similar to the other reply, I also commonly do that when typing, where I know I've fat fingered a word, exclusively from the feeling of the keyboard.

                                                                                                                        But also, your not just trying to beat the fork/exec. You can also successfully beat any number of things. The pre-commit hook, the DNS look up, the TLS handshake. adding an additional 100ms of latency to that could easily be the difference between preempting some action, interrupting it or noticing after it was completed.

                                                                                                                      • yreg 16 hours ago

                                                                                                                        Let's say you are right. What would be a reason for pressing ctrl+c instead of letting the command go through in your example?

                                                                                                                        The delay is intended to let you abort execution of an autocorrected command, but without reading the output you have no idea how the typos were corrected.

                                                                                                                        • brazzy 16 hours ago

                                                                                                                          > you can absolutely ctrl+c before 100 more ms are up

                                                                                                                          Not gonna believe that without empirical evidence.

                                                                                                                          • dusted an hour ago

                                                                                                                            That'd be interesting, but I don't know how to prove that I'm not just "pretending" to make typos and correcting them instantly ?

                                                                                                                            • burnished 16 hours ago

                                                                                                                              I think they are talking about times where you realize a mistake as you are making it as opposed to hindsight, given that 100ms seems pretty reasonable.

                                                                                                                              • dusted an hour ago

                                                                                                                                This is exactly what I'm trying to say. The actions are underway by muscles (or _just_ completed) and the brain catches something's off and so ctrl+c is queued.

                                                                                                                                • brazzy 7 hours ago

                                                                                                                                  "seems pretty reasonable" is not evidence.

                                                                                                                                • bmacho 15 hours ago

                                                                                                                                  I am not sure, have you read it properly? The scenario is that you are pushing enter, halfway change your mind, and your are switching to ctrl+c. So it is not a reaction time, but an enter to ctrl+c scenario.

                                                                                                                                  Regarding reaction time, below 120ms (on a computer, in a browser(!)) is consistently achievable, e.g. this random yt video https://youtu.be/EH0Kh7WQM7w?t=45 .

                                                                                                                                  For some reason, I can't find more official reaction time measurements (by scientists, on world champion athletes, e-athletes), which is surprising.

                                                                                                                                  • brazzy 6 hours ago

                                                                                                                                    That scenario seems to me fishy to begin with, is that something that actually happens, or just something people imagine? How would it work that you "change your mind halfway through" and somehow cannot stop your finger from pressing enter, but can move them over and hit ctrl-c in a ridiculously short time window?

                                                                                                                                    > So it is not a reaction time, but an enter to ctrl+c scenario.

                                                                                                                                    At minimum, if we ignore the whole "changing your mind" thing. And for comparison: the world record for typing speed (over 15 seconds and without using any modifier keys) is around 300wpm, which translates to one keypress every 40ms - you really think 100ms to press two keys is something "you can absolutely" do? I'd believe that some* people could sometimes do it, but certainly not just anyone.

                                                                                                                            • SOLAR_FIELDS 17 hours ago

                                                                                                                              100 ms is an insanely short window. I would say usually even 1000ms would be too short for me to recognize and kill the command, even if I realized immediately that I had done something wrong.

                                                                                                                              • jsjshsbd 17 hours ago

                                                                                                                                It's much too short to read an output, interpret it and realize you have to interrupt

                                                                                                                                But often you type something, realize it's wrong while you are typing but not fast enough to stop your hand from pressing [Enter]

                                                                                                                                That is one of the only situation 100ms would be enough to safe you

                                                                                                                                That being said, the reason in the article for 100ms is just confused commander. Why would anyone:

                                                                                                                                1) encode a Boolean value as 0/1 in a human readable configuration 2) encode a duration as a numeric value without unit in a human readable configuration

                                                                                                                                Both are just lazy

                                                                                                                                • Reason077 16 hours ago

                                                                                                                                  > "Why would anyone ... encode a Boolean value as 0/1 in a human readable configuration"

                                                                                                                                  It may be lazy, but it's very common!

                                                                                                                                  • grayhatter 13 hours ago

                                                                                                                                    laziness is a virtue of a good programmer.

                                                                                                                                    why demand many char when few char do trick?

                                                                                                                                    also

                                                                                                                                    > Why would anyone [...] encode a duration as a numeric value without unit in a human readable configuration

                                                                                                                                    If I'm only implementing support for a single unit, why would you expect or want to provide a unit? What's the behavior when you provide a unit instead of a number?

                                                                                                                                    > but not doing that extra work is lazy

                                                                                                                                    no, because while I'm not implementing unit parsing for a feature I wouldn't use, instead I'm spending that time implementing a better, faster diff algorithm. Or implementing a new protocol with better security, or sleeping. It's not lazy to do something important instead of something irrelevant. And given we're talking about git, which is already very impressive software, provided for free by volunteers, I'm going to default to assuming they're not just lazy.

                                                                                                                                • frde_me 5 hours ago

                                                                                                                                  But the point here is not that you need to realize you typed something wrong and then cancel (in that case just don't enable the setting if you always want to abort). The point is that you need to decide if the autocorrect suggestion was the right one. Which you can't know until it tells you what it wants to autocorrect to.

                                                                                                                                  • dankwizard 14 hours ago

                                                                                                                                    Neo, get off HN and go destroy the agents!

                                                                                                                                  • kqr 6 hours ago

                                                                                                                                    This timeout makes me think about the type of scenario where I know I have mistyped the command, e.g. because I accidentally hit return prematurely, or hit return when I was trying to backspace away a typo. In those situations I reflexively follow return with an immediate ctrl-C, and might be able to get in before the 100 ms timeout. So it’s not entirely useless!

                                                                                                                                    • politelemon 17 hours ago

                                                                                                                                      I agree that 'prompt' should be the value to set if you want git autocorrect to work for you. I'd however want that the Y is the default rather than the N, so that a user can just press enter once they've confirmed it.

                                                                                                                                      In any case it is not a good idea to have a CLI command happen without your approval, even if the intention was really obvious.

                                                                                                                                      • misnome 17 hours ago

                                                                                                                                        Yes, absolutely this. If I don’t want it to run, I will hit ctrl-c.

                                                                                                                                        • junon 16 hours ago

                                                                                                                                          If prompt is the default, mistyped scripts will hang rather than exit 1 if they have stdin open. I think that causes more problems than it solves.

                                                                                                                                          • jzwinck 16 hours ago

                                                                                                                                            That's what isatty() is for. If stdin is not a TTY, prompting should not be the default. Many programs change their defaults or their entire behavior based on isatty().

                                                                                                                                            • junon 14 minutes ago

                                                                                                                                              isatty() is spoofed in e.g. Make via PTYs. It's a light check at best and lies to you at worst.

                                                                                                                                        • mmcnl 16 hours ago

                                                                                                                                          The most baffling thing is that someone implemented deciseconds as a unit of time. Truly bizarre.

                                                                                                                                          • jakubmazanec 3 hours ago

                                                                                                                                            > introduced a small patch

                                                                                                                                            > introduced a patch

                                                                                                                                            > the Git maintainer, suggested

                                                                                                                                            > relatively simple and largely backwards compatible fix

                                                                                                                                            > version two of my patch is currently in flight to additionally

                                                                                                                                            And this is how interfaces become unusable, through thousand small "patches" created without any planning and oversight.

                                                                                                                                            • olddustytrail 33 minutes ago

                                                                                                                                              Ah, if only the Git project had someone of your talents in charge (rather than the current band of wastrel miscreants).

                                                                                                                                              Then it might enjoy some modicum of success, instead of languishing in its well-deserved obscurity!

                                                                                                                                            • 1970-01-01 18 hours ago

                                                                                                                                              Deciseconds?? There's your problem. Always work in seconds when forcing a function for your users.

                                                                                                                                              • GuB-42 16 hours ago

                                                                                                                                                Deciseconds (100ms) are not a bad unit when dealing with UI because it is about the fastest reaction time. We can't really feel the difference between 50 ms and 150 ms (both feel instant), but we can definitely feel the difference between 500 ms and 1500 ms. Centiseconds are too precise, seconds are not enough. Also, it is also possible that the computer is not precise enough for centiseconds or less, making extra precision a lie.

                                                                                                                                                Deciseconds are just uncommon. But the problem here is that the user didn't expect the "1" to be a unit of time but instead a boolean value. He never wanted a timer in the first place.

                                                                                                                                                By the way, not making the unit of time clear is a pet peeve of mine. The unit is never obvious, seconds and milliseconds are the most common, but you don't know which one unless you read the docs, and it can be something else.

                                                                                                                                                My preferred way is to specify the unit during the definition (ex: "timeout=1s") with a specific type for durations, second is to have it in the name (ex: "timeoutMs=1000"), documentation comes third (that's the case of git). If not documented in any way, you usually have to resort to trial and error or look deep into the code, as these values tend to be passed around quite a bit before reaching a function that finally makes the unit of time explicit.

                                                                                                                                                • synecdoche 16 hours ago

                                                                                                                                                  This may be something specific to Japan, which is where the maintainer is from. In the Japanese industrial control systems that I’ve encountered time is typically measured in this unit (100 ms).

                                                                                                                                                  • gruez 17 hours ago

                                                                                                                                                    better yet, encode the units into the variable/config name so people don't have to guess. You wouldn't believe how often I have to guess whether "10" means 10 seconds (sleep(3) in linux) or milliseconds (Sleep in win32).

                                                                                                                                                    • userbinator 12 hours ago

                                                                                                                                                      My default for times is milliseconds, since that's a common granularity of system timing functions.

                                                                                                                                                      • dusted 17 hours ago

                                                                                                                                                        at least fractions of a second, 250 would already be much more noticble.. 100 is a nice compromise between "can't react" and "have to wait", assuming you're already realizing you probably messed up

                                                                                                                                                        • 331c8c71 18 hours ago

                                                                                                                                                          Seconds or milliseconds (e.g. if the setting must be integer) would've been fine as they are widely used. Deciseconds, centiseconds - wtf?

                                                                                                                                                          • atonse 17 hours ago

                                                                                                                                                            Falls squarely within the "They were too busy figuring out whether they could do it, to ask whether they SHOULD do it"

                                                                                                                                                          • UndefinedRef 18 hours ago

                                                                                                                                                            Maybe he meant dekaseconds? Still weird though..

                                                                                                                                                            • TonyTrapp 18 hours ago

                                                                                                                                                              It reads like the intention was that turning the parameter (0/1) command into an integer parameter, where the previous value enabled = 1 should behave reasonably close to the old behaviour. 1 deciseconds is arguably close enough to instant. If the parameter were measured in seconds, the command would always have to wait a whole second before executing, with no room for smaller delays.

                                                                                                                                                              • bot403 16 hours ago

                                                                                                                                                                No, smaller delays <1s are also a misdesign here. Have we all forgotten we're reacting to typos? It's an error condition. It's ok that the user feels it and is inconvenienced. They did something wrong.

                                                                                                                                                                Do some think that 900ms, or 800, or some other sub-second value is really what we need for this error condition? Instead of, you know, not creating errors?

                                                                                                                                                              • schacon 18 hours ago

                                                                                                                                                                We had this debate internally at GitButler. Deci versus deca (and now deka, which appears to also be a legit spelling). My assumption was that 1 full second may have felt too long, but who really knows.

                                                                                                                                                                • a3w 17 hours ago

                                                                                                                                                                  deci is 1/10, deca is 10/1. So decisecond is correct.

                                                                                                                                                                  • schacon 17 hours ago

                                                                                                                                                                    I understand, I meant I tried to say the word “decisecond” out loud and we debated if that was a real word or if I was attempting to say “deca” which was understandable.

                                                                                                                                                                    • zxvkhkxvdvbdxz 10 hours ago

                                                                                                                                                                      It's very standardized (SI), meaning 1/10:th. Althought not so commonly used with seconds.

                                                                                                                                                                      You might be more familiar with decimeters, deciliters, decibels or the base-10 (decimal) numbering system.

                                                                                                                                                                      • CRConrad 7 hours ago

                                                                                                                                                                        Sure, deca- as in "decade" is understandable. But why would deci- as in "decimal" be any less understandable?

                                                                                                                                                              • ocean_moist 3 hours ago

                                                                                                                                                                Fun fact: Professional gamers (esport players) have reaction times around 150ms to 170ms. 100ms is more or less impossible.

                                                                                                                                                                • NoPicklez 12 hours ago

                                                                                                                                                                  Cool but I don't know why it needs to be justified that it's too fast even for an F1 driver. Why can't we just say its too fast without all the fluff about being a race car driver, the guy isn't even an F1 driver but Le Mans.

                                                                                                                                                                  • blitzar 3 hours ago

                                                                                                                                                                    My deoderant is good enough for a F1 driver, why whouldnt my git client adhere to the same standards?

                                                                                                                                                                    • benatkin 11 hours ago

                                                                                                                                                                      The author is someone who went to conferences that DHH also attended, so for some of the audience it's a funny anecdote.

                                                                                                                                                                    • IshKebab 15 hours ago

                                                                                                                                                                      > Junio came back to request that instead of special casing the "1" string, we should properly interpret any boolean string value (so "yes", "no", "true", "off", etc)

                                                                                                                                                                      The fact that this guy has been the Git maintainer for so long and designs settings like this explains a lot!

                                                                                                                                                                      • pmontra 17 hours ago

                                                                                                                                                                        According to Formula 1 web site drivers start on average after 0.2 seconds since the red lights go out https://www.formula1.com/en/latest/article/rapid-decisions-d...

                                                                                                                                                                        Anyway, 0.1 seconds would be far too short even for them, which have a job based on fast reaction times.

                                                                                                                                                                        • moogly 16 hours ago

                                                                                                                                                                          So Mercurial had something like this back in ancient times, but git devs decided to make a worse implementation.

                                                                                                                                                                          • mscdex 17 hours ago

                                                                                                                                                                            This seems a bit strange to me considering the default behavior is to only show a suggested command if possible and do nothing else. That means they explicitly opted into the autocorrect feature and didn't bother to read the manual first and just guessed at how it's supposed to be used.

                                                                                                                                                                            Even the original documentation for the feature back when it was introduced in 2008 (v1.6.1-rc1) is pretty clear what the supported values are and how they are interpreted.

                                                                                                                                                                            • inoffensivename 8 hours ago

                                                                                                                                                                              Maybe a not-so-hot take on this... The only option this configuration parameter should take is "never", which should also be the default. Any other value should be interpreted as "never".

                                                                                                                                                                              • Theodores 17 hours ago

                                                                                                                                                                                0.1 seconds is a long time in drag racing where the timing tree is very different to F1. With F1 there are the five red lights that have to go out, and the time this takes is random.

                                                                                                                                                                                With your git commands it is fairly predictable what happens next, it is not as if the computer is randomly taunting you with five lights.

                                                                                                                                                                                I suggest a further patch where you can put git in either 'F1 mode', or, for our American cousins, 'Drag Strip mode'. This puts it in to a confirmation mode for everything, where the whole timing sequence is shown in simplified ASCII art.

                                                                                                                                                                                As a European, I would choose 'F1 mode' to have the give lights come on in sequence, wait a random delay and then go out, for 'git push' to happen.

                                                                                                                                                                                I see no reason to also have other settings such as 'Ski Sunday mode', where it does the 'beep beep beep BEEEP' of the skiing competition. 'NASA mode' could be cool too.

                                                                                                                                                                                Does anyone have any other timing sequences that they would like to see in the next 'patch'?

                                                                                                                                                                                • mike-the-mikado 18 hours ago

                                                                                                                                                                                  I'd be interested to know if any F1 drivers actually use git.

                                                                                                                                                                                  • schacon 18 hours ago

                                                                                                                                                                                    Not sure, but I do personally know two high profile Ruby developers who regularly race in the LMP2 (Le Mans Prototype 2) class - DHH and my fellow GitHub cofounder PJ Hyett, who is now a professional driver, owning and racing for AO (https://aoracing.com/).

                                                                                                                                                                                    I mostly say this because I find it somewhat fun that they have raced _each other_ at Le Mans last year, but also because I've personally seen both of them type Git commands, so I know it's true.

                                                                                                                                                                                    • xeonmc 16 hours ago

                                                                                                                                                                                      Maybe we can pitch to Max Verstappen to use Git to store his sim racing setup configs.

                                                                                                                                                                                      • pacaro 17 hours ago

                                                                                                                                                                                        I've also worked with engineers who have raced LMP. It's largely pay-to-play and this is one of those professions where if you're the right person, in the right place, at the right time, you might be able to afford it.

                                                                                                                                                                                        • diggan 17 hours ago

                                                                                                                                                                                          Isn't Le Mans more of a "endurance" race though, especially compared to F1? It would be interesting to see the difference in reaction ability between racers from the two, I could see it being different.

                                                                                                                                                                                          • schacon 17 hours ago

                                                                                                                                                                                            I feel like in the "racing / git crossover" world, that's pretty close. :)

                                                                                                                                                                                      • meitham 16 hours ago

                                                                                                                                                                                        Really enjoyable read

                                                                                                                                                                                        • moffkalast 14 hours ago

                                                                                                                                                                                          > As some of you may have guessed, it's based on a fairly simple, modified Levenshtein distance algorithm

                                                                                                                                                                                          One day it'll dump the recent bash and git history into an LLM that will say something along the lines of "alright dumbass here's what you actually need to run"

                                                                                                                                                                                          • baggy_trough 14 hours ago

                                                                                                                                                                                            Whenever you provide a time configuration option, field, or parameter, always encode the units into the name.

                                                                                                                                                                                            • snvzz 14 hours ago

                                                                                                                                                                                              At 60fps that's 6 frames, which is plenty.

                                                                                                                                                                                              That aside, I feel the reason is to advertise the feature so that the user gets a chance to set the timer up to his preference or disable autocorrect entirely.

                                                                                                                                                                                              • ninjamuffin99 14 hours ago

                                                                                                                                                                                                6 frames is not enough to realize you made a typo / read whatever git is outputting telling you that you made a typo, and then respond to that input correctly.

                                                                                                                                                                                                in video games it may seem like a lot of time for a reaction, but a lot of that “reaction time” is based off previous context of the game, visuals and muscle memory and whatnot. If playing street fighter and say youre trying to parry an attack that has a 6 frame startup, you’re already anticipating an attack to “react” to before their attack even starts. When typing git commands, you will never be on that type of alert to anticipate your typos.

                                                                                                                                                                                                • snvzz 12 hours ago

                                                                                                                                                                                                  >6 frames is not enough

                                                                                                                                                                                                  git good.

                                                                                                                                                                                                  (the parent post was a set up for this)

                                                                                                                                                                                              • Pxtl 17 hours ago

                                                                                                                                                                                                Pet peeve: Timespan configs that don't include the unit in the variable name.

                                                                                                                                                                                                I'm so sick of commands with --timeout params where I'm left guessing if it's seconds or millis or what.

                                                                                                                                                                                                • hinkley 16 hours ago

                                                                                                                                                                                                  Be it seconds or milliseconds, eventually your program evolves to need tenths or less of that unit and you can either support decimal points, create a new field and deprecate the old one, or do a breaking change that makes the poor SOB that needs to validate a breaking change-bearing upgrade in production before turning it on get a migraine if they have to toggle back and forth more than a couple times. Code isn’t always arranged so that a config change and a build/runtime change can be tucked into a single commit that can be applied or rolled back atomically.

                                                                                                                                                                                                  All because someone thought surely nobody would ever want something to happen on a quarter of a second delay/interval, or a 250 microsecond one.

                                                                                                                                                                                                  • skykooler 15 hours ago

                                                                                                                                                                                                    I spent a while debugging a library with a chunk_time_ms parameter where it turned out "ms" stood for "microseconds".

                                                                                                                                                                                                    • grayhatter 13 hours ago

                                                                                                                                                                                                      I have a very hard time relating to everyone else complaining about ~~lack of units~~ being unable to read/remember API docs. But using `chunk_time_ms` where ms is MICROseconds?! That's unforgivable, and I hope for all our sakes, you don't have to use that lib anymore! :D

                                                                                                                                                                                                      • Pxtl 10 hours ago

                                                                                                                                                                                                        The sheer number of APIs of modern coding is exhausting, I can't imagine either trying to keep all the stuff I'm using in my head or having to go back to the docs every time instead of being able to just read the code.

                                                                                                                                                                                                    • echoangle 16 hours ago

                                                                                                                                                                                                      Alternatively, you can also accept the value with a unit and return an error when a plain number is entered (so --timeout 5s or --timeout 5h is valid but --timeout 5 returns an error).

                                                                                                                                                                                                      • cratermoon 16 hours ago

                                                                                                                                                                                                        I'll bounce in with another iteration of my argument for avoiding language primitive types and always using domain-appropriate value types. A Duration is not a number type, neither float or integer. It may be implemented using whatever primitive the language provides, but for timeouts and sleep, what is 1 Duration? The software always encodes some definition of 1 unit in the time domain, make it clear to the user or programmer.

                                                                                                                                                                                                      • tester756 17 hours ago

                                                                                                                                                                                                        Yet another example where git shows its lack of user-friendly design

                                                                                                                                                                                                        • hinkley 16 hours ago

                                                                                                                                                                                                          Well it is named after its author after all.

                                                                                                                                                                                                          • yreg an hour ago

                                                                                                                                                                                                            At first I thought this is unnecessary name-calling, but apparently Linus has also made the same joke:

                                                                                                                                                                                                            > "I'm an egotistical bastard, and I name all my projects after myself. First Linux, now git."