• cyanmagenta 5 hours ago

    I am going to be harsh here, but I think it’s necessary: I don’t think anyone should use Emit-C in production. Without proper tooling, including a time-traveling debugger and a temporal paradox linter, the use case just fails compared to more established languages like Rust.

    • doormatt 4 hours ago

      No one is claiming this is necessary. It's a toy language built for fun.

      • mygrant 4 hours ago

        Woosh

        • porcoda 4 hours ago

          Given how often people seriously say things like the top level comment being responded to around here, an explicit /s is almost necessary since it can be hard to distinguish from the usual cynical dismissive comments.

          • fragmede 2 hours ago

            I down vote explicit /s' on principle. If you have to add it, you're not doing it right (imo).

        • hathawsh 3 hours ago

          Please ELI5... I know there's a joke in there, but I'm missing it.

          • skavi 3 hours ago

            The humor lies in the inherent absurdity of the critique itself. Obviously no one will use this in production. There’s nothing especially clever you’re missing.

            • ramon156 2 hours ago

              I was confused at the rust part, which also made me realize that it was part of a joke.

              • 9dev 2 hours ago

                The temporal paradox linter could have given it away too :)

      • pcblues 2 hours ago

        I think a syntax highlighter and in-line documentation for future language features before they are created is also necessary. I'll stick with more established languages, too. Time in a single direction is already hard.

      • unquietwiki 5 hours ago

        Submitted to r/altprog ; I love a language that can murder variables, heh.

        • roarcher 4 hours ago

          This one can murder its own grandfather.

        • deadbabe 4 hours ago

          For the real computer scientists out here, what would time complexity notation be like if time traveling of information was possible (i.e. doing big expensive computations and sending the result back into the past)?

          • openasocket 3 hours ago

            Surprisingly there is prior work on this! https://www.scottaaronson.com/papers/ctchalt.pdf . Apparently a Turing Machine with time travel can solve the halting problem

            • fragmede 2 hours ago

              With time travel, isn't the halting problem trivially solvable? You start the program, and then just jump to after the end of forever and see if the program terminated.

              • JadeNB 2 hours ago

                > With time travel, isn't the halting problem trivially solvable? You start the program, and then just jump to after the end of forever and see if the program terminated.

                Some programs won't halt even after forever, in the sense of an infinite number of time stops. For example, if you want to test properties of (possibly infinite) sets of natural numbers, there's no search strategy that will go through them even in infinite time.

                (Footnote that I'm assuming, I think reasonably but who knows what CSists have been up to?, a model of computation that allows the performance of countably, but not uncountably, many steps.)

                • eddd-ddde an hour ago

                  But if you are at the present, and dont receive a future result immediately, can't you assume it never halts? Otherwise you would have received a result.

                  • mgsouth 31 minutes ago

                    I don't think so. That's assuming the program will always be in a frame of reference which is temporally unbounded. If, for example, it fell into a black hole it would, IIUC, never progress (even locally) beyond the moment of intercepting the event horizon.

                • HeliumHydride 2 hours ago

                  I think you can only time travel a finite time.

              • pcblues 2 hours ago

                If you shift the computational result back in time to the same time you started it, your O notation is just a zero, and quite scalable. Actually it would open the programming market up to more beginners, because they can brute force any computation without caring about the time dimension. Algorithm courses will go broke, and the books will all end up in the remainder bin. Of course, obscure groups of purists will insist on caring about the space AND time dimensions of complexity, but no-one will listen to them anymore.

                • JadeNB 2 hours ago

                  I whimsically imagine some version of bi-directional Hoare logic.