Every time I read a story involving terminals, I am amazed that they function in the first place. The sheer amount of absurd workarounds for backwards compatibility’s sake is mind-boggling.
Actually, what's really amazing is that the wheel has been reinvented so many times except for terminals.
Seriously, make something new that works either with ssh or wireguatd and cement your name in fame.
There's a lot of things that could be reworked in much better ways if you drop backwards compatibility and think a bit about usability, but the real problem is the time it takes versus how "just fine" things work once you finally understand how to improve on old tools.
To make things worse, tools get much better they get once you start thinking hard about re-implementing things. It seems it takes a lot of hatred to start such a project, and starting is the easy part as it only needs the first 80% of the effort.
The reason terminal-land is like this, is precisely because the wheel has been reinvented a few hundred times. Basically XKCD 927, perpetually.
I'm amused by how quickly he shifts from "all existing image viewers have silly bugs, no-one knows what they're doing, I want to understand everything" to "here's some code I copy-pasted that seems to work". What a deeply quixotic approach to, uh, everything in this blog post.
after having read the entire thing, i don't find the same amusement. In fact, you concatenated disparate quotes and put them out of order.
1: Image viewer [libraries] have silly bugs
a. no one knows what they're doing?
2. Copy and paste code to get images on the screen a. not very accurately
b. very slowly
3. I want to understand everything a. here's sRGB, here's Y'UV
b. here's tone mapping
c. etc
the prose and overarching methodology reminds me of frinklang's data file[0] (https://frinklang.org/frinkdata/units.txt) which is... hilariously and educationally opinionated about ... units and measurements and conversions.the writing style comports with how i approach problems and may not jive with other methodologies, maybe? If i have some problem i generally will try copying and pasting (or copilot.exe or whatever) just to see if the problem space has any state of art. If that does what i want, great. I'm done. if i need to do what i want 10,000 times, i usually will dig in and speed it up, or add features (event handling, etc)
I also like scathing commentary about annoyances in technology, and this article had that, for sure. the webp aside was biting!
[0] huh, this file has been updated massively since i last viewed it (mid 2010s) - it looks like earth has made some advancements in definitions of primatives - like Ampere - and things might be more exact, now. Hooray!
I guess we're reading two very different posts then.
You want an idea of how tough it is, write a universal TIFF[0] reader (not writer -writers are simple).
Fun stuff. BTDT. Got the T-shirt.
[0] https://www.itu.int/itudoc/itu-t/com16/tiff-fx/docs/tiff6.pd...
PNG and JPEG are simple enough that a single person can write a useful decoder for them from scratch in a weekend or two.
The newer formats achieve better compression by adding more and more stuff. They're year-long projects to reimplement, which makes almost everyone stick to their reference implementations.
The newer ones are destined to failure by complexity then?
Nah the space savings can significantly cut down on bandwidth costs at scale. They'll get (and have been?) pushed by Google and friends for that reason.
> [about the (R)IFF format] Having a generic container for everything doesn’t make sense. It’s a cool concept on paper, but when you have to write file loaders, it turns out that you have very specific needs, and none of those needs play well with “anything can happen lol”.
Well, it's not like PNG and SVG are any different.
> It may be worth noting that Chafa uses more of the block symbols, but the printouts it makes look ugly to me, like a JPEG image compressed with very low quality.
It may be worth nothing that Chafa can be configured to only use ASCII 219
> Finally, remember that the only legal text you are bound by (if at all) is the actual text of the GPL license. It does not matter what Stallman says he intended with the GPL, or what the GPL FAQ says should happen in some fantasy land.
Tell that to the lawyers when they send you a cease and desist.
The reason non-gpl compliant software don't touch GPL is not because there might be a loophole, it's that there is ni precident set in court and they don't be the ones needing to do it. This requires lawyers with expertise in both copyright law and contract law. It doesn't matter what is copyrightable if you agreed to a contract that you wouldn't do that and that is what the GPL is, a contract that you agree to that mentions how you are allowed to use the code in question.
In the end whether the GPL is enforceable in these edge cases is up to the courts not your interpretation of it and if you project becomes a roaring success do you really want to spend time and money on lawyers that you could rather spend on development.
The author quotes Google vs Oracle where the case was about using headers for compatibility: IIRC to provide an alternative implementation.
This is different from vv which uses the headers to link to the GPLed code.
IN most jurisdictions the GPL is a license, not a contract, and is definitely designed not to be a contract.
That said, as far as I can see vv is in breach of the GPL. This is a case of someone who wants there to be a loophole convincing themselves there is one.
I would definitely not redistribute vv because of that. More importantly I think it likely that people packaging software for Linux repos are not going to want to take the risk, and many will object to trying to find a loophole in GPL on principle too.
Let me guess, doesn't support Hdr images?
Let me guess, didn't read the article?