• yjftsjthsd-h 5 hours ago

    From https://www.tech-pubs.net/wiki/IRIX_101 -

    > Why are machines that run IRIX expensive?

    > There are no new machines being produced, and high-end machines are in short supply. Less expensive machines can be had. We do not recommend using eBay to search, as the prices are usually extremely inflated.

    Less expensive machines can be had on the non-eBay used market, or have we hit the point where the cheapest way to have such a machine is dedicated emulation like https://dmitry.gr/?r=05.Projects&proj=33.%20LinuxCard ?

    • ThatGuyRaion 4 hours ago

      If we can get an FPGA core of the Indy's XL/24 board and helper chips, you can buy a cheap pack of R5ks on alibaba last I checked and probably make a smol sgi board

    • neilv 2 hours ago

      Nice. Couple comments:

      * If you can find ways to collect and preserve copies of the software, that can be big value. Every version and variation of a title has sometimes been important after the fact. Maybe use archive.org for historical archiving of software with unclear licensing status, and link to it from your wiki, with your wiki providing the background text and organizing that isn't archive.org's strong suit. (I'm not talking about piracy, but just trying to ensure that any copy at all survives, which had been a real problem on some other platforms of this era. Also, it's easier to get a company to say that such-and-such software from a company three acquisitions ago is OK for people to run on vintage boxes and in emulators and museums, than to ask them to find and provide a working copy, which they usually cannot.)

      * You might want to capture provenance/copyright info for uploaded photos, like Wikipedia kinda does. Easier to capture it at upload/linking time, than to try to reconstruct it later.

      • tombert 5 hours ago

        I guess I sort of get it, but I always kind of wondered why Silicon Graphics never made a "prosumer" computer. With the (kind of) exception of the Nintendo 64, it seems like most of the Silicon Graphics machines were tens of thousands of dollars in the 90's.

        I am curious what the world would be like now if Silicon Graphics had made a model that was less than $2,000, but still ran Irix and had some amount of the 3D processing stuff with it. For that matter, I sometimes wonder what it would be like if Nintendo had released kits for the N64 that let you use it more as a computer.

        I'm just thinking that it's all about timing; if they had released something in 1997 for "prosumers", before OS X came out, would Apple have its same market position now? Would we all be using SGiPhones? Would every tech startup get all their engineers little SGI laptops?

        • sien 5 minutes ago

          They sort of tried with a machine running Windows.

          The SGI Visual Workstation was meant to be that. It was quite a bit more expensive that 2K though.

          https://en.wikipedia.org/wiki/SGI_Visual_Workstation

          Where I worked in 1999 we got one. They had a neat unified memory architecture. For some video tasks they were fantastic.

          But the issue was that the Intergraph had better machines for PC 3D.

          https://en.wikipedia.org/wiki/Intergraph

          Then Intergraph got eaten by Nvidia who leveraged the huge demand for gaming hardware to improve their GPUs again and again.

          • nyrikki 3 hours ago

            The SGI Indy was their low end line with a price starting around the $5K range which was much cheaper than the Indigo2 line. That was about the same price range as a Dell Dimension Pro150 at the time.

            They went on an acquisition binge in the mid 90's buying Cray, Alias, Wavefront and, Intergraph just when the PC industry busses where starting to catch up and they they announced that they were dumping MIPS and moving to Itanium....

            But the sub $2k price wasn't really possible. Remember even the 256K version of the Pentium Pro was more than $1000 for the CPU alone in the mid 90's.

            The UNIX Wars, PC improvements, NT optimism, Itanium mess and a directionless M&A killed a lot of companies. Dec, Novell, SGI etc...

            Dell targeted that high end consumer market, with enthusiasts as a their target to avoid the higher support costs and low profit of the home market.

            As someone who was running SGI's at the turn of the century in the movie industry...nothing about SGI really promoted brand loyalty, it was a tool to run software that you needed. Once Maya was ported to linux, you were far better off running a cluster than buying more expensive SGI units.

            The good old days almost never were. I still remember having to re-insert the same CD what felt like dozens of time for even a simple IRIX install. I also remember that the remote install required bootp, tftp, and password-less root rsh! on the server...so you ended up using the CD's to install.

            That said, I do miss CXFS and CXVM, their clustered filesystem.

            • tombert 3 hours ago

              Yeah that's fair, as I mentioned, I wasn't really doing anything with Silicon Graphics in the 90s, so I'm only able to look back at this stuff from a kind of historical perspective, and it's easy for these things to kind of feel more legendary than they actually were as a result.

              $2,000 might have been a bit ambitious, but sub $3,000 does seem like it was achievable, but it does make sense that Dell kind of swooped in and took that market.

              You know, it's weird, because I always kind of considered Linux as a bit of a niche geeky thing. I run it, I like it, but I always sort of felt like I was the weird one, and I think for consumers I am, but reading about this stuff it looks like Linux caught on in workstation space pretty quickly. I noticed that the expensive professional video editing software has been used in Linux for awhile, it's just the consumer and prosumer side that struggled, and as you mentioned it has had a port to Maya for quite awhile.

              • nyrikki an hour ago

                While way too complex to accurately flesh out here, Linux took off mainly because the GNU userland had been in progress for almost a decade before and was ready, while the legal issues and rift on the 386BSD caused it to miss the window of common cdrom drives and the commercial internet.

                But yes, I remember being forced to do some silly radio show in ~1997-8 where I was brought in to talk about Linux, as I had saved him a lot of money setting up DNS/Sendmail/Pop on PC's vs the decstations he was using.

                Harley Hahn was the other person on the show, and despite Nasa and others using it, he was insistent that it would never replace commercial unix.

                As I was maintaining OSF/1, SunOS, Irix, Interactive, SCO, AIX, AT&T, coherent UNIXs at that point, along with systems like MPE/IX, os400, OS/2, NT, Netware etc... day to day; it was quite obvious how much time and money it saved.

                IIRC SCO single user desktop license w/networking was ~$800 and you had to pay another $600 more to get the development kit...without updates.

                Obviously the huge bump in popularity was when LAMP became popular.

                • martinpw 2 hours ago

                  > it looks like Linux caught on in workstation space pretty quickly.

                  At least for film and visual effects, this is because the industry was previously dominiated by SGI and most major studios had pipelines built on IRIX.

                  When NT systems became cost competitive, those studios were reluctant to switch even though the hardware was compelling, due to major challenges with the OS switch. However once Linux came along, it was easy since the OS was so similar. Cheaper hardware, same-ish OS. Done deal.

                  • giantrobot 2 hours ago

                    Even a sub-$3k SGI would have been extremely difficult to pull off. PC manufacturers, even ones making higher end workstations, benefitted from Intel's economies of scale. Even if they were ordering custom motherboards they were picking a lot of components out of a catalog.

                    SGI machines had a lot more bespoke components. Even if they chased higher volumes they would have a far smaller scale than the PC space. Even then how would SGI chase volume even if they had a "cheap" option? They had no experience with retail sales. None of their reps would have tried to sell machines with lower price tags and sacrifice a commission.

                    The Indy (at $5k to start) was derided as an "Indigo without the go". I don't think SGI would have had any success building their own take on the PowerMac 4400. If an Indy couldn't make SGI customers happy there would be no way for a lower spec machine to do so.

                • indymike 32 minutes ago

                  > I am curious what the world would be like now if Silicon Graphics had made a model that was less than $2,000, but still ran Irix and had some amount of the 3D processing stuff with it.

                  I'm not sure it would have made much of a difference as the low end of the market was driven by gaming and multimedia (i.e. movies on PC screen), and the high end of the graphics market was driven by rendering for expensive video and film production. In short, what you needed to play Fallout (1997 release) at home was an order of magnitude less than you needed to render CGI for Men in Black (1997 release).

                  > if they had released something in 1997 for "prosumers", before OS X came out, would Apple have its same market position now

                  I believe Apple would have been just as successful as they are today. Software availability (e.g. Adobe and Microsoft Office) and ease of use were very good on Mac, and not so good on workstations like SGI. There was a very small, very demanding market for SGI's workstations (rendering super high quality graphics), and there was a huge market for what Apple supplied (business, print, web graphics, music production, video production). In short the software ecosystem wouldn't have happened. Apple did a great job when OSX came out of making it easy-ish for app developers to port to OSX and allowed users to run their old mac software on their new OSX powered mac.

                  > Would we all be using SGiPhones?

                  Probably not. System V unix was very expensive to license (hence Android being Linux based and iOS being based on BSD and Mach) and would have added considerable cost to each mobile device based on licensing at the time. A lot of what made it possible to package up a modern smartphone was open source software + low cost components with ridiculous capability (for their cost). None of this was of interest to SGI where they were focused on high-end equipment with little commodity appeal.

                  • browningstreet 5 hours ago

                    I had an Indy on my desk when it came out, with a 21" CRT monitor. I think they thought that's what the Indy was. I'd still go downstairs to another lab to play with the NeXT cube that was there.

                    It was the culture of the time -- they didn't want to move downmarket and seem like PCs, so they (and their competitors) kept the workstation class a bit more premium and hoped to get enough business/tech purchases from the likes of academia and entertainment. They weren't really trying to sell to devs so much.

                    • tombert 5 hours ago

                      I am sure there was a whole bunch of factors, but, at least from my lay perspective, it seems like a recurring theme was that these companies kind of underestimated how good even relatively cheap Intel CPUs were going to get in the 90's, making it so that even consumers could afford a pretty powerful computer, or at least powerful enough to be "useful", and then it becomes less obvious why you'd spend $5,000-15,000+ on a fancy workstation; if I can get 80% of the value of an SGI with just a decent Pentium and a bit of extra RAM, for 1/4 the cost, most people are going to go with that.

                      Of course, I don't know what I'm talking about, I'm confident a lot of people on HN know more about this than I do, this is just a Wikipedia-level understanding of this seems to indicate to me.

                      Still, I do like to think about it. A part of me thinks, and I have no way to confirm this, that there might have been bigger ambitions for the N64, to convert it into a "real" computer, so you could have a "real" SGI machine at home (though obviously less powerful than an Indy or something).

                      • dekhn 2 hours ago

                        I got a lot of flack from leadership at the lab where I worked in '97 and '98 when I bought Dual Pentium Pro 200MHz and later a Dual Pentium II 400MHz that ran Linux. They asked why I didn't buy HP, or SGI desktops. I pointed out that the grant i had- for $5K - wasn't really enough to buy a single machine (our lab had the cheapest SGI Indy, which IIRC was $10K) from HP or SGI, while for that amount of money, I could get dual processor experience, and more RAM/disk than the lowest-end HP or SGI. I ported many codes over from SGI/Tru64 to Linux on those machines.

                        Later I got a $15K budget to build a cluster, and again was asked why I didn't just buy a single SGI machine, and I pointed out I was able to get six PCs, each of which was 50% the speed of a single SGI machine, and I could cluster them to get experience running MPI jobs. 10 years later, all the labs had retired their non-Linux machines aand were running various white box PCs or HP servers running Linux, and I had 10+years of experience running HPC jobs on Linux clusters.

                        If you had a very large budget you could get high-end SGIs that had custom capabilities that no PC at the time could match but even then I specifically chose to not need those capabilities until they were commodity.

                        • octorian 4 hours ago

                          One thing about this period that kinda annoyed me in the tech press, is that it always felt like these companies were making new/better computers for "their existing customers" as if they were only ever competing against their own older products.

                          Another thing, which perhaps "grinds my gears" a lot more, is that this late-90's/early-00's shift to PCs happened before Linux was sufficiently taken seriously. So lots of high-end applications that started on UNIX migrated over to Windows NT. And once you're firmly on Windows, its much harder to go to Linux. (Whereas commercial UNIX to Linux is easy.)

                          So now there are whole markets (that used to support commercial UNIX) where Linux users get the middle finger, and as someone who hates Windows, this really ticks me off.

                          • tombert 3 hours ago

                            I hate Windows now because I'm comparing it to modern Linux and macOS, but in the 90's wasn't Windows NT pretty competitive with some of the commercial Unix offerings? I thought it had some pretty cool stuff in regards to non-blocking IO, and NTFS was pretty ahead of its time compared to most of the Unix filesystems at the time I think?

                            I wasn't really writing code in the 90s, so it's tough for me to say with confidence, but I thought I read somewhere that Windows NT was, in some regards, objectively better than most of its competition in the 90's.

                            • p_ing 2 hours ago

                              The NT kernel was designed around non-blocking async I/O; all I/O in NT is async and leveraged completion ports. I believe Solaris is the only other OS with IOCP.

                              Back then, Linux had either a primitive scheduler or O(n) scheduler, until 2003. NT shipped with an O(1) scheduler.

                              I personally feel NT still does high pressure memory management better than Linux. No opinion about other OSes, although macOS will ask the user what to force quit -- then again, given people have been seeing memory ballooning in random apps, including OOTB apps on macOS 15... Apple has other issues.

                              NT is great. It's just plagued by Win32. And those designers...

                              https://users.soe.ucsc.edu/~scott/courses/Spring01/111/slide...

                              > In 1996, more NT server licenses were sold than UNIX licenses

                              • smackeyacky 3 hours ago

                                This is pretty much correct. Windows NT was far better than DOS and feature competitive with the Unices of the time, along with being available on more modest hardware.

                                The Unix wars were also raging, and compatibility between Unices still hadn't been sorted out (and arguably never was) so a company with workstation class software had to port their code between mostly compatible operating systems and wildly incompatible GUI frameworks. So shipping an NT product wasn't the big deal that it seemed.

                                • tombert 3 hours ago

                                  That makes sense; even within the Linux world in the year of our lord 2025, binary compatibility is still kind of an issue. I have had issues getting regular Linux binaries working in NixOS [1], and even getting stuff working between Ubuntu and OpenSUSE and Fedora can be a pain.

                                  It totally makes sense why developers would see Windows NT as the future here; it gave you most of the features you'd want from Unix-land (and I think some new stuff too that wasn't available in Unix?), and having "platform to rule them all" is appealing to most developers for obvious reasons. It doesn't hurt that Win32 isn't too hard to code against (at least it wasn't when I played with it 15 years ago).

                                  I do find it a bit strange that there wasn't really a "de facto" Unix that people coded against, like a clear winner that people liked, but I guess if the hardware was too expensive to run it, that's going to cut down on usage; we didn't really get "standard Unix" until OS X.

                                  [1] https://news.ycombinator.com/item?id=42703720

                            • bombcar 3 hours ago

                              I feel there was a real mental block (that Linux partially helped destroy, mind you) that "real computing" couldn't be done on "commodity chips".

                              And then the Internet blew everything up and out, and Linux machines were cheap and capable of web serving just as well as any workstation.

                          • dekhn 2 hours ago

                            Closest thing I can think of is this: https://en.wikipedia.org/wiki/SGI_Visual_Workstation It had some custom hardware choices different from typical PCs of the day (IIRC it had a crossbar instead of a system bus. It needed a special HAL to boot NT; Windows 2000 was the last version that supported the machine. It was extremely expensive for its capabilities and everybody in my lab looked at it and concluded "SGI is dead."

                            • gregw2 2 hours ago

                              I spent years thinking about this at the time, and wrote up a crystalized super-long "why?" in an SGI retrospective comment on an earlier thread, https://news.ycombinator.com/item?id=39960660 which you might find of interest.

                              Feedback from SGI insiders or pointers to HBR case studies welcome.

                              • tomwheeler 5 hours ago

                                Application compatibility mattered a lot back then. There was no Microsoft Word or Excel for IRIX and Google Docs / Google Sheets didn't exist, for example. Many of the prosumers wouldn't have been able to use it as a daily driver. I recall that Adobe ported Photoshop to IRIX at one point, but I think it was gone by '97.

                                • tombert 4 hours ago

                                  I wonder how much of this is a chicken-egg problem though; IRIX didn't have enough of a user base to justify porting over a lot of these applications, and because they didn't have these applications it couldn't develop a user base big enough, etc.

                                  I think what you're saying is totally correct, if you're going to be spending a lot of money on a computer, you need to make sure it can actually do the stuff you need it to, and in the 90's that does more or less imply Microsoft Office compatibility.

                                  Still, I do kind of wonder if SGI had actually tried to penetrate this market, that maybe they could have made this work. They could have licensed and ported some of these applications over themselves, and if they had gotten big enough then maybe some of these companies would have ported these things over.

                                  It's tough to say.

                                  • kjellsbells 4 hours ago

                                    Well, they tried producing high end Windows NT machines at one point, that would have run Office, had OpenGL support, etc. But no one bought them. So its not clear that doing the same but with Irix would have gone any better.

                                    The last-gasp part of SGI I would have like to have seen develop was Cellular IRIX. If I recall it was a sort of distributed OS where the scheduler was node aware and could parcel out tasks within a single process to other nodes. What NFS did for the local/remote file system split, Irix would do for compute. Never came to fruition.

                                    • tombert 4 hours ago

                                      That sounds pretty cool, I guess the idea was that the scheduler could figure out smartly where to put applications and minimize latency in the process?

                                      • kjellsbells 22 minutes ago

                                        Yes, much like you would in an HPC system (much of Cellular IRIX was supposed to have been influenced by what Cray and SGI learned from each other when they merged).

                                        More details (PDF) here. Remember this is from 1998. Reading it again in 2024 there is of course quite a bit of foreshadowing of what we ended up with in cloud. And you won't be surprised to learn that the author ended up passing significant time working at both Microsoft and AWS after their SGI days.

                                        https://cug.org/5-publications/proceedings_attendee_lists/19...

                                • jefflinwood 5 hours ago

                                  That's kind of what they were trying to do with the 320 and 540.

                                  I worked on their e-commerce store, you could configure and buy and ship one from the web site in 1999-2000.

                                  The prices were still really high.

                                  • MisterTea 3 hours ago

                                    > why Silicon Graphics never made a "prosumer" computer

                                    They were stuck in a hopeless corporate mindset where their customers were all deep pocket mega corps. You had to go through a sales process to buy one, there was no demo machine at the local computer shop. They also had a brain damaged sales team that would do things like e.g. under-spec hardware with too little RAM to make a sale leaving customers with costly hardware that struggled under loads damaging their reputation (e.g. imagine a class room full of underspecd machines leaving students thinking "this pos made Jurassic Park?").

                                    > if Silicon Graphics had made a model that was less than $2,000, but still ran Irix and had some amount of the 3D processing stuff with it.

                                    They would look exactly like the Indy or the later O2 which were their entry level machines.

                                    > I sometimes wonder what it would be like if Nintendo had released kits for the N64 that let you use it more as a computer.

                                    I think SD TV's of the time would have made rendering text a bit difficult forcing you to use large font making any substantial text work impractical (e.g. word processing or code editing). However, a better N64 fantasy IMO would be to include an optical drive in addition to the cart slot, double the RAM, increase the pitiful 4kb texture cache, then for good measure - N64 themed kb, mouse and Ethernet adapter ;-).

                                    > I'm just thinking that it's all about timing; if they had released something in 1997 for "prosumers",

                                    That line would be the SGI Visual Workstations. The first two, the 320 and 540 had custom motherboards with SGI chipset, RAM and GPU. The GPU used unified memory and you needed to buy expensive custom SGI RAM DIMMS. They only ever ran Win NT 4.0 and 2k because of the GPU drivers. Then there was the 230, 330, and 550 - strait up ATX Wintel boxes with Nvidia cards. My main desktop is a 550 case with a Threadripper in it. But again, IMO, they had no clue how to sell to small shops and individuals and by then, too little, too late.

                                    • tombert 3 hours ago

                                      > I think SD TV's of the time would have made rendering text a bit difficult forcing you to use large font making any substantial text work impractical

                                      Yeah, in this hypothetical conversion kit they've have to give you some kind of better connections to plug into a monitor, and obviously some kind of conversion kit, maybe a double-expansion-pak, and maybe some kind of external optical drive or something similar to the N64 DD with a CD drive.

                                      I dunno, I don't think it would necessarily have failed, but we'll obviously never really know. Sony did that with the OtherOS stuff for awhile on the PS3, and I think there was a subset of people who really liked it for scientific computing.

                                    • MrBuddyCasino 5 hours ago

                                      They could have been NVidia (at best, more realistically 3dfx), but certainly not Apple.

                                      • Fnoord 4 hours ago

                                        Back then, when SGI quit the graphics business and went to supercomputers only (having NUMA as IP, and going with Linux) the GPU devs each ended up at Nvidia.

                                        Apple had to pick Be, Inc. or NeXT (who had Jobs). They went with the latter. Be, later on, went to Palm, but that was too little, too late.

                                        SGI was one of the many Unix mastodons who fell victim of PC hardware (x86-32, then x86-64) and Windows NT, Linux being good enough, with Apple taking the piece for graphics designers and audio engineering (many whom came from Amiga, and SGI). Oh, and I forgot to mention: they bet on the Intel Itanic.

                                        I wouldn't pay a dime for these nowadays. Way too slow / inefficient W/performance. Which is really sad, but with current energy prices I just wouldn't be able to afford my Octane 2 using 1 kW anymore. I owned many SGI machines, really loved them (the sound card in the Indy was insanely good! And IndyCam from a time when hardly anyone had a webcam). Indy (various, including Challenger S), Indigo (including purple remake), Indigo 2, and Octane 2. But never an O2 :) nor a Fuel or Tezro ;) cases, although plastic and sensitive to scratches, were great aesthetical, too. While I would not want such machine anymore, I will cherish the memories though! Plus, there are some people reusing the cases. Cool beans! IMNSHO, SGI machines like Indy belong in any half decent Unix museum.

                                        • tombert 3 hours ago

                                          At a previous apartment, I managed to convince my wife we needed a giant server rack, so I started looking on Craigslist and eBay for one.

                                          I really wanted an SGI server rack because I thought it looked cool, and because I found them historically very interesting, but sadly those appear to have a bit of a collectors' market and are too expensive. Instead, I ended up buying a Sun Microsystems rack, which to be fair is also pretty cool historically.

                                          I would love to buy a broken SGI machine and put some modern hardware in there, just for the aesthetic. Maybe someday.

                                          • jamesy0ung an hour ago

                                            The old Sun Ultra 24's make cool looking PC cases as well

                                            • MisterTea 3 hours ago

                                              > I would love to buy a broken SGI machine and put some modern hardware in there, just for the aesthetic. Maybe someday.

                                              The closest you can easily get is the 230/330/550 cases which are strait up ATX. I have a 550 case with a threadripper in it. The only issue was I had to modify the 750W PSU to fit in the case by replacing the IEC inlet with a short pigtail cord.

                                          • tombert 5 hours ago

                                            Genuine question...why not?

                                            The reason I ask, it's not like they were just making "components" like 3dfx or Nvidia at the time, they were making end-to-end workstations. Why is it such a stretch to think that they could have taken on Apple, especially since Apple in the 90s was kind of languishing?

                                            • bombcar 3 hours ago

                                              It was hard enough for Apple to take on Apple.

                                              Next basically tried, and only survived by being acquired.

                                              If they had wanted to survive, they needed to buy Dell at the right time.

                                        • foldor 5 hours ago

                                          How about adding information about IDO, like the static recompilation project being used in N64 game decompilation projects? It enables compiling code in a linux environment and having it match 1:1 as if it were run by the original compiler. There's even significant progress in decompiling the entire codebase as well.

                                          https://github.com/decompals/ido-static-recomp

                                          https://github.com/decompals/ido-matching-decomp

                                          • ThatGuyRaion 4 hours ago

                                            As time permits! I'm the only current article writer at this time

                                          • siev 4 hours ago

                                            Nice job!

                                            Very Important: Are you gonna be adding an entry for the Japan-exclusive set-top box that had its own official port of DOOM done by Jonathan Blow? :p

                                            • ThatGuyRaion 3 hours ago

                                              Link?

                                            • blankx32 6 hours ago

                                              Images broken?

                                              • ThatGuyRaion 4 hours ago

                                                Where?

                                              • ur-whale 4 hours ago

                                                I can't believe your site does not include the notorious SGI screwdriver, the mighty SGI #9980915

                                                http://industrialarithmetic.blogspot.com/2011/04/sgis-finest...

                                                https://www.reddit.com/r/retrobattlestations/comments/fele3v...