|
https://www.youtube.com/watch?v=RuZUPpmXfT0
|
# ? Jun 8, 2019 00:07 |
|
|
# ? May 27, 2024 00:24 |
|
I was just a little kid, but I remember when our local library replaced its paper card catalog with a system exactly like this. (I had completely forgotten about the beepy keyboards. Those are wonderful.) All the zillion cards from the drawers of the old paper catalog found a new life, becoming the slips of paper on which you could scribble down the Dewey decimal numbers you had found in the computer. Practical, and certainly better than just dumping the lot of them in the trash, but I still found it kind of sad somehow.
|
# ? Jun 8, 2019 00:48 |
|
Vanagoon posted:Haven't seen this mentioned in a while. The DeathStar HDD that failed so hard that it would lathe the magnetic coating off the platters: holy poo poo
|
# ? Jun 8, 2019 03:49 |
|
Does anybody have a link for that wireless modem that LGR used on that terminal? I'm having fantasies about making a terminal for my headless BSD server. How crazy would it be to use a terminal as an aws console?
|
# ? Jun 8, 2019 15:29 |
|
RVWinkle posted:Does anybody have a link for that wireless modem that LGR used on that terminal? I'm having fantasies about making a terminal for my headless BSD server. How crazy would it be to use a terminal as an aws console? Here you go! http://biosrhythm.com/?page_id=1453
|
# ? Jun 8, 2019 15:47 |
|
rockinricky posted:Here you go! More info under the video. https://www.youtube.com/watch?v=fsS0E4G310Y Hooray, my first ever Quote when I meant Edit.
|
# ? Jun 8, 2019 15:47 |
|
rockinricky posted:Here you go! Sold out and have been for some time if Twitter is anything to go by.
|
# ? Jun 8, 2019 15:51 |
|
They have them on Aliexpress. https://www.aliexpress.com/item/WIFI232-Eval-Kit-WiFi-module-board-kit-Development-Board-contain-WIFI232/1852767957.html I am not vouching for this seller or item, but there you go. Actually, looking at this more I am not sure that it's the same thing. Lowen SoDium has a new favorite as of 20:54 on Jun 9, 2019 |
# ? Jun 9, 2019 20:49 |
|
klafbang posted:My old university had a ton of SGI machines (Indys, O2s, Octanes, a few Onyxes and a handful of Origins in the server room). My brother and I scored a couple of SGI Crimsons and a bunch of loose boards for them from a surplus auction at our university back in the early 00s for almost nothing and how I loving wish I'd have just held onto them instead of selling them on as cheaply as we did at the time. ReidRansom has a new favorite as of 21:46 on Jun 11, 2019 |
# ? Jun 11, 2019 21:44 |
|
How does an SGI workstation stack up against a modern consumer graphics card?
|
# ? Jun 11, 2019 21:47 |
|
More or less like a 3DFX Voodoo graphics card
|
# ? Jun 11, 2019 21:50 |
|
Casimir Radon posted:How does an SGI workstation stack up against a modern consumer graphics card? The best graphics available for the Crimson was the Reality Engine option, which gave you resolutions of up to 1600x1200, textures up to 1024x1024, and a fill rate of around 2M pixels/second. That's about five orders of magnitude slower than a modern gaming video card. With the RE option, a Crimson would run you about US$100k. The only step up from there was the Onyx (with RealityEngine2 graphics), which would set you back about US$250k. Deskside SGI workstations without the highest-end options ran around US$30-40k, although if you were buying a shitload of 'em you'd tend to get a substantial price break (and miscellaneous other poo poo---big iron sales reps would throw all kinds of crazy poo poo your way if you were remotely involved in the purchasing process in those days.
|
# ? Jun 11, 2019 22:09 |
|
Casimir Radon posted:How does an SGI workstation stack up against a modern consumer graphics card? I have no idea if this is true but I remember a few years ago reading that a modern gaming PC is roughly as powerful as the entire render farm ILM used to make Jurassic Park.
|
# ? Jun 11, 2019 23:00 |
|
That seem plausible - a modern gaming PC can render 4k 60fps scenes with a whole lot of layered effects going on. I don't know what resolution they rendered JP at, or how complex the models/textures/effects were, but I suspect you could render them in realtime now with a very comfortable margin.
|
# ? Jun 11, 2019 23:26 |
|
Monday_ posted:I have no idea if this is true but I remember a few years ago reading that a modern gaming PC is roughly as powerful as the entire render farm ILM used to make Jurassic Park. And really all of that is rounding error compared to the much more mature toolsets and workflows that a modern team has. Because with raw render power if you need more you can either buy more or wait slightly longer, but having poo poo like automatic motion tracking and being able to preview poo poo on your desktop is huge on terms of what's actually feasible.
|
# ? Jun 11, 2019 23:50 |
|
SubG posted:And really all of that is rounding error compared to the much more mature toolsets and workflows that a modern team has. Because with raw render power if you need more you can either buy more or wait slightly longer, but having poo poo like automatic motion tracking and being able to preview poo poo on your desktop is huge on terms of what's actually feasible. What, like an Xbox?
|
# ? Jun 12, 2019 00:07 |
|
tactlessbastard posted:What, like an Xbox?
|
# ? Jun 12, 2019 00:22 |
|
SubG posted:If you're asking about motion tracking, kinda. poo poo like the Kinect is the low-end, consumer-facing side of motion capture, match moving, and so on. The high end being Andy Serkis in a gimp suit. But none of that was available in 1993 when Jurassic Park was released, and all the animation, compositing, and so on was done by hand. Which means you had a bunch of people spending hundreds of hours of their time manually doing poo poo that can be done automagically in minutes these days. That makes a much bigger difference, in terms of what a production looks like and what projects are considered feasible, than whether it will take your render farm two minutes or twenty hours doing a render. he means the xbox is huge, friend.
|
# ? Jun 12, 2019 00:26 |
Kinda nuts to think that there's only like 5-6 years between JP's release and the pre-production of LotR though.
|
|
# ? Jun 12, 2019 00:28 |
|
Pham Nuwen posted:he means the xbox is huge, friend. Old computer hardware gives you a different idea of what constitutes huge. I've broken elevators.
|
# ? Jun 12, 2019 00:43 |
|
SubG posted:
Quite the non sequitur there
|
# ? Jun 12, 2019 00:55 |
|
I remember when rendering Toy Story in real time was seen as being some kind of computer graphics holy grail, can we do that now? I mean rendering the movie exactly as it appeared in theatres on a desktop PC?
Vanagoon has a new favorite as of 05:14 on Jun 12, 2019 |
# ? Jun 12, 2019 05:12 |
|
We probably starting to come close as the big limitation was ray tracing which murder graphics cards even now. RTX is the foot in the door that still comes with very substantial FPS loss and certain limits to maintain an acceptable level of performance. Shaders have made rendering materials a lot easier which was a big thing at the time. Games are also very good at faking a lot of effects and culls a lot of details that Toy Story would consider. VRAM is another consideration. I reckon something like a Nvidia RTX 5xxxti series in SLI would do it with an indistinguishable level of accuracy.
|
# ? Jun 12, 2019 06:41 |
|
We had a room for our usability+UX team at MS in the early 90s that had a bunch of cool hardware like a NeXT cube and a Sun workstation You couldn’t actually do much of anything with them but they were fun to play with. Never got an SGI in there because $$ I guess
|
# ? Jun 12, 2019 06:51 |
|
Vanagoon posted:I remember when rendering Toy Story in real time was seen as being some kind of computer graphics holy grail, can we do that now? I mean rendering the movie exactly as it appeared in theatres on a desktop PC? For a fun comparison, watch a Kingdom Hearts 3 cutscene in the toy story world vs the actual movie. We're not quite there yet unless you have a absolute beast of a system but hot drat there are some places that are close, and some where the modern tech looks better.
|
# ? Jun 12, 2019 06:59 |
|
The original Toy Story doesn't look good these days. It was rendered in sub-1080p, shadows are pretty noisy and geometric and texture detail looks worse than most games released today.
|
# ? Jun 12, 2019 07:04 |
|
Vanagoon posted:I remember when rendering Toy Story in real time was seen as being some kind of computer graphics holy grail, can we do that now? I mean rendering the movie exactly as it appeared in theatres on a desktop PC? Game graphics are built on a lot of the same underlying tech, so we’ve been there for a while and improving on it. Ray tracing actually wasn’t part of Pixar’s render pipeline in Renderman until Cars, which is insane to think about. Just like video games, they’d fake everything, but you don’t have to worry about anything other than a single frame at a time when doing a movie. The big clever thing Renderman did was drop a bunch of values into a single pixel space, then run a program on the data in each one. That program’s called a shader, and graphics cards have been good enough to do full screen lighting with detail shaders like Toy Story since around the original Xbox, and got way closer in the next gen. At this point it’s just a matter of pushing enough hardware at the problem to match theater quality.
|
# ? Jun 12, 2019 07:42 |
|
Dewgy posted:Game graphics are built on a lot of the same underlying tech, so we’ve been there for a while and improving on it. Ray tracing actually wasn’t part of Pixar’s render pipeline in Renderman until Cars, which is insane to think about. Just like video games, they’d fake everything, but you don’t have to worry about anything other than a single frame at a time when doing a movie. The only thing I know about shaders is that my vidya card is better the more cores it has (it's not, it's an Evga GT 1030 that allows me to play Yooka-Laylee and Hat in time - better than the Sandy Bridge iGPU anyway) I've never really found a good and concise explanation of exactly what all those cores are doing. I remember having an ancient piece of poo poo computer that didn't have any pixel or vertex shaders and a lot of stuff refusing to run because of it. Was it 3DMark 2001 that had the rolling waves pixel shader demonstration? I think that's what that is. Really impressed me at the time. This effect here: - at 11:28 if the timestamp doesn't stick https://www.youtube.com/watch?v=SDrFj8JK6P0&t=678s Also, the stylesheet on that site you linked is doing something horrible to the text and it makes it nearly unreadable. Had to copy paste it into notepad to try and parse it.
|
# ? Jun 12, 2019 08:33 |
|
Very basically, they run shaders - little programs that have access to everything in RAM on the graphics card, including the textures, bump maps, 3D scene, the output picture at different stages of rendering, and whatever else the programmers have put in there. A modern rendering pipeline is less "put the scene in ram and let the card draw it" and more like running a whole little render farm on a card with a bunch of different programs tweaking and adding layers. Most of that can be done in parallel (just give a chunk of the image to each core), so more cores = faster. I'm not a graphics guy, so there are a lot of things I don't know here - like how the work is divided between the fixed-function dedicated rendering hardware vs programs on the more generic GPU cores.
|
# ? Jun 12, 2019 09:25 |
|
SCheeseman posted:The original Toy Story doesn't look good these days. It was rendered in sub-1080p, shadows are pretty noisy and geometric and texture detail looks worse than most games released today. Jerry Cotton posted:Quite the non sequitur there
|
# ? Jun 12, 2019 09:39 |
Dewgy posted:Game graphics are built on a lot of the same underlying tech, so we’ve been there for a while and improving on it. Ray tracing actually wasn’t part of Pixar’s render pipeline in Renderman until Cars, which is insane to think about. Just like video games, they’d fake everything, but you don’t have to worry about anything other than a single frame at a time when doing a movie. That's awesome and hilarious. I used to play around with POVRAY back in the early 90s, making huge detailed scenes out of Constructive Solid Geometry (since that's how POVRAY worked, as opposed to meshes of surfaces made up of polygons of arbitrary shape — instead you would construct objects out of mathematically defined spheres and cylinders and planes and such). I still have the renders I made of the Mega Man villains. My first WWIV screen name was "Ray Tracey" lol Reading that stuff makes me feel like a starry-eyed dumbass high school kid again. Even though I know absolutely nothing about the graphics tech that has happened in the intervening time.
|
|
# ? Jun 12, 2019 11:51 |
|
SubG posted:1536x922. Low enough that they did a new render for the blu ray. And since it's a film real time means only 24 fps. No I was implying you're fat lmao laffo lollerkopter
|
# ? Jun 12, 2019 12:59 |
|
Data Graham posted:
|
# ? Jun 12, 2019 13:41 |
|
SubG posted:And since it's a film real time means only 24 fps. Speaking of relics... Now that we've long since moved on to digital projection, why is is that we're still hanging onto 24fps as the standard for movies? I get that anything higher doesn't quite look right, like as in what we expect of a movie, but that's only because we're so accustomed to it, and I'd expect that everyone would quickly reset their expectations if more movies were being distributed in higher frame rates. Is there any other reason though, beyond Hollywood rear end in a top hat studio execs doing their usual thing projecting their own stubborn idiocy onto the wider public?
|
# ? Jun 12, 2019 14:30 |
|
It looks like cheap rear end?
|
# ? Jun 12, 2019 14:32 |
|
ReidRansom posted:Speaking of relics... Now that we've long since moved on to digital projection, why is is that we're still hanging onto 24fps as the standard for movies? I get that anything higher doesn't quite look right, like as in what we expect of a movie, but that's only because we're so accustomed to it, and I'd expect that everyone would quickly reset their expectations if more movies were being distributed in higher frame rates. Is there any other reason though, beyond Hollywood rear end in a top hat studio execs doing their usual thing projecting their own stubborn idiocy onto the wider public? its a lot more of a complicated issuie than "more FPS = better"
|
# ? Jun 12, 2019 14:39 |
|
VFX cost is the big one. All the matchmove, compositing, sims, animation, and rendering workflows are built around 24fps. Moving to 48/60 whatever dramatically increases the cost, because now the artists have to deal with double or more the amount of frames, thus needing way more time and possibly even new pipelines. I've done some 60fps/4k stuff for a few gigs, and it's sorta miserable, especially on a non-prestige budget. Especially with sims/animation, you need to be goddamn spot on because you can't exploit motion blur/quirks of the format anywhere near as well, so any mistakes are immediately apparent.
|
# ? Jun 12, 2019 15:02 |
|
Trabant posted:It looks like cheap rear end? I heartily disagree - but I'm one of the weirdos that loved that one part of the Hobbit movies. (Actually being able to follow what happens in action scenes? Such luxuries!)
|
# ? Jun 12, 2019 15:06 |
|
ReidRansom posted:Speaking of relics... Now that we've long since moved on to digital projection, why is is that we're still hanging onto 24fps as the standard for movies? I get that anything higher doesn't quite look right, like as in what we expect of a movie, but that's only because we're so accustomed to it, and I'd expect that everyone would quickly reset their expectations if more movies were being distributed in higher frame rates. Is there any other reason though, beyond Hollywood rear end in a top hat studio execs doing their usual thing projecting their own stubborn idiocy onto the wider public? Because if you shoot in anything besides 24fps a bunch of morons scream themselves hoarse about how "it looks like teevee..."
|
# ? Jun 12, 2019 15:15 |
|
|
# ? May 27, 2024 00:24 |
|
I used to work somewhere that would run that software to test video cards. I always wished those were real games, especially the car one with the walker
|
# ? Jun 12, 2019 15:32 |