Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
feedmegin
Jul 30, 2008

ate all the Oreos posted:

how practical would it be to just make them like, a standard socket like the CPU?

It's kind of been covered sort of, but I have bad news about how 'standard' CPU sockets are.

Adbot
ADBOT LOVES YOU

josh04
Oct 19, 2008


"THE FLASH IS THE REASON
TO RACE TO THE THEATRES"

This title contains sponsored content.

Doc Block posted:

isn't part of the reason that opencl never caught on was because apple was the only one really invested in it?

like, didn't apple ask nvidia to do the initial design/implementation back when they were still friends, but nvidia would only do it if opencl was published as an open standard? and didn't intel insist on it being gimped and having all that dumb "BUT ALSO RUN KERNEL ON CPU!" garbage to get them to support opencl on their integrated GPUs, because intel still thought realtime raytracing on the CPU was gonna be a thing and that it would kill ATI and NVIDIA?

i could be (read: probably am) completely misremembering the previous paragraph, but it does seem like apple was the only company that tried to push opencl at all.

it was basically an apple invention, iirc, until they lost interest in favour of Metal. nvidia were interested exactly as far as making sure opencl support wasn't a reason to not buy an nvidia card, amd were interested because they needed to compete with cuda but they've put more effort into various hokey run-cuda-by-parsing solutions for the last few years.

and a whole host of other companies (especially mobiles) were briefly interested in it as a selling point, but google shot that down by not supporting opencl on nexuses in favour of something called renderscript as part of an internal google turf war

OzyMandrill
Aug 12, 2013

Look upon my words
and despair

pvr is tiled-based, but internally uses a scanline renderer. Tris within a tile are z-sorted, and then only the topmost pixel is drawn in lines. the actual fill rate is pretty poor compared to amd/nvidia as it is using more general processors instead of dedicated vertex/pixel pipelines which are considerably simpler in terms of instructions they can do. pvr really falls down on multiple layered transparencies and full screen effects which the normal gpus are chomping through with greater performance.

deferred rendering is becoming ubiquitous, which for those who don't follow this stuff is where you draw everything with no textures to just work out the z-buffer. you have to run all the vertex shaders, but the pixels just render depth. then you render everything again with full pixel shaders, so only the visible pixels incur the cost of the pixel shader. but you don't write out final colours, you just store the normal/base color/roughness/conductivity/etc. then you do a full screen pass that applies shadows/lighting/reflections. based on the normals/physical parameters once for each pixel in the final image. then you do the fullscreen post process effects (colour balance, tone mapping, antialiasing, depth of field, fog, etc). as the hw is designed to do multiple passes on the scene, it's optimised to do these passes very efficiently.

recent hardware has introduced 'tiles' to these buffers, but it is fundamentally different. pvr draws each tile full of final colours from z-sorted lists of prims. regular cards use tiles as cache units, and to track simple state flags to accelerate the composite/clear passes. there's no storage of prims/sorting them like pvr does.

Trimson Grondag 3
Jul 1, 2007

Clapping Larry
i'm irrationally happy that powerVR came out of the 3dfx days as the eventual victor. probably not the same company any more i know

Shame Boy
Mar 2, 2010

feedmegin posted:

It's kind of been covered sort of, but I have bad news about how 'standard' CPU sockets are.

eh i was kinda thinking it wouldn't be that different from how PCIe gets a new revision every couple years it seems but then i remembered that that's actually a proper standard and it's backwards compatible, and good luck doing that with a chip lol

feedmegin
Jul 30, 2008

Trimson Grondag 3 posted:

i'm irrationally happy that powerVR came out of the 3dfx days as the eventual victor. probably not the same company any more i know

Well, now it's Imagination, which is p much going to go broke now Apple has decided to use their own GPU instead of licensing PowerVR.

Truga
May 4, 2014
Lipstick Apathy

ate all the Oreos posted:

eh i was kinda thinking it wouldn't be that different from how PCIe gets a new revision every couple years it seems but then i remembered that that's actually a proper standard and it's backwards compatible, and good luck doing that with a chip lol

tbh, it wouldn't be too big a stretch with hbm coming down the pipeline into consumer poo poo soon, have a big slot for the gpu on the mobo, call it accelerated graphics slot or whatever, have gpus be basically socs, like where amd and intel are moving with their desktop cpus already (amd even has the efi/bios in the chip now, apart from ram everything on the motherboard is optional and that's going to be changed by hbm in the future). that'd probably work.

what won't work though, is getting 250+ watts of heat off of that chip :v:

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

in the hpc space nvidia is pushing nvlink which can do gpu-gpu and gpu-cpu; you can get it on a power8 or power9 cpu from ibm but intel isnt really interested in that obvs; anything that needs more than gen3 x16 is a competitor

still a lot slower than on-die hbm tho

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

josh04 posted:

it was basically an apple invention, iirc, until they lost interest in favour of Metal.

there's quite a bit of time between the two:

OpenCL was announced at WWDC 2008 and introduced in Snow Leopard in 2009
Metal was announced at WWDC 2014 and shipped that fall on iOS and the following fall on macOS

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe
i keep working with amd engineers working on opencl, actually, so presumably they haven’t completely lost interest

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

rjmccall posted:

i keep working with amd engineers working on opencl, actually, so presumably they haven’t completely lost interest

opencl is their only hope of dealing with Cuda which the is the golden ticket to hpc and data center riches

for example virtually none of the big dnn frameworks have decent if any opencl support

they do have amazing Cuda support

Malcolm XML
Aug 8, 2009

I always knew it would end like this.
I wanna know the bizarre aechitectures that must be used on the third tier GPUs like vivante

Malcolm XML
Aug 8, 2009

I always knew it would end like this.
also I stumbled on a Verilog implementation of gcn on GitHub which is p cool

Workaday Wizard
Oct 23, 2009

by Pragmatica
as i understand nvidia and other companies dont manufacture their physical chips. what prevents the manufacturers from copying the design themselves?

Truga
May 4, 2014
Lipstick Apathy
patent/copyright laws.

SO DEMANDING
Dec 27, 2003

Malcolm XML posted:

third tier GPUs

you're talking embedded but this got me thinking about old desktop cards

matrox parhelia

SiS xabre

XGI volari (apparently they had a dual-GPU card :pwn:)


i still wonder who bought any of that crap because as i recall they were all total garbage compared to ATI and nvidia

Shame Boy
Mar 2, 2010

so i dabbled in a tiny bit of opencl a while ago and eventually i'd like to come back around to it and play around with it a bit because i'm a turbodork who finds this kinda thing interesting, should i even bother with it if it's going out of style or should i look into something else instead?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SO DEMANDING posted:

you're talking embedded but this got me thinking about old desktop cards

matrox parhelia

SiS xabre

XGI volari (apparently they had a dual-GPU card :pwn:)


i still wonder who bought any of that crap because as i recall they were all total garbage compared to ATI and nvidia
A special warning about Matrox; they are from Canada, and so their products do not meet the demanding electrical standards required of American computer equipment and because their chipsets are mostly illegal counterfeits of American chips, you may encounter unexpected bugs and compatibility problems with standard APIs such as DirectX. ATI is also Canadian and has the same problems; counterfeit computer part manufacturers are attracted to Canada by its lax intellectual property laws.

Truga
May 4, 2014
Lipstick Apathy
the hottest take

Bulgakov
Mar 8, 2009


рукописи не горят

them bit boys in trouble again!

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

SO DEMANDING posted:

you're talking embedded but this got me thinking about old desktop cards

matrox parhelia

SiS xabre

XGI volari (apparently they had a dual-GPU card :pwn:)


i still wonder who bought any of that crap because as i recall they were all total garbage compared to ATI and nvidia

wasn't matrox notable for having video cards with three outputs back when most still had two at max?

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.
somewhere in my boxes of boards i have what was an extremely expensive PCI 3D accelerator made by evans and sutherland. i oughta build a 90s pc workstation someday

freedom graphics or realimage 2000 or something? it has tiny fans on the main chips, a daughterboard, and at least one simm slot

looked somewhat similar to this AccelGraphics Eclipse, which is E&S based:



I also have some prototype of some "realizm" card or other IIRC. Intergraph, not the later 3DLabs ones

atomicthumbs fucked around with this message at 02:27 on Oct 5, 2017

Cocoa Crispies
Jul 20, 2001

Vehicular Manslaughter!

Pillbug

Farmer Crack-rear end posted:

wasn't matrox notable for having video cards with three outputs back when most still had two at max?

idk we have a whole rack of cpu-heavy 2016 servers at the office and they have like the matrox dorito "gpu" in 'em

Silver Alicorn
Mar 30, 2008

𝓪 𝓻𝓮𝓭 𝓹𝓪𝓷𝓭𝓪 𝓲𝓼 𝓪 𝓬𝓾𝓻𝓲𝓸𝓾𝓼 𝓼𝓸𝓻𝓽 𝓸𝓯 𝓬𝓻𝓮𝓪𝓽𝓾𝓻𝓮

atomicthumbs posted:

somewhere in my boxes of boards i have what was an extremely expensive PCI 3D accelerator made by evans and sutherland. i oughta build a 90s pc workstation someday

freedom graphics or realimage 2000 or something? it has tiny fans on the main chips, a daughterboard, and at least one simm slot

looked somewhat similar to this AccelGraphics Eclipse, which is E&S based:



I also have some prototype of some "realizm" card or other IIRC. Intergraph, not the later 3DLabs ones

bitchinfast 3d 2000

Arcteryx Anarchist
Sep 15, 2007

Fun Shoe
remember 3dfx and their crazy 4(+?) core cards that never made production iirc

Sagebrush
Feb 26, 2012

ERM... Actually I have stellar scores on the surveys, and every year students tell me that my classes are the best ones they’ve ever taken.
the voodoo5 6000 was supposed to have four separate discrete processors, not just cores

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

there's a few dozen prototype 6000s floating around apparently

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

atomicthumbs posted:

somewhere in my boxes of boards i have what was an extremely expensive PCI 3D accelerator made by evans and sutherland. i oughta build a 90s pc workstation someday

kind of like this one on eBay but PCI?

josh04
Oct 19, 2008


"THE FLASH IS THE REASON
TO RACE TO THE THEATRES"

This title contains sponsored content.

ate all the Oreos posted:

so i dabbled in a tiny bit of opencl a while ago and eventually i'd like to come back around to it and play around with it a bit because i'm a turbodork who finds this kinda thing interesting, should i even bother with it if it's going out of style or should i look into something else instead?

it still has the advantage of being multi-platform and multi-manufacturer. on my macbook pro i can run tasks on both the internal gpus etc. and nvidia added opencl 1.2 support to their driver in the last two years so they all support roughly the same featureset.

it's just sad reading about cl 2.0 and not being able to get any of the nice features unless you have a specific AMD card and do the magic driver rain dance.

Shame Boy
Mar 2, 2010

josh04 posted:

it still has the advantage of being multi-platform and multi-manufacturer. on my macbook pro i can run tasks on both the internal gpus etc. and nvidia added opencl 1.2 support to their driver in the last two years so they all support roughly the same featureset.

it's just sad reading about cl 2.0 and not being able to get any of the nice features unless you have a specific AMD card and do the magic driver rain dance.

yeah the thing that appealed to me was supposedly being able to compile it to run on an FPGA somehow??? which seems real cool since I have a few FPGA dev boards I'd like more excuses to play with

someone above mentioned they made some translation layer (or are making?) to translate to the new thing (vulkan? i have no idea what these things are) so i guess it's reasonably future-proof to go with for at least a little while then?

josh04
Oct 19, 2008


"THE FLASH IS THE REASON
TO RACE TO THE THEATRES"

This title contains sponsored content.

altera sell fpgas which run opencl, i've only had a cursory look at them though, and i suspect you'd still be doing a lot of work specific to the fpga to get decent results.

Xarn
Jun 26, 2015

josh04 posted:

altera sell fpgas which run opencl, i've only had a cursory look at them though, and i suspect you'd still be doing a lot of work specific to the fpga to get decent results.

That is a pretty safe bet, just like with Xeon Phi. You can theoretically run unchanged CPU code there, but it will have hilariously bad perf.

Notorious b.s.d.
Jan 25, 2003

by Reene

atomicthumbs posted:

somewhere in my boxes of boards i have what was an extremely expensive PCI 3D accelerator made by evans and sutherland. i oughta build a 90s pc workstation someday

freedom graphics or realimage 2000 or something? it has tiny fans on the main chips, a daughterboard, and at least one simm slot

looked somewhat similar to this AccelGraphics Eclipse, which is E&S based:



I also have some prototype of some "realizm" card or other IIRC. Intergraph, not the later 3DLabs ones

i have a sun workstation that came with a 3dlabs wildcat card, as a cheaper alternative to Sun's own graphics. it is very, very slow, even by circa 2002 standards.

i'm not sure how intergraph or 3dlabs stayed in business so long

was OpenGL CAD/CAM just that fuckin impossible on consumer cards?

Cybernetic Vermin
Apr 18, 2005

in that era i think the intersection between what cad/cam needed and what was accelerated on a consumer graphics card was almost empty yeah. hopefully someone knows this better than i do though, i don't dare effortpost on the matter, but believe it comes down to the consumer option of a simple chip that can have (poorly) textured and lit screenspaced triangles thrown at it by cpu at brisk (in total pixels/texels getting drawn) pace, versus cad/cam chips dealing with abuse of huge displaylists of stippled lines and simplistically rendered but hugely detailed meshes that need gpu support for the transforms and somewhat specialized rendering

otoh i learned opengl on one of those wildcat suns back in the day, and as i recall they were indeed pathetically slow, but i think they served entirely as a mostly pointless entry-level thing

Jimmy Carter
Nov 3, 2005

THIS MOTHERDUCKER
FLIES IN STYLE

Shinku ABOOKEN posted:

as i understand nvidia and other companies dont manufacture their physical chips. what prevents the manufacturers from copying the design themselves?

now that we're at like 10 nm fabs the foundries have to be spend more money and be bigger, so if TSMC ever gets caught passing poo poo around everyone will immediately pull their business and they'll be bankrupt in the span of about 7 hours.

The_Franz
Aug 8, 2003

Notorious b.s.d. posted:

i have a sun workstation that came with a 3dlabs wildcat card, as a cheaper alternative to Sun's own graphics. it is very, very slow, even by circa 2002 standards.

i'm not sure how intergraph or 3dlabs stayed in business so long

was OpenGL CAD/CAM just that fuckin impossible on consumer cards?

i think i still have a 3dlabs oxygen card circa 1998 or 1999 somewhere that i got when a university was throwing them out. it had two glint chips and dual monitor support in 1999 so it probably more than doubled the price of the machine it was installed in. by 2001 they were already headed for the dumpster because they were so outdated and slow.

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

Farmer Crack-rear end posted:

wasn't matrox notable for having video cards with three outputs back when most still had two at max?

I was legit excited for getting the parahelia for exactly this reason, except then the marquee game with support for it got cancelled (and then much much later resurrected and rebranded), and the reviews called it garbage.

I could have had multiple monitor life over a decade ago. Alas.

Cybernetic Vermin
Apr 18, 2005

Volmarias posted:

I was legit excited for getting the parahelia for exactly this reason, except then the marquee game with support for it got cancelled (and then much much later resurrected and rebranded), and the reviews called it garbage.

I could have had multiple monitor life over a decade ago. Alas.

the big question is: did you cast your head?

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

Cybernetic Vermin posted:

the big question is: did you cast your head?

No because ultimately I never got it. Maybe it would have.

Adbot
ADBOT LOVES YOU

Shame Boy
Mar 2, 2010

i have an original Ageia PhysX card that i just found lying in a desk at my last job. i took the fan off and polished up the shiny chip to a mirror finish just for fun but idk what else to do with it

i guess i could try very carefully desoldering it and putting it in a little frame?

  • Locked thread