|
One thing I was wondering about is how adding a discrete GPU always seems to add like a flat 20-40 watts from the wall even when idle... surely that can be improved more? Edit: I thought NVidia had something for that but inexplicably locked it to laptops and not desktop.
|
![]() |
|
![]()
|
# ? Jun 10, 2024 15:32 |
|
Rastor posted:Article specifically says it is built on their current Execution Units design and also that it aims for power efficiency, not performance. I don't think there's too much interesting here -- yet. With GPUs these days, power efficiency IS performance.
|
![]() |
|
Anybody expecting Intel to target $300+ gaming GPUs is likely gonna be sorely disappointed.
|
![]() |
|
Zero VGS posted:One thing I was wondering about is how adding a discrete GPU always seems to add like a flat 20-40 watts from the wall even when idle... surely that can be improved more? Try 10W and lower.
|
![]() |
|
Cygni posted:Anybody expecting Intel to target $300+ gaming GPUs is likely gonna be sorely disappointed. It seems like a natural outgrowth of their gaming uarch, and their need for a big GPGPU chip. If you have all the pieces you might as well put it together.
|
![]() |
|
Zero VGS posted:One thing I was wondering about is how adding a discrete GPU always seems to add like a flat 20-40 watts from the wall even when idle... surely that can be improved more? ![]()
|
![]() |
|
Cygni posted:Anybody expecting Intel to target $300+ gaming GPUs is likely gonna be sorely disappointed. That's not known at this time but I don't see what is stopping them. They are bigger than nVidia and AMD combined. And GPUs have advanced beyond being "gamer products" due to applications of machine learning exploding the growth of the market. Intel has never had a reason to compete with their graphics products beyond offering barebones 2D/video acceleration and ability to play Peggle on your grandma's laptop or play WoW at 25fps, but machine learning/HPC has so many applications that for them to sit on the sidelines doesn't make any sense at all, if they believe they can make a competitive product.
|
![]() |
|
Does Intel really need a high performance GPU to compete in ML/HPC though? They already have Xeon Phi for those markets, which covers similar ground and doesn't waste any die space on vestigial ROPs/TMUs/etc.
|
![]() |
|
I dunno what methodology they're using for that, like multiple ammeters or something, but I've got a Watts Up and constantly switch cards around on my PC and it's always over 20 watts difference between having a card and not having a card. Edit: But still, even 6 watts like in that test, is too much for an idle card. A smartphone at idle is milliwatts, you know?
|
![]() |
|
My AMD card self-reports at ~35W idle (at 300 MHz too, so it's probably mostly VRAM?). I've read that having two monitors of differing resolution increases the idle power significantly, no idea how true this is though.
|
![]() |
|
repiv posted:Does Intel really need a high performance GPU to compete in ML/HPC though? They already have Xeon Phi for those markets, which covers similar ground and doesn't waste any die space on vestigial ROPs/TMUs/etc. Xeon Phi is a failure and Intel is discontinuing product development. So yeah, this is going to be their replacement. On the spectrum of "general-purpose processor" to "ASIC", Xeon Phi is too close to the "general-purpose processor" to get good speedups on ML stuff, and it's too weak to perform well on general-purpose x86 stuff. It's a fence-sitting design that pleased noone and saw no takeup outside of some supercomputers, many of which are replacing them with GPUs or DSPs.
|
![]() |
|
Alpha Mayo posted:That's not known at this time but I don't see what is stopping them. They are bigger than nVidia and AMD combined. And GPUs have advanced beyond being "gamer products" due to applications of machine learning exploding the growth of the market. ML is already going to dedicated architectures. It seems pretty clear that they are also targeting Edge computing, as well as custom designs to stick on EMIB packages for laptops. None of these markets necessitates or even really involves the high end gaming market. There is a microscopic amount of profit at the high end, and although margins are nice, I doubt Intel will bother making a specific uarch for it. They can let Nvidia play in that little pond while going at them where it hurts: laptops and the $200 and below market. I'm just guessin'/theory crafting, but thats what I expect.
|
![]() |
|
Paul MaudDib posted:Xeon Phi is a failure and Intel is discontinuing product development. So yeah, this is going to be their replacement. Oh, I thought Phi was still a thing in the compute space. It's hilarious that Larabee failed as a GPU because it was too general purpose, then got respun into a general purpose accelerator but failed again because the design was still too general purpose even for that. How did they ever convince themselves that Larabee GPUs would be viable ![]() repiv fucked around with this message at 20:59 on Feb 20, 2018 |
![]() |
|
Also, just general lack of uptake of AVX-512 in general, and in particular applications that are aware of Phi accelerator cards (so - the same thing AMD is fighting at the moment with OpenCL). The socketed/bootable version only became a thing with Knight's Landing, and that's what the Phi's whole "transparently accelerate x86 with AVX-512" model really needed to see acceptance. By that point it was just too late. But yeah, it was just tough to see what it was really supposed to be used for. Besides HPC, and maybe as a second processor in a dual-socket workstation for video work/3d rendering/etc, most commercial workloads either need tons and tons of math (ML) or a regular processor was sufficient. In the VM and datamining worlds, a lot of systems run out of RAM before they run out of processor time, or database servers are limited by IO. And Phi was poo poo on general workloads because it was fundamentally a very weak P5- or Atom-derived core, without its AVX it sucked rear end. It was so general that it wasn't really useful in many markets, very ironic. If you do one thing really well, people will find a use for it. Maxwell was never intended to launch the machine-learning revolution. It's just that it was really good at FP32 (because that's what games use) and it was fast enough that people were willing to work around its limitations. Paul MaudDib fucked around with this message at 21:40 on Feb 20, 2018 |
![]() |
|
and now NVidia is completely aware of the demand for accessible HPC that Intel fails to provide with their overpriced x86 products, and is catering their GPUs for all sorts of compute workloads. Meanwhile X86's market share for general desktop/laptop usage is being challenged hard by ARM in the long-term, and being challenged right now from within by AMD's Ryzen/Epyc. They can float on "nobody ever got fired for buying
|
![]() |
|
Ezekial posted:Nvidia and AMD should be in the fetal position. Also seems like in the coming year or two I'll finally be glorified in shorting nvidias bubble of a stock.
|
![]() |
|
Cygni posted:There is a microscopic amount of profit at the high end, and although margins are nice, I doubt Intel will bother making a specific uarch for it. They can let Nvidia play in that little pond while going at them where it hurts: laptops and the $200 and below market.
|
![]() |
|
In some way a friend of mine who has a few friends still at Intel, apparently are getting their hands on what will be the successor cards for the 1070. He will let me know when he gets them so I can nab one to tinker on for a little bit. Have to go through them for drivers and all the apps will see it as some Null GPU board, but will be interesting if he delivers. Might even be sooner than ASUS RMA. One question I have on the Intel dGPU chat. With the current laptop gen that is VR capable, the Intel iGPU is disabled due to the way the HDMI is routed if the iGPU is on or something. (really no idea why they can't either have a physical switchable HDMI port if need be, or hell just don't give a crap about iGPU when HDMI is in use.) There may be more too it but I was wondering if the next gen might have a way around that limitation and the next laptop parts from Nvidia/Intel (and maybe AMD finally?) will have iGPU and dGPU together while still offering VR capabilities? It seems that even though the dGPU's can idle pretty low, they still eat a ton more power than an iGPU when doing light laptop task which make the battery life of Gaming Laptops still suck compared to all the iGPU powered devices out now. If they could solve this switchable conundrum issue that should make for much more portable gaming laptops in the future wouldn't it? Hell why not just do a physical switchable setup like Sony did on the early SZ's between iGPU/dGPU which required a reboot, but you can use one or the other, not both?
|
![]() |
|
EdEddnEddy posted:In some way a friend of mine who has a few friends still at Intel, apparently are getting their hands on what will be the successor cards for the 1070. He will let me know when he gets them so I can nab one to tinker on for a little bit. Have to go through them for drivers and all the apps will see it as some Null GPU board, but will be interesting if he delivers. Might even be sooner than ASUS RMA. Haha so these are Turing 2070/1170s? Yeah, sounds like a Q2 launch then. EdEddnEddy posted:One question I have on the Intel dGPU chat. With the current laptop gen that is VR capable, the Intel iGPU is disabled due to the way the HDMI is routed if the iGPU is on or something. (really no idea why they can't either have a physical switchable HDMI port if need be, or hell just don't give a crap about iGPU when HDMI is in use.) There may be more too it but I was wondering if the next gen might have a way around that limitation and the next laptop parts from Nvidia/Intel (and maybe AMD finally?) will have iGPU and dGPU together while still offering VR capabilities? What you're talking about is Optimus. Specific windows can be designated to run on the dGPU and their output is blitted back to the iGPU and composited. GSync (and probably VR) need to have direct access to the display output, either for latency (VR) or for protocol/VRR sync reasons (GSync). For Optimus, the iGPU needs to have control of the display output (since there may be situations in which the dGPU does not run at all). So these options cannot be enabled at the same time. There are definitely laptops where you can switch which GPU has control of the port, but you do need a reboot.
|
![]() |
|
ugh at the next-gen Nvidia cards probably being just as hard to get as the current gen. The MSI Aero 1080ti is in stock atm on Newegg at $799 and I'm strongly tempted to buy it instead sitting on this 670 for a few more months. edit: gently caress gone again when I refreshed the page. Should have snapped one up when I had the chance.
|
![]() |
|
Yeah a TI for anywhere near 800 is a buy it now thing. redeyes fucked around with this message at 01:24 on Feb 21, 2018 |
![]() |
|
redeyes posted:Yeah a TI for anywhere near 800 is a buy it now thing. At $700, buy it now and sell it for $900-$1000, then enjoy your subsidized 1170 in a few months.
|
![]() |
|
ConanTheLibrarian posted:I don't see why Intel wouldn't release mid/high end GPUs if they're going to manufacture discrete cards. There's no doubt Intel's main target is heavy compute workloads, so they will use a scalable design that suits large chips. It's not that far to go to make a gaming variant. Nvidia's profits a high enough to show that expensive GPUs are worth the time. NVidia's gaming profits are high because they basically own the entire dGPU market. For Intel to break into that, they'd have to: (1) Make a competitive uarch, presumably from basically scratch. (2) Scale it up to the point where the boards would be competitive with NVidia. (3) Develop substantial driver-support packages and relationships with developers to be able to release per-game optimized drivers. (4) Manage to do all of that on a budget small enough to allow them to make a profit off the small fraction of the market they'd likely capture. None of these steps are trivial, and none of them really make a lot of sense when they could basically ignore the high-end gaming market like they've been doing for years and specifically target high-performance, high-density, hilariously high-profit compute cards. Though it'll be interesting to see what their solution is for "but everyone already uses CUDA." I mean, if nothing else, if they had some great plan for a gaming-centric uarch, why would they be bothering to license AMD's tech for iGPUs in upcoming chips? Seems like they'd be better off slapping some cut-down version of their home-grown solution in there, instead, if they expected it to be competitive in that sector. e; I mean, from a straight up practical stand point, if Intel really wanted to get into the gaming dGPU market, they'd be a lot better off just buying up or licensing or partnering for a chunk of AMD's GPU IP; if all that's keeping AMD from being able to "fix" their current issues is a lack of cash, Intel can solve that problem real fast, and now they've got a fairly quick-to-market product that would be viable in the marketplace. Certainly safer and faster than trying to do it all themselves, and that AMD already agreed for some iGPU tech shows that they're at least willing to consider such deals. DrDork fucked around with this message at 02:46 on Feb 21, 2018 |
![]() |
|
There’s no incentive for Nvidia and AMD (and even less for retailers) to sell to gamers when datacenters and mining colonies are willing to pay more for more or less the same products. Gamers are basically being gentrified out of the GPU market. Fakeish edit:I mean that as an analogy on a purely rhetorical level and not to paint basement virgins as an oppressed minority group.
|
![]() |
|
Console gamers are all ![]() ![]()
|
![]() |
|
Here are some ideas for those still lingering inside the proverbial box: 1. Charge $2000 or something but then rebate the purchaser $1 for every hour they spend gaming. Ti cards are reimbursed at 1/2 that amount to ensure that they only go to the most hardcore gaming enthusiasts who legitimately need the FPS. You're probably thinking "That is a fantastic and easily implemented idea but won't miners just exploit it by running a game until they get their rebate." Possibly which is where scenario two comes in. 2. Continue to charge a premium on the cards but create a cryptocurrency that is earned for every hour spent playing a game and can only be traded in for loot boxes, rendering it useless to miners. This is a transparent market-based solution founded on provable contracts. The transactions will be processed while on the loading screen of games and at the end of rounds so as not to increase the demand for miners.
|
![]() |
|
So your solution is to sell $2000 cards with the promise that you'll be compensated via lovely loot boxes in games they very well might not want to play? Brilliant! I bet that really moves the *70 units with a quickness.
|
![]() |
|
ItBurns posted:Here are some ideas for those still lingering inside the proverbial box: So, kind of like points you can earn for experience in a game....
|
![]() |
|
Krailor posted:So, kind of like points you can earn for experience in a game.... Yes, but unlike regular experience points these would allow you to progress and gain unlocks
|
![]() |
|
What's more likely to happen is people will sell consoles with crypto miners built in that are sold at a discount with the acceptance that they can be used to mine at any time etc.
|
![]() |
|
That's an expressway to me pulling the plug out of the wall when I'm not using it.
|
![]() |
|
BIG HEADLINE posted:Console gamers are all Nintendo's time to shine!
|
![]() |
|
BIG HEADLINE posted:Console gamers are all Console graphics solutions are fairly custom-made, so aside from GDDR5/6 prices loving the living daylights out of console makers, I don't think crypto will impact them that much imho. Unless someone figures out how to use consoles to mine. ![]() Kinda glad bitcoin wasn't a big thing back when the PS3 was out.
|
![]() |
|
Uhh. I have a gt 730 refurbished bought on Amazon from zotac. Hefty fan gddr5 version. Getting random colors. Screen goes to all blue, switches to green, sometimes back to the OS. Last time I was just googling something [DuckDuckGo] and the screen went orange. Was weird. Then I figured I'd switch the card out, threw in a 3650 and it seemed to run fine, but didn't leave it in to run full tests because it can't do the correct resolution, as amd doesn't support drivers for it in windows 10. So I do what everyone does when they have a potentially broken lowtier card, run the fuzzy donut test. Once furmark brings it to 49 degrees exact, all fans go full blast and the screen goes black, so i have to hard reset. I have ran furmark twice now, both exactly at 49 the crash happens. Is the card kaput? I'll be able to get money back from Amazon most likely, just curious as to what the issue is.
|
![]() |
|
Sounds like it's hosed. Amazon refund should be a pretty painless process at least.
|
![]() |
|
Ya, poo poo's hosed. Watch EVGA's midweek madness sale, it refreshes at 3AM EST (1h 20m from now). They often have poo poo-tier cards like GT710 or GTS450 on sale with free shipping (normally shipping is up to $20). Make sure it's got the outputs you want (do you need displayport?)
|
![]() |
|
Usually hooked to a tv, hdmi works fine. I already had bought a new 1030 on ebay on the cheap, was going to use this forever but I got a second chance bid for it and decided to just use the 730 until I got the new card. Derp. Any harm in using it until new card gets here? Or should I not risk it?
|
![]() |
|
Ezekial posted:Usually hooked to a tv, hdmi works fine. I already had bought a new 1030 on ebay on the cheap, was going to use this forever but I got a second chance bid for it and decided to just use the 730 until I got the new card. Derp. Any harm in using it until new card gets here? Or should I not risk it? Why would you care? poo poo's hosed, if it totally croaks between now and when the 1030 gets here that only makes your case with Amazon even more open-and-shut. Start the return now, don't ship it until the replacement shows up. Amazon gives you like a month from the order to return it. (it's not going to blow up your monitor or anything, if that's what you're asking) Paul MaudDib fucked around with this message at 07:55 on Feb 21, 2018 |
![]() |
|
Hmm. Running furmark with the dvi slot lets it run a few degrees hotter. Also my question was more regarding if it could potentially hurt the board cpu or RAM, as all of that is relatively expensive.
|
![]() |
|
![]()
|
# ? Jun 10, 2024 15:32 |
|
Ezekial posted:Hmm. Running furmark with the dvi slot lets it run a few degrees hotter. Also my question was more regarding if it could potentially hurt the board cpu or RAM, as all of that is relatively expensive. No you can't.
|
![]() |