|
Of course I bought my computer a) just before they announced the first bundle (let alone this current one) and b) from the wrong retailer (since only one in Switzerland is part of the program), uncool guys
|
# ¿ May 6, 2015 10:50 |
|
|
# ¿ May 6, 2024 04:18 |
|
Anime Schoolgirl posted:I like how much effort Nvidia is putting into avoiding saying that this card is 90% as good as their 960 for 75-80% of the price That's funny because the German benchmarks I was reading yesterday have it at like 75-80% of the performance for almost 90% of the price of a GTX 960. Also uses almost as much power as a GTX 960 (in some circumstances, even more). It's the cheapest option for HDMI 2.0, HDCP 2.2 and H.265 decoding, but I'm not sure that's worth it.
|
# ¿ Aug 21, 2015 10:28 |
|
SlayVus posted:Almost no details, just the picture really. https://www.techpowerup.com/231589/msi-geforce-gtx-1080-ti-gaming-x-pictured Unfortunately it has a 2.5 slot cooler now so it won't fit into a Dan A4 (and even if it would, they changed the orientation of the heat sink fins by 90° so you'll just push even less of the hot air out the case). Oh well
|
# ¿ Mar 23, 2017 23:58 |
|
AVeryLargeRadish posted:lol, I know that AMD won't be releasing anything that matches the 1080Ti let alone the 1180 or whatever, but that does not change the fact that they need something that compelling to make any significant impact as far as market share goes. AMD video card market share is a combination of things, and the lack of $600+ video cards that are better than the competition isn't helping, obviously. It's not as important as VR/4k/PCMR enthusiasts seem to think, though.
|
# ¿ Apr 11, 2017 08:34 |
|
Most people who occasionally play video games on Steam have monitors up to 1080p (monitors being one of the things people upgrade even less regularly than computers or video cards), only 5% use higher resolutions, and the majority of multi-monitor users have 2x 1080p displays. Only about 10% even have video cards with more than 4 GB VRAM, about the same percentage as the 5-10% with "faster than GTX 1060" cards. Joe User not having the option to buy a GTX 1070 or better equivalent from AMD is not a major reason for AMD's troubles and low market share. $300 cards are mid range and have been since I can remember. A GTX 1070 is about 50% more, that's a lot of money if you're not an enthusiast who needs 60+ fps at max. details in every single game, at 1440p and beyond. Lockback posted:The Median monitor that people game on is probably something like a 21" 1080p Dell. A quick glance through the Steam Survey looks like less than 20% of gamers have a GPU that is 970 equivalent or above, which I think is a good "1440p" lower bound. Yeah pretty much.
|
# ¿ Apr 11, 2017 16:58 |
|
Paul MaudDib posted:The thing is that the RX 480 has declined to way way under MSRP while the 580 is going to be launching back at MSRP again. They're good deals when they're 30% less than a GTX 1060 6 GB, but priced head to head against the GTX 1060 6 GB the RX 480 has little merit of its own. What? The 580 is equal/faster, at a similar price and the main downside is it uses more power at max. load (the coolers are dealing with that thermal load just fine, though). You come across as having the weirdest AMD hateboner sometimes.
|
# ¿ Apr 18, 2017 16:24 |
|
Risky Bisquick posted:Maybe on consoles How does that work with modern games on mainstream video cards? And I don't mean this thread's moronic idea of mainstream = GTX 1070 and above.
|
# ¿ Apr 19, 2017 15:53 |
|
The GTX x60 range has been Nvidias mainstream series for a while and that has always been in the $250-350 range, at least here. The GTX 970 was popular partly because it was just slightly more expensive than that, but much better than the alternatives (GTX 760 was getting old, AMD had slower/hotter stuff instead) and system requirements had just gotten a significant increase by a new console generation, especially with regards to VRAM use. Hardware enthusiasts often live in this filter bubble where everyone else has the same sky high requirements that call for and justify $500+ video cards, so it can be hard to realize that most people who want a general gaming PC are still happy if their equipment is half as expensive but occasionally lets games drop to 30 fps.
|
# ¿ Apr 19, 2017 16:49 |
|
No true gamerman!
|
# ¿ Apr 20, 2017 10:57 |
|
The Iron Rose posted:Let's not be hyperbolic. I thought this was the entire point of this thread, reading the last few pages anyway.
|
# ¿ May 11, 2017 09:44 |
|
Truga posted:I'm the stock intel cpu cooler that'll make any multicore cpu throttle on a warm day Yeah he should buy a real cooler for $50 for his $70 CPU
|
# ¿ Jun 8, 2017 14:41 |
|
I built an Athlon 64-based setup not long before the transition happened, it had a Radeon 9800 XT (with a whopping 256 MB VRAM!) but its factory overclock was a tad too much and it died after about two years. The availability of decent replacement AGP cards was not great, I think I went to a Radeon X1650, then X1950, all the enthusiast models were PCI-E by that time. I think there were some issues with AGP, Ati cards and Bink videos (which a lot of games used for their pre-rendered intros etc.) at the time, too - I remember having to watch eg. the Oblivion intro out of game with the Bink-Viewer because of it The family computer I tinkered with had some Cirrus Logic video card, I can't even remember what bus. My first computer had a Matrox Millenium G200, which was good at Direct3D but very bad at OpenGL (I played lots of games based on Quake 2 engine). Like 20fps and less at 1024x768 bad. E: My Radeon 9800 XT didn't burn anything but it was visibly struggling from the start and I had massive issues with alpha textures flickering and stretching, especially in WoW. I solved this by underclocking the card but it still died eventually - it was fitting to consider WoW killed my first really expensive video card (I generally paid 250-300 bucks for mainstream cards before and after). orcane fucked around with this message at 20:42 on Jun 15, 2017 |
# ¿ Jun 15, 2017 20:31 |
|
What is the best option for a non-FE 2-slot GTX 1080 Ti, in terms of volume and temp? The ones I'd consider are all 2.5 slots at least, and the Dan A4 only takes 2-slot cards, max.
|
# ¿ Jun 21, 2017 13:58 |
|
I want Vega cards to be available so people in this thread can be idiots about that instead of e-sports, holy poo poo
|
# ¿ Aug 9, 2017 12:50 |
|
Measly Twerp posted:So long story short OC3D spun some bullshit, which got picked up by wccftech, and somehow this bastion of credibility was able to convince everyone that AMD was raising the MSRP of Vega. The video is gone but there are several retailers in Europe claiming their first (now sold out) batch of Vega 64 at the original MSRP was an introductory offer and that they have to charge 100 bucks more now, despite of AMD denying having raised the MSRP. Obviously retailers might be trying to profit from the (mining-driven?) demand and jack up prices, but it's 100% AMD's own fault that them lying to customers and trying to secretly raise Vega's price is considered to be plausible in the first place.
|
# ¿ Aug 17, 2017 09:31 |
|
Judging by how hosed their margins have to be with regards to HBM2 stacks etc., I'm assuming AMD has the same pricing scheme planned for the Vega 56, and unless there's a huge outcry I expect them to go through with it. People will buy their "slightly beats GTX 1070 with 40% higher TDP and same/lower price!" red team card for $400 for two days before the only stock left is $500 cards with game bundles worth , and then buttcoin miners get in.
|
# ¿ Aug 18, 2017 11:57 |
|
Oh is that common? I occasionally encountered that with my GTX 980 and yeah, it wasn't throttling and sometimes it even fixed itself after a while, otherwise required a reboot.
|
# ¿ Aug 21, 2017 13:20 |
|
Rosoboronexport posted:It most definitely did not. Freespace 1 and 2 had Glide and D3D support. FS1 was released in 1998, FS2 in 1999 and first T&L cards were just out in 1999. I remember that Call of Duty 1 was one of the first games that required hardware T&L to run (and even then I think it could be bypassed) I had a G200 when it came out. It was good at D3D (slightly slower than Riva TNT I think) but awful at OpenGL. Which bit me in the rear end, since I played a lot of games with Quake 2/3 engine back then, frequently at sub 20 fps until I got a GeForce 2 MX. Later, I had a PowerVR Kyro 2 for a while - performance was good with its deferred rendering, but it lacked hardware T&L and drivers lagged behind new titles more and more, which made them near unplayable until maybe a new driver fixed them. The last card I had in that P2/P3 build was a GeForce 4 Ti 4200. The reference Ti 4200 launched later with a smaller, cheaper PCB than the Ti 4400/4600, but mine was some Asus deluxe model for which they used a Ti 4400 PCB and high clocks, so it was basically a cheaper Ti 4400. I recently threw it away together with the bonus 3D shutter goggles and AV connection box. For my next PC I had an Radeon 9800 XT from Asus. Never had performance or driver issues as such, but it eventually died (thermal issues I think - I had problems with alpha textures flickering at factory overclock settings, which always went away if I clocked the card very slightly lower). Went through an X1650 (HIS with lovely DVI output) and then X1950 before I replaced the entire computer (AGP Video card + single core CPU ). My computers since then always had Nvidia stuff. GTX 260 (died), GTX 560, GTX 980. And a 1050 Ti I put into a living room SFF PC.
|
# ¿ Aug 27, 2017 09:59 |
|
https://twitter.com/GFXChipTweeter/status/902666411897585664 What? Because your product does not lose a lot of performance by cutting its power draw from insane to "still too high"?
|
# ¿ Aug 30, 2017 11:25 |
|
RX 560 is slightly cheaper, slightly slower. How is it "straight up bad"?
|
# ¿ Sep 5, 2017 10:44 |
|
The RX 560 with 4 GB is still going to be twice as fast (and more) than a 6850. The GTX 1050 and 1050 Ti even more so. It really depends on how much someone wants to spend and what games they want to play/how tolerant they are of lower settings. The common view is that it's not worth saving another 30 or 50 bucks to downgrade entry level cards to even lower end ones and lose another 25%+ of "already low" performance, but someone else might not be bothered by the performance hit and could spend those 30 bucks on something else that's important. Granted, it's a small niche, but that doesn't make the chip outright bad or useless (unlike, say, the RX 550 and GT 1030). That said, I do agree that you want to get at least a 1050 Ti for gaming anything remotely modern in 1080p. Or a RX 570 I guess, if buttcoins mining on GPUs didn't exist
|
# ¿ Sep 5, 2017 11:44 |
|
Surprise Giraffe posted:I mean it's cooler than my previous stock-cooled one, and that didn't crash total warhammer. Quieter too. It's not nearly as great as the Strix I got but that started malfunctioning. If all your video cards are malfunctioning, are you sure nothing else is broken in your PC?
|
# ¿ Sep 8, 2017 17:40 |
|
Basic "Gaming" (no X, Z etc.) is enough, but if you're not getting anything above a 1070 you can get an MSI Armor I guess. They're awful above that, though.
orcane fucked around with this message at 18:54 on Sep 17, 2017 |
# ¿ Sep 17, 2017 18:49 |
|
FaustianQ posted:Why even loving bother? To do a proper RX Vega custom you'd need to run obese 3 slot poo poo bricks, they'll never recoup costs and no AIB should be eager for this shitpile unless they know something we don't. Asus sticks a 1080 Ti cooler on their custom Vega 64 sample, that would make it a 2.5 slot card that does well (for a Vega 64 obviously). Demand for the card exists, for some reason or another. But no, they should just give up and fold and Nvidia forever!
|
# ¿ Sep 20, 2017 19:45 |
|
VostokProgram posted:It definitely used to be in the system tray. I went looking for it last week as well and didn't realize it was in the right click menu until I googled where to find it They removed the tray icon to GFE with update 3.9.0.97, according to their release notes: quote:Removed NVIDIA Tray Icon from Windows system tray in order to reduce the system footprint of NVIDIA software.
|
# ¿ Sep 23, 2017 19:41 |
|
Arivia posted:Yeah, this was mostly what I was responding to. The hope was that AIBs could fix some of the mess Vega is, and that's not the case. Even accounting for Jay losing the silicon lottery, it's just not any better performance-wise in any respect. It's cooler and quieter, but that's just closing the barn door on the fire after all the smoke floated out already. ComputerBase had one that was 8-9% faster, cooler and quieter. Silicon lottery yay!
|
# ¿ Sep 25, 2017 09:38 |
|
Zero VGS posted:Can confirm, that laptop is larger than my NFC S4 Mini, attached to the back of my monitor. Like I could hook a battery pack to my 12v Pico PSU and literally have a portable unit smaller that that, but with a 1080ti. It doesn't count if half the video card is hanging out of your case TBH.
|
# ¿ Sep 30, 2017 08:16 |
|
One correction, running memtest a couple times takes way more than a few minutes
|
# ¿ Oct 8, 2017 20:39 |
|
Can't you disable telemetry in Nvidia's control panel anyway? Also, if you want to use the Ansel screenshot thing a lot of newer games have, you need the 3D driver IIRC.
|
# ¿ Oct 11, 2017 09:19 |
|
Sininu posted:You do not. So does it work with just the basic driver for it now or what? Because it didn't work for me back when I tried it in Ghost Recon Wildlands when I used to only install the video and audio driver and PhysX software until I installed the 3D Vision (?) driver too.
|
# ¿ Oct 11, 2017 09:40 |
|
Generic Monk posted:yeah i find it pretty hard to justify android unless you're looking to save money. i'm ok with having the informed tradeoff of a lower entry price being ropier hardware and your data probably being mined by google and your chosen oem, but neither of those dissipate as much as i'd like as you go up in price. also the iphone can scroll without being a stuttery mess Not really for this thread, but I'm wondering WTF you do with your phones or what poo poo bricks you've used because other than the security patch thing, none of these issues have been present in any of the Android phones I've used and had seen friends and relatives use in the past five years Also the analogy doesn't really work for video cards but hey, it's Hyperbole Paul so whatever.
|
# ¿ Nov 9, 2017 13:57 |
|
Arzachel posted:There are people that don't turn off all the scrolling animation crap the second they get a new phone? Same but in Windows.
|
# ¿ Nov 9, 2017 14:16 |
|
Laslow posted:Windows 10 is little bit better and MacOS HiDPI is good. I’m tired of the DPI of my computer getting dunked on so hard by my goddamn cellphone. Stop having your monitor in your face I guess?
|
# ¿ Nov 12, 2017 14:05 |
|
They can but not in firmware.
|
# ¿ Dec 5, 2017 18:15 |
|
They might if the above-MSRP price doesn't go to them, actually.
|
# ¿ Jan 18, 2018 16:20 |
|
Zero VGS posted:One thing I was wondering about is how adding a discrete GPU always seems to add like a flat 20-40 watts from the wall even when idle... surely that can be improved more? Try 10W and lower.
|
# ¿ Feb 20, 2018 20:13 |
|
Just yesterday I looked up how much I paid for my GTX 980 three years ago. The more expensive GTX 1060 models are only slightly less expensive now and show no sign of slowing down, it's insane
|
# ¿ Feb 26, 2018 11:19 |
|
Used 980 (no Ti indeed) are $250 here but GTX 1060 are at least $350 (the 3 GB version) or over $400 for cheap 6 GB versions. Getting an Asus or MSI GTX 1060 6GB goes all the way up to $500 which is hilarious considering that's nearly what I paid for my GTX 980 shortly after it came out
|
# ¿ Mar 12, 2018 22:10 |
|
Wirth1000 posted:Sorry if this was a respost but da gently caress??? Pretty sure it was revealed that it's all about MXM stuff for their minis, not discrete PCIe GPUs like Asus, MSI etc.
|
# ¿ Mar 16, 2018 19:35 |
|
|
# ¿ May 6, 2024 04:18 |
|
Truga posted:I was in a class of nerds and gamers and they bought absolute poo poo like that geforce4 that was actually gf2 and similar poo poo. GeForce 4 MX has "GeForce 4" in the name okay?
|
# ¿ Mar 20, 2018 22:29 |