|
Craptacular! posted:My point is many of these AMD loyalists aren't blind. They're waiting for waiting for Vega 20 at least. And if Ryzen and Coffee Lake cost the same thing they'd buy the better product. To use a personal anecdote, I used to use $350 Nexus phones and argue with iPhone-using friends who talked about how they don't mind spending twice as much to have the cushy, premium, industry-best smartphone experience. And as soon as $350 Nexus became $700 Pixel and flagship Android phones cost as much as iPhones do, here I am with an iPhone. I was accepting crashes, random reboots, etc in order to have a phone half the price and felt it was an acceptable compromise. At the same price, forget it. That's the same math I'm doing. I want a good low-light camera so it's pixel or iPhone and welp, I'll go for the one that actually has security updates. That or a $150 unlocked moto-whatever. I made my first-gen Moto G last almost 4 years. I also have an X34 and a 1080 so, I guess that doesn't say much as an apriori factor. I don't want to gently caress with it.
|
# ? Nov 9, 2017 04:57 |
|
|
# ? Jun 5, 2024 07:01 |
|
Xae posted:It would only work with a Xeon(tm) Processor and Optane(tm) brand SSDs.
|
# ? Nov 9, 2017 05:05 |
|
Paul MaudDib posted:He's the tech version of Alex Jones. NVIDIA YOU DEVIL! I used to outright feel bad for the guy because I never really came across much he said that was misguided, rather he was just very optimistic - but plausible. However it was all just ultimately wrong and not his fault. But I kind of cooled on that when I watched the second half of the history of gpus thing he put out. Though not technically wrong, it was easy to read between the lines. But now... I mean... Anyhoo, what a rollercoaster ride for the gpu world this week.
|
# ? Nov 9, 2017 05:25 |
|
I’d have to think this is at least a little bad for AMD because it’s very easy for laptops with Polaris on board to opt for Freesync displays. I feel like the widespread ubiquitousness of GeForce cards combined with the premium, luxury good pricing of Gsync has to be holding back adaptive sync somewhat, right? You give people iPads that can speed up their refresh rates dynamically and laptops (and soon maybe TVs?) with Freesync and you increase the audience of people who expect this stuff as standard, and they’re going to balk at how much Nvidia’s solution adds to the price of a 1440p monitor.
|
# ? Nov 9, 2017 05:36 |
|
Craptacular! posted:I’d have to think this is at least a little bad for AMD because it’s very easy for laptops with Polaris on board to opt for Freesync displays. Oh god yes Intel getting access to Freesync-capable tech immediately fucks NVIDIA even in the short-term, we now see FreeSync compatible iGPUs no later than a year out, and there is a solid use-case for plebs buying a poo poo-tier freesync monitor and upgrading it right through using FreeSync compatible GPUs in the entry-level market. Microsoft will be pushing it in the living-room market and XB1X supports it. How long do you think NVIDIA is going to stall on that? Volta will have it (but not Pascal). Paul MaudDib fucked around with this message at 05:42 on Nov 9, 2017 |
# ? Nov 9, 2017 05:39 |
|
Nvidia seems pretty good about giving features to legacy cards imo if they do open it up
|
# ? Nov 9, 2017 05:41 |
|
1gnoirents posted:Nvidia seems pretty good about giving features to legacy cards imo if they do open it up they love to give their old cards features when it makes sense and there is real competition
|
# ? Nov 9, 2017 05:53 |
|
Fauxtool posted:they love to give their old cards features when it makes sense and there is real competition yes, but say Volta does freesync because
|
# ? Nov 9, 2017 06:02 |
|
1gnoirents posted:yes, but say Volta does freesync because gently caress, the laptop chips do identical functionality to freesync already
|
# ? Nov 9, 2017 06:21 |
|
Anime Schoolgirl posted:primitive discard made its way back to kepler and fermi for a 15-20% performance improvement across the board so this won't be surprising lol the nvidia devs are actually pretty great. and you know what? Who's going to teach people how to program your uarch if it isn't you? I learned to do CUDA on an NVIDIA cluster because... CUDA was where all the support was at that time. I could go to an NVIDIA presentation, learn a thing, run it on university hardware (provided by NVIDIA), have a question + email it to a library author/mailing list and get a prompt response back. Forget the library support... NVIDIA has a massive support base going on. The social factor is super hard to break here. The semi-custom business is actually super important for AMD right now. And in the world of DX12/Vulkan low-level programming it only gets more important. Like I said... I bet the NVIDIA devs could smack like 50-100% improvement out of the AMD hardware with their DX11 MT-queue driver and poo poo like that. From what I've heard the problem isn't that AMD doesn't do a good job threading and optimizing... they just don't do it. Paul MaudDib fucked around with this message at 06:36 on Nov 9, 2017 |
# ? Nov 9, 2017 06:33 |
|
1gnoirents posted:yes, but say Volta does freesync because People did testing, NVidia laptop chips already support freesync in all but name. (VBlank being the basis for Freesync.)
|
# ? Nov 9, 2017 06:36 |
|
if only AMD scared them enough to force it out
|
# ? Nov 9, 2017 06:38 |
|
1gnoirents posted:yes, but say Volta does freesync because As a guy still gaming at 60hz, Fast Sync has become my favorite feature of the past few years, and they announced it with Pascal but put it on the 900 series, too. (Shrug)
|
# ? Nov 9, 2017 06:48 |
|
Keplers below GK110 (780/780 Ti) did age significantly worse than the flagships. GK110 is a 7950 now at best - but it's a 7950 with GSync, while GCN 1.0 lacked FreeSync. And GK104 did not age well, it's probably below a 7770 at this point. And fermi sucks poo poo nowadays, even in DX11 but especially in DX12. It's a compatibility mode, not a performant option. Of course I suppose the AMD equivalent is Terascale... Paul MaudDib fucked around with this message at 07:30 on Nov 9, 2017 |
# ? Nov 9, 2017 06:54 |
|
SwissArmyDruid posted:Because that's where the paywall cut off the rest of the article. The Anandtech article was updated a while back, they think the HBM is only directly connected to the GPU and that the CPU->GPU connection is on-package PCIe. Also, the CPU die most likely has has it's iGPU still, so expect some form of switching.
|
# ? Nov 9, 2017 08:36 |
|
I really do wish they would just all sit down and accept a standard for adaptive sync and eventually just make it baseline in all tech, forever. It's barmy having hardware needing to match display output and not the other way around. HDMI 2.1 is a good step forward. I play all my PC games on my tv because it's a hugeass 4K 65" and it looks glorious but it's a slave to vsync with all the associated performance downsides and input delays. I hope a third GPU vendor doesn't complicate things with their own intelvision sync 9000 implemention and instead just force everyone to draw a line under the whole thing. Every other week or so since late 2015 I do a quick search for TVs that support adaptive sync/gsync/freesync and it's nearly 2018 and there's still literally nothing whatsoever. It's maddening. Hurry up!!!
|
# ? Nov 9, 2017 11:56 |
|
My take is that Intel first wants Raja to scale up Gen10 EUs into a high performance dGPU for AI/MI, and then with his input finally push like a Gen12 (Gen11 likely already in the pipeline) solution as a higher performance replacement and for general consumer use. Again, we're still looking at like, early 2019 for this to start to matter with even the most basic of product but Intel needs/wants something now.
|
# ? Nov 9, 2017 12:31 |
|
Craptacular! posted:As a guy still gaming at 60hz, Fast Sync has become my favorite feature of the past few years, and they announced it with Pascal but put it on the 900 series, too. (Shrug) is there an actual writeup on what fast sync actually does under the hood? when i had a gaming pc i forced it on for everything and there really was a palpable improvement in input latency and general... smoothness, but pretty much every mention i saw of it was just quoting the nvidia line of it being a 'supercharged version of vsync' or some poo poo like that. i totally expected it to be snake-oil from that but it was actually really great, i've just got no idea what it actually does Arivia posted:Yeah AdoredTV was going on about how the AMD iGPU deal was going to kill nVidia, which is just like what the gently caress. well it's going to put the screws on them at any rate, and more competition in the pretty languid world of consumer gpus is always good. honestly i find adoredtv relatively salient and informed, he's just prone to making massive sweeping generalisations, and probably panders to the amd fanboy caucus more than he really should Paul MaudDib posted:That's the same math I'm doing. I want a good low-light camera so it's pixel or iPhone and welp, I'll go for the one that actually has security updates. That or a $150 unlocked moto-whatever. I made my first-gen Moto G last almost 4 years. yeah i find it pretty hard to justify android unless you're looking to save money. i'm ok with having the informed tradeoff of a lower entry price being ropier hardware and your data probably being mined by google and your chosen oem, but neither of those dissipate as much as i'd like as you go up in price. also the iphone can scroll without being a stuttery mess Generic Monk fucked around with this message at 13:23 on Nov 9, 2017 |
# ? Nov 9, 2017 13:13 |
|
Generic Monk posted:is there an actual writeup on what fast sync actually does under the hood? when i had a gaming pc i forced it on for everything and there really was a palpable improvement in input latency and general... smoothness, but pretty much every mention i saw of it was just quoting the nvidia line of it being a 'supercharged version of vsync' or some poo poo like that. i totally expected it to be snake-oil from that but it was actually really great, i've just got no idea what it actually does Nvidia "Fast Sync" is triple buffered V-Sync for DirectX games employed at the driver level. Nvidia designed it to cater to the Overwatch/CSGO crowd who often play at 300 fps on a 60 Hz display. Fast Sync allows the game engine to run at max rate, avoiding back pressure that can come with standard V-Sync while maintaining synchronisation with the display. Back pressure causes latency on your controls and is horrible for games like that. Say you're playing at 300 fps on a 60 Hz display with Fast Sync. Basically the Fast Sync algorithm is discarding most frames created by the GPU and choosing an appropriate frame to send to the display 60 times a second. This satisfies the display's requirement for V-Sync while the game processes your inputs at 300 fps. The trade-off is judder. As you're only shown 1/5 of the frames generated, you might see some slight hitching or stuttering. Whether you notice this will probably depend on your exposure to high hz + high frames gameplay.
|
# ? Nov 9, 2017 13:35 |
|
Generic Monk posted:yeah i find it pretty hard to justify android unless you're looking to save money. i'm ok with having the informed tradeoff of a lower entry price being ropier hardware and your data probably being mined by google and your chosen oem, but neither of those dissipate as much as i'd like as you go up in price. also the iphone can scroll without being a stuttery mess Not really for this thread, but I'm wondering WTF you do with your phones or what poo poo bricks you've used because other than the security patch thing, none of these issues have been present in any of the Android phones I've used and had seen friends and relatives use in the past five years Also the analogy doesn't really work for video cards but hey, it's Hyperbole Paul so whatever.
|
# ? Nov 9, 2017 13:57 |
|
orcane posted:Not really for this thread, but I'm wondering WTF you do with your phones or what poo poo bricks you've used because other than the security patch thing, none of these issues have been present in any of the Android phones I've used and had seen friends and relatives use in the past five years Scrolling on my old Nexus5x smoothly right now with Oreo, meanwhile my friend is raging about his iPhone6 being a bag of poo poo with the latest IOS.
|
# ? Nov 9, 2017 14:07 |
|
There are people that don't turn off all the scrolling animation crap the second they get a new phone?
|
# ? Nov 9, 2017 14:08 |
|
Arzachel posted:There are people that don't turn off all the scrolling animation crap the second they get a new phone? These are the same people who leave their brand new TV at the default settings and wonder why the colors are all hosed up and the "motion looks wrong." They are a pox upon all of us.
|
# ? Nov 9, 2017 14:15 |
|
Arzachel posted:There are people that don't turn off all the scrolling animation crap the second they get a new phone? Same but in Windows.
|
# ? Nov 9, 2017 14:16 |
|
Enos Cabell posted:Is this when Nvidia announces their entrance into the CPU market? Nvidia annouces Ai86, using the power of AI Nvidia can decode x86 instruction sets into GPGPU instructions sets at greater than native x86 speeds.
|
# ? Nov 9, 2017 14:30 |
|
DrDork posted:These are the same people who leave their brand new TV at the default settings and wonder why the colors are all hosed up and the "motion looks wrong." They are a pox upon all of us. Then there are those put max brightness of a million suns on everything and then are completely surprised that they fail shortly after the warranty period.
|
# ? Nov 9, 2017 15:03 |
|
It is funny how attached and defensive people get of their overpriced mobile computer. Regardless of what you're paying for it, that device is not worth 10% of that money.
|
# ? Nov 9, 2017 15:09 |
|
Volguus posted:It is funny how attached and defensive people get of their overpriced mobile computer. Regardless of what you're paying for it, that device is not worth 10% of that money. I'll agree with you about people getting all stupid fanboy defensive over their various bits of tech. But if you're trying to argue that an unlimited connection to cat memes and nudie pics isn't worth at least $50, I think you need to re-evaluate your idea of what "worth" means.
|
# ? Nov 9, 2017 15:20 |
|
monday: AMD's hardware is going into Intel CPUs wednesday: Intel is going to make dGPUs friday: The Playstation 5 is a video card, made by Apple
|
# ? Nov 9, 2017 15:39 |
|
GRINDCORE MEGGIDO posted:Scrolling on my old Nexus5x smoothly right now with Oreo, meanwhile my friend is raging about his iPhone6 being a bag of poo poo with the latest IOS. Nexus 5X is great while it works, I was a big fan of mine up until last weekend. Then LG's manufacturing defect caused my CPU die to start separating from the BGA and whoops, it won't boot anymore. Oh, the Pixel is twice as expensive? Wonderful... At least Project Fi SIMs will work with any GSM phone after being activated, even if Google doesn't advertise it. I can use my old spare while waiting to see what LG support does for me.
|
# ? Nov 9, 2017 15:59 |
|
Eletriarnation posted:At least Project Fi SIMs will work with any GSM phone after being activated, even if Google doesn't advertise it. I can use my old spare while waiting to see what LG support does for me. AFAIK while they work, they effectively become T-Mobile SIMs when used with an unsupported device. You lose the ability to roam on to Sprint and US Cellular, as well as the SIM-authenticated access to certain public wifi networks.
|
# ? Nov 9, 2017 16:27 |
|
What model GTX 1070 has the best/quietest cooling? I've got an EVGA SC2 that I'm pretty happy with, but my buddy I'm doing a build for is pretty spergy about sound.
|
# ? Nov 9, 2017 16:44 |
tehinternet posted:What model GTX 1070 has the best/quietest cooling? I've got an EVGA SC2 that I'm pretty happy with, but my buddy I'm doing a build for is pretty spergy about sound. The MSI Gaming X, iirc.
|
|
# ? Nov 9, 2017 16:52 |
|
wolrah posted:AFAIK while they work, they effectively become T-Mobile SIMs when used with an unsupported device. You lose the ability to roam on to Sprint and US Cellular, as well as the SIM-authenticated access to certain public wifi networks. Ah, interesting; I had figured I'd lose CDMA and keep LTE across all carriers, and didn't yet notice anything to disprove that assumption so I didn't know. My old spare Zenfone 2 Laser (lol) actually has noticeably better cell reception than my 5X though, to the point that even while the 5X worked I was already using the Zenfone instead with the data-only SIM some of the time and just dialing over Hangouts. Shame it's 2.4GHz and 1A charging only, but better than my backup backup Nexus 4 which doesn't even have LTE. Anyway, it's just a stopgap until I hopefully get a replacement 5X. That will probably be subject to the same manufacturing defect and fail after another year and a half.
|
# ? Nov 9, 2017 19:11 |
|
Stanley Pain posted:Nvidia annouces Ai86, using the power of AI Nvidia can decode x86 instruction sets into GPGPU instructions sets at greater than native x86 speeds. After this week, Transmeta suddenly rising from the dead would be the LEAST weird thing so far.
|
# ? Nov 9, 2017 19:15 |
|
SwissArmyDruid posted:After this week, Transmeta suddenly rising from the dead would be the LEAST weird thing so far. Yes, please.
|
# ? Nov 9, 2017 19:33 |
|
Volguus posted:It is funny how attached and defensive people get of their overpriced mobile computer. Regardless of what you're paying for it, that device is not worth 10% of that money. depends what you do with it. my 10bit anime porn mkvs have rather deprecated the value yeah
|
# ? Nov 9, 2017 21:32 |
|
tehinternet posted:What model GTX 1070 has the best/quietest cooling? I've got an EVGA SC2 that I'm pretty happy with, but my buddy I'm doing a build for is pretty spergy about sound.
|
# ? Nov 9, 2017 21:51 |
|
I have an Asus STRIX 1070 which has excellent thermals and noise due to being fuckoff huge, looking at some reviews it's probably one of the best cards in that area. Gigabyte G1 and MSI Gaming X are all within a dB or two though so any of those cards would probably be fine.
MaxxBot fucked around with this message at 22:12 on Nov 9, 2017 |
# ? Nov 9, 2017 22:10 |
|
|
# ? Jun 5, 2024 07:01 |
|
The Palit Gamerock 1070 must own thermally. The 1080 does.
|
# ? Nov 9, 2017 22:14 |