|
https://www.twitch.tv/twitch did not realize AMD was releasing real info today showing the whole polaris lineup showed the 460, 470, and 480 in hand
|
# ? Jun 13, 2016 20:08 |
|
|
# ? Jun 5, 2024 03:10 |
|
Lisa Su has a sweet video card brief case. More like a military case but I'm going to carry around my GPU's like that.
|
# ? Jun 13, 2016 20:09 |
|
So RX480 appears to be the full chip, RX470 is the cutdown version, RX460 is full Polaris 11. No specs though so uuuuuuggggggghh.
|
# ? Jun 13, 2016 20:10 |
|
Backpack was pretty sick lol
|
# ? Jun 13, 2016 20:11 |
|
No gently caress you AMD, stop it, spot it with that loving "I am Zen" clip.
|
# ? Jun 13, 2016 20:12 |
|
I was hoping, at least, for teaser info about cards competing with 1070s and up... oh well. Sweet backpack.
|
# ? Jun 13, 2016 20:14 |
|
If I don't hear anything about my preordered Amazon FE 1070 soon, I will literally die in real life.
|
# ? Jun 13, 2016 20:17 |
|
Holyshoot posted:Lol if you bought a 9800 pro. The pro move was to get a 9800 non pro and unlock it to become a 9800 pro. Good times. Was that the card you could unlock with a pencil?
|
# ? Jun 13, 2016 20:23 |
|
Love the old video card talk my first card was a 4MB Diamond Stealth II S220 that I ran in an old Acer Pentium 100 desktop to play Quake in PowerVR. Looked awesome, played like rear end until we upgraded to 8MB EDO from the 4MB that was installed. Once it was upgraded though, Quake deathmatch was loving amazing
|
# ? Jun 13, 2016 20:26 |
|
THE DOG HOUSE posted:https://www.twitch.tv/twitch Well now I am torn between grabbing a 480 or waiting several months for a 490 that might not even release until next year. Probably just going to grab the 480.
|
# ? Jun 13, 2016 20:33 |
|
FaustianQ posted:No gently caress you AMD, stop it, stop it with that loving "I am Zen" clip.
|
# ? Jun 13, 2016 20:36 |
|
Lungboy posted:Was that the card you could unlock with a pencil? Software unlock. fozzy fosbourne posted:If you watch a 60fps video on your second display your refresh on your gaming screen might get fuxxored It makes sense because now that I think about it the 144hz was down to 120hz with the 690. However it doesn't explain why vlc works fine.
|
# ? Jun 13, 2016 20:40 |
|
repiv posted:GD05B spec sheet says it supports 11" GPUs, and the reference 1070/1080 is 10.5" long.
|
# ? Jun 13, 2016 21:10 |
|
Tunga posted:Does this mean we definitely won't see "mini" versions of the 1070? The short cards fit so much better in my Node 304. I can cram a fullsize one in if that's all we're getting but if one might show up then I'll wait. Though I'm waiting for non-FE cards anyway right now. I dont think there's anything more restrictive than maxwell in terms of size. Since it uses less power as well, I don't see why they wouldn't release small variants
|
# ? Jun 13, 2016 21:18 |
|
Can anyone explain vulkan to me like I'm an idiot? Bleh Maestro fucked around with this message at 21:24 on Jun 13, 2016 |
# ? Jun 13, 2016 21:21 |
|
Also mentioned earlier was the HBM talk, HBM1 was apparently technically limited to 4GB MAX by design where HBM2 can go way above that due to stacking I believe. So the Fury was limited in memory size, but the bandwidth it still got was enough that even at 4K res the 4G was enough where the Nvidia hardware needed >4G to maintain performance at the same res. HBM2 will be an interesting time as yes there isn't a direct "need" for it over GDDR5X, but at the same time, if we get 8G of the stuff on the 1080Ti/Titan, God knows what sort of performance we can get. Price isn't the main issue with HBM, but currently it is supply and right now Tesla is taking what seems like all the HBM2 chips.
|
# ? Jun 13, 2016 21:23 |
|
Bleh Maestro posted:Can anyone explain vulkan to me like I'm an idiot? You know how's there's different ways to tell a GPU to do 3d poo poo, and the biggest 2 are DirectX and OpenGL? DirectX is used by Windows and (direct)XBox while OpenGL is an open source standard that anybody can use, including Windows, Macs, Linux things, your phone, and more. Vulkan is a new, open source way to tell GPUs how to render stuff, and it's got much more specific lower level control. This has the potential to make telling the GPU what to do take less of the computers resources, reducing the amount of time the CPU spends telling the GPU what to render and potentially improving performance.
|
# ? Jun 13, 2016 21:25 |
|
Twerk from Home posted:You know how's there's different ways to tell a GPU to do 3d poo poo, and the biggest 2 are DirectX and OpenGL? DirectX is used by Windows and (direct)XBox while OpenGL is an open source standard that anybody can use, including Windows, Macs, Linux things, your phone, and more. Think of it as a Mind Meld between the Game and the GPU.
|
# ? Jun 13, 2016 21:30 |
|
Phoneposting, so excuse my brevity, but this is also mandatory reading on DX12/Vulkan: http://www.gamedev.net/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/#entry5215019 The tl;dr here is that the current driver model is a "leaky abstraction" (term of art) and attempts to make something inherently complicated (multithreaded apps, multiple GPUs each with their own separate memory/processor/etc) look simple (one giant fast GPU with a single-threaded API). This is a gigantic loving pain for everyone involved - gamedevs have no idea what the "right" thing to do is and don't have the powers they need, and also turn out horrifically broken code because they know the driver guys will fix it so NVIDIA or AMD doesn't have "broken drivers" on launch day. Driver devs get to rewrite games from within the driver stack to fix code so it runs let alone hits the fast paths, while maintaining a multimillion line monstrosity of their own. DX12 and Vulkan throw that whole model out - they expose the complexity to the developer but also give the tools to write good code without divine Driver Team intervention. The idea is that work will be divided - some smart guys will write game engines and handle this complexity for you, and gamedevs will be writing games on top of Unreal, Unity, CryEngine, or something like that. Paul MaudDib fucked around with this message at 02:48 on Jun 14, 2016 |
# ? Jun 13, 2016 21:37 |
|
Twerk from Home posted:You know how's there's different ways to tell a GPU to do 3d poo poo, and the biggest 2 are DirectX and OpenGL? DirectX is used by Windows and (direct)XBox while OpenGL is an open source standard that anybody can use, including Windows, Macs, Linux things, your phone, and more. How do I use it? Does it just work when they develop for it, or like how they are saying they are releasing a Vulkan update for doom. Is it built into drivers and in the graphics options or do you need to get vulkan drivers or something?
|
# ? Jun 13, 2016 21:43 |
|
Bleh Maestro posted:How do I use it? Does it just work when they develop for it, or like how they are saying they are releasing a Vulkan update for doom. Is it built into drivers and in the graphics options or do you need to get vulkan drivers or something? Both AMD's and NVIDIA's current drivers include the Vulkan runtime. Games need to have a renderer for Vulkan, and yes, it'll be in the graphics options of a game. This poo poo makes me feel old, since old games used to ask about this all the time. Glide, D3D! Software rendering!
|
# ? Jun 13, 2016 21:48 |
|
HalloKitty posted:Both AMD's and NVIDIA's current drivers include the Vulkan runtime. I remember having to pick between D3D and OpenGL back in the day so it's just like that basically? I just haven't seen it yet.
|
# ? Jun 13, 2016 21:49 |
|
Double phone post.
|
# ? Jun 13, 2016 21:50 |
|
afkmacro posted:You had to flash the bios with the one from the pro. It's not like they shipped the cards with a hidden pro bios on it.
|
# ? Jun 13, 2016 21:50 |
|
Bleh Maestro posted:I remember having to pick between D3D and OpenGL back in the day so it's just like that basically? I just haven't seen it yet. Yeah, just like that, once a game has support for it.
|
# ? Jun 13, 2016 21:51 |
|
Bleh Maestro posted:I remember having to pick between D3D and OpenGL back in the day so it's just like that basically? I just haven't seen it yet. It's similar to choosing between direct x versions for some games (civ5 comes to mind)
|
# ? Jun 13, 2016 21:52 |
|
If you've ever used a game that let you select between DX9/DX11/OpenGL/Mantle renderers, DX12 and Vulkan will be entries in that menu. They're graphics APIs, which are a set of function calls that are implemented by the driver, which conveys operations to the hardware. OpenGL is the oldest and has the highest-level function calls - it looks like a single-threaded state machine. It's a good place to start, google it if you don't know what I'm talking about. Paul MaudDib fucked around with this message at 22:42 on Jun 13, 2016 |
# ? Jun 13, 2016 21:53 |
|
HalloKitty posted:Both AMD's and NVIDIA's current drivers include the Vulkan runtime. And when Geforce came out, there was Direct3D and Direct3D T&L (!) Which was the option that changed out from Glide in later days. T&L was worth what 20+FPS depending on the game though so it was a big deal back the. Man that was a long time ago. I wrote a report on the importance of it in High School that probably went right over my teachers head but hey, I got that A. On a Vulkan note though, if the driver supports it, what hardware is Vulkan supported on? Anything thats DX10+ or DX11+? (So Nvidia 400 series/AMD 5000 Series +)?
|
# ? Jun 13, 2016 21:53 |
|
Man, I had no idea that Half Life still had a software renderer. By the time I got around to playing it I had a Geforce 2 MX.
|
# ? Jun 13, 2016 22:05 |
|
Twerk from Home posted:Man, I had no idea that Half Life still had a software renderer. By the time I got around to playing it I had a Geforce 2 MX. Half life was released in 1998, the same year the Voodoo 2 was released. I got Command and Conquer Red Alert for Christmas in 1996, and had to go on the internet with my 19.2kbps modem to download new S3 drivers that supported DirectX 3.0. Quake II came out a year later. Half-life came out a year after that.
|
# ? Jun 13, 2016 22:31 |
|
Twerk from Home posted:Man, I had no idea that Half Life still had a software renderer. By the time I got around to playing it I had a Geforce 2 MX. Pentium-133 with 24MB RAM was the minimum requirement, although by that time I'd moved from that to a screamingly fast P2-400 with a Riva TNT. My first video card was a Diamond Monster 3D, on that aforementioned P-133. Tomb Raider at 30fps in 640x480 with bilinear filtering?
|
# ? Jun 13, 2016 22:34 |
|
Ludicrous Gibs! posted:Pentium-133 with 24MB RAM was the minimum requirement, although by that time I'd moved from that to a screamingly fast P2-400 with a Riva TNT. Man those boxes back then had that old school, beige tower, Work PC that can actually do a bit more than Solitaire feel to them didn't they? lol Also I finally beat Half Life 1 on my P2 266 with a VooDoo 2 while my P4 was down with that Motherboard Burnout when I cooked that GF3 Ti 500 lol. Wasn't a terrible experience since I still had a CRT Then, but that 48MB EDO ram held me back. The game ran ok for the first few levels, then after you loaded about a dozen times it took freaking forever to finish loading and smooth out, requiring a reboot. Man the amount of time wasted on Data Shuffling due to low ram amounts. Now we have 2-6GB on our freaking phones, and can play HL2 on them too. I remember I was pretty hot stuff back in 2004 playing Quake Portable (Quake 1 Software & 3D rendered as well as Quake 3) on my Dell Axim X50v. Quake was quite playable with a BT Keyboard and Mouse on that 640x480 screen too. That PowerVR 16MB GPU was a beast back then in a PPC.
|
# ? Jun 13, 2016 22:41 |
|
Got my 1070FE today, it's a really solid card. Apart from a few quirks where certain games didn't know what the gently caress it was (looking at you Fallout 4) and also the most up to date version of GeForce experience claiming to not meet the minimum requirement for optimising games. Speccy also seems to think that it only has 4096 VRAM for some reason. I assume because the card is new out or something?
|
# ? Jun 13, 2016 22:41 |
|
Newegg is burning down the second-hand market for 980 Tis. Have to admit I'm thinking real seriously about it. Is the basic 1080 ACX model on EVGA Step Up yet? Is there a disadvantage besides having to pay shipping both ways on the return/replacement? Paul MaudDib fucked around with this message at 22:52 on Jun 13, 2016 |
# ? Jun 13, 2016 22:47 |
|
Linx posted:Got my 1070FE today, it's a really solid card. Apart from a few quirks where certain games didn't know what the gently caress it was (looking at you Fallout 4) and also the most up to date version of GeForce experience claiming to not meet the minimum requirement for optimising games. Speccy also seems to think that it only has 4096 VRAM for some reason. I assume because the card is new out or something? Have you checked for an update for Geforce Experience itself? Might be that or just that its so new it doesn't really know the 1070 yet. It is super new so it will be fixed for sure. Also you could just go into your games and set everything to high yourself too. :/ That is also a killer deal for a 980Ti. Dammit Newegg Wish they would do that for an ASUS Strix and I could run 2 for a while.
|
# ? Jun 13, 2016 22:51 |
|
EdEddnEddy posted:Have you checked for an update for Geforce Experience itself? Might be that or just that its so new it doesn't really know the 1070 yet. It is super new so it will be fixed for sure. Yeah it was the first thing I checked. Not really too fussed because I never used the optimisation tool anyway, I just thought it odd that the software didn't recognise their own hardware.
|
# ? Jun 13, 2016 22:54 |
|
For any UK types, Overclockers have cut all their 9xx prices pretty drastically. Still a bit high if the 480 delivers as expected, but the hell knows with amd?
|
# ? Jun 13, 2016 23:09 |
|
There was some Overclocking utility i forget the name of (Ati tray tools? Omega Drivers?) that could be used to softmod your 9800 to pro. You just enabled disabled pipes in it and it either worked (rarely) or artifacted like a mofo (most common). Similar story with Geforce 6800LE to 6800GS
|
# ? Jun 13, 2016 23:19 |
|
Lungboy posted:For any UK types, Overclockers have cut all their 9xx prices pretty drastically. Still a bit high if the 480 delivers as expected, but the hell knows with amd? i'm still not convinced enough that AMD will deliver that i wouldn't pull the trigger on an AIB 1070 if it got down to $400
|
# ? Jun 14, 2016 00:00 |
|
|
# ? Jun 5, 2024 03:10 |
|
Sormus posted:There was some Overclocking utility i forget the name of (Ati tray tools? Omega Drivers?) that could be used to softmod your 9800 to pro. You just enabled disabled pipes in it and it either worked (rarely) or artifacted like a mofo (most common). Omega Drivers were the risk-free way to softmod if you didn't feel comfortable during the primal days of ATIFlash, ATT had a couple ways to tweak existing softmods though, since Omega had a habit of lowering image quality slightly in the name of performance. Every 9800 softmod relied on the hope that your card had an excellent ASIC quality, so if your 9800 had trouble OC'ing to Pro/XT speeds, you probably got something that wasn't good enough to be one of the top-tier 9800's, so nothing has really changed in comparison to every other major unlock in the history of popular GPU unlocks Despite the name change, it was still the same card with a huge base clock increase and timing tweaks, along with a couple Catalyst-centric improvements upon detection of a Pro/XT, as that lineup was essentially a big R3xx family of varying quality. The one big difference came from whether or not your card had the 128 or 256-bit memory bus, with 256 being mandatory if you had any hope of playing around with EATM MSAA, which was the only way to roll when HL2 came out I think FEAR was the day of reckoning for all of us that thought nothing would ever get better than a 9800 XT
|
# ? Jun 14, 2016 00:07 |