Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Wistful of Dollars
Aug 25, 2009

Maybe I'll look for 1080 TI's on black Friday this year, since Vega is a hot, wet squeaker so far.

Adbot
ADBOT LOVES YOU

1gnoirents
Jun 28, 2014

hello :)

kirtar posted:

No, the problem is that people speculate based on limited and often vague information.

Yes, but this time the speculators are the reviewers themselves who appear to have been lied to. Though technically unproven, a few retailers leaked that AMD's pricing to them was higher than "MSRP". AMD's own statement essentially confirmed bullshit was going on by way of Not-Saying-Anything. Forcing people to buy into "bundles" that just simply cost more money is a pretty good indication that MSRP doesn't exist. Also it doesn't help that their track record for about 3 years now is total dogshit.

But yes, its all speculation based on vague and certainly contradictory information (even if some of direct contradictions stems from AMD) , so you are not wrong.

I wish you luck buying a $400 Vega 56

Maxwell Adams
Oct 21, 2000

T E E F S

1gnoirents posted:

I wish you luck buying a $400 Vega 56

In my Radeon fanfiction, AMD was being honest about building up stock before putting Vega on sale, except they were talking about just Vega 56, and also they'll have enough supply to meet demand and keep the price low.

Or maybe AMD realized that they can gently caress up as bad as they want and it will hardly matter because miners will buy out everything anyway.

1gnoirents
Jun 28, 2014

hello :)
The reality of it to me is we really are talking pushing the envelope of technology that didn't exist before. That probably isnt easy, and you cant win them all. And I bet a slippery slope is many times more devastating in that scenario, which they started falling down on a while ago.

EVIL Gibson
Mar 23, 2001

Internet of Things is just someone else's computer that people can't help attaching cameras and door locks to!
:vapes:
Switchblade Switcharoo

1gnoirents posted:

The reality of it to me is we really are talking pushing the envelope of technology that didn't exist before. That probably isnt easy, and you cant win them all. And I bet a slippery slope is many times more devastating in that scenario, which they started falling down on a while ago.

Turns out my envelope was full of confetti. Good prank AMD 😁

Khorne
May 1, 2002

Maxwell Adams posted:

Or maybe AMD realized that they can gently caress up as bad as they want and it will hardly matter because miners will buy out everything anyway.
It's not just miners. AMD has rabid fanboys that buy their stuff whether it's good or bad. Look at the people clamoring for vega on r/amd. It's weird to me to fanboy any tech company. I can understand mindlessly buying Samsung or SanDisk for microsd cards or something, but those markets have nearly identical products and a high level of competition.

Khorne fucked around with this message at 20:23 on Aug 24, 2017

Craptacular!
Jul 9, 2001

Fuck the DH
I've been an Intel fanboy, I guess. I owned a K6-2 and it was a good processor held back by a poor choice in motherboards (VIA architecture), but I bought a P3 while tech magazines were certain CPUID was going to be used to spy on you and then two P4s (both the early "gently caress overclockers" edition and the later Hyperthreading gemeration). When AMD introduced Athlon I basically never touched their CPUs again. I immediately got on Conroe and an Intel reference motherboard and used that until it was way too old. Now I'm on Ivy Bridge and a Z77 mobo still and still doing fine for now but I do less CPU intensive tasks than I did in 2005.

Today I'll use Intel either until Apple stops Hackintosh or it becomes a worse choice for that. I hear you can do it with Ryzen now but the number of extra hoops are such that I'd rather just throw money at the problem.

It helps that Intel is no longer trying to follow the model of late 70s and early 80s IBM, and that the IBM-like things that they did in the 90s and won't let go of (like going from 486 to Pentium just so you can trademark the name) are things that the industry adjusted to and likewise Intel is less lovely about. That said, I don't think "fanboys" love court cases; I was an Apple fanboy forever but the Apple/MS court case was dumb and and especially the subsequent Apple/Samsung court case is the worst horseshit I've seen from a tech company in years.

I would posit, based on the K6-2 MMX lawsuit, that if 1997's Intel was exactly as it was today that they would be suing AMD right now for using the numbers 3/5/7 to differentiate their CPUs.

EVIL Gibson
Mar 23, 2001

Internet of Things is just someone else's computer that people can't help attaching cameras and door locks to!
:vapes:
Switchblade Switcharoo
This under voltage thing is great.

Running at 880 @ 1900 clock. Dropped ~5-8 C on some kind of buttcoin to benchmark it (which I know is ultra unrealistic because that test does not give a flip about memory transfer rates. Uploading numbers to the matrix instead of megatextures ) but it has been running overnight without breaking anything.

Craptacular! posted:

I was an Apple fanboy forever but the Apple/MS court case was dumb and and especially the subsequent Apple/Samsung court case is the worst horseshit I've seen from a tech company in years.

Please tell me you got anywhere close to our high school I LOVE APPLE dude where he wore a trenchcoat with I<3IMAC because that machine was just released that year.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Craptacular! posted:

I've been an Intel fanboy, I guess. I owned a K6-2 and it was a good processor held back by a poor choice in motherboards (VIA architecture), but I bought a P3 while tech magazines were certain CPUID was going to be used to spy on you and then two P4s (both the early "gently caress overclockers" edition and the later Hyperthreading gemeration). When AMD introduced Athlon I basically never touched their CPUs again. I immediately got on Conroe and an Intel reference motherboard and used that until it was way too old. Now I'm on Ivy Bridge and a Z77 mobo still and still doing fine for now but I do less CPU intensive tasks than I did in 2005.

It's kinda funny that you disavowed AMD exactly when they had their "Golden Era".

Craptacular!
Jul 9, 2001

Fuck the DH

EVIL Gibson posted:

Please tell me you got anywhere close to our high school I LOVE APPLE dude where he wore a trenchcoat with I<3IMAC because that machine was just released that year.

No. My dad's college friend bought a Mac dealer and my dad moved from HP to run his repair department. This meant occasionally we had really cool computers at home over the weekend for free, it meant I knew far more about how to optimize the system than the people trying to run the computers at school, and it means I have a few t-shirts and tchotchkes from Apple service seminars.

I tried to understand PCs around the time of DOS 6/Win 3.1, and I did own a 486 but IRQ issues and the shittiness of DUN pre-98 made it never my primary platform until USB happened. This meant on most BBSes I was the Mac guy hearing how great Doom was but never really allowed to understand why.

We abandoned Mac at home around System 8, by which point the store was sold to someone else and everyone left.

Craptacular! fucked around with this message at 22:06 on Aug 24, 2017

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

SourKraut posted:

It's kinda funny that you disavowed AMD exactly when they had their "Golden Era".

Isn't that also about the time Intel put out lovely processors and started their illegal rebate program to shove AMD out of the market?

GRINDCORE MEGGIDO
Feb 28, 1985


It's dedication to a cause buying multiple p4's over Athlons.

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

AMD chipsets were still a shitshow during their golden era. That should have all been in-house and not doing so caused a lot of headaches and kept corporate adoption low.

Craptacular!
Jul 9, 2001

Fuck the DH

SourKraut posted:

It's kinda funny that you disavowed AMD exactly when they had their "Golden Era".

I always wanted Intel because marketing, but
not only was Intel more expensive but Quake II added 3DNow instructions. I watched a few games like Ubisoft's "Pod" that were coded around MMX, and basically AMD had a chip with MMX instructions plus their own secret sauce that John Carmack adopted in his newest engine, so why not buy AMD?

The reason I left AMD is because of the motherboard, which was an Asus with a VIA chipset. VIA joined early Rage era ATI on my "do but buy under any circumstances" list after that mobo. My understanding is that Nvidia stepped in and created nForce and saved millions from having to live in a VIA wasteland, with AMD+Nvidia being the fabled "green team" for a few years.

GRINDCORE MEGGIDO posted:

It's dedication to a cause buying multiple p4's over Athlons.

Every young man has that moment where they want to root for the empire. Some express it by voting Republican, I did it by buying P4s.

Craptacular! fucked around with this message at 22:38 on Aug 24, 2017

Enos Cabell
Nov 3, 2004


The best combo was NVidia NForce chipset boards with AMD cpus. Loved my old NForce/Opteron combo.

EVIL Gibson
Mar 23, 2001

Internet of Things is just someone else's computer that people can't help attaching cameras and door locks to!
:vapes:
Switchblade Switcharoo

Craptacular! posted:


I tried to understand PCs around the time of DOS 6/Win 3.1, and I did own a 486 but IRQ issues and the shittiness of DUN pre-98 made it never my primary platform until USB happened. This meant on most BBSes I was the Mac guy hearing how great Doom was but never really allowed to understand why.

We abandoned Mac at home around System 8, by which point the store was sold to someone else and everyone left.

You can blame Jobs and the rest of the exec board for not letting you experience DOOM. They paid gaming mags to not bring up or review any Mac games that came out because they we're trying to market it as a business machine and not a thing for games.

Then when they tried to get into games was when they found Bungie and MS, who were looking for a killer app for their console system , hit two problems with one aquisition.

Your whole IRQ conflicts I remember well where I didn't use a mouse until windows 95 because me and my dad could never figure out how to get a mouse working while keeping our soundblaster working (really dumb but there was no internet).

This is why Gateway was so drat popular because you could get a preconfigured system that might be slow with lots of tech support cruft in it, but it worked out of the box.

Now we are here where everything is basically plug and play and only if you are really interested in enthusiast level do you not need to worry about doing something horrible to your psu without either physical (psu connections to motherboard )or smart safeguards (over clocking your gpu from 1500Mhz to 3Ghz just makes your drivers crash and not bring on the smell of magic blue smoke) preventing you from loving something up.

metallicaeg
Nov 28, 2005

Evil Red Wings Owner Wario Lemieux Steals Stanley Cup

Enos Cabell posted:

The best combo was NVidia NForce chipset boards with AMD cpus. Loved my old NForce/Opteron combo.

I had a dual-NIC nForce board with one of the Barton core XPs overclocked like crazy with what was essentially a giant boulder of copper in one of the most obnoxiously loud and heavy Thermaltake cases around when I was in high school. Nearly 60 pounds when fully assembled.

redeyes
Sep 14, 2002

by Fluffdaddy

BangersInMyKnickers posted:

AMD chipsets were still a shitshow during their golden era. That should have all been in-house and not doing so caused a lot of headaches and kept corporate adoption low.

This is exactly why I refused to use or sell their piece of poo poo systems. No matter how much faster than Intel they were, you could not rely on them. Yes I am sure there were special snowflakes but overall, gently caress AMD chipsets(back then)

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!
I had a P4 but it was because I did not know much about PCs back then and it was my first one from Dell.

feedmegin
Jul 30, 2008

EVIL Gibson posted:

You can blame Jobs and the rest of the exec board for not letting you experience DOOM. They paid gaming mags to not bring up or review any Mac games that came out because they we're trying to market it as a business machine and not a thing for games.

Wasn't Jobs at NeXT in the Doom time frame?

LRADIKAL
Jun 10, 2001

Fun Shoe
How to underclock a 1080ti? Afterburner doesn't let me go below 0 overclock.

1gnoirents
Jun 28, 2014

hello :)

Jago posted:

How to underclock a 1080ti? Afterburner doesn't let me go below 0 overclock.

Press Ctrl + F. Its not the most user friendly thing and as far as I could tell after you have a baseline set you really do have to manually drag each dot on the graph. Hold shift on the highest data point to drag everything down the first time. I actually didn't get much underclocking out of my EVGA 1080ti, I truly lost on that silicon however I still could shave off a few degrees despite that.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Paul MaudDib posted:

NVIDIA stripped their uarch down to the bare essentials. This started back with Kepler, where they actually slowed down some instructions because making all the instructions take the same amount of time let them ditch the instruction-level scheduler. Maxwell/Pascal take this even further, there is basically nothing except the bare minimum of fixed-function hardware (ROPs/TMUs) that is actually necessary to do graphics. Everything else is either run on the shader cores (eg geometry) or pushed onto the CPU. And every time they strip things down further they plow all the gains right back into putting more cores on the die. Basically Maxwell/Pascal is as close to "nothing but graphics cores" as NVIDIA's engineers can manage.

In contrast AMD is still running an old-school compute-oriented uarch, they still have all that stuff on the chip. It was great against Fermi, it did fine against Kepler, but it's fallen further and further behind Maxwell/Pascal in gaming performance. AMD drastically under-invested during this time period - Raja actually mentioned that around 2012 (i.e. just prior to the Hawaii launch) AMD's leadership thought discrete GPUs were going away, and R&D appears to have drastically tailed off as a result. To make matters worse their drivers have acquired a reputation for sucking hard, especially right at launch (aka FineWine).

It would be better if AMD could make two separate chips, one for compute and one for gaming. But they can't afford that, and they still have fantasies (delusions) of competing in the compute/datacenter market. So what they've had to do instead is try and make gaming workloads more like compute workloads - eg things like having graphics kernels launch async tasks to run in the background to fill pipeline bubbles. But this effectively means that they have to fight an uphill battle to get gamedevs to use the dumb poo poo they come up with, and this requires lots of low-level programming that gives people lots of opportunities to write lovely code that doesn't run well.

So what it comes down to is that NVIDIA has advantages both in software and hardware. They are running the absolute minimum hardware that is necessary and using the die space to just cram on all the cores, and then finding ways to work around the resulting limitations. Whereas AMD is constantly trying to push some new bullshit to make their circa-2012 uarch perform acceptably, and they've more or less just finally hit the tipping point from "behind but still a reasonable budget pick" into "this is stupid, why would you buy this".

is there a detailed analysis of kepler/maxwell/pascal out there? I know that some little stuff like pascal/maxwell having tiling has been eked out.

i thought GCN was pretty dece but if it hasn't been overhauled in 5 years I am actually surprised Vega is competitive at all. That's basically being bulldozer to sandy bridge levels of out of date microarch

Also surprised that AMD didn't go full Zeppelin style MCM on interposer for Vega. That would at least cut their yield issues down and give them much needed margin/price competition. One step forward one step backward. Still, Zen appears successful enough that it will hopefully let them invest in a decent new graphics arch assuming Raja isn't fired post-Vegapocalypse


e: I found a verilog impl of southern islands here if anyone wants to poke around, this is good poo poo: https://github.com/VerticalResearchGroup/miaow

Malcolm XML fucked around with this message at 23:25 on Aug 24, 2017

Arivia
Mar 17, 2011
MCM on interposer is supposed to be Navi, I believe.

Vega is such an odd duck because it really should have just been cancelled. AMD had a better gaming chip they already binned (Big Polaris) and now they're treading water until their next actual architecture. Vega really should have just been compute-only/mostly for professional/prosumer tasks like the Frontier Edition was, but AMD made some stupid choices and had to push it as a consumer gaming architecture instead.

GRINDCORE MEGGIDO
Feb 28, 1985


BangersInMyKnickers posted:

AMD chipsets were still a shitshow during their golden era. That should have all been in-house and not doing so caused a lot of headaches and kept corporate adoption low.

I had nothing but trouble with Via. But nForce / nForce2 were amazing to me. When I went over to them, I finally got what Intel users were saying about stability.
I left a system on for days just to see if it would crash, and it didn't.

canyoneer
Sep 13, 2005


I only have canyoneyes for you
I remember leaving on a gaming PC for 3 months in the Pentium 4 era/Windows XP era.
That was exciting for the time.
Running a windows 98 PC hard with games/other stuff meant you were restarting every day at least.

I have a GTX 660Ti that I want to replace, but I don't want to pay buttcoin miner prices for stuff and I have an old Dell 1080p 24" monitor, which I should also probably replace at the same time which makes it real expensive.

Just feels like a weird time to upgrade with the new Nvidia stuff 6 months away, but I want to play Battlefront 2 and Shadow of War later this year too :(

dissss
Nov 10, 2007

I'm a terrible forums poster with terrible opinions.

Here's a cat fucking a squid.

canyoneer posted:

I have a GTX 660Ti that I want to replace, but I don't want to pay buttcoin miner prices for stuff and I have an old Dell 1080p 24" monitor, which I should also probably replace at the same time which makes it real expensive.

Just feels like a weird time to upgrade with the new Nvidia stuff 6 months away, but I want to play Battlefront 2 and Shadow of War later this year too :(

You could look at a second hand 970 or 980 to tide you over - not sure how location dependent it is but they’re cheap as hell in my part of the world right now (I guess it isn’t worth mining with them?)

IanTheM
May 22, 2007
He came from across the Atlantic. . .

Arivia posted:

MCM on interposer is supposed to be Navi, I believe.

Vega is such an odd duck because it really should have just been cancelled. AMD had a better gaming chip they already binned (Big Polaris) and now they're treading water until their next actual architecture. Vega really should have just been compute-only/mostly for professional/prosumer tasks like the Frontier Edition was, but AMD made some stupid choices and had to push it as a consumer gaming architecture instead.

Is there anywhere showing how good Vega is exactly at compute?

canyoneer
Sep 13, 2005


I only have canyoneyes for you

dissss posted:

You could look at a second hand 970 or 980 to tide you over - not sure how location dependent it is but they’re cheap as hell in my part of the world right now (I guess it isn’t worth mining with them?)

is a 960 a meaningful upgrade from a 660 Ti? I mean, it's only like $75 used so I guess it doesn't need to be much of an upgrade

Craptacular!
Jul 9, 2001

Fuck the DH

canyoneer posted:

I have a GTX 660Ti that I want to replace, but I don't want to pay buttcoin miner prices for stuff and I have an old Dell 1080p 24" monitor, which I should also probably replace at the same time which makes it real expensive.

Just feels like a weird time to upgrade with the new Nvidia stuff 6 months away, but I want to play Battlefront 2 and Shadow of War later this year too :(

If you decide not to replace your motherboard, you can get a 3GB 1060 for $200 easily and it will provide a significant boost with the monitor you have. If you go to 1440p, you can either go to the 6GB 1060 for about $70 more (previously $20-50 more before buttminers) or you can buy a used 980 and get 1070-equivalent speed (can not gauge value there, I don't buy used cards.)

Keep in mind all of these suggestions assume you'll be a rational person who understands that sometimes games ship with ridiculously high quality presets or VRAM-gobbling shadow garbage that make for better screenshots but terrible gameplay, and that if you see two modes called High and Highest and they look nearly the same but Highest plays like a slideshow, that you can play High and enjoy yourself and not be beating yourself up over the slightly more diffused shadows that you're missing out on. For instance, if you stay at 1080p, the 3GB 1060 can't handle Mirrors Edge Catalyst's "Hyper" setting and Doom's "Nightmare" setting has a hardware check for 5GB of VRAM. But both of those modes that play at 60+ and look nearly identical if you look at side-by-side screenshots.

If you're the guy who absolutely must enable the soft shadow option or the hair option that uses as much RAM as the rest of the game combined, then you're already in 1080 territory even at 1080p because of the mining push on 1070s.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

canyoneer posted:

is a 960 a meaningful upgrade from a 660 Ti? I mean, it's only like $75 used so I guess it doesn't need to be much of an upgrade

The 960 is around the same speed as the 660 Ti.

Craptacular!
Jul 9, 2001

Fuck the DH
The 6GB 1060 likely would have been called the 1060ti had the 3GB version not launched at a later date. If you're wondering what today's version of "your card" is, it's that. If you want the card that can run Hyper/Nightmare settings in 2016/2017 games at 1080p, and play 1440p at High or Ultra depending on the game, that's your card.

Craptacular! fucked around with this message at 01:45 on Aug 25, 2017

Kazinsal
Dec 13, 2011


If the local computer parts market didn't blow I'd seriously consider selling my GTX 1070 and buying a GTX 1080 Ti. You can't find a 1070 in stock around here and they're $550+ CAD. A 1080Ti is $900 CAD.

I seriously think if I pushed it and there were enough thirsty butters around I could probably get one to pay $600 for a loving 1070.

craig588
Nov 19, 2005

by Nyc_Tattoo
Are you overclocking the 660 ti? Overclocking would probably get you 15% more performance and close to a stock 960 level of performance. I agree a 960 is too small of an upgrade to be worth spending money on, even accounting for its overclocked performance probably getting an extra 25% faster.

SlayVus
Jul 10, 2009
Grimey Drawer
This looks like a cool unboxing experience.

canyoneer
Sep 13, 2005


I only have canyoneyes for you

craig588 posted:

Are you overclocking the 660 ti? Overclocking would probably get you 15% more performance and close to a stock 960 level of performance. I agree a 960 is too small of an upgrade to be worth spending money on, even accounting for its overclocked performance probably getting an extra 25% faster.

Yeah, I fired up Afterburner today and just looked up online the settings people were boasting about, and then took it down about 15% less than the bragworthy ones and it appears to run fine.
I should have done this 3 years ago because I'm getting a pretty huge (20% average FPS) performance boost in the Rome 2 benchmark already

Ragingsheep
Nov 7, 2009

kirtar posted:

No, the problem is that people speculate based on limited and often vague information.

https://www.techpowerup.com/236422/retailers-are-buying-amd-rx-vega-64-at-usd-675-each

:shrug:

kirtar
Sep 11, 2011

Strum in a harmonizing quartet
I want to cause a revolution

What can I do? My savage
nature is beyond wild

Supposedly that has to do with MA Labs typical price behavior in that price is adjusted based on demand. This is also either a restock order or a really late one since they ordered it on launch day.

kirtar fucked around with this message at 05:50 on Aug 25, 2017

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



PerrineClostermann posted:

Isn't that also about the time Intel put out lovely processors and started their illegal rebate program to shove AMD out of the market?
It was! The wonderful Prescott P4s and the disappointment that was the Netburst era.

Craptacular! posted:

The reason I left AMD is because of the motherboard, which was an Asus with a VIA chipset. VIA joined early Rage era ATI on my "do but buy under any circumstances" list after that mobo. My understanding is that Nvidia stepped in and created nForce and saved millions from having to live in a VIA wasteland, with AMD+Nvidia being the fabled "green team" for a few years.
Yeah, VIA chipsets were bad, but nForce 2 was really good and as others mentioned, surprisingly stable. I remember my Athlon XP 3200+ and nForce 2 setup fondly. The original nForce had a few issues but was still an improvement over anything VIA or SiS.

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo

SlayVus posted:

This looks like a cool unboxing experience.



That's gotta be a galax board.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply