Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Rabid Snake
Aug 6, 2004




I wonder why Intel would do this when AMD is trying to compete with them with their own CPU processors

Adbot
ADBOT LOVES YOU

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
It's more about legal protection IMHO, although getting new patents related to GCN or VLIW to further improve their stuff could be a bonus.

However, the way Kyle was wording everything basically made it sound like Intel is giving up on Iris and AMD is doing all the GPU work in the future (likely Cannonlake and beyond). I don't think there will be an issue for AMD as I don't think they'll be under any obligation to give them the very best AMD makes.

betterinsodapop
Apr 4, 2004

64:3

TheRationalRedditor posted:

I have an Asus P8Z68-V Pro/GEN3 so honestly I wouldn't be surprised if it was the culprit, as I've fully swapped my RAM, PSU and GPU since originally building the PC in 2012. I just updated to the most recent BIOS from 2015, which I neglected to do. Like 2 out of the previous 4 firmware ROMs had the vague devnote "Increased Stability", so maybe that's very telling.
I hate ASUS so much. Their website, service, support...everything is so vague and not helpful.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Rabid Snake posted:

I wonder why Intel would do this when AMD is trying to compete with them with their own CPU processors

It almost makes sense if Intel is looking mostly at the laptop market, where being able to toss in a beefier AMD iGPU solution would help decrease total costs compared to a crappy iGPU + ok dGPU. You just have to assume that whatever AMD got out of the deal probably wasn't enormously valuable, since Intel is certainly in the driver's seat here.

EoRaptor
Sep 13, 2003

by Fluffdaddy

Rabid Snake posted:

I wonder why Intel would do this when AMD is trying to compete with them with their own CPU processors

Having had a few minutes to think about it, this is probably about patent protection, nothing more.

Intel and AMD have a cross license for all the cpu tech, however it may have vague wording about gpu tech, or tech acquired via purchase (the original cpu deal predates amd buying ati). This might be a simple CYA where intel gives amd a chunk of cash (they have plenty) to firm up the deal instead of ending up in court for 10 years.

Honestly, replacing the intel gpu stuff with AMD embedded/integrated stuff would actually be a huge improvement and a boon to consumers, but I don't think internal politics at intel would ever allow that. Best that maybe happens is some better use of silicon and some performance improvements.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

betterinsodapop posted:

I hate ASUS so much. Their website, service, support...everything is so vague and not helpful.

What the hell happened to ASUS? I remember people singing their praises and using them almost exclusively for builds years back and then all of a sudden nothing.

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf

betterinsodapop posted:

I hate ASUS so much. Their website, service, support...everything is so vague and not helpful.

I used to have an ASUS 1950... Something. And it was impossible to use the stock graphics card drivers with it, only the ASUS ones worked and naturally they were massively out of date and never ever updated. I've avoided them after that, I only buy their motherboards these days

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

jisforjosh posted:

What the hell happened to ASUS? I remember people singing their praises and using them almost exclusively for builds years back and then all of a sudden nothing.

Their hardware is generally very solid still. It's just their support services that are terrible.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

DrDork posted:

Their hardware is generally very solid still. It's just their support services that are terrible.

And that ends up as a net negative when you have the likes of Sapphire, XFX, EVGA, MSI and even Zotac managing good build quality and average to fantastic customer service, and ASRock and MSI boards are usually of the same quality but superior in customer service.

They're not Gigabyte but man are there better alternatives at this point.

well why not
Feb 10, 2009




At one point a few years back I realised that the rig I'd built up / amassed over the years was using mostly assembled from still-working ASUS stuff. I don't think their QC or manufacturing is bad, it's just that their support + prices aren't where they should be. Also, that ROG poo poo is tacky as gently caress.

Guigui
Jan 19, 2010
Winner of January '10 Lux Aeterna "Best 2010 Poster" Award
Wondering if I could get the opinion of some of the goon experts here...

I'm narrowing down my search for a graphic card replacement of my old 560, and so far I have come up with two cards that fit my budget and use:

An EVGA gtx 1060 sc gaming, single fan, ACX 6gb (06-p4-6163-kr), 1607 mzh base, 1835 boost, for a total of 379 canadian after taxes (and rebates)... or,

A Sapphire, nitro+ Radeon rx 480 100406NT+8GL 8gb 256 bit, 1208 base, 1342 boost. For 387 Canadian (shipping included).

Currently I have a P6P67 mobo and an i5 2400 chip. My monitor is a 1680 x 1050, which would be nice to upgrade I the future.

I have been hemming and biding my time, but some of these promotional offers expire tomorrow, so any thoughts or questions would be greatly appreciated!

(I could go with a cheaper Blower 480, but noise is a bit of an issue...)

well why not
Feb 10, 2009




If noise is an issue, the single fan 1060 is very quiet.

sauer kraut
Oct 2, 2004

Guigui posted:

Wondering if I could get the opinion of some of the goon experts here...

I'm narrowing down my search for a graphic card replacement of my old 560, and so far I have come up with two cards that fit my budget and use:

An EVGA gtx 1060 sc gaming, single fan, ACX 6gb (06-p4-6163-kr), 1607 mzh base, 1835 boost, for a total of 379 canadian after taxes (and rebates)... or,

A Sapphire, nitro+ Radeon rx 480 100406NT+8GL 8gb 256 bit, 1208 base, 1342 boost. For 387 Canadian (shipping included).

Currently I have a P6P67 mobo and an i5 2400 chip. My monitor is a 1680 x 1050, which would be nice to upgrade I the future.

I have been hemming and biding my time, but some of these promotional offers expire tomorrow, so any thoughts or questions would be greatly appreciated!

(I could go with a cheaper Blower 480, but noise is a bit of an issue...)

If budget is an issue and you're thinking about a monitor upgrade soon, a 480 with FreeSync is the way to go.
The Nitro+ is pretty loud according to reviews, are you locked into a specific vendor with gift cards or something that limits your options?

These are better 349CAD options imho:
http://www.newegg.ca/Product/Product.aspx?item=N82E16814131694 Red Devil
https://www.amazon.ca/dp/B01K1JTT8S MSI Gaming 8G

SwissArmyDruid
Feb 14, 2014

by sebmojo

eames posted:

That power consumption decrease seemed impressive until I read the slides.


and then in a sentence two paragrpahs down:


:saddowns:
They're basically throttling framerates depending on movement on the screen. I guess this will be useful for laptops but I don't think it'll do much if anything with a RX 480 and a 100hz 1440p screen.

If they are using HiAlgoBoost-derived code to do it, it is not just throttling framerates. It is probably also dynamic resolution scaling as well. I imagine it's a combination of their VR technologies to smooth out headtracking along with framerate throttling, and whatever other voodoo they decided to add into the mix, then brought to a stationary monitor.

As someone who used HiAlgoBoost during the early days of Skyrim when I was a poor and had a potato PC badly in need of an upgrade, it was a very clever and well-implemented bit of kit then, and I imagine it has gotten better with lower-level access to the GPU.

I would like to see how it functions on a modern GPU, however. Several years ago, I was so happy just to be able to keep my framerate up with all my mods loaded that I could forgive a bit of motion-blur-but-not-really, but these days I won't tolerate it at all.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

SwissArmyDruid posted:

If they are using HiAlgoBoost-derived code to do it, it is not just throttling framerates. It is probably also dynamic resolution scaling as well. I imagine it's a combination of their VR technologies to smooth out headtracking along with framerate throttling, and whatever other voodoo they decided to add into the mix, then brought to a stationary monitor.

As someone who used HiAlgoBoost during the early days of Skyrim when I was a poor and had a potato PC badly in need of an upgrade, it was a very clever and well-implemented bit of kit then, and I imagine it has gotten better with lower-level access to the GPU.

I would like to see how it functions on a modern GPU, however. Several years ago, I was so happy just to be able to keep my framerate up with all my mods loaded that I could forgive a bit of motion-blur-but-not-really, but these days I won't tolerate it at all.

Yes, AMD acquired HiAlgo, so this Chill feature is probably HiAlgo Chill - a "smart framerate limiter". Who knows precisely what it does in this new driver, but AMD's scant info indicates they're probably just lowering the maximum frame rate limit when the algorithm decides you don't need the performance.

penus penus penus
Nov 9, 2014

by piss__donald

PerrineClostermann posted:

It's been consistent for years, so they must be doing something right.

Because they've been on the same architecture for years :v: It was actually them doing something wrong that made the 290's and the like get better and better (bad drivers made good)

"Better value over time" is something that can flip at the drop of a hat once new gens come through and there is zero reason to think one brand is better than another

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I wonder how much performance the 4000, 5000 and 6000 series had locked up behind bad drivers. Like, was there competitive edge even greater back then? Was the HD5870 capable of matching or beating a GTX480? We'll never know.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

FaustianQ posted:

I wonder how much performance the 4000, 5000 and 6000 series had locked up behind bad drivers. Like, was there competitive edge even greater back then? Was the HD5870 capable of matching or beating a GTX480? We'll never know.

Well it's not perfect but you can go back and look at their raw GLFOPS performance vs their Nvidia counterparts, I like to sperg out on these two wiki articles sometimes.

https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units
https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units

Josh Lyman
May 24, 2009


FaustianQ posted:

I wonder how much performance the 4000, 5000 and 6000 series had locked up behind bad drivers. Like, was there competitive edge even greater back then? Was the HD5870 capable of matching or beating a GTX480? We'll never know.
Wouldn't raw FLOPS be a driver-free measure?

edit: ^ motherfucker!

Josh Lyman fucked around with this message at 00:48 on Dec 8, 2016

EdEddnEddy
Apr 5, 2012



MaxxBot posted:

Well it's not perfect but you can go back and look at their raw GLFOPS performance vs their Nvidia counterparts, I like to sperg out on these two wiki articles sometimes.

https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units
https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units

Based on that, it looks like a 5870 could hit up to a GTX 670's GFlops Single Precision and it wasn't even a contest with DP. Am I reading that wrong?

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
That's correct but keep in mind that DP isn't used much for games and that Nvidia often deliberately gimps their DP and HP stuff for market segmentation reasons.

EdEddnEddy
Apr 5, 2012



Right I understand, but drat the 5870 was a overachieving card.

I remember when the 4870 dropped as the first TF card as well. Good times for ATI at least and I loved my 4870X2 after they got the initial drivers sort of dialed in. That thing was fast right up until the 5870 came out and matched it, all on a single GPU lol.


Both ran faster and more stable for things like Crysis, than Tri-SLI 8800GTX's too.

SwissArmyDruid
Feb 14, 2014

by sebmojo

MaxxBot posted:

Well it's not perfect but you can go back and look at their raw GLFOPS performance vs their Nvidia counterparts, I like to sperg out on these two wiki articles sometimes.

https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units
https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units

Scanning over those articles brings back a thought: If AMD doesn't break out the old All-In-Wonder branding for the next generation of APUs, holy poo poo what a wasted opportunity.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

SwissArmyDruid posted:

Scanning over those articles brings back a thought: If AMD doesn't break out the old All-In-Wonder branding for the next generation of APUs, holy poo poo what a wasted opportunity.

"What a wasted opportunity" is basically AMD's business motto.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

MaxxBot posted:

Well it's not perfect but you can go back and look at their raw GLFOPS performance vs their Nvidia counterparts, I like to sperg out on these two wiki articles sometimes.

https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units
https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units

AMD FLOPs never equaled Nvidia FLOPs for some reason, but let's assume AMD could get a similar ratio of FLOPs for performance out of the HD5870 as they do out of the RX480. That'd convert the HD5870's 2.7TFLOPs into 2.0TFLOPs Nvidia equivalent (again, thinking optimistic scenario).

A HD5870 would then be contesting with a GTX 1050ti in performance, or between a GTX 760 and GTX 670 , slightly ahead of a GTX 580. That's loving madness, Nvidia would have never been able to live that down.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
Is there a good abridged history of Nvidia Intel/AMD ATI? I got into PC gaming when Nvidia and Intel were on top, but apparently long ago it was better to go team red when you wanted to do gaming. What happened?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Dali Parton posted:

What happened?

ATI got bought by AMD, which did not work out well for ATI in terms of product focus. Also DX11 happened; ATI/AMD DX11 drivers suffered from terrible performance for a very long time (and still do, somewhat). NVidia's DX11 drivers were much better able to utilize the underlying hardware and the end result was noticeably better actual performance.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dali Parton posted:

Is there a good abridged history of Nvidia Intel/AMD ATI? I got into PC gaming when Nvidia and Intel were on top, but apparently long ago it was better to go team red when you wanted to do gaming. What happened?

NVIDIA and ATI jointly killed 3DFX (although 3DFX's management and Sega both share some blame), and they've gone back and forth ever since. Rage 128 was great, TNT2 was great, GeForce 3/4 was great, Radeon 9700/9800 was great. Then NVIDIA went through a bit of a dry spell with Fermi and Kepler, while AMD had the 4000 series, the 5000 series, then GCN. NVIDIA has finally caught up with Maxwell and especially Pascal, and they're much better cards for gaming than GCN/Polaris can offer, although AMD is still holding their ground in the console and low-end market thanks to the low-level optimizations they enjoy from consoles, an advantage in DX12 and Vulkan (which are essentially designed by AMD, they are Mantle 2.0), etc. Their DX11 drivers have always kind of sucked because they can't pay to retain the quality/quantity of talent that it takes to optimize a bunch of games written by terrible devs who need to get it in stores by christmas.

Right now the biggest thing hurting AMD is they literally don't have anything that competes with anything faster than a 1070, and even that card (the R9 Fury) has some serious caveats attached (limited VRAM as we move into an era where VRAM usage is increasing fast, some weird microstutter problems, and high power consumption). The RX 480 is a pretty decent card especially at the price but it only competes with a GTX 1060 and they still consume roughly 50% more power than their NVIDIA counterparts. They were planning to make a much bigger dent but they had a lovely fab and they overvolted all their cards to try and keep clock speeds up. Many cards actually undervolt pretty well and you can make up a chunk of the difference that way. They also did not hit the same clock speed as NVIDIA did, and I think they were expecting more (part of the reason behind goosing the voltage so hard).

Roughly speaking, right now AMD is one generation behind in terms of power consumption (Polaris is only hitting Maxwell level efficiency out of the box) and they have nothing that is recommendable without major caveats past the lower-midrange performance bracket (the x60 series from NVIDIA have never been a high-end part). They have the advantage in terms of ecosystem (FreeSync is an open standard and many of the monitors that support it are much cheaper than their NVIDIA GSync equivalents), but that also comes with some major caveats too (many FreeSync monitors do not have very good sync range, and they do not have the same selection of really premium monitors at the high end). Really what AMD needs to compete right now is a big chip, because if you scaled Polaris up it wouldn't have fantastic power-to-performance but it would be at least something past the GTX 1060 range.

They really, really need something along the lines of the Kepler-to-Maxwell redesign where they fundamentally clean up their architecture, but it'll be a hard sell especially since they have been coasting on the same basic architecture for the past 5 years and a redesign will likely nuke a lot of their advantage in terms of staffing costs and DX12/Vulkan advantage. And they just sold a buttload of next-gen consoles so they are committed to supporting that architecture.

As for Intel vs AMD... AMD had the lead in the 2000s. AMD was at Intel's heels, and then Intel made a huge mis-step with Pentium 4. They spent 3 years trying to reverse that mistake and then went with a Pentium 3 derivative. AMD ran their design out of gas, culminating in the Phenom II architecture. Then they tried to switch to Bulldozer, made many of the same mistakes Intel did plus some extras (they bet the farm on parallelism), and spent the next 3 years trying to claw their way back to the same IPC they had with Phenom II. Intel has mostly coasted for the past 5 years with performance that solidly thrashes anything AMD can offer. It's not even close right now, Intel chips dominate AMD chips in both performance and power at any given price point.

Overall, AMD just could not keep up with Intel and TSMC's fabs. They spun off their in-house fab into GloFo and then made an even bigger mistake betting on GloFo with big wafer agreements. They also failed to hand Jim Keller a blank check and tell him to write as many zeroes as it takes to keep him there. He was a DEC guy who designed the Alpha chips before management ran that company into the ground, and then came to AMD to design Thunderbird/Athlon XP/Athlon64/X2 (essentially anything AMD is famous for in the last 30 years). He then left AMD, the company went to poo poo with Bulldozer/etc, and he returned to design Zen before disappearing into the night once more.

Paul MaudDib fucked around with this message at 03:23 on Dec 8, 2016

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Paul MaudDib posted:

They also failed to hand Jim Keller a blank check and tell him to write as many zeroes as it takes to keep him there. He was a DEC guy who designed the Alpha chips before management ran that company into the ground, and then came to AMD to design Thunderbird/Athlon XP/Athlon64/X2 (essentially anything AMD is famous for in the last 30 years). He then left AMD, the company went to poo poo with Bulldozer/etc, and he returned to design Zen before disappearing into the night once more.

I talked to Jim Keller the other day, he's at Tesla running Autopilot Engineering. Weird reality/forum crossover. I doubt he's doing processor design at Tesla, but I guess it's possible.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Subjunctive posted:

I talked to Jim Keller the other day, he's at Tesla running Autopilot Engineering. Weird reality/forum crossover. I doubt he's doing processor design at Tesla, but I guess it's possible.

Haha, solid. No, probably not processor design, although you never know - Google and FB get to "design their own chips" thanks to proprietary microcode to unlock functional units they engineer with Intel. But there's a whole bunch of systems integration work to be done there and most of it is novel at this point. Something like Drive PX2 is a complex system with a non-uniform architecture that will need a lot of micro-management.

I don't know Jim Keller except by reputation and what my dad told me about working at DEC at the time (his exact words something like were "AMD has spent the last 10 years unlocking the rest of Alpha's potential") but as far as I'm concerned the man is a legend in his own time and it honestly gives me confidence to know he's riding herd on something as dangerous as self-driving cars. I don't suppose he's hiring? I did a device-only CUDA simulation for my Master's thesis :v:

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

He didn't mention if he was hiring (he's sort of gruff), but I'm sure Tesla is!

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
Wowie zowie. Thanks for the history lesson.

So its safe to say ATI/AMD getting competitive again would force Intel to innovate and cut the selling price of their CPUs a little more? The same thing with Nvidia would happen I assume?

vvv: and thanks too man! Reading two histories is twice as good.

buglord fucked around with this message at 03:29 on Dec 8, 2016

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Dali Parton posted:

Is there a good abridged history of Nvidia Intel/AMD ATI? I got into PC gaming when Nvidia and Intel were on top, but apparently long ago it was better to go team red when you wanted to do gaming. What happened?

Nvidia/AMD
Kepler to Maxwell happened. Prior to that Nvidia did in fact make the single fastest GPUs, but they were coupled with high heat, large die size, and commensurate price tag that meant AMD could offer 90% of the performance for 70% the price tag; this was for the 4000, 5000 and 6000 series IIRC. This happened to Nvidia because they stagnated on the Tesla uarch for so long (8000/9000/200 series) all of which were pummeled by AMD's offerings once the 4000 series hit (and the 3000 series was competitive with the 8800GT and below). Despite these leads, poorer driver optimization and poorer marketing meant AMD still lost out on sales to indisputably inferior Nvidia products. Then GK104 hit, then Maxwell, and Nvidia has stolen the crown that for the longest time AMD held - power efficiency, die efficiency, and gaming performance. With struggling financial's, AMD just doesn't have the R&D anymore to innovate like they did 4-8 years ago and have been coasting on old GCN, which taken with the driver improvements today would mean AMD has been ahead of Nvidia technologically up until Maxwell (ad has reaped near zero benefit form it).

If AMD hadn't been staffed by greedy cartoon villians at the top and had good software development (something Nvidia went all in on) we'd definitely be rooting for an Nvidia underdog who's barely clinging on.

Intel/AMD
Bulldozer happened, Hector Ruiz happened. There are so many compounding reasons for why AMD's star fell in the CPU world, but it's pretty easy to lay the blame at Ruiz's feet and just leave it there. AMD used to have a competitive uarch compared to Intel, especially when Intel went full gigatarded with Pentium 4, but poor management (a reoccurring theme) coupled with betting on the wrong horse (construction cores) and undercutting funding to continue development on their current uarch (K8 essentially), meant they regressed in performance while Intel used a Pentium 3 derivative, iterated on the design repeatedly, and leaped ahead of AMD in basically one year. Sandy Bridge left AMD in the dust and it's never been competitive since except at the very low end because LOL Atoms are trash and construction cores ands cat cores scale really well downwards.

There is hope Zen will reverse this or at least make the company stop hemorrhaging, to the point that if Zen fails AMD fails.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

Intel/AMD
Bulldozer happened, Hector Ruiz happened. There are so many compounding reasons for why AMD's star fell in the CPU world, but it's pretty easy to lay the blame at Ruiz's feet and just leave it there. AMD used to have a competitive uarch compared to Intel, especially when Intel went full gigatarded with Pentium 4, but poor management (a reoccurring theme) coupled with betting on the wrong horse (construction cores) and undercutting funding to continue development on their current uarch (K8 essentially), meant they regressed in performance while Intel used a Pentium 3 derivative, iterated on the design repeatedly, and leaped ahead of AMD in basically one year. Sandy Bridge left AMD in the dust and it's never been competitive since except at the very low end because LOL Atoms are trash and construction cores ands cat cores scale really well downwards.

There is hope Zen will reverse this or at least make the company stop hemorrhaging, to the point that if Zen fails AMD fails.

Yeah again it's hard to overemphasize how much AMD hosed up when they are getting beat by an evolved Pentium 3 - even after they saw what happened with Pentium 4's terrible IPC and Intel's recovery. Frankly the early results from Bulldozer prototypes should have been the alarm bells that they should be keeping Phenom in development until they knew they had an actual successor but they went all in. In general AMD didn't have enough money and/or was too stingy with it to keep up R&D on two major fronts on the same time, let alone also developing bleeding-edge fab technology like Intel does to keep pace. I don't want to say the acquisition of ATI was necessarily a bad decision, and it opened up doors for a lot of semi-custom contracts... but they didn't spend the money to keep pace on all three fronts at once. And spinning off GloFo was pretty much the end since the fab tech is a loss-leader for a company like Intel.

Literally the only AMD CPU that I can recommend whole-heartedly right now is the Athlon 5350. Because who can honestly say no to a 25W processor and motherboard with ECC RAM support for $50? It ain't even bad for your fileserver poo poo apart from only 2 SATA ports, right now you really need a SATA controller card. I would literally pay double or triple that for a server-class mITX motherboard with ECC support and another SATA controller onboard and maybe dual NICs. It would be easy to undercut Intel's C-series motherboards. AMD are idiots.

Hopefully Keller-senpai put AMD on the right path. Haswell IPC/power would put AMD back on the map. They need a win with Vega too - we both agree that the Vega 10/4096 core refresh is not enough a year after Pascal, they desperately need either bigger chips or a new uarch to compete, at a minimum, and both to really excel. Otherwise they're circling the drain in both markets.

Paul MaudDib fucked around with this message at 03:47 on Dec 8, 2016

New Zealand can eat me
Aug 29, 2008

:matters:


What about the bit where Intel got caught bribing Dell to not use AMD and had to fork >$1B USD over, but still haven't paid yet apparently?

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
EVGA finally got back to me about my step-up (to the streets. and a 1080) and it's on the phase where I have a week to pay. For anyone who's had experience doing it, how does it go from here? Are they willing to write package notation that indicates tax has already been paid on the product (as it's being sent from US to Canada)? Do I actually have to send my card away before they even begin shipping the new one back?

SlayVus
Jul 10, 2009
Grimey Drawer
During the Step Up process you will be given an RMA number, with this number you will ship your card back to them. After they receive your card they will then ship out the new Stepped Up card to you. The package will be shipped to you as an RMA Replacement and should not incur any additional taxes. People on the EVGA forums suggest using UPS and not Fedex.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

SlayVus posted:

During the Step Up process you will be given an RMA number, with this number you will ship your card back to them. After they receive your card they will then ship out the new Stepped Up card to you. The package will be shipped to you as an RMA Replacement and should not incur any additional taxes. People on the EVGA forums suggest using UPS and not Fedex.
Any reason why? Is it super express? So far the receipt only has it listed as a massive non-specific $73 dollar charge, I'm assuming because that's cross border but the more info about it the better.

I didn't count on having to revert to my dusty 670 while waiting for a handoff, so I'm trying to figure out all I can to decide if it's worth it. The currency conversion is murderous.

SlayVus
Jul 10, 2009
Grimey Drawer
If that is $73 for Step-Up, it includes step-up costs, taxes, and shipping fees. What people said on the EVGA forum. UPS is just easier to deal with for cross-board shipments than Fedex is.

Adbot
ADBOT LOVES YOU

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
That's specifically the shipping alone, it's listed as a separate deduction under the actual step-up price difference between my 1070 and the 1080

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply