|
charity rereg posted:https://www.dell.com/en-us/work/shop/desktops-all-in-one-pcs/optiplex-5055-tower-and-small-form-factor/spd/optiplex-5055-desktop Hmm. Not what I was expecting. When I think of workstation I mean a dual socket boat anchor. I want to see ryzen in something like a 7000 series or HP Z8 series. I work in visual effects and my company has about 600 or so dual socket HP Z8 series deployed. We won’t switch to AMD unless it ca be done on monster dual socket desktops and high density blade servers made by HP so there’s parity for code that needs to behave the same on the desktop an on the cpu farm. So likely won’t happen for at least 5 years if at all and AMD needs to keep it up for as long. Funny when I first started in 2010 they were phasing out AMD hardware but that was also due to a corporate partnership more so than actual engineering decision. Shaocaholica fucked around with this message at 23:28 on Jan 4, 2020 |
# ? Jan 4, 2020 23:24 |
|
|
# ? Jun 5, 2024 12:23 |
|
One factor keeping AMD from being more widely sold by Dell/HP/etc is that their APUs lag way behind their pure CPU releases. A 3700X might be great value but it becomes a lot less attractive once the cost of a dGPU is added on. Of course, I wouldn't be surprised if the big OEMs expect Intel to regain the top spot eventually, and they wouldn't want to sour the relationship over a relatively temporary state of affairs.
|
# ? Jan 4, 2020 23:38 |
|
I'm still not too clear about what graphics are in Renoir. Modified Vega? ConanTheLibrarian posted:One factor keeping AMD from being more widely sold by Dell/HP/etc is that their APUs lag way behind their pure CPU releases Just my hot take, but the graphics units just make no sense. So far they have been overkill for office work/light video stuff which Intel HD does very well and with better software support, but too little for games or serious Mac Pro workloads. sauer kraut fucked around with this message at 23:51 on Jan 4, 2020 |
# ? Jan 4, 2020 23:41 |
|
Intel competes real dirty in the mobile space. That's where they make the most revenue and they have a lead that extends beyond just the cpu: they provide full turnkey designs for motherboards and the like (iirc they actually design the macbook logic boards). Stuff like integrated thunderbolt 3 is going to keep them on the premium segment. They are also not above discounting or rebating suppliers. The lack of 10nm has hurt them though, can't do lpddr4x in 14nm apparently. AMD will have to really compete hard. GPU power is not as important for the business segment, but connectivity and battery life is.
|
# ? Jan 5, 2020 00:13 |
|
Khorne posted:Monolithic 8c/16t APU for the 4000 series allegedly leaked recently. Might be mobile though. No way are they exclusively mobile, maybe a mobile launch but it's extremely likely these are also desktop processors and they'll come relatively soon. It does look like a really competitive launch though, and for the mobile space that's huge. It's good for a mobile focused launch, but they probably have a ton of high leakage dies as well, they'd just be looking at volume. Probably going to be a 4700G, 4600G, 4400G and 4200G (15, 13, 11 and 8CU?). I'd be amazed if they had any type of L4, and I don't know how they can price an 8C/16T 15CU at a point that makes sense for desktop - anything close to 300$ doesn't make sense to me without it being HBM enabled and competitive with a GTX 1650, and 65% better puts mobile versions about as powerful as an HD 7850 (TPU charts). A 1650 is still near twice as fast, so if you you don't price it at about 259$ it's not competitive (2600/1650 combo).
|
# ? Jan 5, 2020 01:01 |
|
Why doesn't AMD get more recognition for the innovations they've brought to consumers of x86 technology? Especially considering that many of these breakthroughs were developed or delivered during a period where they were suffering from the effects of anti-competitive practices carried out by Intel, requiring all R&D to happen on a shoestring budget. First dual, quad, octo, etc., core desktop processors. Created the industry standard 64 bit extensions for x86 processors, without breaking compatibility with legacy 32 bit apps. Pioneered the use of chiplets, which greatly enhances the modularity and scalability of the platform. Infinity Fabric - high unidirectional throughput while maintaining impressively low latency. AMD has put a lot of R&D into improving HyperTransport over the years, and they are definitely reaping the rewards. This interconnect / bus is key to many design wins. Perhaps the most consumer friendly socket of all time, the AM4 - which has supported 3 generations of the Ryzen family so far since its release in 2016. First to decouple memory controller from CPU to motherboard (and back again). First to bring PCIe v4 to market Implemented SMT in a way that is not only secure, but more performant than Intel's HT.
|
# ? Jan 5, 2020 01:19 |
|
FOLK STYLE FISTING posted:Why doesn't AMD get more recognition for the innovations they've brought to consumers of x86 technology? Especially considering that many of these breakthroughs were developed or delivered during a period where they were suffering from the effects of anti-competitive practices carried out by Intel, requiring all R&D to happen on a shoestring budget. Please support FOLK STYLE FISTING as the next AMD president
|
# ? Jan 5, 2020 01:28 |
|
I appreciate that you wrote all that down. Those are all good contributions from AMD
|
# ? Jan 5, 2020 01:31 |
|
What difference does recognition of technical improvements make?
|
# ? Jan 5, 2020 01:36 |
|
FOLK STYLE FISTING posted:Why doesn't AMD get more recognition for the innovations they've brought to consumers of x86 technology? Because normal people don't give a gently caress about that nerd poo poo. Second, a bunch of that list of "firsts" are not really worth much boasting. What's the big achievement in dual core on a consumer desktop when Power chips had been doing it for years in servers & workstations? Much like PCIe 4.0, it was gonna happen anyways (intel was well at work on Yonah when Athlon 64 X2 came out). Third, AMD does have fanboys. Ain't nobody got a "cyrix 4 lyfe" tattoo. x86-64 is worth saying thanks AMD for though, god knows where we'd be in the mirror universe where itanium was the only choice.
|
# ? Jan 5, 2020 01:38 |
|
FOLK STYLE FISTING posted:Why doesn't AMD get more recognition for the innovations they've brought to consumers of x86 technology? Especially considering that many of these breakthroughs were developed or delivered during a period where they were suffering from the effects of anti-competitive practices carried out by Intel, requiring all R&D to happen on a shoestring budget. Dont fanboy tech companies that wouldn't hesitate to grind your bones up and feed them to your dog for a .01% stock price increase.
|
# ? Jan 5, 2020 01:57 |
|
I'm pretty excited about a 8C/16T APU, but it's also completely correct that if it's much more than 260 USD then a discrete CPU+GPU combo would still be better value.FOLK STYLE FISTING posted:Why doesn't AMD get more recognition for the innovations they've brought to consumers of x86 technology? it didn't make Number Go Up enough
|
# ? Jan 5, 2020 02:20 |
|
Klyith posted:x86-64 is worth saying thanks AMD for though, god knows where we'd be in the mirror universe where itanium was the only choice. probably in a timeline where both microsoft and intel are worth $30 because without all the backward compatibility everyone hosed off to less hostile ecosystems, so really it's hard to say whether it's good or bad
|
# ? Jan 5, 2020 02:24 |
|
Alternate timeline PowerPC w/Debian as most popular OS.
|
# ? Jan 5, 2020 03:55 |
|
The true alternate timeline would be BeOS on PowerPC. Also, AM4 doesn't have anything on (Super) Socket 7. It ended up being a mainstream platform all the way from the 1994 Pentium at 75 MHz to the 1999 AMD K6-3 at 550 MHz and you can apparently still buy industrial/embedded motherboards using a derivative of that platform using AMD Geode CPUs with availability guaranteed through 2021.
|
# ? Jan 5, 2020 04:12 |
|
Windows and x86 were a mistake
|
# ? Jan 5, 2020 04:32 |
|
Truga posted:probably in a timeline where both microsoft and intel are worth $30 because without all the backward compatibility everyone hosed off to less hostile ecosystems, so really it's hard to say whether it's good or bad yeah if the tech industry moved away from a model where these two companies hold a virtual duopoly I think that might have been healthier in the long run
|
# ? Jan 5, 2020 04:40 |
|
Shaocaholica posted:Windows and x86 were a mistake Microsoft agreed that some of Windows (non-NT) was a mistake. Intel also agreed that x86 was a mistake and cooked up IA64 but in the middle of them and HP trying to force feed it to the industry AMD stole Intel's spell book and cast a necromancy curse that gave x86 immortality with the 64 bit extensions.
|
# ? Jan 5, 2020 05:56 |
|
Fabulousity posted:Microsoft agreed that some of Windows (non-NT) was a mistake. Intel also agreed that x86 was a mistake and cooked up IA64 but in the middle of them and HP trying to force feed it to the industry AMD stole Intel's spell book and cast a necromancy curse that gave x86 immortality with the 64 bit extensions. Which worked because the parts of x86 which were a mistake were no longer terribly relevant, but the parts of IA64 which were a mistake, oh boy were they a mistake.
|
# ? Jan 5, 2020 06:31 |
|
Killer robot posted:Which worked because the parts of x86 which were a mistake were no longer terribly relevant, but the parts of IA64 which were a mistake, oh boy were they a mistake. This is a little off topic for the thread, but for someone who doesn’t isn’t a programmer, I’ve always heard that Itamium was a mess but what was the huge issue?
|
# ? Jan 5, 2020 13:45 |
|
KKKLIP ART posted:This is a little off topic for the thread, but for someone who doesn’t isn’t a programmer, I’ve always heard that Itamium was a mess but what was the huge issue? AFAIK, Intel tried to move OoO execution from the chip hardware to the compiler, but the former will always be more efficient because it possesses the ability to analyze and optimize itself dynamically on-the-fly to the current code than to blindly follow the complied code like in the latter.
|
# ? Jan 5, 2020 14:56 |
|
Compiler nerds: ugh this scheduling is so lame, I could do better ISA nerds: enjoy Compiler nerds: hmm! well, nevertheless It’s not clear that Itanium even approached the limits of static scheduling, let alone taking advantage of removing the real-time constraint on scheduling as compensation for lacking runtime signals. How did top-end JITs do, anyway? Palladium posted:AFAIK, Intel tried to move OoO execution from the chip hardware to the compiler, but the former will always be more efficient because it possesses the ability to analyze and optimize itself dynamically on-the-fly to the current code than to blindly follow the complied code like in the latter. I’m not sure what it means to move OoO to the compiler: the compiler is what sets the order, so by definition its output is “in order”. Maybe I misunderstand what you mean. Mostly they tried to move the parallelism to the compiler and increase the degree of parallelism and it wanted instruction parallelism rather than mostly thread parallelism, which most software architecture and languages were and are less well-suited for. It’s like if we got 4096-bit-wide SIMD in Zen2, instead of more cores. Subjunctive fucked around with this message at 15:08 on Jan 5, 2020 |
# ? Jan 5, 2020 14:58 |
|
All this nerd talk is boring me. All I need to know is how fast will it run
|
# ? Jan 5, 2020 15:24 |
|
KKKLIP ART posted:This is a little off topic for the thread, but for someone who doesn’t isn’t a programmer, I’ve always heard that Itamium was a mess but what was the huge issue? It needed a compiler that would have to be literally magic.
|
# ? Jan 5, 2020 15:34 |
|
Balliver Shagnasty posted:All this nerd talk is boring me. All I need to know is how fast will it run Parts thread that is that way
|
# ? Jan 5, 2020 15:36 |
|
Kerbtree posted:It needed a compiler that would have to be literally magic. More or less, all the compiler would need to do to use all the resources in the itanium efficiently, would be to solve P=NP, from that point on, the rest is a trivial exercise by the reader.
|
# ? Jan 5, 2020 15:55 |
|
Balliver Shagnasty posted:All this nerd talk is boring me. All I need to know is how fast will it run Actually we're back to trying to run quake again tbh
|
# ? Jan 5, 2020 16:32 |
|
Balliver Shagnasty posted:All this nerd talk is boring me. All I need to know is how fast will it run
|
# ? Jan 5, 2020 16:52 |
|
Wrong thread
|
# ? Jan 5, 2020 21:11 |
|
is there a way to watch like a broadcast or a livestream of the CES event later?
|
# ? Jan 6, 2020 01:49 |
|
gradenko_2000 posted:is there a way to watch like a broadcast or a livestream of the CES event later? https://www.amd.com/en/events/ces Expectations are the 3980/3990X, 5600XT, and maybe the Zen2 APUs/Laptop parts. Maybe the big Navi MI card? Or none of that, who knows!
|
# ? Jan 6, 2020 02:08 |
|
Balliver Shagnasty posted:All this nerd talk is boring me. All I need to know is how fast will it run There is still no system that can run Crysis on Mega Ultra settings.
|
# ? Jan 6, 2020 03:24 |
|
taqueso posted:There is still no system that can run Crysis on Mega Ultra settings. what would it take?
|
# ? Jan 6, 2020 03:33 |
|
a dyson sphere
|
# ? Jan 6, 2020 03:45 |
|
gradenko_2000 posted:what would it take? A near-event horizon system wherein the speed of light is 50000000000x the standard rate, but it's able to communicate with out outside world at whatever the pcie-3 spec says the data rate should be. Blammo, arbitrary speed!
|
# ? Jan 6, 2020 06:29 |
|
gradenko_2000 posted:what would it take? Crysis was made at the time when game devs thought CPUs would hit 5Ghz and then 6Ghz clocks in the near future. So we'd need a breakthrough in CPU speed. OhFunny fucked around with this message at 11:05 on Jan 6, 2020 |
# ? Jan 6, 2020 08:31 |
|
Hail Mary here. Does anyone have a line on an Opteron 2224? I need 2 for a dual socket relic and I already got one but need its twin brother. There's 2 on ebay for like $65 but historically I think they've gone for as little as $10 so I'd rather hold out for that than pay $65.
|
# ? Jan 6, 2020 10:36 |
|
The AMD Zen2 Mobile embargo is going to lift in a few hours and some of the laptop makers are already posting stuff. Looks like those of you looking for premium AMD laptop designs are going to get their wish. https://www.anandtech.com/show/15282/ces-2020-lenovo-yoga-slim-7-with-amd-4000-apu-inside-coming-march Release date for the CPUs appears to be March.
|
# ? Jan 6, 2020 21:11 |
|
Is that really true about crysis? I thought I was just being silly.
|
# ? Jan 6, 2020 21:13 |
|
|
# ? Jun 5, 2024 12:23 |
|
taqueso posted:Is that really true about crysis? I thought I was just being silly. I have a 9900k overclocked to an all core boost of 5.1ghz with an RTX 2080 and with everything turned all the way up (except MSAA which I have set to 2x) I get between 70 and 90 FPS on my 2560x1080 ultrawide monitor. It’s not as much as I would expect from a game from 2007 but it’s not horrible. AMD suffers more simply because of clocks. I was stuck around 50fps with the same settings on a 2700x/1080ti. GutBomb fucked around with this message at 21:37 on Jan 6, 2020 |
# ? Jan 6, 2020 21:19 |