|
jm20 posted:It looks mighty hard to fit a pcie card and use sata on that board, inventive indeed.
|
# ? Aug 19, 2016 20:08 |
|
|
# ? May 21, 2024 13:37 |
|
With a single-slot cooler, amirite?
|
# ? Aug 19, 2016 21:52 |
|
Yes
|
# ? Aug 19, 2016 22:01 |
|
So, a while back, I tried to figure out what it would take to make a single-slot cooler for a low-profile video card. Long story short, the extruded heatsinks you get from a long-rear end bar and just chop off every couple of inches have a minimum order price, and they really don't cool all that well when you cut them down severely, and there's a reason why every modern enthusiast GPU moved to really thin fins. Even on the reference coolers, the fins are vastly thinner than I would feel comfortable machining. It looks dire, I'm afraid. You're going to have to compromise on *one* thing. Even the Galax/Galaxy/KFA2 low-profile 750 Ti I have is a double-height cooler, and it uses an extruded heatsink with no heatpipes. 50 cents on the day. SwissArmyDruid fucked around with this message at 08:36 on Aug 20, 2016 |
# ? Aug 20, 2016 07:56 |
|
EdEddnEddy posted:
Since this seems to be a realistic scenario now: If this really happens, why the heck doesn't Apple just buy AMD? Surely a company like Apple has analysts and insiders that can gauge the performance of Zen by now. They'd free themselves from Intel and probably get a boatload of useful GPU/CPU patents for their ARM chips as well.
|
# ? Aug 20, 2016 09:46 |
|
Will save you the trouble of digging the answer out from answer from many times over: AMD's present arrangement with Intel contains a clause that states that if either party should be acquired (obviously this makes it only pertinent to AMD, and not Chipzilla) AMD's x86 license is terminated, as is Intel x86-64 license. This makes acquisition untenable for anyone desiring to enter the x86 space.
|
# ? Aug 20, 2016 09:55 |
|
SwissArmyDruid posted:Will save you the trouble of digging the answer out from answer from many times over: AMD's present arrangement with Intel contains a clause that states that if either party should be acquired (obviously this makes it only pertinent to AMD, and not Chipzilla) AMD's x86 license is terminated, as is Intel x86-64 license. This makes acquisition untenable for anyone desiring to enter the x86 space. Oh I see, thanks. That changes everything as AMD minus the x86 license and designs is not even attractive to Apple, even though they're rumored to switch to ARM. (thinner! lighter! ) I think there's a decent good chance we'll see Zen in future Macbooks assuming the performance pans out, unless they really switch ARM before that.
|
# ? Aug 20, 2016 10:11 |
|
EdEddnEddy posted:I continue to really hope for the best here though. Intel needs a kick in the nuts and AMD needs a winning architecture that can bring them back into the game full swing. If Zen ends up being great, and they make some APU's with HBM2 for the mobile market that can swing within striking distance of say 25% slower than an Nvidia 1060 (would that be possible?) then they could really have some killer products on the market in the next year or two. I've been waiting for something like this since Llano.
|
# ? Aug 20, 2016 12:06 |
|
Palladium posted:Gotta love when Zen is rumored to be competitive with BDW-E, the minds of fanboys go straight to "gimme that for <$200", because competitive performance = competitive pricing doesn't apply to AMD. For general desktop and gaming and 'real world' use purposes that sort of performance difference won't mean much. Even if they can only get Zen to clock to around 3.5Ghz at stock clockspeeds it won't matter much vs existing Skylake or future Purleylake chips. But they'll lose the synthetic benchmark battles by significant amounts and they probably won't have as good perf/watt vs what Intel will be offering which will matter when they go to try and sell to the server markets. Being less than the best typically means you have to reduce prices by quite a bit in the PC biz unfortunately. The good news is even selling their chips for hundreds less than Intel will still result in dramatically improved ASP's and revenue so I'd hardly look at it as a 'bad' situation to be in. What will be really interesting to see is if Zen+ performs like AMD is saying. If Zen+ really does end up being ~15% faster per clock than Zen it means they'll have essentially near identical performance to Purleylake even in synth benches. I wouldn't be surprised if AMD sold Zen+ for just a tad ($20-50) less than Intel's prices if they pull of that degree of performance. PC LOAD LETTER fucked around with this message at 12:16 on Aug 20, 2016 |
# ? Aug 20, 2016 12:09 |
|
Twerk from Home posted:Yes and no. It's massively memory bottlenecked and falls off at high resolution / AA / any other memory-intensive situations. I think on paper it's even faster than those! So basically it's even more retarded on Intels part than I thought, as integrating an Iris Pro 580 into PCB + GDDR5 and memory controller would give them a very competitive product for single slot/low profile. I mean a 580 is paired with 4 skylake cores, so we're talking about an ASIC which pulls what 20W at most in a standalone configuration paired with 2-4GB memory pulling 10-20 more watts. That's a pretty slamdunk perfect product from my perspective, and it clearly has more room for say a 108/144EU part that still fits inside PCIE power spec. Intel could be selling bus powered 380X/960s to OEMs and it boggles my mind why it's not happening. EmpyreanFlux fucked around with this message at 22:18 on Aug 22, 2016 |
# ? Aug 20, 2016 14:36 |
|
Because there's just not that much profit in a market as small as consumer graphics cards, especially compared to all the other stuff Intel does
|
# ? Aug 20, 2016 15:14 |
|
Allstone posted:Because there's just not that much profit in a market as small as consumer graphics cards, especially compared to all the other stuff Intel does I'm not thinking consumer though, GPU's still have use in professional and enterprise, and it's clear they could build an up to snuff GPU to challenge even Nvidia, so it's kind of baffling.
|
# ? Aug 20, 2016 16:07 |
|
Unfortunately, CUDA rules the roost in that. There's a small reason why Intel is instead stuffing lots of little Pentium 2s in a Tesla-like card.
|
# ? Aug 20, 2016 17:22 |
|
Anime Schoolgirl posted:Unfortunately, CUDA rules the roost in that. There's a small reason why Intel is instead stuffing lots of little Pentium 2s in a Tesla-like card. Is CUDA that important for a Autocad/Maya/Illustrator wokstation? If I recall correctly, AMD's APUs have had a hybrid DDR3/GDDR5 mem controller for a few generations already, so there's likely some underlying issues to why no one's selling a PS4/Xbone style SoC.
|
# ? Aug 20, 2016 18:45 |
"So, what, this thing's basically a PS4/XBONE? Why not just buy one of those for a cheaper price and "just works"-ness?"
|
|
# ? Aug 20, 2016 18:54 |
|
Consoles make for poor laptops
|
# ? Aug 21, 2016 07:50 |
|
Arzachel posted:Consoles make for poor laptops If there were a market for it, I'm pretty sure that a first-party console laptop is the most possible it's ever been right now. The die shrunk xbone / PS4 have got to be relatively efficient, and they're just x86 + GCN.
|
# ? Aug 22, 2016 21:35 |
|
Twerk from Home posted:If there were a market for it, I'm pretty sure that a first-party console laptop is the most possible it's ever been right now. The die shrunk xbone / PS4 have got to be relatively efficient, and they're just x86 + GCN. I would unironically buy a nintendo tablet that was basically just a web browser and a nintendo game unit, with a wii u pro controller. Maybe buy an optional tv dock for improved graphical performance.
|
# ? Aug 22, 2016 23:16 |
|
mediaphage posted:I would unironically buy a nintendo tablet that was basically just a web browser and a nintendo game unit, with a wii u pro controller. Maybe buy an optional tv dock for improved graphical performance. You're in luck! It seems like this is basically what the NX is, except it uses Tegra and not an AMD SoC. http://www.eurogamer.net/articles/2016-07-26-nx-is-a-portable-console-with-detachable-controllers
|
# ? Aug 23, 2016 00:20 |
|
WCCFt slide dump for processor nerds: http://wccftech.com/amd-zen-architecture-hot-chips/
|
# ? Aug 23, 2016 04:17 |
|
mediaphage posted:Maybe buy an optional tv dock for improved graphical performance. eGPU is the future.
|
# ? Aug 23, 2016 04:33 |
|
Still no word about how they're managing NUMA L3 between the two "core complex" units in an 8C/16T part. Also, I personally am waiting for that final form of APU where they put 4C/8T Zen, enough GPU power to be around X60, and a couple dots of HBM2 into a thin-and-light. Up until earlier this month, there would have been no competition to such a product. SwissArmyDruid fucked around with this message at 05:05 on Aug 23, 2016 |
# ? Aug 23, 2016 04:45 |
|
cbirdsong posted:You're in luck! It seems like this is basically what the NX is, except it uses Tegra and not an AMD SoC. http://www.eurogamer.net/articles/2016-07-26-nx-is-a-portable-console-with-detachable-controllers I like this idea, but I'm going to be pretty cross if the dock doesn't actually make it look nice on the TV (that is, true alternate GPU rendering rather than some upscaling nonsense). I'm honestly not sure if this console is a replacement for the Wii U or the 3DS, or maybe both. I am a little skeptical of their ability to retain AAA 3rd party titles if they once again choose to poo poo the graphics bed, but whatever. I mean I'm still going to buy this almost no matter what so
|
# ? Aug 24, 2016 02:00 |
|
mediaphage posted:I like this idea, but I'm going to be pretty cross if the dock doesn't actually make it look nice on the TV (that is, true alternate GPU rendering rather than some upscaling nonsense). I'm honestly not sure if this console is a replacement for the Wii U or the 3DS, or maybe both. I am a little skeptical of their ability to retain AAA 3rd party titles if they once again choose to poo poo the graphics bed, but whatever. Well they already don't have much in the way of third party AAA titles on the Wii U to begin with, so they're not really losing out. Most Wii U third party stuff is ports from last-gen console versions, when the Wii U is supported at all, due to Wii U basically being an Xbox 360 with 2 GB of RAM and a slightly faster GPU.
|
# ? Aug 24, 2016 02:05 |
|
FaustianQ posted:95W for 8C/16T, 150W for 24C/48T and 180W for 32C/64T. I'm not sure why there isn't a 16C/32T chip for what seems to be a 125W slot but I'm not AMD. 143mm^2 is suuuuuuuuper small for a many core CPU- Broadwell Xeons range from 246-456mm^2, for the 10 core to 24 core versions.
|
# ? Aug 24, 2016 02:26 |
|
Gwaihir posted:143mm^2 is suuuuuuuuper small for a many core CPU- Broadwell Xeons range from 246-456mm^2, for the 10 core to 24 core versions. I was going off what the original guesses were based on the original die shot leak, but apparently it's been revised upwards towards 200mm˛.
|
# ? Aug 24, 2016 02:54 |
|
SwissArmyDruid posted:Still no word about how they're managing NUMA L3 between the two "core complex" units in an 8C/16T part. I still think there is no real competition to this tragically hypothetical product. The Pascal based laptop solutions would probably be double the price of such a device. It would probably be the ideal college laptop as well. Powerful enough for your low power games and emulators (CS:GO, Overwatch, WoW, etc), drop a Freesync panel in there so it ages a little more gracefully and get near - ultrabook level battery life by eating up all that dGPU space with battery? Oh yeah, sign me right up. You can probably do some decent rendering work in there as well, modeling, and so forth. Pretty good for just about any major. This is the kind of stuff that makes me to hnnnnngggggggg for Zen. AMD are so within striking distance of something incredible like this. But then again, it's up to the OEMs to make use of the capability if it's there. Even if it requires jamming a CPU grade part into a laptop chassis.
|
# ? Aug 24, 2016 09:52 |
|
NewFatMike posted:But then again, it's up to the OEMs to make use of the capability if it's there. NewFatMike posted:it's up to the OEMs to make use of the capability if it's there. NewFatMike posted:it's up to the OEMs When has this ever bit AMD in the rear end?
|
# ? Aug 24, 2016 16:22 |
|
mediaphage posted:I like this idea, but I'm going to be pretty cross if the dock doesn't actually make it look nice on the TV (that is, true alternate GPU rendering rather than some upscaling nonsense). I'm honestly not sure if this console is a replacement for the Wii U or the 3DS, or maybe both. I am a little skeptical of their ability to retain AAA 3rd party titles if they once again choose to poo poo the graphics bed, but whatever. I have held off getting a Wii U for mostly this reason. I want one, but outside of the Mario games, what else is the draw of the U? My N3DS however I am bummed I didn't get sooner. This thing is a ton of fun.
|
# ? Aug 24, 2016 16:42 |
|
Nintendo's next console going to the NX really doesn't help AMD, does it? Global Foundries is making their Tegras. To my knowledge, this would make Nintendo the first console corp to include an actual, honest-to-god COTS GPU in a system. That's....possibly attractive given the performance of Tegra-accelerated devices over Xbone / PS4. (I am only parroting articles citing stuff like SHIELD devices pushing easy 1080p 60FPS for the new DOOM where the Xbone can barely deliver 720p and very choppy 45-50 fps). If that actually works out and the power of their full-handheld-but-dockable-console isn't intentionally cut in the name of battery life, it would be hilarious to see Nintendo possibly force the rest of the console market forward with good hardware.
|
# ? Aug 24, 2016 17:07 |
|
EdEddnEddy posted:I have held off getting a Wii U for mostly this reason. I want one, but outside of the Mario games, what else is the draw of the U? I have one mostly for the Mario games, which are really really good. The Zelda remakes are nice too.
|
# ? Aug 24, 2016 17:11 |
|
Potato Salad posted:Nintendo's next console going to the NX really doesn't help AMD, does it? Global Foundries is making their Tegras. That 1080/60fps demo was on Doom 3, actually
|
# ? Aug 24, 2016 17:37 |
|
NewFatMike posted:
I don't think I've ever seen an AMD-powered laptop that wasn't a complete piece of poo poo.
|
# ? Aug 24, 2016 17:39 |
|
Potato Salad posted:Nintendo's next console going to the NX really doesn't help AMD, does it? Global Foundries is making their Tegras. The new Doom only runs on x86-64 systems. There are no ARM builds so it can't run natively on the Shield or any Tegra device. You were either watching people streaming it from a nice computer, or maybe a heavily stripped down tech demo. Like you grossly overestimate the chipset Nvidia is offering here. It's just a particularly good tablet/smartphone SOC, maybe, which still puts it far behind the performance of the Xbox One or PS4, let alone the upgraded Xbox One and PS4 models that will be coming out next year. The current Nividia Shield K1 tablet, for instance, has a quad core 2.2 ghz 32 bit ARMv7-A CPU with a 192 core Kepler based GPU setup, with 2 GB of RAM for the system. The Xbox One's AMD APU is 8 1.6 GHz x86-64 cores, with 768 GPU cores that don't have to be downclocked so as to prevent overheating in a non-fan-equipped mobile device, and there's 8 GB of total RAM. PS4 is similar. What Nintendo's on track to do, is for the third console in a row they're going to end up as the slowest/least powerful system out there. The Wii was of course noticeably faster than the previous generation of consoles but was still quite a bit behind the 360 and PS3, including both only being able to render in standard def and only having 91 MB of RAM total (PS3 and 360 both have 512 MB) and only being single core PowerPC when the 360 was triple core/6 thread PowerPC and the PS3 was a weird setup with 1 main PowerPC core and 7 different cores used for software as well. The Wii U is basically just an Xbox 360's design with more RAM - 2 GB specifically. The CPU is similarly triple core/six thread PPC, although at ~1.25 GHz instead of the 360's 3.2 GHz it has actual performance on most things about the same because of improvements in instruction set and the like. The GPU is only slightly more powerful than the 360's as well. So basically it launched as a 7 year old design when it came out in 2012. Now if Nintendo's next regular console really will be the NX, and the NX really will be any sort of current or near future NVidia Tegra chipset - it's not even going to be as fast as the 2013 Xbox One and PS4, it'll be a decent bit faster than the Wii U but that's a very low bar. Depending on which particularly Tegra setup they use since it'll supposedly end up handheld too, it might even be only the performance of the Wii U! It's a very bad sign, their only real option for coming out ahead of the current Xbox One and PS4, let alone the upgraded models coming next year, would be securing a high core count Intel or AMD setup. Anime Schoolgirl posted:That 1080/60fps demo was on Doom 3, actually If that's what he saw, then yeah that's just showing the hardware can handle a 12 year old game which ran in 720p on the original Xbox. Not exactly impressive! fishmech fucked around with this message at 17:54 on Aug 24, 2016 |
# ? Aug 24, 2016 17:52 |
|
The current Tegra X1 in the Shield TV is a 8 core (4 A57 + 4 A53 cores) and a 256 core Maxwell GPU that is actively cooled and while it still isn't quite to XBox One or PS4 level, it is drat close. It runs War Thunder's flying parts almost better than the PS4 does. And can handle video like a champ up to 4K/60FPS H265. Since the PX1 is now out even which is an even meatier 12Core with Pascale in it, capable of 8TF (though it used 250W and is water cooled for use in cars), it is reasonable to assume that whatever Nvidia tech the NX uses should probably be around the X1 but possible with Pascale tech for the GPU in it I would guess(hope?). Which would bring it pretty close to current gen console level graphics wise if not better.
|
# ? Aug 24, 2016 18:53 |
|
EdEddnEddy posted:The current Tegra X1 in the Shield TV is a 8 core (4 A57 + 4 A53 cores) and a 256 core Maxwell GPU that is actively cooled and while it still isn't quite to XBox One or PS4 level, it is drat close. It runs War Thunder's flying parts almost better than the PS4 does. And can handle video like a champ up to 4K/60FPS H265. PX1 still only uses 256 cuda cores, yes? Unless they are clocked a lot higher than the TX1 I don't see how the GPU would be much of an improvement, outside of power efficiency - which is kind of a big deal for sure. Maxwall and Pascal IPC are virtually the same.
|
# ? Aug 24, 2016 19:42 |
|
I dramatically mis-understood the article, so I'm going to offer a lame excuse of "It was 12:15 am and I was trying to sleep with a cold"
|
# ? Aug 24, 2016 20:02 |
|
EdEddnEddy posted:The current Tegra X1 in the Shield TV is a 8 core (4 A57 + 4 A53 cores) and a 256 core Maxwell GPU that is actively cooled and while it still isn't quite to XBox One or PS4 level, it is drat close. It runs War Thunder's flying parts almost better than the PS4 does. And can handle video like a champ up to 4K/60FPS H265. It's nowhere close to the AMD GPU performance in the Xbox One or PS4, and being able to handle a video codec doesn't tell you much about games performance, it just tells you they have hardware codec support. Also, the 8 cores don't get used simultaneously, they switch between the sets of 4 cores based on system load - great at keeping battery draw or heating load down when you're doing non-intensive things, but you can't use them simultaneously to increase performance. Plus the NX is supposed to be able to be used portably if the same rumors listing it as a Tegra chipset are true, which places serious constraints on what sort of graphics performance the games can expect, unless you've got the most amazing cooling tech for a handheld system in the world and a really good battery on it. You consider all this stuff combined and getting performance out of the thing that's as good as the 3 year old XBO/PS4 is a distant hope, let alone anything better, and once again improved CPU/GPU XBO/PS4 are due out next year. Both of those are expected to be able to have real time gameplay at at least ~3K horizontal resolution if not full 4K horizontal resolution, and of course better performance at 1080p regardless.
|
# ? Aug 24, 2016 20:39 |
|
fishmech posted:being able to handle a video codec doesn't tell you much about games performance, it just tells you they have hardware codec support. Nothing reminds me more of this than when I ran a Pentium 120 as a DVD player. Software decoding? Absolutely unusable. MPEG2 card installed? Perfect, of course.
|
# ? Aug 24, 2016 21:39 |
|
|
# ? May 21, 2024 13:37 |
|
HalloKitty posted:Nothing reminds me more of this than when I ran a Pentium 120 as a DVD player. Software decoding? Absolutely unusable. MPEG2 card installed? Perfect, of course. It's still like this if you've somehow got your hands on a 4K 60 fps video source and are trying to play it back without hardware acceleration.
|
# ? Aug 24, 2016 22:16 |