|
movax posted:The hardware has been around for awhile, but most consumer stuff like ThuderBolt is kind of bandwidth starved (PCIe 2.1 x4). The software support isn't quite there yet either, unfortunately. Work needs to be done by both OS vendors and hardware vendors. I'm sure the BIOS guys have some work to do if they want to roll with ACPI-mediated PCIe hot plug. MSI actually has a functional demo. http://www.anandtech.com/show/5352/msis-gus-ii-external-gpu-via-thunderbolt Personally, if this existed I seriously consider just picking up a MacBook Air and just using that as my main pc. If I ever needed to play games I'd just dual-boot and hook up my external GPU.
|
# ? Aug 7, 2012 08:51 |
|
|
# ? May 10, 2024 00:58 |
|
Happy_Misanthrope posted:Doesn't the "rise" (well, they're not exactly taking over from dedicated GPU's yet of course) of APU's necessitate more bandwidth eventually? The rumor on the street is that Haswell will have a large on-die cache (64MB?) in the form of eDRAM to be used as an interposer to help alleviate some of the traffic over the DRAM bus. If the rest of the rumors are true - going from 12 EUs on die to 40 - then Intel may have to make drastic changes to the memory hierarchy to keep them all fed with data.
|
# ? Aug 7, 2012 15:07 |
|
Shaocaholica posted:Did those external GPU boxes ever catch on? Seems like that would be cheaper than a whole desktop unless there are other bottlenecks to consider. I've linked this in another thread, but DIY external GPU's are a thing. You use a full-size desktop card and basically end up with performance equivalent to a mobile version of the same card. It's a pretty compelling use case: Carry a 12" laptop, then go home, hook it up to your external GPU, and transform it into a gaming laptop.
|
# ? Aug 7, 2012 15:50 |
|
Gobbeldygook posted:You use a full-size desktop card and basically end up with performance equivalent to a mobile version of the same card. Because of bandwidth limitations on Thunderbolt or what?
|
# ? Aug 7, 2012 16:18 |
|
Toast Museum posted:Because of bandwidth limitations on Thunderbolt or what? Yeah, because of bandwidth limitations on express card/pcie ports which is how almost all eGPUs are currently done. An eGPU via a Thunderbolt port would achieve performance very close to the same card in a desktop.
|
# ? Aug 7, 2012 17:44 |
|
Gobbeldygook posted:Yeah, because of bandwidth limitations on express card/pcie ports which is how almost all eGPUs are currently done. An eGPU via a Thunderbolt port would achieve performance very close to the same card in a desktop.
|
# ? Aug 7, 2012 17:53 |
|
Alereon posted:A Thunderbolt port is still only equivalent to PCI-E 2.0 4X. Better than Expresscard and good enough for basic desktop work, but still a major bottleneck. Kind of remarkable, but maybe not so much, at least with cards comparable in performance to a GTX The testing I'm referring to was mainly to see what kind of issues bandwidth limitation would cause on a triple-monitor system; they found virtually no bandwidth issues at resolutions between 2560x1600 and 1920x1080 with last generation's highest performance technology. I swear it was linked here somewhere recently. GPU thread, maybe. Edit: Found it, it was posted in Anandtech comments. Also, GTX 480s. video card bandwidth test: PCI-e 2.0 16x vs. 8x vs. 4x with GTX 480s. Agreed fucked around with this message at 18:48 on Aug 7, 2012 |
# ? Aug 7, 2012 18:28 |
|
Alereon posted:A Thunderbolt port is still only equivalent to PCI-E 2.0 4X. Better than Expresscard and good enough for basic desktop work, but still a major bottleneck. Right. And like others have mentioned, its not graphics cards that are the bottleneck here, its professional stuff like capture, transcoding, and RAID controllers. Our typical Mac Pro setup has a AJA Kona 3 video card and an ATTO RAID card, which both require a 4x slot. The RED suite has a RED rocket that requires an 8x slot and the aforementioned RAID card. Apple keeps telling us ThunderBolt is the solution to all of our professional problems, and that just is not the case right now.
|
# ? Aug 7, 2012 22:31 |
|
mayodreams posted:Right. And like others have mentioned, its not graphics cards that are the bottleneck here, its professional stuff like capture, transcoding, and RAID controllers. Our typical Mac Pro setup has a AJA Kona 3 video card and an ATTO RAID card, which both require a 4x slot. The RED suite has a RED rocket that requires an 8x slot and the aforementioned RAID card. http://tv.adobe.com/watch/davtechtable/tech-demo-thunderbolt-macbook-air-red-rocket/
|
# ? Aug 7, 2012 22:45 |
|
japtor posted:That reminds me of this demo from a while back: Right, I've seen that, and they are not using any kind of storage along with that either. The BlackMagic box is also not as powerful as the Kona3. I'm not saying it won't work in any capacity, it's just not the miracle Apple claims it to be.
|
# ? Aug 7, 2012 23:16 |
|
I asked in the system building thread and didn't get an answer, so I figured I'd try here since it's relevant. I need a new mobo, and since I have an i7 920 I need an LGA1366 board which is basically the equivalent of unicorn poo poo from a UK vendor unless I want to spend a £200 on a board or play ebay roulette. For maybe another 30% more than the cost of 1366 board I can get a 3000 series i5 and decent LGA1155 board, which I don't mind doing IF I can really justify replacing my still pretty respectable overclocked 920 and spending the extra money. There's the problem with dropping a bunch of money on an already obsolete socket to consider, but the 920 has kept up surprisingly well for me and I feel kind of sick dumping a perfectly fine processor (that still has some unexplored OC headroom) for what looks like a sidegrade at best. Any thoughts?
|
# ? Aug 7, 2012 23:40 |
|
DicksToAsses posted:Any thoughts?
|
# ? Aug 8, 2012 01:26 |
|
mayodreams posted:Right, I've seen that, and they are not using any kind of storage along with that either. The BlackMagic box is also not as powerful as the Kona3. I'm not saying it won't work in any capacity, it's just not the miracle Apple claims it to be. Speaking of Kona though: http://www.aja.com/en/articles/183/
|
# ? Aug 8, 2012 02:20 |
|
For consumers, we just have to wait for the cost of fiber to come down, or see what Intel does with regards to their next generation TB controllers and still operating over copper. Having PCIe 3.0 x4 over copper would finally be enough to drive all but the most demanding resolutions/games, I think. I know I could do PCIe 3.0 x4 using a QSFP and some PLX hardware, and I'd love to try it if I had a spare $2k laying around. Really, we should have a bored goon here benchmark some stuff at x4/x8/x16 to get a better idea of what's going on, that HardOCP article is pretty old.
|
# ? Aug 8, 2012 03:09 |
|
How do you throttle a PCIe slot, anyway?
|
# ? Aug 8, 2012 05:47 |
|
Factory Factory posted:How do you throttle a PCIe slot, anyway? I think you can tape off some of the pins to force it to run at a lower speed. e: or just cut them http://allthemods.com/userinfo.php?userid=238&id=6246 e2: or find one of these adapters for 4x/8x/16x and mod it like they do and put a 16x card in it http://www.kegetys.fi/forum/index.php?topic=752.0 text editor fucked around with this message at 06:04 on Aug 8, 2012 |
# ? Aug 8, 2012 06:00 |
|
Factory Factory posted:How do you throttle a PCIe slot, anyway? If I recall correctly, the LTSSM (Link Training Status State Machine) is responsible for initiating link width up or downsizing. Electrically, as many people have noticed or done, if you only have a certain number of the pins electrically connected, that will force a certain link width. You could have the end-point device also report certain link widths in software if you wished. (I think I've done this before) Certain devices, like modern Intel PCHs are also cable of linking up say a x4 link if the lanes are ordered from 0-3 or 3-0, which makes life really simple for layout.
|
# ? Aug 8, 2012 06:19 |
|
Well, I'd be willing to do some modding and investigate if someone were willing to supply me with 1) some GeForce 680s, 2) a 2560x monitor, and 3) a Z77 board and IVB CPU for PCIe 3.0 support. I promise it would be a REALLY GOOD writeup.
|
# ? Aug 8, 2012 06:30 |
|
text editor posted:e2: or find one of these adapters for 4x/8x/16x and mod it like they do and put a 16x card in it Dear lord, I've tested dongles which much shorter length than that which had pretty lovely electrical characteristics. I'd hate to see what those are like These are much less lovely and doesn't require cutting anything http://www.startech.com/PCI-Express-x1-to-Low-Profile-x16-Slot-Extension-Adapter~PEX1TO16
|
# ? Aug 8, 2012 06:58 |
|
WhyteRyce posted:Dear lord, I've tested dongles which much shorter length than that which had pretty lovely electrical characteristics. I'd hate to see what those are like Yeah, that ribbon cable I almost want to buy one of those and slap a probe on it to see how lovely it is. I mean granted it's only six critical wires (TXP/TXN, RXP/RXN, REFCLK+/-), but I imagine the impedance discontinuities/etc are not helping things. The eye probably looks really gross.
|
# ? Aug 8, 2012 07:10 |
|
movax posted:For consumers, we just have to wait for the cost of fiber to come down, or see what Intel does with regards to their next generation TB controllers and still operating over copper. Having PCIe 3.0 x4 over copper would finally be enough to drive all but the most demanding resolutions/games, I think. I know I could do PCIe 3.0 x4 using a QSFP and some PLX hardware, and I'd love to try it if I had a spare $2k laying around. Is this new enough for you? http://www.anandtech.com/show/5458/the-radeon-hd-7970-reprise-pcie-bandwidth-overclocking-and-msaa quote:The good news is that even at 2GB/sec the bottlenecking is rather limited, and based on our selection of benchmarks it looks like a handful of games will be bottlenecked.
|
# ? Aug 8, 2012 08:03 |
|
Look at this Lexus of Z77 mobos. Tons of add-on goodies and a PCIe switch for x8/x8/x8/x8 graphics
|
# ? Aug 15, 2012 01:30 |
|
movax posted:Look at this Lexus of Z77 mobos. Tons of add-on goodies and a PCIe switch for x8/x8/x8/x8 graphics That little SSD is the most useless thing...
|
# ? Aug 15, 2012 02:02 |
|
hobbesmaster posted:That little SSD is the most useless thing...
|
# ? Aug 15, 2012 03:20 |
|
Alereon posted:If it had been a 64GB Samsung PM830 then I would call that a pretty smart deal, but I guess it's kind of useful as a dedicated pagefile drive. Now that I think about it, bundling an SSD which is going to rapidly depreciate in value with a motherboard that might spend some time on a shelf is a pretty poor choice, I guess they're hoping to move these things quickly. Now they should sell a model where you can bring your own mSATA card.
|
# ? Aug 15, 2012 08:08 |
|
incoherent posted:Now they should sell a model where you can bring your own mSATA card. I thought they did; this one of course you can just replace the LiteOn drive with something else, but I swear I remember another member of their alphabet soup of mobos that had a mSATA port open for a little cache drive (right when Z68 launched I think).
|
# ? Aug 15, 2012 15:02 |
|
movax posted:I thought they did; this one of course you can just replace the LiteOn drive with something else, but I swear I remember another member of their alphabet soup of mobos that had a mSATA port open for a little cache drive (right when Z68 launched I think). I believe all the Z77 ROG (Maximus V Gene, Formula, and Extreme) boards have a PCIe/mSATA module that can take WLAN of your choice on one side and mSATA of your choice on the other. Comes pre-stocked with a WLAN card but not with an mSATA drive.
|
# ? Aug 15, 2012 15:06 |
|
hobbesmaster posted:That little SSD is the most useless thing... I ran Ubuntu for a long time with a 30GB partition for / and it never got more than half full, but if you're running Linux you probably don't need the creme de la creme motherboard. And if you're buying that motherboard you're probably expecting to be doing a lot of gaming and therefore running Windows primarily. I guess you could put the page file on it and/or keep a small linux install on it in case you gently caress up your windows install and need to look poo poo up to fix it.
|
# ? Aug 15, 2012 20:30 |
|
pienipple posted:I ran Ubuntu for a long time with a 30GB partition for / and it never got more than half full, but if you're running Linux you probably don't need the creme de la creme motherboard. And if you're buying that motherboard you're probably expecting to be doing a lot of gaming and therefore running Windows primarily. Huh, dual booting to a Linux install would be a pretty good use for it.
|
# ? Aug 15, 2012 20:36 |
|
It wouldn't be a bad place to stash bootable diagnostic tools, either, especially if you could disable that port when it's not needed to keep it from being compromised.
|
# ? Aug 15, 2012 22:26 |
|
hobbesmaster posted:That little SSD is the most useless thing... It's useful if you want to use it as a cache to back a mechanical drive. But, if you're buying a $450 super deluxe motherboard, you're probably going to have a nice big SSD to boot from, too.
|
# ? Aug 16, 2012 04:21 |
|
Here is a decent whitepaper from Intel about the boot process/initialization process for their x86 CPUs, if anyone is curious as to what black magic the BIOS does.
|
# ? Aug 17, 2012 17:17 |
|
Since there's already been talk about Haswell memory, has anything more surfaced yet about the transactional memory extension? It's not going to be server market only, is it?
|
# ? Aug 18, 2012 09:45 |
|
movax posted:Yeah, that ribbon cable I almost want to buy one of those and slap a probe on it to see how lovely it is. I mean granted it's only six critical wires (TXP/TXN, RXP/RXN, REFCLK+/-), but I imagine the impedance discontinuities/etc are not helping things. The eye probably looks really gross. I'd be curious since I just bought one of these since I want to move a x1 eSata card to a physical slot below my motherboard (mATX in an ATX case). The x16 version and x1->x16 versions of these were being used very heavily by the bitcoin mining community to max out the number of GPUs they could put in a single computer. In fact, when trying to find any stats at all about the reliability of these things, I couldn't find a single reference to their use outside of bitcoin mining.
|
# ? Aug 19, 2012 05:12 |
|
I had a thought about using one to get a sound card into a mini-ITX build via a mini-PCIe slot, but I'm stumped on mounting an expansion card in a 5.25" bay. That seems to be a need that even StarTech hasn't addressed.
|
# ? Aug 19, 2012 05:16 |
|
Chuu posted:The x16 version and x1->x16 versions of these were being used very heavily by the bitcoin mining community to max out the number of GPUs they could put in a single computer. In fact, when trying to find any stats at all about the reliability of these things, I couldn't find a single reference to their use outside of bitcoin mining. Hey it's only correctable errors anyway who cares. And ASPM, why would bitcoiners even deal with that. Have any of those guys ever explored using a full-fledged PCIe expansion system? They are pricier than those poo poo dongles but I figured cost was no option to them.
|
# ? Aug 19, 2012 07:08 |
|
This is a really interesting look at CPU performance in gaming. I am surprised there is that much difference in latency between Intel and AMD. http://arstechnica.com/information-technology/2012/08/inside-the-second-gaming-performance-with-todays-cpus/
|
# ? Aug 24, 2012 17:56 |
|
mayodreams posted:This is a really interesting look at CPU performance in gaming. I am surprised there is that much difference in latency between Intel and AMD. It was not far fetched to imagine Bulldozer wouldn't beat Sandy Bridge back then, but the thing nobody expected is that it would be worse than their own previous products. It really was a poor show.
|
# ? Aug 24, 2012 18:00 |
|
Well to be fair, technically there are a few random arcane workloads that Bulldozer performs better in. But even there it comes at the expense of higher power draw.
|
# ? Aug 24, 2012 18:44 |
|
|
# ? May 10, 2024 00:58 |
|
mayodreams posted:This is a really interesting look at CPU performance in gaming. I am surprised there is that much difference in latency between Intel and AMD.
|
# ? Aug 24, 2012 18:50 |