|
rscott posted:That is a slot 1 cooler sir. Yes but you could buy a socket 370 to slot 1 adapter I forget if that cooler actually worked on the adapter though.
|
# ? Sep 22, 2010 19:35 |
|
|
# ? May 9, 2024 23:37 |
|
The Athlon XP days of huge rear end cpu coolers so the box didn't sound like a hair dryer. I had the aluminum/copper Zalman beast with the 120mm fan. Needed it's own little support brackets that bolted to the board. Just BARELY cleared the PSU. The stock cooler on my 64 X2 is quieter and smaller, and keeps everything at a satisfactory temp.
|
# ? Sep 22, 2010 20:09 |
|
COCKMOUTH.GIF posted:Look at us watching old people fall down stairs on YouTube via our Google Android phones. overclocked to 1.2ghz on a broadband connection in the woods "Can I get high on these mushrooms? Cmon google goggles, tell me yes!"
|
# ? Sep 22, 2010 20:53 |
|
WhyteRyce posted:Incorrect. The best cooler was the Glacier 4500C with the Arctic cap Haha I had a cooler similar to that on my 300A except it had three (3!) fans across the width of the slot. Worked great! I eventually traded that whole computer, case and all for a linksys wireless router. Doh!
|
# ? Sep 22, 2010 22:15 |
|
VOS32 was the best cooler from that time period IIRC. Giant hunk of aluminum almost half again as big as the entire Slot A module, only cost like $25 too. Put a few 80mm fans on it (or the ever popular 10k rpm Black Label Delta screamers) and you were set to get those 6-700Mhz Athlons up to nearly 1Ghz. More if you were lucky. I think Golden Orb's and Alpha heatsinks were much more popular back then though. old skool OC chat is awesome but anyone know how much those K series SB's will cost anyways?
|
# ? Sep 22, 2010 23:48 |
|
I can still remember penciling in the L1 bridges on my duron.
|
# ? Sep 23, 2010 00:16 |
|
PC LOAD LETTER posted:old skool OC chat is awesome but anyone know how much those K series SB's will cost anyways?
|
# ? Sep 23, 2010 00:27 |
|
rscott posted:Yeah I was just sperging out because I miss the old days of using graphite pencils to unlock extra multipliers or setting jumpers to 2x to get 6x multipliers on my old super socket 7 boards. That's the "old days?" Bah, you kids these days have no appreciation for the times when we had to desolder the oscillator module on the motherboard and replace it with a higher-frequency one. Of course, there were no provisions for mounting a cooler on the 386's socket, so you had to get creative with thermal adhesives (not easy to find, those days) and heatsinks designed for other devices. But, on the other hand, a 386-33 at 40 MHz could be as fast as a low-end 486! spanko posted:This isn't true anymore for a lot of popular games. Mind providing some examples? Of course, bottlenecks are going to depend on the exact hardware configuration - but a system built right now with a roughly even balance between video card, CPU, and monitor is almost certain to bottleneck on the video card in any game recent enough that bottlenecks matter. Yeah, if you run with an Athlon II X2 and a pair of GTX480s on a 1280x1024 display it won't be the CPU holding you back, but as long as everything's in the same rough category ("budget," "midrange," "high-end," or "I burn piles of money for laughs") the CPU is rarely the limiting factor. Admittedly, CPU bottlenecks can be a more serious concern; as your system ages, it's generally possible to dodge a video card bottleneck that leads to unacceptable performance just by dropping settings, but CPU bottlenecks are more concrete. However, that's one place where overclocking is still a viable solution, at least for now. We'll see how it plays out as the market moves away from single-core performance and towards parallelism that can't be made up so easily by just cranking up the clocks.
|
# ? Sep 23, 2010 00:56 |
|
my experience is that a video card will tend to bottleneck most of the time except when under heavy load, where the CPU can frequently bottleneck. Most notably in the badly optimized new content in TF2. I had to overclock my c2d to 3ghz from 2.33 to get acceptable frame rates---not even great framerates on koth_viaduct. 9600gt video card, which should digest the game with medium settings at 1680x1050.
|
# ? Sep 23, 2010 01:45 |
|
rscott posted:Best HSFU for skt370 was the golden orb. It had a 60mm fan! Years later you came to realize what a piece of poo poo it was as it had very little surface area compare to a normal looking cooler. It did look 'cool' though..
|
# ? Sep 23, 2010 02:00 |
|
I bought a "huge" Thermalright AX-7 for my Athlon... it was big enough for an 80mm fan! I had to dremel off the bottom of a couple fins to fit in on my motherboard.
|
# ? Sep 23, 2010 04:32 |
|
rscott posted:Tualatin wasn't officially supported on 440BX but I had an Abit BH6 that had no problem running up to 150MHz. Basically the only reason I went from 440BX to 815 was my Radeon 8500 couldn't tolerate the overclocked AGP bus like my old GeForce 2 did. Yeah, and there were factory 440BX boards with made expressly for a 133 FSB, with the right multipliers to keep AGP/PCI speeds correct. I had an MSI BX-Master like this - still have it, somewhere. Alereon posted:
gently caress VIA and gently caress that MOTHERFUCKING SHITSUCKING ASSDILDO of a chipset, those goddamn Apollo Pros had the IDE transfer rate of a sedated snail, a total inability to be stable using a Sound Blaster card, and general instability under heavy multitasking when you were hitting the HD. People everywhere bitched for years about it, VIA kept releasing new 4-in-1's that did absolutely nothing but give me temporary hope. Christ, that stupid chipset gave more hassles than any computer hardware I ever used, before or since. I had a dual-proc P-III/933 box that should have been the cat's meow, but that goddamn chipset was a constant thorn in my side. And gently caress Intel, too, I didn't have $1000 to buy an 820 or 840 board and RDRAM.
|
# ? Sep 23, 2010 06:01 |
|
You know what would go a long way towards making computers seem better? A dedicated chip and memory just to run the OS. I'd be pretty tiny and it'd take out a ton of the apparent slowness of a computer.
|
# ? Sep 23, 2010 13:28 |
|
mew force shoelace posted:You know what would go a long way towards making computers seem better? A dedicated chip and memory just to run the OS. I'd be pretty tiny and it'd take out a ton of the apparent slowness of a computer. JnnyThndrs posted:Yeah, and there were factory 440BX boards with made expressly for a 133 FSB, with the right multipliers to keep AGP/PCI speeds correct. I had an MSI BX-Master like this - still have it, somewhere.
|
# ? Sep 23, 2010 14:10 |
|
If you want to take apparent slowness out of a computer, just get an SSD.
|
# ? Sep 23, 2010 14:57 |
|
Alereon posted:
That's true on the AGP, but not the PCI. I had an ABit BM6 (440BX) which had a 1/4th PCI multiplier so you could run the PCI bus at 33 mhz while at 133. Your AGP would be a little over 88mhz at that point, but a lot of cards could take that.
|
# ? Sep 23, 2010 16:05 |
|
Somebody please open a OC nostalgia thread. That picture of the Slot A Athlon with the little debug module (gold finger something?) can only be topped by a Celeron 300A with peltier/air cooler. Content: I think I’m going to wait for the Sandy Bridge update of Apples MBP line until I replace this 17" C2D. Should arrive roughly one year from now, no?
|
# ? Sep 23, 2010 16:44 |
|
Space Gopher posted:Mind providing some examples? SC2, WoW, Dragon Age, Mass Effect 2, Left 4 Dead (most source engine games,) most mmos are cpu limited. In fact I'd say in general most new RPG/RTS games are cpu limited on a $180-$220 videocard, but for FPS games like AvP, Crysis, Bioshock, etc, what you said is true and they are GPU limited. My main point is you are way better off spending extra to get a quad core CPU than spending over $250 or more on a videocard, unless you mainly play the bleeding edge FPS's.
|
# ? Sep 23, 2010 17:02 |
|
My first overclock was when I built my first computer, a tualatin celeron 1100A on an 815 motherboard (Soyo SY-TISU). Both I pulled out of the garbage. Got that sucker running at 1400 MHz. This was around when the 2.53 Northwood P4 was a mid-range part so I was pretty behind the times, but I was proud since most of the parts were scavanged and the only thing I actually bought was a GeForce4 4200. How did I cool this montrosity? Well, the case didn't have any of the side panels so I just stuck a small window fan next to it.
|
# ? Sep 23, 2010 17:05 |
|
Just reading the thread and I'm not too up on hardware, but thought I'd ask a few questions. My impression was that stuff like GPGPU was a big boon to low cost high performance computing applications like high frequency trading and scientific computing. What sort of market share do those applications represent and are they enough to sustain the discrete card makers? Also does this mean by this time next year that the sweet spot build in the system building thread might not have a discrete card?
|
# ? Sep 23, 2010 17:25 |
|
pseudointellectual posted:My impression was that stuff like GPGPU was a big boon to low cost high performance computing applications like high frequency trading and scientific computing. What sort of market share do those applications represent and are they enough to sustain the discrete card makers? quote:Also does this mean by this time next year that the sweet spot build in the system building thread might not have a discrete card?
|
# ? Sep 23, 2010 17:32 |
|
mew force shoelace posted:You know what would go a long way towards making computers seem better? A dedicated chip and memory just to run the OS. I'd be pretty tiny and it'd take out a ton of the apparent slowness of a computer. Yeah, on an application page fault what you really want to do is go off-chip for the handler instead of just running the OS right on that core. That'd make things zippy.
|
# ? Sep 23, 2010 18:02 |
|
It'd be even better if you could have it pull down the pages directly from Microsoft or kernel.org. Trusted computing, gently caress yeah.
|
# ? Sep 23, 2010 20:00 |
|
WhyteRyce posted:The 300As had a near guaranteed overclock to 450. The 366s had a higher multiplier which meant it had to go to 550 if you wanted a FSB of 100mhz, but the success rate of getting those to 550 were much lower than getting the 300s to 450. I hit 550 . The last great celeron processor. The only low budget processor that out performed the flagship CPU due to on-dye level 1 cache. The prelude to the coppermine.
|
# ? Sep 24, 2010 12:56 |
|
freeforumuser posted:Let's face it, the only real apps left that are still primarily CPU limited are rendering and video encoding. Interestingly, both apps lend themselves well to massively parallel processing on GPUs, same for gaming physics. And now, we see Intel and AMD are pushing with CPUs with integrated GPUs. Coincidence? Me thinks no and let me proclaim the multicore era is already over and welcome our new GPU-dominant processor overlords. Video encoding, if you're looking for good quality, cannot have very many threads. Frames reference each other in H.264. If you're encoding hundreds of frames at once, they obviously can't effectively do that; you'll have to use another method of threading, like slicing frames into slices and encoding each frame with multiple threads. Sliced threading is inherently lower quality. On the whole, CPUs are better for video encoding. There are certain things that GPUs can do better than CPUs in video encoding. There's been some study into stuff that they can do, such as motion estimation (see the qual. task too). Note that it is listed as very difficult and needing a large amount of work. There's also probably been lots of papers, but papers are just theory, and are very often not practical. Thanks a lot for making me sperg
|
# ? Sep 24, 2010 13:18 |
|
Aleksei Vasiliev posted:Video encoding, if you're looking for good quality, cannot have very many threads. Frames reference each other in H.264. If you're encoding hundreds of frames at once, they obviously can't effectively do that; you'll have to use another method of threading, like slicing frames into slices and encoding each frame with multiple threads. Sliced threading is inherently lower quality. Couldn't the GPU be used for a really-fast first pass, since it can look at tons of frames in parallel?
|
# ? Sep 24, 2010 15:05 |
|
TOOT BOOT posted:Couldn't the GPU be used for a really-fast first pass, since it can look at tons of frames in parallel? The last method tried was to move x264's lookahead thread onto the GPU - this runs about ~50 frames ahead of actual encoding and is used to decide some stuff like frame types and visual importance. It also avoids one of the worst problems with GPGPU; the motion decisions and so on are chosen just as much for how well the MVs compress as how well they match. This is really easy when you run all the decisions in the same order they're being encoded in... but GPUs do everything at the same time, so that doesn't work. Badaboom/etc most likely just don't bother doing any of it, so they're fast because they suck. Anyway, the two people working on it still haven't finished after months, so it still can't be that easy...
|
# ? Sep 24, 2010 15:33 |
|
A couple welcome updates: Sandy Bridge's 6-series chipsets will have native USB3.0 support after all. Both notebook and desktop platforms will have it, no idea on how many ports unfortunately. Apparently Intel kept this on the down-low because they weren't sure if their native implementation would pass qualification, but it did. Sandy Bridge graphics details, including branding. Graphics on Sandy Bridge will come in two flavors: the 6 Execution Unit version called Intel HD Graphics 100, and the 12 EU version called Intel HD Graphics 200. HD Graphics 200 will only be available on Core i5 2500 and i7 2600 desktop processors (and presumably higher), though according to previous information all mobile processors will have HD Graphics 200. This does appear to confirm that Anandtech tested the 12 EU version.
|
# ? Oct 7, 2010 12:19 |
|
I just want it all to come out already so I can upgrade my PC for the first time in years
|
# ? Oct 7, 2010 16:29 |
|
Same here. I hope they release them in January but that's just wishful thinking. Which CPU got that 5GHz overclock on air cooling with a minor voltage increase? I'm hoping it was the i5-2500K because I want one of those so bad.
|
# ? Oct 7, 2010 22:33 |
|
I hope this chipset is finally the one to force motherboard makers to UEFI.
|
# ? Oct 8, 2010 02:50 |
|
incoherent posted:I hope this chipset is finally the one to force motherboard makers to UEFI. There's rumors that the next version of Windows will require UEFI, though I doubt that will be the case since almost no one has it right now so very few computers could be upgraded. It would be nice if they'd come out and say that the version after will require it though, that gives plenty of time for hardware manufacturers to get up to speed and for most computers that would be worth upgrading the OS on to have support.
|
# ? Oct 8, 2010 22:56 |
|
Very doubtful. Probably the people who started that are the same ones that are saying that Microsoft will allow it next year, when it's been an official option for booting since Vista SP1. It's doubtful because a major reason why there's still a 32-bit version of windows is because up until a year or so ago a decent amount of intel chips were 32-bit only. there's no way Microsoft would ignore millions of machines that could run the next os fine but can't because of some technicality.
|
# ? Oct 9, 2010 00:09 |
|
wolrah posted:There's rumors that the next version of Windows will require UEFI, though I doubt that will be the case since almost no one has it right now so very few computers could be upgraded. It would be nice if they'd come out and say that the version after will require it though, that gives plenty of time for hardware manufacturers to get up to speed and for most computers that would be worth upgrading the OS on to have support. The only thing companies will do if you give them a couple extra years like that is spend a couple years hoping you change your mind.
|
# ? Oct 9, 2010 00:39 |
|
Zhentar posted:The only thing companies will do if you give them a couple extra years like that is spend a couple years hoping you change your mind. At least it will be more clear cut than the Vista driver support fiasco. It'll either boot or it wont. That said all those people like me with decent gaming rigs who have no plans to upgrade will probably be able to run Windows 8 just fine and will hold off on upgrading if UEFI is a requirement.
|
# ? Oct 9, 2010 00:41 |
|
Remember that UEFI is required in order to boot from HDDs larger than 2TB, and since we currently have 3TB HDDs that are relegated to external storage applications, that's kind of important.
|
# ? Oct 9, 2010 01:09 |
|
incoherent posted:I hope this chipset is finally the one to force motherboard makers to UEFI. MSI announced all their sandy bridge boards will be UEFI, don't know about other manufacturers. http://techpinger.com/2010/06/msi-working-on-uefi-that-will-kill-bios-in-three-years/
|
# ? Oct 9, 2010 02:08 |
|
incoherent posted:I hope this chipset is finally the one to force motherboard makers to UEFI. I hope so, because I'm tired of dealing with AMI's BIOS development environment and x86 assembly. So looking forward to being lazy as poo poo and being able to use C in conjunction with BIOS development.
|
# ? Oct 9, 2010 20:01 |
|
Alereon posted:Remember that UEFI is required in order to boot from HDDs larger than 2TB, and since we currently have 3TB HDDs that are relegated to external storage applications, that's kind of important. This is the primary example of why I'm looking for a uefi board.
|
# ? Oct 10, 2010 21:21 |
|
|
# ? May 9, 2024 23:37 |
|
incoherent posted:This is the primary example of why I'm looking for a uefi board. Well right now all 2TB HDDs are 5400RPM, so not exactly something you'd want to have as a boot drive. Still, hopefully 1155/new AMD boards will make UEFI standard.
|
# ? Oct 11, 2010 07:38 |