|
Yeah Windows 8.1 and 10 will run on CPU that's at least 1 ghz and supports PAE, SSE2, and the NX bit. That means almost every Athlon 64 and most Intel chips after the later Prescott Pentium 4s. So as early as a fall 2003 computer with AMD and as early as a summer 2004 computer with Intel. (there's also hacky poo poo where you install with a compatible CPU then switch it out for one that lacks the NX bit, which would allow you to run Windows 10 on CPU from about 2001)
|
# ¿ Apr 20, 2017 14:37 |
|
|
# ¿ May 7, 2024 14:05 |
|
eames posted:If Destiny 2 is any indication then consoles are becoming CPU bottlenecked which would explain why the game is locked to 30 fps at any resolution (correct me if I'm wrong). The Scorpio project has a major increase in GPU compute units as well as their speed, unlike the Ps4 Pro that mainly upclocked the CPU/GPU and precisely doubled the GPU compute units.
|
# ¿ May 21, 2017 15:55 |
|
FaustianQ posted:Ryzen is apparently related to the cat cores or at least began it's design stages as "make cat cores better". I'm still flummoxed on why the Scorpio went with Jaguar, and not Puma+ or even the rumored Basilisk if they were going have an issue with Zen cores. It's not even like Zen would be particularly throttled in performance here, they could easily do 8 cores at 2.5Ghz or 4 cores at 3.0Ghz and still be inside the overclocked Jaguar core thermal envelope, Zens just that much better of a design. The hardware had to be finalized either the end of last year or beginning of this year, in order to ramp up production of the actual hardware and have working devkits available for the game developers. An architecture that isn't out yet certainly wouldn't work, and Puma/Puma+ simply might not have been judged a worthwhile change compared to the option Microsoft actually took, of customizing Jaguar cores further.
|
# ¿ May 22, 2017 20:45 |
|
Eletriarnation posted:Maybe the Atom qualifies? I know that a lot of the idea behind it was "Make a chip as simple as the original Pentium but running as fast as we can get it these days, plus any improvements which give you at least a 2:1 return on performance:wattage." The Atom design developed from the "Stealey" cores used in the A100 and A110 processors in 2007, meant for use in those weird Ultra Mobile PC things that were a thing for a few years before netbooks. One of the only devices they went into was the HTC Shift. The Stealey cores were based on Dothan Pentium M-based Celeron CPUs, obviously heavily cut downand slowed down to fit the 5 watt power envelope. So, Atom wasn't really a new microarchitecure, it was just another variant of the P6, with more similarity to older designs in that family. Also remember that with Atom and the Stealey cores of the year before, Intel gave the directives to just cut power by any means possible, while maintaining compatibility. Once they got that with the Stealey core, they spent time refining Atom to slowly bring back performance without massively increasing power. And then eventually gave up on that line for the new Atom chips that derive from the the core i series.
|
# ¿ May 28, 2017 00:14 |
|
GRINDCORE MEGGIDO posted:I'm surprised AMD don't sell a lightweight compute card that's just a standard card, cheaper, with no video outputs, and separate the markets. Don't they, for the normal high performance computing market?
|
# ¿ May 30, 2017 03:21 |
|
FaustianQ posted:So basically say gently caress it, embrace that buttcoin miners won't stop doing this and sell buttcoin edition cards that are theoretically useless for anything but compute? They're more expensive upfront, but that's partially because they're low volume sellers, and partially because they're a decent bit more powerful for pure compute tasks than the typical graphics cards of equivalent chipsets.
|
# ¿ May 30, 2017 03:40 |
|
eames posted:From what I understand "mining" efficiency (coin per watt) is always on a steady decline, the price per coin is highly volatile and power costs are constant. The only thing I'd add that is there's constantly "new" coins being made and a certain kind of sucker buying GPUs for the new stuff without anyone having time to figure return vs electricity ratios, so that drives demand when prices aren't particularly good for the main established coins.
|
# ¿ May 30, 2017 10:32 |
|
Truga posted:loving this. Websites are turning into steaming piles of poo poo that eat 1000 gigs of ram and 37 cores just by opening them. If don't want to close your browser every time you want to play a game, buy something with 8 threads. Yeah totally. That's why it's always funny when you see people asking for suggestions for a laptop for their mom or something and they say "oh she's only going to be surfing Facebook!". That'll loving eat a huge chunk out of a decent CPU these days.
|
# ¿ Jun 11, 2017 16:18 |
|
thechosenone posted:
Sure, if you can find some way to cram dozens of gigabytes of high speed video RAM and a massive amount of cooling onto the CPU. It's a whole lot simpler to make that sort of thing work on a separate card.
|
# ¿ Jun 15, 2017 19:16 |
|
thechosenone posted:so like you could use stacked memory to get a lot of data, and stitch parts together or something, like some sort of super-chip inspired by Dr.Frankenstein. 1 gigabyte of video RAM is insufficient for removing dedicated GPUs already, let alone for future uses when such a processor would be practical. Even like 3 or 4 GB would be stretching things for something meant to replace most dedicated GPUs in use.
|
# ¿ Jun 15, 2017 19:44 |
|
Wirth1000 posted:Alright, so just got home and booted up into the UEFI (when the hell did UEFI replace BIOS?) Started about 12 years ago as far as PCs go, basically finished up by like 3 or 4 years ago.
|
# ¿ Jun 16, 2017 21:19 |
|
Palladium posted:I think most of us forgot laptop CPUs constitutes maybe like 70% of Intel's consumer market and AMD has been non existent there for a decade. A decade? It's a lot longer than that they've been noncompetitive. ultrabay2000 posted:I wouldn't buy one but they seem to do alright in the Walmart laptop segment. Seems like a large chunk of cheap laptops I see have AMD CPUs. Those also sell less than you'd expect. There's tons of Atom, Pentium, Celeron laptops going out nearly as cheap or even cheaper, but more widely available. And most consumers don't really understand that sometimes the AMD's better integrated GPU might actually be better for what they want to do on a real low end device.
|
# ¿ Jul 4, 2017 04:16 |
|
Khorne posted:I was real serious about this. The reason is twofold, one you get a higher clock under half-load, and two the fpu thing plays some role probably. Due to this processor's design you have insanely inconsistent run times for processes that run on the scale of a week+. As in, "hey, why does my process that takes 7 days sometimes take 14 days". You set process-processor affinity in windows on a per program basis. This guide uses win 7,but the process is the same in Vista, 8,or 10. http://www.techrepublic.com/blog/windows-and-office/change-the-processor-affinity-setting-in-windows-7-to-gain-a-performance-edge/ Simply put, you will add a command line option to your shortcuts, which tells windows which cores are allowed to be used. You may need to experiment first to find the correct sets to not split fpus. Unfortunately, you can't easily set this on for everything on your system.
|
# ¿ Jul 6, 2017 04:54 |
|
SourKraut posted:Dividends are for companies in good financial situations. That's not AMD. AMD hasn't appeared to have paid a dividend, ever, at least since January 13, 1978 when google starts tracking their stock. Surely you wouldn't count them as never being in a good financial situation since 1978?
|
# ¿ Jul 8, 2017 21:49 |
|
Combat Pretzel posted:"PCIe" is a registered trademark? wat? Probably the last major one that wasn't is ISA, and then only because that was a retroactive name applied after it'd already been in use for several years with no real name besides "the expansion bus IBM PCs use"
|
# ¿ Jul 28, 2017 00:00 |
|
Most of the big warehouse setups are in rural China, in places with dirt cheap electricity prices because power supplies had been overbuilt and the grid won't take the electricity out to bigger cities that need it (usually new hydro dam installations). The warehouses get tossed together in a few days with no safety considerations at all, so it's no surprise when they catch fire.
|
# ¿ Aug 10, 2017 02:55 |
|
It was never really writing 40 GB a day, due to how disk caches work.
|
# ¿ Aug 21, 2017 19:34 |
|
NewFatMike posted:Alternate architecture chat is making me think about x86 emulation running on ARM that got Intel all in a tizzy earlier this year. I mean, that's been a thing for a long time. Its problem is that it compounds existing ARM performance issues by the speed penalty emulation imposes.
|
# ¿ Aug 26, 2017 02:48 |
|
MaxxBot posted:Explicitly parallel instruction computing design for CPUs just turned out to be a pretty lovely and unworkable solution and would have sucked even without x86 momentum. ARM on the other hand is inherently better but nowhere near enough to overcome said momentum. What's supposed to be inherently better about ARM that isn't a failed promise like PA-RISC, PowerPC, Alpha, etc?
|
# ¿ Aug 27, 2017 16:35 |
|
VostokProgram posted:Was DEC Alpha supposed to be super badass or something back in the day? I find mention of it in a lot of places but no explanation of why it was so interesting Ultimately DEC Alpha was important because DEC was important, and they positioned it as the direct successor to their popular VAX families of processors. Volguus posted:As far as I remember it was a RISC architecture and extremely fast. While I haven't used it personally, the rumor was that WinNT for x86 was faster on a VM on a DEC than on a native x86 cpu. This was true, but there was also that early NT was specifically designed to not favor any particular CPU design, and there was also the fact that x86 processors weren't all that fast themselves back in the day (but conversely, DEC Alpha hardware was hardly inexpensive, and IIRC DEC never brought Alphas down to their "low-end" systems in their heyday). And Alpha support was only in NT 3.1/3.5/4.0 and by 4.0 it was already looking sketchy for the architecture. I guess you could consider it like, what if the Intel CPU lines currently stopped at the "Pentium" branded chips of today, everything i3 and up and all the Xeons were absent, and they were also a few generations back from current? That's kinda what putting up especially 486s or the early Pentiums against the DEC Alphas was like, with the Alphas taking the role of very high Intel chips of today. fishmech fucked around with this message at 16:15 on Aug 29, 2017 |
# ¿ Aug 29, 2017 16:12 |
|
Google's IPv6 usage statistics (gathered on the basis of how many connections they see to all their sites over v6 vs v4) are very interesting for that. For one thing, IPv6 adoption is consistently higher on Saturdays, Sundays and holidays than on normal workdays. For another, Belgium, the US and Greece are the top 3 IPv6 users, in that order. https://www.google.com/intl/en/ipv6/statistics.html
|
# ¿ Aug 30, 2017 17:52 |
|
Anime Schoolgirl posted:can't wait for a bastardized 4-6 core version of one of those to show up in the PS5 and Xbone Z Consoles wouldn't go back from the 8 cores they have now. At the very least the Xbox Aleph-4 is going to keep an 8 core minimum cuz Microsoft is all about the backwards compatibility stuff. And then Sony would be hard pressed to keep parity with a 4-6 core variant of the same hardware.
|
# ¿ Sep 19, 2017 03:30 |
|
SwissArmyDruid posted:Well... now that AMD has come back to the land of sanity and re-embraced SMT, I'm thinking a 4c/8t or 6c/12t part would still be a net step up. Remember: Construction cores may have been polished to a mirror sheen, but they are still garbage. Yeah but the games and especially the emulation packages for 360 titles rely on having 8 full cores (or rather 6 or 7 exclusive cores with the main system holding a core for its own) to work against, and trying to shove that into hyperthreading is unlikely to work out too well for that. If you're OK with breaking compatibility then sure you could do a faster system on your computer 4c/8t, but the compatibility issue would make it a no go for MS.
|
# ¿ Sep 19, 2017 04:10 |
|
SwissArmyDruid posted:I think of one thing that has probably kept everyone here at least appreciative of AMD even in its worst years, is loving not having to upgrade motherboards every time a new goddamn chip comes out. And what good was that really when you had the same performance for the chips for 7 years straight.
|
# ¿ Sep 28, 2017 01:25 |
|
PerrineClostermann posted:How is the tablet market doing in 2017? Still declining for the 14th quarter in a row, last I checked.
|
# ¿ Oct 21, 2017 20:00 |
|
redeyes posted:Let me know when they allow mouse control on them. http://bgr.com/2017/09/04/xbox-one-games-keyboard-mouse-support-release-date/ SwissArmyDruid posted:Speaking of DOOM. Xbox One X can certainly do that for patched games, and the PS4 Pro does it for a lot too.
|
# ¿ Oct 29, 2017 16:20 |
|
SwissArmyDruid posted:I'm not saying I think you're crazy.... but I think a blind A-B experiment should be set up with multiple pairs of identical monitors running at different maximum refresh rates should be done to prove your assertion. Two things: 1) The most important aspect of a 240hz refresh rate monitor is that it is sufficient to display 120hz 3D content using a glasses method, which is nice for a certain style of games and other media. 2) The human eye and brain can tell the difference between quite high framerates just fine, even if the higher you go the less of the additional frames per second you can really perceive. And depending on the game in question, there can be some really nice improvements in how responsive the controls feel.
|
# ¿ Oct 29, 2017 17:40 |
|
Otakufag posted:But aren't its 8 jaguar cpu cores garbage? I think the Destiny 2 devs said it horribly bottlenecked the gpu. Less of a bottleneck than the original PS4 and Xbox One 8 Jaguar cores were, and the Xbox One X's got further improvements on the processors beyond what the PS4 Pro did.
|
# ¿ Oct 30, 2017 00:17 |
|
Happy_Misanthrope posted:XB1X is in 1070 range even. Hell Outlast2 is apparently native 4K and 60fps on the 1X - don't think a 1070 can do that, albeit that may be an outlier. The mining boom is making the value proposition of PC GPU's looking not that great against mid-release console cycles now. Painful to think if the mining impact didn't exist where we would be on prices now, at least with the low/midrange. Mid-release console cycles isn't really a thing, there was simply nothing comparable in previous generations. There were tiny, minor, speedups usually accompanying cost reduction revisions, but that was not usually noticeable outside constrained scenarios. And you'd have random poo poo like being able to double the RAM in the N64, or adding extra RAM to the Saturn through a cartridge, but nothing really affecting processing speeds. And especially on the Microsoft side of things, they seem really dedicated to a "your old games will work on any of our consoles, eventually we'll stop releasing new games for old hardware" thing that they probably meant to do from the start but got lost among other launch-day missteps. So probably the Xbox XP or whatever that comes after the Xbox One X and goes against a PS5 is just going to straight up play the Xbox One stuff natively - no more typical console lifespan.
|
# ¿ Oct 30, 2017 05:29 |
|
SlayVus posted:I doubt any consoles could do legitimate native 4k. Well you're wrong. It's as simple as that, especially now that the Xbox One X is out in a week. But even before that the PS4 Pro was doing quite well at native 4k/30 in many updated games. The original stock PS4 and Xbox One of 2013 couldn't do it, sure, but that was 4 years ago.
|
# ¿ Oct 31, 2017 17:51 |
|
taqueso posted:There is no ethical consumption under capitalism. Yes there are a bunch of cheap, high quality, no-frills Korean monitors, panels and major components manufactured and assembled in Korea. However its not like that will mean they don't have any parts manufactured in China in them: capacitors, etc.
|
# ¿ Nov 8, 2017 04:48 |
|
My understanding was that there are enforceable non-competes in California, but the only ones that are enforceable are the ones where the original company must continue paying the employee full salary until the non-compete time is up. Which would essentially mean the companies would be on the hook for hundreds of thousands to millions of dollars for every employee they hired, and so few companies actually ever had those.
|
# ¿ Nov 11, 2017 02:06 |
|
Mofabio posted:It's like when the powers-that-be go away, everybody's memory of the scandal goes blank. Well, it does need constant refreshing.
|
# ¿ Dec 31, 2017 19:46 |
|
Combat Pretzel posted:If the average hit turns out to be more than 5%, it'll get quite a little ironic that Intel's ongoing performance advantage came from cutting corners in regards to security. Er, but they don't seem to have? It appears to affect all x86-64 supporting Intel CPUs, which would mean it goes all the way back to the first 64 bit Pentium 4s and for the whole line from the first Core 2 Duo chips.
|
# ¿ Jan 3, 2018 06:27 |
|
SourKraut posted:Maybe Intel can pay everyone in I certainly didn't get any rebates for RAM when rowhammer was invented.
|
# ¿ Jan 4, 2018 04:47 |
|
Seamonster posted:Perhaps not as outright awful as RAM but SSD prices have increased per GB as well. And innovation in terms of drive size has utterly stagnated too. Where the hell are our 4TB drives for ~$800? Where are you seeing SSD prices go up? It sure looks like prices have either held or gone down anywhere I look. And as far as innovation in pure size, Samsung's got 16TB drives that fit in the 2.5 inch SAS form factor and Seagate has 64 TB demonstrators in 3.5 inch form factor (though you can't buy those Seagates on the market). Considering as you can still only go up to 12 terabytes in a single spinning disk drive, that's pretty impressive.
|
# ¿ Jan 6, 2018 04:23 |
|
Anime Schoolgirl posted:I just want Raven Ridge to show up on more than two laptops That kind of thing is why I'm skeptical on AMD being able to really exploit exploits for a short term boost in their market share. Laptops are an absolutely huge market for x86-64 and AMD is still barely able to offer in it.
|
# ¿ Jan 7, 2018 17:20 |
|
pixaal posted:I'd rather see folding at home turned into some kind of buttcoin like currency so the mining would actually do something and people could trade them on the value that they did something good and there is a limited amount of research. This is impossible. The tasks folding at home, seti@home, etc do are things that can't be verified without the verifier doing the same processing all over again. Meanwhile bitcoin mining type things are designed so that the person doing the "work" takes a lot of time to do it, but I can come over and verify it by running a way faster function to compare things.
|
# ¿ Jan 22, 2018 23:09 |
|
Kazinsal posted:It's never going back down. Welcome to the idiot hellfucker dimension where the cost of building a PC is going back to 1980s levels. Gotta remember that $1 in the 80s is between $2 and $3 now though. People would pay the modern equivalent of like $16,000 for high end PC clones back then, parts could be entirely crazy.
|
# ¿ Jan 26, 2018 21:13 |
|
|
# ¿ May 7, 2024 14:05 |
|
Craptacular! posted:The bus between the things was Intel’s tech. Regardless, I don’t think it matters, because it’s designed for pre-builts and you’re not going to see desktop motherboards with that socket. It's telling that the best the "ARM ANY DAY NOW" crew can come up with is that some of Apple's most neglected models of laptop are vaguely comparable to ARM on certain single core only benchmark. For instance the MacBook Air is still running ultra low power Broadwell and the only change they've made for the CPU since 2015 has been to bump the base model up to 1.8 GHz base clocks.
|
# ¿ Feb 19, 2018 00:55 |