Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
fishmech
Jul 16, 2006

by VideoGames
Salad Prong
Yeah Windows 8.1 and 10 will run on CPU that's at least 1 ghz and supports PAE, SSE2, and the NX bit. That means almost every Athlon 64 and most Intel chips after the later Prescott Pentium 4s. So as early as a fall 2003 computer with AMD and as early as a summer 2004 computer with Intel.


(there's also hacky poo poo where you install with a compatible CPU then switch it out for one that lacks the NX bit, which would allow you to run Windows 10 on CPU from about 2001)

Adbot
ADBOT LOVES YOU

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

eames posted:

If Destiny 2 is any indication then consoles are becoming CPU bottlenecked which would explain why the game is locked to 30 fps at any resolution (correct me if I'm wrong).
Guess that's what happens when you bump up the GPU performance and neglect the CPU during the mid-cycle refreshes. Scorpio is also based on Jaguar with a clockspeed bump so same story there.
We'll have to wait for Zen APUs to see another meaningful improvement. (PS5 next year?)


The good thing to come out of this for the PC is that multithreading will be more important than ever.

The Scorpio project has a major increase in GPU compute units as well as their speed, unlike the Ps4 Pro that mainly upclocked the CPU/GPU and precisely doubled the GPU compute units.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

FaustianQ posted:

Ryzen is apparently related to the cat cores or at least began it's design stages as "make cat cores better". I'm still flummoxed on why the Scorpio went with Jaguar, and not Puma+ or even the rumored Basilisk if they were going have an issue with Zen cores. It's not even like Zen would be particularly throttled in performance here, they could easily do 8 cores at 2.5Ghz or 4 cores at 3.0Ghz and still be inside the overclocked Jaguar core thermal envelope, Zens just that much better of a design.

The hardware had to be finalized either the end of last year or beginning of this year, in order to ramp up production of the actual hardware and have working devkits available for the game developers. An architecture that isn't out yet certainly wouldn't work, and Puma/Puma+ simply might not have been judged a worthwhile change compared to the option Microsoft actually took, of customizing Jaguar cores further.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Eletriarnation posted:

Maybe the Atom qualifies? I know that a lot of the idea behind it was "Make a chip as simple as the original Pentium but running as fast as we can get it these days, plus any improvements which give you at least a 2:1 return on performance:wattage."


The Atom design developed from the "Stealey" cores used in the A100 and A110 processors in 2007, meant for use in those weird Ultra Mobile PC things that were a thing for a few years before netbooks. One of the only devices they went into was the HTC Shift.


The Stealey cores were based on Dothan Pentium M-based Celeron CPUs, obviously heavily cut downand slowed down to fit the 5 watt power envelope. So, Atom wasn't really a new microarchitecure, it was just another variant of the P6, with more similarity to older designs in that family.

Also remember that with Atom and the Stealey cores of the year before, Intel gave the directives to just cut power by any means possible, while maintaining compatibility. Once they got that with the Stealey core, they spent time refining Atom to slowly bring back performance without massively increasing power. And then eventually gave up on that line for the new Atom chips that derive from the the core i series.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

GRINDCORE MEGGIDO posted:

I'm surprised AMD don't sell a lightweight compute card that's just a standard card, cheaper, with no video outputs, and separate the markets.

Don't they, for the normal high performance computing market?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

FaustianQ posted:

So basically say gently caress it, embrace that buttcoin miners won't stop doing this and sell buttcoin edition cards that are theoretically useless for anything but compute?


Aren't those much more expensive? That'd be why.

They're more expensive upfront, but that's partially because they're low volume sellers, and partially because they're a decent bit more powerful for pure compute tasks than the typical graphics cards of equivalent chipsets.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

eames posted:

From what I understand "mining" efficiency (coin per watt) is always on a steady decline, the price per coin is highly volatile and power costs are constant.

The spikes in GPU demand happen when the price goes up beyond the power costs of calculating a coin. Assuming a constant price, efficiency will catch up and inevitably make it unprofitable again, at which point people sell GPUs.
My understanding is that prices would need to rise forever to keep demand constant and a sudden crash leads to cheap used GPUs for all gamers.

(correct me if I'm wrong, seasoned cryptogoons)

The only thing I'd add that is there's constantly "new" coins being made and a certain kind of sucker buying GPUs for the new stuff without anyone having time to figure return vs electricity ratios, so that drives demand when prices aren't particularly good for the main established coins.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Truga posted:

loving this. Websites are turning into steaming piles of poo poo that eat 1000 gigs of ram and 37 cores just by opening them. If don't want to close your browser every time you want to play a game, buy something with 8 threads.

Yeah totally. That's why it's always funny when you see people asking for suggestions for a laptop for their mom or something and they say "oh she's only going to be surfing Facebook!". That'll loving eat a huge chunk out of a decent CPU these days.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

thechosenone posted:


Ultimately I guess its is a bit much to ask of it, but I've always been interested in the idea that iGPUs will eventually be strong enough that dedicated graphics will go the way of soundcards (ie relegated to specific situations that need optimized processors).

Sure, if you can find some way to cram dozens of gigabytes of high speed video RAM and a massive amount of cooling onto the CPU. It's a whole lot simpler to make that sort of thing work on a separate card.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

thechosenone posted:

so like you could use stacked memory to get a lot of data, and stitch parts together or something, like some sort of super-chip inspired by Dr.Frankenstein.

But like, I figure if 128 MB can fit, then if you can stack it, then you could at least get a gigabyte or two. This would lead to one only needing a dGPU if they needed buttloads of memory beyond what would fit.

Wow, I guess I know how people manage to click on the quote button instead of the editing button now. I always thought it would be hard to miss that.

1 gigabyte of video RAM is insufficient for removing dedicated GPUs already, let alone for future uses when such a processor would be practical. Even like 3 or 4 GB would be stretching things for something meant to replace most dedicated GPUs in use.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Wirth1000 posted:

Alright, so just got home and booted up into the UEFI (when the hell did UEFI replace BIOS?)

Started about 12 years ago as far as PCs go, basically finished up by like 3 or 4 years ago.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Palladium posted:

I think most of us forgot laptop CPUs constitutes maybe like 70% of Intel's consumer market and AMD has been non existent there for a decade.

A decade? It's a lot longer than that they've been noncompetitive.


ultrabay2000 posted:

I wouldn't buy one but they seem to do alright in the Walmart laptop segment. Seems like a large chunk of cheap laptops I see have AMD CPUs.

Those also sell less than you'd expect. There's tons of Atom, Pentium, Celeron laptops going out nearly as cheap or even cheaper, but more widely available. And most consumers don't really understand that sometimes the AMD's better integrated GPU might actually be better for what they want to do on a real low end device.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Khorne posted:

I was real serious about this. The reason is twofold, one you get a higher clock under half-load, and two the fpu thing plays some role probably. Due to this processor's design you have insanely inconsistent run times for processes that run on the scale of a week+. As in, "hey, why does my process that takes 7 days sometimes take 14 days".

You set process-processor affinity in windows on a per program basis. This guide uses win 7,but the process is the same in Vista, 8,or 10. http://www.techrepublic.com/blog/windows-and-office/change-the-processor-affinity-setting-in-windows-7-to-gain-a-performance-edge/

Simply put, you will add a command line option to your shortcuts, which tells windows which cores are allowed to be used. You may need to experiment first to find the correct sets to not split fpus.

Unfortunately, you can't easily set this on for everything on your system.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SourKraut posted:

Dividends are for companies in good financial situations. That's not AMD.

AMD hasn't appeared to have paid a dividend, ever, at least since January 13, 1978 when google starts tracking their stock. Surely you wouldn't count them as never being in a good financial situation since 1978?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Combat Pretzel posted:

"PCIe" is a registered trademark? wat?

Probably the last major one that wasn't is ISA, and then only because that was a retroactive name applied after it'd already been in use for several years with no real name besides "the expansion bus IBM PCs use"

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
Most of the big warehouse setups are in rural China, in places with dirt cheap electricity prices because power supplies had been overbuilt and the grid won't take the electricity out to bigger cities that need it (usually new hydro dam installations). The warehouses get tossed together in a few days with no safety considerations at all, so it's no surprise when they catch fire.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
It was never really writing 40 GB a day, due to how disk caches work.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

NewFatMike posted:

Alternate architecture chat is making me think about x86 emulation running on ARM that got Intel all in a tizzy earlier this year.

Would be cool to have AMD and Qualcomm competing with Intel. But also rip in peace AMD for selling Adreno.

I mean, that's been a thing for a long time. Its problem is that it compounds existing ARM performance issues by the speed penalty emulation imposes.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

MaxxBot posted:

Explicitly parallel instruction computing design for CPUs just turned out to be a pretty lovely and unworkable solution and would have sucked even without x86 momentum. ARM on the other hand is inherently better but nowhere near enough to overcome said momentum.

What's supposed to be inherently better about ARM that isn't a failed promise like PA-RISC, PowerPC, Alpha, etc?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

VostokProgram posted:

Was DEC Alpha supposed to be super badass or something back in the day? I find mention of it in a lot of places but no explanation of why it was so interesting

e: besides it having bizarre super weak memory ordering

Ultimately DEC Alpha was important because DEC was important, and they positioned it as the direct successor to their popular VAX families of processors.

Volguus posted:

As far as I remember it was a RISC architecture and extremely fast. While I haven't used it personally, the rumor was that WinNT for x86 was faster on a VM on a DEC than on a native x86 cpu.

This was true, but there was also that early NT was specifically designed to not favor any particular CPU design, and there was also the fact that x86 processors weren't all that fast themselves back in the day (but conversely, DEC Alpha hardware was hardly inexpensive, and IIRC DEC never brought Alphas down to their "low-end" systems in their heyday). And Alpha support was only in NT 3.1/3.5/4.0 and by 4.0 it was already looking sketchy for the architecture.

I guess you could consider it like, what if the Intel CPU lines currently stopped at the "Pentium" branded chips of today, everything i3 and up and all the Xeons were absent, and they were also a few generations back from current? That's kinda what putting up especially 486s or the early Pentiums against the DEC Alphas was like, with the Alphas taking the role of very high Intel chips of today.

fishmech fucked around with this message at 16:15 on Aug 29, 2017

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
Google's IPv6 usage statistics (gathered on the basis of how many connections they see to all their sites over v6 vs v4) are very interesting for that. For one thing, IPv6 adoption is consistently higher on Saturdays, Sundays and holidays than on normal workdays. For another, Belgium, the US and Greece are the top 3 IPv6 users, in that order.

https://www.google.com/intl/en/ipv6/statistics.html

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Anime Schoolgirl posted:

can't wait for a bastardized 4-6 core version of one of those to show up in the PS5 and Xbone Z

Consoles wouldn't go back from the 8 cores they have now. At the very least the Xbox Aleph-4 is going to keep an 8 core minimum cuz Microsoft is all about the backwards compatibility stuff. And then Sony would be hard pressed to keep parity with a 4-6 core variant of the same hardware.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SwissArmyDruid posted:

Well... now that AMD has come back to the land of sanity and re-embraced SMT, I'm thinking a 4c/8t or 6c/12t part would still be a net step up. Remember: Construction cores may have been polished to a mirror sheen, but they are still garbage.

Yeah but the games and especially the emulation packages for 360 titles rely on having 8 full cores (or rather 6 or 7 exclusive cores with the main system holding a core for its own) to work against, and trying to shove that into hyperthreading is unlikely to work out too well for that.

If you're OK with breaking compatibility then sure you could do a faster system on your computer 4c/8t, but the compatibility issue would make it a no go for MS.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SwissArmyDruid posted:

I think of one thing that has probably kept everyone here at least appreciative of AMD even in its worst years, is loving not having to upgrade motherboards every time a new goddamn chip comes out.

And what good was that really when you had the same performance for the chips for 7 years straight.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

PerrineClostermann posted:

How is the tablet market doing in 2017?

Still declining for the 14th quarter in a row, last I checked.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

redeyes posted:

Let me know when they allow mouse control on them.

http://bgr.com/2017/09/04/xbox-one-games-keyboard-mouse-support-release-date/

SwissArmyDruid posted:

Speaking of DOOM.

This, hopefully, is not what goes into the next generation of consoles.

I don't know how much more I can deal with consoles still not being able to do 1080p60, despite claiming so.

Xbox One X can certainly do that for patched games, and the PS4 Pro does it for a lot too.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SwissArmyDruid posted:

I'm not saying I think you're crazy.... but I think a blind A-B experiment should be set up with multiple pairs of identical monitors running at different maximum refresh rates should be done to prove your assertion.

Two things:
1) The most important aspect of a 240hz refresh rate monitor is that it is sufficient to display 120hz 3D content using a glasses method, which is nice for a certain style of games and other media.

2) The human eye and brain can tell the difference between quite high framerates just fine, even if the higher you go the less of the additional frames per second you can really perceive. And depending on the game in question, there can be some really nice improvements in how responsive the controls feel.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Otakufag posted:

But aren't its 8 jaguar cpu cores garbage? I think the Destiny 2 devs said it horribly bottlenecked the gpu.

Less of a bottleneck than the original PS4 and Xbox One 8 Jaguar cores were, and the Xbox One X's got further improvements on the processors beyond what the PS4 Pro did.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Happy_Misanthrope posted:

XB1X is in 1070 range even. Hell Outlast2 is apparently native 4K and 60fps on the 1X - don't think a 1070 can do that, albeit that may be an outlier. The mining boom is making the value proposition of PC GPU's looking not that great against mid-release console cycles now. Painful to think if the mining impact didn't exist where we would be on prices now, at least with the low/midrange.

Mid-release console cycles isn't really a thing, there was simply nothing comparable in previous generations. There were tiny, minor, speedups usually accompanying cost reduction revisions, but that was not usually noticeable outside constrained scenarios. And you'd have random poo poo like being able to double the RAM in the N64, or adding extra RAM to the Saturn through a cartridge, but nothing really affecting processing speeds.

And especially on the Microsoft side of things, they seem really dedicated to a "your old games will work on any of our consoles, eventually we'll stop releasing new games for old hardware" thing that they probably meant to do from the start but got lost among other launch-day missteps. So probably the Xbox XP or whatever that comes after the Xbox One X and goes against a PS5 is just going to straight up play the Xbox One stuff natively - no more typical console lifespan.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SlayVus posted:

I doubt any consoles could do legitimate native 4k.

Well you're wrong. It's as simple as that, especially now that the Xbox One X is out in a week. But even before that the PS4 Pro was doing quite well at native 4k/30 in many updated games.

The original stock PS4 and Xbox One of 2013 couldn't do it, sure, but that was 4 years ago.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

taqueso posted:

There is no ethical consumption under capitalism.

I thought some monitors were made in S Korea like those QNIX and Crossover? My info is old, I haven't been monitor shopping in a few years.

Yes there are a bunch of cheap, high quality, no-frills Korean monitors, panels and major components manufactured and assembled in Korea. However its not like that will mean they don't have any parts manufactured in China in them: capacitors, etc.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
My understanding was that there are enforceable non-competes in California, but the only ones that are enforceable are the ones where the original company must continue paying the employee full salary until the non-compete time is up.

Which would essentially mean the companies would be on the hook for hundreds of thousands to millions of dollars for every employee they hired, and so few companies actually ever had those.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Mofabio posted:

It's like when the powers-that-be go away, everybody's memory of the scandal goes blank.

Well, it does need constant refreshing.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Combat Pretzel posted:

If the average hit turns out to be more than 5%, it'll get quite a little ironic that Intel's ongoing performance advantage came from cutting corners in regards to security.

Er, but they don't seem to have? It appears to affect all x86-64 supporting Intel CPUs, which would mean it goes all the way back to the first 64 bit Pentium 4s and for the whole line from the first Core 2 Duo chips.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SourKraut posted:

Maybe Intel can pay everyone in bribes rebates to compensate for the massive security flaw.

I certainly didn't get any rebates for RAM when rowhammer was invented.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Seamonster posted:

Perhaps not as outright awful as RAM but SSD prices have increased per GB as well. And innovation in terms of drive size has utterly stagnated too. Where the hell are our 4TB drives for ~$800?

Where are you seeing SSD prices go up? It sure looks like prices have either held or gone down anywhere I look. And as far as innovation in pure size, Samsung's got 16TB drives that fit in the 2.5 inch SAS form factor and Seagate has 64 TB demonstrators in 3.5 inch form factor (though you can't buy those Seagates on the market). Considering as you can still only go up to 12 terabytes in a single spinning disk drive, that's pretty impressive.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Anime Schoolgirl posted:

I just want Raven Ridge to show up on more than two laptops

That kind of thing is why I'm skeptical on AMD being able to really exploit exploits for a short term boost in their market share. Laptops are an absolutely huge market for x86-64 and AMD is still barely able to offer in it.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

pixaal posted:

I'd rather see folding at home turned into some kind of buttcoin like currency so the mining would actually do something and people could trade them on the value that they did something good and there is a limited amount of research.

This is impossible.

The tasks folding at home, seti@home, etc do are things that can't be verified without the verifier doing the same processing all over again. Meanwhile bitcoin mining type things are designed so that the person doing the "work" takes a lot of time to do it, but I can come over and verify it by running a way faster function to compare things.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Kazinsal posted:

It's never going back down. Welcome to the idiot hellfucker dimension where the cost of building a PC is going back to 1980s levels.

Gotta remember that $1 in the 80s is between $2 and $3 now though. People would pay the modern equivalent of like $16,000 for high end PC clones back then, parts could be entirely crazy.

Adbot
ADBOT LOVES YOU

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Craptacular! posted:

The bus between the things was Intel’s tech. Regardless, I don’t think it matters, because it’s designed for pre-builts and you’re not going to see desktop motherboards with that socket.

The whole point of the enterprise seems to be halting the advance of ARM in laptops, since the latest iPads have single core performance that rival MacBooks and have had Apple evangelism outposts like Gruber‘s blog and ATP mulling the necessity of Intel in MacBooks. System builders are a smaller segment than the people buying laptops like disposable objects, and both Intel and AMD suffer as X86 stakeholders if that market even begins a shift off the platform.

It's telling that the best the "ARM ANY DAY NOW" crew can come up with is that some of Apple's most neglected models of laptop are vaguely comparable to ARM on certain single core only benchmark. For instance the MacBook Air is still running ultra low power Broadwell and the only change they've made for the CPU since 2015 has been to bump the base model up to 1.8 GHz base clocks.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply