|
Combat Pretzel posted:Is there even a reliable unified list of POST codes? I thought most BIOS manufacturers did their own thing, and even changed them up from time to time. I've always looked them up. The few I've memorized were common problems with a specific system that I had to support. Most of the time they come on delivery, which I just call in the warranty I'm not going to open it and waste my time. If I'm building a new computer, well it's for myself and I'm going to have the manual with all the error codes in arms reach if I happen to need it. If it starts beeping a few years later I just google for the error code for that model motherboard. pixaal fucked around with this message at 15:39 on Mar 15, 2019 |
# ? Mar 15, 2019 15:36 |
|
|
# ? Jun 6, 2024 06:09 |
|
Pshhhh standardizing post codes/beep codes/light codes would take EFFORT! Just like front panel connector design, why dont we just all do our own thing for decades and then eventually remove all the old documentation from our sites so that nobody can diagnose anything? Now thats good thinkin.
|
# ? Mar 15, 2019 16:40 |
|
Combat Pretzel posted:Is there even a reliable unified list of POST codes? No, but the motherboard manual should have a fairly comprehensive list of codes for that specific motherboard.
|
# ? Mar 15, 2019 19:21 |
|
Cygni posted:Pshhhh standardizing post codes/beep codes/light codes would take EFFORT! Just like front panel connector design, why dont we just all do our own thing for decades and then eventually remove all the old documentation from our sites so that nobody can diagnose anything? Now thats good thinkin. A few places tried to do synthetic voice errors back in 2004-2006. It was pretty creepy, or a fever dream I'm only a little evidence of it on Google. Pretty sure it was an ASUS board that screamed MEMORY ERROR at me.
|
# ? Mar 15, 2019 19:28 |
|
Cygni posted:Pshhhh standardizing post codes/beep codes/light codes would take EFFORT!
|
# ? Mar 15, 2019 21:56 |
|
pixaal posted:A few places tried to do synthetic voice errors back in 2004-2006. It was pretty creepy, or a fever dream I'm only a little evidence of it on Google. Pretty sure it was an ASUS board that screamed MEMORY ERROR at me. I had one of those it was quite weird.
|
# ? Mar 16, 2019 00:39 |
|
doomisland posted:I had one of those it was quite weird. With modern text to speech it would probably be an improvement over beep codes. Then again I hate getting new hardware in now, gently caress you Cortana go away. Just need to tell her to shutup 29 more times.
|
# ? Mar 16, 2019 02:26 |
|
I thought at least on aftermarket boards beep codes were standardized based on whether you had a Phoenix Award or AMI BIOS but OEMs un-standardize poo poo just to gently caress with people (see Dell PSU connectors). But I cant remember a single time I have ever used beep codes to troubleshoot a PC, if no post just reseat the RAM and hope the smoke is still inside the CPU.
|
# ? Mar 16, 2019 03:42 |
|
I have a dumb question: I OCed my 1600 with the Wraith Spire (pretty simple OC, 3.7 @ 1.14vcore and medium LLC) and now the fan drone kicks up an octave and then back down regularly and that's annoying. I can actually make it happen by loading certain web sites. Is this a fan curve thing I ought to adjust or something else?
|
# ? Mar 16, 2019 07:56 |
|
I use Argus monitor to control my fans, and it'll let you set the fan speed based on a 10 second average rather than however fast these things normally respond. Alternatively, you can increase the temperature at which the fan spins up from idle speeds to a higher temperature to help prevent spikes from kicking up the fan speed.
|
# ? Mar 16, 2019 08:24 |
|
I figured if i did that, the temp would simply rise to the new threshold and do the same thing. Dunno. I considered simply raising the bottom RPMs up a bit because I'm okay with a little noise, but action-based noise levels annoy the poo poo out of me (this is why I can't stand GPU coil whine either). EDIT: Default fan curve seems fine, MSI bios does let you adjust the length of time the temperature needs to reach a point to kick up the fans, and going from 0.1 second to 0.3 seems to have fixed the fan that couldn't figure out what speed it wanted. Craptacular! fucked around with this message at 10:08 on Mar 16, 2019 |
# ? Mar 16, 2019 09:07 |
|
Craptacular! posted:I have a dumb question: I OCed my 1600 with the Wraith Spire (pretty simple OC, 3.7 @ 1.14vcore and medium LLC) and now the fan drone kicks up an octave and then back down regularly and that's annoying. I can actually make it happen by loading certain web sites. Your CPU basically spins up when minor background tasks perform basic things, and because it spins down again afterwards, your fans and their corresponding curves can't really do anything about it.
|
# ? Mar 16, 2019 15:24 |
|
AFAIK it's Precision Boost creating heat. If you disable it in the BIOS, it stops doing that. The only real alternative, at least if you don't want a fan curve that results in higher idle temperatures, is water cooling with the fans controlled by the water temperature. At least I had to, because ASRock BIOSes don't do time constants for fan spooling.
|
# ? Mar 16, 2019 15:36 |
|
You noobs. He fixed it!
|
# ? Mar 16, 2019 18:06 |
|
LRADIKAL posted:You noobs. He fixed it! ufarn fucked around with this message at 19:01 on Mar 16, 2019 |
# ? Mar 16, 2019 18:59 |
|
Combat Pretzel posted:AFAIK it's Precision Boost creating heat. If you disable it in the BIOS, it stops doing that. I don’t believe I’m using PBoost. The only setting I know for PBO is my BIOS is the PB Overdrive setting, which is disabled and pointless since this is a 1600 and not a 2600(x). And no it’s still happening and I’ll try some of the suggestions here. I did move to Windows Balnced as soon as I installed chipset drivers. This isn’t my first Ryzen chip, but it is my first Ryzen OC as I run the home server at stock.
|
# ? Mar 16, 2019 21:35 |
|
Keep in mind the +20°C offset on ryzens. On my motherboard (MSI) the fan curve responds to Tdie (actual temperature) when I'm in the bios setting up the fan curve, and Tctl (+20 offset) once the OS boots. My CPU idles below 30°C (actual) but spikes up about 10 degrees whenever it clocks up to do the smallest amount of work. So given the offset that means anything below ~60°C on the fan curve graph is effectively idle temperature. So my fan curve is a flat on 20% PWM up to 50, a very shallow slope up to 30% at 65, and then some actual fan above that. Now I have a big 120mm tower heatsink so you'll probably need to be higher than that as a minimum setting, but don't be afraid to go ham on that fan curve if you think it's spinning up for no reason.
|
# ? Mar 17, 2019 00:53 |
|
ufarn posted:Not really, they just increased the spin-up time to make it (ie hysteresis) less audible. Right. The issue of the fan constantly throttling is gone.
|
# ? Mar 17, 2019 01:21 |
|
Craptacular! posted:I don’t believe I’m using PBoost. The only setting I know for PBO is my BIOS is the PB Overdrive setting, which is disabled and pointless since this is a 1600 and not a 2600(x). Klyith posted:Keep in mind the +20°C offset on ryzens. On my motherboard (MSI) the fan curve responds to Tdie (actual temperature) when I'm in the bios setting up the fan curve, and Tctl (+20 offset) once the OS boots. I'm glad I eventually sprang the money for a Commander Pro and some water temp sensor to bypass all this Tdie, Tctl and ratcheting bullshit. Wish my mainboard would have had support for external 10K sensors, to save that money, but alas. Combat Pretzel fucked around with this message at 01:46 on Mar 17, 2019 |
# ? Mar 17, 2019 01:42 |
|
Klyith posted:Keep in mind the +20°C offset on ryzens. On my motherboard (MSI) the fan curve responds to Tdie (actual temperature) when I'm in the bios setting up the fan curve, and Tctl (+20 offset) once the OS boots. I think this is only for Ryzen 7s, at least for the original generation? When AMD blogged about it, they only acknowledged the 1700/1800.
|
# ? Mar 17, 2019 03:10 |
|
Craptacular! posted:I think this is only for Ryzen 7s, at least for the original generation? When AMD blogged about it, they only acknowledged the 1700/1800. All of the X series chips have the temperature offset.
|
# ? Mar 17, 2019 03:23 |
|
Measly Twerp posted:All of the X series chips have the temperature offset.
|
# ? Mar 17, 2019 04:39 |
|
Crotch Fruit posted:Is the temp offset a good thing? It just seems very weird for AMD to choose to report that the CPU is running hotter than it actually is since I think most enthusiasts care about temperatures and want to see lower numbers. I've heard some suggest that the +20 adjustment was combat hot spots on the die. For AMD to sell higher clocked X chips they felt they needed to be sure those hot spots were accounted for, as opposed to non X series chips where if you overclock it too far and hot, that's on you.
|
# ? Mar 17, 2019 04:57 |
|
Measly Twerp posted:All of the X series chips have the temperature offset. ah, I have a 1600X. didn't know it wasn't the whole 1st gen series. NVM craptacular, guess that doesn't have anything to do with your deal. Combat Pretzel posted:That's certainly an interesting way to go about it by MSI. The offest behavior is weird but not difficult once I figured out what was happening. Fan curves & hysteresis are really not that hard to figure out and set to be non-annoying... but you can't do that when you need to reboot to bios every time you want to change something. Luckily for me I was using speedfan for the last like decade or so, sad that it stopped being updated to work with modern mobos. But all the mobo brands have their own utilities that can set fan curves afaik. (The MSI one is crap tho. Afterburner good, MSI commander bad.)
|
# ? Mar 17, 2019 06:17 |
|
Argus Monitor is a decent modern alternative to SpeedFan, and less obnoxious than most motherboard vendor fan control software.
|
# ? Mar 17, 2019 13:23 |
|
TheFluff posted:Argus Monitor is a decent modern alternative to SpeedFan, and less obnoxious than most motherboard vendor fan control software. Is there laptop compatibility there? Jw
|
# ? Mar 17, 2019 13:37 |
|
Statutory Ape posted:Is there laptop compatibility there? Jw No idea, never tried it on anything other than a desktop machine. e: the motherboard compatibility list does say it supports "Lenovo / IBM Thinkpad Notebooks" and "Dell Notebooks". There's a free trial, so try it I guess? TheFluff fucked around with this message at 14:18 on Mar 17, 2019 |
# ? Mar 17, 2019 14:15 |
|
Some Ryzen 3000 news based on people riffling through the newest BIOS updates (MSI, Asus, Biostar) https://www.overclock.net/forum/13-amd-general/1640919-new-dram-calculator-ryzena-1-4-1-overclocking-dram-am4-414.html#post27895416quote:Translation into simple language. We have: quote:The XFR is actually quite impressive. If I read it right, they added FCLK, which we need to find out what that bus is doing. On Intel Skylake, Anand wrote the following:
|
# ? Mar 18, 2019 01:38 |
|
EmpyreanFlux posted:Some Ryzen 3000 news based on people riffling through the newest BIOS updates (MSI, Asus, Biostar) https://www.overclock.net/forum/13-amd-general/1640919-new-dram-calculator-ryzena-1-4-1-overclocking-dram-am4-414.html#post27895416 lol at amd fanboys still having a grudge with PCperspective but zen2 finally solving the IF / memory clock is good
|
# ? Mar 18, 2019 01:49 |
|
Interesting if they manage to backport all that functionality to current AM4 mobos. Or at least X470.
|
# ? Mar 18, 2019 13:25 |
|
Current scuttlebutt says B350 mobos won't get support for Ryzen 3k -- only X470, X370 and B450. https://www.youtube.com/watch?v=ezyTaUnXJkQ
|
# ? Mar 18, 2019 16:00 |
|
Heresy, b450 is the same as b350
|
# ? Mar 18, 2019 16:28 |
|
They made a promise once. :-(
|
# ? Mar 19, 2019 07:31 |
|
So like rumored earlier, AMD is powering this Google game streaming service they just launched. People are digging through the specs, but its hard to tell if its a new semi-custom design or traditional Rome + Vega servers running VMs. Or something else entirely. There is a lot of goofy stuff, like the slides mentioning HyperThreading by name (which would suggest Intel, as AMD can't use the term), but then Intel not being on the 'partners' slide at all. Be interesting to see what Google is actually using, and if its actually usable for latency sensitive games. https://www.anandtech.com/show/14105/google-announces-stadia-a-game-streaming-service
|
# ? Mar 19, 2019 22:44 |
|
Intel can't touch AMD's stuff for cost-effective core density right now. The only way they could set that up at scale is with something running on zen cores.
|
# ? Mar 19, 2019 22:56 |
|
DF has details 2.7 GHz custom CPU with SMT AVX2 SMID 9.5 MB L2+L3 AMD 10.7 TF GPU 56 CU HBM2 16 GB HBM2 shared by CPU and GPU () 484 GB/s bandwidth the GPU seems very similar to Vega 56 and as they rightfully pointed out, what a coincidence that Crytek published their RT tech demo on a Vega 56 of all cards. It almost seems like AMD has given Google early access to next gen console silicon scaled up to DC dimensions. Good for AMD though I don’t think this is good for general high performance computing when gaming moves to the cloud (which seems inevitable once connection and bandwidth issues are solved). They quote 20GB per hour of gameplay. eames fucked around with this message at 00:07 on Mar 20, 2019 |
# ? Mar 20, 2019 00:04 |
|
That's going to be an assload of EPYCs and Vega 10s.
|
# ? Mar 20, 2019 04:36 |
|
I'm not sure how to feel about this. Now that consoles are half way decent and not holding back the PC ports/builds, suddenly a new graphics hardware bottleneck shows up for devs to target. While it's easy to toss tons of RAM and cores into a server, sufficient graphics hardware is harder to condense.
|
# ? Mar 20, 2019 05:14 |
|
Combat Pretzel posted:I'm not sure how to feel about this. Now that consoles are half way decent and not holding back the PC ports/builds, suddenly a new graphics hardware bottleneck shows up for devs to target. While it's easy to toss tons of RAM and cores into a server, sufficient graphics hardware is harder to condense. I don't see the next-gen consoles having higher specs than that thing. Even with google launching a year ahead of them, all that poo poo isn't gonna fit in a $400 box. And really, graphics hardware is getting to the point where actually targeting the leading edge is a problem in itself. Those ultra-detailed high-res assets cost a lot of money. Cost of development is already a problem right now; if you expect the next gen to set the baseline up to what Rockstar does now then I think you're gonna be disappointed. What you should be worried about is streaming itself, because making games for streaming play requires changes to gameplay. No matter how good your bandwidth is you can't avoid the speed of light. Input lag is an unavoidable problem and that makes entire categories of games & gameplay either impossible or very different from how they are now.
|
# ? Mar 20, 2019 05:44 |
|
|
# ? Jun 6, 2024 06:09 |
|
I just read that it's entirely Linux based and it reads like they want game developers to make Linux builds of their games (or I guess something that'll work fine in Wine/DXVK), maybe something good will come from it. That is if EA, Ubisoft et al even care.
|
# ? Mar 20, 2019 06:58 |