|
cat doter posted:This is the post I got when I asked the same question which worked out for me. Turns out mine isn't flashable, but you know, I needed to know. Yeah, but all those posts link to the Hawaiinfo utility at http://rghost.ru which is down and I cannot find the Hawaiinfo tool anywhere else.
|
# ? Dec 12, 2013 11:03 |
|
|
# ? Jun 5, 2024 06:29 |
|
Here we go: http://www.mediafire.com/download/balyl7c1i8hfv0m/r9+290+flash.rar
Ghostpilot fucked around with this message at 11:31 on Dec 12, 2013 |
# ? Dec 12, 2013 11:07 |
|
cat doter posted:Doesn't seem to be, I ran OCCT again with GPU-Z and under 51% load for 3 minutes it reached 91C. The only memory options I see in OCCT are for the CPU. For GPU, there is memtestCL and memtest80. But I don't know how good this works since I can put a several hundred mhz increase on my memory and not see an error, yet games will absolutely poo poo themselves the second they try to render at the same memory speed. https://simtk.org/home/memtest/ As far as the issue you are having, that sounds exactly like the video driver stopping. Try different drivers with clean installs. Other things, re-seat the card in the motherboard and try another power supply (preferably larger).
|
# ? Dec 12, 2013 14:38 |
|
Phuzun posted:The only memory options I see in OCCT are for the CPU. I doubt it's the power supply, it's 850w non generic so that can't be it. I think there's a new beta driver out, I'll try that, then onto the GPU memory tests if that doesn't work.
|
# ? Dec 12, 2013 15:00 |
|
Everyone stop what you are doing! The G-Sync review is out at Anandtech http://www.anandtech.com/show/7582/nvidia-gsync-review -edit- What occurred to me reading that is that GSync would be fantastic for laptops. My Macbook Pro with a 750m would benefit incredibly, much more so than my monster of a desktop. Animal fucked around with this message at 15:37 on Dec 12, 2013 |
# ? Dec 12, 2013 15:03 |
|
Anandtech posted:Enabling G-Sync does have a small but measurable performance impact on frame rate. After the GPU renders a frame with G-Sync enabled, it will start polling the display to see if it’s in a VBLANK period or not to ensure that the GPU won’t scan in the middle of a scan out. The polling takes about 1ms, which translates to a 3 - 5% performance impact compared to v-sync on. NVIDIA is working on eliminating the polling entirely, but for now that’s how it’s done. Did they mean compared to V-Sync off? I thought if V-Sync is on, there isn't really going to be tearing anyways? 5% lower framerates than that other thing that eliminates tearing makes it awfully hard to justify the premium they charge when it can be put towards a GPU upgrade. And I'm one of the guys who has the upgradable Asus monitor and was looking forward to this.
|
# ? Dec 12, 2013 15:51 |
|
So UPS was kind enough to leave my 800 dollar package from Mountain Mods at my door last night without even having the courtesy to ring the doorbell. God I loving hate UPS. Where should I post a build thread?
|
# ? Dec 12, 2013 16:03 |
|
Zero VGS posted:Did they mean compared to V-Sync off? I thought if V-Sync is on, there isn't really going to be tearing anyways? 5% lower framerates than that other thing that eliminates tearing makes it awfully hard to justify the premium they charge when it can be put towards a GPU upgrade. And I'm one of the guys who has the upgradable Asus monitor and was looking forward to this. Vsync on eliminates tearing, but at the cost of introducing lag and stuttering any time your FPS hits something other than an exact multiple of your monitor's refresh rate (60 FPS, 30 FPS, etc). Gsync also eliminates tearing, but has no lag or stuttering at all, so long as your FPS is anywhere between 30 FPS and (60-144 FPS depending on your screen's refresh rate). Since it's way more common, especially for folks running 2560 monitors, to have FPS ranging between 35-55ish FPS, gsync is a godsend. Of course, they haven't introduced any 2560 IPS monitors with gsync yet. e: The other thing is, with this tech, so long as you're getting over 30 FPS, it seems like extra FPS is effectively wasted? I know with current monitors I see a big difference between 30 and 60 FPS, but I wonder if that will still be true with a Gsync screen? It seems possible, at least. (I never turn on vsync presently, I play on a 30" screen and don't have the GPU power to keep it at 60 FPS all the time with a single GTX680). Gwaihir fucked around with this message at 16:13 on Dec 12, 2013 |
# ? Dec 12, 2013 16:08 |
|
Jesus Kingpin got his 780 Ti to 1.9GHz on LN2.
|
# ? Dec 12, 2013 16:25 |
|
veedubfreak posted:So UPS was kind enough to leave my 800 dollar package from Mountain Mods at my door last night without even having the courtesy to ring the doorbell. God I loving hate UPS. Where should I post a build thread? Option 1: poo poo this thread up Option 2: poo poo the OC thread up Option 3: We start a new "Enthusiasts" thread for highlighting our crazy bullshit and fill it to the brim. I like option 3, personally - we could make it part mock thread too. It'd be a relatively low traffic thread, I think, but that's fine.
|
# ? Dec 12, 2013 16:36 |
|
Factory Factory posted:Option 1: poo poo this thread up I have been thinking about option 3 for months but
|
# ? Dec 12, 2013 16:42 |
|
We have a post the pictures of your ~real~ desktop thread, what kind of hardware forum would we be without a "Post the pictures of your desktop(pc this time)" thread?
|
# ? Dec 12, 2013 16:52 |
|
In this forum, we'd have to have a rule about "No useless specdumps for run-of-the-mill systems; you have to have put some effort into it" and enforce it by probation. Otherwise we'd let the riff-raff in.
|
# ? Dec 12, 2013 17:02 |
|
Gwaihir posted:e: The other thing is, with this tech, so long as you're getting over 30 FPS, it seems like extra FPS is effectively wasted? I know with current monitors I see a big difference between 30 and 60 FPS, but I wonder if that will still be true with a Gsync screen? It seems possible, at least. (I never turn on vsync presently, I play on a 30" screen and don't have the GPU power to keep it at 60 FPS all the time with a single GTX680). Not wasted, because with V-Sync off I can still tell a clear difference between 30, 40, 50, and 60 FPS. The difference between 60 and 120 is like night and day as well, but from 120 to 144hz I can't pretend I can see much of a difference from there. In first person shooters or anything with fast motion, every bit does help. Wouldn't something like a 240hz IPS, if those come to market in time with G-Sync, eliminate a lot of the perceived tearing just as well?
|
# ? Dec 12, 2013 17:17 |
|
No more than a 120 Hz monitor, because it only adds two additional Vsync targets that 120 Hz doesn't have - 240 Hz and 80 Hz.
|
# ? Dec 12, 2013 17:20 |
|
Gwaihir posted:We have a post the pictures of your ~real~ desktop thread, what kind of hardware forum would we be without a "Post the pictures of your desktop(pc this time)" thread? Yeah, some people might see it as posing, but I genuinely enjoy seeing what people have done with their systems.
|
# ? Dec 12, 2013 17:31 |
|
Zero VGS posted:Not wasted, because with V-Sync off I can still tell a clear difference between 30, 40, 50, and 60 FPS. The difference between 60 and 120 is like night and day as well, but from 120 to 144hz I can't pretend I can see much of a difference from there. In first person shooters or anything with fast motion, every bit does help.
|
# ? Dec 12, 2013 17:46 |
|
Yea, I dunno. I don't really mind tearing so much, but I hate lurching/stuttering. So I play with vsync disabled, and usually get between 40 and 60 FPS, which is about as much as you can reasonably get on a 30" screen without spending $1000s on video cards. A gsync module would basically mean I get the same lack of lurching/stuttering, but would also get rid of tearing, too. If the price was reasonable I'd go for it. Considering I'm a crazy person that already has a 30" monitor I guess I'd consider 75-maybe 100$ reasonable for the monitor module? I think it just comes down to having to see it in person. This is such a personal preference thing, and some people just can't see differences at all because of who knows why.
|
# ? Dec 12, 2013 18:10 |
|
The cool thing about running 3 1440 monitors is that I never have to worry about tearing, since I'll never break 60 fps
|
# ? Dec 12, 2013 18:26 |
|
veedubfreak posted:The cool thing about running 3 1440 monitors is that I never have to worry about tearing, since I'll never break 60 fps You still get tearing below 60FPS.
|
# ? Dec 12, 2013 18:41 |
|
Yea, tearing happens at all FPS other than "Exactly 30/60 all the time" (for 60hz monitors).
|
# ? Dec 12, 2013 18:44 |
|
GSync is awesome but it really needs to come down in price to where monitor manufacturers won't think twice about making it a part of their screens. NVIDIA needs to start handing these out very close to free so that the technology becomes a standard. As much as I want it, I don't think I am ready to give up my 30" 2560x1600 IPS screen.
|
# ? Dec 12, 2013 18:49 |
|
It sounds much of it is similar to a standard scaler, hardware-wise, except for the FPGA. No doubt if Nvidia could sell more, they could make more for cheaper per unit, but right now it's brand-new hardware. The 248Q + G-sync motherboard shot was pretty empty; I doubt that a scaler board and driver in a regular monitor is any less complex than a G-sync setup, at least by any large amount. I have to wonder: Why do this instead of a DisplayPort revision? Too big a change to have a prayer of catching on at a reasonable price?
|
# ? Dec 12, 2013 19:01 |
|
Gwaihir posted:We have a post the pictures of your ~real~ desktop thread, what kind of hardware forum would we be without a "Post the pictures of your desktop(pc this time)" thread? veedubfreak posted:So UPS was kind enough to leave my 800 dollar package from Mountain Mods at my door last night without even having the courtesy to ring the doorbell. God I loving hate UPS. Where should I post a build thread? Go for it, and if it turns into a twisted parody of [H] build threads, I'm OK with that too. We could use more threads than just the megathreads floating around.
|
# ? Dec 12, 2013 19:18 |
|
I can't imagine AMD not trying to copy vsync as fast as they can. Being AMD they'll probably manage to give it micro stutter somehow.
|
# ? Dec 12, 2013 19:30 |
|
movax posted:Go for it, and if it turns into a twisted parody of [H] build threads, I'm OK with that too. We could use more threads than just the megathreads floating around. Veedub, obviously you need to start this thread then. I enjoy both nice meticulous builds as well as "Living like poo poo" builds, so hey, room for all to post!
|
# ? Dec 12, 2013 19:43 |
|
What's confusing me about G-synch is what it offers in practical benefits over just having a 120 hz monitor. I've had one for years for 3D gaming and I can't remember the last time I actively saw tearing in a modern game. Maybe I am just just used to tearing but I thought if I wasn't cranking fps around in the 120 area I wouldn't need to worry. I always have the quality on max which keeps things in most modern releases at 60-80 fps.
|
# ? Dec 12, 2013 20:54 |
|
Do you play with vsync enabled?
|
# ? Dec 12, 2013 21:08 |
|
Incredulous Dylan posted:What's confusing me about G-synch is what it offers in practical benefits over just having a 120 hz monitor. I've had one for years for 3D gaming and I can't remember the last time I actively saw tearing in a modern game. Maybe I am just just used to tearing but I thought if I wasn't cranking fps around in the 120 area I wouldn't need to worry. I always have the quality on max which keeps things in most modern releases at 60-80 fps. For games it means that you won't see tearing or drop to 30 fps if you card can't maintain 60fps. For video playback it means that you no longer have to choose between tearing, audio resampling or the occasional missed frame due to the audio and video clocks not being in perfect sync. For more niche applications like emulation it means that you can emulate systems that don't run exactly at 60hz properly. Even arcade machines that run at oddball refresh rates like 53hz will be silky smooth and tear-free.
|
# ? Dec 12, 2013 21:12 |
|
Gwaihir posted:Do you play with vsync enabled? No - since the monitor is 120 hz I never saw a need for vsynch, especially after realizing vsynch often tanks your FPS. It always seemed really smooth without vsynch. I wasn't aware that your FPS could still take a hit if you didn't have high enough frames. Thanks for the explanation, Franz. I could see some emulation enthusiasts enjoying that!
|
# ? Dec 12, 2013 21:24 |
|
Without vsync turned on you still have tearing, it likely just doesn't bother you very much. It's never really bothered me a ton either. It's likely to be much more noticeable if you're playing FPS games vs something like an RTS though.
|
# ? Dec 12, 2013 21:29 |
|
Or you can double-buffer without vsync, but that still gives you stutter. If you're reliably getting 90 FPS, though, the stutter may not be noticeable.
|
# ? Dec 12, 2013 21:35 |
|
Thanks a lot. Looks like my card isn't unlockable, but it's good to know one way or the other.
|
# ? Dec 12, 2013 22:10 |
|
veedub, the only way you wouldn't notice it is if you're constantly at under 30FPS, since that's the maximum length it can tell the GPU to hold a frame for. Or if you just don't "see" tearing (some people don't, god bless you). And FF, it is basically a direct replacement FPGA for the scalar unit, good eye.
|
# ? Dec 12, 2013 23:01 |
|
I sometimes see it, but it's not all that often. Half the time I'm playing I'm drunk as poo poo anyway so maybe that is the solution.
|
# ? Dec 12, 2013 23:11 |
|
movax posted:Go for it, and if it turns into a twisted parody of [H] build threads, I'm OK with that too. We could use more threads than just the megathreads floating around. Now I just browse it to see their SoapBox forum go full retard like it's Free Republic Jr.
|
# ? Dec 13, 2013 00:53 |
|
So, um, speaking of [H], does this type of thing happen often? It doesn't, but is it a known problem for DirectCU cards? This was brought up in another non-[H] forum that I frequent, and it was put forth alongside with some other threads detailing lovely encounters with ASUS's customer service over returns. Does ASUS have a known bad reputation for this kind of stuff, or is it an example of a vocal minority spreading poo poo?
|
# ? Dec 13, 2013 02:53 |
|
Yeah, their videocards have been very inconsistent for the entire time they've been producing videocards. They attempt to leverage their motherboard design experience and stray very far from the reference designs, but aren't very good at it yet.
|
# ? Dec 13, 2013 03:15 |
|
I'm a bit sad that this first-generation G-Sync does away with the hardware scaler. I know I'm a small minority, but I like to plug consoles into my monitor too. Those often need scaling-up.
|
# ? Dec 13, 2013 03:25 |
|
|
# ? Jun 5, 2024 06:29 |
|
I think every board maker's CS except EVGA is godawful.
|
# ? Dec 13, 2013 04:35 |