Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Enos Cabell
Nov 3, 2004


wolrah posted:

I'm a block and a half from corn fields in two different directions and I have 500mbit/sec cable service. The next town over has gigabit muni fiber. In between us, however, are a lot of unfortunate souls who have the choice of LTE based services or Frontier DSL that advertises 24/3 and might do 10/1 if you're lucky.

There's very little that anyone can accurately generalize about American internet service, other than that the vast majority of households do not have meaningful competition between wireline providers.

Nebraska? I've got gigabit u/d here and my in laws 30 min away have 1mbit satellite as their only option.

Adbot
ADBOT LOVES YOU

Cao Ni Ma
May 25, 2010



EdEddnEddy posted:

The question I have to ask, have you actually tried VR, And not just your Phone or a Rollercoaster Experience VR, but actually strapped on a Vive in a proper room scale setup and Played any of the excellent experiences that are available currently? (Or sitting in a game such as Elite Dangerous or Project Cars? Yes a flight sim and driving sim which VR was good for even before the Roomscale stuff came about.)

It always seems the detractors of the tech are people that have no actual, or a short bit of poor experience with it.

VR has a lot of downsides such as the Price of entry, hardware requirements, headset visual quality, etc, but the good experiences are already there and continue to get better as we learn a hell of a lot more than we knew prior to ~2014.

Is it perfect yet? Hell no, but it is really, really good when it is setup well, and is designed really well from a VR Focused base. (Normal screen games converted to VR don't count as its a crapshoot if they will be converted in a way that actually is good, or be complete garbage. Similar to how 3D Movies made out of 2D Movies were usually terrible compared to a movie actually shot in 3D in the first place.)

Id buy VR just for something like Elite but its a steep price of entry for just one game. The Rift S being $400 is making me consider buying one though.

Enos Cabell
Nov 3, 2004


Cao Ni Ma posted:

Id buy VR just for something like Elite but its a steep price of entry for just one game. The Rift S being $400 is making me consider buying one though.

Hit up the VR thread in Games, but for simulators one of the cheaper WMR headsets can be a great option.

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

I mean, PC gaming might be dying but no more than desktop PCs in general are dying, in favor of tablets and mobiles

joe football
Dec 22, 2012

Comfy Fleece Sweater posted:

I mean, PC gaming might be dying but no more than desktop PCs in general are dying, in favor of tablets and mobiles

Tablet sales are also declining and smartphone sales are leveling off. I'm afraid the entire concept of the electronic computing is on death's door. We're going to have to go back to rooms full of classy ladies doing math soon

wolrah
May 8, 2006
what?

Enos Cabell posted:

Nebraska? I've got gigabit u/d here and my in laws 30 min away have 1mbit satellite as their only option.

Ohio. I used to live in one of those kinds of areas, about 20 miles from where I am now. Verizon actually accidentally sold us DSL out there when we were officially out of range, it'd work at sub-megabit speeds most of the time but got pretty lossy at times and would totally drop out on hot summer days because the above ground lines apparently stretched out and we were so marginal it'd just stop working. We kept it because our other choices were 26.4kbit/sec dialup or satellite. Since that experience I have had ISP availability as a primary criteria when looking at places to live.

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

Comfy Fleece Sweater posted:

I mean, PC gaming might be dying but no more than desktop PCs in general are dying, in favor of tablets and mobiles

The percentage of gaming PC vs. home desktops productivity and gen usage show that gaming PC are gaining.

https://www.youtube.com/watch?v=qioXYSED-MM

(if you can't speak Spanish, you know what this clip is anyways)

fknlo
Jul 6, 2009


Fun Shoe
I live in a town of 80,000 in the middle of the country and have fiber. It's also run by the city and costs $50/month. Before that I also had a fiber connection in the middle of flyover country. You can get good internet away from the coast but I know that it's not even remotely common and I've been very fortunate to have that kind of connection for almost a decade now.

Dodoman
Feb 26, 2009



A moment of laxity
A lifetime of regret
Lipstick Apathy

Happy_Misanthrope posted:

I thought Nvidia's multi-display GPU clock issue was fixed? My clocks go from 300 to 1200 mhz when I extend my display to my TV. Both are '60'hz, the only difference is my TV is likely at 59.94 vs the true 60hz of my monitor. Both are 4k.

Edit: Tried the multi-monitor performance mode setting in CP btw, no diff.

Try multi display powersaver that comes with NVinspector, it's the only tool that has reliably worked for me

sauer kraut
Oct 2, 2004
I haven't seen this posted yet, behold Nvidias fix for the big RTX cards burning up inside:



Yes this is how they come from the factory now.
(German) https://www.tomshw.de/2019/04/17/ka...ueck-der-woche/

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

Bazooka Joe to the rescue!

(Bazooka Joe is bubble gum)

eames
May 9, 2009

That is a RTX Titan in the picture. Nothing says 2500$ like five different thermal interface compunds on one cooler.

The EVGA 2080ti kingpin has a much more elegant copper heatspreader, it’s briefly visible in a GN video.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

eames posted:

That is a RTX Titan in the picture. Nothing says 2500$ like five different thermal interface compunds on one cooler.

The EVGA 2080ti kingpin has a much more elegant copper heatspreader, it’s briefly visible in a GN video.

On a $300 card, yeah, okay I get that. Kind of.

On the fuckin Titan? lol gently caress off with that poo poo

E: I mean for $2500 it should have poo poo I haven’t even heard of for cooling

Icept
Jul 11, 2001

Rexxed posted:

Bazooka Joe to the rescue!

(Bazooka Joe is bubble gum)

Yeah no joke that looks like the underside of an elementary school desk

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

tehinternet posted:

On a $300 card, yeah, okay I get that. Kind of.

On the fuckin Titan? lol gently caress off with that poo poo

E: I mean for $2500 it should have poo poo I haven’t even heard of for cooling

The one thing you can always be sure about companies is that they love saving money more than you do

eames
May 9, 2009

Does anybody understand why they went with this solid cooling plate covering the entire front of the PCB rather than letting the airflow from the fans hit the components and PCB? Surely was more effective than the contraption pictured above? Even if this thermal pad workaround solves the "sudden death" issue there's little doubt in my mind that these cards will have a shorter lifetime than they could have with better cooling.

I found a picture of EVGA's memory heatspreader, apparently they already used it for their watercooled Pascal cards.

Only registered members can see post attachments!

Stanley Pain
Jun 16, 2001

by Fluffdaddy

sauer kraut posted:

I haven't seen this posted yet, behold Nvidias fix for the big RTX cards burning up inside:



Yes this is how they come from the factory now.
(German) https://www.tomshw.de/2019/04/17/ka...ueck-der-woche/

:dogbutton: :sad:


So glad I cancelled by 2080ti preorder and got a regular 2080 instead.

VelociBacon
Dec 8, 2009

Stanley Pain posted:

:dogbutton: :sad:


So glad I cancelled by 2080ti preorder and got a regular 2080 instead.

Just never buy first party cards.

Cygni
Nov 12, 2005

raring to post

Thermal pad spam is what all of the high end cards do, and have done for a few generations if you watch the GN teardowns. That seems a bit over the top, but that cooler is over the top in a whole lot of ways (like screw count/type).

Yall complain about weird stuff.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It's a very reasonable way to dissipate heat from a bunch of different surfaces that don't need great thermal assistance, but definitely need some.

Purgatory Glory
Feb 20, 2005
So how big of a deal is pcie 4.0? Is the throughput increase noticeable everywhere? And now 5.0 is coming soon after..

VelociBacon
Dec 8, 2009

Purgatory Glory posted:

So how big of a deal is pcie 4.0? Is the throughput increase noticeable everywhere? And now 5.0 is coming soon after..

Only extreme edge cases (some 2080ti) saturate 16x PCI-E 3.0 lanes, and only then by a tiny amount, so for GPUs it's not really any news of interest.

Setset
Apr 14, 2012
Grimey Drawer

VelociBacon posted:

Only extreme edge cases (some 2080ti) saturate 16x PCI-E 3.0 lanes, and only then by a tiny amount, so for GPUs it's not really any news of interest.

unless your cpu doesnt have many pcie lanes and wants to use them for other things, then 8x lanes pcie4 would be sufficient

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
What situations cause the need for PCIe lanes? I hear about it a lot but I don’t really see it mentioned outside these forums. Is it less a PC gaming thing and more some other thing?

Setset
Apr 14, 2012
Grimey Drawer

buglord posted:

What situations cause the need for PCIe lanes? I hear about it a lot but I don’t really see it mentioned outside these forums. Is it less a PC gaming thing and more some other thing?

NVME uses 4, thunderbolt can use 4, gigabit ethernet uses 4. And your chipset (not cpu lanes) can only assign lanes in groups of 4, so anything that takes 1 lane is going to pull 4.

Yeah it's not a huge deal but the 9900k only has 16 lanes to begin with. Chipsets provide another ~24 or so

Cygni
Nov 12, 2005

raring to post

buglord posted:

What situations cause the need for PCIe lanes? I hear about it a lot but I don’t really see it mentioned outside these forums. Is it less a PC gaming thing and more some other thing?

Bandwidth needs. Or if you want to think about it another way, communication needs.

A little wifi card or sound card doesn't need a ton of bandwidth, cause the data its working on is relatively small and doesnt need to be passed around to a bunch of other locations in the system. On the other side of the spectrum, something like a video card (or the interconnect between two CPUs in a server) could be moving huge amounts of data between places like system RAM, compute units, etc

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

VelociBacon posted:

Only extreme edge cases (some 2080ti) saturate 16x PCI-E 3.0 lanes, and only then by a tiny amount, so for GPUs it's not really any news of interest.

2080Tis saturate PCIe 2.0 lanes on occasion. It's going to be quite some time before anything saturates 16 3.0 lanes, and even the first crop of PCIe 4.0 storage solutions are adding about 500-750MB/sec on top of the current top-tier NVMe drives.

In short, PCIe 4.0 might be nice if there's an easy way to control lane allocation to enable more/better use of add-in cards. CF/SLI/NVLink might not be worth it, but for ML applications and the like, being able to theoretically run four GPGPUs on a *consumer* board at PCIe 4.0 4x4 (which would be the equivalent to 4x PCIe 2.0 x8) would help a lot of hobbyists and students not have to spend a small fortune on a super-expensive PC.

VelociBacon
Dec 8, 2009

Oh sorry. I thought the GN test was on 3.0?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

VelociBacon posted:

Oh sorry. I thought the GN test was on 3.0?

It might have been, but I remember the results showing that at the absolute limit, that the 2080Ti *just* barely would completely saturate and slightly exceed the maximum bandwidth of a PCIe 2.0 x16 connection.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

BIG HEADLINE posted:

It might have been, but I remember the results showing that at the absolute limit, that the 2080Ti *just* barely would completely saturate and slightly exceed the maximum bandwidth of a PCIe 2.0 x16 connection.

Only took ~12 years to saturate the PCIe 2.0 on the X38 chipset

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
Re PCIe 4, it isn’t only the throughout but also the faster and wider the link is, the lower the latency on the link as well. This is more important for machine learning gpu applications than graphics in general, but the link also responds quicker to turnaround data transfers.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
Looking at the description of PCIe 5.0, it says 16x will run at 63 GB/s. Considering dual channel 3200 DDR4 only has 51.2 GB/s of bandwidth, it sounds like the kind of thing that won't feature in the consumer space for years even though the spec is supposed to be finalised shortly. Won't the motherboards cost a fortune too given that the traces for PCIe 4.0 are supposed to be pretty difficult to create already?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
We'll have DDR5 before PCIe 5.0, and Tom's Hardware just put out caps from a DDR5 presentation yesterday: https://www.tomshardware.com/news/what-we-know-ddr5-ram,39079.html

SwissArmyDruid
Feb 14, 2014

by sebmojo
oh loving brother, that one guy has a loving Navi video out.

ConanTheLibrarian posted:

Looking at the description of PCIe 5.0, it says 16x will run at 63 GB/s. Considering dual channel 3200 DDR4 only has 51.2 GB/s of bandwidth, it sounds like the kind of thing that won't feature in the consumer space for years even though the spec is supposed to be finalised shortly. Won't the motherboards cost a fortune too given that the traces for PCIe 4.0 are supposed to be pretty difficult to create already?

No, because PCIe 4.0 is electrically-identical to PCIe 3.0. The issue is signalling, not physical traces. It's why there is the prospect, with Zen 2, that PCIe 4.0 could come to existing motherboards in a limited capacity, that is, to the first slot only, and then everything else would remain PCIe 3.0.

To get PCIe 4.0 out to all the slots, you would need booster chips, which, obviously, won't be retrofitted to older boards, but will probably be a thing on newer boards.

SwissArmyDruid fucked around with this message at 21:32 on Apr 19, 2019

Worf
Sep 12, 2017

If only Seth would love me like I love him!

What does a booster do / how does it do it

Is it basically just running a second bidirectional garden hose from the well?

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Statutory Ape posted:

What does a booster do / how does it do it

Is it basically just running a second bidirectional garden hose from the well?

a repeater basically

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
PCIe repeaters fall into two categories: redrivers and retimers.

Redrivers are electrical only to boost the signal and reduces the random jitter. They are just signal conditioners. Pcie equalization will be performed on the entire link, through the redriver.

Retimers are aware of the pcie link equalization procedure so each link connected to a retimer goes through equalization. They convert the high speed analog to digital to extract the equalization training sets and then back to high speed analog. So they’re quite a bit more complex. They shouldn’t show up in the pcie device tree. However another method would be to use 2 port pcie switches to retime links as well, which would show up in the tree and can add some additional latency depending on how the switch works.

IDT has a good explainer if you’re into that kind of thing:

https://www.idt.com/document/msc/pcie-gen3-retimer-frequently-asked-questions

Worf
Sep 12, 2017

If only Seth would love me like I love him!

A few pages back ( I think it was this thread) we talked about going to nv control panel and putting fast sync on. Seems to work great for me on a gtx1060 hooked into a TCL tv

Tried the same thing also on a 1060 on my laptop and the option wasn't there for the laptop screen

Is it just not available or? It's a Helios 300 with the 7th gen i7/15.6/1060

defaultluser
Jan 13, 2007

The person can drink sake for the following five reasons. First of all, for the national holiday. Moreover, it fills with the nectar. Finally, for reasons. Next, to heal the dryness of the place. After that, to refuse the future
Fun Shoe

Statutory Ape posted:

A few pages back ( I think it was this thread) we talked about going to nv control panel and putting fast sync on. Seems to work great for me on a gtx1060 hooked into a TCL tv

Tried the same thing also on a 1060 on my laptop and the option wasn't there for the laptop screen

Is it just not available or? It's a Helios 300 with the 7th gen i7/15.6/1060

If your laptop uses Optimus, you're might not able to use it, because all display outputs usually go through the Intel integrated graphics.

It's the same reason you can't drive gsync external display on most Optimus laptops. You need the whole chain to support the feature, or you need a laptop with an dedicated discrete card output.

defaultluser fucked around with this message at 16:46 on Apr 20, 2019

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

HalloKitty posted:

Interesting - gpu usage is higher and cpu lower with vulkan as one might expect, but there's clearly optimisation still needed

stop thinking of low CPU utilization as being a good thing, think of it as "low CPU occupancy" instead. Your cores can't get enough work to keep them occupied.

The reason the CPU utilization is lower on Vulkan is probably because the dev sucks at optimizing drawcall submission and NVIDIA's DX11 multithreading driver is clawing back some performance by spreading it across multiple threads. And you've got a single-thread bottleneck in Vulkan as a result, which is why the framerate is lower.

Low CPU utilization and low framerate is actually a bad thing, high utilization and high framerate is a good thing (as long as utilization is not so high that you are getting stutter). Otherwise you've just got a bunch of cores (and your GPU) sitting there idle while the main thread tries to churn through fast enough to keep them busy. Single-thread bottlenecks are a practical reality in many titles even in multithreaded engines.

(low CPU utilization and high framerate is great of course! but given the choice I'll take high framerate over high percent of my processor sitting idle any day.)

Paul MaudDib fucked around with this message at 19:18 on Apr 22, 2019

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply