|
power crystals posted:This is the problem with USB now. It can do a lot of things, but who knows what any given device or port supports. Maybe your device can fast charge at 20V, but the charger only supports 15V. Maybe a USB port supports displayport alternate mode, maybe it doesn't. Maybe it's just a USB 2 port in a C shape. Maybe it's thunderbolt. Who the gently caress knows! But at least now it's all the same cable, except when those cables get the power delivery spec wrong and fry your device. ???
|
# ? Jan 15, 2023 21:25 |
|
|
# ? May 30, 2024 22:05 |
|
power crystals posted:This is the problem with USB now. It can do a lot of things, but who knows what any given device or port supports. Maybe your device can fast charge at 20V, but the charger only supports 15V. Maybe a USB port supports displayport alternate mode, maybe it doesn't. Maybe it's just a USB 2 port in a C shape. Maybe it's thunderbolt. Who the gently caress knows! But at least now it's all the same cable, except when those cables get the power delivery spec wrong and fry your device. On the data side, I agree with you; all the optional parts suck and it should be clear what a port supports in terms of transfer and display. But for charging, it's totally fine. Some devices only want 18w and can use a cheap 18w charger, just like before. What's better than before though, is if you want you can take one 100w charger and use it for all your devices, including the lower power ones. And you can buy a handful of chargers and seed them throughout your home, knowing they'll work for everything. And you can buy devices without chargers, since you can assume backwards compatibility with what you had before. Like yes there's no magic bullet that will make a 10w adapter put out 100w, but you can buy a few 60w ones and they will cover all of your needs, including low power ones. It is way, way better. I bring one cable and one adapter when I travel now and it's glorious.
|
# ? Jan 15, 2023 21:45 |
|
in a pinch you can take advantage of under-speced chargers too, a phone charger won't charge a laptop particularly quickly but it will charge it, and that's better than nothing if it's all you have to hand
|
# ? Jan 15, 2023 21:59 |
|
repiv posted:in a pinch you can take advantage of under-speced chargers too, a phone charger won't charge a laptop particularly quickly but it will charge it, and that's better than nothing if it's all you have to hand I specifically use underpowered chargers on airplanes because airlines will often shut off outlets if you try to pull over 65w, and I can actually power my laptop on that fine if I'm not pushing it. It's more flexible than anything we've had before, and can legitimately reduce e-waste as well.
|
# ? Jan 15, 2023 22:05 |
|
VorpalFish posted:The only acceptable alternative to usbc for charging small consumer electronics is magsafe, and even then only if usbc charging is also enabled. no this is the correct opinion
|
# ? Jan 15, 2023 22:09 |
|
repiv posted:in a pinch you can take advantage of under-speced chargers too, a phone charger won't charge a laptop particularly quickly but it will charge it, and that's better than nothing if it's all you have to hand IIRC a USB-C port (without a PD marking) could be in “spec” and provide as little as 100mA in USB 1.1 mode. Normally you’d expect any USB-C port to provide 1.5A/5V at a minimum which is still only 7.5W and not enough to trickle charge a laptop from what I’ve seen.
|
# ? Jan 15, 2023 22:13 |
|
i meant more along the lines of using a 25w PD charger (typical for newer phones) to charge a laptop that expects 65w or more there is a lower bound where the laptop just won't charge if you give it comically little wattage, yeah
|
# ? Jan 15, 2023 22:14 |
|
Apparently the USB-C labeling situation is supposed to improve, finally. https://www.howtogeek.com/864375/psa-updated-usb-logos-make-selecting-the-right-cable-simple/
|
# ? Jan 15, 2023 23:04 |
|
CaptainSarcastic posted:Apparently the USB-C labeling situation is supposed to improve, finally. quote:Manufacturers have the option to print these logos on devices in addition to the box, so you may see them clearly indicated on a laptop’s USB ports, for example—but that’s not required. You won’t necessarily see them printed physically on your devices. So, almost nobody will actually do it, got it.
|
# ? Jan 15, 2023 23:24 |
|
power crystals posted:So, almost nobody will actually do it, got it. Yeah, that's not great, but what I found more important was this: quote:For USB cables, USB-C cables will have updated packaging and cable logo, indicating both the speed capacity of the cable plus the wattage capacity for charging. So rather than having to decode what some initials and a squiggle mean, you can simply read “Certified USB 80Gbps 60W” right on logo, which you’ll find both on the box and the cable itself.
|
# ? Jan 15, 2023 23:30 |
|
CaptainSarcastic posted:Yeah, that's not great, but what I found more important was this: Yeah that part is a massive improvement. Insane that that wasn’t in the spec from day 1.
|
# ? Jan 15, 2023 23:31 |
|
The vast majority of no-name USB cables from non-parties to USB-IF will still ignore it, but the retail cables from real brands will hopefully follow it. USB is still an absolute mess, but hopefully this will cut down on a small part of the confusion in the near future where the USB-C connector itself will be used for data, power, display, and everything else for every device.
|
# ? Jan 15, 2023 23:38 |
|
Cygni posted:The vast majority of no-name USB cables from non-parties to USB-IF will still ignore it, but the retail cables from real brands will hopefully follow it. USB is still an absolute mess, but hopefully this will cut down on a small part of the confusion in the near future where the USB-C connector itself will be used for data, power, display, and everything else for every device. Agreed. As it is I have carefully separated out my various USB-C cables and keep the ones that can carry higher power and faster data apart from the generic ones, but it makes organizing my cables that much more of a pain.
|
# ? Jan 16, 2023 01:39 |
|
hobbesmaster posted:IIRC a USB-C port (without a PD marking) could be in “spec” and provide as little as 100mA in USB 1.1 mode. Normally you’d expect any USB-C port to provide 1.5A/5V at a minimum which is still only 7.5W and not enough to trickle charge a laptop from what I’ve seen. Isn't that minimum spec current from pre-USC C/PD days? it's hard to find those things anymore. Even the cheap USC C non-PD-aware charging ports on random chargers or those lovely international plug adapters I buy because I forgot to pack the good one will usually do 15 W (5V x 3A) or 18W without PD. I too seed the house with a few chargers and don't worry about it anymore. It's not perfect but it's a ~better world~. CaptainSarcastic posted:Agreed. As it is I have carefully separated out my various USB-C cables and keep the ones that can carry higher power and faster data apart from the generic ones, but it makes organizing my cables that much more of a pain. The 5A cables are palpably thicker, that's how I distinguish them so far. The only USB C data stuff I do is pigtails on hubs which then usually distribute to USB A ports, so that's a lot simpler too. Do wish most USB cables carried actual data rate markings like for eg Cat 5/6 cable does. E: I do have one USB C to DisplayPort cable which can do 8K/60Hz or 32.4Gbps and only costs $19. That's pretty stellar and I was quite surprised. v1ld fucked around with this message at 03:23 on Jan 16, 2023 |
# ? Jan 16, 2023 03:17 |
|
power crystals posted:This is the problem with USB now. It can do a lot of things, but who knows what any given device or port supports. Maybe your device can fast charge at 20V, but the charger only supports 15V. Maybe a USB port supports displayport alternate mode, maybe it doesn't. Maybe it's just a USB 2 port in a C shape. Maybe it's thunderbolt. Who the gently caress knows! But at least now it's all the same cable, except when those cables get the power delivery spec wrong and fry your device. or burn your house down! repiv posted:i meant more along the lines of using a 25w PD charger (typical for newer phones) to charge a laptop that expects 65w or more if the 25w phone charger only outputs 9v, and the laptop expects 20v from the power brick it came with, it may or may not charge. depends if the maker spend the extra buck fifty to add the needed converters.
|
# ? Jan 16, 2023 04:57 |
|
Unrelated, but my new car lets me toggle between Normal and Sport driving modes, the latter of which turns the speedometer and tachometer from blue to red. I have dubbed it RAGEMODE.
|
# ? Jan 17, 2023 00:58 |
|
eagerly awaiting the reintroduction of "maxx" to AMD branding
|
# ? Jan 17, 2023 04:07 |
|
Can we bring back the hot 3d ladies on the boxes too
|
# ? Jan 17, 2023 04:18 |
|
Anime Schoolgirl posted:eagerly awaiting the reintroduction of "maxx" to AMD branding "Eco mode" to be renamed "Goblin mode"
|
# ? Jan 17, 2023 04:21 |
Klyith posted:Yeah your mobo is what's hosed. Both A slots connect to the same pins on the CPU, so if only one of them works it's gotta be something on the mobo. You were correct. I tried to take the CPU back to Micro Center to exchange it and they were out of stock but they did have the same mobo, so I went ahead and bought that to switch out. Everything's working great now, just have to return the old mobo to Amazon.
|
|
# ? Jan 17, 2023 06:26 |
|
New Zealand can eat me posted:Can we bring back the hot 3d ladies on the boxes too What about the hot Palit GPU frog?
|
# ? Jan 17, 2023 08:29 |
|
Criss-cross posted:What about the hot Palit GPU frog? Sir, behave.
|
# ? Jan 17, 2023 11:01 |
|
gradenko_2000 posted:"Eco mode" to be renamed "Goblin mode" ?
|
# ? Jan 17, 2023 16:02 |
|
perfect in every way
|
# ? Jan 17, 2023 16:59 |
|
90s and 00s GPU boxes are my favourite genre of art.
|
# ? Jan 17, 2023 18:05 |
|
Criss-cross posted:What about the hot Palit GPU frog? There's something wrong with my frog but I'm okay with it
|
# ? Jan 17, 2023 22:27 |
|
^ I have this card. It's on the shroud too. Wait, no, I'm wrong. Mine has that elf but she's holding a lazer weapon of some sort. Sorry everyone!
|
# ? Jan 18, 2023 02:37 |
|
I know that Type R stickers add 5HP, but how many FPS do Space Elf Titties add?
|
# ? Jan 18, 2023 05:21 |
|
To horny jail with all of you, by degree of GPU Gandalf
|
# ? Jan 18, 2023 10:08 |
|
Only ever bought one weird GPU box, and it was unconventional even to try to get open. (mine was the GT, though ) I also had one motherboard that looked like it had a Yu-Gi-Oh monster on the box, and it was bar none the worst motherboard I've ever owned in history. Even including my Socket 7 board with the VIA chipset.
|
# ? Jan 18, 2023 13:50 |
|
"Defenders of the Gaming World"
|
# ? Jan 18, 2023 14:11 |
|
https://www.tomshardware.com/news/micron-unveils-24gb-and-48gb-ddr5-memory-modules disgusting, RAM comes in powers of two
|
# ? Jan 18, 2023 15:32 |
|
someone explain that article to me, I got to this part and my brain just ground to a halt.quote:Typically, 24GB and 48GB capacities are considered to be optimal for new-generation server platforms as they allow systems to precisely balance memory capacity and the number of cores, which ultimately means lower costs. Meanwhile, support for AMD EXPO and Intel XMP 3.0 profiles designed primarily for enthusiasts in mind indicate that these modules are indeed aimed at desktops.
|
# ? Jan 18, 2023 16:19 |
It sounds like someone copied marketing speak, because it's absolutely as idiotic as you think it is. Even if virtual memory didn't exist, main memory isn't partitioned.
|
|
# ? Jan 18, 2023 16:23 |
|
More charitably, I'd suggest that a given parallelizable workload will use additional memory for each additional process or core that it's able to use, because that's certainly true for my group? So you've got process X, that can only saturate 1 core, and uses a max of 4GB of RAM. If you want to effectively fill 24 cores, you had better have at least 96GB of RAM. It makes sense to me?
|
# ? Jan 18, 2023 16:27 |
|
Twerk from Home posted:More charitably, I'd suggest that a given parallelizable workload will use additional memory for each additional process or core that it's able to use, because that's certainly true for my group? So you've got process X, that can only saturate 1 core, and uses a max of 4GB of RAM. If you want to effectively fill 24 cores, you had better have at least 96GB of RAM. It makes sense to me? In that scenario how is 96GB of RAM "more optimal" than 128GB of RAM though?
|
# ? Jan 18, 2023 16:32 |
|
Charles Leclerc posted:In that scenario how is 96GB of RAM "more optimal" than 128GB of RAM though? It's enough for your job and is cheaper. "More than you need" isn't optimal for anyone but enthusiasts.
|
# ? Jan 18, 2023 16:33 |
|
I think this is a much bigger problem in servers, and these weird-capacity DIMMs are going to be super useful there. Current Xeons have 8 memory channels, and current Eypcs have 12. You need a matched set of DIMMs to get full memory bandwidth out of these, what do you do if your 2-socket workload needs 1TB of RAM, but 32GB DIMMs * 12 channels * 2 sockets gives you 768GB of RAM, and 64GB DIMMs * 12 channels * 2 sockets gives you 1.5TB. Are you really going to pay for an extra 512GB of memory that you don't need?
|
# ? Jan 18, 2023 16:37 |
Twerk from Home posted:More charitably, I'd suggest that a given parallelizable workload will use additional memory for each additional process or core that it's able to use, because that's certainly true for my group? So you've got process X, that can only saturate 1 core, and uses a max of 4GB of RAM. If you want to effectively fill 24 cores, you had better have at least 96GB of RAM. It makes sense to me? Even putting aside how you can know exactly how much memory a process uses, modern memory systems will page out inactive data into swap in order to fit more often-used pages. If you aren't using FreeBSD which has a unified buffer cache for everything, you also have to account for everything from filesystem caching (both for your primary filesystem as well as things like NFS), SysV/POSIX shared memory segments, and lots of other things. If you're also doing substantial network traffic, there's the kernel itself to consider too. Ultimately, if you're doing any kind of computing, keeping things in memory cache is a lot better for performance than keeping it on disk - so you always want to max out on memory, unless what you're working on fits entirely within a cacheline.
|
|
# ? Jan 18, 2023 16:37 |
|
|
# ? May 30, 2024 22:05 |
|
BlankSystemDaemon posted:That absolutely makes sense on paper, but in reality memory allocation doesn't work like that. Sure, I simplified it some and you need overhead for the OS, but once you've got enough RAM for your workload + the most active pages in cache / buffers, adding additional RAM is not going to do very much and memory is expensive. You don't want to just buy 50% more RAM than you can effectively use. quote:Ultimately, if you're doing any kind of computing, keeping things in memory cache is a lot better for performance than keeping it on disk - so you always want to max out on memory, unless what you're working on fits entirely within a cacheline. Everything is a trade-off. Should we be recommending 128GB kits for everyone in the PC building thread? Should I be more than doubling the cost of our compute nodes by throwing 16x 128GB LRDIMMs in them, because "you always want to max out on memory"?
|
# ? Jan 18, 2023 16:40 |