Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
ExcessBLarg!
Sep 1, 2001

Splode posted:

I'm really enjoying pop as my first distro, out of curiosity why is ubuntu more popular than straight debian? It seems like lots of things are debian based but few people use it directly, why is that?
Debian has been around for a long time. It's biggest contribution was to figure out how to actually built a working community software distribution in a decentralized fashion, including all the ancillary tooling (package mangers, distributed-build systems, bug reports, etc.) needed to do it. Beyond that, Debian doesn't do much to actually brand itself, and it strives to be generalist--you could use the same distribution on a server, desktop, Raspberry Pi, or anything with a compatible CPU that once you figure out how to run software on it Debian has thousands of packages to install. So while people do run straight Debian, it's really better as a base on which purpose-specific distributions build themselves.

Ubuntu started as a desktop-focused distribution. It's original advantage over Debian is that it would come with all the glue scripts necessary to make all the component of your laptop work out of the box (for a sufficiently popular laptop). In the quest for profitability Ubuntu has branched off into other directions, namely cloud and IoT, often dragging their desktop users along with changes they're not necessarily interested in. Others branched off of Ubuntu with either a more-specific focus (gaming) or just to avoid the more controversial changes that Canonical was/is making.

I run Ubuntu (specifically Lubuntu) on desktops where I want things to just work without having to configure much, and Debian most everywhere else (servers, Chrome OS chroots, etc.).

Adbot
ADBOT LOVES YOU

ExcessBLarg!
Sep 1, 2001

Emong posted:

People are mostly annoyed that they're replacing normal packages with snaps invisibly.
It's only happened twice that I'm aware of: the LXD container manager and Firefox. In both cases it was done at the request of the upstream project.

LXD is annoying. I don't use it, but it's a default installation on Ubuntu Server, so now the server install invisibly pulls in the lxd snap. The "at the request of the upstream project" part may be true, but LXD is sponsored by Canonical so they're also pretty clearly throwing their weight around here.

Firefox makes some sense. Browsers are a priority target for 0-days, so keeping them up-to-date via a "multi-distribution" (or at least, a multi-release) build makes sense versus Mozilla volunteers having to backport security updates to multiple LTS firefox-esr builds. Conveniently this gives an excuse for Ubuntu desktop installations to pull in the firefox snap, but I don't know, there's worse things in the world.

Snaps themselves are technically sound and the client is free software. The two major complaints in that the backend, the snapcraft.io store, is proprietary and owned by Canonical, and that snaps do forced updates which isn't compatible with enterprises that have policies against unscheduled/untested updates. Oh, and I guess that flatpak already exists, although it doesn't fully cover the use cases for snaps.

ExcessBLarg! fucked around with this message at 19:38 on Sep 21, 2022

ExcessBLarg!
Sep 1, 2001
I don't think that's a fair characterization. They didn't change the format, just the compressor used for packages. You can still decompress Ubuntu-flavored debs on a Debian system "manually" (via ar, zstd, and tar), to the extent that using Ubuntu-flavored debs on a Debian system is of any utility in the first place. Debian (or a specific maintainer) is just dragging their feet adding support that was requested four years ago. It's worth keeping in mind a couple of things:
  • Debian already supports multiple compression methods: gzip, lzma (deprecated), and xz. Although their past experience with lzma might be contributing to the delay for zstd.
  • Literally everyone else has added support for zstd as a lossless compression method for their file formats because the speed gains are significant relative to increased compression size. Even Linux has an in-kernel zstd decompressor now.

Why does Ubuntu want to change the compression method so "badly"? My guess is cloud-init. If you use AWS to boot Ubuntu AMIs with a cloud-init configuration to install needed packages as part of your autoscailing solution, the time to decompress and install debs is actually a significant part of getting an instance up and running. Moving to zstd makes this significantly faster, which makes Ubuntu's cloud solutions more competitive (so, part of their monetization) but is a general benefit to anyone who uses Ubuntu on AWS even in a free capacity. This is something Debian just doesn't do so they're not as motivated to move quickly on it.

Sibling distributions won't agree on everything, but where they disagree this is exactly the from of disagreement I want to see take place where the discussion is public and technically rooted.

ExcessBLarg!
Sep 1, 2001

Truga posted:

what i don't get is, multithreaded xz dpkg decompression is in works, and since it scales almost linearly with core count, it will make zstd speedup irrelevant unless you somehow only have a single core in tyool 2022, while still keeping the smaller filesize, which is always nice. instead of contributing to this solution, ubuntu just pushed their own through instead.
Again, everyone else has already switched to zstd, so there's clearly use-cases where multithreaded decompression isn't sufficient.

And yes, Ubuntu is frequently run on single-core or CPU-constrained VM instances (AWS micro instances, or micro-VMs in general). It would suck to hit your CPU limit while booting an instance if cloud-init has to install xz-compressed packages. Plus there's the whole Ubuntu IoT initiative.

Truga posted:

also, ubuntu packages are often useful on desktop debians, because many people still only package desktop apps for ubuntu despite the difference mostly not being there anymore
Are people who are packaging out-of-tree Ubuntu packages building them with zstd compression anyways? I don't think it's actually the default for dpkg-buildpackage.

But all that aside, does Ubuntu not have a say in the dpkg format? Is it forever "owned" by Debian maintainers? Ubuntu waited three years for Debian to act or offer a sufficient comrpomise before making the switch for their own repos. I think that's a reasonable enough time to avoid introducing a "breaking" change.

Mind you, I've been running Debian for 25 years and I appreciate their stability and uh, "deliberateness" to change. But the zstd thing is a really silly thing to get hung up over.

ExcessBLarg!
Sep 1, 2001
I've been following the Intel Arc stuff a bit. I don't have a gaming PC but even if I did, the last time I purchased a discrete GPU was a Riva TNT2. However, I've been using Intel's integrated stuff on Linux for quite a while and they're to be commended for driving much of the effort on Linux DRM and improving the state of Linux graphics in general.

So Intel recently announced the Arc A770 for sale. The A770 16 GB is $349 (!). Price-wise that's insanely competitive. The real issue with Arc is that it's a new card that's not "trusted", and their Windows OpenGL and DX 9/10/11 driver underperforms significantly.

I don't know if anyone has tried it for Linux gaming. Anyone know if review units have gone out? I'd be curious how it performs with DXVK--it might be an insanely good option if everything you're doing is Vulkan (or MesaGL I guess) anyways.

ExcessBLarg!
Sep 1, 2001
Anyone know if there's a "better" way to change the name of the current user in wine, other than symlinking the $WINEPREFIX/drive_c/users/$USER dir?

I need to make C:\Users\deck behave like C:\Users\steamuser, at least temporarily. The symlink does seem to work fine though.

ExcessBLarg!
Sep 1, 2001
Anyone here use DuckStation (standalone)? Is it possible to get it to automatically resize the render window?

Unless I start it in fullscreen, it always renders to a 640x480 window even if I change the internal resolution. I'm looking for something like Dolphin's "Auto-Adjust Window Size" option but I can't find something like that in the settings.

ExcessBLarg!
Sep 1, 2001

Klyith posted:

I would assume that a PC that is totally stable in Windows but has hard locks in Linux is not suffering from bad hardware.
It's entirely possible for bad hardware to be tickled in ways by Linux that it isn't by Windows, and yet it's not due to a "bug" in any Linux driver.

ExcessBLarg!
Sep 1, 2001

Klyith posted:

Anything's possible. How likely is that though? If you had a PC that was 100% stable in Windows and was not so in Linux, would you spend your time doing hardware diagnostics and swapping out the PSU?
If I wanted to run Linux on it, yes, I would. Because the issue is either bad hardware or a buggy driver, but at least isolating the cause to the hardware or driver relevant gives you more information to work with to make an informed decision on how to proceed.

Also, rarely does Linux "hard lock" in my experience although I don't play around with Nvidia GPUs/nouveau. At least you get a panic message with a backtrace or BUG() that suggests a culprit.

The last time I did have a Linux machine hard lock it turned out to be due to an Intel microcode bug on a new processor stepping that was only tickled by a particular Linux driver for a PCIe peripheral card. I knew the card wasn't bad because it worked in other machines with the same driver. I didn't personally try Windows on it, but my understanding is that there were no reports of the issue from Windows users. The issue came down to Linux tickling the PCIe controller in ways that Windows didn't, which triggered the microcode issue. Unfortunately it took months for the card vendor to work with Intel and get issue to issue a fix. If you read the errata for Intel firmware updates though you'll see that in the grand-scheme these things do happen.

Adbot
ADBOT LOVES YOU

ExcessBLarg!
Sep 1, 2001
The whole point of AppImages is that you can pretend they're just one large binary. The only unusual dependency is libfuse2.

Flatpaks also run totally fine on Ubuntu. Like I get the idea you might not want both flatpaks and snaps installed but functionally it's not an issue.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply