Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


What are everyone thoughts on Openlogic? I'm not a linux expert but this is the only Fedora supported distro in Azure how do you guys feel about this?

Adbot
ADBOT LOVES YOU

evol262
Nov 30, 2010
#!/usr/bin/perl

Tab8715 posted:

What are everyone thoughts on Openlogic? I'm not a linux expert but this is the only Fedora supported distro in Azure how do you guys feel about this?

Never heard of it, but Google says they're just providing an SLA for CentOS on Azure?

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


evol262 posted:

Never heard of it, but Google says they're just providing an SLA for CentOS on Azure?

Appears to be, I also heard while RHEL isn't technically supported on Azure if you upload a VM it'll work and support will still try to help.

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.

evol262 posted:

The use case for virtual machines seems to be totally lost on you either way.
I thought the post was written by Jon Hendren until I checked the username

karl fungus
May 6, 2011

Baeume sind auch Freunde
How good is Wacom Intuos tablet support on Linux? Also, which programs work the best with it?

evol262
Nov 30, 2010
#!/usr/bin/perl

Misogynist posted:

I thought the post was written by Jon Hendren until I checked the username

Is he on SA?

Tab8715 posted:

Appears to be, I also heard while RHEL isn't technically supported on Azure if you upload a VM it'll work and support will still try to help.

I'm actually the maintainer of the RHEL guest images (qcow for openstack/KVM, not the ec2 Ami or the docker image), so if you upload our official cloud image and it doesn't work, you can harass me. Also, if you use the official 6.6 image and it doesn't work, you can harass me and file a bug against 6.7, which may be able to get fast tracked and fixed in time for the 6.7 GA

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.

evol262 posted:

Is he on SA?
You serious? He's DocEvil and he wrote for the site for ten years.

evol262
Nov 30, 2010
#!/usr/bin/perl

Misogynist posted:

You serious? He's DocEvil and he wrote for the site for ten years.

To be honest, I don't think I've read the front page for almost a decade...

effika
Jun 19, 2005
Birds do not want you to know any more than you already do.

karl fungus posted:

How good is Wacom Intuos tablet support on Linux? Also, which programs work the best with it?

I think you asked this a page or two ago, so I did some Googling. It might work pretty ok? There's stuff about them on the Arch and Ubuntu wikis, and Wacom has a page about Linux.

GIMP should have some level of support; I remember seeing some plugins about tablets.

Try it and see, would be my advice.

FWT THE CUTTER
Oct 16, 2007

weed
Krita is supposed to be good for tablets.

Progressive JPEG
Feb 19, 2003

evol262 posted:

Darktable is supposed to be ok

I use it a lot for editing raws and I think it's great, though I've never used lightroom so maybe that's better still

CaptainSarcastic posted:

I personally have gotten used to GIMP's multi-window mode

I tolerated multi window when it was the only option

nonathlon
Jul 9, 2004
And yet, somehow, now it's my fault ...
Misposted, never mind.

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

Suspicious Dish posted:

What I've been doing for the last year has finally come to fruition: https://endlessm.com/

Congrats on the wired article, I just ran into it by accident reading about iRobot's lawnmower. :)

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
I'm really happy with that one. Felt more like it was trying to open a discussion by mentioning Facebook, Google, and us, rather than trying to sell us or bash us or do something about Kickstarter.

Of course, as you're probably well aware, we did reach out to lots of journalists to get this covered, but that can have wacky results. We weren't overly happy with the BBC's tone, but we're told that's just how the BBC is.

Access Violation
Jul 21, 2003
Maybe there should be a "linux server administration megathread"?

I am a programmer at a small startup. There are two linux servers, a DB server and a web server. Everyone here are just about novices with Linux administration but know enough to be dangerous, and really, we're coders and would really like to focus on that instead of server administration. Budget is pretty limited right now and I'm not sure we could stretch it to hire a server admin full time, and realistically, he'd be idle 95% of the time.

Since I'm kind of unofficially in charge of the programming team this is a a huge headache for me since the server setup is so complicated, but worse, I can't figure out how to simplify it given the need to keep access limited between environments.

We have three public websites hosted on the same server. Each website has three environments (live, dev, staging). So there are 9 environments in total.

Each of our coders (four of them) have their own user accounts for each and every one of these environments. In addition, each environment runs its own Apache instances to make sure they can't write to each others' directories and provide some protection in case one site gets hacked somehow. Also we use a deployment system that logs into our servers to deploy code, and this also has its own account for every environment.

So there are 9*6 = 54 user accounts in total, and god knows how many SSH keys in the respective .authorized_keys.

Now we need to set up a new web server to do some load balancing and I get a migraine just thinking about how to manage it all and make sure that the users and account structures are consistent across environments.

How the hell is this supposed to work? What about if there were 100 servers and 100 users? Is there some kind of centralized management system for this stuff? I've heard of Chef and Puppet but I don't have experience using them. Would that be the best way forward? How does that help with things like keeping Apache config files consistent, distributing ssh keys, etc?

Many thanks for any suggestions.

captkirk
Feb 5, 2010

Access Violation posted:

Maybe there should be a "linux server administration megathread"?

I am a programmer at a small startup. There are two linux servers, a DB server and a web server. Everyone here are just about novices with Linux administration but know enough to be dangerous, and really, we're coders and would really like to focus on that instead of server administration. Budget is pretty limited right now and I'm not sure we could stretch it to hire a server admin full time, and realistically, he'd be idle 95% of the time.

Since I'm kind of unofficially in charge of the programming team this is a a huge headache for me since the server setup is so complicated, but worse, I can't figure out how to simplify it given the need to keep access limited between environments.

We have three public websites hosted on the same server. Each website has three environments (live, dev, staging). So there are 9 environments in total.

Each of our coders (four of them) have their own user accounts for each and every one of these environments. In addition, each environment runs its own Apache instances to make sure they can't write to each others' directories and provide some protection in case one site gets hacked somehow. Also we use a deployment system that logs into our servers to deploy code, and this also has its own account for every environment.

So there are 9*6 = 54 user accounts in total, and god knows how many SSH keys in the respective .authorized_keys.

Now we need to set up a new web server to do some load balancing and I get a migraine just thinking about how to manage it all and make sure that the users and account structures are consistent across environments.

How the hell is this supposed to work? What about if there were 100 servers and 100 users? Is there some kind of centralized management system for this stuff? I've heard of Chef and Puppet but I don't have experience using them. Would that be the best way forward? How does that help with things like keeping Apache config files consistent, distributing ssh keys, etc?

Many thanks for any suggestions.

You want to learn puppet or chef. That allows you manage those configuration files and distribute the public keys. You might also want to look into using docker to handle the different apache processes.

Bhodi
Dec 9, 2007

Oh, it's just a cat.
Pillbug
No and no. Puppet/chef doesn't address his user/account issues and docker is neither secure enough for internet-facing webservers nor production ready.

Your best bet at your user account problem with what you have is to look at LDAP, active directory, or some other shared RBAC program. I think redhat's pimping something these days but I haven't kept up on all the options out there. I've used proprietary solutions in two different jobs but they were both homegrown. In this modern day, you're likely better served with some sort of cloud product that handles this all for you.

Think of chef/puppet has the ability to push out maintain customizable configs on all your systems. It can help but won't really solve your current problem on its own, it's more of a delivery vehicle rather than a solution itself.

Thalagyrt
Aug 10, 2006

Access Violation posted:

Maybe there should be a "linux server administration megathread"?

I am a programmer at a small startup. There are two linux servers, a DB server and a web server. Everyone here are just about novices with Linux administration but know enough to be dangerous, and really, we're coders and would really like to focus on that instead of server administration. Budget is pretty limited right now and I'm not sure we could stretch it to hire a server admin full time, and realistically, he'd be idle 95% of the time.

Since I'm kind of unofficially in charge of the programming team this is a a huge headache for me since the server setup is so complicated, but worse, I can't figure out how to simplify it given the need to keep access limited between environments.

We have three public websites hosted on the same server. Each website has three environments (live, dev, staging). So there are 9 environments in total.

Each of our coders (four of them) have their own user accounts for each and every one of these environments. In addition, each environment runs its own Apache instances to make sure they can't write to each others' directories and provide some protection in case one site gets hacked somehow. Also we use a deployment system that logs into our servers to deploy code, and this also has its own account for every environment.

So there are 9*6 = 54 user accounts in total, and god knows how many SSH keys in the respective .authorized_keys.

Now we need to set up a new web server to do some load balancing and I get a migraine just thinking about how to manage it all and make sure that the users and account structures are consistent across environments.

How the hell is this supposed to work? What about if there were 100 servers and 100 users? Is there some kind of centralized management system for this stuff? I've heard of Chef and Puppet but I don't have experience using them. Would that be the best way forward? How does that help with things like keeping Apache config files consistent, distributing ssh keys, etc?

Many thanks for any suggestions.

Don't create a new account for each user for each environment. One account for each administrator, and filesystem ACLs will do everything you need with regards to granting specific users rights to specific applications - and you may not even need the facls module to do this - proper usage of POSIX groups should do the trick. I'm not sure what programming language you're using, but you don't need separate web servers for each environment either - you can very easily set up each application server under its own user and proxy back to it from a single frontend web server with group membership that allows it read access to whatever it needs. Since you seem to only be running Apache, I'm going to assume PHP. Look into PHP-FPM, and set up a worker pool for each application if that's the case.

With a proper deployment system in place your developers shouldn't even have access to prod. Your ops guy(s) (that's you and only you right now from what I understand) should have access, and the deployment system should have access. Giving developers who lack ops experience direct access to production is a recipe for disaster. I definitely wouldn't advise running prod and dev/staging in the same server, either. It's too easy for dev/staging to hose things up (code in development goes into an infinite loop hitting the database with thousands of queries/second could happen, for one example) on the server and cause a prod outage or service degradation. Use identical installs of the OS, but not the same server for each environment.

evol262
Nov 30, 2010
#!/usr/bin/perl

Access Violation posted:

Maybe there should be a "linux server administration megathread"?

I am a programmer at a small startup. There are two linux servers, a DB server and a web server. Everyone here are just about novices with Linux administration but know enough to be dangerous, and really, we're coders and would really like to focus on that instead of server administration. Budget is pretty limited right now and I'm not sure we could stretch it to hire a server admin full time, and realistically, he'd be idle 95% of the time.

Since I'm kind of unofficially in charge of the programming team this is a a huge headache for me since the server setup is so complicated, but worse, I can't figure out how to simplify it given the need to keep access limited between environments.

We have three public websites hosted on the same server. Each website has three environments (live, dev, staging). So there are 9 environments in total.

Each of our coders (four of them) have their own user accounts for each and every one of these environments. In addition, each environment runs its own Apache instances to make sure they can't write to each others' directories and provide some protection in case one site gets hacked somehow. Also we use a deployment system that logs into our servers to deploy code, and this also has its own account for every environment.

So there are 9*6 = 54 user accounts in total, and god knows how many SSH keys in the respective .authorized_keys.

Now we need to set up a new web server to do some load balancing and I get a migraine just thinking about how to manage it all and make sure that the users and account structures are consistent across environments.

How the hell is this supposed to work? What about if there were 100 servers and 100 users? Is there some kind of centralized management system for this stuff? I've heard of Chef and Puppet but I don't have experience using them. Would that be the best way forward? How does that help with things like keeping Apache config files consistent, distributing ssh keys, etc?

Many thanks for any suggestions.

You should be using puppet or chef.

If all of this is local, use kerberos instead of a zillion SSH keys. Even if not, you don't need 54 SSH keys.

No passwords. Anywhere.

You have the same accounts on every server right? Or the same names? With the same SSH keys?

It's ridiculous to have an account for "each and every environment". Since you're already using some kind of automated deployment (capistrano or fabric or whatever), there's no reason for them to have accounts for those environments at all. You should never be touching live.

Development flow goes like this:

You build up a docker container, vagrant image, or AMI which has an identical environment for everyone. They don't need dev environments on the server. There are a million and one tools for populating dev databases from example data. Use one of these, and add your example schema/json/whatever to source control so everyone uses the same thing.

You write code and commit it. Travis (which is stupidly easy to set up and already ties into github, if you guys are using that -- if you're not, it may be worth the tiny monthly fee to ease your lives) runs integration tests, unit tests, etc. If it fails, reject their commit and go back to zero. If it passes, it waits for code review. If it gets +2, it gets an automated build which gets deployed to staging where QE or whoever tests it. Staging is also on separate servers, VMs, or docker containers.

If you don't have QE, you don't need a staging environment at all. If you only have staging so it can do something with live data, this is a ticking bomb, and you should set up replication to an isolated database that staging can use.

Once someone signs off on whatever is happening in staging (assuming you don't scrap the idea of staging entirely), the package gets pushed live.

That done...

Don't run individual apache instances, and don't co-host websites on the same server. This isn't security in any real sense of the word, since any proper exploit is going to give them shell, where they'll do whatever they want with the permissions of the apache/httpd user in the best case or go looking for local root vulnerabilities in the worst case. If any of those sites gets hacked, you can't trust anything on it, much less your 8 other environments.

My real advice to you would be to stop doing this entirely. PaaS is viable and worth the money, especially in cases where you're winging it without a real admin and any idea of what best practice is. No joking, migrate your poo poo to Heroku or Openshift or whatever and stop worrying about all of this until when/if you get big enough to hire some devops/SRE/admins/whatever to wrangle this.

captkirk
Feb 5, 2010

Bhodi posted:

No and no. Puppet/chef doesn't address his user/account issues and docker is neither secure enough for internet-facing webservers nor production ready.

Your best bet at your user account problem with what you have is to look at LDAP, active directory, or some other shared RBAC program. I think redhat's pimping something these days but I haven't kept up on all the options out there. I've used proprietary solutions in two different jobs but they were both homegrown. In this modern day, you're likely better served with some sort of cloud product that handles this all for you.

Think of chef/puppet has the ability to push out maintain customizable configs on all your systems. It can help but won't really solve your current problem on its own, it's more of a delivery vehicle rather than a solution itself.

I did not say it address his user/account issues (though it could). I said it could help with pushing out public keys and his configs.

Also, while I agree docker has some maturing to do it is still worth him looking into, particularly with his dev/test stacks and possibly with his productions stack. While you may assert it is not production ready, it is already used in production in the wild.

evol262
Nov 30, 2010
#!/usr/bin/perl

captkirk posted:

Also, while I agree docker has some maturing to do it is still worth him looking into, particularly with his dev/test stacks and possibly with his productions stack. While you may assert it is not production ready, it is already used in production in the wild.
It is, and containers are evolving, but it takes an incredible amount of work to keep on top of them, since there's not a good mechanism (yet) of letting you know about/update images. There's a whole external ecosystem of docker-pull and webhooks and home-grown scripts for checking whether you're running the latest image, but there's no "docker check-update somelayer..." or equivalent. That's not really what you want at a shop with little/no admin knowledge.

The UID0 issue is also a serious problem for security.

Docker is used in the wild, both because of the hype machine and for shops which don't already have devops practices in place, find that they'd rather just deal with docker than AWS/openstack+puppet/salt/whatever+... (mostly, you need to use a config management system with docker anyway, so this is a bust). Or to ship containers that have updated configs. Or Kubernetes+Mesos shops trying to maximize utilization on public cloud instances. There are valid use cases, and "give your developers identical dev environments then ship it to prod" certainly is one, but "it's used in production in the wild so it's production ready" ignores a whole lot of issues.

Ellie Crabcakes
Feb 1, 2008

Stop emailing my boyfriend Gay Crungus

Access Violation posted:

How the hell is this supposed to work? What about if there were 100 servers and 100 users? Is there some kind of centralized management system for this stuff? I've heard of Chef and Puppet but I don't have experience using them. Would that be the best way forward? How does that help with things like keeping Apache config files consistent, distributing ssh keys, etc?
I'd go with Ansible. I wish it had been around the last time I had to push code and configs to 60+ servers.

ProfessorBooty
Jan 25, 2004

Amulet of the Dark
I use JACK to record multiple sound sources simultaneously. One of the USB headsets broke today, and as a temporary fix I'm using an external USB (Blue Yeti) Microphone in tandem with the headset. It's important to record these devices in the same Audacity session so I don't have to worry about syncing recordings or any sort of echo in the recording.

To start jack I have a .sh this command:

code:
jackd -r -d alsa r 44100
Then I have a shell command that contains this for when the USB headsets are working:

code:
alsa_out -j "Headphone 1" -d hw:LX3000 -q 4 2>&1 /dev/null &
alsa_out -j "Headphone 2" -d hw:LX3000_1 -q 4 2>&1 /dev/null &
alsa_in -j "Microphone 1" -d hw:LX3000 -q 4 2>&1 1> /dev/null &
alsa_in -j "Microphone 2" -d hw:LX3000_1 -q 4 2>&1 1> /dev/null &
I got the hardware by using the 'aplay -l' and 'arecord -l' commands.
code:
...
card 2: LX3000 [Microsoft LifeChat LX-3000], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 3: LX3000_1 [Microsoft LifeChat LX-3000], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
But with the Yeti plugged in, aplay and arecord show this:

code:
...
card 2: LX3000 [Microsoft LifeChat LX-3000], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 3: Microphone [Yeti Stereo Microphone], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
Unfortunately, replacing the 'LX3000_1' with 'Microphone' results in an error message - how the heck do I find out what the device driver for this microphone is?

Edit: I answered my own question. I found the device number (using cat /proc/asound/cards), and instead of placing the device driver in my alsa_in/out script, I replaced it with the device number (hw:2).

ProfessorBooty fucked around with this message at 19:12 on Apr 19, 2015

Not Wolverine
Jul 1, 2007
Can anyone what tell what version of Red Hat this video is likely to be using? I think it looks like Gnome 2 but I am a KDE fan so for all I know it could be Gnome 3 with a skin. If I want to follow this series of tutorials, should I look for an old version of Red Hat or use Centos or another distro?

https://www.youtube.com/watch?v=bG0mMOteVR8

spankmeister
Jun 15, 2008






Crotch Fruit posted:

Can anyone what tell what version of Red Hat this video is likely to be using? I think it looks like Gnome 2 but I am a KDE fan so for all I know it could be Gnome 3 with a skin. If I want to follow this series of tutorials, should I look for an old version of Red Hat or use Centos or another distro?

https://www.youtube.com/watch?v=bG0mMOteVR8

That's a REALLY old version, it looks like Red Hat 7 (not RHEL, Red Hat)

evol262
Nov 30, 2010
#!/usr/bin/perl

Crotch Fruit posted:

Can anyone what tell what version of Red Hat this video is likely to be using? I think it looks like Gnome 2 but I am a KDE fan so for all I know it could be Gnome 3 with a skin. If I want to follow this series of tutorials, should I look for an old version of Red Hat or use Centos or another distro?

https://www.youtube.com/watch?v=bG0mMOteVR8

It appears to be gnome 1, which matches up with the archaic version of emacs they're using. Centos didn't even exist at this point. RHEL didn't either.

I would guess its Red Hat 7.2 (not RHEL 7.2, which isn't out yet). Maybe 8. You should follow a modern LPIC guide to start with

Not Wolverine
Jul 1, 2007

evol262 posted:

I would guess its Red Hat 7.2 (not RHEL 7.2, which isn't out yet). Maybe 8. You should follow a modern LPIC guide to start with

I know it's not a good tutorial by any means, it was just something related to a video I watched so I clicked it and it looked interesting.

Yeah I thought it was pre RHEL, but the video was published November 2014, so why would they use a version 10 years old? The only thing I can think of is that is a re-upload of an older video.

The_Franz
Aug 8, 2003

Crotch Fruit posted:

I know it's not a good tutorial by any means, it was just something related to a video I watched so I clicked it and it looked interesting.

Yeah I thought it was pre RHEL, but the video was published November 2014, so why would they use a version 10 years old? The only thing I can think of is that is a re-upload of an older video.

Who is 'they'? That YouTube channel isn't related to RedHat, it's just some random dude.

The_Franz fucked around with this message at 15:20 on Apr 20, 2015

Not Wolverine
Jul 1, 2007
Well, if you google the water mark in the video, 'they' are cbtnuggets.com, an online IT video training service. :v:

The_Franz
Aug 8, 2003

Crotch Fruit posted:

Well, if you google the water mark in the video, 'they' are cbtnuggets.com, an online IT video training service. :v:

That company doesn't seem to have anything to do with that YouTube channel either. It's just an old training video that, for some reason, some random guy put on YouTube last year.

Robo Reagan
Feb 12, 2012

by Fluffdaddy
The other day I was compiling my first program ever. The fucker kept failing on me and after a bunch of Googling I got a Microsoft help page where they said that some drivers got hosed up because another program tried to install the same drivers and now all the drivers are sad or something. So I applied the fix and it didn't do anything and my files are still just sitting there. I'm not a programmer, but I'm leaning how to for something to do and that kind of gave me a really ominous feeling for any future attempts while I'm still running Windows.

So uhh, should I give Linux a shot or what? Windows is starting to get pretty frustrating, but like I said I just started with this kind of stuff so I don't know how well I'll be able to work with Linux.

karl fungus
May 6, 2011

Baeume sind auch Freunde
Compiling is incredibly easy on Linux. I feel bad for you people that have to compile on Windows.

Robo Reagan
Feb 12, 2012

by Fluffdaddy
I guess I'll give Ubuntu a shot then? That seems like the most newbie friendly. Should I keep a partition of Win7 since I'm a gigantic nerd and like video games?

evol262
Nov 30, 2010
#!/usr/bin/perl

Robo Reagan posted:

The other day I was compiling my first program ever. The fucker kept failing on me and after a bunch of Googling I got a Microsoft help page where they said that some drivers got hosed up because another program tried to install the same drivers and now all the drivers are sad or something. So I applied the fix and it didn't do anything and my files are still just sitting there. I'm not a programmer, but I'm leaning how to for something to do and that kind of gave me a really ominous feeling for any future attempts while I'm still running Windows.

So uhh, should I give Linux a shot or what? Windows is starting to get pretty frustrating, but like I said I just started with this kind of stuff so I don't know how well I'll be able to work with Linux.

What are you compiling? In what language? With what compiler? And what's the error?

Drivers should have nothing to do with anything, and while compiler suites used to be easier to come by on Linux, compiling is pretty Mich the same process there.

spankmeister
Jun 15, 2008






Use a modern distro like ubuntu or fedora and keep your Windows around for games, although gaming on linux has improved greatly over the last few years so you might want to install steam on Linux and give it a go.

karl fungus
May 6, 2011

Baeume sind auch Freunde
Wait two days until the 23rd and you'll have a whole new Ubuntu release to install.

waffle iron
Jan 16, 2004
Ubuntu, Debian, and Fedora all have major releases in the next 3 weeks.

Crack
Apr 10, 2009
I'm on Ubuntu (I've used the .10 version for a few months) and I want to try out Debian KDE when it releases in a few days. In preparation I ran "tar -cvpzf backup.tar.gz --one-file-system /" and moved the file to an external disk. However, I encrypted my home directory on install. Is there an easy way to decrypt all these files from another system to pull individual files I want from it? or should I delete all the encrypted stuff from the tar and manually move my home folder to it using the GUI then encrypt the archive? Or would this create the same problem? Thanks.

E:also at the end it said "tar: /: file changed as we read it" does this mean everything backed up successfully apart from the changes or did it stop when it ran into something changing?

Crack fucked around with this message at 13:22 on Apr 21, 2015

karl fungus
May 6, 2011

Baeume sind auch Freunde
Why not just install the kubuntu-desktop package?

Adbot
ADBOT LOVES YOU

evol262
Nov 30, 2010
#!/usr/bin/perl

Crack posted:

I'm on Ubuntu (I've used the .10 version for a few months) and I want to try out Debian KDE when it releases in a few days. In preparation I ran "tar -cvpzf backup.tar.gz --one-file-system /" and moved the file to an external disk. However, I encrypted my home directory on install. Is there an easy way to decrypt all these files from another system to pull individual files I want from it? or should I delete all the encrypted stuff from the tar and manually move my home folder to it using the GUI then encrypt the archive? Or would this create the same problem? Thanks.

E:also at the end it said "tar: /: file changed as we read it" does this mean everything backed up successfully apart from the changes or did it stop when it ran into something changing?

Why not keep your home directory and reuse it on Debian?

tar didn't stop when that happened, and it was probably a log, but backing up live systems works better with consecutive rsyncs

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply