Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Thalagyrt
Aug 10, 2006

evol262 posted:

Even a fully patched system can still be subject to problems from 2000 because nobody bothered to label it as security. As usual, defense in depth wins

Proper network segregation definitely can go a long way, but I'd wager a lot of people don't do that the right way. The glibc thing from today is a great example of the quoted bit here. It was fixed in May 2013, but nobody labeled it security, so here we are today scrambling on it!

Adbot
ADBOT LOVES YOU

RFC2324
Jun 7, 2012

http 418

fatherdog posted:

A company that holds large government contracts and has had 2 compromises in the past year still refusing to patch their internal servers because completely unrelated poor planning in a hardware spec cost them more money is some impressive levels of head-in-the-sand-ism from your managers. (please understand that as being stated in a tone of sympathy rather than derision)

We don't have those contracts anymore, as I understand we finally sold the last of them to Raytheon.

That said, the habits are ingrained in corporate culture in quite a few areas, and the Sr admins insist that the current patching practices are a legacy of that.

I'm more inclined to think it has to do with fear of disks welding themselves.

evol262
Nov 30, 2010
#!/usr/bin/perl

Thalagyrt posted:

Proper network segregation definitely can go a long way, but I'd wager a lot of people don't do that the right way. The glibc thing from today is a great example of the quoted bit here. It was fixed in May 2013, but nobody labeled it security, so here we are today scrambling on it!

:thejoke:

Longinus00
Dec 29, 2005
Ur-Quan

RFC2324 posted:

We don't have those contracts anymore, as I understand we finally sold the last of them to Raytheon.

That said, the habits are ingrained in corporate culture in quite a few areas, and the Sr admins insist that the current patching practices are a legacy of that.

I'm more inclined to think it has to do with fear of disks welding themselves.

This is really just one of those culture things that will take time and tragedy to sort out. Once this sort of think becomes more common, managers might start authorizing faster updates and/or pressuring suppliers to actually write more secure software in the first place. Kind of like how it take airline crashes to push through reforms in aviation safety.

Lukano
Apr 28, 2003

Can anyone point me in the direction of an incremental, encrypted, backup tool to use with the 1TB of Google Drive space that I have?

I'd like to be able to have it back up the obvious stuff like family photos, music, ebooks, etc... but I want to have control over the key, so it should be encrypted on the Google side of things.

I tried Duplicati, but I was really really underwhelmed. Maybe I'm expecting too much, as I used to use JungleDisk for S3 backups a bunch of years back, but Duplicati seemed slow, clunky, uninformative... and honestly running a windows app in mono for backing up on my linux box(s) just seems silly. I'm also not entirely sure after reading a bunch of Duplicati documentation, if it really does want to have to re-upload the entire backup once every <iteration> (day/week/month). I mean, I have the bandwidth for it now (100/40 fiber) but the backups currently size >500gigs in total.

But I've done a bunch of googling and can't come up with another solution. Is there a more manual way of going about it (encrypting and then using another tool to send to Google Drive) while still leaving the files open and accessible to my family locally?

edit - or failing that, another reputable and fast service that has better support for linux, and will let me get the same amount of space for the same $ ($10/month for 1TB).

edit 2 - well poo poo, I was closing a bunch of tabs I had open after writing this post, and noticed one that I hadn't gotten to for 'cloudsync' which appears to be exactly what I'm looking for. https://github.com/HolgerHees/cloudsync

Lukano fucked around with this message at 14:55 on Jan 29, 2015

midnightclimax
Dec 3, 2011

by XyloJW
I'd like to install Linux on an old laptop, but it doesn't allow for "boot via USB" and I don't have a blank CD at my disposal. I think there's wubi, but that's probably a performance hit on an already older machine. Any other options out there? It's still running XP.

Spazz
Nov 17, 2005

midnightclimax posted:

I'd like to install Linux on an old laptop, but it doesn't allow for "boot via USB" and I don't have a blank CD at my disposal. I think there's wubi, but that's probably a performance hit on an already older machine. Any other options out there? It's still running XP.

Are you sure the "Boot to USB" isn't hidden somewhere in the BIOS? I had to dig into the BIOS for an older laptop and enable it.

midnightclimax
Dec 3, 2011

by XyloJW

Spazz posted:

Are you sure the "Boot to USB" isn't hidden somewhere in the BIOS? I had to dig into the BIOS for an older laptop and enable it.

Hmm I'll check again, but it's one of those Sony VAIO laptops, and I remember hearing not many good things about their BIOS settings. Afaik there's hard drive, floppy, cd-rom, and network.

Spazz
Nov 17, 2005

You could always try PXE for the install, but it'll require another computer. I'd also try Googling "Sony Vaio <model> USB boot" and see if anything comes up, it could just require a BIOS update to get that boot option.

telcoM
Mar 21, 2009
Fallen Rib

midnightclimax posted:

Hmm I'll check again, but it's one of those Sony VAIO laptops, and I remember hearing not many good things about their BIOS settings. Afaik there's hard drive, floppy, cd-rom, and network.

A network boot to Linux installer might be an option. The standard network boot method in practically all modern PC hardware is known as PXE.

Basically, to implement a PXE netboot server you'll need three things, which all can be run on a single host:
- a very basic DHCP server, for responding to the laptop's network boot query with a "to boot, download this file from this IP address" response (basically one or two DHCP options)
- a TFTP server, for supplying the netboot loader file, its configuration file, and the Linux kernel and initrd files (maybe some small boot menu image files too)

For Debian/Ubuntu, the above might be enough if the installer can have Internet connectivity: it will download the rest from the distribution's standard mirrors.
For some other distributions, you may have to provide the rest of the installation ISO image from a NFS or HTTP server (or even FTP server or Windows network share with some distributions).

http://tftpd32.jounin.net/ is a Windows application that contains practically all of the basic server-side functionality for PXE network booting. (If your distribution of choice requires a local NFS/HTTP/FTP service to download the main installation image from, that is not included.)

With a bit of Googling, you'll probably find step-by-step instruction documents for setting up a netboot installation server for your Linux distribution of choice. However, they will usually assume you already have at least one Linux system you can use as a netboot server.

Longinus00
Dec 29, 2005
Ur-Quan

Lukano posted:

Can anyone point me in the direction of an incremental, encrypted, backup tool to use with the 1TB of Google Drive space that I have?

I'd like to be able to have it back up the obvious stuff like family photos, music, ebooks, etc... but I want to have control over the key, so it should be encrypted on the Google side of things.

I tried Duplicati, but I was really really underwhelmed. Maybe I'm expecting too much, as I used to use JungleDisk for S3 backups a bunch of years back, but Duplicati seemed slow, clunky, uninformative... and honestly running a windows app in mono for backing up on my linux box(s) just seems silly. I'm also not entirely sure after reading a bunch of Duplicati documentation, if it really does want to have to re-upload the entire backup once every <iteration> (day/week/month). I mean, I have the bandwidth for it now (100/40 fiber) but the backups currently size >500gigs in total.

But I've done a bunch of googling and can't come up with another solution. Is there a more manual way of going about it (encrypting and then using another tool to send to Google Drive) while still leaving the files open and accessible to my family locally?

edit - or failing that, another reputable and fast service that has better support for linux, and will let me get the same amount of space for the same $ ($10/month for 1TB).

edit 2 - well poo poo, I was closing a bunch of tabs I had open after writing this post, and noticed one that I hadn't gotten to for 'cloudsync' which appears to be exactly what I'm looking for. https://github.com/HolgerHees/cloudsync

Why did you use duplicati instead of duplicity (which duplicati is a c# reimplementation of)? https://en.wikipedia.org/wiki/Duplicity_%28software%29

midnightclimax
Dec 3, 2011

by XyloJW

Spazz posted:

You could always try PXE for the install, but it'll require another computer. I'd also try Googling "Sony Vaio <model> USB boot" and see if anything comes up, it could just require a BIOS update to get that boot option.


telcoM posted:

A network boot to Linux installer might be an option. The standard network boot method in practically all modern PC hardware is known as PXE.

Basically, to implement a PXE netboot server you'll need three things, which all can be run on a single host:
- a very basic DHCP server, for responding to the laptop's network boot query with a "to boot, download this file from this IP address" response (basically one or two DHCP options)
- a TFTP server, for supplying the netboot loader file, its configuration file, and the Linux kernel and initrd files (maybe some small boot menu image files too)

For Debian/Ubuntu, the above might be enough if the installer can have Internet connectivity: it will download the rest from the distribution's standard mirrors.
For some other distributions, you may have to provide the rest of the installation ISO image from a NFS or HTTP server (or even FTP server or Windows network share with some distributions).

http://tftpd32.jounin.net/ is a Windows application that contains practically all of the basic server-side functionality for PXE network booting. (If your distribution of choice requires a local NFS/HTTP/FTP service to download the main installation image from, that is not included.)

With a bit of Googling, you'll probably find step-by-step instruction documents for setting up a netboot installation server for your Linux distribution of choice. However, they will usually assume you already have at least one Linux system you can use as a netboot server.

I've got another Linux machine up and running, so that shouldn't be too difficult. Will try googling for BIOS update first, though, ta

Lukano
Apr 28, 2003

Longinus00 posted:

Why did you use duplicati instead of duplicity (which duplicati is a c# reimplementation of)? https://en.wikipedia.org/wiki/Duplicity_%28software%29

Honestly, I don't know -- Duplicati ends up being the primary search result in most of my previous searches for a solution, so I totally forgot Duplicity was an option.

That said, the way that duplicati / duplicity do the backups is not ideal for me (having to re-up 500+ gigs every <interval> because it doesn't encrypt per-file) is a pain and a waste of bandwidth (even if I have it to spare).

enthe0s
Oct 24, 2010

In another few hours, the sun will rise!
Is this the thread to ask about command line stuff? If not, I'll take this down.

I'm having trouble getting a cronjob to write a textfile to the index.html file located in /var/www

I'm not a seasoned linux user or scripter by any means, but I'm not totally new to linux either.

This is the cronjob I have currently and I don't know why it's not working.

code:
* * * * * sudo ~/recent500.txt > /var/www/index.html
This should write the contents from recent500.txt to index.html every minute, correct?

fatherdog
Feb 16, 2005

enthe0s posted:

Is this the thread to ask about command line stuff? If not, I'll take this down.

I'm having trouble getting a cronjob to write a textfile to the index.html file located in /var/www

I'm not a seasoned linux user or scripter by any means, but I'm not totally new to linux either.

This is the cronjob I have currently and I don't know why it's not working.

code:
* * * * * sudo ~/recent500.txt > /var/www/index.html
This should write the contents from recent500.txt to index.html every minute, correct?

Currently, it's trying to execute ~/recent500.txt and sending the output to the index file.

You want either

code:
* * * * * sudo cat ~/recent500.txt > /var/www/index.html
or

code:
* * * * * cat ~/recent500.txt > /var/www/index.html
Depending on what user your cronjob is running as/permissions/sudo setup.

Powered Descent
Jul 13, 2008

We haven't had that spirit here since 1969.

Lukano posted:

Can anyone point me in the direction of an incremental, encrypted, backup tool to use with the 1TB of Google Drive space that I have?

I'd like to be able to have it back up the obvious stuff like family photos, music, ebooks, etc... but I want to have control over the key, so it should be encrypted on the Google side of things.

I tried Duplicati, but I was really really underwhelmed. Maybe I'm expecting too much, as I used to use JungleDisk for S3 backups a bunch of years back, but Duplicati seemed slow, clunky, uninformative... and honestly running a windows app in mono for backing up on my linux box(s) just seems silly. I'm also not entirely sure after reading a bunch of Duplicati documentation, if it really does want to have to re-upload the entire backup once every <iteration> (day/week/month). I mean, I have the bandwidth for it now (100/40 fiber) but the backups currently size >500gigs in total.

But I've done a bunch of googling and can't come up with another solution. Is there a more manual way of going about it (encrypting and then using another tool to send to Google Drive) while still leaving the files open and accessible to my family locally?

edit - or failing that, another reputable and fast service that has better support for linux, and will let me get the same amount of space for the same $ ($10/month for 1TB).

edit 2 - well poo poo, I was closing a bunch of tabs I had open after writing this post, and noticed one that I hadn't gotten to for 'cloudsync' which appears to be exactly what I'm looking for. https://github.com/HolgerHees/cloudsync

There was a discussion a little while ago all about gdrive and its various alternatives. It starts on page 512. Google hasn't released an official Linux client that will let you properly mount your gdrive, but there are several unofficial clients. And of course other cloud storage providers.

That cloudsync utility looks pretty nice, but you can also always roll your own solution. For ideas, see my post history in this thread for a quick script I wrote that uses encfs and rsync to send encrypted backups out to an SFTP host.

RFC2324
Jun 7, 2012

http 418

Powered Descent posted:

There was a discussion a little while ago all about gdrive and its various alternatives. It starts on page 512. Google hasn't released an official Linux client that will let you properly mount your gdrive, but there are several unofficial clients. And of course other cloud storage providers.

That cloudsync utility looks pretty nice, but you can also always roll your own solution. For ideas, see my post history in this thread for a quick script I wrote that uses encfs and rsync to send encrypted backups out to an SFTP host.

I suddenly want to build up a tiny windows vm set up to mount a linux directory in the gdrive location.

So I can have the single most inefficient solution to the gdrive problem.

enthe0s
Oct 24, 2010

In another few hours, the sun will rise!

fatherdog posted:

Currently, it's trying to execute ~/recent500.txt and sending the output to the index file.

You want either

code:
* * * * * sudo cat ~/recent500.txt > /var/www/index.html
or

code:
* * * * * cat ~/recent500.txt > /var/www/index.html
Depending on what user your cronjob is running as/permissions/sudo setup.

Ok, that makes sense as to why it wasn't doing anything before. The previous 2 cronjobs I made both run a command, so I guess I wasn't fully grasping what was happening here. I updated the line to use cat like you said, but it's still not working. Here's what I have currently:

code:
* * * * * ~/myscript.sh >> ~/output.txt 2>&1
* * * * * tail -n 2500 ~/output.txt > ~/recent500.txt
* * * * * sudo cat ~/recent500.txt > /var/www/index.html
So I run my bash script and append to output.txt, and then I take the last 2500 lines from output.txt and create recent500.txt. (I know this is dirty right now, I'm gonna put it all into a single bash script later once I have it working).

I think I need the sudo version because I can't just move files into that /www/ directory without it, but I tried it without sudo just to see and still nothing. I also tried putting ~ in front of the path (~/var/www/index.html), but that didn't work either, so I'm at a loss.

Lukano
Apr 28, 2003

Powered Descent posted:

There was a discussion a little while ago all about gdrive and its various alternatives. It starts on page 512. Google hasn't released an official Linux client that will let you properly mount your gdrive, but there are several unofficial clients. And of course other cloud storage providers.

That cloudsync utility looks pretty nice, but you can also always roll your own solution. For ideas, see my post history in this thread for a quick script I wrote that uses encfs and rsync to send encrypted backups out to an SFTP host.

Awesome, thank you very much. I totally forgot about encfs, but now that you've reminded me I recall that it was what I had originally intended to use when I signed up for the 1TB gdrive service 6 months ago (and then promptly procrastinated setting up, thus forgetting what I was going to do to utilize it).

I suspect the encfs / rsync solution will end up being a faster solution than cloudsync (which currently is running pretty slow). Or is that just the gdrive api rate-limiting?

edit - I don't suppose you happen to have a link to the post / script handy? I'm going back through your post history, but you're a fairly prolific poster :P

Lukano fucked around with this message at 23:38 on Jan 29, 2015

RFC2324
Jun 7, 2012

http 418

enthe0s posted:

Ok, that makes sense as to why it wasn't doing anything before. The previous 2 cronjobs I made both run a command, so I guess I wasn't fully grasping what was happening here. I updated the line to use cat like you said, but it's still not working. Here's what I have currently:

code:
* * * * * ~/myscript.sh >> ~/output.txt 2>&1
* * * * * tail -n 2500 ~/output.txt > ~/recent500.txt
* * * * * sudo cat ~/recent500.txt > /var/www/index.html
So I run my bash script and append to output.txt, and then I take the last 2500 lines from output.txt and create recent500.txt. (I know this is dirty right now, I'm gonna put it all into a single bash script later once I have it working).

I think I need the sudo version because I can't just move files into that /www/ directory without it, but I tried it without sudo just to see and still nothing. I also tried putting ~ in front of the path (~/var/www/index.html), but that didn't work either, so I'm at a loss.

Put it in the root crontab, and it will run as root, saving you needing to use sudo(sudo may be breaking it as well if it is asking for a password)

enthe0s
Oct 24, 2010

In another few hours, the sun will rise!

RFC2324 posted:

Put it in the root crontab, and it will run as root, saving you needing to use sudo(sudo may be breaking it as well if it is asking for a password)

This worked!

code:
* * * * * /home/myusername/myscript.sh >> ~/output.txt 2>&1
* * * * * /home/myusername/tail -n 2500 ~/output.txt > ~/recent500.txt
* * * * * cat /home/myusername/recent500.txt > /var/www/index.html
I didn't even know there was a root crontab, but I imagine it's as you said RFC2324, it was most likely breaking because of a password request. Huge thanks to you and fatherdog!

Powered Descent
Jul 13, 2008

We haven't had that spirit here since 1969.

Lukano posted:

Awesome, thank you very much. I totally forgot about encfs, but now that you've reminded me I recall that it was what I had originally intended to use when I signed up for the 1TB gdrive service 6 months ago (and then promptly procrastinated setting up, thus forgetting what I was going to do to utilize it).

I suspect the encfs / rsync solution will end up being a faster solution than cloudsync (which currently is running pretty slow). Or is that just the gdrive api rate-limiting?

edit - I don't suppose you happen to have a link to the post / script handy? I'm going back through your post history, but you're a fairly prolific poster :P

Sure thing, right here.

Note that my rsync options include a nonstandard port number for ssh, so you'd need to either get rid of that, or just dump ssh entirely if you have your cloud storage mounted another way. Also, the rsync command is inside an eval statement because I was a stickler for pulling the options out into a variable. Scroll up a few posts from there for details on that, and feel free to simplify to your taste.

In normal mode, encfs keeps files in an encrypted rootdir, of which you get a decrypted "view" in a mountpoint of your choosing. But if your stuff is all kept in plaintext, you can use encfs in --reverse mode to get an encrypted "view" of a normal folder. My script then rsyncs this view to a remote host.

Before this will work, you'll obviously need to set up the encfs password on the directory you want to sync. Something like:

code:
mkdir mountpoint
encfs --reverse ~/BackThisShitUp ~/mountpoint
# Hit enter to select standard mode, then enter the encryption password you want
# See all the nice encrypted files in your encrypted view:
ls ~/mountpoint
# Clean up:
fusermount -u ~/mountpoint
At this point, ~/BackThisShitUp has a new file in it, called .encfs6.xml. BACK THIS FILE UP SEPARATELY. You'll need it if you ever need to decrypt your backup, and since it appears only in the rootdir, not the mountpoint, it won't automatically come along with the rsync.

The rest of the script should be pretty straightforward, but give a shout if you have questions.

e: Adding a quick warning that this script is a one-way sync and intended for backing up a single box. It doesn't do any kind of nice multi-host sync, but simply makes the remote host into an encrypted mirror of whatever you're backing up. Add something to that folder manually, on the gdrive website? It's toast on the next run of the backup script. Have two machines backing up to the same folder on your cloud storage? They'll be undoing each other's changes every single time.

Powered Descent fucked around with this message at 00:34 on Jan 30, 2015

Lukano
Apr 28, 2003

Perfect!

Yeah, this is the only device that will be backing up to gdrive. It's an rPi with a sata>usb attached drive that just sits there pulling in stuff to archive via rsync, then will shuttle it out to gdrive. I tried cloudsync, but it would have taken me ~260days to finish backing up the ~500gigs I anticipate. 90% of the delay was the program doing openpgp aes 256 encrypting each file. I'm hoping that having some granular control over the encryption aspect with encfs, will allow me to pick something more suited to / faster on the rpi.

edit - Actually I do have a question for you Powered Descent. Why encfs instead of ecryptfs, which if I am reading correctly should potentially be faster and the metadata is stored in the files themselves, as opposed to requiring that seperate .xml?

Lukano fucked around with this message at 01:32 on Jan 30, 2015

Powered Descent
Jul 13, 2008

We haven't had that spirit here since 1969.

Lukano posted:

edit - Actually I do have a question for you Powered Descent. Why encfs instead of ecryptfs, which if I am reading correctly should potentially be faster and the metadata is stored in the files themselves, as opposed to requiring that seperate .xml?

Mostly because ecryptfs doesn't have the "reverse" mode that I wanted to use. (Or if it does, I couldn't find anything about it.) I prefer to keep my backups encrypted when I send them off to the cloud (AKA someone else's computer), but I'm not quite paranoid enough to bother with keeping it all encrypted on my local box. With encfs --reverse, I can generate an encrypted view of my stuff on the fly, taking up zero disk space.

If you're already keeping things encrypted locally, then you're right, ecryptfs has definite advantages.

karl fungus
May 6, 2011

Baeume sind auch Freunde
Is there a Linux program that will download attachments from my Gmail account, preferably with the ability to set a filter?

Like, let's say I want to download all attachments sent to me from fart@email.com, and only that address. Is there a way to mass download them on Linux?

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

karl fungus posted:

Is there a Linux program that will download attachments from my Gmail account, preferably with the ability to set a filter?

Like, let's say I want to download all attachments sent to me from fart@email.com, and only that address. Is there a way to mass download them on Linux?

You could probably whip something up in python using the Gmail API to do this: https://developers.google.com/api-client-library/python/apis/gmail/v1

Superdawg
Jan 28, 2009

enthe0s posted:

This is the cronjob I have currently and I don't know why it's not working.

code:
* * * * * sudo ~/recent500.txt > /var/www/index.html

I know others have suggested alternatives, but assuming you were trying to cat ~/recent500.txt, the > won't pass the root permissions along. You will need to use '| sudo tee /var/www/index.html'.

RFC2324
Jun 7, 2012

http 418

enthe0s posted:

This worked!

code:
* * * * * /home/myusername/myscript.sh >> ~/output.txt 2>&1
* * * * * /home/myusername/tail -n 2500 ~/output.txt > ~/recent500.txt
* * * * * cat /home/myusername/recent500.txt > /var/www/index.html
I didn't even know there was a root crontab, but I imagine it's as you said RFC2324, it was most likely breaking because of a password request. Huge thanks to you and fatherdog!

Every user has their own personal crontab that runs with their permissions, so you could use this to set up the cron job to run as, say, your apache user, for a more secure setup. Right now your script is now running as root, so if anyone other than you can edit the script, they can use it to take control of the system.

enthe0s
Oct 24, 2010

In another few hours, the sun will rise!

Superdawg posted:

I know others have suggested alternatives, but assuming you were trying to cat ~/recent500.txt, the > won't pass the root permissions along. You will need to use '| sudo tee /var/www/index.html'.

Is 'sudo tee' any different than just doing 'sudo cat'? Cause the only difference I see is tee displays the info while also letting you write to a file.

RFC2324 posted:

Every user has their own personal crontab that runs with their permissions, so you could use this to set up the cron job to run as, say, your apache user, for a more secure setup. Right now your script is now running as root, so if anyone other than you can edit the script, they can use it to take control of the system.

Yeah I was a bit wary of using the root crontab for that very reason, but it's a learning exercise and not going into production anywhere. The user I was trying to run this with initially has root access, but it still wouldn't let me do it from the user's crontab for some reason. Like I was using sudo to install apache2 earlier and it didn't even prompt me for a password.

Powered Descent
Jul 13, 2008

We haven't had that spirit here since 1969.

enthe0s posted:

Is 'sudo tee' any different than just doing 'sudo cat'? Cause the only difference I see is tee displays the info while also letting you write to a file.


Yeah I was a bit wary of using the root crontab for that very reason, but it's a learning exercise and not going into production anywhere. The user I was trying to run this with initially has root access, but it still wouldn't let me do it from the user's crontab for some reason. Like I was using sudo to install apache2 earlier and it didn't even prompt me for a password.

The problem with doing "sudo cat file.txt > /root/access/required/foo.txt" is that the sudo rights-elevation only applies to the cat statement that's outputting the contents of the file. The redirection done by a > or a >> is done by your regular shell, which won't have elevated privileges. By using a program (tee in this case) to write it to the new file, and using the sudo on that command, you can get around this.

The default way sudo is configured is to require a password from the user that's invoking it, but when you've done so once, it won't bother asking you again for a little while. That's probably why you weren't prompted before when you did the install. It's possible to set up a user such that specific commands (or all commands) won't ask you for your password at all when you sudo them, and it'll just go and do it. This is what you'll want to do if you need a script to sudo something. Take a look in /etc/sudoers to see how you're currently set up. Warning: if you're going to experiment with changing this file, make sure to use the command visudo instead of just editing the file with a text editor. Screw up the file format and you might end up with no users on the box able to use sudo... including to go back in and fix the sudoers file! visudo does a basic sanity check before it saves the file.

Takes No Damage
Nov 20, 2004

The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far.


Grimey Drawer
Is there a distro with a decent GUI that's significantly lighter than Lubuntu? I recently upgraded from 11 to 14 on an ancient XP laptop, and while it's probably more secure and stable now it's also become unusably slow through VNC. Like I'll click on something and 15 seconds later it will select.

I don't need much, I just use it to remote into and browse dumb crap from work. So I'd need to go in and set up SSH and a VNC server, and be able to run at least a lighter browser like Opera Mini or similar.

So far I've been reading about Porteus but I've never used a 'live' distro before, are those going to be lighter than an installed OS?

e: Should also mention I would like to run WINE for a few things, Foobar and the like. That's probably the most resource heavy thing I'd still want to be able to do, don't need to watch videos or play games or anything.

Takes No Damage fucked around with this message at 23:39 on Feb 2, 2015

evol262
Nov 30, 2010
#!/usr/bin/perl

Takes No Damage posted:

Is there a distro with a decent GUI that's significantly lighter than Lubuntu? I recently upgraded from 11 to 14 on an ancient XP laptop, and while it's probably more secure and stable now it's also become unusably slow through VNC. Like I'll click on something and 15 seconds later it will select.

I don't need much, I just use it to remote into and browse dumb crap from work. So I'd need to go in and set up SSH and a VNC server, and be able to run at least a lighter browser like Opera Mini or similar.

So far I've been reading about Porteus but I've never used a 'live' distro before, are those going to be lighter than an installed OS?

e: Should also mention I would like to run WINE for a few things, Foobar and the like. That's probably the most resource heavy thing I'd still want to be able to do, don't need to watch videos or play games or anything.

"live" tends to be significant slower.

Try a BSD, honestly

karl fungus
May 6, 2011

Baeume sind auch Freunde
How does BSD compare to Linux as a desktop?

spankmeister
Jun 15, 2008






karl fungus posted:

How does BSD compare to Linux as a desktop?

BSD is about 5 years behind on Linux for Desktop, maybe more.

Experto Crede
Aug 19, 2008

Keep on Truckin'
What's the general consensus regarding setting ssh on a port other than 22?

On the one hand I know that having a port <1024 is considered better as these are privileged, but on the other doesn't having it on a port other than 22 provide an extra level of protection?

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
Any port scanner in use today would find SSH on a non-standard port instantly, so no it doesn't really add any protection.

Docjowles
Apr 9, 2009

Really all it's good for is reducing log spam from drive-by idiots brute forcing you. Disable root login, disable passwords (SSH keypairs only), congrats you're now more locked down than 90% of Linux servers on the internet.

evol262
Nov 30, 2010
#!/usr/bin/perl

spankmeister posted:

BSD is about 5 years behind on Linux for Desktop, maybe more.

I don't even know what this is supposed to mean.

CaptainSarcastic
Jul 6, 2013



evol262 posted:

I don't even know what this is supposed to mean.

I was thinking 5 years might be a bit too much, but having run some desktop BSD relatively recently I am pretty much in agreement. I ran PCBSD 9.2 as my main desktop for a while, and while I haven't tried the latest release yet (10.1.1 just came out today), I really doubt it would hold a candle to a good Linux desktop. I have pretty decent Internet, and the updates on PCBSD were so huge and so slow that it really detracted from the experience.

In all fairness, maybe PCBSD is unfair to use as an example, but it is the BSD most directly aimed at the desktop to my knowledge.

Adbot
ADBOT LOVES YOU

VictualSquid
Feb 29, 2012

Gently enveloping the target with indiscriminate love.
I am looking for a Dropbox replacement in terms of linux compatibility.
I currently have Dropbox started through init, use it as a directory and I have an encfs partition running on it.

Is there another one that is good and offers something like this. Or at least a directory mode with full client side encryption.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply