|
evol262 posted:Even a fully patched system can still be subject to problems from 2000 because nobody bothered to label it as security. As usual, defense in depth wins Proper network segregation definitely can go a long way, but I'd wager a lot of people don't do that the right way. The glibc thing from today is a great example of the quoted bit here. It was fixed in May 2013, but nobody labeled it security, so here we are today scrambling on it!
|
# ? Jan 27, 2015 22:42 |
|
|
# ? Jun 4, 2024 20:13 |
|
fatherdog posted:A company that holds large government contracts and has had 2 compromises in the past year still refusing to patch their internal servers because completely unrelated poor planning in a hardware spec cost them more money is some impressive levels of head-in-the-sand-ism from your managers. (please understand that as being stated in a tone of sympathy rather than derision) We don't have those contracts anymore, as I understand we finally sold the last of them to Raytheon. That said, the habits are ingrained in corporate culture in quite a few areas, and the Sr admins insist that the current patching practices are a legacy of that. I'm more inclined to think it has to do with fear of disks welding themselves.
|
# ? Jan 27, 2015 22:45 |
|
Thalagyrt posted:Proper network segregation definitely can go a long way, but I'd wager a lot of people don't do that the right way. The glibc thing from today is a great example of the quoted bit here. It was fixed in May 2013, but nobody labeled it security, so here we are today scrambling on it!
|
# ? Jan 27, 2015 23:37 |
|
RFC2324 posted:We don't have those contracts anymore, as I understand we finally sold the last of them to Raytheon. This is really just one of those culture things that will take time and tragedy to sort out. Once this sort of think becomes more common, managers might start authorizing faster updates and/or pressuring suppliers to actually write more secure software in the first place. Kind of like how it take airline crashes to push through reforms in aviation safety.
|
# ? Jan 28, 2015 01:26 |
|
Can anyone point me in the direction of an incremental, encrypted, backup tool to use with the 1TB of Google Drive space that I have? I'd like to be able to have it back up the obvious stuff like family photos, music, ebooks, etc... but I want to have control over the key, so it should be encrypted on the Google side of things. I tried Duplicati, but I was really really underwhelmed. Maybe I'm expecting too much, as I used to use JungleDisk for S3 backups a bunch of years back, but Duplicati seemed slow, clunky, uninformative... and honestly running a windows app in mono for backing up on my linux box(s) just seems silly. I'm also not entirely sure after reading a bunch of Duplicati documentation, if it really does want to have to re-upload the entire backup once every <iteration> (day/week/month). I mean, I have the bandwidth for it now (100/40 fiber) but the backups currently size >500gigs in total. But I've done a bunch of googling and can't come up with another solution. Is there a more manual way of going about it (encrypting and then using another tool to send to Google Drive) while still leaving the files open and accessible to my family locally? edit - or failing that, another reputable and fast service that has better support for linux, and will let me get the same amount of space for the same $ ($10/month for 1TB). edit 2 - well poo poo, I was closing a bunch of tabs I had open after writing this post, and noticed one that I hadn't gotten to for 'cloudsync' which appears to be exactly what I'm looking for. https://github.com/HolgerHees/cloudsync Lukano fucked around with this message at 14:55 on Jan 29, 2015 |
# ? Jan 29, 2015 14:50 |
|
I'd like to install Linux on an old laptop, but it doesn't allow for "boot via USB" and I don't have a blank CD at my disposal. I think there's wubi, but that's probably a performance hit on an already older machine. Any other options out there? It's still running XP.
|
# ? Jan 29, 2015 16:55 |
|
midnightclimax posted:I'd like to install Linux on an old laptop, but it doesn't allow for "boot via USB" and I don't have a blank CD at my disposal. I think there's wubi, but that's probably a performance hit on an already older machine. Any other options out there? It's still running XP. Are you sure the "Boot to USB" isn't hidden somewhere in the BIOS? I had to dig into the BIOS for an older laptop and enable it.
|
# ? Jan 29, 2015 17:32 |
|
Spazz posted:Are you sure the "Boot to USB" isn't hidden somewhere in the BIOS? I had to dig into the BIOS for an older laptop and enable it. Hmm I'll check again, but it's one of those Sony VAIO laptops, and I remember hearing not many good things about their BIOS settings. Afaik there's hard drive, floppy, cd-rom, and network.
|
# ? Jan 29, 2015 18:03 |
|
You could always try PXE for the install, but it'll require another computer. I'd also try Googling "Sony Vaio <model> USB boot" and see if anything comes up, it could just require a BIOS update to get that boot option.
|
# ? Jan 29, 2015 18:16 |
|
midnightclimax posted:Hmm I'll check again, but it's one of those Sony VAIO laptops, and I remember hearing not many good things about their BIOS settings. Afaik there's hard drive, floppy, cd-rom, and network. A network boot to Linux installer might be an option. The standard network boot method in practically all modern PC hardware is known as PXE. Basically, to implement a PXE netboot server you'll need three things, which all can be run on a single host: - a very basic DHCP server, for responding to the laptop's network boot query with a "to boot, download this file from this IP address" response (basically one or two DHCP options) - a TFTP server, for supplying the netboot loader file, its configuration file, and the Linux kernel and initrd files (maybe some small boot menu image files too) For Debian/Ubuntu, the above might be enough if the installer can have Internet connectivity: it will download the rest from the distribution's standard mirrors. For some other distributions, you may have to provide the rest of the installation ISO image from a NFS or HTTP server (or even FTP server or Windows network share with some distributions). http://tftpd32.jounin.net/ is a Windows application that contains practically all of the basic server-side functionality for PXE network booting. (If your distribution of choice requires a local NFS/HTTP/FTP service to download the main installation image from, that is not included.) With a bit of Googling, you'll probably find step-by-step instruction documents for setting up a netboot installation server for your Linux distribution of choice. However, they will usually assume you already have at least one Linux system you can use as a netboot server.
|
# ? Jan 29, 2015 18:32 |
|
Lukano posted:Can anyone point me in the direction of an incremental, encrypted, backup tool to use with the 1TB of Google Drive space that I have? Why did you use duplicati instead of duplicity (which duplicati is a c# reimplementation of)? https://en.wikipedia.org/wiki/Duplicity_%28software%29
|
# ? Jan 29, 2015 19:30 |
|
Spazz posted:You could always try PXE for the install, but it'll require another computer. I'd also try Googling "Sony Vaio <model> USB boot" and see if anything comes up, it could just require a BIOS update to get that boot option. telcoM posted:A network boot to Linux installer might be an option. The standard network boot method in practically all modern PC hardware is known as PXE. I've got another Linux machine up and running, so that shouldn't be too difficult. Will try googling for BIOS update first, though, ta
|
# ? Jan 29, 2015 20:04 |
|
Longinus00 posted:Why did you use duplicati instead of duplicity (which duplicati is a c# reimplementation of)? https://en.wikipedia.org/wiki/Duplicity_%28software%29 Honestly, I don't know -- Duplicati ends up being the primary search result in most of my previous searches for a solution, so I totally forgot Duplicity was an option. That said, the way that duplicati / duplicity do the backups is not ideal for me (having to re-up 500+ gigs every <interval> because it doesn't encrypt per-file) is a pain and a waste of bandwidth (even if I have it to spare).
|
# ? Jan 29, 2015 21:51 |
|
Is this the thread to ask about command line stuff? If not, I'll take this down. I'm having trouble getting a cronjob to write a textfile to the index.html file located in /var/www I'm not a seasoned linux user or scripter by any means, but I'm not totally new to linux either. This is the cronjob I have currently and I don't know why it's not working. code:
|
# ? Jan 29, 2015 23:02 |
|
enthe0s posted:Is this the thread to ask about command line stuff? If not, I'll take this down. Currently, it's trying to execute ~/recent500.txt and sending the output to the index file. You want either code:
code:
|
# ? Jan 29, 2015 23:09 |
|
Lukano posted:Can anyone point me in the direction of an incremental, encrypted, backup tool to use with the 1TB of Google Drive space that I have? There was a discussion a little while ago all about gdrive and its various alternatives. It starts on page 512. Google hasn't released an official Linux client that will let you properly mount your gdrive, but there are several unofficial clients. And of course other cloud storage providers. That cloudsync utility looks pretty nice, but you can also always roll your own solution. For ideas, see my post history in this thread for a quick script I wrote that uses encfs and rsync to send encrypted backups out to an SFTP host.
|
# ? Jan 29, 2015 23:13 |
|
Powered Descent posted:There was a discussion a little while ago all about gdrive and its various alternatives. It starts on page 512. Google hasn't released an official Linux client that will let you properly mount your gdrive, but there are several unofficial clients. And of course other cloud storage providers. I suddenly want to build up a tiny windows vm set up to mount a linux directory in the gdrive location. So I can have the single most inefficient solution to the gdrive problem.
|
# ? Jan 29, 2015 23:25 |
|
fatherdog posted:Currently, it's trying to execute ~/recent500.txt and sending the output to the index file. Ok, that makes sense as to why it wasn't doing anything before. The previous 2 cronjobs I made both run a command, so I guess I wasn't fully grasping what was happening here. I updated the line to use cat like you said, but it's still not working. Here's what I have currently: code:
I think I need the sudo version because I can't just move files into that /www/ directory without it, but I tried it without sudo just to see and still nothing. I also tried putting ~ in front of the path (~/var/www/index.html), but that didn't work either, so I'm at a loss.
|
# ? Jan 29, 2015 23:30 |
|
Powered Descent posted:There was a discussion a little while ago all about gdrive and its various alternatives. It starts on page 512. Google hasn't released an official Linux client that will let you properly mount your gdrive, but there are several unofficial clients. And of course other cloud storage providers. Awesome, thank you very much. I totally forgot about encfs, but now that you've reminded me I recall that it was what I had originally intended to use when I signed up for the 1TB gdrive service 6 months ago (and then promptly procrastinated setting up, thus forgetting what I was going to do to utilize it). I suspect the encfs / rsync solution will end up being a faster solution than cloudsync (which currently is running pretty slow). Or is that just the gdrive api rate-limiting? edit - I don't suppose you happen to have a link to the post / script handy? I'm going back through your post history, but you're a fairly prolific poster :P Lukano fucked around with this message at 23:38 on Jan 29, 2015 |
# ? Jan 29, 2015 23:35 |
|
enthe0s posted:Ok, that makes sense as to why it wasn't doing anything before. The previous 2 cronjobs I made both run a command, so I guess I wasn't fully grasping what was happening here. I updated the line to use cat like you said, but it's still not working. Here's what I have currently: Put it in the root crontab, and it will run as root, saving you needing to use sudo(sudo may be breaking it as well if it is asking for a password)
|
# ? Jan 29, 2015 23:43 |
|
RFC2324 posted:Put it in the root crontab, and it will run as root, saving you needing to use sudo(sudo may be breaking it as well if it is asking for a password) This worked! code:
|
# ? Jan 30, 2015 00:05 |
|
Lukano posted:Awesome, thank you very much. I totally forgot about encfs, but now that you've reminded me I recall that it was what I had originally intended to use when I signed up for the 1TB gdrive service 6 months ago (and then promptly procrastinated setting up, thus forgetting what I was going to do to utilize it). Sure thing, right here. Note that my rsync options include a nonstandard port number for ssh, so you'd need to either get rid of that, or just dump ssh entirely if you have your cloud storage mounted another way. Also, the rsync command is inside an eval statement because I was a stickler for pulling the options out into a variable. Scroll up a few posts from there for details on that, and feel free to simplify to your taste. In normal mode, encfs keeps files in an encrypted rootdir, of which you get a decrypted "view" in a mountpoint of your choosing. But if your stuff is all kept in plaintext, you can use encfs in --reverse mode to get an encrypted "view" of a normal folder. My script then rsyncs this view to a remote host. Before this will work, you'll obviously need to set up the encfs password on the directory you want to sync. Something like: code:
The rest of the script should be pretty straightforward, but give a shout if you have questions. e: Adding a quick warning that this script is a one-way sync and intended for backing up a single box. It doesn't do any kind of nice multi-host sync, but simply makes the remote host into an encrypted mirror of whatever you're backing up. Add something to that folder manually, on the gdrive website? It's toast on the next run of the backup script. Have two machines backing up to the same folder on your cloud storage? They'll be undoing each other's changes every single time. Powered Descent fucked around with this message at 00:34 on Jan 30, 2015 |
# ? Jan 30, 2015 00:28 |
|
Perfect! Yeah, this is the only device that will be backing up to gdrive. It's an rPi with a sata>usb attached drive that just sits there pulling in stuff to archive via rsync, then will shuttle it out to gdrive. I tried cloudsync, but it would have taken me ~260days to finish backing up the ~500gigs I anticipate. 90% of the delay was the program doing openpgp aes 256 encrypting each file. I'm hoping that having some granular control over the encryption aspect with encfs, will allow me to pick something more suited to / faster on the rpi. edit - Actually I do have a question for you Powered Descent. Why encfs instead of ecryptfs, which if I am reading correctly should potentially be faster and the metadata is stored in the files themselves, as opposed to requiring that seperate .xml? Lukano fucked around with this message at 01:32 on Jan 30, 2015 |
# ? Jan 30, 2015 01:14 |
|
Lukano posted:edit - Actually I do have a question for you Powered Descent. Why encfs instead of ecryptfs, which if I am reading correctly should potentially be faster and the metadata is stored in the files themselves, as opposed to requiring that seperate .xml? Mostly because ecryptfs doesn't have the "reverse" mode that I wanted to use. (Or if it does, I couldn't find anything about it.) I prefer to keep my backups encrypted when I send them off to the cloud (AKA someone else's computer), but I'm not quite paranoid enough to bother with keeping it all encrypted on my local box. With encfs --reverse, I can generate an encrypted view of my stuff on the fly, taking up zero disk space. If you're already keeping things encrypted locally, then you're right, ecryptfs has definite advantages.
|
# ? Jan 30, 2015 01:53 |
|
Is there a Linux program that will download attachments from my Gmail account, preferably with the ability to set a filter? Like, let's say I want to download all attachments sent to me from fart@email.com, and only that address. Is there a way to mass download them on Linux?
|
# ? Jan 30, 2015 05:49 |
karl fungus posted:Is there a Linux program that will download attachments from my Gmail account, preferably with the ability to set a filter? You could probably whip something up in python using the Gmail API to do this: https://developers.google.com/api-client-library/python/apis/gmail/v1
|
|
# ? Jan 30, 2015 06:59 |
|
enthe0s posted:This is the cronjob I have currently and I don't know why it's not working. I know others have suggested alternatives, but assuming you were trying to cat ~/recent500.txt, the > won't pass the root permissions along. You will need to use '| sudo tee /var/www/index.html'.
|
# ? Jan 30, 2015 15:11 |
|
enthe0s posted:This worked! Every user has their own personal crontab that runs with their permissions, so you could use this to set up the cron job to run as, say, your apache user, for a more secure setup. Right now your script is now running as root, so if anyone other than you can edit the script, they can use it to take control of the system.
|
# ? Jan 31, 2015 03:40 |
|
Superdawg posted:I know others have suggested alternatives, but assuming you were trying to cat ~/recent500.txt, the > won't pass the root permissions along. You will need to use '| sudo tee /var/www/index.html'. Is 'sudo tee' any different than just doing 'sudo cat'? Cause the only difference I see is tee displays the info while also letting you write to a file. RFC2324 posted:Every user has their own personal crontab that runs with their permissions, so you could use this to set up the cron job to run as, say, your apache user, for a more secure setup. Right now your script is now running as root, so if anyone other than you can edit the script, they can use it to take control of the system. Yeah I was a bit wary of using the root crontab for that very reason, but it's a learning exercise and not going into production anywhere. The user I was trying to run this with initially has root access, but it still wouldn't let me do it from the user's crontab for some reason. Like I was using sudo to install apache2 earlier and it didn't even prompt me for a password.
|
# ? Jan 31, 2015 18:56 |
|
enthe0s posted:Is 'sudo tee' any different than just doing 'sudo cat'? Cause the only difference I see is tee displays the info while also letting you write to a file. The problem with doing "sudo cat file.txt > /root/access/required/foo.txt" is that the sudo rights-elevation only applies to the cat statement that's outputting the contents of the file. The redirection done by a > or a >> is done by your regular shell, which won't have elevated privileges. By using a program (tee in this case) to write it to the new file, and using the sudo on that command, you can get around this. The default way sudo is configured is to require a password from the user that's invoking it, but when you've done so once, it won't bother asking you again for a little while. That's probably why you weren't prompted before when you did the install. It's possible to set up a user such that specific commands (or all commands) won't ask you for your password at all when you sudo them, and it'll just go and do it. This is what you'll want to do if you need a script to sudo something. Take a look in /etc/sudoers to see how you're currently set up. Warning: if you're going to experiment with changing this file, make sure to use the command visudo instead of just editing the file with a text editor. Screw up the file format and you might end up with no users on the box able to use sudo... including to go back in and fix the sudoers file! visudo does a basic sanity check before it saves the file.
|
# ? Jan 31, 2015 20:59 |
|
Is there a distro with a decent GUI that's significantly lighter than Lubuntu? I recently upgraded from 11 to 14 on an ancient XP laptop, and while it's probably more secure and stable now it's also become unusably slow through VNC. Like I'll click on something and 15 seconds later it will select. I don't need much, I just use it to remote into and browse dumb crap from work. So I'd need to go in and set up SSH and a VNC server, and be able to run at least a lighter browser like Opera Mini or similar. So far I've been reading about Porteus but I've never used a 'live' distro before, are those going to be lighter than an installed OS? e: Should also mention I would like to run WINE for a few things, Foobar and the like. That's probably the most resource heavy thing I'd still want to be able to do, don't need to watch videos or play games or anything. Takes No Damage fucked around with this message at 23:39 on Feb 2, 2015 |
# ? Feb 2, 2015 19:42 |
|
Takes No Damage posted:Is there a distro with a decent GUI that's significantly lighter than Lubuntu? I recently upgraded from 11 to 14 on an ancient XP laptop, and while it's probably more secure and stable now it's also become unusably slow through VNC. Like I'll click on something and 15 seconds later it will select. "live" tends to be significant slower. Try a BSD, honestly
|
# ? Feb 3, 2015 00:39 |
|
How does BSD compare to Linux as a desktop?
|
# ? Feb 3, 2015 00:41 |
|
karl fungus posted:How does BSD compare to Linux as a desktop? BSD is about 5 years behind on Linux for Desktop, maybe more.
|
# ? Feb 3, 2015 00:42 |
|
What's the general consensus regarding setting ssh on a port other than 22? On the one hand I know that having a port <1024 is considered better as these are privileged, but on the other doesn't having it on a port other than 22 provide an extra level of protection?
|
# ? Feb 3, 2015 01:32 |
|
Any port scanner in use today would find SSH on a non-standard port instantly, so no it doesn't really add any protection.
|
# ? Feb 3, 2015 01:39 |
|
Really all it's good for is reducing log spam from drive-by idiots brute forcing you. Disable root login, disable passwords (SSH keypairs only), congrats you're now more locked down than 90% of Linux servers on the internet.
|
# ? Feb 3, 2015 03:27 |
|
spankmeister posted:BSD is about 5 years behind on Linux for Desktop, maybe more. I don't even know what this is supposed to mean.
|
# ? Feb 3, 2015 06:27 |
|
evol262 posted:I don't even know what this is supposed to mean. I was thinking 5 years might be a bit too much, but having run some desktop BSD relatively recently I am pretty much in agreement. I ran PCBSD 9.2 as my main desktop for a while, and while I haven't tried the latest release yet (10.1.1 just came out today), I really doubt it would hold a candle to a good Linux desktop. I have pretty decent Internet, and the updates on PCBSD were so huge and so slow that it really detracted from the experience. In all fairness, maybe PCBSD is unfair to use as an example, but it is the BSD most directly aimed at the desktop to my knowledge.
|
# ? Feb 3, 2015 09:24 |
|
|
# ? Jun 4, 2024 20:13 |
|
I am looking for a Dropbox replacement in terms of linux compatibility. I currently have Dropbox started through init, use it as a directory and I have an encfs partition running on it. Is there another one that is good and offers something like this. Or at least a directory mode with full client side encryption.
|
# ? Feb 3, 2015 10:39 |