Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
fuf
Sep 12, 2004

haha
Nice, those are both really good ideas thanks :)

I have been obsessively trying to secure those wordpress sites as much as possible since they got hacked, so I should definitely implement that second idea. gently caress, I am always confused by permissions though... what's the best way to only give www-data read access? Should I set the owner to my user account and change the group to www-data? Then only give the group read permissions? I use wp-cli to do all the wordpress updates...

Adbot
ADBOT LOVES YOU

Thalagyrt
Aug 10, 2006

fuf posted:

Nice, those are both really good ideas thanks :)

I have been obsessively trying to secure those wordpress sites as much as possible since they got hacked, so I should definitely implement that second idea. gently caress, I am always confused by permissions though... what's the best way to only give www-data read access? Should I set the owner to my user account and change the group to www-data? Then only give the group read permissions? I use wp-cli to do all the wordpress updates...

Assuming a Wordpress install in /var/www/blaggoblag.com, cd /var/www && chown -R root:www-data blagoblag.com && chown -R www-data:www-data blagoblag.com/wp-content/uploads should do the trick. So long as www-data isn't the owner and doesn't have write access via the group permissions it'll be fine. Leave www-data as the group so you can ensure that www-data gets read permissions.

cd first simply to avoid accidentally hitting enter and chowning /. Basically never start a chown -R with /, unless you really like to live dangerously. :)

Also, I don't know if you have or not, but make sure you never chmod 777 stuff, that's just asking for trouble and is terrible advice that seems to live only in the PHP community.

fuf
Sep 12, 2004

haha
Thanks Thalagyrt, super helpful.

btw I was moments away from signing up for your managed vps service the other day, but all of my traffic is from the UK. you should get a european location! :)

pipebomb
May 12, 2001

Dear God, what is it like in your funny little brains?
It must be so boring.
Can you guys recommend a way for me to monitor mysql, and essentially ensure it is always running? I don't know why, but it occasionally crashes, in which case I get a Pingdom alert. I've setup a script that will allow me to quickly login from my phone (using WorkflowHQ) and restart either the process or the machine, but I really need to figure out why it is happening and find a way to respond it as needed.

Where to start?

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
Something as small as supervisord or runit might be enough to ensure it's always running.

Monit might be a better choice for error recovery and telling you when it fell over.

pipebomb
May 12, 2001

Dear God, what is it like in your funny little brains?
It must be so boring.

minato posted:

Something as small as supervisord or runit might be enough to ensure it's always running.

Monit might be a better choice for error recovery and telling you when it fell over.

Thanks. I was actually just looking at Monit.

JHVH-1
Jun 28, 2002
This might work on wordpress but I remember using it for other applications where they had a directory that was for images and other media and you knew for sure scripts were never going to run from it. The place I was working at the time had customers who just wouldn't switch to an application that wasn't written horribly. There is also an htaccess directive you can disable just php with, "php_flag engine off" you can put in apache's config or .htaccess depending on how your server is set up.

code:
#prevent cgi scripts in .htaccess
AddHandler cgi-script .php .pl .py .jsp .asp .htm .shtml .sh .cgi 
Options -ExecCGI
#prevent files placed in directory via web server
<Limit POST PUT>
 order deny,allow
 deny from all
</Limit>

#prevent by extension
<FilesMatch "\.(php|php5|php4|sh|pl|sql|gz|exe|bat|rb|asp|aspx|cgi|java)$">
Deny From All
</FilesMatch>

waffle iron
Jan 16, 2004
Hooray, my wish came true and rpmfusion-nonfree started repackaging Dropbox's horrible RPMs.

Mr Shiny Pants
Nov 12, 2012
Is it me, or is Docker really awesome? I just installed a Wordpress container on a fresh Debian install. Smooooth.....

Is it too good to be true? Am I being retarded for installing and running Wordpress through Docker?

Mr Shiny Pants fucked around with this message at 20:29 on Dec 27, 2014

kujeger
Feb 19, 2004

OH YES HA HA

Mr Shiny Pants posted:

Is it me, or is Docker really awesome? I just installed a Wordpress container on a fresh Debian install. Smooooth.....

Is it too good to be true? Am I being retarded for installing and running Wordpress through Docker?

Docker is great. Just, you know, be aware that the docker image will not be updated automatically when you update the host OS.
You'll need to either update things inside the running docker image, or download and run a newer docker image (and figure out how to keep your data. Either export/import it, or -- much better -- keep the data on a separate docker volume).

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
I am a huge Docker fanboy (well, specifically a containerization fanboy). Compared to the host + configuration hell I've been lumped with for the last few years, managing and running containers are a breath of fresh air. Just being able to have a multitude of apps available on a single system without any chance of them conflicting is a huge benefit for both Dev and Ops.

It's not a panacea, it's still early days and it's yet to reach maturity. But I truly think that most Linux apps will be delivered this way in the future.

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.
As awesome as Docker is, it is amusing to see the containerization community rediscover all the problems that traditional packaging systems worked through fifteen years ago.

Mr Shiny Pants
Nov 12, 2012

Misogynist posted:

As awesome as Docker is, it is amusing to see the containerization community rediscover all the problems that traditional packaging systems worked through fifteen years ago.

True, but it is pretty nice when it works. It seems like it has a lot of mindshare. I've just installed a couple of programs that had Docker containers readily available.

It feels a bit wasteful though, creating almost complete Linux installations to run an app.

The part of keeping my data is also something I am not really keen on yet. The Wordpress container runs it's own MySQL installation. How do I extract my stuff?
Another thing is that you don't know how well they packaged the container and the application settings. Maybe they left dumb defaults in the container, how do you figure this out? Might be my newbness though.

Janitor Prime
Jan 22, 2004

PC LOAD LETTER

What da fuck does that mean

Fun Shoe
Go check the security fuckup thread in yospos for some docker hilarity. It's the wave of the future no doubt but someone is going to do it better

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.

Mr Shiny Pants posted:

True, but it is pretty nice when it works. It seems like it has a lot of mindshare. I've just installed a couple of programs that had Docker containers readily available.

It feels a bit wasteful though, creating almost complete Linux installations to run an app.
If it's done correctly, it's pretty far from a complete Linux installation. It shouldn't be significantly heavier than an omnibus install of an app running on a native image, but people are still figuring out the best way to Dockerize their applications.

Mr Shiny Pants posted:

The part of keeping my data is also something I am not really keen on yet. The Wordpress container runs it's own MySQL installation. How do I extract my stuff?
The standard Docker solution to this is to expose MySQL to a partner container that's responsible for running some kind of backup process, but this isn't a perfect solution for everyone. For example: let's say your database is gigantic, too big to reliably mysqldump. What's the standard Docker way to signal the database to quiesce so you can snapshot the underlying volume? Nobody knows.

Mr Shiny Pants posted:

Another thing is that you don't know how well they packaged the container and the application settings. Maybe they left dumb defaults in the container, how do you figure this out? Might be my newbness though.
This is a problem that should largely go away as containers are released by first parties instead of enthusiastic non-experts in the community, but this is pretty much exactly what I meant by people rediscovering all the problems of traditional packaging. There have been security issues in packages since the dawn of package management, and that's exactly why Fedora, Debian, and other distributions have formal review processes for packages. Docker will continue to grow in complexity to handle edge cases until it resembles building RPMs, the community standards will become more stringent until they look exactly like Fedora's packaging guidelines, and then we're right back where we started.

When Shellshock hit, I was really glad my applications weren't from some loving community Docker repository.

Janitor Prime posted:

Go check the security fuckup thread in yospos for some docker hilarity. It's the wave of the future no doubt but someone is going to do it better
Docker's recent checksum non-validation comedy shitshow aside, the LXC model still has a long way to go for sensitive data. There's lots of paths to sensitive information disclosure from another container on the same host, and there's still lots of issues surrounding mandatory access control, auditing, cross-container resource accounting, and so forth. I think Docker is great for creating deployable artifacts that work reliably between dev and prod environments (with all the operational security issues that necessarily entails), but we're a long way off from being able to run containers in an generic but still operationally responsible way the same way that we run virtual machines.

Vulture Culture fucked around with this message at 21:32 on Dec 27, 2014

Mr Shiny Pants
Nov 12, 2012

Misogynist posted:

If it's done correctly, it's pretty far from a complete Linux installation. It shouldn't be significantly heavier than an omnibus install of an app running on a native image, but people are still figuring out the best way to Dockerize their applications.

The standard Docker solution to this is to expose MySQL to a partner container that's responsible for running some kind of backup process, but this isn't a perfect solution for everyone. For example: let's say your database is gigantic, too big to reliably mysqldump. What's the standard Docker way to signal the database to quiesce so you can snapshot the underlying volume? Nobody knows.

This is a problem that should largely go away as containers are released by first parties instead of enthusiastic non-experts in the community, but this is pretty much exactly what I meant by people rediscovering all the problems of traditional packaging. There have been security issues in packages since the dawn of package management, and that's exactly why Fedora, Debian, and other distributions have formal review processes for packages. Docker will continue to grow in complexity to handle edge cases until it resembles building RPMs, the community standards will become more stringent until they look exactly like Fedora's packaging guidelines, and then we're right back where we started.


Thanks for the post, seems like I will be installing Wordpress on it's own machine.

Isn't that IT in a nutshell? Old is new again? I've seen it countless times now.

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
One of Docker's hooks is "wow, it's so easy, I just downloaded this community Docker build of application X and had it running in seconds!" and that's great for experimentation, but the community is really not where people should be getting their production containers. To my mind, the value of Docker is "wow, it's so easy, I just downloaded this build of application X from our internal build server* and had it running in seconds!"

* where the build server is maintained by a competent DevOps group who will audit the build process and take care of any Shellshock-style vulnerabilities.

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.

minato posted:

One of Docker's hooks is "wow, it's so easy, I just downloaded this community Docker build of application X and had it running in seconds!" and that's great for experimentation, but the community is really not where people should be getting their production containers. To my mind, the value of Docker is "wow, it's so easy, I just downloaded this build of application X from our internal build server* and had it running in seconds!"

* where the build server is maintained by a competent DevOps group who will audit the build process and take care of any Shellshock-style vulnerabilities.
I've said it before, and I'll say it again: Docker is a tool for rapid prototyping, and a tool for highly complex release engineering. Anything outside these two problem domains is likely doomed to failure.

p.s. gently caress "devops groups"

maskenfreiheit
Dec 30, 2004
.

maskenfreiheit fucked around with this message at 21:04 on Apr 28, 2019

kujeger
Feb 19, 2004

OH YES HA HA

GregNorc posted:

Technically an OSX question, but I'd rather do this on the CLI than use some proprietary app.

Is there a way to easily delete the contents of a folder after a specified time?

I created a folder called ~/screenshots, and tweaked OSX so all screenshots go in there.

I'd like to make a script to delete the contents of that folder bi-weekly. (Actually, to be more specific, I'd like the files moved to the Trash rather than disappear completely)

Delete everything in that folder every two weeks, or delete files older than two weeks every two weeks?


e: In any case, the Trash is located in /Users/username/.Trash/ , so what you would want is to move the files there instead of outright deleting them.

If what you want is to move everything in that folder to trash, something like "mv ~/screenshots/* ~/.Trash/", possibly creating a new directory to sort them, e.g.:
code:
mkdir ~/.Trash/$(date +%Y_%m_%d) && mv ~/screenshots/* ~/.Trash/$(date +%Y_%m_%d)/
This is a convenient website to create a launchd configuration file for you, that will execute the above whenever you need it:
http://launched.zerowidth.com/

Launchd will make sure that if your laptop is off or suspended when the command was supposed to run, the command will run when your laptop wakes up.

For example, this will run the command at 1800 hours, on the 1st and 15th every month: http://launched.zerowidth.com/plists/2f9afac0-7049-0132-b41f-3625a680fe4a

kujeger fucked around with this message at 23:56 on Dec 27, 2014

Death Vomit Wizard
May 8, 2006
Bottom Feeder
I want to do a minimal (headless) linux install for a dedicated VM host. I will use KVM. Do any particular distros stand out for this use case? Obvious concerns that come to mind are stability and security. It's a home server that will only ever be a single host.

I am not a book
Mar 9, 2013

Death Vomit Wizard posted:

I want to do a minimal (headless) linux install for a dedicated VM host. I will use KVM. Do any particular distros stand out for this use case? Obvious concerns that come to mind are stability and security. It's a home server that will only ever be a single host.

The standard Ubuntu/Debian/CentOS choices would be good. Go with what you're familiar with.

Death Vomit Wizard
May 8, 2006
Bottom Feeder
That is good news. Coming from a background of Debian and Ubuntu desktop use, I look forward to trying out CentOS for this project. Checking out the amount of free, quality documentation out there for RHEL has made me pretty excited.

spankmeister
Jun 15, 2008






Red Hat is pretty heavily invested in KVM which makes RHEL and by extension CentOS a good platform for it.

Liam Emsa
Aug 21, 2014

Oh, god. I think I'm falling.
I have a server running right beneath my laptop. It's connected to the same network. Both are running Ubuntu. Short of hooking up a monitor, how do I find out its IP?

Docjowles
Apr 9, 2009

If the server is configured to use DHCP, just check the leases on whatever device hands out DHCP. If not, grab the network info from your laptop and try running a scan of your subnet with nmap. It should locate the server unless it's severely firewalled off. Assuming it's a home lab or something else you're in control of, that is. Randomly port scanning other people's networks is frowned upon.

vvv cool

Docjowles fucked around with this message at 03:53 on Dec 30, 2014

Liam Emsa
Aug 21, 2014

Oh, god. I think I'm falling.

Docjowles posted:

If the server is configured to use DHCP, just check the leases on whatever device hands out DHCP. If not, grab the network info from your laptop and try running a scan of your subnet with nmap. It should locate the server unless it's severely firewalled off. Assuming it's a home lab or something else you're in control of, that is. Randomly port scanning other people's networks is frowned upon.

Yeah found it with nmap, thanks.

jaegerx
Sep 10, 2012

Maybe this post will get me on your ignore list!


Liam Emsa posted:

Yeah found it with nmap, thanks.

http://bash.org/?5273

ChaiCalico
May 23, 2008


That's what the eject command is for.


Are there any decent linux news and discussion podcasts?

Cidrick
Jun 10, 2001

Praise the siamese

madpanda posted:

Are there any decent linux news and discussion podcasts?

Seconding this. I listened to This Week in Enterprise Tech a bit, but it's very... broad. I'm still looking for something good and a bit more technically-focused.

mod sassinator
Dec 13, 2006
I came here to Kick Ass and Chew Bubblegum,
and I'm All out of Ass
TWiT's FLOSS weekly podcast is good in my opinion. It's always on top of recent events and interviews people from interesting projects. http://twit.tv/show/floss-weekly

Liam Emsa
Aug 21, 2014

Oh, god. I think I'm falling.

I heard a story once about a local university where they were tracking down and replacing old computers. They had found every machine except for one. Eventually they had to follow the cable physically, and it led them straight into a wall. The machine had been behind the wall, built over, and been just sitting there turned on for like a decade with no one noticing.

Docjowles
Apr 9, 2009

Liam Emsa posted:

I heard a story once about a local university where they were tracking down and replacing old computers. They had found every machine except for one. Eventually they had to follow the cable physically, and it led them straight into a wall. The machine had been behind the wall, built over, and been just sitting there turned on for like a decade with no one noticing.

Is this on Snopes yet? Not saying I doubt you personally but I've probably heard this exact story ten different times.

I will say that (before I worked there) my company went through a full data center move. They found a couple servers that had been racked, powered on, connected to the network... And then never used :saddowns: We had been in that DC for many years so those boxes spent their entire 3 year amortization (or whatever) just sitting idle. loving inventory, how does it work?

Liam Emsa
Aug 21, 2014

Oh, god. I think I'm falling.

Docjowles posted:

Is this on Snopes yet? Not saying I doubt you personally but I've probably heard this exact story ten different times.

I will say that (before I worked there) my company went through a full data center move. They found a couple servers that had been racked, powered on, connected to the network... And then never used :saddowns: We had been in that DC for many years so those boxes spent their entire 3 year amortization (or whatever) just sitting idle. loving inventory, how does it work?

Nah, I think this one was real, I was pretty sure it was UNC, and this came up:

http://www.theregister.co.uk/2001/04/12/missing_novell_server_discovered_after/

quote:

In the kind of tale any aspiring BOFH would be able to dine out on for months, the University of North Carolina has finally located one of its most reliable servers - which nobody had seen for FOUR years.

One of the university's Novell servers had been doing the business for years and nobody stopped to wonder where it was - until some bright spark realised an audit of the campus network was well overdue.

According to a report by Techweb it was only then that those campus techies realised they couldn't find the server. Attempts to follow network cabling to find the missing box led to the discovery that maintenance workers had sealed the server behind a wall.

Things buried behind walls belong more to the world of Edgar Alan Poe than that of the BOFH. And think of the horror facing the college techies if they ever replace this old Novell server with an NT box.

In that case, the terror of the Blue Screen of Death awaits you, fellas. ®

RFC2324
Jun 7, 2012

http 418

Docjowles posted:

Is this on Snopes yet? Not saying I doubt you personally but I've probably heard this exact story ten different times.

I will say that (before I worked there) my company went through a full data center move. They found a couple servers that had been racked, powered on, connected to the network... And then never used :saddowns: We had been in that DC for many years so those boxes spent their entire 3 year amortization (or whatever) just sitting idle. loving inventory, how does it work?

Ask again in the poo poo that pisses you off thread. I have seen 2 people claim it happened to them there.

Or just read the thread, it was in the current incarnation that I read it(or possibly the Ticket Came in thread)

maskenfreiheit
Dec 30, 2004
.

maskenfreiheit fucked around with this message at 21:04 on Apr 28, 2019

maskenfreiheit
Dec 30, 2004
.

maskenfreiheit fucked around with this message at 21:04 on Apr 28, 2019

My Rhythmic Crotch
Jan 13, 2011

I am having a hell of a time figuring out what switch I need to flip in order to get kernel debugging messages to show up. This is under Debian 7 with the following kernel:
code:
Linux arm 3.14.26-ti-r41 #1 SMP PREEMPT Mon Dec 15 20:01:07 UTC 2014 armv7l GNU/Linux
And an example message that I need to show up is:
code:
printk(KERN_INFO "Test message\n");
I have tried enabling all levels of messages and restarting rsyslog:
code:
echo "7" > /proc/sys/kernel/printk
service rsyslog restart
But that is not helping. I'm thinking I've got to adjust the rsyslog config but my google skills are failing me. Can anyone shove me in the right direction?

Docjowles
Apr 9, 2009

If you run "sudo dmesg" do you see it there?

I mostly work on Red Hat so it may not be true of Debian, but IIRC by default syslog is not configured to log kernel messages. You want to add an entry like this to /etc/rsyslog.conf, which will capture those messages into /var/log/kern.log:

code:
kern.*                                                  /var/log/kern.log

Adbot
ADBOT LOVES YOU

My Rhythmic Crotch
Jan 13, 2011

Yes, I have been using dmesg to check the logs.

Apparently something in my driver code is hosed - because a simple Hello World driver prints kernel messages as expected. So it appears to be a false alarm. Thanks for your feedback anyway!

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply