|
What are everyone thoughts on Openlogic? I'm not a linux expert but this is the only Fedora supported distro in Azure how do you guys feel about this?
|
# ? Apr 17, 2015 03:48 |
|
|
# ? Jun 2, 2024 18:53 |
|
Tab8715 posted:What are everyone thoughts on Openlogic? I'm not a linux expert but this is the only Fedora supported distro in Azure how do you guys feel about this? Never heard of it, but Google says they're just providing an SLA for CentOS on Azure?
|
# ? Apr 17, 2015 04:46 |
|
evol262 posted:Never heard of it, but Google says they're just providing an SLA for CentOS on Azure? Appears to be, I also heard while RHEL isn't technically supported on Azure if you upload a VM it'll work and support will still try to help.
|
# ? Apr 17, 2015 04:51 |
|
evol262 posted:The use case for virtual machines seems to be totally lost on you either way.
|
# ? Apr 17, 2015 05:46 |
|
How good is Wacom Intuos tablet support on Linux? Also, which programs work the best with it?
|
# ? Apr 17, 2015 06:04 |
|
Misogynist posted:I thought the post was written by Jon Hendren until I checked the username Is he on SA? Tab8715 posted:Appears to be, I also heard while RHEL isn't technically supported on Azure if you upload a VM it'll work and support will still try to help. I'm actually the maintainer of the RHEL guest images (qcow for openstack/KVM, not the ec2 Ami or the docker image), so if you upload our official cloud image and it doesn't work, you can harass me. Also, if you use the official 6.6 image and it doesn't work, you can harass me and file a bug against 6.7, which may be able to get fast tracked and fixed in time for the 6.7 GA
|
# ? Apr 17, 2015 06:47 |
|
evol262 posted:Is he on SA?
|
# ? Apr 17, 2015 07:01 |
|
Misogynist posted:You serious? He's DocEvil and he wrote for the site for ten years. To be honest, I don't think I've read the front page for almost a decade...
|
# ? Apr 17, 2015 07:31 |
|
karl fungus posted:How good is Wacom Intuos tablet support on Linux? Also, which programs work the best with it? I think you asked this a page or two ago, so I did some Googling. It might work pretty ok? There's stuff about them on the Arch and Ubuntu wikis, and Wacom has a page about Linux. GIMP should have some level of support; I remember seeing some plugins about tablets. Try it and see, would be my advice.
|
# ? Apr 17, 2015 14:04 |
|
Krita is supposed to be good for tablets.
|
# ? Apr 17, 2015 14:49 |
|
evol262 posted:Darktable is supposed to be ok I use it a lot for editing raws and I think it's great, though I've never used lightroom so maybe that's better still CaptainSarcastic posted:I personally have gotten used to GIMP's multi-window mode I tolerated multi window when it was the only option
|
# ? Apr 17, 2015 16:29 |
|
Misposted, never mind.
|
# ? Apr 17, 2015 16:49 |
|
Suspicious Dish posted:What I've been doing for the last year has finally come to fruition: https://endlessm.com/ Congrats on the wired article, I just ran into it by accident reading about iRobot's lawnmower.
|
# ? Apr 17, 2015 17:22 |
|
I'm really happy with that one. Felt more like it was trying to open a discussion by mentioning Facebook, Google, and us, rather than trying to sell us or bash us or do something about Kickstarter. Of course, as you're probably well aware, we did reach out to lots of journalists to get this covered, but that can have wacky results. We weren't overly happy with the BBC's tone, but we're told that's just how the BBC is.
|
# ? Apr 17, 2015 17:26 |
|
Maybe there should be a "linux server administration megathread"? I am a programmer at a small startup. There are two linux servers, a DB server and a web server. Everyone here are just about novices with Linux administration but know enough to be dangerous, and really, we're coders and would really like to focus on that instead of server administration. Budget is pretty limited right now and I'm not sure we could stretch it to hire a server admin full time, and realistically, he'd be idle 95% of the time. Since I'm kind of unofficially in charge of the programming team this is a a huge headache for me since the server setup is so complicated, but worse, I can't figure out how to simplify it given the need to keep access limited between environments. We have three public websites hosted on the same server. Each website has three environments (live, dev, staging). So there are 9 environments in total. Each of our coders (four of them) have their own user accounts for each and every one of these environments. In addition, each environment runs its own Apache instances to make sure they can't write to each others' directories and provide some protection in case one site gets hacked somehow. Also we use a deployment system that logs into our servers to deploy code, and this also has its own account for every environment. So there are 9*6 = 54 user accounts in total, and god knows how many SSH keys in the respective .authorized_keys. Now we need to set up a new web server to do some load balancing and I get a migraine just thinking about how to manage it all and make sure that the users and account structures are consistent across environments. How the hell is this supposed to work? What about if there were 100 servers and 100 users? Is there some kind of centralized management system for this stuff? I've heard of Chef and Puppet but I don't have experience using them. Would that be the best way forward? How does that help with things like keeping Apache config files consistent, distributing ssh keys, etc? Many thanks for any suggestions.
|
# ? Apr 17, 2015 18:41 |
|
Access Violation posted:Maybe there should be a "linux server administration megathread"? You want to learn puppet or chef. That allows you manage those configuration files and distribute the public keys. You might also want to look into using docker to handle the different apache processes.
|
# ? Apr 17, 2015 18:47 |
|
No and no. Puppet/chef doesn't address his user/account issues and docker is neither secure enough for internet-facing webservers nor production ready. Your best bet at your user account problem with what you have is to look at LDAP, active directory, or some other shared RBAC program. I think redhat's pimping something these days but I haven't kept up on all the options out there. I've used proprietary solutions in two different jobs but they were both homegrown. In this modern day, you're likely better served with some sort of cloud product that handles this all for you. Think of chef/puppet has the ability to push out maintain customizable configs on all your systems. It can help but won't really solve your current problem on its own, it's more of a delivery vehicle rather than a solution itself.
|
# ? Apr 17, 2015 18:54 |
|
Access Violation posted:Maybe there should be a "linux server administration megathread"? Don't create a new account for each user for each environment. One account for each administrator, and filesystem ACLs will do everything you need with regards to granting specific users rights to specific applications - and you may not even need the facls module to do this - proper usage of POSIX groups should do the trick. I'm not sure what programming language you're using, but you don't need separate web servers for each environment either - you can very easily set up each application server under its own user and proxy back to it from a single frontend web server with group membership that allows it read access to whatever it needs. Since you seem to only be running Apache, I'm going to assume PHP. Look into PHP-FPM, and set up a worker pool for each application if that's the case. With a proper deployment system in place your developers shouldn't even have access to prod. Your ops guy(s) (that's you and only you right now from what I understand) should have access, and the deployment system should have access. Giving developers who lack ops experience direct access to production is a recipe for disaster. I definitely wouldn't advise running prod and dev/staging in the same server, either. It's too easy for dev/staging to hose things up (code in development goes into an infinite loop hitting the database with thousands of queries/second could happen, for one example) on the server and cause a prod outage or service degradation. Use identical installs of the OS, but not the same server for each environment.
|
# ? Apr 17, 2015 18:55 |
|
Access Violation posted:Maybe there should be a "linux server administration megathread"? You should be using puppet or chef. If all of this is local, use kerberos instead of a zillion SSH keys. Even if not, you don't need 54 SSH keys. No passwords. Anywhere. You have the same accounts on every server right? Or the same names? With the same SSH keys? It's ridiculous to have an account for "each and every environment". Since you're already using some kind of automated deployment (capistrano or fabric or whatever), there's no reason for them to have accounts for those environments at all. You should never be touching live. Development flow goes like this: You build up a docker container, vagrant image, or AMI which has an identical environment for everyone. They don't need dev environments on the server. There are a million and one tools for populating dev databases from example data. Use one of these, and add your example schema/json/whatever to source control so everyone uses the same thing. You write code and commit it. Travis (which is stupidly easy to set up and already ties into github, if you guys are using that -- if you're not, it may be worth the tiny monthly fee to ease your lives) runs integration tests, unit tests, etc. If it fails, reject their commit and go back to zero. If it passes, it waits for code review. If it gets +2, it gets an automated build which gets deployed to staging where QE or whoever tests it. Staging is also on separate servers, VMs, or docker containers. If you don't have QE, you don't need a staging environment at all. If you only have staging so it can do something with live data, this is a ticking bomb, and you should set up replication to an isolated database that staging can use. Once someone signs off on whatever is happening in staging (assuming you don't scrap the idea of staging entirely), the package gets pushed live. That done... Don't run individual apache instances, and don't co-host websites on the same server. This isn't security in any real sense of the word, since any proper exploit is going to give them shell, where they'll do whatever they want with the permissions of the apache/httpd user in the best case or go looking for local root vulnerabilities in the worst case. If any of those sites gets hacked, you can't trust anything on it, much less your 8 other environments. My real advice to you would be to stop doing this entirely. PaaS is viable and worth the money, especially in cases where you're winging it without a real admin and any idea of what best practice is. No joking, migrate your poo poo to Heroku or Openshift or whatever and stop worrying about all of this until when/if you get big enough to hire some devops/SRE/admins/whatever to wrangle this.
|
# ? Apr 17, 2015 19:00 |
|
Bhodi posted:No and no. Puppet/chef doesn't address his user/account issues and docker is neither secure enough for internet-facing webservers nor production ready. I did not say it address his user/account issues (though it could). I said it could help with pushing out public keys and his configs. Also, while I agree docker has some maturing to do it is still worth him looking into, particularly with his dev/test stacks and possibly with his productions stack. While you may assert it is not production ready, it is already used in production in the wild.
|
# ? Apr 17, 2015 19:04 |
|
captkirk posted:Also, while I agree docker has some maturing to do it is still worth him looking into, particularly with his dev/test stacks and possibly with his productions stack. While you may assert it is not production ready, it is already used in production in the wild. The UID0 issue is also a serious problem for security. Docker is used in the wild, both because of the hype machine and for shops which don't already have devops practices in place, find that they'd rather just deal with docker than AWS/openstack+puppet/salt/whatever+... (mostly, you need to use a config management system with docker anyway, so this is a bust). Or to ship containers that have updated configs. Or Kubernetes+Mesos shops trying to maximize utilization on public cloud instances. There are valid use cases, and "give your developers identical dev environments then ship it to prod" certainly is one, but "it's used in production in the wild so it's production ready" ignores a whole lot of issues.
|
# ? Apr 17, 2015 19:24 |
|
Access Violation posted:How the hell is this supposed to work? What about if there were 100 servers and 100 users? Is there some kind of centralized management system for this stuff? I've heard of Chef and Puppet but I don't have experience using them. Would that be the best way forward? How does that help with things like keeping Apache config files consistent, distributing ssh keys, etc?
|
# ? Apr 18, 2015 04:51 |
|
I use JACK to record multiple sound sources simultaneously. One of the USB headsets broke today, and as a temporary fix I'm using an external USB (Blue Yeti) Microphone in tandem with the headset. It's important to record these devices in the same Audacity session so I don't have to worry about syncing recordings or any sort of echo in the recording. To start jack I have a .sh this command: code:
code:
code:
code:
Edit: I answered my own question. I found the device number (using cat /proc/asound/cards), and instead of placing the device driver in my alsa_in/out script, I replaced it with the device number (hw:2). ProfessorBooty fucked around with this message at 19:12 on Apr 19, 2015 |
# ? Apr 19, 2015 15:00 |
|
Can anyone what tell what version of Red Hat this video is likely to be using? I think it looks like Gnome 2 but I am a KDE fan so for all I know it could be Gnome 3 with a skin. If I want to follow this series of tutorials, should I look for an old version of Red Hat or use Centos or another distro? https://www.youtube.com/watch?v=bG0mMOteVR8
|
# ? Apr 20, 2015 14:24 |
|
Crotch Fruit posted:Can anyone what tell what version of Red Hat this video is likely to be using? I think it looks like Gnome 2 but I am a KDE fan so for all I know it could be Gnome 3 with a skin. If I want to follow this series of tutorials, should I look for an old version of Red Hat or use Centos or another distro? That's a REALLY old version, it looks like Red Hat 7 (not RHEL, Red Hat)
|
# ? Apr 20, 2015 14:57 |
|
Crotch Fruit posted:Can anyone what tell what version of Red Hat this video is likely to be using? I think it looks like Gnome 2 but I am a KDE fan so for all I know it could be Gnome 3 with a skin. If I want to follow this series of tutorials, should I look for an old version of Red Hat or use Centos or another distro? It appears to be gnome 1, which matches up with the archaic version of emacs they're using. Centos didn't even exist at this point. RHEL didn't either. I would guess its Red Hat 7.2 (not RHEL 7.2, which isn't out yet). Maybe 8. You should follow a modern LPIC guide to start with
|
# ? Apr 20, 2015 15:06 |
|
evol262 posted:I would guess its Red Hat 7.2 (not RHEL 7.2, which isn't out yet). Maybe 8. You should follow a modern LPIC guide to start with I know it's not a good tutorial by any means, it was just something related to a video I watched so I clicked it and it looked interesting. Yeah I thought it was pre RHEL, but the video was published November 2014, so why would they use a version 10 years old? The only thing I can think of is that is a re-upload of an older video.
|
# ? Apr 20, 2015 15:10 |
|
Crotch Fruit posted:I know it's not a good tutorial by any means, it was just something related to a video I watched so I clicked it and it looked interesting. Who is 'they'? That YouTube channel isn't related to RedHat, it's just some random dude. The_Franz fucked around with this message at 15:20 on Apr 20, 2015 |
# ? Apr 20, 2015 15:15 |
|
Well, if you google the water mark in the video, 'they' are cbtnuggets.com, an online IT video training service.
|
# ? Apr 20, 2015 15:20 |
|
Crotch Fruit posted:Well, if you google the water mark in the video, 'they' are cbtnuggets.com, an online IT video training service. That company doesn't seem to have anything to do with that YouTube channel either. It's just an old training video that, for some reason, some random guy put on YouTube last year.
|
# ? Apr 20, 2015 15:27 |
|
The other day I was compiling my first program ever. The fucker kept failing on me and after a bunch of Googling I got a Microsoft help page where they said that some drivers got hosed up because another program tried to install the same drivers and now all the drivers are sad or something. So I applied the fix and it didn't do anything and my files are still just sitting there. I'm not a programmer, but I'm leaning how to for something to do and that kind of gave me a really ominous feeling for any future attempts while I'm still running Windows. So uhh, should I give Linux a shot or what? Windows is starting to get pretty frustrating, but like I said I just started with this kind of stuff so I don't know how well I'll be able to work with Linux.
|
# ? Apr 21, 2015 06:15 |
|
Compiling is incredibly easy on Linux. I feel bad for you people that have to compile on Windows.
|
# ? Apr 21, 2015 06:24 |
|
I guess I'll give Ubuntu a shot then? That seems like the most newbie friendly. Should I keep a partition of Win7 since I'm a gigantic nerd and like video games?
|
# ? Apr 21, 2015 06:34 |
|
Robo Reagan posted:The other day I was compiling my first program ever. The fucker kept failing on me and after a bunch of Googling I got a Microsoft help page where they said that some drivers got hosed up because another program tried to install the same drivers and now all the drivers are sad or something. So I applied the fix and it didn't do anything and my files are still just sitting there. I'm not a programmer, but I'm leaning how to for something to do and that kind of gave me a really ominous feeling for any future attempts while I'm still running Windows. What are you compiling? In what language? With what compiler? And what's the error? Drivers should have nothing to do with anything, and while compiler suites used to be easier to come by on Linux, compiling is pretty Mich the same process there.
|
# ? Apr 21, 2015 06:35 |
|
Use a modern distro like ubuntu or fedora and keep your Windows around for games, although gaming on linux has improved greatly over the last few years so you might want to install steam on Linux and give it a go.
|
# ? Apr 21, 2015 06:37 |
|
Wait two days until the 23rd and you'll have a whole new Ubuntu release to install.
|
# ? Apr 21, 2015 06:47 |
|
Ubuntu, Debian, and Fedora all have major releases in the next 3 weeks.
|
# ? Apr 21, 2015 07:56 |
|
I'm on Ubuntu (I've used the .10 version for a few months) and I want to try out Debian KDE when it releases in a few days. In preparation I ran "tar -cvpzf backup.tar.gz --one-file-system /" and moved the file to an external disk. However, I encrypted my home directory on install. Is there an easy way to decrypt all these files from another system to pull individual files I want from it? or should I delete all the encrypted stuff from the tar and manually move my home folder to it using the GUI then encrypt the archive? Or would this create the same problem? Thanks. E:also at the end it said "tar: /: file changed as we read it" does this mean everything backed up successfully apart from the changes or did it stop when it ran into something changing? Crack fucked around with this message at 13:22 on Apr 21, 2015 |
# ? Apr 21, 2015 13:10 |
|
Why not just install the kubuntu-desktop package?
|
# ? Apr 21, 2015 14:26 |
|
|
# ? Jun 2, 2024 18:53 |
|
Crack posted:I'm on Ubuntu (I've used the .10 version for a few months) and I want to try out Debian KDE when it releases in a few days. In preparation I ran "tar -cvpzf backup.tar.gz --one-file-system /" and moved the file to an external disk. However, I encrypted my home directory on install. Is there an easy way to decrypt all these files from another system to pull individual files I want from it? or should I delete all the encrypted stuff from the tar and manually move my home folder to it using the GUI then encrypt the archive? Or would this create the same problem? Thanks. Why not keep your home directory and reuse it on Debian? tar didn't stop when that happened, and it was probably a log, but backing up live systems works better with consecutive rsyncs
|
# ? Apr 21, 2015 14:39 |