Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
bitprophet
Jul 22, 2004
Taco Defender

Carthag posted:

I have a bunch of servers running Debian, some prod, some dev.

Currently, we deploy to production by calling scripts on dev that copy the relevant files to prod, but that's a bit of a pain since perms tend to get hosed up.

I'm thinking about having some cronjob or something on the prod servers poll a file on the dev servers and pull the changes if indicated.

Is that a horrible idea?

You should probably use source control (cue someone storming in to say "Source control isn't a deployment tool!" :rolleyes:) or some dedicated deployment- or configuration-oriented tool such as Capistrano, Fabric, Chef, Puppet or similar.

Might also need more details -- what sort of code are you talking about (or are you talking about server config files instead of executable application code)? What does it do? What language is it? etc.

bitprophet fucked around with this message at 20:04 on Jan 7, 2010

Adbot
ADBOT LOVES YOU

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



We mainly deploy files for the web servers (images, javascript, etc) and a number of webobjects applications and frameworks. The layout of prod & dev is identical, but they have different configurations (point to different databases, etc).

The current deploy scripts are simple bash scripts that scp or rsync files. They're a mess because they're old and have been modified by 3 or 4 people in succession before I signed on. For example:

deploy-app Name 192.168.0.10
deploy-frm Name 192.168.0.10

These are in turn encapsulated in like

deploy-frm-all Name (which would deploy Name to all prod servers)

The main issue with the current scripts is that they screw up perms and leave tmp files causing problems for the next user trying a deployment.

I'll try looking into your suggestions tomorrow at work -- we're switching from our old Mac servers, so it's a perfect time to set up a new deployment method, they'll need updated paths for the files anyway. I pretty much have free reign in implementing this, as long as it's not harder to deploy than it was before and security isn't compromised.

Carthag Tuek fucked around with this message at 23:08 on Jan 7, 2010

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



Fabric looks like it'll alleviate a lot of the problems we have with the current setup.

One question, is this a bad idea security-wise:

A user on the production servers that has the ability to sudo without password, and can't login with a password, only with a ssh key.

bitprophet
Jul 22, 2004
Taco Defender

Carthag posted:

Fabric looks like it'll alleviate a lot of the problems we have with the current setup.

One question, is this a bad idea security-wise:

A user on the production servers that has the ability to sudo without password, and can't login with a password, only with a ssh key.

WIth tools like Fabric and Capistrano you'll be prompted for sudo prompts (and the first prompt will be remembered, so you generally only have to enter the password once per invocation of the tool), so if you were worried about that aspect of it, it's not a problem. (They cannot do other kinds of remote interactivity, though -- at least not yet -- but they are able to pick up sudo prompts.)

So I'd say it's safer to have a passworded sudo access, but still use an SSH key, as that is always a good idea -- even better if you can make the machine only accessible by SSH keys.

Mister Biff
Dec 26, 2006

Misogynist posted:

For household stuff, it really doesn't matter, and whatever's cheap and works will probably do you fine. But you can't go wrong with the Intel PRO/1000 desktop adapters.

You should give some more information about the network problems, though. Even a cheap switch shouldn't be choking under the weight of gigabit unless you're totally maxing out the backplane somehow, and I doubt that's happening with torrents unless you somehow have a gigabit Metro Ethernet link to the outside. If torrents are killing your network performance, the NAT tables in your router are probably too small. You can alleviate this by turning down the idle connection pruning timeout from your NAT table, if your router allows you to tweak that.

I don't think I was very clear, then. The router I have now is a Asus 10/100 running DD-WRT, and when I have more than two torrents running along with web browsing and streaming media from one computer to another, the router runs at 100% load and the streaming video stutters.

In my mind, that just means my router (or server NIC) isn't capable of keeping up with the volume of traffic, so I was going to upgrade to one with more RAM/faster processor, and decided to take the plunge to gigabit while I was at it, and soup up the streaming server as well.

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



bitprophet posted:

WIth tools like Fabric and Capistrano you'll be prompted for sudo prompts (and the first prompt will be remembered, so you generally only have to enter the password once per invocation of the tool), so if you were worried about that aspect of it, it's not a problem. (They cannot do other kinds of remote interactivity, though -- at least not yet -- but they are able to pick up sudo prompts.)

So I'd say it's safer to have a passworded sudo access, but still use an SSH key, as that is always a good idea -- even better if you can make the machine only accessible by SSH keys.

It turns out with some careful planning, we don't even need sudo in the deployment scripts. You can do some simple interactive remote stuff with prompt() + run() but it's not needed for us.

Gonna work out with the others whether we prefer a single deployment user, or for all of us to have access to all machines. Gotta say the former appears more manageable, but could be an issue if the key was compromised (otoh, one key to replace versus potetially many).

Thanks for the pointers!

bitprophet
Jul 22, 2004
Taco Defender

Carthag posted:

Gonna work out with the others whether we prefer a single deployment user, or for all of us to have access to all machines. Gotta say the former appears more manageable, but could be an issue if the key was compromised (otoh, one key to replace versus potetially many).

We have this same problem at my workplace (albeit from a Rails/Capistrano perspective.) No clearly best solution yet, unfortunately. I tend to request that people sign in under their own names when connecting manually, and otherwise channel things through a deploy user to make actual app deployment easier (file permissions, etc.)

Horse Clocks
Dec 14, 2004


Mister Biff posted:

Wasn't sure if this was the best place, but it seemed like a good bet.

Anyone have a quick recommendation for a reliable PCI gigabit NIC? My network is choking under the weight of torrents + streaming, and I'm finally going to upgrade my router to a gigabit switch with wireless-N, but will need my media server to have a gigabit NIC to keep up.
I don't know if it's been fixed or not. But when I built my file server with a Realtek gigabit NIC, it'd fail as soon as you started doing anything over samba with it. It'd happily fly along transferring files over FTP, but as soon as you started talking to it over samba it's throughput would drop to a trickle.

I put in an Intel PRO nic and haven't had any issues since.

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



bitprophet posted:

We have this same problem at my workplace (albeit from a Rails/Capistrano perspective.) No clearly best solution yet, unfortunately. I tend to request that people sign in under their own names when connecting manually, and otherwise channel things through a deploy user to make actual app deployment easier (file permissions, etc.)

I wrote up a Fabric deployment setup that works from my user, gonna test it with one of the other guys on monday.

But honestly, my main issue with having everybody deploying is that ill have to create a bunch of accounts and i dont feel like it.

Exioce
Sep 7, 2003

by VideoGames
I installed Ubuntu 9.10 yesterady on a PC running Windows XP that I had pretty much written off as almost dead in hardware terms (due to constant hard crashes). Turns out that Windows XP is just bloated and poo poo - Ubuntu runs as smooth as gently caress, starts up quick, and doesn't crash.

I think I want to use this Ubuntu box to play all those old Windows 95/98 RPGs I could never get to work on Windows XP. I know there are some native games such as Beneath A Steel Sky that wont require emulation, but if I run something like Wine, am I going to see a performance hit?

The PC in question has hardware specs that are a few years old admittedly, but it's Windows XP generation hardware rather than Windows 95/98 generation. I used to run the pre-Dominion Eve Online on it perfectly fine.

spiritual bypass
Feb 19, 2008

Grimey Drawer

Exioce posted:

am I going to see a performance hit?

Depends on the game, really. Check out other people's test data on winehq.org to try to predict if the games you want to play are actually going to work.

HolyDukeNukem
Sep 10, 2008

Exioce posted:

I installed Ubuntu 9.10 yesterady on a PC running Windows XP that I had pretty much written off as almost dead in hardware terms (due to constant hard crashes). Turns out that Windows XP is just bloated and poo poo - Ubuntu runs as smooth as gently caress, starts up quick, and doesn't crash.

I think I want to use this Ubuntu box to play all those old Windows 95/98 RPGs I could never get to work on Windows XP. I know there are some native games such as Beneath A Steel Sky that wont require emulation, but if I run something like Wine, am I going to see a performance hit?

The PC in question has hardware specs that are a few years old admittedly, but it's Windows XP generation hardware rather than Windows 95/98 generation. I used to run the pre-Dominion Eve Online on it perfectly fine.

most games won't run at native speed, that being said; a lot will run near it. It's worth checking out, i usually just google "appdb game i want to play". That will load a google search of wine's application database.

Big Dick Cheney
Mar 30, 2007
Before my Windows OS crapped out, I used my netbook as a way to connect my 360 to the internet by bridging the networks. I just installed Ubuntu Netbook Remix, and was wondering if anyone could tell me how to do the same thing in Ubuntu.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Exioce posted:

I installed Ubuntu 9.10 yesterady on a PC running Windows XP that I had pretty much written off as almost dead in hardware terms (due to constant hard crashes). Turns out that Windows XP is just bloated and poo poo - Ubuntu runs as smooth as gently caress, starts up quick, and doesn't crash.

:ssh: A fresh XP install is even faster.

spiritual bypass
Feb 19, 2008

Grimey Drawer

Bob Morales posted:

:ssh: A fresh XP install is even faster.

Sure, if it's a clean retail install. The bullshit that comes with the computer never is, though.

Megaman
May 8, 2004
I didn't read the thread BUT...

Bob Morales posted:

:ssh: A fresh XP install is even faster.

:ssh: no it's not, buddy :ssh:

HondaCivet
Oct 16, 2005

And then it falls
And then I fall
And then I know


I've done as much googling as I can stand and I'm still not finding very good answers . . . How well does Linux dual-booting work on a Macbook, specifically the newest white unibody? It sounds like lots of people can get it to sort of work but they have many random problems like the sound not working right or the Macbook getting hot enough to melt lead or its battery life being cut in half. Is it just not really worth it to run Linux on a Macbook or have things gotten better? Also, what distros have the best support/success on Macbooks?

Rated PG-34
Jul 1, 2004




The GNOME panel window selector is bugging out on me (intermittently unresponsive). What's a good alternative window manager?

HolyDukeNukem
Sep 10, 2008

HondaCivet posted:

I've done as much googling as I can stand and I'm still not finding very good answers . . . How well does Linux dual-booting work on a Macbook, specifically the newest white unibody? It sounds like lots of people can get it to sort of work but they have many random problems like the sound not working right or the Macbook getting hot enough to melt lead or its battery life being cut in half. Is it just not really worth it to run Linux on a Macbook or have things gotten better? Also, what distros have the best support/success on Macbooks?

from what i've heard, it's not really worth it since apple puts zero help into getting linux to work on their hardware. At least with other pc's, you will get some form of generic chip that has linux drivers, but apple puts in mostly proprietary hardware which means it's up to apple to make the drivers. If I remember correctly ubuntu has a site with apple compatibility.

https://help.ubuntu.com/community/MacBook

This is the site for ubuntu.

Megaman
May 8, 2004
I didn't read the thread BUT...

Rated PG-34 posted:

The GNOME panel window selector is bugging out on me (intermittently unresponsive). What's a good alternative window manager?

upgrade it, also XFCE

maskenfreiheit
Dec 30, 2004
Edit: Double Post

maskenfreiheit fucked around with this message at 21:02 on Mar 13, 2017

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



Check up on find:

find /home -name "*.tex" -or -name "*HW*"

You could also just do find /home and then do the grepping afterwards, if the find terms become too complex to manage.


But really it ought to be possible to fix this with groups.

peepsalot
Apr 24, 2007

        PEEP THIS...
           BITCH!

Anyone know a nice app for ripping video clips to animated GIF in Linux? I know Gimp can do it, but it feels very tedious from the times I've tried. Maybe I'm doing it wrong, but ut never sets the frame speed right, then I have to go through and manually edit the name of each frame to change the timing. Also, trying to optimize GIF in Gimp is confusing. Wondering if there are any other more streamlined/specialized apps for this purpose.

JHVH-1
Jun 28, 2002

peepsalot posted:

Anyone know a nice app for ripping video clips to animated GIF in Linux? I know Gimp can do it, but it feels very tedious from the times I've tried. Maybe I'm doing it wrong, but ut never sets the frame speed right, then I have to go through and manually edit the name of each frame to change the timing. Also, trying to optimize GIF in Gimp is confusing. Wondering if there are any other more streamlined/specialized apps for this purpose.

You can do it with ffmpeg I think, you just need to find the right flags. By default its unoptimized. There may be flags to get it the way you need it.

Heres a tutorial that uses another method with mplayer:
http://blog.ahfr.org/2008/03/making-animated-gifs-with-free-software.html

covener
Jan 10, 2004

You know, for kids!

JHVH-1 posted:

You can do it with ffmpeg I think, you just need to find the right flags. By default its unoptimized. There may be flags to get it the way you need it.

Heres a tutorial that uses another method with mplayer:
http://blog.ahfr.org/2008/03/making-animated-gifs-with-free-software.html

imagemagick has an "animate" command line utility that you may be able to swap in here for the gimp step.

Megaman
May 8, 2004
I didn't read the thread BUT...

covener posted:

imagemagick has an "animate" command line utility that you may be able to swap in here for the gimp step.

:ssh: graphicsmagick is better

emf
Aug 1, 2002



Megaman posted:

:ssh: graphicsmagick is better
I found an interesting "bug" in graphicsmagick affine transformation. I say "bug" because it isn't technically incorrect, as the specification doesn't say it should, but it will not perform sub-pixel affine tranformations. Imagmagick will perform sub-pixel transformations, but their matrix is defined with an X-axis pointing the opposite way as GM, which is opposite to their documented specification.

Somewhere I have a bash script that shows the difference. I should dig it up.

corgski
Feb 6, 2007

Silly goose, you're here forever.

Is there any kernel parameter I can use to override the auto-detected IRQ for my computer's IDE bus? The 2.6 kernel broke the auto-config routine in the driver for the IDE adapter in my laptop, and also conveniently removed the idex=base,ctl,irq kernel parameter that would let me override it's faulty detection. Genius move there, devs.

yippee cahier
Mar 28, 2005

thelightguy posted:

Is there any kernel parameter I can use to override the auto-detected IRQ for my computer's IDE bus? The 2.6 kernel broke the auto-config routine in the driver for the IDE adapter in my laptop, and also conveniently removed the idex=base,ctl,irq kernel parameter that would let me override it's faulty detection. Genius move there, devs.

Isn't everyone using libata now?

corgski
Feb 6, 2007

Silly goose, you're here forever.

sund posted:

Isn't everyone using libata now?

All the distros I tried to install used ide-core. Vector Linux and DSL-N are the two I remember off the top of my head.

Are there any lightweight 2.6-based distros that use libata? It's a Geode (well actually MediaGX, the precursor to Geode) based system, so anything too bulky isn't going to do well.


Just looked it up, libata is a Serial ATA driver. This system is Geode-based, so it's good 'ol Parallel ATA only. And I can't find any docs on IDE-core that would explain how to override it's interface auto-detection.

corgski fucked around with this message at 08:10 on Jan 17, 2010

Seraphic
Sep 16, 2005
Atropine
If I installed something via manual compile and not via a package manager what is the best way to go about installing a new version? Could I just download the new source and compile or do I need to do something to the existing compile that I want to replace?

covener
Jan 10, 2004

You know, for kids!

Seraphic posted:

If I installed something via manual compile and not via a package manager what is the best way to go about installing a new version? Could I just download the new source and compile or do I need to do something to the existing compile that I want to replace?

that works just fine, but consider checkinstall or stow in the future

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.

covener posted:

that works just fine, but consider checkinstall or stow in the future
Or just building a package correctly in the first place, because most of the time it really isn't difficult, it makes it much easier to apply patches consistently, and once you have a spec you can generally upgrade using the same spec without changing anything except the source install archive.

checkinstall is okay, but really doesn't keep proper track of dependencies, which is a lot of the point of the package manager. It also doesn't keep a revision history or track patches or other customizations.

covener
Jan 10, 2004

You know, for kids!

Misogynist posted:

Or just building a package correctly in the first place

what are you contrasting against when you say correctly?

Vulture Culture
Jul 14, 2003

I was never enjoying it. I only eat it for the nutrients.

covener posted:

what are you contrasting against when you say correctly?
Building a package using checkinstall or a similar tool which doesn't appropriately store package metadata information like description, change history, applied patches or anything else besides literally a list of binaries and their checksums.

checkinstall also often doesn't (can't) abide by distro-specific packaging guidelines, but how much you want to abide by that is your choice. My Red Hat packages always conform to Red Hat/Fedora's packaging guidelines, but I'm anal like that.

I'm absolutely not saying that building packages is the only right way to install software ever.

Edit: It's also important to note that checkinstall doesn't correctly flag config files, so you can easily clobber your configs when upgrading where a "real" RPM would have them tagged %config(noreplace).

Vulture Culture fucked around with this message at 15:24 on Jan 20, 2010

other people
Jun 27, 2004
Associate Christ
Is there an mp3 tagger out there that is worth a poo poo?

I have been using easytag, which is quite nice for batch operations, but it only supports 14 of the dozens of fields that id3v2.4 supports.

'Ex Falso' comes with Quod Libet (which I love), but it is a bitch to work with, and it is still missing a lot of tag fields.

'Audio Tag Tool' supports every field, but it is a pain in the rear end to use on anything but single files.

eyeD3 is neat, but it is command line and far from ideal for every day use.

Hell, if easytag just added the 'publisher' field I would be content.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Whats the proper way to have something done when a process ends?

For example, sound a bell, or send an email AFTER a long operation is done, such as large file transfers, makefile for a large project...

Just use && after the command, and then put the next command after that?

Harokey
Jun 12, 2003

Memory is RAM! Oh dear!

Bob Morales posted:

Whats the proper way to have something done when a process ends?

For example, sound a bell, or send an email AFTER a long operation is done, such as large file transfers, makefile for a large project...

Just use && after the command, and then put the next command after that?

This will work as long as the first command ends with success, it won't run the second command unless the first one ends with success.

Dr. Despair
Nov 4, 2009


39 perfect posts with each roll.

Ok, so I've got a question that google isn't helping me with at all. Basically, I'm trying to setup Geant4 on my netbook, which has an ati3200 gpu. I've got the FGLRX drivers installed, everything seems to be working, glxinfo shows that opengl is there, glxgears works, I can get the fancy graphics options to work on the desktop.

But when I go to build the geant4 package it can't find where opengl (specifically, it says "You have selected to build one or more drivers that require OpenGL. But OpenGL was not found in /usr/lib.) and then prompts for the location. Now on my main comptuer with an nvidia card it finds what it needs in /usr/lib without any trouble, but it just doesn't seem to see what it needs with the ati drivers.

Pointing it to /usr/lib64 and /usr/lib/fglrx doesn't seem to help much either, so I'm kinda stumped.

Edit: Also, I'm currently running Kubuntu 9.10

Dr. Despair fucked around with this message at 02:22 on Jan 24, 2010

Adbot
ADBOT LOVES YOU

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



Harokey posted:

This will work as long as the first command ends with success, it won't run the second command unless the first one ends with success.

Also if you want the second command to run regardless of how the previous command finishes, use semicolon.

You can also use || to only run something if the previous failed (at least in bash this works due to lazy evaluation).

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply