Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
ToxicFrog
Apr 26, 2008


fuf posted:

Thanks for these responses. By home directory I mean config files, but also source files for projects I'm working on. I think my best option will be to use rsync and run it with cron or incrond when something changes.

I generally use btsync for bulk data synchronization between my computers, e.g. ~/devel and ~/Documents. This will go catastrophically wrong if you're ever editing something on two machines at once, but if you aren't it works great.

For things I want a bit more control over, like my dotfiles, I keep them in git; I init my entire home directory as a git repo and then create a .gitignore containing "/*" so that it only tracks things I explicitly tell it to. There's a master branch that has all my common configuration (editor configs, common bash aliases, etc), and then some machines have computer-specific branches containing changes specific to that machine, e.g. work-specific aliases on my work desktop.

Adbot
ADBOT LOVES YOU

evol262
Nov 30, 2010
#!/usr/bin/perl

Salt Fish posted:

This is some kind of scheme. Fixing a server is probably going to be a lot quicker than shipping anything anywhere for most providers. It sounds like what they really want is a high-availability active-passive setup using 2 servers and a DRBD volume syncing data between the two. You can use something like heartbeat with pacemaker to automate failovers if the site goes down. (hire me)

You recommended clustering. That disqualifies you from jobs!

In all seriousness, pacemaker and heartbeat are fine for the website itself. Don't use DRBD for this, though. Use Ceph, Gluster, or another tool. Set up replication between the databases.

Hollow Talk
Feb 2, 2014

fuf posted:

Thanks for these responses. By home directory I mean config files, but also source files for projects I'm working on. I think my best option will be to use rsync and run it with cron or incrond when something changes.

If you need source files as in programming, using version control would be a good thing to do in general!

ToxicFrog posted:

For things I want a bit more control over, like my dotfiles, I keep them in git; I init my entire home directory as a git repo and then create a .gitignore containing "/*" so that it only tracks things I explicitly tell it to. There's a master branch that has all my common configuration (editor configs, common bash aliases, etc), and then some machines have computer-specific branches containing changes specific to that machine, e.g. work-specific aliases on my work desktop.

I think this is the right way to go. You can replicate that as often as you want. You can work on it on different machines etc. I use one git repository for my main config files (zsh/bash/emacs/i3wm) and then one git repository per specific use case, namely one for my writing (~/Documents/XYZ), one for my scripts (~/bin) etc.

You can of course start copying things around, which will only cause problems once you change different things on different computers without syncing them first, at which point things like rsync's diffing-algorithm will spontaneously combust. Git was made for exactly these use cases, and I don't think there will be any better option and especially nothing that will be more resilient.

For a bit of a walkthrough for everyday git usage: https://schacon.github.io/git/everyday.html

fuf
Sep 12, 2004

haha

Hollow Talk posted:

If you need source files as in programming, using version control would be a good thing to do in general!

I do actually use git for programming, but never thought of using it for documents...

I like the idea of rsync or btsync because it's automatic. If I was using git to work on the same document in two places I'd have to do something like this every time, right?

on PC:
create document
git add
git commit
git push to remote (some vps that stores all my repos)

on Laptop:
git pull from remote

Any way I could automate / streamline this a bit? (gitwatch?)

ToxicFrog
Apr 26, 2008


Hollow Talk posted:

If you need source files as in programming, using version control would be a good thing to do in general!


I think this is the right way to go. You can replicate that as often as you want. You can work on it on different machines etc. I use one git repository for my main config files (zsh/bash/emacs/i3wm) and then one git repository per specific use case, namely one for my writing (~/Documents/XYZ), one for my scripts (~/bin) etc.

You can of course start copying things around, which will only cause problems once you change different things on different computers without syncing them first, at which point things like rsync's diffing-algorithm will spontaneously combust. Git was made for exactly these use cases, and I don't think there will be any better option and especially nothing that will be more resilient.

I should clarify that all the stuff in ~/devel/ etc is itself in git, I just use btsync to sync the whole directory and all the repos in it rather than pushing/pulling individual projects.

Hollow Talk
Feb 2, 2014

ToxicFrog posted:

I should clarify that all the stuff in ~/devel/ etc is itself in git, I just use btsync to sync the whole directory and all the repos in it rather than pushing/pulling individual projects.

Ah, that makes sense as well, I didn't catch that! My further reply was more directed towards fof (other than agreeing with you), which I probably should have said! :downs: Could this also be a solution for fuf, though?

fuf posted:

Any way I could automate / streamline this a bit? (gitwatch?)

If you look at ToxicFrog's comment, there is always the option of using git locally and then simply syncing over the whole git directory via something like rsync. Another option would be to automatically run a git pull via a shell script upon login, via ~/.profile or some such. This should be pretty easy either via simply cd'ing into the appropriate folders or using $GIT_DIR and $GIT_WORK_TREE, and could be a simple as using an array that you simply loop through for every major project.

There is little that can be done about the git add && git commit stage other than using something like gitwatch (which you know more about than I do) I suppose, though you could use a post-commit hook to push it to your remote, or you can use a script to push it to multiple servers at once. My usual process is to commit and then run a shell script that deals with pushing everything to three or four different servers.

Lysidas
Jul 26, 2002

John Diefenbaker is a madman who thinks he's John Diefenbaker.
Pillbug

ToxicFrog posted:

For things I want a bit more control over, like my dotfiles, I keep them in git; I init my entire home directory as a git repo and then create a .gitignore containing "/*" so that it only tracks things I explicitly tell it to. There's a master branch that has all my common configuration (editor configs, common bash aliases, etc), and then some machines have computer-specific branches containing changes specific to that machine, e.g. work-specific aliases on my work desktop.

This makes me uneasy for reasons that I can't really articulate. I also dislike always being inside a Git repository since I have the branch name included as part of my prompt. I prefer creating a directory .home-git and symlinking everything from there:'

code:
$ ls -al | grep home-git
lrwxrwxrwx 1 lysidas lysidas         22 Dec  8 12:51 .bash_aliases -> .home-git/bash_aliases                            
lrwxrwxrwx 1 lysidas lysidas         16 Dec  8 12:52 .bashrc -> .home-git/bashrc                                        
lrwxrwxrwx 1 lysidas lysidas         19 Dec  8 12:50 .gitconfig -> .home-git/gitconfig
lrwxrwxrwx 1 lysidas lysidas         15 Dec  8 12:50 .gnupg -> .home-git/gnupg
drwx------ 1 lysidas lysidas        108 May  5 11:10 .home-git
lrwxrwxrwx 1 lysidas lysidas         13 Dec  8 12:50 .ssh -> .home-git/ssh
lrwxrwxrwx 1 lysidas lysidas         15 Dec  8 12:51 .vimrc -> .home-git/vimrc
I really don't synchronize this across multiple machines since I also have my SSH keys in this repository on each.

ToxicFrog
Apr 26, 2008


Lysidas posted:

This makes me uneasy for reasons that I can't really articulate. I also dislike always being inside a Git repository since I have the branch name included as part of my prompt. I prefer creating a directory .home-git and symlinking everything from there:'

I also have my git branch in my prompt and it doesn't bother me. :shrug: I evaluated the hassle of doing this vs. the hassle of maintaining symlinks on all five machines I use regularly and decided the latter was much easier.

quote:

I really don't synchronize this across multiple machines since I also have my SSH keys in this repository on each.

Yeah, .ssh/config is in git but the private keys and authorized_keys aren't.

To clarify, I wanted this in git because I do want to synchronize it across multiple machines, but with finer control over what gets synchronized where than I get with btsync, which just indiscriminately syncs everything -- I didn't want to sync my entire homedir or even all of my dotfiles.

ToxicFrog fucked around with this message at 22:13 on Jun 7, 2014

nescience
Jan 24, 2011

h'okay
why is that that my hostname keeps resetting whenever I restart? I can't get a settings to hold after editing /etc/hosts and /etc/hostname, as well as using the command hostname <mychoice>.

nescience fucked around with this message at 21:37 on Jun 7, 2014

nitrogen
May 21, 2004

Oh, what's a 217°C difference between friends?

Suspicious Dish posted:

It's such a controversial feature that we might add it back. We've talked about it before. We're not sure yet.

PLEASE please at least make it an option.

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug
Can I ask a solaris 10 question here or am I better looking else where, I am more rusty than I like to admit on my solaris.

evol262
Nov 30, 2010
#!/usr/bin/perl

Dilbert As gently caress posted:

Can I ask a solaris 10 question here or am I better looking else where, I am more rusty than I like to admit on my solaris.

Probably the best thread

RFC2324
Jun 7, 2012

http 418

Dilbert As gently caress posted:

Can I ask a solaris 10 question here or am I better looking else where, I am more rusty than I like to admit on my solaris.

What do you need to know?

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug
Wow never mind I was WAY overthinking this just some driver installs and automated backups; I'm dumb.

Xik
Mar 10, 2011

Dinosaur Gum
How would I go about making all my functions in .bashrc available to a non-interactive shell?

Is there a reason I shouldn't be trying to do this?

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
Source a specific file from your script. Relying on bashrc makes your scripts less portable and harder to debug.

Xik
Mar 10, 2011

Dinosaur Gum
That's how I currently do it, but was trying to tidy up some really small scripts (few lines) in ~/bin and migrate them to .bashrc as functions with all my other little things.

Then I realised that I call those little scripts from multiple places, including other config files and scripts. Maybe I should be doing the exact opposite and really be migrating them out into small scripts in ~/bin leaving .bashrc for aliases?

telcoM
Mar 21, 2009
Fallen Rib

nescience posted:

why is that that my hostname keeps resetting whenever I restart? I can't get a settings to hold after editing /etc/hosts and /etc/hostname, as well as using the command hostname <mychoice>.

The "hostname" command is used to tell the new hostname setting to the kernel, but to make it persistent, it also need to be specified in a configuration file. Unfortunately, different Linux distributions use different locations for the configuration file.

The old name is probably still present in wherever the startup scripts read it from.
For example, RedHat has the hostname in /etc/sysconfig/network.

/etc/hostname is the equivalent Debianish configuration file, so maybe you're using some version of Debian or Ubuntu. (It would be easier to answer if you mentioned the name and version of the Linux distribution you've using.)

If your network configuration is by DHCP, then the DHCP server may be configured to assign a hostname for your system, and your DHCP client might be configured to use that instead of your locally-set hostname. If you are using the "dhclient" DHCP client package (the Debian default one), check the /etc/dhcp/dhclient* files (or /etc/dhcp3/dhclient* in some versions) for any hostname-related things.

If that's not it, then it is possible that the old hostname setting is still embedded in the initrd/initramfs. In that case, you'll need to update your initramfs to make the new setting take effect at boot time. In Debian, this is the command to do it (probably in Ubuntu too):
code:
sudo update-initramfs -u -k $(uname -r)

fuf
Sep 12, 2004

haha
What's up dudes another annoying question.

Is there a fast window manager that'll just switch between various programs running in fullscreen? No status bar or tiling or overlapping windows.

I'm setting up a media pc in our living room and I want to make it as idiot-proof as possible - ideally pressing the windows key would switch between chrome, spotify and netflix.

e: looks like awesome might be pretty good.

fuf fucked around with this message at 13:08 on Jun 8, 2014

FWT THE CUTTER
Oct 16, 2007

weed

fuf posted:

What's up dudes another annoying question.

Is there a fast window manager that'll just switch between various programs running in fullscreen? No status bar or tiling or overlapping windows.

I'm setting up a media pc in our living room and I want to make it as idiot-proof as possible - ideally pressing the windows key would switch between chrome, spotify and netflix.

e: looks like awesome might be pretty good.

I'd use openbox.

fuf
Sep 12, 2004

haha
Actually I realised I can do it by just switching between workspaces in gnome shell.

I made a web page with nice big buttons for netflix, spotify, etc, with a Node backend which executes commands to open the app on the right workspace in fullscreen and switch to that workspace. Too bad there's no terminal command to switch workspaces. I had to use xdotool to fake key presses. Kinda clunky but it works ok.

evol262
Nov 30, 2010
#!/usr/bin/perl

fuf posted:

Actually I realised I can do it by just switching between workspaces in gnome shell.

I made a web page with nice big buttons for netflix, spotify, etc, with a Node backend which executes commands to open the app on the right workspace in fullscreen and switch to that workspace. Too bad there's no terminal command to switch workspaces. I had to use xdotool to fake key presses. Kinda clunky but it works ok.

Node.
Clunky.
Shock

ZHamburglar
Aug 24, 2006
I have a penis.
Is there an IRC chat where I can get help from you guys? I'm working on figuring out how to correctly do the permissions for Plex on Ubuntu, but I'm not entering it all in correctly and don't want to clog up the thread with a bunch of super basic poo poo.

Varkk
Apr 17, 2004

There are a bunch of SH/SC regulars in #bofh on SynIRC

nescience
Jan 24, 2011

h'okay

telcoM posted:

hostname stuff

Thanks for the explanation! It was an unrelated issue, the said machine is a OpenVZ container running Debian 7, and the hostname was set in one of the control panels for my VM, which overrode whatever I changed in the OS.

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug
I guess this question belongs here:

Is there a good guide or can someone give me a direction on packaging a driver into a solaris ISO so the driver picks up natively?

evol262
Nov 30, 2010
#!/usr/bin/perl

Dilbert As gently caress posted:

I guess this question belongs here:

Is there a good guide or can someone give me a direction on packaging a driver into a solaris ISO so the driver picks up natively?

This is really well documented.

General_Failure
Apr 17, 2005
I'm still running lubuntu 14.04 x64 on the upgrade install I did a while back. Got most of the issues sorted pretty easily but there's still one big one. The AMD graphics drivers. Although it is my pleasure to report for the first time the open source ones don't cause constant GPU locks rendering the system unusable, it's still not much use because I want to use CrossfireX.

In 13.04 the proprietary AMD drivers were working well ...enough. Apport would explode all over the place on startup / resume / coming back from a screensaver and OpenGL would break after a sreensaver had run. But performance was good, settings persisted across reboots and CrossfireX seemed to work fine.

In 14.04 performance is kind of not good, settings don't persist like dual monitors which throws CrossfireX out of whack and weird things happen like chromium's window contents will end up with a shuffled tile pattern if something with OpenGL is running in another window.

I can't even remember what driver I was using before updating. I want to say xorg-edgers with bumblebee ripped out because it's just a desktop with AMD cards, but I'm not sure any more because I did it ages ago.

What an I do to fix this driver issue? I'm getting close to doing the whole getting the shits and shoving my nVidia 8800GTX in, struggle with driver issues and go back to the AMD dance again.

CaptainSarcastic
Jul 6, 2013



What graphics card is it? AMD still has awful driver support in general, and proprietary driver support was mangled for cards as recent as HD 4000.

I dealt with a similar issue on an HP all-in-one running AMD graphics, and just had to settle for the open source driver for the time being.

Looks like this describes a similar situation on Ubuntu" https://bugs.launchpad.net/ubuntu/+source/fglrx-installer/+bug/1058040

It is stuff like this that has been a significant reason that, when possible, I have run Nvidia cards in all my builds for many years now.

General_Failure
Apr 17, 2005

CaptainSarcastic posted:

What graphics card is it? AMD still has awful driver support in general, and proprietary driver support was mangled for cards as recent as HD 4000.

I dealt with a similar issue on an HP all-in-one running AMD graphics, and just had to settle for the open source driver for the time being.

Looks like this describes a similar situation on Ubuntu" https://bugs.launchpad.net/ubuntu/+source/fglrx-installer/+bug/1058040

It is stuff like this that has been a significant reason that, when possible, I have run Nvidia cards in all my builds for many years now.

Kind of embarrassing but I don't know. They are Radeon HD 5700 series but beyond that I don't remember. I want to say 5770 but I'm not sure. I got them as a pair on eBay a while back because my nVidia card chews up and spits out PSUs given the chance. I got a much better PSU in the interim but even 1x AMD card > nVidia card benchmark wise so I tried to stick with them.

What I've never understood is on paper my AMD(s) are much better than the nVidia but in real world I swear the nVidia gives a better experience even though it's not capable of everything that the AMDs are. What I mean is if I'm working within what both the AMDs and the nVidia are capable of, the nVidia has a far more stable framerate and screams for more, but the AMD gives the impression it's constantly going "ohshitohshitohshit!" with a less than stable framerate. Plus there's other quirks. For example the main menu screen on Saints Row the Third in Win8. Using my nVidia it's rock solid. The background city is smooth from the moment the menu appears. With the AMDs, CrossfireX enabled or not the city judders along at a couple of FPS until it catches up. There's lots of other examples of that sort of behaviour too.

Back on topic. The open source drivers for nVidia and AMD just plain didn't work for me until 14.04. Haven't tried the nVidia one yet though.

That bug does seem semi relevant, although the card generation is wrong. But you didn't know that before.

One frustrating thing is it's a super tough year so I'm stuck with the hardware I have so I can't just circumvent by getting a better nVidia to stuff in and forget about the whole thing.

CaptainSarcastic
Jul 6, 2013



That should be new enough that the regular Catalyst drivers should work. I'm not sure exactly how great the Crossfire support is - that is not something I have ever had to deal with.

Aside from on-board AMD graphics (my netbook and aforementioned HP all-in-one have newer and older AMD graphics, respectively) the last discrete card I ran that wasn't Nvidia was a Radeon 9600 All-In-Wonder about a decade ago. Great card, but terrible drivers and software even then.

I'd really hoped that driver and software support would improve when AMD bought ATI, but that never really seemed to happen.

General_Failure
Apr 17, 2005

CaptainSarcastic posted:

That should be new enough that the regular Catalyst drivers should work. I'm not sure exactly how great the Crossfire support is - that is not something I have ever had to deal with.

Aside from on-board AMD graphics (my netbook and aforementioned HP all-in-one have newer and older AMD graphics, respectively) the last discrete card I ran that wasn't Nvidia was a Radeon 9600 All-In-Wonder about a decade ago. Great card, but terrible drivers and software even then.

I'd really hoped that driver and software support would improve when AMD bought ATI, but that never really seemed to happen.

The AMD drivers always give me a headache. nVidia aren't much better. nVidia still use pretty much the same installer that they were using around about 2000. With the advent of the average Linux PC going straight to X it really became a fight to find what method works for that particular distribution to stop a desktop manager / X from loading.
The AMD drivers are like playing Russian roulette with an SMG.

Out of curiosity I checked what versions Catalyst was reporting. Driver version 13.35.1005-blahblah.
AMD current beta drivers on their site are 14.6. The trouble is every time I have tried installing one of their drivers like that all hell breaks loose and the bastard is impossible to remove because it embeds itself like a gestating chestbursting xenomorph.

peepsalot
Apr 24, 2007

        PEEP THIS...
           BITCH!

General_Failure posted:

Kind of embarrassing but I don't know. They are Radeon HD 5700 series but beyond that I don't remember. I want to say 5770 but I'm not sure. I got them as a pair on eBay a while back because my nVidia card chews up and spits out PSUs given the chance. I got a much better PSU in the interim but even 1x AMD card > nVidia card benchmark wise so I tried to stick with them.

What I've never understood is on paper my AMD(s) are much better than the nVidia but in real world I swear the nVidia gives a better experience even though it's not capable of everything that the AMDs are. What I mean is if I'm working within what both the AMDs and the nVidia are capable of, the nVidia has a far more stable framerate and screams for more, but the AMD gives the impression it's constantly going "ohshitohshitohshit!" with a less than stable framerate. Plus there's other quirks. For example the main menu screen on Saints Row the Third in Win8. Using my nVidia it's rock solid. The background city is smooth from the moment the menu appears. With the AMDs, CrossfireX enabled or not the city judders along at a couple of FPS until it catches up. There's lots of other examples of that sort of behaviour too.

Back on topic. The open source drivers for nVidia and AMD just plain didn't work for me until 14.04. Haven't tried the nVidia one yet though.

That bug does seem semi relevant, although the card generation is wrong. But you didn't know that before.

One frustrating thing is it's a super tough year so I'm stuck with the hardware I have so I can't just circumvent by getting a better nVidia to stuff in and forget about the whole thing.

Try lspci if you don't know what card you have installed.

General_Failure
Apr 17, 2005

peepsalot posted:

Try lspci if you don't know what card you have installed.

Everything just tells me 5700 series.

I've hopefully got the current beta fglrx building as a .deb right now.

Oh sweet! It just finished. Took a while. And that's all in RAM too. That's kind of sad. Anyway hopefully I'll check in in a day or three after I've instaled it, it's horribly corrupted everything and I've reinstalled. Or the off chance that the Linux driver didn't bend the system over and give it a good rogering for once.

CaptainSarcastic
Jul 6, 2013



General_Failure posted:

Everything just tells me 5700 series.

I've hopefully got the current beta fglrx building as a .deb right now.

Oh sweet! It just finished. Took a while. And that's all in RAM too. That's kind of sad. Anyway hopefully I'll check in in a day or three after I've instaled it, it's horribly corrupted everything and I've reinstalled. Or the off chance that the Linux driver didn't bend the system over and give it a good rogering for once.

For what it's worth, the fglrx driver on my netbook has been running beautifully (it's got an HD 6000 series chip) under openSUSE 12.3 and 13.1 (the current release).

Longinus00
Dec 29, 2005
Ur-Quan

General_Failure posted:

The AMD drivers always give me a headache. nVidia aren't much better. nVidia still use pretty much the same installer that they were using around about 2000. With the advent of the average Linux PC going straight to X it really became a fight to find what method works for that particular distribution to stop a desktop manager / X from loading.
The AMD drivers are like playing Russian roulette with an SMG.

Out of curiosity I checked what versions Catalyst was reporting. Driver version 13.35.1005-blahblah.
AMD current beta drivers on their site are 14.6. The trouble is every time I have tried installing one of their drivers like that all hell breaks loose and the bastard is impossible to remove because it embeds itself like a gestating chestbursting xenomorph.

"Driver version 13.35.1005-blahblah" is just ubuntu's designation for it. In reality I'm pretty sure trusty is using the 14.4 driver.

One thing you may have to do if you're installing the driver off amd's website is recompiling it on every kernel update. I don't know if they've added dkms support yet.

General_Failure
Apr 17, 2005
drat versioning, eh?
Anyway the compile worked okay. I saw it mention DKMS when the driver was installing so it's possible that it's working.

The good:
The corruption I was seeing in chromium when I had an OpenGL program (KSP) running has gone away.
It's definitely a newer driver, even if only by a couple of revision points.
OpenGL seems to be faster and smoother. Where it should be.

The bad:
amdcccle still doesn't remember my multi monitor settings needing me to reconfigure on every reboot. It used to save them in 13.04.
amdcccle is put in usr/lib/fglrx/bin where I don't have a path environment setting. Easy to fix but still it's probably just because the driver is generic Linux rather than the Ubuntu-fied one in the repository.

CaptainSarcastic
Jul 6, 2013



I'm not sure if this is at all applicable, but a couple years ago (on a machine running PCLinuxOS) I found that I had to run the Nvidia control panel as root in order for it to retain settings. I don't know if sudoing into it might give you persistent results or not, but it's akin to what it took for my Nvidia settings to persist on that system.

General_Failure
Apr 17, 2005

CaptainSarcastic posted:

I'm not sure if this is at all applicable, but a couple years ago (on a machine running PCLinuxOS) I found that I had to run the Nvidia control panel as root in order for it to retain settings. I don't know if sudoing into it might give you persistent results or not, but it's akin to what it took for my Nvidia settings to persist on that system.

The only way to change settings like multiple monitor modes in catalyst is to run amdcccle as su. I was hoping at first it was ust a permissions issue but su not doing it pretty much nixes that argument.

Adbot
ADBOT LOVES YOU

Longinus00
Dec 29, 2005
Ur-Quan

General_Failure posted:

The only way to change settings like multiple monitor modes in catalyst is to run amdcccle as su. I was hoping at first it was ust a permissions issue but su not doing it pretty much nixes that argument.

I'm not sure if you're using unity or what but with xubuntu I just set the monitors from the xfce control panel and it works fine. I also had to manually set a larger virtual desktop in xorg.conf of course because nothing is ever that simple.

Why don't you post your xorg.conf in here and let us take a look at it?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply