Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
FlapYoJacks
Feb 12, 2009
Considering the guy who wrote systemd got death threats, I wouldn't say anything either. lovely script kiddies get really angry when you take away their lovely bash scripts.

Adbot
ADBOT LOVES YOU

James Baud
May 24, 2015

by LITERALLY AN ADMIN

Suspicious Dish posted:

also i should probably stop talking in this thread because i think at this point im officially done with linux and desktops and linux desktops

Endless already went under?

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
nah im just working on the cloud software side of things and not the desktop part

Soricidus
Oct 21, 2010
freedom-hating statist shill
conratulations on your promotion

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
thank you!!! unfortunately the cloud is garbage but im still happy

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

Suspicious Dish posted:

nah im just working on the cloud software side of things and not the desktop part

so does this mean gnome is pivoting from having tablet apps as its primary target to having webcloud apps as its primary target?

mike12345
Jul 14, 2008

"Whether the Earth was created in 7 days, or 7 actual eras, I'm not sure we'll ever be able to answer that. It's one of the great mysteries."





eschaton posted:

so does this mean gnome is pivoting from having tablet apps as its primary target to having webcloud apps as its primary target?

the year of the desktop cloud?

Condiv
May 7, 2008

Sorry to undo the effort of paying a domestic abuser $10 to own this poster, but I am going to lose my dang mind if I keep seeing multiple posters who appear to be Baloogan.

With love,
a mod


laffo about the guy complaining about fglrx. yeah, ati has poo poo software and fglrx has been a pita since before bush left office. don't use ati cards if you wanna do opencl/opengl on linux

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

Condiv posted:

laffo about the guy complaining about fglrx. yeah, ati has poo poo software and fglrx has been a pita since before bush left office. don't use ati cards if you wanna do opencl/opengl on linux

great suggestion, buy a new graphics card so i can gently caress around with dumb text poo poo using a different driver

fglrx works fine now btw, it was fedora that was the issue. installed in one step on ubutnu.

Condiv
May 7, 2008

Sorry to undo the effort of paying a domestic abuser $10 to own this poster, but I am going to lose my dang mind if I keep seeing multiple posters who appear to be Baloogan.

With love,
a mod


atomicthumbs posted:

great suggestion, buy a new graphics card so i can gently caress around with dumb text poo poo using a different driver

fglrx works fine now btw, it was fedora that was the issue. installed in one step on ubutnu.

nah. fglrx is really buggy and bad, even on ubuntu (ubuntu makes it easier to get running, but a lot of the difficulty comes from how absolutely poo poo fglrx is). considering you can get the proprietary drivers for nvidia as easily on ubuntu, and they won't crash your system or be dogshit at rendering 2d yes you should get a more appropriate graphics card.

oh, another thing to note is the proprietary driver for ati cards have a much worse performance degradation compared to their windows versions than the nvidia drivers too, so don't expect your ati graphics card to be able to handle anything near what it can on windows.

Truga
May 4, 2014
Lipstick Apathy
opencl on linux runs really loving well with ati, there's a non-insignificant performance advantage compared to windows

though I'm guessing that's mainly due to buttcoiners

ahmeni
May 1, 2005

It's one continuous form where hardware and software function in perfect unison, creating a new generation of iPhone that's better by any measure.
Grimey Drawer

Condiv posted:

don't use ati cards

Condiv
May 7, 2008

Sorry to undo the effort of paying a domestic abuser $10 to own this poster, but I am going to lose my dang mind if I keep seeing multiple posters who appear to be Baloogan.

With love,
a mod


Truga posted:

opencl on linux runs really loving well with ati, there's a non-insignificant performance advantage compared to windows

though I'm guessing that's mainly due to buttcoiners

if opencl works well with the opensource ati drivers then that would be the best situation for him. the opensource drivers were always much better and more stable for me than the proprietary (though they were slower at 3d, but if you're not playing games it doesn't matter much).

Sapozhnik
Jan 2, 2005

Nap Ghost
Is open source OpenCL even a thing yet because it always seems to be just around the corner.

Vulkan is supposed to do both render and compute from one API anyway.

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

eschaton posted:

so does this mean gnome is pivoting from having tablet apps as its primary target to having webcloud apps as its primary target?

no. what im doing has nothing to do with gnome.

Celexi
Nov 25, 2006

Slava Ukraini!

Mr Dog posted:

Is open source OpenCL even a thing yet because it always seems to be just around the corner.

Vulkan is supposed to do both render and compute from one API anyway.

vulkan will be better because games and consumer software will also be able to use it easier, like most consumer computers and drivers installed from shitbuntu repos and rpmfusion and others strip out the opencl parts of the closed source drivers, and the opencl drivers for intel or amd cpus are also not installed by default, so its really poo poo to try to make a consumer software that uses gpu compute and gpu at same time

Athas
Aug 6, 2007

fuck that joker

Mr Dog posted:

Is open source OpenCL even a thing yet because it always seems to be just around the corner.

Vulkan is supposed to do both render and compute from one API anyway.

The OpenCL driver for Intel GPUs (called "Beignet" for whatever reason) is fully open source and works pretty well. AMD is also working on opening as much as their driver as possible. Well, actually writing a new one that seems to be called amdgpu... but also a headless one that I think is called ROCK, for Radeon Open Compute Kernel, but AMDs entire GPU strategy is covered in a byzantine layer of acronyms that I have yet to fully understand.

Athas fucked around with this message at 06:34 on Apr 22, 2016

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Athas posted:

AMDs entire GPU strategy is covered in a byzantine layer of acronyms that I have yet to fully understand.

its the results of neverending waves of ~(not so) brilliant ideas~ to try to get people more interested in amd products

like for a while they had this one relatively high up dude who really really wants to make THE HOLODECK from st:tng because that is a super important thing for humanity to have so they pushed massive multihead as a major super important feature that would somehow eventually be an important part of creating the brave new world of cmdr data: sherlock holmes edition. this is why you could get amd cards with like six video outs, it was pushed as a great awesome feature and of course it failed to make a dent because like 0.1% of people care about having six displays attached to one card

not even kidding about this btw it was covered on anandtech and other pc tech websites

Smythe
Oct 12, 2003

BobHoward posted:

its the results of neverending waves of ~(not so) brilliant ideas~ to try to get people more interested in amd products

like for a while they had this one relatively high up dude who really really wants to make THE HOLODECK from st:tng because that is a super important thing for humanity to have so they pushed massive multihead as a major super important feature that would somehow eventually be an important part of creating the brave new world of cmdr data: sherlock holmes edition. this is why you could get amd cards with like six video outs, it was pushed as a great awesome feature and of course it failed to make a dent because like 0.1% of people care about having six displays attached to one card

not even kidding about this btw it was covered on anandtech and other pc tech websites

six video outs on 1 card sounds badass for driving something like qlab or watchout or whatever that other one is that does the same thing

Sapozhnik
Jan 2, 2005

Nap Ghost
I don't understand what OpenCL on Intel GPUs is good for.

Bitcoin randroids do GPGPU on AMD hardware and grownups do GPGPU on nVidia. Nobody does GPGPU on Intel hardware, it's what, a 2x improvement over just using the all CPU cores, if that?

Video games might want to do compute offload in addition to rendering, but gaming on Linux is either indie poo poo or it's a de-facto nVidia monopoly. On Windows you'd probably use DirectCompute or whatever since Khronos APIs haven't been a going concern in Windows gaming since before Khronos was even incorporated. Even then it was basically Carmack's poo poo using OpenGL vs literally everybody else using D3D.

Sapozhnik
Jan 2, 2005

Nap Ghost
Speaking of Carmack, remember back when he said Direct3D 5 sucked and he was only going to use OpenGL from now on?

His brief technical justifications from 1999 showed D3D 5 operating in a manner crudely equivalent to Vulkan's shiny new hotness of 2016: you construct a command buffer, upload it to the GPU, then execute it. Which is apparently omg so cumbersome. His sole technical complaint was "The driver knows better than me how big a command buffer I should be using", which is complete loving nonsense because it is the game engine itself that knows where one piece of geometry ends and another begins. Obviously, 3D accelerators of that era were just fixed-function vertex interpolators and texture samplers with none of the shader pipeline fun times of a modern GPU, but it still seems like the D3D5 style was the correct architecture. At least in retrospect. Everything old is new again!

Also he showed OpenGL code that used the old glBegin/glEnd and display list style which was a complete architectural dead end (even when it was written: texture uploads in display lists were a total non-starter so OpenGL already had texture objects, quickly followed by vertex buffer objects and a wholesale sea change in OpenGL's entire way of doing things).

But guys, I can draw a triangle in 10 lines as opposed to 100! In my commercial-grade game engine consisting of 100+ KLOC of code. Because that's something that loving matters.

Maybe the libertarian wunderkind ... might have got it wrong.

VAGENDA OF MANOCIDE
Aug 1, 2004

whoa, what just happened here?







College Slice

Mr Dog posted:

Speaking of Carmack, remember back when he said Direct3D 5 sucked and he was only going to use OpenGL from now on?

His brief technical justifications from 1999 showed D3D 5 operating in a manner crudely equivalent to Vulkan's shiny new hotness of 2016: you construct a command buffer, upload it to the GPU, then execute it. Which is apparently omg so cumbersome. His sole technical complaint was "The driver knows better than me how big a command buffer I should be using", which is complete loving nonsense because it is the game engine itself that knows where one piece of geometry ends and another begins. Obviously, 3D accelerators of that era were just fixed-function vertex interpolators and texture samplers with none of the shader pipeline fun times of a modern GPU, but it still seems like the D3D5 style was the correct architecture. At least in retrospect. Everything old is new again!

Also he showed OpenGL code that used the old glBegin/glEnd and display list style which was a complete architectural dead end (even when it was written: texture uploads in display lists were a total non-starter so OpenGL already had texture objects, quickly followed by vertex buffer objects and a wholesale sea change in OpenGL's entire way of doing things).

But guys, I can draw a triangle in 10 lines as opposed to 100! In my commercial-grade game engine consisting of 100+ KLOC of code. Because that's something that loving matters.

Maybe the libertarian wunderkind ... might have got it wrong.

the rage engine's bad decisions still haunt a carmack-less id software to this day

megatextures/etc make it impossible to be a hobbyist modder of id games

VAGENDA OF MANOCIDE fucked around with this message at 20:11 on Apr 21, 2016

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

Condiv posted:

nah. fglrx is really buggy and bad, even on ubuntu (ubuntu makes it easier to get running, but a lot of the difficulty comes from how absolutely poo poo fglrx is). considering you can get the proprietary drivers for nvidia as easily on ubuntu, and they won't crash your system or be dogshit at rendering 2d yes you should get a more appropriate graphics card.

oh, another thing to note is the proprietary driver for ati cards have a much worse performance degradation compared to their windows versions than the nvidia drivers too, so don't expect your ati graphics card to be able to handle anything near what it can on windows.

No, fglrx works fine.

Mr Dog posted:

I don't understand what OpenCL on Intel GPUs is good for.

Bitcoin randroids do GPGPU on AMD hardware and grownups do GPGPU on nVidia. Nobody does GPGPU on Intel hardware, it's what, a 2x improvement over just using the all CPU cores, if that?

Video games might want to do compute offload in addition to rendering, but gaming on Linux is either indie poo poo or it's a de-facto nVidia monopoly. On Windows you'd probably use DirectCompute or whatever since Khronos APIs haven't been a going concern in Windows gaming since before Khronos was even incorporated. Even then it was basically Carmack's poo poo using OpenGL vs literally everybody else using D3D.

I tested Beignet with my 4790K's GPU and it was 60% as fast on my workloads as my 7870.

atomicthumbs fucked around with this message at 21:07 on Apr 21, 2016

Smythe
Oct 12, 2003

atomicthumbs posted:

No, fglrx works fine.

lol

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

Mr Dog posted:

Speaking of Carmack, remember back when he said Direct3D 5 sucked and he was only going to use OpenGL from now on?

His brief technical justifications from 1999 showed D3D 5 operating in a manner crudely equivalent to Vulkan's shiny new hotness of 2016: you construct a command buffer, upload it to the GPU, then execute it. Which is apparently omg so cumbersome. His sole technical complaint was "The driver knows better than me how big a command buffer I should be using", which is complete loving nonsense because it is the game engine itself that knows where one piece of geometry ends and another begins. Obviously, 3D accelerators of that era were just fixed-function vertex interpolators and texture samplers with none of the shader pipeline fun times of a modern GPU, but it still seems like the D3D5 style was the correct architecture. At least in retrospect. Everything old is new again!

Also he showed OpenGL code that used the old glBegin/glEnd and display list style which was a complete architectural dead end (even when it was written: texture uploads in display lists were a total non-starter so OpenGL already had texture objects, quickly followed by vertex buffer objects and a wholesale sea change in OpenGL's entire way of doing things).

But guys, I can draw a triangle in 10 lines as opposed to 100! In my commercial-grade game engine consisting of 100+ KLOC of code. Because that's something that loving matters.

Maybe the libertarian wunderkind ... might have got it wrong.

This was an era where GPUs were slow and CPUs were fast and we weren't hitting the major memory bandwidth limitations of today. Texture uploads never stalled your game because you were trying hard to simply hit 60fps to begin with. On-GPU vertex processing was still in its infancy. Per-pixel shading, the *basis* of modern GPUs, wouldn't exist until the GeForce 256 and D3D7.

So OpenGL was correct. D3D5 was garbage on the machines that it was on at the time. It couldn't properly schedule command buffers. It didn't max out your fill rate. Drivers knew a lot more about their crazy architectures. We weren't sure about shader execution units and several companies played around with pipelined models which is why glTexEnv had driver-defined limits on the number of texture environments you could have.

If you squint, the models look similar. In context, with all the added details, they are radically different.

pseudorandom name
May 6, 2007

The GeForce name is such extreme '90s teen garbage.

Notorious b.s.d.
Jan 25, 2003

by Reene

pseudorandom name posted:

The GeForce name is such extreme '90s teen garbage.

remember the boxes




Smythe
Oct 12, 2003

Notorious b.s.d. posted:

remember the boxes






cool

hobbesmaster
Jan 28, 2008

Mr Dog posted:

I don't understand what OpenCL on Intel GPUs is good for.

its for doing development on a macbook air on a plane

AWWNAW
Dec 30, 2008

getting anew geforce used to give me huge turgid teen boners

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

hobbesmaster posted:

its for doing development on a macbook air on a plane

That is a fire hazard and is illegal.

Wheany
Mar 17, 2006

Spinyahahahahahahahahahahahaha!

Doctor Rope

AWWNAW posted:

getting anew geforce used to give me huge turgid teen boners

Because the tech demos always had hot as gently caress cgi girls :swoon: :fap:

Athas
Aug 6, 2007

fuck that joker

Mr Dog posted:

I don't understand what OpenCL on Intel GPUs is good for.

Bitcoin randroids do GPGPU on AMD hardware and grownups do GPGPU on nVidia. Nobody does GPGPU on Intel hardware, it's what, a 2x improvement over just using the all CPU cores, if that?

How is a 2x improvement over all CPU cores not a good performance increase? You might as well ask why it's worth using parallel code at all. It's even likely that this ratio will improve more in the future, as GPU designs are easier to scale up.

I do GPGPU at work, and even though I mostly deploy on AMD and NVIDIA hardware, it is still very useful to have fast OpenCL on my laptop, which only has an Intel GPU.

The major problem with Intel GPUs is that they have tiny memory - something like 128MiB on die. It will transparently swap to main memory I think, but come on now.

Of course, that does not solve the problem that manually using the OpenCL API is total rear end.

Asymmetric POSTer
Aug 17, 2005

ratbert90 posted:

Why on earth are you using a file browser that isn't the CLI in Linux? :psyduck:

because some people are fucktarded enough to use linux as their desktop os

they're fail aids positive and too poor to afford the antiretrovirals

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

eschaton posted:

so does this mean gnome is pivoting from having tablet apps as its primary target to having webcloud apps as its primary target?

2016 year of user facing linux in the butt

FlapYoJacks
Feb 12, 2009

blowfish posted:

2016 year of user facing linux in the butt

In my butt. :eng101:

FlapYoJacks
Feb 12, 2009
Lmbo:

http://www.pcworld.com/article/3058857/linux/ubuntu-1604-lts-gives-fans-new-reasons-to-love-this-popular-linux-desktop.html

quote:


There’s good and bad news for AMD graphics users. Ubuntu 16.04 LTS will support AMD’s new AMDGPU graphics driver architecture, which should help close the gap with Nvidia’s exceptional Linux drivers. Unfortunately, Ubuntu 16.04 LTS no longer supports the current AMD Catalyst driver, also known as “fglrx.” AMD graphics card users may want to stick with Ubuntu 14.04 until AMDGPU has matured. That is, if you’re using the card for gaming or other demanding chores. AMD graphics will work just fine if you’re performing standard desktop activities.


So in addition to Ubuntu being bad, now fglrx won't "just work" like AtomicThumbs likes to brag.

Blue Train
Jun 17, 2012

jfc unity looks like poo poo

akadajet
Sep 14, 2003

Notorious b.s.d. posted:

remember the boxes






photoshop bevel and drop shadow effects rule

Adbot
ADBOT LOVES YOU

Condiv
May 7, 2008

Sorry to undo the effort of paying a domestic abuser $10 to own this poster, but I am going to lose my dang mind if I keep seeing multiple posters who appear to be Baloogan.

With love,
a mod


ratbert90 posted:

Lmbo:

http://www.pcworld.com/article/3058857/linux/ubuntu-1604-lts-gives-fans-new-reasons-to-love-this-popular-linux-desktop.html


So in addition to Ubuntu being bad, now fglrx won't "just work" like AtomicThumbs likes to brag.

i guess they got tired of supporting fglrx since it tends to break if you look at it wrong and leave you without a desktop

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply