Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Zlodo
Nov 25, 2006

Suspicious Dish posted:

i write c because it's just fine.

so fine that you need to add your own semantics through annotations parsed from comments

https://wiki.gnome.org/GObjectIntrospection/Annotations

im dying

Adbot
ADBOT LOVES YOU

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
ms did the same thing with winrt and called it projection

its not a new concept

Zlodo
Nov 25, 2006

Suspicious Dish posted:

ms did the same thing with winrt and called it projection

its not a new concept

ms does a lot of bad things

Zlodo
Nov 25, 2006

Zlodo posted:

ms does a lot of bad things

jfc i just looked into that "projection" stuff and i was wondering why they called it that instead of "bindings" like the rest of the world and it turns out thats its because its not just bindings, it comes with a set of retarded and completely unnecessary language extensions

like an object is created like Butt^ butt = ref new Butt;

except that this ^ poo poo is merely a refcounted pointer, something you can do perfectly well in normal c++

and thanks to this retarded c++/cli like syntax people can't use a compiler that is c++11 compliant and not slow as poo poo (ie clang) to develop for w8

gnome devs: "great concept lets emulate it"

lol just lol

Zlodo fucked around with this message at 19:42 on Oct 28, 2013

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
except thats not at all what projection is. projection is just bindings plus the facility to automate binding development through automatically generated metadata.

what you're talking about is either c++/cli or c++/cx. they are v. dumb yes.

Shaggar
Apr 26, 2006
yeah c++ is retarded and old and no one should use it when theres c# and java

Zlodo
Nov 25, 2006

Suspicious Dish posted:

except thats not at all what projection is. projection is just bindings plus the facility to automate binding development through automatically generated metadata.

i much prefer to automate binding generation by using a separate description file instead of having to put it with a special syntax inside comments or rolling my own bizarre custom language extensions

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
write the gir file manually then. the annotations are just scanned and a gir file is produced from that, which is our interchange format.

google for "Gtk-3.0.gir" for an example. it's a fairly simple format

Zlodo
Nov 25, 2006

Suspicious Dish posted:

write the gir file manually then. the annotations are just scanned and a gir file is produced from that, which is our interchange format.

google for "Gtk-3.0.gir" for an example. it's a fairly simple format

oh god yeah let me write xml manually sign me right the gently caress in

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
what would you prefer

Notorious b.s.d.
Jan 25, 2003

by Reene

Suspicious Dish posted:

write the gir file manually then. the annotations are just scanned and a gir file is produced from that, which is our interchange format.

google for "Gtk-3.0.gir" for an example. it's a fairly simple format

so you've borrowed the worst and most annoying thing from CORBA while abandoning all the good things about CORBA

great hustle guys. glad you deprecated CORBA and bonobo instead of documenting them

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
except for the fact that it has nothing to do with corba at all?

Notorious b.s.d.
Jan 25, 2003

by Reene

Mr Dog posted:

Building a desktop environment on top of Java or C# is loving stupid, the politics are just coincidental.

nobody argued you would want to build the entire environment in java or c#.

but at some point you have user applications, which will differ per-user and may in fact not come bundled with gnome. wild idea, i know. gnome 3 doesn't really encourage that mode of thinking.

Mr Dog posted:

They are entire Qt-like universes in their own right, so have fun churning out mountains of boilerplate converting back and forth between java.util.List or System.Collections.IList objects or w/e and the native collection types used by the system frameworks. Same issue if you want to interface with GStreamer or any other such native API. Mountains of loving boilerplate so you're going to have to hope that whoever does the tedious shitwork of keeping your C#/Java bindings up to date has a binding for the exact API that you want to use and it's been tested and shown to work.

the point of bindings is to handle all this bullshit. java-gnome and gtk# were essentially seamless and worked really great

yes, if you wanted to roll it yourself and do your own JNI or C# marshalling for gtk+ that would suck. no one would do that.

Mr Dog posted:

GObject still exists, there's still C-side boilerplate you have to write if you're programming for GObject in C, but you do that once and then dynamic languages can dynamically bind to it using GObject introspection

lol just lol

gobject introspection was layering bad ideas on bad ideas. reinventing the wheel. gnome abandoned CORBA so now it has to re-invent it badly

Mr Dog posted:

It's a large part of why the official "default" language for GNOME is now JavaScript. JS sucks but at least it's just a programming language as opposed to a programming language with a gigantic compulsory standard library attached to it that duplicates a large chunk of the Gtk/GLib stack.

...and here we reach the end game

when every loving stupid thing has been tried and is hopelessly broken we just start pretending the world is a web browser and borrow a language that was "designed" in four weeks

when you're completely out of ideas and no one can write applications for your constantly-shifting platform obviously js is the next stop on the crazy train

Notorious b.s.d.
Jan 25, 2003

by Reene

Suspicious Dish posted:

except for the fact that it has nothing to do with corba at all?

you reinvented the IDL in its entirety and you generate it from C code and you don't see how this is like CORBA?

CORBA was originally core to the gnome project and then they went on a crusade against it and jesus christ i hate open sores desktops. gnu network object model environment, now without the network, object, or model.

Notorious b.s.d.
Jan 25, 2003

by Reene

Suspicious Dish posted:

what would you prefer

i would prefer that people had actually finished the work begun with gnome 1 and gnome 2
i would prefer that bonobo had been documented for use by other people
i would prefer that eazel and ximian had developed actual business models

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

Notorious b.s.d. posted:

you reinvented the IDL in its entirety and you generate it from C code and you don't see how this is like CORBA?

IDLs weren't invented by CORBA, nor does the IDL accurately model high-level semantics of C APIs. the problem that this needs to solve is "if I have a method that takes a char***, it really means that it's an out pointer to an array of strings and so treat it as a return value from python or js"

the reason we dropped CORBA was because it was synchronous, and it really hurt to do 30 method calls for every part in the desktop. it hasn't been replaced with gobject-introspection. in fact, it was replaced by DBus, a separate project.

vala started out as a project to make an IDL that describes gobject. there is also a "vapi" IDL that's similar to GIR in some ways. vapi supports some things that GIR doesn't, and vice versa.

Notorious b.s.d. posted:

the point of bindings is to handle all this bullshit. java-gnome and gtk# were essentially seamless and worked really great

in the past, pygtk, java-gnome and gtk# all had to invent their own different ways of making bindings. all three of them had some hacked up c parser that would generate a bunch of data, and they'd edit it manually to deal with some hacks. each binding would bind some method calls but not others, and it was inconsistent as to what actually got bound in each language.

the pygtk, java-gnome and gtk# developers wanted to share the goal of mapping the api. they invented the gir interchange format and the typelib binary format to be usable by bindings at runtime. later, they realized they could apply this to other random libraries in the stack, so a convenient way of generating gir from C was invented, using some annotations in doc comments. they're actually the same annotations that we added a long time ago for documentation purposes in gtk-doc, telling you stuff like "this function returns a new reference, so you need to unref with g_object_unref". they were already in libraries like gtk+ for doc purposes, so it made sense to reuse that data which bindings could also use.

as a result of this effort, pygtk, java-gnome and gtk# are now all based on introspection data.

it happened by the bindings authors as a way of sharing effort, not the other way around.

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

Notorious b.s.d. posted:

i would prefer that people had actually finished the work begun with gnome 1 and gnome 2

which means ...?

Notorious b.s.d. posted:

i would prefer that bonobo had been documented for use by other people

we phased out bonobo by gnome 2.6 or so. that was released over ten years ago. it didn't work out very well in practice.

dbus is a lot better, so just use that instead.

Notorious b.s.d. posted:

i would prefer that eazel and ximian had developed actual business models

i heard that the yorba guys want to start a new business based on developing add-on services around nautilus. let's hope that eazel mk. ii has more success.

PrBacterio
Jul 19, 2000
reading this all I can think of is epicycles
like, yes, it IS possible to come up with an utterly convoluted but ultimately "logical" explanation for all this poo poo but at the end of the day, why bother when theres another approach thats straightforward and makes sense right from the start

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

PrBacterio posted:

reading this all I can think of is epicycles
like, yes, it IS possible to come up with an utterly convoluted but ultimately "logical" explanation for all this poo poo but at the end of the day, why bother when theres another approach thats straightforward and makes sense right from the start

which is?

Notorious b.s.d.
Jan 25, 2003

by Reene

Suspicious Dish posted:

we phased out bonobo by gnome 2.6 or so. that was released over ten years ago. it didn't work out very well in practice.

it was doomed from day one. if it's not documented, no one can use it

if no one can use it, it doesn't improve.

Suspicious Dish posted:

dbus is a lot better, so just use that instead.

notably dbus came from kde

which is where all the sane users/developers went

Notorious b.s.d.
Jan 25, 2003

by Reene

Suspicious Dish posted:

IDLs weren't invented by CORBA, nor does the IDL accurately model high-level semantics of C APIs. the problem that this needs to solve is "if I have a method that takes a char***, it really means that it's an out pointer to an array of strings and so treat it as a return value from python or js"

the reason we dropped CORBA was because it was synchronous, and it really hurt to do 30 method calls for every part in the desktop. it hasn't been replaced with gobject-introspection. in fact, it was replaced by DBus, a separate project.

this is actually a sane and reasonable explanation rather than a lengthy screed about corba sucking

i appreciate it


Suspicious Dish posted:

as a result of this effort, pygtk, java-gnome and gtk# are now all based on introspection data.

it happened by the bindings authors as a way of sharing effort, not the other way around.

still super fuckin hacky

if you want a global calling convention that rises above the level of C, define that calling convention. don't generate magic xml from comments in C files



edit: early in the .net project, some people kinda thought of the CLR as that "above C" calling convention. C++/CLI and the genuinely multilanguage nature of the CLR made it way more than just a JVM clone or a new language for windows -- it was supposed to be legit possible to have EVERYTHING communicating through CLR calling convention and getting away from win32 forever. welp we see how that panned out

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

Notorious b.s.d. posted:

notably dbus came from kde

dbus was inspired by both dcop and bonobo, but it was actually havoc pennington of gnome who wrote the spec and all the code for it.

Notorious b.s.d. posted:

still super fuckin hacky

if you want a global calling convention that rises above the level of C, define that calling convention. don't generate magic xml from comments in C files

calling conventions aren't high-level enough

if i take a 'int **' as an argument, that either means an array of int or an out pointer to an int.

if i return a string like a 'char *', do i need to free it afterwards?

this is normally covered by documentation (or a stronger type system), but a binding needs to figure this out.

we did discuss alternate solutions like doing #define out and putting 'out int** foo' in your param list, or adding a new idl language, but we decided that piggybacking on the standard annotations already provided by our doc tool was good enough, gave us a very easy testbed for prototyping (since 90% of gtk+ already used these annotations), and made it a relatively easy transition for library authors.

it's a bit awkward at first, but you warm up to it quite soon. add a (transfer none) here and a (out) there and suddenly wow, automatic javascript, python, mono, java, lua, perl bindings.

Bream
Feb 3, 2013

Farmer's Barket
guys what was wrong with gnome 2? i still happily use it and fear for the day when it's discontinued. i guess there will always be xfce or some kind of rhel-supported legacy mode, but do we really need to rethink how desktop environments work?

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
there were issues with how it was architected that made features like "lock the screen if I have a menu open" impossible.

we also did a lot of ui research on gnome used by our workstation customers and found out lots of stuff. a large problem for a lot of real workstation customers was that they could accidentally delete the start menu that add all their apps or drag their clock so it was 1px wide and they had no idea what they did and how to undo it.

lots of random apps stuffed themselves into the notification area using the gtktrayicon api, and some of our customers had over 30 icons in that little area. it wasn't designed to do that, and the tray icon api gives the app way too much power over what that icon is allowed to do, and isn't accessible in the slightest. it also wasn't designed for transparent panels or anything that our customers were asking us for.

gnome3 was done with lots of ui research and testing, and it was mostly about changing the architecture so it was easier to add new features, and to get away from the panel/applet model of desktop development

you're free to use xfce or cinnamon or lxde or unity or mate if you don't like gnome3 (though gnome 3.10 is much better than gnome 3.0 if you haven't tried it recently)

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
if it isn't clear already i work at red hat on gnome3 all day so i'm happy to answer any questions you may have about this stuff.

Sapozhnik
Jan 2, 2005

Nap Ghost

Notorious b.s.d. posted:

edit: early in the .net project, some people kinda thought of the CLR as that "above C" calling convention. C++/CLI and the genuinely multilanguage nature of the CLR made it way more than just a JVM clone or a new language for windows -- it was supposed to be legit possible to have EVERYTHING communicating through CLR calling convention and getting away from win32 forever. welp we see how that panned out

Yes, you can skin the most banal aspects of the syntax to look like Visual Basic or like Java.

This is a technical triumph, gj microsoft.

Here's the thing: they tried that with COM, and it was... well, not entirely a nightmare; all it really defined was a calling convention for 'this' plus the IUnknown interface which gave you dynamic casts and reference counting, but it was rather bureaucratic with regards to memory management as a result. Then they wanted to make it a bit less bare-bones and added "COM Automation", which was basically a fancy way of saying "The Visual Basic type system" and welp that was an absolute screaming horror show. Ask me about SAFEARRAYs sometime. Specifically how they must always be contained in VARIANTs and they can have multiple dimensions and the range of each dimension is allowed to start at -12 and go up to 4 or whatever and SAFEARRAYs themselves must always contain VARIANTs and there's probably VT_BYREF VARIANTs involved and aaaaaaaaaaaaaaaaaaaaarrhrejwihglkfhgfdslgh you absolute fucker I'd repressed my memories of that poo poo.

The whole thing was also bodged into becoming an RPC/marshalling system as well on the side (even an intra-process marshalling system for "apartments", which are basically isolated logical processes within a process), and also built out a bunch of enterprise distributed transaction co-ordination and clustering stuff and called it COM+. Remote object access and RPC are fundamentally loving unworkable ideas and it took the industry the whole of the 90s and a good chunk of the noughts to figure this out, both fall afoul of basically every one of the distributed systems fallacies, this is an idea that really needs to die forever and not keep getting zombified anew in the form of SOAP and WSDL and whatever other poo poo.

And yeah it had an IDL too.

Anyway, then they tried again with CLR, made the type system look like something that wasn't designed by a cocaine fuelled psychopath, and also added a managed heap. It also made no attempt at any magical-distributed-system-for-free stupidity.

Except now you've married what was once just an ABI to a managed heap, which means that managed heap needs to know a particular type system, carnally, including crystallising decisions such as how generics are handled, covariance and contravariance, exception specifications and unwinding, single vs multiple inheritance, closures and continuations or absence thereof, everything aspect of a language's semantics that makes a given language unique and it's baked into this "common" runtime. All this ultimately leaves you with once you've nailed that stuff down is syntax skins. Or horrible square-peg-in-round-hole abominations like Scala and its phalanx of synthesized interfaces on the most trivial of objects (yes I know you can probably mitigate this by not writing idiomatic Scala, but if you're not using a given language idomatically then what the gently caress is the point). The Parrot guys also tilted at this "universal managed heap" windmill and got a faceful of dirt ten years down the line for their trouble. There's no silver bullet for some sort of high-level Esperanto for software components just like there is no silver bullet that magically cooks up distributed systems as easily as linking betweens DLL (because message serialisation barely even matters, whereas distributed state and the CAP theorem are important and they're kind of really loving hard).

For a while I was daydreaming about making a toy managed OS kernel that implemented a single address space managed LISP VM, and of course every managed imperative language is just a subset of LISP really so it would totally be the most awesome thing ever. Maybe allow users to create unmanaged process that you could exchange byte streams or memory pages with for high-performance kernels written in C like video codecs or whatever. Then I realised how stupid the whole idea was.

Bream
Feb 3, 2013

Farmer's Barket

Suspicious Dish posted:

if it isn't clear already i work at red hat on gnome3 all day so i'm happy to answer any questions you may have about this stuff.

HA HA OKAY YEAH I DO. what is up with nvidia not having any plans to support wayland with their proprietary driver according to some community rep's post on their forums as quoted on phoronix? what does this mean for the state of opengl linux in gnome? glx is no picnic, but i feel like it's a solved problem.

i'm honestly surprised to hear about any of the use cases you mentioned being problems for people, but here in the world of desktop linux tm, we are all our own edge cases. the needs i most need suiting are accelerated 3d and support for autodesk products. right now rhel meets or exceeds in all categories, but the landscape is such that i have concerns for the future making my usage patterns more niche than they already are.

on your recommendation i'll give 3.10 a shot. there's another guy where i work who's been using it and seems to like it; thus far i'd just chalked it up to bad taste.

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

Bream posted:

HA HA OKAY YEAH I DO. what is up with nvidia not having any plans to support wayland with their proprietary driver according to some community rep's post on their forums as quoted on phoronix? what does this mean for the state of opengl linux in gnome? glx is no picnic, but i feel like it's a solved problem.

nvidia wants to be in the business of letting you use their graphics card and doing cool direct rendering, not in the business of supporting new window systems. a bunch of people from nvidia and from rh/intel/canonical all got together and talked about this at xdc, and we have a pretty idea of what to do to allow nvidia to work together with wayland/mir.

they proposed a new egldevice abstraction layer so you can find egl devices to render on in a window system-abstract way, along with eglstreams which allow transferring an egl surface from one process to another, which solves the window compositing case, again without coding anything wayland-specific.

they haven't adopted kms since it's not complete enough an api for them and doesn't have enough features that they want to support. they've talked about making a new standard modesetting api so that we can use their cards, potentially in egl.

if you want me to go more in depth here i can, if you're curious about how direct rendering and modesetting work.

Bream posted:

i'm honestly surprised to hear about any of the use cases you mentioned being problems for people, but here in the world of desktop linux tm, we are all our own edge cases. the needs i most need suiting are accelerated 3d and support for autodesk products. right now rhel meets or exceeds in all categories, but the landscape is such that i have concerns for the future making my usage patterns more niche than they already are.

on your recommendation i'll give 3.10 a shot. there's another guy where i work who's been using it and seems to like it; thus far i'd just chalked it up to bad taste.

just curious, do you know if you work at a movie studio?

Zlodo
Nov 25, 2006

Suspicious Dish posted:

calling conventions aren't high-level enough

if i take a 'int **' as an argument, that either means an array of int or an out pointer to an int.

if i return a string like a 'char *', do i need to free it afterwards?

guess what, its c that isnt high level enough, which is why the entire gnome project is such a clown show

oh but yeah ~c if fine~ lol

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
its the only language thats sane and compiles to native machine code

well i mean theres like rust. i'd love to write new gnome components in rust and i already started working on that.

but do you have any better ideas?

Bream
Feb 3, 2013

Farmer's Barket

Suspicious Dish posted:

nvidia wants to be in the business of letting you use their graphics card and doing cool direct rendering, not in the business of supporting new window systems. a bunch of people from nvidia and from rh/intel/canonical all got together and talked about this at xdc, and we have a pretty idea of what to do to allow nvidia to work together with wayland/mir.

they proposed a new egldevice abstraction layer so you can find egl devices to render on in a window system-abstract way, along with eglstreams which allow transferring an egl surface from one process to another, which solves the window compositing case, again without coding anything wayland-specific.

they haven't adopted kms since it's not complete enough an api for them and doesn't have enough features that they want to support. they've talked about making a new standard modesetting api so that we can use their cards, potentially in egl.

if you want me to go more in depth here i can, if you're curious about how direct rendering and modesetting work.


just curious, do you know if you work at a movie studio?

i haven't looked at it in depth, but maybe correct a misconception of mine and tell me that you can get a real, say, opengl 4.4 context from egl, because i could swear that you could at most hope for gles 3 from it. i really am interested in where the future of that is going.

what would the modesetting stuff actually fix? i've read a lot of stuff complaining about how randr etc. are kludgy hacks to solve an architecturally intractable problem, but i've only ever done x stuff either via qt, motif, sdl, or freeglut.

and no, i don't work at a movie studio, maya etc. is just a hobby.

Zlodo
Nov 25, 2006

Suspicious Dish posted:

its the only language thats sane and compiles to native machine code

well i mean theres like rust. i'd love to write new gnome components in rust and i already started working on that.

but do you have any better ideas?

c++

oh wait, you're implying that c++ is not sane

because int** is saner than vector< int >, or int&

and char* is saner than string

right

Nomnom Cookie
Aug 30, 2009



im so glad my work isn't open sores. its bad enough working on a hacked together creaky 5 year old product w/o having forums people telling me what i already know, to wit, that the bad decisions of others are my fault

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

Zlodo posted:

c++

oh wait, you're implying that c++ is not sane

because int** is saner than vector< int >, or int&

and char* is saner than string

right

they learn so fast

Zlodo
Nov 25, 2006

Suspicious Dish posted:

they learn so fast

shut up, C Tiny Bug Child

double sulk
Jul 2, 2010

Zlodo posted:

c++

oh wait, you're implying that c++ is not sane

because int** is saner than vector< int >, or int&

and char* is saner than string

right

kill yourself

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

Bream posted:

i haven't looked at it in depth, but maybe correct a misconception of mine and tell me that you can get a real, say, opengl 4.4 context from egl, because i could swear that you could at most hope for gles 3 from it. i really am interested in where the future of that is going.

egl is just a platform-agnostic layer for glx/agl/wgl. egl is actually quite bad right now. it's just a bunch of typedefs so EGLWindow is typedef'd to "Window" on x11, and "NSWindowSurface" or w/e on os x, and "HWND" on windows, etc.

egl came around after glx/agl/wgl were finalized, so there's no motivation to add it to existing desktop apps. but on mobile platforms like android, they don't have any existing bindings, so the first egl implementations were tied together with gles/android simply so they didn't have to write a custom droidgl binding or something when egl already existed.

nvidia and amd are starting to support egl on windows and x11. apple writes their own drivers for everything so they probably won't support egl any time soon. but on mir/wayland, we don't write our own bindings so we can piggyback on android.

it's going to be full gl in that case. the way you create a gl context is you ask it for all the possible contexts it can create with eglGetConfigs/eglChooseconfigs, and you find one you want, and say "cool, create that one" with eglCreateContext.

you can look for attributes with EGL_OPENGL_BIT / EGL_OPENGL_ES_BIT / EGL_OPENGL_ES2_BIT to see what you're going to get coming out of it.

Bream posted:

what would the modesetting stuff actually fix? i've read a lot of stuff complaining about how randr etc. are kludgy hacks to solve an architecturally intractable problem, but i've only ever done x stuff either via qt, motif, sdl, or freeglut.

right now modesetting is on a per monitor basis. so you ask "what are all the modes for monitor 1?", "cool, let's set that one". "what are all the modes for monitor 2?" "cool, let's set that one"...

that doesn't work when you have configurations like "we support three monitors, one with a full 1920x1200 res, and the other two at 800x600. or we support two monitors, both at a full 1920x1200 res". we're working on atomic modesetting where you can query the list of combinatorials and set all configs at once.

the other thing right now that's missing in kms is sophisticated hw overlay support. a lot of chips now have a way of doing simple hardware rendering/scaling so you can say "please display this yuv 640x480 buffer and scale it up to scanout res, and put a black fill around the edges so it's letterboxed" without doing any scaling or letterboxing or rendering on the cpu except decoding right into the yuv buffer.

pro nvidia chips can even do color management and gamma correction on-board.

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

Zlodo posted:

shut up, C Tiny Bug Child

i've never been so honored

compuserved
Mar 20, 2006

Nap Ghost

Zlodo posted:

shut up, C Tiny Bug Child

tiny c child :3:

compuserved fucked around with this message at 00:08 on Oct 29, 2013

Adbot
ADBOT LOVES YOU

Bream
Feb 3, 2013

Farmer's Barket

Suspicious Dish posted:

it's going to be full gl in that case. the way you create a gl context is you ask it for all the possible contexts it can create with eglGetConfigs/eglChooseconfigs, and you find one you want, and say "cool, create that one" with eglCreateContext.

you can look for attributes with EGL_OPENGL_BIT / EGL_OPENGL_ES_BIT / EGL_OPENGL_ES2_BIT to see what you're going to get coming out of it.

that makes considerably more sense than whatever i had going on in my head. okay, i'll subscribe to your newsletter. presuming i was working in c++, would you recommend gtk+ or gtkmm?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply