Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
Is there a term for a generic that has had it's generic types parameterized has not been instantiated or called with arguments? I'm juggling around with that with reflection a lot and I am finding naming the intermediate stuff to be confusing if I say it's "parameterized." I've been using "realized" for this case but I pulled that out of my rear end.

Adbot
ADBOT LOVES YOU

Potassium Problems
Sep 28, 2001
Are you talking about an open generic? like typeof(List<>)

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

List<T> is a parametrically polymorphic type.

The process of assigning a specific type to T, e.g. String, is sometimes called the monomorphization of List<T>.

So List<String> is a monomorphic type, but so is String and every other non-generic type, so it's not a distinctive term. I suppose the correct term would be "monomorphed", although I've never seen it used.

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!

Potassium Problems posted:

Are you talking about an open generic? like typeof(List<>)

I'll just lay it out but it's going to be ugly.

When I'm invoking .NET methods from my interpreter, I have to figure out which method I'm invoking based on arguments. This is now muddier with generics. I'm implementing it such that the first arguments become the generic arguments. Internally then what happens is that I pin down the generic, make it using the types of those arguments, and then I'm holding... something. In long English, it's, like, an instantiation of the generic type with all its generic parameters filled in, but the arguments are still unsatisfied. However, using any variation of that makes for a lovely variable name and I'm trying to give the intermediate steps something that means a thing without taking a gajillion characters. Stating it's "parameterized" or whatever in this case is bad because I'm about to parameterize the actual parameters too.

NihilCredo posted:

List<T> is a parametrically polymorphic type.

The process of assigning a specific type to T, e.g. String, is sometimes called the monomorphization of List<T>.

So List<String> is a monomorphic type, but so is String and every other non-generic type, so it's not a distinctive term. I suppose the correct term would be "monomorphed", although I've never seen it used.

Yeah sure I'll take that unless somebody drums up something else. That at least googles in the ballpark so the cockroaches that inherit the planet and see this on GitHub should be able to deduce what I meant. As for why I did it, nobody knows. :ghost:

Chrungka
Jan 27, 2015
How about using genericType for closed generic exclusively and genericTypeDefinition for open ones. It might not be 100% correct, but at least it's consistent with .Net reflection naming.

code:
var genericType = typeof(List<>).MakeGenericType(typeof(string));
var genericTypeDefinition = typeof(List<string>).GetGenericTypeDefinition();
Or just go with open and closed type/method. I think most of the cockroaches should be familiar with those two concepts.

Chrungka fucked around with this message at 00:53 on Mar 10, 2020

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!

Chrungka posted:

How about using genericType for closed generic exclusively and genericTypeDefinition for open ones. It might not be 100% correct, but at least it's consistent with .Net reflection naming.

code:
var genericType = typeof(List<>).MakeGenericType(typeof(string));
var genericTypeDefinition = typeof(List<string>).GetGenericTypeDefinition();

I preferred monomorphized because it's a very different word and distinct versus tacking on "definition" or whatever. Nonetheless, I wanted to come back and emphasize this post because I did just get bit by that naming convention while polishing the code. I was testing for IsGenericMethod to determine if I needed special handling. However, both the polymorphic and the monomorphic methods will evaluate true; both SomeCall<T> and SomeCall<int> pass IsGenericMethod I had to instead test for IsGenericMethodDefinition to make sure I was dealing with a generic that was still juggling runtime type information. In that case, SomeCall<T> is true as a definition but SomeCall<int> is false.

A little bit of trivia for anybody that ever wades through this peat bog after me.

brap
Aug 23, 2004

Grimey Drawer
List<string> is a constructed type, List<T> is an open type. List<T> is the original definition of List<string>.

Red Mike
Jul 11, 2011
There is a page with terminology in the documentation. "Open" isn't explicitly used in there, but it's what the rest of the documentation tends to use. I think the same terminology is used for method definitions, however that isn't really explicitly stated.

Couple more quirks to be aware of from just that page:

quote:

It is important to note that a method is not generic just because it belongs to a generic type, or even because it has formal parameters whose types are the generic parameters of the enclosing type.

quote:

The common language runtime considers nested types to be generic, even if they do not have generic type parameters of their own.

I recommend you read this: https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/generics/generics-and-reflection
Followed by this (linked from the previous one), this one has the breakdown of terminology despite it being on an API ref article: https://docs.microsoft.com/en-us/dotnet/api/system.type.isgenerictype?view=netframework-4.8#remarks

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!

Red Mike posted:

There is a page with terminology in the documentation. "Open" isn't explicitly used in there, but it's what the rest of the documentation tends to use. I think the same terminology is used for method definitions, however that isn't really explicitly stated.

I recommend you read this: https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/generics/generics-and-reflection
Followed by this (linked from the previous one), this one has the breakdown of terminology despite it being on an API ref article: https://docs.microsoft.com/en-us/dotnet/api/system.type.isgenerictype?view=netframework-4.8#remarks

I think I saw "open types" and "closed types" being used throughout so I'll see if it makes sense for me to use those terms instead. If there's one thing I don't like after using monomorph is that polymorphic as an alternative implies run-time abstraction.

It also looks like I should use ContainsGenericArguments instead of IsGeneric/IsGenericDefinition.

putin is a cunt
Apr 5, 2007

BOY DO I SURE ENJOY TRASH. THERE'S NOTHING MORE I LOVE THAN TO SIT DOWN IN FRONT OF THE BIG SCREEN AND EAT A BIIIIG STEAMY BOWL OF SHIT. WARNER BROS CAN COME OVER TO MY HOUSE AND ASSFUCK MY MOM WHILE I WATCH AND I WOULD CERTIFY IT FRESH, NO QUESTION

Rocko Bonaparte posted:

I think I saw "open types" and "closed types" being used throughout so I'll see if it makes sense for me to use those terms instead. If there's one thing I don't like after using monomorph is that polymorphic as an alternative implies run-time abstraction.

It also looks like I should use ContainsGenericArguments instead of IsGeneric/IsGenericDefinition.

Open/closed is pretty common terminology, so you'll do yourself a favour if you adopt that phrasing since many StackOverflow questions, DI library documentation, etc all use that terminology. (i.e. https://simpleinjector.readthedocs.io/en/latest/advanced.html)

Mata
Dec 23, 2003
This is maybe more of a reactive programming question, but I'm using RxNET, and I got a a background thread doing some heavy work, and pushing their data out via BehaviorSubjects, which the GUI thread subscribes to. I'd like to do the actual GUI updating on the GUI thread, ofcourse, and as far as I can tell the ObserveOn operator is supposed to do this, but how?
Am I supposed to write my own IScheduler implementation? My current setup, which doesn't work, looks like this:
code:
public void UpdateGUIWithDataFromBackgroundThread(IObservable<ViewModel> someObservableFromABackgroundThread) {
	var sub = someObservableFromABackgroundThread
		.Select(vm => doSomeHeavyWork(vm))
		.ObserveOn(Scheduler.CurrentThread)
		.Do(vm => updateGUI(vm))
		.Subscribe();
}
I want everything downstream of the ObserveOn operator to be done on the GUI thread (where the Subscribe method is called) and everything above ObserveOn to be done wherever the Observable's next value is coming from (in my case, always the background thread), instead it seems like the whole thing is executed on the background thread. It's not clear if the ObserveOn operator is having any effect with the Scheduler.CurrentThread parameter, but it's certainly not doing what I want.
The IScheduler interface seems a bit clumsy and I can't find any good docs about it.
It would be super cool if I could just use the Buffer operator and pass in an observable stream that generates values on the GUI thread, but I guess it's going to cause concurrency issues if you have a bufferCloseBySelector that's being fed values from a different thread from the observable it's buffering. Anyone done this kind of thing before or can point me in the right direction?

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

Does anybody know if it's possible to configure ASP.NET Core (with Newtonsoft.Json, not System.Text.Json) in such a way that:

1) by default, missing fields in a request body return 400, unless that particular field is decorated with some kind of [NotRequired] attribute

2) by default, unrecognized fields in a request body return 400, unless that particular class is decorated with some kind of [AdditionalPropertiesAllowed] attribute

Most documentation seems to push you towards doing the exact opposite, i.e. have liberal defaults and then apply restrictions on specific objects.

But I want the first implementation of a client to be forced to be 100% conformant, and then retain the ability to selectively and judiciously apply backwards-compatible changes to the API.

Chrungka
Jan 27, 2015
1) This might be doable by implementing IContractResolver. I've used custom ContractReslover to implement de/serialization of Optional<T>.
2) Look into https://www.newtonsoft.com/json/help/html/SerializationSettings.htm#MissingMemberHandling and its interaction with https://www.newtonsoft.com/json/help/html/DeserializeExtensionData.htm

So you might end with something like this:
code:
public void ConfigureServices(IServiceCollection services)
{
     services.AddControllers()
        .AddNewtonsoftJson(options =>
        {
            options.SerializerSettings.MissingMemberHandling = Newtonsoft.Json.MissingMemberHandling.Error;
            options.SerializerSettings.ContractResolver = RequiredContractResolver.CreateReplacement(options.SerializerSettings.ContractResolver);
        });
}

public class RequiredContractResolver : CamelCasePropertyNamesContractResolver
{
    public static RequiredContractResolver Instance { get; } = new RequiredContractResolver();

    public static RequiredContractResolver CreateReplacement(IContractResolver original)
    {
        if (original is DefaultContractResolver defaultContractResolver)
            return new RequiredContractResolver()
            {
                IgnoreIsSpecifiedMembers = defaultContractResolver.IgnoreIsSpecifiedMembers,
                IgnoreSerializableAttribute = defaultContractResolver.IgnoreSerializableInterface,
                IgnoreSerializableInterface = defaultContractResolver.IgnoreSerializableInterface,
                IgnoreShouldSerializeMembers = defaultContractResolver.IgnoreShouldSerializeMembers,
                NamingStrategy = defaultContractResolver.NamingStrategy,
                SerializeCompilerGeneratedMembers = defaultContractResolver.SerializeCompilerGeneratedMembers
            };
        else
            return Instance;
    }

    protected override JsonProperty CreateProperty(MemberInfo member, MemberSerialization memberSerialization)
    {
        JsonProperty property = base.CreateProperty(member, memberSerialization);
        property.Required = Required.Always; // or Required.AllowNull if you want to allow nullable based on member.MemberType
        return property;
    }
}

Chrungka fucked around with this message at 16:11 on Mar 13, 2020

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

Thanks, that's a pretty good starting place.

For my #1 requirement, I guess I would check the MemberInfo for some specific attribute and then set .Required. I'm concerned that .Required doesn't have a 'not required' value - it's only got Default, which sounds like it would not override a global setting - or it might just be legacy naming. I'll need to test it.

For #2 though, I can't find a place to locally override the MissingMemberHandling setting. It feels like it should be in JsonObjectContract, but it isn't there.

putin is a cunt
Apr 5, 2007

BOY DO I SURE ENJOY TRASH. THERE'S NOTHING MORE I LOVE THAN TO SIT DOWN IN FRONT OF THE BIG SCREEN AND EAT A BIIIIG STEAMY BOWL OF SHIT. WARNER BROS CAN COME OVER TO MY HOUSE AND ASSFUCK MY MOM WHILE I WATCH AND I WOULD CERTIFY IT FRESH, NO QUESTION
Big announcement: https://devblogs.microsoft.com/dotnet/announcing-net-5-0-preview-1/

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
I think I've hit the first point in my life where multimethods might be a thing. I'm trying to make my interpreter work directly with .NET types too. I was able to use dynamic to fairly compact basic arithmetic operators that were getting mixed between interpreter and .NET types. However, subscripting is a whole other beast because the handling is so varied. The containers could be:

List types:
Internal list type
.NET Arrays
ILists
Types that implement the index operator

Dictionary types:
Internal dictionary type
.NET dictionary type

The keys could be:
Lists: Internal number type, or anything that can be cast to a .NET integer
Dictionary Types: Just about anything, really.

Routing between the different permutations with conditionals is going to quickly get me into some Pepe Silvia logic, so I'm hoping somebody knows a way to organize it so it isn't so gross. Actually reflecting on everything isn't particularly hard; it isn't hard to particularly implement any particular pair, but adding on the permutations just makes it disgusting!

Boz0r
Sep 7, 2006
The Rocketship in action.
I've moved over to generating client proxies with NSwag in MSBuild, but my XML comments aren't included in the swagger definition. How do I fix this? I'm using the Web API via reflection swagger generator.

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

Boz0r posted:

I've moved over to generating client proxies with NSwag in MSBuild, but my XML comments aren't included in the swagger definition. How do I fix this? I'm using the Web API via reflection swagger generator.

Disclaimer: I'm using the NSwag middleware so I don't know if the generation via NSwagStudio + reflection even supports XML comments.

Are you outputting the .xml files along with the .dlls? It's a project option to publish those, and the XML comments are printed there.

Boz0r
Sep 7, 2006
The Rocketship in action.

NihilCredo posted:

Disclaimer: I'm using the NSwag middleware so I don't know if the generation via NSwagStudio + reflection even supports XML comments.

Are you outputting the .xml files along with the .dlls? It's a project option to publish those, and the XML comments are printed there.

Yeah, we previously used swashbuckle to expose the swagger API and Nswag to generate proxies from another project, which was a hassle, so I wanted to do it on compile time instead. That method included the XML comments.

fankey
Aug 31, 2001

I'm trying to support user supplied custom fonts in my application. The fonts will be embedded in the document as a byte[] and it can't be required that they are installed by the regular Windows font management. I need to create a Typeface based on the font which will be used directly via DrawingContext calls as well as assigned to regular Controls. As far as the user is concerned the fonts will be specified using the same syntax as the CSS @font-face directive where the src() is the file name used when the font was embedded in the document. I'm running into 2 issues-

1. WPF really wants the fonts to be either resources or files on disk. I haven't figured out a way to make it load from a byte[]. The best I've found is saving to a temp file which I don't like doing if it's not absolutely necessary.

2. I can't figure out how to load the font via filename. Even if I pass in a file to Fonts.GetTypefaces it returns all the fonts in the same directory as the file with their names resolved to the names as defined in the font files themselves. This would be great for populating a dropdown with the available fonts but that's not what I'm trying to do - I want to get the Typeface based on the filename, not what's defined in the font file. For added confusion if you just pass in a directory into GetTypefaces nothing is returned but if you can pass in a file that doesn't exist as long as it ends with .ttf and it'll find every font in the folder.

EDIT : looks like I can create a GlyphTypeface based on a specific file but I'm not seeing a way to get a Typeface from a GlyphTypeface.

EDIT2 : I've come up with what seems like a possibly horrible and fragile solution
  • Save all the byte[] to temporary files in a given path
  • load the GlyphTypeface based on each file name
  • from the glyph get the the name of the FontFamily and create a family->path dictionary
  • use Fonts.GetTypefaces(temppath) to load all the typefaces
  • look up the path based on FontFamily, now create a path->Typeface dictionary
In my extensive testing of 2 fonts it seems to work but I'm open to any other better ideas that don't seem as hacky.

fankey fucked around with this message at 17:03 on Mar 20, 2020

raminasi
Jan 25, 2005

a last drink with no ice

fankey posted:

I'm trying to support user supplied custom fonts in my application. The fonts will be embedded in the document as a byte[] and it can't be required that they are installed by the regular Windows font management. I need to create a Typeface based on the font which will be used directly via DrawingContext calls as well as assigned to regular Controls. As far as the user is concerned the fonts will be specified using the same syntax as the CSS @font-face directive where the src() is the file name used when the font was embedded in the document. I'm running into 2 issues-

1. WPF really wants the fonts to be either resources or files on disk. I haven't figured out a way to make it load from a byte[]. The best I've found is saving to a temp file which I don't like doing if it's not absolutely necessary.

2. I can't figure out how to load the font via filename. Even if I pass in a file to Fonts.GetTypefaces it returns all the fonts in the same directory as the file with their names resolved to the names as defined in the font files themselves. This would be great for populating a dropdown with the available fonts but that's not what I'm trying to do - I want to get the Typeface based on the filename, not what's defined in the font file. For added confusion if you just pass in a directory into GetTypefaces nothing is returned but if you can pass in a file that doesn't exist as long as it ends with .ttf and it'll find every font in the folder.

EDIT : looks like I can create a GlyphTypeface based on a specific file but I'm not seeing a way to get a Typeface from a GlyphTypeface.

EDIT2 : I've come up with what seems like a possibly horrible and fragile solution
  • Save all the byte[] to temporary files in a given path
  • load the GlyphTypeface based on each file name
  • from the glyph get the the name of the FontFamily and create a family->path dictionary
  • use Fonts.GetTypefaces(temppath) to load all the typefaces
  • look up the path based on FontFamily, now create a path->Typeface dictionary
In my extensive testing of 2 fonts it seems to work but I'm open to any other better ideas that don't seem as hacky.

This solution just goes around WPF to load the font family, which might look kind of scary but should work fine in practice. Have you investigated that kind of approach?

fankey
Aug 31, 2001

raminasi posted:

This solution just goes around WPF to load the font family, which might look kind of scary but should work fine in practice. Have you investigated that kind of approach?

Yeah I tried that approach and I couldn't get it to work - all the fonts ended up being Arial. It's possible I was doing something wrong - in my case I need to end up with a Typeface not a FontFamily which is what that approach accomplished. Even if it worked I'm not sure that approach would allow me to differentiate between different weights and styles of the same Family that came from different font files. I'll play around more with it since if it works it's a lot better than saving out files to disk - especially with awesome bugs in GlyphTypeface that requires every font be in its own directory.

raminasi
Jan 25, 2005

a last drink with no ice

fankey posted:

Yeah I tried that approach and I couldn't get it to work - all the fonts ended up being Arial. It's possible I was doing something wrong - in my case I need to end up with a Typeface not a FontFamily which is what that approach accomplished. Even if it worked I'm not sure that approach would allow me to differentiate between different weights and styles of the same Family that came from different font files. I'll play around more with it since if it works it's a lot better than saving out files to disk - especially with awesome bugs in GlyphTypeface that requires every font be in its own directory.

With the enormous caveat that I've never done anything like this myself, is this TypeFace constructor not what you want? Or does that always yield Arial?

fankey
Aug 31, 2001

raminasi posted:

With the enormous caveat that I've never done anything like this myself, is this TypeFace constructor not what you want? Or does that always yield Arial?

Yep, that's the one I was using. For whatever reason it just doesn't work. I see other people complaining about PrivateFontCollection not working in WPF - perhaps it's .NET version related or something.

Boz0r
Sep 7, 2006
The Rocketship in action.
I have a console application calling a Core WebApi calling a Framework WebApi. Both WebApis have NSwag generated proxies. We'd like to stamp each call with an ID so we can trace the calls all the way through. What's the best way of doing this? One of our guys really wants to use the built-in IoC container, but I don't know if that's the best solution.

Mr Shiny Pants
Nov 12, 2012
Create a Guid and stuff it in the headers and pick it up on the receiving end? That's what I used in the past.

Opulent Ceremony
Feb 22, 2012
Does anyone here have experience with Azure Event Grid? I'm having a weird problem.

My setup is: an Azure Media Services account and an Azure Function App, with two instances of each, each set of Media Service/Function App representing a different environment, let's call them environment 1 and 2. They are all a part of the same Azure Subscription, Location, Resource Group, and for the Media Services, Storage Account.

My goal was, per environment, to create an Event Grid subscription from the Media Service to a method on the Function App. With environment 1 (env 1 Media Service to env 1 Function App), this works perfectly. With environment 2, this fails.

My initial idea was to use the Azure web shell CLI to investigate the details of those existing subscriptions to see if env 2's was correct, but following along with (https://docs.microsoft.com/en-us/cli/azure/eventgrid/event-subscription?view=azure-cli-latest#az-eventgrid-event-subscription-list), the tool doesn't appear to work, as none of my existing subscriptions (including the one that works) show up here, although it will show me Event Grid subscriptions whose endpoint is a webhook (used when testing locally), but the Function App endpoint doesn't show up at all. The subscriptions show up in the Azure Portal, but they relate a very limited amount of info once already created.

I've tried simple things like re-creating the env2 Media Service and deleting and re-creating the Event Grid subscriptions a bunch of times.

My investigations so far has yielded these results:

Does env 2 Media Service publish the events? Yes, I can see the events displayed on the Service's Time Graph, and I can receive the correct events if I add another webhook subscription to my local system, they just aren't getting to the env 2 Function App.

Does env 2 Function App receive any Event Grid events? Yes, I was initially filtering for just JobFinished and JobErrored in the subscription, but re-creating the subscription without event type filtering (which is like 20 different kinds), env 2 Function App receives those others, just not the two ones I care about.

Basically, there isn't or shouldn't be any real difference in how env 1 and env 2 are setup and connected, and one works where the other doesn't and I'm going a little crazy.

EssOEss
Oct 23, 2006
128-bit approved
I can't help you but wow, Azure Media Services is still alive? I remember when they fired everybody but like 1 developer. What do you use it for? What are its strengths?

Boz0r
Sep 7, 2006
The Rocketship in action.

Mr Shiny Pants posted:

Create a Guid and stuff it in the headers and pick it up on the receiving end? That's what I used in the past.

You're right, of course, but our architecture is over-engineered and the customer wants to use the IoC container for it, so the issue was to get it from the controller to the call to the next proxy. I got it working, though.


I have a new problem where I'd like your help to make our project a little less poo poo. We have a web service that acts as a middle mad for communication to CRM 2011(being replaced by D365 soon). The web service is built around a classic 3 layer architecture:

API layer, which contains controllers who just has reference and calls through to
Business Logic layer, containing business managers who just has references and through to
Service Access Layer, containing the reference to CRM. Only this layer knows CRM's data structure and is responsible for mapping to DTOs which get returned all the way back to the caller.

The API layer was getting more and more controllers and methods with unique DTOs, so one of their architects got the idea of making more standard DTOs per entity, with varying degrees of fields and references. So we'd end up with something like CustomerBasic, CustomerNormal, CustomerFancy, sometimes more, sometimes less.

It's not my decision, but I thought it was a terrible idea, and I'd much rather have a DTO per use case so I can pick exactly the fields I need, but I'd rather have some sort of query language so we didn't need all those end points. The problem with this, though, is that only the SAL can know CRM's datamodel, it has to work with Swagger, and needs to work with NSwag generated proxies.

What are best practices for these types of setups? Is the entire thing trash?

Opulent Ceremony
Feb 22, 2012

EssOEss posted:

I can't help you but wow, Azure Media Services is still alive? I remember when they fired everybody but like 1 developer. What do you use it for? What are its strengths?

Well this sure doesn't give me a lot of confidence! We use it to encode and DASH stream our own video content.

Mr Shiny Pants
Nov 12, 2012

Boz0r posted:

You're right, of course, but our architecture is over-engineered and the customer wants to use the IoC container for it, so the issue was to get it from the controller to the call to the next proxy. I got it working, though.


I have a new problem where I'd like your help to make our project a little less poo poo. We have a web service that acts as a middle mad for communication to CRM 2011(being replaced by D365 soon). The web service is built around a classic 3 layer architecture:

API layer, which contains controllers who just has reference and calls through to
Business Logic layer, containing business managers who just has references and through to
Service Access Layer, containing the reference to CRM. Only this layer knows CRM's data structure and is responsible for mapping to DTOs which get returned all the way back to the caller.

The API layer was getting more and more controllers and methods with unique DTOs, so one of their architects got the idea of making more standard DTOs per entity, with varying degrees of fields and references. So we'd end up with something like CustomerBasic, CustomerNormal, CustomerFancy, sometimes more, sometimes less.

It's not my decision, but I thought it was a terrible idea, and I'd much rather have a DTO per use case so I can pick exactly the fields I need, but I'd rather have some sort of query language so we didn't need all those end points. The problem with this, though, is that only the SAL can know CRM's datamodel, it has to work with Swagger, and needs to work with NSwag generated proxies.

What are best practices for these types of setups? Is the entire thing trash?

This sounds like something GraphQL would/can fix. Or something like Clojure to be honest. I don't know anything about swagger, I know it generates REST API documentation or something but have never actually used it to be honest. There are some Swagger to GraphQL tools it seems but I dont't think it is something you'd do on a Friday.

As for the Clojure thing, it doesn't work with classes as such, you just build maps of Key Value pairs that was especially invented for stuff like you describe: we have an entity and sometimes we want more data from it and sometimes less, but for us it is essentially still the same thing. I thought it was really elegant. I have never used Clojure but I've been watching some talks from Rich Hickey and it is a good way to understand some highlevel thinking about software systems. Your post reminded me of it. :)

EssOEss
Oct 23, 2006
128-bit approved

Opulent Ceremony posted:

Well this sure doesn't give me a lot of confidence! We use it to encode and DASH stream our own video content.

I see. We used it off and on since it was in private preview and back in like 2016 or so they fired just about everyone who had not already left for AWS. Even the private forums where Media Services people used to hang out became a ghost town with a desperate promoter posting some tutorial videos once a month at best.

Microsoft tried hard to make streaming work but instead of focusing on simplicity and throughput they kept trying to sell the product via pointless bells and whistles. They had a few developers with good ideas but the undertaking seemed fairly doomed from the start due to the prioritization, with lack of basics like supporting different input formats (do they actually accept MPEG2-TS live streams now?) and instead spending on things such as automated transcripts (which remain, to this day, utter garbage). Because "machine learning" was the hot buzzword and hot buzzwords is the only way they know to sell stuff.

Sad stuff. Then again AWS also did not really pan out as well as they hoped and had to resort to buying Elemental to really get their game going. For whatever reason, the encoding/streaming ecosystem is rife with companies that hire smart developers but completely butcher it when it comes to making a good product. I say that looking at the announcement just posted in Slack about my own employer's 4th attempt at a media encoding service going live. Yet another service that tries hard not to try hard...

EssOEss fucked around with this message at 14:36 on Mar 27, 2020

Opulent Ceremony
Feb 22, 2012

I appreciate you sharing your experiences with various media services. The main factor for us choosing Azure was simply we already have in place a bunch of other Azure services that are a good fit and are working as expected. This latest experience is a pain though, and I'm sad to hear it sounds like it doesn't get a ton of support.

adaz
Mar 7, 2009

Boz0r posted:


It's not my decision, but I thought it was a terrible idea, and I'd much rather have a DTO per use case so I can pick exactly the fields I need, but I'd rather have some sort of query language so we didn't need all those end points. The problem with this, though, is that only the SAL can know CRM's datamodel, it has to work with Swagger, and needs to work with NSwag generated proxies.

What are best practices for these types of setups? Is the entire thing trash?

The entire thing is trash yes but often times we don't get a choice in what we have to work with so here are some suggestions!

1.) Add another library project that just contains your models. If you really want to be ridiculous (and based on your architecture you might need to be) it can jsut contain the abstract classes or interfaces and let concrete implementations up to each real project. So you Controllers / SAL / Business logic all have dependencies on _it_ and it has references to nothing
2.) Change your SAL to layer to accept types of <T> and rely on reflection to populate the DTO. Your SAL becomes an ORM mapper for the uptsream projects. Your controller supplies the type it wants to your sal and your sal is relying on reflection and discovery to populate it with data from your CRM. This is tough both practically and for performance reasons but it's a common solution to this problem and meets some of your goals, is more flexible, etc
3.) Use something like automapper (NOTE: read Jimmy's best practice guidelines _and do not violate them!!_) to map your sal DTO -> Business DTO -> Controller dtos.
IN general I would avoid this because its a whole lot of object creation/destruction for absolutely (0) benefit but you might not have a choice depending on your architect.
4.) as another poster mentioned something like GraphQL or Elastic Search/Solr is basically built ot be a generic data query language around existing data sources. But that's a complete rebuild of your app which is probably not practical!

adaz fucked around with this message at 21:58 on Mar 27, 2020

Boz0r
Sep 7, 2006
The Rocketship in action.

adaz posted:

The entire thing is trash yes but often times we don't get a choice in what we have to work with so here are some suggestions!

1.) Add another library project that just contains your models. If you really want to be ridiculous (and based on your architecture you might need to be) it can jsut contain the abstract classes or interfaces and let concrete implementations up to each real project. So you Controllers / SAL / Business logic all have dependencies on _it_ and it has references to nothing
2.) Change your SAL to layer to accept types of <T> and rely on reflection to populate the DTO. Your SAL becomes an ORM mapper for the uptsream projects. Your controller supplies the type it wants to your sal and your sal is relying on reflection and discovery to populate it with data from your CRM. This is tough both practically and for performance reasons but it's a common solution to this problem and meets some of your goals, is more flexible, etc
3.) Use something like automapper (NOTE: read Jimmy's best practice guidelines _and do not violate them!!_) to map your sal DTO -> Business DTO -> Controller dtos.
IN general I would avoid this because its a whole lot of object creation/destruction for absolutely (0) benefit but you might not have a choice depending on your architect.
4.) as another poster mentioned something like GraphQL or Elastic Search/Solr is basically built ot be a generic data query language around existing data sources. But that's a complete rebuild of your app which is probably not practical!

At the moment we have a library that that generates IQueryable sets with typed entities for interaction with CRM. Would it be realistic to build a DTO library that mapped DTOs to the entities we needed and be able to send an IQueryable expression as a REST call?

adaz
Mar 7, 2009

Boz0r posted:

At the moment we have a library that that generates IQueryable sets with typed entities for interaction with CRM. Would it be realistic to build a DTO library that mapped DTOs to the entities we needed and be able to send an IQueryable expression as a REST call?

Yes, the hard part is then mapping REST -> IQueryable. Like REST is about resources and state and defers how to handle querying, sorting, pagination, etc basically solely up to the implementer. With that said, sure many frameworks expose REST endpoints that under the hood rely on IQueryable to sort and get the data that the REST endpoint exposes. However if the point is to expose Iqueryable<T> to the outside world from the CRM why not just let them query the CRM which surely already has a REST interface or index the data and expose it as lucene/elastic search queries which are industyr standard ish and people are more used to? Also a million clients available for

Boz0r
Sep 7, 2006
The Rocketship in action.

adaz posted:

Yes, the hard part is then mapping REST -> IQueryable. Like REST is about resources and state and defers how to handle querying, sorting, pagination, etc basically solely up to the implementer. With that said, sure many frameworks expose REST endpoints that under the hood rely on IQueryable to sort and get the data that the REST endpoint exposes. However if the point is to expose Iqueryable<T> to the outside world from the CRM why not just let them query the CRM which surely already has a REST interface or index the data and expose it as lucene/elastic search queries which are industyr standard ish and people are more used to? Also a million clients available for

What's the general consensus about having the CRM data model publically known to all our services? I think it's pretty odd since we're already just mapping them to our own DTOs, and there's built-in authentication in CRM. I found a way of exposing REST endpoints as IQueryable, so maybe I can take DTOs and map them to CRM's data model before sending the query.

EDIT: I got frustrated because I had to write three simple queries to CRM for entities we hadn't used yet, so I had to implement this entire chain three times:

All of the calls just flow through to the last SAL layer.

Boz0r fucked around with this message at 12:29 on Mar 30, 2020

Eggnogium
Jun 1, 2010

Never give an inch! Hnnnghhhhhh!
I have a Json.NET question:

A web API I'm calling returns some json including one attribute that is an int64. I'm taking this data and serializing it into a .NET object, but all the consumers of this object will actually want to read it as a string after converting to hex. Is there a super simple way to augment JsonConvert.DeserializeObject to just do this for me so the deserialized object looks as it should? I could have two properties, one with a setter for deserializing and one with a getter that hexifies the other, but I'd rather avoid that if possible, keep the object interface clean, and bundle this complexity with the deserialization code.

LOOK I AM A TURTLE
May 22, 2003

"I'm actually a tortoise."
Grimey Drawer

Eggnogium posted:

I have a Json.NET question:

A web API I'm calling returns some json including one attribute that is an int64. I'm taking this data and serializing it into a .NET object, but all the consumers of this object will actually want to read it as a string after converting to hex. Is there a super simple way to augment JsonConvert.DeserializeObject to just do this for me so the deserialized object looks as it should? I could have two properties, one with a setter for deserializing and one with a getter that hexifies the other, but I'd rather avoid that if possible, keep the object interface clean, and bundle this complexity with the deserialization code.

There are probably several ways to do that. One way is to create a custom JsonConverter that does the needful. You will need to either add it to the JsonSerializerSettings that you pass to DeserializeObject, or apply it as an attribute to the class or property in question. See https://www.newtonsoft.com/json/help/html/CustomJsonConverter.htm and related articles.

Adbot
ADBOT LOVES YOU

Eggnogium
Jun 1, 2010

Never give an inch! Hnnnghhhhhh!

LOOK I AM A TURTLE posted:

There are probably several ways to do that. One way is to create a custom JsonConverter that does the needful. You will need to either add it to the JsonSerializerSettings that you pass to DeserializeObject, or apply it as an attribute to the class or property in question. See https://www.newtonsoft.com/json/help/html/CustomJsonConverter.htm and related articles.

Thanks, looks like exactly what I need! Don’t know how I whoopsed over this when browsing the docs.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply