Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
StoicFnord
Jul 27, 2012

"If you want to make enemies....try to change something."


College Slice

EssOEss posted:

Can you post an example token and decryption key (better yet, example code)? I find it hard to follow your description but have successfully used Jose-JWT in the past so I can give it a try.

Unless what you're asking is just about how to use some unsupported algorithm (is it?).

Ill see what i can do (Much of it is company confidential)

Basically the issue is that Jose-JWT doesnt implement ConcatKDF in net standard. Ref: https://github.com/dvsekhvalnov/jose-jwt/blob/e54de3bb706edf294053b4b86f0db47333d433ef/jose-jwt/crypto/ConcatKDF.cs

Adbot
ADBOT LOVES YOU

adaz
Mar 7, 2009

Boz0r posted:

I've been messing around with AutoFixture and AutoMoq and it's really cool, but I've hit a snag that I don't know how to solve. My code gets a bunch of proxies from a static factory class that I switch out with a mock factory in my test base class, and I use AutoDataAttribute to inject fixtures into my tests.

My mocking factory looks like this:
code:
public class ProxyFactoryMock : IProxyFactory
{
    private readonly IFixture _fixture;

    public ProxyFactoryMock(IFixture fixture)
    {
        _fixture = fixture;
    }

    public T GetProxy<T>(string url) where T : IProxyBase
    {
        return _fixture.Create<T>();
    }
}
I create an AutoMoq attribute like in their guide:
code:
public class AutoMoqDataAttribute : AutoDataAttribute
{
    public AutoMoqDataAttribute() : base(() => new Fixture().Customize(new AutoMoqCustomization()))
    {
    }
}
And I use it in my test definition like so.
code:
[Theory, AutoMoqData]
public void AutoProxyTest([Frozen] Mock<ISomethingProxy> somethingProxy)
{
	// Arrange
	somethingProxy.Setup(proxy => proxy.Method(It.IsAny<string>())).ReturnsAsync(() => new SwaggerResponse<string>(default, default, "test"));
	
	...
}
My problem here is, that the attribute and the factory have two different instances of IFixture, so the factory doesn't use the proxy I just set up.

How do I fix this in the neatest way, without having to add extra code to each unit tests, and also having the instance be unique per test?

I'm sorry it looks like you are missing some code here or the psuedo code didnt get quite translated here. In your example test you are injecting an a moq of <ISomethingProxy>. Why would you expect that to use your factory? There's nothing telling autofixture to do that - it see's an object of type ISomethingProxy and will create you and auto moq'd object of that type. What are you wanting it do here? Should that ISomethingProxy instead be a IProxyFactory?

insta
Jan 28, 2009

Boz0r posted:

I've been messing around with AutoFixture and AutoMoq and it's really cool, but I've hit a snag that I don't know how to solve. My code gets a bunch of proxies from a static factory class that I switch out with a mock factory in my test base class, and I use AutoDataAttribute to inject fixtures into my tests.

My mocking factory looks like this:
code:
public class ProxyFactoryMock : IProxyFactory
{
    private readonly IFixture _fixture;

    public ProxyFactoryMock(IFixture fixture)
    {
        _fixture = fixture;
    }

    public T GetProxy<T>(string url) where T : IProxyBase
    {
        return _fixture.Create<T>();
    }
}
I create an AutoMoq attribute like in their guide:
code:
public class AutoMoqDataAttribute : AutoDataAttribute
{
    public AutoMoqDataAttribute() : base(() => new Fixture().Customize(new AutoMoqCustomization()))
    {
    }
}
And I use it in my test definition like so.
code:
[Theory, AutoMoqData]
public void AutoProxyTest([Frozen] Mock<ISomethingProxy> somethingProxy)
{
	// Arrange
	somethingProxy.Setup(proxy => proxy.Method(It.IsAny<string>())).ReturnsAsync(() => new SwaggerResponse<string>(default, default, "test"));
	
	...
}
My problem here is, that the attribute and the factory have two different instances of IFixture, so the factory doesn't use the proxy I just set up.

How do I fix this in the neatest way, without having to add extra code to each unit tests, and also having the instance be unique per test?

Make the factory class injectable instead of static and swap factory implementations?

putin is a cunt
Apr 5, 2007

BOY DO I SURE ENJOY TRASH. THERE'S NOTHING MORE I LOVE THAN TO SIT DOWN IN FRONT OF THE BIG SCREEN AND EAT A BIIIIG STEAMY BOWL OF SHIT. WARNER BROS CAN COME OVER TO MY HOUSE AND ASSFUCK MY MOM WHILE I WATCH AND I WOULD CERTIFY IT FRESH, NO QUESTION
Not sure if this is suitable for this thread but not sure where else to post it. We have a .NET Framework API that is using Azure Cache for its caching layer, which is Azure's managed implementation of Redis. We occasionally get these errors where the connection to Redis has dropped and we get hundreds of messages in the logs like this:

code:
No connection is available to service this operation: EVAL; It was not possible to connect to the redis
server(s). To create a disconnected multiplexer, disable AbortOnConnectFail. ConnectTimeout; IOCP:
(Busy=0,Free=1000,Min=4,Max=1000), WORKER: (Busy=94,Free=32673,Min=4,Max=32767), Local-CPU:
n/a One or more errors occurred. It was not possible to connect to the redis server(s). To create a
disconnected multiplexer, disable AbortOnConnectFail.
This problem has persisted through a migration from Redis on a VM to Azure Cache, and is not necessarily present in all of our instances at the same time (so we know Azure Cache is available, because other instances aren't getting dropped). The problem is mostly fixed after a period of a few minutes and if it's not, an application restart in IIS always fixes it.

What I've tried
I found a lot of talk from Googling around that the StackExchange.Redis library can have errors like this, but unfortunately this error seems to be some sort of 'catch-all' error that can arise from many many causes. The two suggestions I've seen and tried to no avail:
  • Set abortConnect=false in the connection string
  • Update the version of the StackExchange.Redis package, we're now on the latest version

Monitoring for Azure Cache shows no unusual data at all during the times the connections are failing. The CPU load for example is peaking at 20%, so I don't think it's an issue of underprovisioning or the like.

This is creating a big problem for us because without Redis the load on our DB is ridiculous and it's creating cascading problems. I hope someone can help, I'm at the end of my ideas for how to even begin to debug this.

Boz0r
Sep 7, 2006
The Rocketship in action.

adaz posted:

I'm sorry it looks like you are missing some code here or the psuedo code didnt get quite translated here. In your example test you are injecting an a moq of <ISomethingProxy>. Why would you expect that to use your factory? There's nothing telling autofixture to do that - it see's an object of type ISomethingProxy and will create you and auto moq'd object of that type. What are you wanting it do here? Should that ISomethingProxy instead be a IProxyFactory?

When I need a proxy somewhere in my code I get it like this:
code:
ProxyFactory.Instance.GetProxy<ISomethingProxy>(_serviceUrl)
I solved the problem by changing the constructor of my attribute to set my proxy factory like this:
code:
public class AutoMoqDataAttribute : AutoDataAttribute
{
    public AutoMoqDataAttribute() : base(() =>
    {
        IFixture fixture = new Fixture().Customize(new AutoMoqCustomization());
        ProxyFactory.Instance = new ProxyFactoryMock(fixture);
        return fixture;
    }) { }
}
I don't know if I like this way, though, it doesn't seem too elegant.

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

Mata posted:

Cool. Did you get any speedup from this or did you just want to get the response piecemeal to make it easier to work with?

Speed-up wasn't a big concern for a one-time download, I just wanted to make it possible for the clients to display a progress bar as the collection was being downloaded.

Actually, I thought about your problem again and it might in fact correspond to what I observed. When you're using Newtonsoft to serialize, the JSON is sent as a single huge HTTP response so the browser can't begin deserializing it until it has got the entire, uncorrupted response.

But if you pre-serialize it as a string, then Newtonsoft is out of the picture and ASP.NET is free to chunk the string response. Then your browser can start deserializing the response as soon as it receives a single piece, instead of all together at the end.

Can you inspect your raw HTTP request/responses application calls with Fiddler and see if the Content-Length is set to chunked?

Also, since you were asking for low-hanging fruit: the response is already gzipped, right? It should be by default.

necrotic
Aug 2, 2005
I owe my brother big time for this!
The browser can begin reading as soon as headers are sent. If the backend is not sending anything until the whole blob is serialized then the client had to wait that long to even see headers.

You don't even need chunked responses to stream/show progress, the backend just has to start sending data asap or you can only show a "waiting" (or 0% progress) until that TTFB (when the headers come in).

adaz
Mar 7, 2009

Boz0r posted:

When I need a proxy somewhere in my code I get it like this:
code:
ProxyFactory.Instance.GetProxy<ISomethingProxy>(_serviceUrl)
I solved the problem by changing the constructor of my attribute to set my proxy factory like this:
code:
public class AutoMoqDataAttribute : AutoDataAttribute
{
    public AutoMoqDataAttribute() : base(() =>
    {
        IFixture fixture = new Fixture().Customize(new AutoMoqCustomization());
        ProxyFactory.Instance = new ProxyFactoryMock(fixture);
        return fixture;
    }) { }
}
I don't know if I like this way, though, it doesn't seem too elegant.


Ahhh! I see. Yeah this isn't working quite right because you're relying on basically service locator pattern for your ProxyFactory. AutoFixture can't really hook into that pipeline. In general for autofixture you want everything to be - as much as possible - constructor injected / hidden behind an interface or abstract class. You start violating that and it becomes harder and harder to test.

Can I ask why you don't just inject into your class constructors a instance of IProxyFactory? If you did that then autofixture (and your DI framework for that matter) could control the entire creation pipe and you wouldn't need to set the instance explictly like you are doing in the AutoMoqData.

Boz0r
Sep 7, 2006
The Rocketship in action.

adaz posted:

Ahhh! I see. Yeah this isn't working quite right because you're relying on basically service locator pattern for your ProxyFactory. AutoFixture can't really hook into that pipeline. In general for autofixture you want everything to be - as much as possible - constructor injected / hidden behind an interface or abstract class. You start violating that and it becomes harder and harder to test.

Can I ask why you don't just inject into your class constructors a instance of IProxyFactory? If you did that then autofixture (and your DI framework for that matter) could control the entire creation pipe and you wouldn't need to set the instance explictly like you are doing in the AutoMoqData.

It's custom plugin code running on Dynamics 365, so I have no idea how to go about doing DI in that context, and none of our other consulting teams have done it.

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
I recently found myself doing some C# copy-paste and wondered if I could avoid it. I was wrapping a lot of private fields with getters and setters so I could strobe an event when the setters were called. The event is different per field and the data type varies per field, but it's still pretty generic. I can't think of any way to represent that other than to still type it out without getting into ugly reflection poo poo that is far worse, but I thought I'd ask anyways.

I could probably compromise and have a more overall "this thing changed but I can't tell you what" kind of event and make that standard but I'm still stuck messing with the fields. I can only think of a generic helper that encapsulates the field but I'd rather have the class trying to do all this contain the field itself.

biznatchio
Mar 31, 2001


Buglord

Rocko Bonaparte posted:

I recently found myself doing some C# copy-paste and wondered if I could avoid it. I was wrapping a lot of private fields with getters and setters so I could strobe an event when the setters were called. The event is different per field and the data type varies per field, but it's still pretty generic. I can't think of any way to represent that other than to still type it out without getting into ugly reflection poo poo that is far worse, but I thought I'd ask anyways.

I could probably compromise and have a more overall "this thing changed but I can't tell you what" kind of event and make that standard but I'm still stuck messing with the fields. I can only think of a generic helper that encapsulates the field but I'd rather have the class trying to do all this contain the field itself.

Well, you could do something like this, which reduces the amount of boilerplate you need to write, but still requires you to write a field, event, and one-line getter and setter for each property.

Or, you could use the .Net standard INotifyPropertyChanged interface and simplify things a little further into something like this; though you'll pay a little bit of runtime cost with this because it's adding a dictionary lookup and boxing/unboxing to your gets and sets.

You could combine the two approaches to remove that extra overhead, though, at the cost of having to define the backing field for each property like this.

insta
Jan 28, 2009
Sounds like a use-case for T4

Canine Blues Arooo
Jan 7, 2008

when you think about it...i'm the first girl you ever spent the night with

Grimey Drawer

biznatchio posted:

Well, you could do something like this, which reduces the amount of boilerplate you need to write, but still requires you to write a field, event, and one-line getter and setter for each property.

Or, you could use the .Net standard INotifyPropertyChanged interface and simplify things a little further into something like this; though you'll pay a little bit of runtime cost with this because it's adding a dictionary lookup and boxing/unboxing to your gets and sets.

You could combine the two approaches to remove that extra overhead, though, at the cost of having to define the backing field for each property like this.

Taking the INotify approach a bit further, you can use Prism and simplify it a bit since some of the boilerplate is just handled. The 'simple' version here still has you signing up for private/public locals, but the class is simple to read and maintain and it's a pretty reasonable amount of boilerplate, all things considered.

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
I suppose it's safe to say there isn't any particular magic built in for it but I did forget about using actions and refs. Thanks everybody! This is actually more than I expected.

Also:

insta posted:

Sounds like a use-case for T4

Are there any other preprocessor-like things like this that are pretty common? I think I was suggested this a few months ago for some particular bit of Rocko insanity. If it's the same thing then it's probably about time I get into this particular brand of crazy.

redleader
Aug 18, 2005

Engage according to operational parameters

Rocko Bonaparte posted:

Are there any other preprocessor-like things like this that are pretty common? I think I was suggested this a few months ago for some particular bit of Rocko insanity. If it's the same thing then it's probably about time I get into this particular brand of crazy.

Source Generators are coming Real Soon Now

ljw1004
Jan 18, 2005

rum

redleader posted:

Source Generators are coming Real Soon Now

In addition to Source Generators, you can also use something like PostSharper:
https://doc.postsharp.net/inotifypropertychanged-add

(my take on all of these things is that generated code solves problems, but always creates more problems than it solves...)

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
For something like templates or metaprogramming, I was kind of assuming something like macros, but that INotifyPropertyChanged thing was pretty neat. And yeah, I haven't used anything like that for anything due to all the issues I've had in the past with step debugging in particular.

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

I'm trying to clean up a massive git folder from an old solution that is full of useless crap, like old versions of code, never-used assets, and so on.

Is there a way to make MSBuild print out a clean list of every file it actually used during the build process, so I can delete the rest?

Mr Shiny Pants
Nov 12, 2012

NihilCredo posted:

I'm trying to clean up a massive git folder from an old solution that is full of useless crap, like old versions of code, never-used assets, and so on.

Is there a way to make MSBuild print out a clean list of every file it actually used during the build process, so I can delete the rest?

Shouldn't your csproj or fsproj list all the files it uses?

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

Mr Shiny Pants posted:

Shouldn't your csproj or fsproj list all the files it uses?

I think so, but it's like 25 project files each with a slightly different mix of content include, none include, compile include, etc., plus it references a bunch of local .DLLs (ancient hardware vendor libraries, mainly), plus SOAP client support files that I don't fully understand (what's a *.datasource file?).

Point is, walking through all of that XML to grab just the file paths would be a bore and I would be very worried about forgetting some rarely-used shim or some tiny icon file.

So I was more looking for something clever like setting all of the folder's last accessed date back by a few years, running a full rebuild, and seeing what got accessed.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

NihilCredo posted:

I'm trying to clean up a massive git folder from an old solution that is full of useless crap, like old versions of code, never-used assets, and so on.

Is there a way to make MSBuild print out a clean list of every file it actually used during the build process, so I can delete the rest?

Have you done analysis to see where the massiveness comes from? In my experience, it's binaries 99% of the time.

"Old versions of code" is suspect -- that's what source control is for? Unless you mean someone copied Foo/* to Foo.bak/* and committed that, but that's immediately obvious.

I'd take an incremental, is-it-good-enough-yet? approach. Find something big and clearly bad. Remove it with BFG. Repeat until the repo meets your size/speed requirements.

Worst case, just take the past X weeks or months of history and archive the old version of the repo.

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

New Yorp New Yorp posted:

"Old versions of code" is suspect -- that's what source control is for? Unless you mean someone copied Foo/* to Foo.bak/* and committed that, but that's immediately obvious.

It's more like: the file ProductView.vb used to live in project A, but at some point in the aughties it got moved out to Project B, but the old ProductView.vb still exists in project A's folder even though it's not actually being referenced by the project file anymore - it's dead code. Or just as often, some Feature.vb file was left unfinished and the incomplete file full of stubs never got added to the project but it's still lying around the folder

Most of the time, in Visual Studio, it's quite invisible and harmless. But the dead code still shows up whenever you do a grep / find in folder, or when I use Everything to quickly open up a certain file, and it's rather annoying because there's nothing indicating it's dead code, the path looks legit (who remembers "wait, ObscureProductView.vb is supposed to be in Foo/Main/Views/, not in Bar/Support/Product/View or whatever).

There's hundreds of files like these, since this whole solution wasn't put under source control until... I think 2012 or something like that (apparently the programmers at the time treated the project file like a master branch). I would like to clean the repo up before I finally move it to Git from TFS, especially because they would be much more annoying when people will be going through the repo using Gitlab's web UI and search.

NihilCredo fucked around with this message at 15:53 on Jul 27, 2020

biznatchio
Mar 31, 2001


Buglord
Turn on file access auditing for the repo folder, do a build and a run; then turn the auditing back off and use Event Viewer to search for and extract the file access audit log entries to an EVTX file, then write a small C# program using System.Diagnostics.Eventing.Reader.EventLogReader to iterate through the entries and build a list of files.

Mr Shiny Pants
Nov 12, 2012

biznatchio posted:

Turn on file access auditing for the repo folder, do a build and a run; then turn the auditing back off and use Event Viewer to search for and extract the file access audit log entries to an EVTX file, then write a small C# program using System.Diagnostics.Eventing.Reader.EventLogReader to iterate through the entries and build a list of files.

Good one, or use process monitor. :)
https://docs.microsoft.com/en-us/sysinternals/downloads/procmon

It will show you all file access.

EssOEss
Oct 23, 2006
128-bit approved
Surely the compiler will still compile/copy even the files that contain useless code. Or what is the rationale behind this attempt?

Cuntpunch
Oct 3, 2003

A monkey in a long line of kings

NihilCredo posted:

It's more like: the file ProductView.vb used to live in project A, but at some point in the aughties it got moved out to Project B, but the old ProductView.vb still exists in project A's folder even though it's not actually being referenced by the project file anymore - it's dead code. Or just as often, some Feature.vb file was left unfinished and the incomplete file full of stubs never got added to the project but it's still lying around the folder

Most of the time, in Visual Studio, it's quite invisible and harmless. But the dead code still shows up whenever you do a grep / find in folder, or when I use Everything to quickly open up a certain file, and it's rather annoying because there's nothing indicating it's dead code, the path looks legit (who remembers "wait, ObscureProductView.vb is supposed to be in Foo/Main/Views/, not in Bar/Support/Product/View or whatever).

There's hundreds of files like these, since this whole solution wasn't put under source control until... I think 2012 or something like that (apparently the programmers at the time treated the project file like a master branch). I would like to clean the repo up before I finally move it to Git from TFS, especially because they would be much more annoying when people will be going through the repo using Gitlab's web UI and search.

It'll take a little while, but it should be able to just cleansweep through with Solution Explorer in (full) Visual Studio:

Make sure you're in Solution View, then turn on Show All Files up at the top, and then you're going to be able to see in any given place where there are files-on-disk that aren't referenced by the project/solution, like so:

TIP
Mar 21, 2006

Your move, creep.



I'm trying to write a fairly simple piece of code but I have been working ridiculous hours and my brain is currently tapioca. Every time I go to even sketch out the logic of it my brain gives up.

What I'm trying to do is take a List containing any number of Dictionary<string, uint> objects and then return a Dictionary<string, uint> containing every way that you can combine the keys and values in order.

So, for example if I put something like this into the function:
[{ "A1":1, "A2": 2},{"B1":1, "B2": 2}];



I'd get back something like this:
{"A1B1": 2, "A1B2": 3, "A2B1": 3, "A2B2": 4}


If I was going to do this for a set number of entries like 2 or 3, I'd probably just do some nested loops iterating over each collection and combining the values in the deepest loop, but I need to do it over an arbitrary number of entries.

I know that a solution to this is to do it with recursion, but my brain is just hard locking every time I try to sketch out the basic logic.

Would any of you please help me here? Just a rough pseudocode explanation of the steps to take would really help me right now.

TIP fucked around with this message at 08:51 on Aug 3, 2020

Jabor
Jul 16, 2010

#1 Loser at SpaceChem
The key insight to coming up with a recursive solution is to look hard at your incremental step - suppose you have a solution for three different dictionaries. As in, you've run the code, and you've got your solution back out of it.

Now someone comes along with a fourth dictionary, and asks you what the solution would be if you had had that fourth dictionary in there from the start. Can you do that? And can you generalize that so you could do exactly the same steps if someone then came along with a fifth dictionary, and a sixth, and so on?

Once you know what your incremental step is, the base case is frequently pretty obvious. Then you can write your recursive solution:

code:

Solution solveRecursively(Problem p) {
  if (isBaseCase(p))
    return baseCaseSolution(p);
  Solution smallerSolution = solveRecursively(smallerProblem(p));
  return incrementalStep(p, smallerSolution);
}

TIP
Mar 21, 2006

Your move, creep.



Jabor posted:

The key insight to coming up with a recursive solution is to look hard at your incremental step - suppose you have a solution for three different dictionaries. As in, you've run the code, and you've got your solution back out of it.

Now someone comes along with a fourth dictionary, and asks you what the solution would be if you had had that fourth dictionary in there from the start. Can you do that? And can you generalize that so you could do exactly the same steps if someone then came along with a fifth dictionary, and a sixth, and so on?

Once you know what your incremental step is, the base case is frequently pretty obvious. Then you can write your recursive solution:

code:

Solution solveRecursively(Problem p) {
  if (isBaseCase(p))
    return baseCaseSolution(p);
  Solution smallerSolution = solveRecursively(smallerProblem(p));
  return incrementalStep(p, smallerSolution);
}

Thanks! This post was just what I needed to get my thoughts in order on it. Only took a minute to get working once I had the logic figured out.

Boz0r
Sep 7, 2006
The Rocketship in action.
We use early bound entities for developing plugins for Dynamics 365. We upgraded our csproj files to the new 2017 format, and we've just discovered a the early bound types have stopped working. Usually, we have to add the following line to an AssemblyInfo.cs:

code:
[assembly: Microsoft.Xrm.Sdk.Client.ProxyTypesAssemblyAttribute()]
But after we've converted to the new format that file no longer exists. I've tried adding the tag somewhere else in our code, but it doesn't work. Any ideas?

ThePeavstenator
Dec 18, 2012

:burger::burger::burger::burger::burger:

Establish the Buns

:burger::burger::burger::burger::burger:

Boz0r posted:

We use early bound entities for developing plugins for Dynamics 365. We upgraded our csproj files to the new 2017 format, and we've just discovered a the early bound types have stopped working. Usually, we have to add the following line to an AssemblyInfo.cs:

code:
[assembly: Microsoft.Xrm.Sdk.Client.ProxyTypesAssemblyAttribute()]
But after we've converted to the new format that file no longer exists. I've tried adding the tag somewhere else in our code, but it doesn't work. Any ideas?

If by "2017 format" you mean SDK-style csproj (what .NET Standard 2.0+ and .NET Core projects use), you can add assembly info stuff to the csproj file. Example: https://stackoverflow.com/a/44502158

raminasi
Jan 25, 2005

a last drink with no ice

Boz0r posted:

We use early bound entities for developing plugins for Dynamics 365. We upgraded our csproj files to the new 2017 format, and we've just discovered a the early bound types have stopped working. Usually, we have to add the following line to an AssemblyInfo.cs:

code:
[assembly: Microsoft.Xrm.Sdk.Client.ProxyTypesAssemblyAttribute()]
But after we've converted to the new format that file no longer exists. I've tried adding the tag somewhere else in our code, but it doesn't work. Any ideas?

And if adding directly to the .csproj doesn't work, you can always create AssemblyInfo.cs yourself. It's no longer automatically generated, but it's still used if it exists.

EssOEss
Oct 23, 2006
128-bit approved
There is nothing special about AssemblyInfo.cs - you can put that stuff into any .cs file and the result will be the same. Having AssemblyInfo.cs for it is just a convention, not a technical requirement. Whatever changed is not because AssemblyInfo.cs is missing - if adding the stuff elsewhere does not work, you've got some other gremlins, possibly specific to whatever Dynamics SDKs you are using.

I find your mention of "2017 format" confusing, though. The file format does not change just because of VS version - are you now targeting a different .NET version or something, which brings a different project file format? Different .NET runtimes do use different file formats but this is a way bigger change than that of a file format and has many compatibility implications. Maybe explain in detail what you are trying to do.

EssOEss fucked around with this message at 20:21 on Aug 6, 2020

raminasi
Jan 25, 2005

a last drink with no ice

EssOEss posted:

I find your mention of "2017 format" confusing, though. The file format does not change just because of VS version - are you now targeting a different .NET version or something, which brings a different project file format? Different .NET runtimes do use different file formats but this is a way bigger change than that of a file format and has many compatibility implications. Maybe explain in detail what you are trying to do.

The SDK-style project file format is commonly colloquially referred to as “2017-style” because VS 2017 is the first version that supported it.

Boz0r
Sep 7, 2006
The Rocketship in action.
I tried doing that, but our problem remains, so I think it's something else. For some reason, when we register our plugins in D365 we get unknown type errors when using early bound types. We tried using a clean project template that we usually use for new customers, and that works. So we probably screwed something else up.

LongSack
Jan 17, 2003

Question about APIs.

When coding a desktop app, my ECL classes (this is the layer that produces observable DTO objects from entity objects) have a method similar to
code:
IEnumerable<TDTO> Get<TDTO, TEntity>(Expression<Func<TEntity, bool>> predicate = null)
which allows me to do something like
code:
var results = FooECL.Get(x => x.BarId = barid)
This works well in a monolithic desktop app

However, now I’m working on a demonstration app that is a management system for a restaurant. The lower layers look like MS Sql -> EF Core -> DAL layer -> API layer. There is a WPF based management desktop app as well as a web site. All access to data is through the API layer, since that’s where the authentication and authorization are done.

I’m thinking about how I can compose those Expression<Func<TEntity, bool>> elements and send them to the API controllers.

My first (naïve) thought was to serialize them and send them in the request body, but this failed spectacularly.

So my other idea is to expose more endpoints in the API controllers.

But I’m wondering if there’s another way to do this?

Ideas?

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

LongSack posted:

Question about APIs.

When coding a desktop app, my ECL classes (this is the layer that produces observable DTO objects from entity objects) have a method similar to
code:
IEnumerable<TDTO> Get<TDTO, TEntity>(Expression<Func<TEntity, bool>> predicate = null)
which allows me to do something like
code:
var results = FooECL.Get(x => x.BarId = barid)
This works well in a monolithic desktop app

However, now I’m working on a demonstration app that is a management system for a restaurant. The lower layers look like MS Sql -> EF Core -> DAL layer -> API layer. There is a WPF based management desktop app as well as a web site. All access to data is through the API layer, since that’s where the authentication and authorization are done.

I’m thinking about how I can compose those Expression<Func<TEntity, bool>> elements and send them to the API controllers.

My first (naïve) thought was to serialize them and send them in the request body, but this failed spectacularly.

So my other idea is to expose more endpoints in the API controllers.

But I’m wondering if there’s another way to do this?

Ideas?

Read up on REST API design. If anything, you should be translating the Expressions to REST calls to get the appropriate data, not passing them to REST calls. If you want to do it right, you're going to have to re-examine your approach; trying to directly translate a set of patterns that worked fine in a monolith is not going to work in the REST world.

[edit] Also, you're going to go crazy trying to maintain feature parity between two UIs. Consider using Atom or Electron or something similar and hosting the web front end within a desktop app.

New Yorp New Yorp fucked around with this message at 18:35 on Aug 8, 2020

LongSack
Jan 17, 2003

New Yorp New Yorp posted:

Read up on REST API design. If anything, you should be translating the Expressions to REST calls to get the appropriate data, not passing them to REST calls. If you want to do it right, you're going to have to re-examine your approach; trying to directly translate a set of patterns that worked fine in a monolith is not going to work in the REST world.

Yeah, that's what I meant by exposing more endpoints. I didn't explain myself properly.

quote:

[edit] Also, you're going to go crazy trying to maintain feature parity between two UIs. Consider using Atom or Electron or something similar and hosting the web front end within a desktop app.

It's a demo app, so once it's written not much will change. Also, the two front ends have different purposes. The WPF app is for back-end management intended for use by the staff. The web front end is intended for customers of the restaurant, to view the menu, make reservations, etc.

biznatchio
Mar 31, 2001


Buglord
I don't know if I'd recommend it, but you can serialize an Expression to a JSON object using Aq.ExpressionJsonSerializer, and then deserialize it on the other side into something you can execute. But if you're going to expose something like that in a public API you better make sure you have your ducks in a row that you're not just allowing anyone to do arbitrary code execution on you.

Adbot
ADBOT LOVES YOU

Chrungka
Jan 27, 2015
If you're comfortable with opening your API to arbitrary projection/predicate query, maybe OData could work for you. A cursory search says there is support for LINQ-enabled client with Microsoft.OData.Client.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply