Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

Inverness posted:

There's an update to the .NET Framework blog about the status of open source.

I'm surprised that they've only moved 25% of things to GitHub so far.

I'm curious about what exactly is involved in moving those libraries and the CLR repository to GitHub that is consuming their time.

Probably removing all the obscenities.

Adbot
ADBOT LOVES YOU

Polio Vax Scene
Apr 5, 2009



EssOEss posted:

There is probably something wrong with your project setup. I have never seen a case where "Setting copy local to true" is a meaningful step for a well-formed project, only cases where it can be sort of used to hide some deeper underlying issue.

I believe readding references fixed it, copy local was a panic move.

epswing
Nov 4, 2003

Soiled Meat
Why would re-adding references fix something?

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



Inverness posted:

There's an update to the .NET Framework blog about the status of open source.

I'm surprised that they've only moved 25% of things to GitHub so far.

I'm curious about what exactly is involved in moving those libraries and the CLR repository to GitHub that is consuming their time.

Lawyers

Also

Bognar posted:

Probably removing all the obscenities.

Dromio
Oct 16, 2002
Sleeper
I'm still fighting with passing complex objects from javascript to asp.net MVC through the querystring. I've have a complex object like this:

code:
public class PricingContext
{
  public long ProductId{get;set;}
  public List<ChoiceContext> Choices{get;set;}
}
public class ChoiceContext
{
  public long ChoiceId{get;set;}
  public string ChoiceValue{get;set;}
}

public class PricingController : Controller
{
  public JsonResult GetPrice(PricingContext context, List<int> otherStuff)
  {
    //Do some stuff here, it works!
  }
}
On the client side I use knockout.js and jquery to build up a matching JSON object and POST it to an action in the server and "it just works":
code:
$.ajax({
  type: "POST",
  url: theEndpoint,
  data: ko.toJSON({context: somePricingContext, otherStuff: anArrayOfIntegers})
});
But now I'm tasked with changing many of these API calls to use the querystring because they really are GET operations (no part of the system changes due to this call) and we need to cache the responses. But ASP.NET MVC seems to really HATE trying to decode these objects from the querystring.
I've URLEncoded the JSON and changed the call to GET:
code:
$.ajax({
  type: "GET",
  url: theEndpoint,
  data: uriEncodeComponent(ko.toJSON({context: somePricingContext, otherStuff: anArrayOfIntegers}))
});
My action IS called and has an instantiated PricingContext, but it's as if it called the parameterless constructor and never populated any of the properties. The ID is 0, there are no choices, etc.

What's the best way to attack this? What's the best way to get MVC to deal with complex parameters in GET operations?

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Dromio posted:

But now I'm tasked with changing many of these API calls to use the querystring because they really are GET operations (no part of the system changes due to this call) and we need to cache the responses. But ASP.NET MVC seems to really HATE trying to decode these objects from the querystring.
I've URLEncoded the JSON and changed the call to GET:
code:
$.ajax({
  type: "GET",
  url: theEndpoint,
  data: uriEncodeComponent(ko.toJSON({context: somePricingContext, otherStuff: anArrayOfIntegers}))
});
My action IS called and has an instantiated PricingContext, but it's as if it called the parameterless constructor and never populated any of the properties. The ID is 0, there are no choices, etc.

What's the best way to attack this? What's the best way to get MVC to deal with complex parameters in GET operations?
I thought jQuery handled the encoding and stuff for you?

code:
$.ajax({
  type: "GET",
  url: theEndpoint,
  data: {context: somePricingContext, otherStuff: anArrayOfIntegers}
});

mastersord
Feb 15, 2001

Gold Card Putty Fan Club
Member Since 2017!
Soiled Meat

epalm posted:

Why would re-adding references fix something?

I had a similar problem when moving a project from one dev environment to another. It could be that the reference was pointing to something that changed or was deleted. The object may still exist elsewhere in the system.

Dromio
Oct 16, 2002
Sleeper

Inverness posted:

I thought jQuery handled the encoding and stuff for you?

Doesn't seem like it. The querystring ends up with a lot of :, ", and {} in it. Even so, MVC doesn't actually figure out how to bind it correctly, I end up with an empty instance of the object.

Che Delilas
Nov 23, 2009
FREE TIBET WEED

Bognar posted:

Probably removing all the obscenities.

Don't forget that they have to split off all the spyware functionality into separate modules so we can't confirm it exists. :tinfoil:

wwb
Aug 17, 2004

Lawyers, spyware, obscenities and also making sure it stands on it's head and is readable to the general public. Lets give these guys some credit for the work they are doing.

That jquery thing should be working to create query strings, I'd walk through it carefully with a js debugger and see what is going on.

Dromio
Oct 16, 2002
Sleeper

wwb posted:

Lawyers, spyware, obscenities and also making sure it stands on it's head and is readable to the general public. Lets give these guys some credit for the work they are doing.

That jquery thing should be working to create query strings, I'd walk through it carefully with a js debugger and see what is going on.
Here's the simplest I can do, where the data passed is a complex javascript object:
code:
      $.ajax( {
        url: window._resources["load-siteproductpricing"],
        type: "GET",
        data: aJavascriptPricingContextWith6Choices
      });
I get querystrings like this: ProductId=51&Choices%5B0%5D%5BChoiceId%5D=25&Choices%5B0%5D%5BOptionId%5D=11&Choices%5B0%5D%5BOptionType%5D=width&Choices%5B0%5D%5BChoiceValue%5D=101.125&....

My MVC action recognizes this as a pricing context, but fails to hydrate it properly. So if I look at it in the debugger the action is called and passed a PricingContext, and PricingContext ends up with a ProductID of 51. But it has the 6 choices and those end up with ChoiceIds of 0 and OptionTypes of "" and ChoiceValues of "". It looks like the default model binding in MVC just isn't up to the task of handling these more complex types, unless I'm missing something simple.

At this point I'm tempted just to write a nasty javascript serialization routine that makes something stupid like a "|" delimited string, then split it back up manually on the server side. But it feels like the wrong thing to do and fraught with peril.

Dromio fucked around with this message at 19:50 on Jan 29, 2015

bpower
Feb 19, 2011

wwb posted:

Lawyers, spyware, obscenities and also making sure it stands on it's head and is readable to the general public. Lets give these guys some credit for the work they are doing.

That jquery thing should be working to create query strings, I'd walk through it carefully with a js debugger and see what is going on.

If my boss told me to open source all our code I'd go to the bathroom to throw up. So I definitely appreciate what they're doing. I'm really impressed by the change in Microsoft regarding OS. Its like they hit a midlife crisis, quit the job and bought a motorbike , but less lame.

wwb
Aug 17, 2004

Dromio posted:

Here's the simplest I can do, where the data passed is a complex javascript object:
code:
      $.ajax( {
        url: window._resources["load-siteproductpricing"],
        type: "GET",
        data: aJavascriptPricingContextWith6Choices
      });
I get querystrings like this: ProductId=51&Choices%5B0%5D%5BChoiceId%5D=25&Choices%5B0%5D%5BOptionId%5D=11&Choices%5B0%5D%5BOptionType%5D=width&Choices%5B0%5D%5BChoiceValue%5D=101.125&....

My MVC action recognizes this as a pricing context, but fails to hydrate it properly. So if I look at it in the debugger the action is called and passed a PricingContext, and PricingContext ends up with a ProductID of 51. But it has the 6 choices and those end up with ChoiceIds of 0 and OptionTypes of "" and ChoiceValues of "". It looks like the default model binding in MVC just isn't up to the task of handling these more complex types, unless I'm missing something simple.

At this point I'm tempted just to write a nasty javascript serialization routine that makes something stupid like a "|" delimited string, then split it back up manually on the server side. But it feels like the wrong thing to do and fraught with peril.

It kind of looks like it is doing the right thing -- productID looks right. Now, choices is fubar but I'm wondering if that is too complex so it gets json encoded? See http://api.jquery.com/jQuery.param/ for some info on what might be happening but there is a new "traditional" flag you might want to set. $.get() might also behave a bit better, that is what I've typically used rather than $.ajax().

twodot
Aug 7, 2005

You are objectively correct that this person is dumb and has said dumb things

Inverness posted:

I'm curious about what exactly is involved in moving those libraries and the CLR repository to GitHub that is consuming their time.
I'm sure there is some cleanup that others have mentioned, but I think most of us have just resigned to being exposed as horrible human beings. I think most of the effort is that Microsoft has a history of using horrible build environments/scripts, and we don't want to export that practice. On the runtime side, there's also simultaneous work being done to enable building for Linux, and it'd be a waste of effort to open source everything and then suddenly change everything.

twodot fucked around with this message at 20:48 on Jan 29, 2015

Cervix-A-Lot
Sep 29, 2006
Cheeeeesy
I have a question about storing images in a database.

In the past, we would store image for say, a website product, in a database. We would upload a high quality image, say a 3500x3500 image of the product, they were always square, and we would store it in a database. Then we would have another table called ImageConfigurations which would have specified image configurations. So say on the product website, we wanted to display this image as 200x200. The code would pull that image, use imagemagik to resize it, then display it. However, after it resized, it would then store it in another database called ImageCache. So the code would see the image request, check the imagecache db, if it's there, use that as it's already resized and ready to go, if not, pull it from imagedatabase and resize, and display. Sounds like a decent system in theory, but seemed too slow to me.

Here are the models we had minus a few properties.
code:
public class ImageDatabase : BaseOrderableObject
    {
        public string imageID { get; set; }

        public byte[] Original { get; set; }

        public virtual ImageContentType ImageContentType { get; set; }

        public int ImageContentTypeId { get; set; }
    }
code:
public class ImageConfiguration : BaseObject
{
        [DefaultValue(0), Required]
        public int Height { get; set; }

        [DefaultValue(0), Required]
        public int Width { get; set; }
}
code:
public class ImageCache : BaseObject
    {
        public virtual ImageDatabase Image { get; set; }

        public virtual ImageConfiguration ImageConfiguration { get; set; }

        [Display(Name = "Configuration")]
        public int? ImageConfigurationId { get; set; }

        public byte[] ImageContent { get; set; }

        [Display(Name = "Image"), ForeignKey("Image")]
        public int? ImageId { get; set; }
    }
So, imageId would essentially be the image name and url to that image.

ImageConfiguration would hold something like:
_frontPageWeb with height and width at 200.

So this system worked because I didn't have to resize images all the time for specific parts of the website. However, it is kind of slow sometimes, especially for the first time it pulls the images from the database. So what I'd like to know is, how would you handle images?

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

Easy Mac posted:

So this system worked because I didn't have to resize images all the time for specific parts of the website. However, it is kind of slow sometimes, especially for the first time it pulls the images from the database. So what I'd like to know is, how would you handle images?

If storage isn't a problem I would put them into a queue and have some other process resize them and store them into the database. That way when the user goes to access the product the image is already resized. The queue is used so that you can have multiple workers resizing images in case volume increases. It's slightly YAGNI but I think the extra effort up front is worth it even if you keep it at one worker.

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

Dromio posted:

... deserializing complex models in GET requests with MVC ...

It's really not worth it to bother here with MVC's default model binding. If you really have to do it, the best way is to JSONify all of your request data before sending it, then send that as a parameter. For example:

code:
public ActionResult MyMVCMethod(string json)
{
    var data = JsonConvert.DeserializeObject<MyDataType>(json);
    ....
}

...

$.get('/mymvcmethod/', { json: JSON.stringify(data) });

kingcrimbud
Mar 1, 2007
Oh, Great. Now what?
Enterprise Library Semantic Logging question here. We're logging out of process to a few rolling files and it works great except for one small thing. The logs aren't written to the file unless the service is stopped/restarted or the file rolls over. This doesn't allow us to read into the files at 'real-time' and it's a pain. This is not standard behavior apparently and is happening locally and on the server.

I don't think too many people use SLAB but maybe someone has an idea?

karms
Jan 22, 2006

by Nyc_Tattoo
Yam Slacker

Easy Mac posted:

I have a question about storing images in a database.

In the past, we would store image for say, a website product, in a database. We would upload a high quality image, say a 3500x3500 image of the product, they were always square, and we would store it in a database. Then we would have another table called ImageConfigurations which would have specified image configurations. So say on the product website, we wanted to display this image as 200x200. The code would pull that image, use imagemagik to resize it, then display it. However, after it resized, it would then store it in another database called ImageCache. So the code would see the image request, check the imagecache db, if it's there, use that as it's already resized and ready to go, if not, pull it from imagedatabase and resize, and display. Sounds like a decent system in theory, but seemed too slow to me.

Here are the models we had minus a few properties.
code:
public class ImageDatabase : BaseOrderableObject
    {
        public string imageID { get; set; }

        public byte[] Original { get; set; }

        public virtual ImageContentType ImageContentType { get; set; }

        public int ImageContentTypeId { get; set; }
    }
code:
public class ImageConfiguration : BaseObject
{
        [DefaultValue(0), Required]
        public int Height { get; set; }

        [DefaultValue(0), Required]
        public int Width { get; set; }
}
code:
public class ImageCache : BaseObject
    {
        public virtual ImageDatabase Image { get; set; }

        public virtual ImageConfiguration ImageConfiguration { get; set; }

        [Display(Name = "Configuration")]
        public int? ImageConfigurationId { get; set; }

        public byte[] ImageContent { get; set; }

        [Display(Name = "Image"), ForeignKey("Image")]
        public int? ImageId { get; set; }
    }
So, imageId would essentially be the image name and url to that image.

ImageConfiguration would hold something like:
_frontPageWeb with height and width at 200.

So this system worked because I didn't have to resize images all the time for specific parts of the website. However, it is kind of slow sometimes, especially for the first time it pulls the images from the database. So what I'd like to know is, how would you handle images?

There's very little reason to store the images in the db instead of the disk. Use any kind of caching solution (write resized images to a specific path or something like memcached) to store your resized images.

Destroyenator
Dec 27, 2004

Don't ask me lady, I live in beer
You may also run into the browser/webserver/load balancer/cache/whatever limiting your max URL length.

Dromio
Oct 16, 2002
Sleeper

Destroyenator posted:

You may also run into the browser/webserver/load balancer/cache/whatever limiting your max URL length.

Yeah, I've been worrying about that. We're looking at URLs of less than 900 characters. Everything I read says that's probably ok, but it does still bother me. We're using Akamai to cache, amazon elb for load balancing, and IIS 8.5 for the web servers. So far they all seem to say they can handle it.

Dromio fucked around with this message at 13:47 on Jan 30, 2015

wilderthanmild
Jun 21, 2010

Posting shit




Grimey Drawer
So I am interested in learning how to build ASP.net MVC websites. I've built asp.net webforms websites before and have some experience in general web development, but my primary experience is with desktop development using WPF and winforms. Anyone experienced on learning on their own have any tutorials and resources they used to learn? I prefer written guides over videos and most of the suggestions I got from people I know were videos.

Chill Callahan
Nov 14, 2012

wilderthanmild posted:

So I am interested in learning how to build ASP.net MVC websites. I've built asp.net webforms websites before and have some experience in general web development, but my primary experience is with desktop development using WPF and winforms. Anyone experienced on learning on their own have any tutorials and resources they used to learn? I prefer written guides over videos and most of the suggestions I got from people I know were videos.

http://www.asp.net/mvc/overview/getting-started/introduction/getting-started is good.

bobua
Mar 23, 2003
I'd trade it all for just a little more.

wilderthanmild posted:

So I am interested in learning how to build ASP.net MVC websites. I've built asp.net webforms websites before and have some experience in general web development, but my primary experience is with desktop development using WPF and winforms. Anyone experienced on learning on their own have any tutorials and resources they used to learn? I prefer written guides over videos and most of the suggestions I got from people I know were videos.

I've started this week on trying to learn the same thing. Can't stand videos.

http://www.asp.net/mvc/overview/getting-started/introduction/getting-started this tutorial got me off and running.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Inverness posted:

There's an update to the .NET Framework blog about the status of open source.

I'm surprised that they've only moved 25% of things to GitHub so far.

I'm curious about what exactly is involved in moving those libraries and the CLR repository to GitHub that is consuming their time.

there is a 30+ step process to open source things at microsoft, for good reasons.

You don't want some patent/licensing agreement to bite you in the rear end a decade after it was signed.

Newf
Feb 14, 2006
I appreciate hacky sack on a much deeper level than you.

Malcolm XML posted:

there is a 30+ step process to open source things at microsoft, for good reasons.

You don't want some patent/licensing agreement to bite you in the rear end a decade after it was signed.

Possibly relevant: How many Microsoft employees does it take to change a lightblub?

epswing
Nov 4, 2003

Soiled Meat

When employees here whine about procedure, that's the link I direct them to. No we're not as big as Microsoft, BUT STILL.

bpower
Feb 19, 2011

wilderthanmild posted:

So I am interested in learning how to build ASP.net MVC websites. I've built asp.net webforms websites before and have some experience in general web development, but my primary experience is with desktop development using WPF and winforms. Anyone experienced on learning on their own have any tutorials and resources they used to learn? I prefer written guides over videos and most of the suggestions I got from people I know were videos.

https://curah.microsoft.com/198908/aspnet-mvc-dos-and-donts-best-practices

This was very useful for me.

raminasi
Jan 25, 2005

a last drink with no ice
If I need a collection I can numerically index into but won't need to resize, are there any non-stylistic/habitual reasons for choosing an array over a list or vice versa? I know there are plenty of cases that call for one or the other, but I'm wondering if there's a good way to choose when it doesn't seem to really matter. There's Arrays considered somewhat harmful, but it seems to be basically about mutability, and lists aren't any less mutable than arrays.

ljw1004
Jan 18, 2005

rum

Inverness posted:

There's an update to the .NET Framework blog about the status of open source.
I'm surprised that they've only moved 25% of things to GitHub so far.

:( I think that 350k lines of code in 37 workdays is pretty darned impressive! (assuming reasonable time off for Thanksgiving and Christmas).


I asked Immo for more details on what's taking the time. He says he touched on it a bit in this part of his recent video interview
http://channel9.msdn.com/Series/NET-Framework/Immo-Landwerth-and-David-Kean-Open-sourcing-the-NET-Framework#time=11m6s
but I'm hoping he'll blog more about the process in detail.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

ljw1004 posted:

:( I think that 350k lines of code in 37 workdays is pretty darned impressive! (assuming reasonable time off for Thanksgiving and Christmas).


I asked Immo for more details on what's taking the time. He says he touched on it a bit in this part of his recent video interview
http://channel9.msdn.com/Series/NET-Framework/Immo-Landwerth-and-David-Kean-Open-sourcing-the-NET-Framework#time=11m6s
but I'm hoping he'll blog more about the process in detail.

Yeah for real. It's not a trivial process to even open source code within the drat company, let alone take a large component and clear it for public consumption. Kudos to everyone involved.

Did you know, for example, that "prd" is a Czech word for fart and could be offensive? It took down an internal portal for a day since it had prd in the URL and didn't pass some extended check.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

GrumpyDoctor posted:

If I need a collection I can numerically index into but won't need to resize, are there any non-stylistic/habitual reasons for choosing an array over a list or vice versa? I know there are plenty of cases that call for one or the other, but I'm wondering if there's a good way to choose when it doesn't seem to really matter. There's Arrays considered somewhat harmful, but it seems to be basically about mutability, and lists aren't any less mutable than arrays.

Is the size a maximum size, or a fixed size? If it's a maximum size, List. If it's a fixed size, array. You can always return an IList (or if mutability is a concern, an IReadOnlyList).

ljw1004
Jan 18, 2005

rum

GrumpyDoctor posted:

If I need a collection I can numerically index into but won't need to resize, are there any non-stylistic/habitual reasons for choosing an array over a list or vice versa? I know there are plenty of cases that call for one or the other, but I'm wondering if there's a good way to choose when it doesn't seem to really matter. There's Arrays considered somewhat harmful, but it seems to be basically about mutability, and lists aren't any less mutable than arrays.

If you're doing immutable stuff then you should use ImmutableList!
https://msdn.microsoft.com/en-us/library/dn467185(v=vs.111).aspx


When I profile my apps, I've never observed significant differences between List and Array because there's always other more significant stuff that I'm doing inside my loops. HOWEVER in my FFT code I did notice a big perf increase when I switched to linked-lists over arrays/lists. I guess that's because when .NET follows a ".Next" pointer it doesn't have to do bounds-checks like lists/arrays nor covariance-checks like writing to an array.

Here's an interesting fact about following a .Next pointer "x.y" and the possibility of NullReferenceException. The CLR reserves a bottom segment of memory in each process. From the OS point of view, this memory is always accessible by the CLR. An access of this area always results in an NullReferenceException generated by the CLR’s execution environment rather than an AV generated by the OS. (In fact, the CLR doesn't translate an AV into an NRE).

For instance, when emitting code for "(x == null ? null : x.y)" then the JIT generates something like this
code:
mov rax, X
test rax, rax
beq :lbl
mov rax, [rax + k]
:lbl
The bit that computes x.y never actually needs to even perform a null-check on x! I guess that's why it's so fast to iterate over a linked-list.

ljw1004 fucked around with this message at 23:10 on Jan 30, 2015

raminasi
Jan 25, 2005

a last drink with no ice

ljw1004 posted:

If you're doing immutable stuff then you should use ImmutableList!
https://msdn.microsoft.com/en-us/library/dn467185(v=vs.111).aspx

This is dead sexy, but not worth pulling another dependency in, even it's from Microsoft.

ljw1004
Jan 18, 2005

rum

GrumpyDoctor posted:

This is dead sexy, but not worth pulling another dependency in, even it's from Microsoft.

We're actually planning to adopt NuGet wholeself, so the .NET framework itself will be just represented as a collection of NuGet dependencies. In such a world, you might have a dependency on a System.Core NuGet package, and one on System.Xml NuGet package, and one on System.Collections.Immutable NuGet package. http://blogs.msdn.com/b/dotnet/archive/2014/12/04/introducing-net-core.aspx

So I hope that, once this finally becomes reality (like it already is in ASP.NET vNext) then concerns like yours will fade away...

Dietrich
Sep 11, 2001

ljw1004 posted:

We're actually planning to adopt NuGet wholeself, so the .NET framework itself will be just represented as a collection of NuGet dependencies. In such a world, you might have a dependency on a System.Core NuGet package, and one on System.Xml NuGet package, and one on System.Collections.Immutable NuGet package. http://blogs.msdn.com/b/dotnet/archive/2014/12/04/introducing-net-core.aspx

So I hope that, once this finally becomes reality (like it already is in ASP.NET vNext) then concerns like yours will fade away...

Yesssssssssss!!!

fankey
Aug 31, 2001

I'm stuck on 4.0 and want to async-ually POST form data. It looks UploadValuesTaskAsync was added in 4.5. Is there a way to convert a NameValueCollection to the same series of bytes that would be sent in UploadValues so I can use UploadDataTaskAsync to post my data?

ljw1004
Jan 18, 2005

rum

fankey posted:

I'm stuck on 4.0 and want to async-ually POST form data. It looks UploadValuesTaskAsync was added in 4.5. Is there a way to convert a NameValueCollection to the same series of bytes that would be sent in UploadValues so I can use UploadDataTaskAsync to post my data?

One option is to write it yourself. All the *TaskAsync methods are just thin wrappers around the corresponding legacy callback-based *Async methods, like these...
code:
   wc.UploadValuesCompleted += hander;
   wc.UploadValuesAsync(...);
All of the "thin wrappers" all look more or less like this:
code:
public static async Task<string> UploadValuesTaskAsync(this WebClient wc, string addr, NameValueCollection v)
{
    var tcs = new TaskCompletionSource<byte[]>();

    UploadValuesCompletedEventHandler lambda = (sender, e) => {
         if (e.Cancelled) tcs.TrySetCancelled();
         else if (e.Error != null) tcs.TrySetException(e.Error);
         else tcs.TrySetResult(e.Result);
    };

    wc.UploadValuesCompleted += lambda;
    try
    {
        wc.UploadValuesAsync(addr, v);
        return await tcs.Task
    }
    finally
    {
       wc.UploadValuesCompleted -= lambda;
    }
}
If you add a reference to the NuGet package "Microsoft.Net.Http" which includes many *TaskAsync wrappers in its Microsoft.Threading.Tasks.Extension.Desktop.dll assembly, you can examine them in reflector to see that all the wrappers are basically just like this (albeit slightly more efficient because they avoid a few extra heap allocations).


Sorry to sound like a broken record :) but I made a short video that explains this "TaskCompletionSource" technique for wrapping up event-based APIs into Task-returning APIs. It's a technique that's needed all over the place.
http://channel9.msdn.com/Series/Three-Essential-Tips-for-Async/Lucian03-TipsForAsyncThreadsAndDatabinding



Note that these kind of slim wrappers are incapable of handling cancellation nicely, in the "CancellationToken" style. That's because the underlying event-based API doesn't expose any means of doing per-operation cancels. They only allow you to cancel every pending async operation via WebClient.CancelAsync.

ljw1004 fucked around with this message at 00:29 on Jan 31, 2015

EssOEss
Oct 23, 2006
128-bit approved

ljw1004 posted:

We're actually planning to adopt NuGet wholeself, so the .NET framework itself will be just represented as a collection of NuGet dependencies.

I hope NuGet gets some serious upgrades and fixes before this happens. In recent years, I see ever more half-baked features and mystery errors. Bcl.Build not playing nicely with Azure SDK, random unexplained failures to execute Install.ps1 in certain projects if installing via the GUI but not if via the Package Manager Console, the public repository being unstable if accessed from a Windows Server 2012 machine with the default TCP configuration etc etc etc. It was cool when it was just an easy file distribution mechanism but now that people are piling extra logic and fancy features onto it, I am very weary of going NuGet heavy in any project.

EssOEss fucked around with this message at 00:43 on Jan 31, 2015

Adbot
ADBOT LOVES YOU

Inverness
Feb 4, 2009

Fully configurable personal assistant.

ljw1004 posted:

:( I think that 350k lines of code in 37 workdays is pretty darned impressive! (assuming reasonable time off for Thanksgiving and Christmas).
That was directed at the size of the framework, not the progress rate in open sourcing it.

So it's said that this new stack is going to be the future of .NET going forward. I think it's well thought out and hope that is the case. Does that mean .NET Core is actually going to replace the other non-desktop stacks at some point?

It's always bothered me how the framework or at least parts of it were never kept consistent between platforms. One of the things that bothered me was reading some blog about how they were changing what type that Type derived from on one of the stacks.

Inverness fucked around with this message at 04:10 on Jan 31, 2015

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply