Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Knyteguy
Jul 6, 2005

YES to love
NO to shirts


Toilet Rascal
One more quick question about async:
C# code:
public virtual async Task<JsonObject> GetJsonValueFromServerAsync(string outJson)
{
    var serverUrl = GetApiUrl();

    var http = new HttpClient();
    try
    {
        http.BaseAddress = new Uri(serverUrl);
        var response = await http.PostAsync("", new StringContent(outJson)); // problems here see note below
        response.EnsureSuccessStatusCode();

        return JsonObject.Parse(response.Content.ToString());
    }
    catch { } // Just for testing
    return null; // ^
}
Calling Method:
code:
public void GetCustomerItemData()
{
    ServerConnector sc = new ServerConnector();
    var serverValuesTask = sc.GetJsonValueFromServerAsync("");
    Task.WaitAll(serverValuesTask);
    
    JsonObject serverValues = serverValuesTask.Result; // Debugger never reaches here
}
Ok so in the "problems here see note below" comment this is what is happening:
FirstParam = URI which isn't needed yet. Threads exit here, UI locks up, no error, everything in stasis until I terminate the debug sessions.

Any ideas? I don't have the server API setup yet and the param outJson == string.empty but why am I not getting any feedback from the debugger? Why would the everything just lock up?

Adbot
ADBOT LOVES YOU

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Knyteguy posted:

One more quick question about async:
C# code:

public virtual async Task<JsonObject> GetJsonValueFromServerAsync(string outJson)
{
    var serverUrl = GetApiUrl();

    var http = new HttpClient();
    try
    {
        http.BaseAddress = new Uri(serverUrl);
        var response = await http.PostAsync("", new StringContent(outJson)); // problems here see note below
        response.EnsureSuccessStatusCode();

        return JsonObject.Parse(response.Content.ToString());
    }
    catch { } // Just for testing
    return null; // ^
}
Calling Method:
code:

public void GetCustomerItemData()
{
    ServerConnector sc = new ServerConnector();
    var serverValuesTask = sc.GetJsonValueFromServerAsync("");
    Task.WaitAll(serverValuesTask);
    
    JsonObject serverValues = serverValuesTask.Result; // Debugger never reaches here
}

Ok so in the "problems here see note below" comment this is what is happening:
FirstParam = URI which isn't needed yet. Threads exit here, UI locks up, no error, everything in stasis until I terminate the debug sessions.

Any ideas? I don't have the server API setup yet and the param outJson == string.empty but why am I not getting any feedback from the debugger? Why would the everything just lock up?

Don't use task.waitall. Make the method async task and await the call.

Knyteguy
Jul 6, 2005

YES to love
NO to shirts


Toilet Rascal

Ithaqua posted:

Don't use task.waitall. Make the method async task and await the call.

Thanks, that fixed the problems.

ljw1004
Jan 18, 2005

rum

Knyteguy posted:

Thanks, that fixed the problems.

Also, don't use ".Result" in code that also uses await. It will usually run into the same problems.


Question: if you have a method that absolutely CAN'T be an async method for whatever reason, but it still needs to call an async method, how do you do this?

Answer: you basically can't. It has to be async all the way down and all the way up. More here: http://blogs.msdn.com/b/pfxteam/archive/2012/04/13/10293638.aspx

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

I know why this is happening, but I thought I'd pick youse guys brains on how to possibly get around it. I'm grabbing some images, resizing them, copying them somewhere, and then deleting them. The code I do it with is like so:
code:
Dim myClient As New System.Net.WebClient
dim dsImagesToGet as dataset = 'sql to get some images
For Each dr As DataRow In dsImagesToGet.Tables(0).Rows
  Dim fn As String = RegexFound("\/([A-Z0-9]+\.JPG)", dr("first_image"))
  Dim fna As String = RegexFound("\/([A-Z0-9]+a\.JPG)", dr("second_image"))

  myClient.DownloadFile(dr("first_image"), "d:\temp\images\small\" & fn)
  myClient.DownloadFile(dr("second_image"), "d:\temp\images\alt\" & fna)
  Dim source As New System.Drawing.Bitmap("d:\temp\images\small\" & fn)
  Dim target As Bitmap = ResizeImage(source, 1000, 1000) 'External function that does resizing
  target.Save("d:\images\pub\1000\" & fn, System.Drawing.Imaging.ImageFormat.Jpeg)
  File.Copy("d:\temp\images\alt\" & fna, "d:\images\pub\alt\" & fna)
  File.Delete("d:\temp\ruby\images\small\" & fn)
  File.Delete("d:\temp\ruby\images\alt\" & fna)
Next
Everything works, except for the last step of deleting the file in which I get "access denied". I've ran into this before when dealing with CSV/TXT/XML files I've created in the past, that they basically get locked until the entire process is over. I can set up a scheduled task to clear the directory at (x) PM every night, but I was wondering, is there a way to do it inside the For...Next loop without having to make another one?

Essential
Aug 14, 2003

Scaramouche posted:

I know why this is happening, but I thought I'd pick youse guys brains on how to possibly get around it. I'm grabbing some images, resizing them, copying them somewhere, and then deleting them. The code I do it with is like so:
code:
Dim myClient As New System.Net.WebClient
dim dsImagesToGet as dataset = 'sql to get some images
For Each dr As DataRow In dsImagesToGet.Tables(0).Rows
  Dim fn As String = RegexFound("\/([A-Z0-9]+\.JPG)", dr("first_image"))
  Dim fna As String = RegexFound("\/([A-Z0-9]+a\.JPG)", dr("second_image"))

  myClient.DownloadFile(dr("first_image"), "d:\temp\images\small\" & fn)
  myClient.DownloadFile(dr("second_image"), "d:\temp\images\alt\" & fna)
  Dim source As New System.Drawing.Bitmap("d:\temp\images\small\" & fn)
  Dim target As Bitmap = ResizeImage(source, 1000, 1000) 'External function that does resizing
  target.Save("d:\images\pub\1000\" & fn, System.Drawing.Imaging.ImageFormat.Jpeg)
  File.Copy("d:\temp\images\alt\" & fna, "d:\images\pub\alt\" & fna)
  File.Delete("d:\temp\ruby\images\small\" & fn)
  File.Delete("d:\temp\ruby\images\alt\" & fna)
Next
Everything works, except for the last step of deleting the file in which I get "access denied". I've ran into this before when dealing with CSV/TXT/XML files I've created in the past, that they basically get locked until the entire process is over. I can set up a scheduled task to clear the directory at (x) PM every night, but I was wondering, is there a way to do it inside the For...Next loop without having to make another one?

Is it just the first File.Delete that has the access denied? I wonder if it's the ResizeImage() that's locking that file. I'm almost positive (I don't have the code in front of me at the moment) that I've called webclient.downloadfile and then deleted the file after moving it. Can you wrap the webclients in using statements to make sure they get disposed?

Or possibly you need to dispose target first?

I'm pretty sure any object that touches the images has to be disposed before you can delete. If that's the case, then one (or more) of those objects is what's locking the file.

Essential fucked around with this message at 00:42 on Sep 11, 2014

Eggnogium
Jun 1, 2010

Never give an inch! Hnnnghhhhhh!
This is the TFS thread too, right? We're migrating from TFSVS to Git and I have some questions about how to handle post-build activities. Our custom build workflow currently runs some exes after the build that we used to have checked into source control. Since binaries checked into Git is a bad thing, what are the alternative ways to make the exes available to be run on the build agent?

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Eggnogium posted:

This is the TFS thread too, right? We're migrating from TFSVS to Git and I have some questions about how to handle post-build activities. Our custom build workflow currently runs some exes after the build that we used to have checked into source control. Since binaries checked into Git is a bad thing, what are the alternative ways to make the exes available to be run on the build agent?

TFS-hosted Git? It has (not surprisingly) totally different build process templates to handle Git. Try to not modify the build process template if you can, it's a big pain in the rear end to upgrade them (although I hear TFS 2014/2015 will have some changes in this regard that will make life easier). The 2013 templates have pre- and post-build extension points where you can tell it to run something.

In any case, why not put the binaries in NuGet? The current thinking is to not put binaries in TFVC, either.

Eggnogium
Jun 1, 2010

Never give an inch! Hnnnghhhhhh!

Ithaqua posted:

TFS-hosted Git? It has (not surprisingly) totally different build process templates to handle Git. Try to not modify the build process template if you can, it's a big pain in the rear end to upgrade them (although I hear TFS 2014/2015 will have some changes in this regard that will make life easier). The 2013 templates have pre- and post-build extension points where you can tell it to run something.

In any case, why not put the binaries in NuGet? The current thinking is to not put binaries in TFVC, either.

TFS-hosted Git, yeah. I'm working off the Git template to reconstruct our custom workflows. They're not terribly different from the defaults but we have one custom workflow activity so have to do a few minor changes.

Yeah, NuGet was my first thought. The problem is that said binary has some binary dependencies of its own, which are already in their own NuGet package. So I'm left with either putting all these binaries back together from their installed packages into a single folder as part of the build, or creating a single package with all the binaries, both of which seem icky.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Eggnogium posted:

TFS-hosted Git, yeah. I'm working off the Git template to reconstruct our custom workflows. They're not terribly different from the defaults but we have one custom workflow activity so have to do a few minor changes.

Yeah, NuGet was my first thought. The problem is that said binary has some binary dependencies of its own, which are already in their own NuGet package. So I'm left with either putting all these binaries back together from their installed packages into a single folder as part of the build, or creating a single package with all the binaries, both of which seem icky.

You can have NuGet packages require other NuGet packages via the <dependency> element.

http://docs.nuget.org/docs/reference/nuspec-reference#Specifying_Dependencies

Eggnogium
Jun 1, 2010

Never give an inch! Hnnnghhhhhh!

Ithaqua posted:

You can have NuGet packages require other NuGet packages via the <dependency> element.

http://docs.nuget.org/docs/reference/nuspec-reference#Specifying_Dependencies

But won't NuGet install the dependencies to separate directories? How can I avoid FileNotFoundException's at runtime when the framework tries to load the dependencies and they aren't in the same directory as the EXE?

darthbob88
Oct 13, 2011

YOSPOS
This is probably a stupid question, but is there a good way to write data to multiple files simultaneously? I'm trying to create a system for cataloging my media collections, so that if I ever lose the files I can rebuild/replace them, and I'm putting the catalogs in cloud storage, so I can retrieve them even if my main computer dies. I'd like to be even more redundant, and store the files in multiple cloud storage folders, so even if Dropbox and Microsoft OneDrive fail, I can still get them from Google Drive. At the moment, I do that by writing the files to one folder and manually copying them to the other folders, but that's reliant on me remembering, and it'd be good if I could have the program do it automatically. I probably could also use File.Copy, but that seems inelegant, so I'd prefer to just have a StreamWriter that can write multiple streams at once, without too much juggling.

RICHUNCLEPENNYBAGS
Dec 21, 2010

darthbob88 posted:

This is probably a stupid question, but is there a good way to write data to multiple files simultaneously? I'm trying to create a system for cataloging my media collections, so that if I ever lose the files I can rebuild/replace them, and I'm putting the catalogs in cloud storage, so I can retrieve them even if my main computer dies. I'd like to be even more redundant, and store the files in multiple cloud storage folders, so even if Dropbox and Microsoft OneDrive fail, I can still get them from Google Drive. At the moment, I do that by writing the files to one folder and manually copying them to the other folders, but that's reliant on me remembering, and it'd be good if I could have the program do it automatically. I probably could also use File.Copy, but that seems inelegant, so I'd prefer to just have a StreamWriter that can write multiple streams at once, without too much juggling.

Sure, this kind of thing is easy with async methods, which StreamWriter exposes... just add a bunch of tasks to a list (or Select them or whatever) and use Task.WaitAll when you need to.

darthbob88
Oct 13, 2011

YOSPOS

RICHUNCLEPENNYBAGS posted:

Sure, this kind of thing is easy with async methods, which StreamWriter exposes... just add a bunch of tasks to a list (or Select them or whatever) and use Task.WaitAll when you need to.
That might do it, but I don't understand async that well. I don't suppose I can get a demonstration, just a toy program?

ETA: I think I might get it. You're suggesting three StreamWriters to write to three files, with asyncs and awaits to keep them vaguely in sync. That'd work, but what I really wanted was something that'd let me do
code:
using (var comicsLog = File.CreateText(file1, file2, file3)) {comicsLog.Write(things)}

darthbob88 fucked around with this message at 05:40 on Sep 11, 2014

raminasi
Jan 25, 2005

a last drink with no ice

darthbob88 posted:

That might do it, but I don't understand async that well. I don't suppose I can get a demonstration, just a toy program?

ETA: I think I might get it. You're suggesting three StreamWriters to write to three files, with asyncs and awaits to keep them vaguely in sync. That'd work, but what I really wanted was something that'd let me do
code:
using (var comicsLog = File.CreateText(file1, file2, file3)) {comicsLog.Write(things)}

What specific problem do you think that will solve that three separate async writes won't?

darthbob88
Oct 13, 2011

YOSPOS

GrumpyDoctor posted:

What specific problem do you think that will solve that three separate async writes won't?
It'd save creating and disposing of two StreamWriters per media collection, and it'd be marginally easier to have the StreamWriter write to another folder than to create and dispose of another StreamWriter, but that only matters to my crippling Asperger's. The async is probably unnecessary; at the moment the catalogs total only about 400K, with the largest being 180K, so I can just do it with three regular writes.

EssOEss
Oct 23, 2006
128-bit approved

Eggnogium posted:

But won't NuGet install the dependencies to separate directories? How can I avoid FileNotFoundException's at runtime when the framework tries to load the dependencies and they aren't in the same directory as the EXE?

I am detecting some confusion here regarding how NuGet works. I will try to explain.

NuGet is a way to deliver binaries (+ other irrelevant crap) into the solution. Everything that is downloaded into a neat little packages directory. These are just files and do not have any intrinsic behavior associated with them.

When you install NuGet packages into a project in your solution, in addition to just downloading the above files package, NuGet will also scan the package for any assemblies that match the platform of your project (as defined according to the folder names in the package), after which it will create references to these assemblies.

When you build a project or solution, these referenced assemblies (unless configured otherwise) are copied to your binary output directory. The end result is that you will have all the files you need to run your app in one directory.

If the behavior you are seeing is different, something in the above workflow has been broken. Is a NuGet package not correctly referencing its dependency packages? Are the binaries marked with an incorrect platform so the references are not created? If you give more information, we can assist in figuring it out. But at no point should using a NuGet package require any more prerequisites setup than installing that package.

A Tartan Tory
Mar 26, 2010

You call that a shotgun?!
Ignore me, fixed my own problem.

A Tartan Tory fucked around with this message at 10:25 on Sep 11, 2014

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

darthbob88 posted:

That might do it, but I don't understand async that well. I don't suppose I can get a demonstration, just a toy program?

ETA: I think I might get it. You're suggesting three StreamWriters to write to three files, with asyncs and awaits to keep them vaguely in sync. That'd work, but what I really wanted was something that'd let me do
code:
using (var comicsLog = File.CreateText(file1, file2, file3)) {comicsLog.Write(things)}

C# code:
public class MultiFileWriter
{
    public Task WriteFilesAsync(byte[] data, params string[] files)
    {
        var tasks = files.Select(f => WriteAsync(data, f));
        await Task.WhenAll(tasks);
    }

    private Task WriteAsync(byte[] data, string path)
    {
        using (var fs = new FileStream(path, FileMode.Create))
        {
            await fs.WriteAsync(data, 0, data.Length);
        }
    }
}
Bugs + typos may exist, code from the internet warnings still apply.

raminasi
Jan 25, 2005

a last drink with no ice

darthbob88 posted:

It'd save creating and disposing of two StreamWriters per media collection, and it'd be marginally easier to have the StreamWriter write to another folder than to create and dispose of another StreamWriter, but that only matters to my crippling Asperger's. The async is probably unnecessary; at the moment the catalogs total only about 400K, with the largest being 180K, so I can just do it with three regular writes.

Computers exist to do boring, repetitive work. Wait for the profiler to start whining before you worry about this kind of optimization.

Eggnogium
Jun 1, 2010

Never give an inch! Hnnnghhhhhh!

EssOEss posted:

I am detecting some confusion here regarding how NuGet works. I will try to explain.

NuGet is a way to deliver binaries (+ other irrelevant crap) into the solution. Everything that is downloaded into a neat little packages directory. These are just files and do not have any intrinsic behavior associated with them.

When you install NuGet packages into a project in your solution, in addition to just downloading the above files package, NuGet will also scan the package for any assemblies that match the platform of your project (as defined according to the folder names in the package), after which it will create references to these assemblies.

When you build a project or solution, these referenced assemblies (unless configured otherwise) are copied to your binary output directory. The end result is that you will have all the files you need to run your app in one directory.

If the behavior you are seeing is different, something in the above workflow has been broken. Is a NuGet package not correctly referencing its dependency packages? Are the binaries marked with an incorrect platform so the references are not created? If you give more information, we can assist in figuring it out. But at no point should using a NuGet package require any more prerequisites setup than installing that package.

Okay, yeah, I was totally unaware of all that auto-referencing behavior because I've been mainly working with solution-level packages. Thank you for clarifying. If I understand right I would add a dummy project that references all the packages I need to assembly my application, and the build will copy them all into the output directory, where I can run it from after the build is done.

raminasi
Jan 25, 2005

a last drink with no ice
Why can't you just have the projects that need NuGet packages directly reference them?

Eggnogium
Jun 1, 2010

Never give an inch! Hnnnghhhhhh!

GrumpyDoctor posted:

Why can't you just have the projects that need NuGet packages directly reference them?

Well the tool kicks off deployment for the whole application to a test environment so logically it seems more appropriate to associate with the solution, or a special project within the solution. I'd basically just be picking a project at random in each of our repositories if I associated it with an existing project, which seems confusing.

EssOEss
Oct 23, 2006
128-bit approved
I don't get it. Is this NuGet package containing things that are not actually needed by any of the project but something external to the solution? I would be interested in hearing more details about your scenario. In my experience, NuGet packages provide stuff (usually code in assemblies) for projects, so you simply install them into whichever projects require the stuff.

Eggnogium
Jun 1, 2010

Never give an inch! Hnnnghhhhhh!

EssOEss posted:

I don't get it. Is this NuGet package containing things that are not actually needed by any of the project but something external to the solution? I would be interested in hearing more details about your scenario. In my experience, NuGet packages provide stuff (usually code in assemblies) for projects, so you simply install them into whichever projects require the stuff.

Yes, this NuGet package isn't used to build projects at all. It contains (or would contain, haven't built the package yet) an exe that is run after all the projects are built to deploy the combined application, in a post-build step of the TFS build workflow.

Right now I'm leaning towards just breaking best practices and throwing the exe and all its dependencies in a single solution-level package, then running it out of the packages folder.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Eggnogium posted:

Well the tool kicks off deployment for the whole application to a test environment

Don't do this from build. Use a real release tool.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Essential posted:

Is it just the first File.Delete that has the access denied? I wonder if it's the ResizeImage() that's locking that file. I'm almost positive (I don't have the code in front of me at the moment) that I've called webclient.downloadfile and then deleted the file after moving it. Can you wrap the webclients in using statements to make sure they get disposed?

Or possibly you need to dispose target first?

I'm pretty sure any object that touches the images has to be disposed before you can delete. If that's the case, then one (or more) of those objects is what's locking the file.

I'm looking into this now since I have to do this with another image source that already has images of the right size. However, that leads to:

Another Question for .NET Experts

I'm downloading these image files using a webclient as above, however this is an 'image server' that I have to give parameters to like so:
code:
myClient.DownloadFile("http://blah.com/imgsrc/" & filename & "?wid=1000&hei=1000", "d:\temp\images\1000\" & filename)
However someone on the other end is being a clever dick. If the file doesn't exist, it returns a 'image not available' image called missimg.jpg@wid=1000&hei=1000.

The problem being, I don't want those crappy images. And given how myClient.DownloadFile works, they're going to be invisibly renamed to be legit images, since the second parameter provided is what filename to save it as. Is there a way to check what filename is being provided after the Address gets downloaded, but before the Filename gets saved? Kind of hijack it in the middle as it were.

raminasi
Jan 25, 2005

a last drink with no ice
Is there a way to get more detailed information about what's causing a TypeLoadException? I know the type it can't find, but I haven't the faintest idea why it can't find the type, and I'm pulling my hair out trying to figure out what's going on.

No Safe Word
Feb 26, 2005

GrumpyDoctor posted:

Is there a way to get more detailed information about what's causing a TypeLoadException? I know the type it can't find, but I haven't the faintest idea why it can't find the type, and I'm pulling my hair out trying to figure out what's going on.

It's always a binding redirect :v:

No seriously, whenever it hasn't been some NuGet package (or rogue developer) doing something stupid with binding redirects, I've been able to fairly easily step through the normal places .NET looks for poo poo and figure it out manually, but we also don't do much mucking about with assembly loading.

Sedro
Dec 31, 2008
If you do think assembly loading is the problem, there's fusion logs.

Scaramouche posted:

I'm looking into this now since I have to do this with another image source that already has images of the right size. However, that leads to:
Pull the filename out of the response header, examples here.

raminasi
Jan 25, 2005

a last drink with no ice

No Safe Word posted:

It's always a binding redirect :v:

No seriously, whenever it hasn't been some NuGet package (or rogue developer) doing something stupid with binding redirects, I've been able to fairly easily step through the normal places .NET looks for poo poo and figure it out manually, but we also don't do much mucking about with assembly loading.

Oh, it's found the assembly. It loads all the other types in it fine. I just can't add any more, apparently.

Sedro
Dec 31, 2008
Does the type have a .cctor or any static fields?

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Sedro posted:

Pull the filename out of the response header, examples here.

Yeah that's looking definitely a thing I'll be checking out. I ran it through Fiddler to see what's coming back and those sneaks are using a 302:
code:
HTTP/1.1 302 Found
Date: Fri, 12 Sep 2014 01:09:42 GMT
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET
X-AspNet-Version: 4.0.30319
Location: /images/missimg.jpg?wid=1000&hei=1000
Cache-Control: private
Content-Type: text/html; charset=utf-8
Content-Length: 192

<html><head><title>Object moved</title></head><body>
<h2>Object moved to <a href="/images/missimg.jpg?wid=1000&amp;hei=1000">here</a>.</h2>
</body></html>
So theoretically this will catch it:
code:
Using myClient
 myClient.OpenRead("http://blah.com/images/" & fn & "?w=1000&h=1000")
 Dim strHeader As String = myClient.ResponseHeaders("status-code")
 If Not strHeader.Contains("302") Then
  'do file saving stuff
 End If
End Using
The only thing I'm not sure is how to peel off the file from the webclient OpenRead without making another DownloadFile/request operation, something to do with StreamReader maybe.

Scaramouche fucked around with this message at 02:28 on Sep 12, 2014

Sedro
Dec 31, 2008

Scaramouche posted:

The only thing I'm not sure is how to peel off the file from the webclient OpenRead without making another DownloadFile/request operation, something to do with StreamReader maybe.

C# code:
using (var contents = webClient.OpenRead(...))
using (var fs = File.Create(@"\path\to\file"))
{
    contents.CopyTo(fs);
}

raminasi
Jan 25, 2005

a last drink with no ice

Sedro posted:

Does the type have a .cctor or any static fields?

Yep, static class with .cctor.

Sedro
Dec 31, 2008
Obvious question, does it throw an exception?

epswing
Nov 4, 2003

Soiled Meat
I'd like to understand a bit better how to deal with 3rd party libraries, and logging. I use System.Diagnostics.Trace throughout my application, for my own application logging purposes. When I suck in a 3rd party library via NuGet, and that library does some logging, do I generally want to see their logs in my logs? If the answer is yes, what if the 3rd party lib uses a different logging mechanism, like log4net? If I have my facts straight, DiagnosticsTraceAppender will just act as a pass-through to pipe log4net logs into System.Diagnostics logs, so everything ends up in the same place.

How is this usually handled?

XML code:
<log4net>
  <appender name="DiagnosticsTraceAppender" type="Blah.DiagnosticsTraceAppender">
    <layout type="log4net.Layout.PatternLayout">
      <conversionPattern value="%message" />
    </layout>
  </appender>
  <root>
    <level value="ALL" />
    <appender-ref ref="DiagnosticsTraceAppender" />
  </root>
</log4net>
C# code:
namespace Blah
{
    public class DiagnosticsTraceAppender : TraceAppender
    {
        protected override void Append(LoggingEvent loggingEvent)
        {
            var level = loggingEvent.Level;
            var message = RenderLoggingEvent(loggingEvent);
            if (level >= Level.Error)
                Trace.TraceError(message);
            else if (level >= Level.Warn)
                Trace.TraceWarning(message);
            else if (level >= Level.Info)
                Trace.TraceInformation(message);
            else
                Trace.Write(message);
            if (ImmediateFlush)
                Trace.Flush();
        }
    }
}

RICHUNCLEPENNYBAGS
Dec 21, 2010

Bognar posted:

C# code:
public class MultiFileWriter
{
    public Task WriteFilesAsync(byte[] data, params string[] files)
    {
        var tasks = files.Select(f => WriteAsync(data, f));
        await Task.WhenAll(tasks);
    }

    private Task WriteAsync(byte[] data, string path)
    {
        using (var fs = new FileStream(path, FileMode.Create))
        {
            await fs.WriteAsync(data, 0, data.Length);
        }
    }
}
Bugs + typos may exist, code from the internet warnings still apply.

More or less what I had in mind, yeah. I like async/await a whole lot; I miss it when working in JS-land and having a million callbacks.

raminasi
Jan 25, 2005

a last drink with no ice

Sedro posted:

Obvious question, does it throw an exception?

I forgot that I'd disabled the .cctor to try to solve this problem. The class at this point has a single method which is a no-op.

edit: Problem solved, the wrong .dll was getting loaded. (It was an old version in a different location.) ProcMon, you've saved my bacon yet again!

raminasi fucked around with this message at 06:25 on Sep 12, 2014

Adbot
ADBOT LOVES YOU

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

epalm posted:

When I suck in a 3rd party library via NuGet, and that library does some logging, do I generally want to see their logs in my logs? If the answer is yes, what if the 3rd party lib uses a different logging mechanism, like log4net? If I have my facts straight, DiagnosticsTraceAppender will just act as a pass-through to pipe log4net logs into System.Diagnostics logs, so everything ends up in the same place.

How is this usually handled?

Yes, that will insert other libraries' log messages into your trace stream. Doesn't log4net include a trace appender already though?

With log4net, one configuration applies to all logging instances in the entire AppDomain. (For example, as part of my own application's log configuration, I set the NHibernate loggers to only be included at the error level and above.)

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply