Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

Ithaqua posted:

It isn't. It works fine in an arbitrary console application. What's this "Dump" extension method?

I can confirm that it works fine. It outputs "H012345678901" twice. gently caress them what are you expecting as output? You have a NullReferenceException in your Where clause. You should check String.IsNullOrWhiteSpace first and then run the Regex on your string.

The Dump extension is something in LinqPad to know what to output.

Adbot
ADBOT LOVES YOU

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

gently caress them posted:

Just what exactly are ALL of the places one can change how requests are returned in .NET? I've gone through every single decorator I can find for my methods in .svc and .asmx and .whatever files, and I want to return JSON, and I keep either returning a single XML string which itself contains JSON, or I return JSON in a SOAP envelope.

Is there something in web.config I've overlooked? I've scoured MDSN, stack overflow, google, and even tried restarting VS a few times, and it's getting very frustrating.

I just want to spit out a JSON string :(

Stop using ASMX. It's really freaking old. If you are starting fresh today you want to use Web API. That's the easiest way to send json or XML over HTTP to a client. If you can't use Web API use WCF.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

Newf posted:

I've got a class with some static and some non-static methods, and I'd like to lazily check the existence of system setup / default configurations whenever the class is referenced. Is the following a bad idea for any reason?

C# code:
Class Foo
{
 private static bool initialized = InitClass();
 private static bool InitClass()
 {
  // check that necessary directory structure is in place
  // check that a 'default' foo is in place
  return true;
 }
}
I figure that InitClass() will now be called only when the class is first referenced at runtime for any given run of the program, and will run before any static method or constructor since the class needs to initialize all of its static variables before running any of its meat code.

InitClass will be called at startup because the runtime wants to create the variable initialized. It sounds like you want Lazy<T>. Without a better sample it's hard to say if this is good or bad. I'd lean toward not doing it because it's hard to mock out or hide static anything during a unit test.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

ljw1004 posted:

In practice, the JIT compiles a method only at the first time the method is invoked. At that moment it first chases down every type that's touched by the method, and ensures they're all loaded. The type-load process will invoke all static constructors (hence, will initialize all static variables).

You are correct. I was dumb. I create a console application and put the static items in the class Program so InitClass is called before application startup because the .NET runtime create the Program class. Creating another non-static class and putting in the static member and method has the creation wait until you touch the class.

EDIT: Was the initialization changed since Skeet's blog post? I tried the "Eager initialization: .NET 3.5" example and it printed out "Type initialized" when the class was touched. Changing the targeted .NET version didn't help. I don't have VS2008 installed and I'm too lazy to grab it

gariig fucked around with this message at 18:36 on Aug 8, 2014

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
You can but I'd setup a server that you push to for integration. That way you can bring your local IIS Express up and down and your co-workers don't need to worry if you left the API up or not. If you can't get another box deploy to your local IIS instance and leave that running over port 80.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

RICHUNCLEPENNYBAGS posted:

So for people for do this how do you handle different branches working with modified database schemas?

I'd spin up a VM for each branch to deploy to. What problem are you trying to solve?

EDIT: VVVV That's another way to do it. Also, deploying to Azure/AWS/etc would work

gariig fucked around with this message at 17:56 on Sep 7, 2014

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
Can you post a code sample of what you are doing?

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
What is the value of the parameters passed in? I'm guessing you are getting a negative number for Take/Skip. Also, have you tried doing a ToList() on the query to force the query to be executed earlier?

gently caress them posted:

Edit: Naturally if I just put my "make an empty string having object and return that thang" logic into the catch block of a try catch It Just Works!

What?

gariig fucked around with this message at 18:33 on Sep 8, 2014

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

gently caress them posted:

That if else statement? I made it a Try/Catch And Now It Just Works™.


THAT makes sense.

Is the Try/Catch working because you are catching the Exception and returning a good value or the Exception doesn't occur because of the try/catch? What part of it "And Now It Just Works™"?

Well what are the values? Although that makes sense is that the root cause?

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

Knyteguy posted:


C# code:
public virtual async Task<JsonObject> GetSettingsFromConfigFile(string configFilePath = @"../../Config/Settings.json")
{
    string content = String.Empty;

    var myStream = await ApplicationData.Current.LocalFolder.OpenStreamForReadAsync(configFilePath);
    using (StreamReader reader = new StreamReader(myStream))
    {
        content = reader.ReadToEnd();
    }

    return JsonObject.Parse(content);
}

public virtual JsonObject GetJsonValueFromServer(string outJson)
{
    Task.Factory.StartNew(async () =>
    {
        JsonObject settings;
        settings = await GetSettingsFromConfigFile();
        ServerUrl = settings["Server"].ToString();

        HttpWebRequest request =
        (HttpWebRequest)HttpWebRequest.Create(ServerUrl);

        request.BeginGetResponse(GetServerCallback, request);
    });
    return null;
}

What is happening is the Task you are creating from the Task.Factory is starting to run and you are immediately returning null. You need to capture the Task<T> created from the Task.Factory and await it.

EDIT: That also means GetJsonValueFromServer should return a Task<JsonObject>. Once you go async/await it will spread through your code like a virus.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
Since so many people have asked about asynchronous and parallel programming this week I'll plug Concurrency in C# as a great book to go over modern (.NET 4+) concurrency techniques. It's in a cookbook style approach where each chapter has a theme (async/await, TPL, Rx) and a bunch of examples that build on each other. Excellent book.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

GrumpyDoctor posted:

Was the old behavior originally intended, or was it a bug? That behavior seems incredibly counterintuitive to me, especially because, if I understand right, there's no way to explicitly request it anywhere in the language.

(Also, how did I not know that?)

C# code:
foreach (string keyword in keywords)
{
  query = query.Where (p => p.Description.Contains (keyword));
}
var results = query.ToList();
The old way makes complete sense if you think about it (as a compiler). I recreated the old code without the temporary variable. What you are creating is a closure over the variable keyword not the value of keyword. When the code goes to evaluate query using ToList all of the Where clauses captured keyword which would be the last value in keywords.

I'm glad Microsoft "fixed" this behavior to do what a normal human being would expect. Which is to have each Where clause capture the value of keyword by using the temporary variable.

Also, I didn't realize multiple Where clauses were cumulative (is that the right term?). I learned something new.

EDIT: VVVV It totally makes sense. I just never thought of it that way. At least Ithaqua didn't realize it. I take solace in that

gariig fucked around with this message at 17:12 on Sep 16, 2014

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

A Tartan Tory posted:

Ah right, let me try again.

1. Pre-populate three lists with single values, base the lists on the current year and month, with every value corresponding to a specific month.
2. Use those lists of values to do a sumproduct, but to do it for every month so I can get the sumproduct for every month. (the month starts by using the next months list value, so for the first month in the first year, it would start with the values from the second month in the first year until it got to the eleventh month in the second year, the second month in the first year would use the values from the third month in the first year until the eleventh month of the second year etc etc, the eleventh month of the second year would just use it's own list numbers for that month, where any month after that would start from the eleventh month of the second year for it's sumproduct and end at the current month's values)
3. Use the sumproduct for every month generated and divide it by two list values from that month, in order to get the final value.
4. Save the final value in an another list based on the current month for future display.

Is that a little better?

No, I still don't understand what you are trying to do. Pretend I don't know what you are trying to calculate and describe the process. Also, add a couple of small examples of input and output. Maybe even the manual steps to get your input to be output

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
You are (probably) overestimating how slow a database trip is. If this database is close to your web server (same host) you are looking at a couple of milliseconds per call. I would start with the database and then see what you need to do to optimize the process.

I'm also unsure of what you mean you were "doing it badly"? Just appending everything with a StringBuilder sounds OK to me (until it's not)

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

wwb posted:

I'll just be contrarian here and note that while the concept of using a database makes a lot of sense you might not want an actual database server involved -- if it is just a key-value lookup then using a static Dictionary<string, GemInfo> isn't the most horrible thing in the world. Said "database" could be hydrated from an xml or json file which gets compiled in as a static resource. The key advantage here is that the app can travel with the data internally so you don't need to stand up the gem DB to do testing / developing. Downside is you've got a big object in memory but RAM is cheap these days.

The biggest problem for Scaramouche is it the data isn't 100% static. Like if someone updates a description do you have to recycle the app pools to get Application_start to rehydrate your Dictionary? I would start with a database, it could even be a SQL Server Express hosted on the web server. A database is nice in that it gives you a point of integration. Later you might need to do allow updating of descriptions or adding new gems not in the original dump of data. If you are constantly hitting the database you don't need to worry about caching.

Quite honestly, with a fairly low volume (like 8-17 items/second) there's not really a "bad choice". I doubt you could add measurable latency to your process when the whole thing takes 40 minutes. What's another 1-2 seconds in the pipeline? Really the whole process should probably take 10ms to lookup a database, transform the data into a description, and pass the work item and description to the next process.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
Have you tried the Import/Export data feature in SSMS or use SQL Server Integration Services? I'm not sure how much manipulation is required for moving the data from MySQL to SQL Server but either tool should be able to do it.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
How would a fake DbContext know that your relationship should fail? It's just going to let your code call into it, return pre-recorded behavior (Setup/Returns), and to allow you to Verify calls. If you want to test what happens when your code gives a bad relation you can have Moq throw an exception.

What you want is an integration test where you aren't mocking your DbContext but using a MyContext against a real database.

The Art of Unit Testing is a great book that goes over unit testing and touches on integration testing.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

RICHUNCLEPENNYBAGS posted:

Do you? Frankly I think a mocked out DbContext is pretty useful in a lot of cases and with the integration test you run into consistency problems.

It depends on what you want to test. I've honestly not done much EF. Can you mock a DbContext and still get information like invalid FK like Ochowie wanted? I'm too lazy to go look up the information.

To your point about integration tests being hard. They are and might not be worth the effort. Depends on the code base

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
Try updating your nuget packages. Microsoft pushed a new version of the MVC4 dll today that is breaking things.

StackOverflow
Microsoft bulletin

EDIT: Rereading this probably won't help. However, this might be good information for other developers

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

ljw1004 posted:

We launched "VS2015 Preview" today: download

Can I run this side-by-side with VS2013/2012 or should I stick it on a VM still?

EDIT: I didn't read the KB article

quote:

Although this release is intended to be installed side-by-side with earlier versions of Visual Studio, complete compatibility is not guaranteed.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
With all of this unit testing talk if you want a book on it The Art of Unit Testing is excellent and I highly recommend it.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

mastersord posted:

I noticed you create a new connection and other ADO objects every time you connect. I set up a connection pool (we have multiple databases) and reuse oledbconnection objects. I am not sure which is better, as I am in no way an expert in how ADO works.

Don't do this. ADO.Net is already doing connection pooling. The only reason to do your own object pooling is if you are having trouble with GC and creating/destroying a new object is too expensive (generally not).

A "General network error. Check your network documentation" is something of a legend. Every person who has solved this has a different way. I think Richard Campbell from .NET Rocks/Runas Radio once said it was a bad keystone jack. You can check out Polly about handling and retrying because of the transient error

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

GrumpyDoctor posted:

If I need to kick off a long-running, CPU-bound task and not care about the results, is just calling Task.Run(whatever) the right way to do it?

Yes but don't have the implementation call Task.Run have the caller (here's why). Just remember you won't get any exception messages unless the code in the Task handles it. It's truly fire and forget if the task isn't awaited.

If you are in ASP.NET you are sucking up a thread from your thread pool that is used to handle requests and IIS might shut down your app while it's running. You can use QueueBackgroundWorkItem to get around that or ship it off the box.

gariig fucked around with this message at 22:05 on Jan 23, 2015

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

Newf posted:

Can anyone here tell me about discount options for hosting an extremely low traffic MVC site with a very small database? I'm looking to make a quick and dirty attendance/merit tracking webapp for someone managing the homework room of a Boys and Girls Club, but the local network isn't accessible for me to host the thing due to national Boys and Girls club bureaucracy which I'd rather sidestep.

If you are in school you can do Dreamspark (or Githubs pack which includes Dreamspark) or BizSpark. I'm pretty sure BizSpark has very little requirements to join, I think you just need to sign up.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

Easy Mac posted:

So this system worked because I didn't have to resize images all the time for specific parts of the website. However, it is kind of slow sometimes, especially for the first time it pulls the images from the database. So what I'd like to know is, how would you handle images?

If storage isn't a problem I would put them into a queue and have some other process resize them and store them into the database. That way when the user goes to access the product the image is already resized. The queue is used so that you can have multiple workers resizing images in case volume increases. It's slightly YAGNI but I think the extra effort up front is worth it even if you keep it at one worker.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

Space Whale posted:

Also, in general, how much of a PITA is it to finally whip a solution into building properly and using NuGet correctly?

Not that bad. If you have Resharper if you right click on the reference and pick Project Hierarchy it will show you all of the projects referencing the assembly. You can then right click on the solution, manage Nuget packages for solution, and add the package to your projects. I recheck the references to make sure they are correct because I've had Nuget not do the right thing. If your build system doesn't restore nuget packages automatically you'll have to check nuget.org for instructions. For internal assemblies you can run your own nuget server if needed.

I'm literally in the process of doing this at work for some of our external dependencies. Someone added a new project with log4net through Nuget, but other projects had references to the assembly in a lib folder. Everything compiled but programs crashed, very annoying.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

GrumpyDoctor posted:

Ok, I guess I should provide the entire problem description so I'm not X/Ying this. It's kind of gnarly to explain, though. I was trying to originally only ask about one piece of it, but I guess I should put it all on the table.

I've got a collection of scene objects. Three things need to happen:
1) Each scene object is preprocessed into a format that is written to disk. The computation cost is not trivial, but not insane; I want to parallelize this.
2) The disk versions of the scene objects are collected into a unified scene data structure S. This happens via separate program, which is why the inputs need to be represented on-disk.
3) For each scene object X in some subset of the original set of scene objects, run another (relatively) expensive computation that takes as input X and S. This also happens via a separate program.

My original question was about the movement between steps 1 and 2. What I was trying to do was set up a way to do this whereby neither the original object set nor the subset in step 3 needed to be known beforehand; clients could add scene objects willy-nilly until step 2 kicked off, at which point additional attempts to add or update scene objects would either block or fail. (In practice, I wouldn't ever expect to get there - the idea would be that a request to begin step 2 would block until any remaining scene object additions were complete.) In describing it, I'm realizing that I could just drop this particular goal (and have clients pass in the object set as a single batch) to simplify everything, so I guess that's what I'll go with. I think the original reason I was reluctant to do that was that parallelization of step 1 would necessarily be hidden from client code, but I guess that's not the end of the world.

I don't understand where the ConcurrentDictionary comes into play. I would do some sort of Producer-Consumer model (basically use queues). Where one part of the pipeline is sticking in data and the next part is doing work on it to push it into the next part. How to implement this depends on how much parallelization you need. Do you need cores on a machine or 50 machines? You could look into MSMQ or another message queue system for coordination across process boundaries. You can use a disk as transactional storage but it's definitely a tough problem. You have to make sure files are written completely or you'll get locking issues. I would look into something else to coordinate between steps 2 and 3 (could be queues again that handles the transaction, IE write to file then insert work into queue)

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

GrumpyDoctor posted:

I've got a desktop application that runs through a loop in which each loop iteration creates a directory, uses it for some scratch work, and then deletes it. On some loop iterations the call to Directory.CreateDirectory fails with an UnauthorizedAccessException, but this makes no sense to me, because the directory name doesn't change, and nothing else uses it. (I know that nothing uses it because it's a temp directory.) What could be going on here?

Can you post some code? Could be a logic error in the code. If it's a transient error you could use something like Polly to retry.

EDIT: Anyone here use OzCode? I'm wondering if it's worth the cash

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

bobua posted:

If you have the property set to nullable, yes. Properties like datetime can't be set to nullable though.

Because while the program originallly get's it's data from an sql database, it saves(and reimports that data to xml files). Each strongly typed datatable is a different step in a long process. Sometimes the tables need to be mixed an matched... table 1 from live database, table 2 from an xml file, resulting table pushed back up to the database... etc. It's always different.

Can you stop using DataTable? It's a very leak abstraction and I think you are having trouble because you are using three different technologies, DataTable, EF, and LINQ-to-objects. My suggestion is to make a POCO (Plain Old CLR Object) model, have each step project the data into your POCO model, do your model mutations, then project that into the database. At the very least dumping DataTable will probably help out. Also, posting more code could be helpful but I think is more of an architecture problem than a coding error problem.

You could also move to something like SSIS (SQL Server Integration Services) or some other ETL (Extract, Transform, Load) system.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

crashdome posted:

App.Config question:

I have a WPF Application with a single project for ViewsVMs/Services/Etc. I also have two additional projects referenced by the main project. Each sub-project is basically an EF Project each with it's own connection string and data models.

When I build the solution, the connection strings are not merged up in to my main project App.Config. What am I doing wrong?

Optionally, is there a way I can have a single App.Config all projects can reference without EF wizard complaining a connection doesn't exist?

Building doesn't merge the app.configs. However, you can have one app.config reference an external file (example).

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
I'd probably post it to StackOverflow and hope Jon Skeet answers it. You can also check out Nodatime (from Jon Skeet)

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

enthe0s posted:

I'm using DateTime to get the day of the month as a number, but I haven't been able to find a good way to format the day with suffixes. For instance, if the date is 5, I want to format the string to output "5th", 1 becomes "1st", etc.

Is this something custom I'll have to write?

Use Humanizr (pulled from The Hanselman)

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

Boz0r posted:

Yeah. I know C# too from programming in Unity3D. I'm just got my master's in computer science, so I'm into all that OOP and poo poo, so I just need a quick and dirty version, I think.

If you know programming but want to learn what makes C# standout I suggest C# In Depth. It goes over all of the changes in .NET 2/3/4/4.5 (generics, LINQ, async/await) which is what makes .NET standout from Java/C++. It also goes into a lot of the things that can be odd unless someone told out like LINQ being lazy evaluated. As far as learning WinForms, WCF, WebForms, MVC, Web API, etc. you are screwed but the employer knows that.

Also, the Microsoft Virtual Academy has some hit or miss classes. I wouldn't dismiss all of them if you didn't like the one you saw.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

Calidus posted:

I have a MVC webpage which is used to add items to an order. Think of it like a stupid version of those design your own car webpages every manufacture has now. It has 3 sections, the first section is html form with some inputs used to select and add an single item to the order. The second section is a preview image rendered by the server based on uses current order. The third section is a second html form which is used to remove a previously added part from the order. This is clunky as hell with two different html forms being posted.

I am thinking I have a few ways I think I could make this better:

  • I could change the page to use MVCs Ajax partial postbacks and just refresh the sections that need refreshed
  • I could also try and use a single form with multiple submit buttons
  • I also use Json results, JS and Ajax to update each individual parts of my model and my complex session variable.

No one really answered you but I would go with a SPA (Single Page App) approach. I think that's the kind of UI that users are expecting today. Post backs work best as a work flow (view -> edit -> view update) but this sounds like a very iterative approach on this page. If you are able I would go with a more traditional SPA approach of using DOM manipulation on the front end and sending JSON to the server. The Ajax partial stuff doesn't seem to be in vogue at Microsoft so could be dropped in a version or two (pure speculation).

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
I found this http://forums.iis.net/post/1992357.aspx which sounds like what you wanted

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

Scaramouche posted:

Then I parse them, and move them to here:
d:\invoices\processed

The way I'm avoiding double parsing is like this currently:
code:
Dim files
files = Directory.EnumerateFiles("d:\invoices", "*.csv")
For Each filename As String In files
  If Not File.Exists("d:\invoices\processed\" & Path.GetFileName(filename).ToString) Then
    'do stuff
    'move file to processed
  Else
   'File is duplicate delete
   File.Delete(filename)
  End If
Next
This works, but I'm wondering, will the .Exists method eventually introduce some overhead lag if the contents of \Processed ends up being thousands of files? There would only ever be 5-6 files in \Invoices at a time. Is this premature optimization, or is there a better way?

I wouldn't worry about the File.Exists taking too long. It's going to just ask the file system if the file exists or not. If you want to test this just make a bunch of fake "invoices" (like 100,000) and see what happens.

I think you should save this in a database so there is some history the work was done.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

Brady posted:

Yeah I tried the multi-threading thing and didn't really know what I was doing so I abandoned that route. I want to make a button that, when clicked, will pause the operation after the current block has been executed, and not pause it in the middle of the loop, so I don't think the mutli-threaded thing would suit me anyway. Basically I'm just relocating files from one server to another and renaming them along the way and I want the user to be able to pause at their leisure, but only after the current file rename is completed to avoid any issues. I'm checking out Await but I'm lost when it comes to attaching it to a button click, especially since the UI is locked during execution... which in that case, is making the application double-threaded is the only way to go then? I know this is what ASync is for kind of but I can't find any example online that actually uses Await to pause anything completely until the button is clicked again, but instead all I can find are examples of a wait timer.

All code in .NET is executed on a thread. When you are in a GUI by default everything runs on the UI thread. The UI thread is what is responsible for listening to UI events (button click, window drag/drop) and coordinating repaints of the UI. Since it's handling button clicks if you don't move long running work to another thread it will consume all of CPU time of the UI thread and the UI thread can't listen to UI events and schedule repaints. If you want to do long running work and have a responsive UI you need to do the work on another thread. The best options right now are BackgroundWorker, Task Parallel Library (using Task.Run, or async/await if the library exposes Async (Task returning) methods like HttpClient. Those options are from worst to best.

Allowing the user to stop the action requires the programmer to allow it without the user killing the Process. This is a more advanced scenario that requires use of a CancellationToken. I found a Console application example. The same process will apply to the UI. I would have Start and Cancel buttons. When you Start create a CancellationToken and pass it into your long running job that is probably spawned from Task.Run. Now the Cancel button is clickable and will cancel the CancellationToken. You can control when the CancellationToken is observed in your long running process and stop when you finish a unit of work. Combine this with ljw1004 IProgress example and you have an awesome application!

If you want to learn more about writing concurrent programming I suggest Concurrency in C# Cookbook. It skips Threads which you should 99.99% never use if you are in .NET 4+ but goes into everything I mentioned above plus Reactive Extensions.

gariig fucked around with this message at 15:41 on Apr 14, 2015

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
A read through the Roslyn Contribution Guide might help some. A lot of this stuff will make your code harder to read. There's also LinqOptimizer which will compile your LINQ queries down to IL. I have not used LinqOptimizer but I think someone on SA had

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
Can you post more code? I'm still not sure what is going on. You saying "multicast delegate holding 2 old filewriter's and the one current one" has me worried. A Bitmap is not very threadsafe (SO) and it's possible you are getting into a dead lock.

For the code Ithaqua mentioned having unbraced loops and if statements can be a "bad" thing (see goto fail SSL exploit). I would at least change it to a LINQ statement so that it's more declarative. Really, I would have the constructor find the ImageCodecInfo and cache it or wrap it in a Lazy<T>. Just to reduce the amount of work needed when saving a JPEG and it's one less call within saveJpeg plus null checking. I would also want to throw an exception if no ImageCodecInfo is found instead of silently giving up or at least log it.

Adbot
ADBOT LOVES YOU

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug

bobua posted:

Sorry, I meant more from a best practices\what are the pitfalls point of view.

I don't quite understand garbage collection and database resource handling outside the desktop application environment.

The pitfall is you are munging your business logic and data access into one spot. There's no separation of concern so testing this without firing up a database impossible. For a small application creating your DbContext in the constructor or injected in via IoC isn't that bad.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply