Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
brap
Aug 23, 2004

Grimey Drawer
Why would parallelizing it make it faster than a delimited stream of XML documents? Bandwidth is bandwidth, right?

Adbot
ADBOT LOVES YOU

Mr Shiny Pants
Nov 12, 2012

fleshweasel posted:

Why would parallelizing it make it faster than a delimited stream of XML documents? Bandwidth is bandwidth, right?

If he is bandwidth constrained it wouldn't matter, but if the program runs them sequentially it will.

EssOEss
Oct 23, 2006
128-bit approved
Presumably the cost is in the XmlDocument parsing (large documents?). Of course, that is an assumption and optimizing based on assumptions is a bad thing to do. Perhaps making this a learning opportunity on using profiler would be good here - use the Visual Studio performance explorer to get a nice analysis of what parts of the program are costing time, for example. If that verifies the assumption, you now have some real data to base design decisions on.

But fixing that dumb as poo poo communication protocol should be the first priority. Optimizing a system that has a built-in design flaw seems questionable. Assuming that part is under your control (and if not, send hatemail to whoever made it).

darthbob88
Oct 13, 2011

YOSPOS

Mr Shiny Pants posted:

If he is bandwidth constrained it wouldn't matter, but if the program runs them sequentially it will.
Yeah, the problem definitely isn't bandwidth, since this totals about 110KB of data both ways.

gariig posted:

Unless your latency can be under 60ms your going to have to parallelize this. Has the course gone into this? Using the Task Parallel Library (TPL) with a concurrent collection is probably all you need, unless ordering is important. Just a little async/await will probably get you to doing this in a few seconds.

Don't forget StopWatch for timing!
Unfortunately, ordering does matter, at least in how I send the requests. The responses can be read in any order, since they do still include a requestID tying them to the request I sent. And yeah, async/await is probably the right way to do it, but I haven't worked out the right way to handle this in async. Obviously I need BeginRead, since that is the async method for reading from NetworkStreams, but the callback methods on that page aren't working properly for me. Either I need to wait half a second for the data to actually be there, or I get an array of empty bytes. Basically, async is causing the problems EssOEss described earlier, where I have no good way of knowing that there's any data to read.

EssOEss posted:

Presumably the cost is in the XmlDocument parsing (large documents?). Of course, that is an assumption and optimizing based on assumptions is a bad thing to do. Perhaps making this a learning opportunity on using profiler would be good here - use the Visual Studio performance explorer to get a nice analysis of what parts of the program are costing time, for example. If that verifies the assumption, you now have some real data to base design decisions on.

But fixing that dumb as poo poo communication protocol should be the first priority. Optimizing a system that has a built-in design flaw seems questionable. Assuming that part is under your control (and if not, send hatemail to whoever made it).
No, I did some simple profiling of my own, and it's taking half a second just on "stream.Read(data, 0, data.Length);" even if I don't parse the response. It's just latency. I've already sent a request for clarification, whether the problem is in my code or in the fact that I'm 1200 miles from their server.

brap
Aug 23, 2004

Grimey Drawer
110 KB total of data is negligible to parse on anything but an old mobile device. Definitely look into what's going on with that stream.Read call. There must be a way to read from the stream without introducing something like a round-trip time delay each time you try to read.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug
IIRC if you await ReadAsync on a stream, it will resume when data is available.

raminasi
Jan 25, 2005

a last drink with no ice
I'm having a problem with Progress<T>. I've got something like this:
C# code:
var report = new Progress<int>(ticks => UpdateUIOrWhatever());

Parallel.ForEach(things, x =>
{
    DoLongRunningThing(x);
    report.Report(1);
});
What happens is that the first 20 or so reports pile up somewhere and then eventually go off all at once. After that, there's one after each long running thing completes (what I expected). Using debug stepping I can see that the calls to report.Report just vanish into the ether. Where are they going, and how can I get them to show up when I want them to?

ninjeff
Jan 19, 2004

GrumpyDoctor posted:

I'm having a problem with Progress<T>. I've got something like this:
C# code:
var report = new Progress<int>(ticks => UpdateUIOrWhatever());

Parallel.ForEach(things, x =>
{
    DoLongRunningThing(x);
    report.Report(1);
});
What happens is that the first 20 or so reports pile up somewhere and then eventually go off all at once. After that, there's one after each long running thing completes (what I expected). Using debug stepping I can see that the calls to report.Report just vanish into the ether. Where are they going, and how can I get them to show up when I want them to?

Judging from http://referencesource.microsoft.com/#mscorlib/system/progress.cs, Progress<T> saves the SynchronizationContext it's created in and then posts every report back to that. You're seeing the Report calls "vanish" because they're going to another SynchonizationContext for it to handle whenever (not synchronously). If that SynchronizationContext is the WPF dispatcher or something similar, then it might still be dealing with your button click or whatever triggered this code before it gets around to handling any Reports.

If you want an IProgress<T> that acts exactly how you want it to, you could just implement the interface yourself - you'll then be responsible for handling concurrency. Keep in mind that if you're updating a single-threaded UI, you're going to have to either learn to deal with updates "vanishing" into a queue or synchronously update it (with e.g. Dispatcher.Invoke), and the latter is going to kill the performance benefit of Parallel.ForEach.

raminasi
Jan 25, 2005

a last drink with no ice

ninjeff posted:

Judging from http://referencesource.microsoft.com/#mscorlib/system/progress.cs, Progress<T> saves the SynchronizationContext it's created in and then posts every report back to that. You're seeing the Report calls "vanish" because they're going to another SynchonizationContext for it to handle whenever (not synchronously). If that SynchronizationContext is the WPF dispatcher or something similar, then it might still be dealing with your button click or whatever triggered this code before it gets around to handling any Reports.

If you want an IProgress<T> that acts exactly how you want it to, you could just implement the interface yourself - you'll then be responsible for handling concurrency. Keep in mind that if you're updating a single-threaded UI, you're going to have to either learn to deal with updates "vanishing" into a queue or synchronously update it (with e.g. Dispatcher.Invoke), and the latter is going to kill the performance benefit of Parallel.ForEach.

See, that makes total sense in theory, but I can't figure out what's actually causing the holdup on the original SynchronizationContext. To explain more, there's an additional wrinkle: The thread calling Parallel.ForEach isn't the UI thread, but a worker thread spawned by the UI thread with Task.Run. Here's what it looks like:
code:
                               long-running task
                              /
UI thread - background thread  - long-running task
                              \
                               long-running task
My understanding of the way this works is that each long-running task will post its updates back to the default SynchronizationContext. The background thread is then responsible for updating the UI. (It does this using Dispatcher.Invoke.) I can see where the queue backlog is, but I don't understand why it's happening, because the delay is several seconds long, and the UI is totally responsive the whole time. Is the default SynchronizationContext not what I want here? Which one should I use instead?

ninjeff
Jan 19, 2004

GrumpyDoctor posted:

See, that makes total sense in theory, but I can't figure out what's actually causing the holdup on the original SynchronizationContext. To explain more, there's an additional wrinkle: The thread calling Parallel.ForEach isn't the UI thread, but a worker thread spawned by the UI thread with Task.Run. Here's what it looks like:
code:
                               long-running task
                              /
UI thread - background thread  - long-running task
                              \
                               long-running task
My understanding of the way this works is that each long-running task will post its updates back to the default SynchronizationContext. The background thread is then responsible for updating the UI. (It does this using Dispatcher.Invoke.) I can see where the queue backlog is, but I don't understand why it's happening, because the delay is several seconds long, and the UI is totally responsive the whole time. Is the default SynchronizationContext not what I want here? Which one should I use instead?

OK, that's a bit more strange. My only thought is that perhaps the thread pool's getting saturated - IIRC it can only spin up one new thread per second by default, so if all of the current threads are busy doing your Parallel.ForEach work then the Report calls might have to wait a little while to get any time. In this case it might actually be better to create the Progress<T> on the Dispatcher, as that's not going to be tied down by thread pool work. Alternatively, you could set the MaxDegreeOfParallelism on your ForEach to one less than the number of cores to effectively reserve a thread for other work, but that'll have a pretty significant effect on your throughput.

raminasi
Jan 25, 2005

a last drink with no ice

ninjeff posted:

OK, that's a bit more strange. My only thought is that perhaps the thread pool's getting saturated - IIRC it can only spin up one new thread per second by default, so if all of the current threads are busy doing your Parallel.ForEach work then the Report calls might have to wait a little while to get any time. In this case it might actually be better to create the Progress<T> on the Dispatcher, as that's not going to be tied down by thread pool work. Alternatively, you could set the MaxDegreeOfParallelism on your ForEach to one less than the number of cores to effectively reserve a thread for other work, but that'll have a pretty significant effect on your throughput.

Moving the Progress<T> instantiation into the UI thread did the trick, thanks!

Essential
Aug 14, 2003
I'm implementing a REST service to allow an outside company CRUD access to some of our data. There's probably a lot of things I'm missing in setting all this up, but what I'm currently scratching my head on is what to do when a complex object type is passed in as an update.

For instance, if it is a person object, how many of the fields should be optional? Should they only include things they want changed? Let's say they want to change the person's last name, when they pass in the DTO, should I expect only the personID & LastName properties to be populated and everything else will be null? Then I ignore everything except the LastName? I'd like to avoid having a bunch of "if (dto.property != null)" for checking what they've populated, but right now that's the only thing I can see to do.

I'm trying to keep this pretty simple, so I have my web api 2 project with very simple controllers (GET, POST, PUT) that goes off to a data access layer. I tried to ignore complex data types at first, but realized I had to have them for the update. I don't have anything except the default wep api 2 stuff.

Also, am I correct in assuming they will be sending the complex data types in the request body and I extract from there? Is there anyway (automapper/binding?) to extract from the body right into an object/entity on my end? I think what I want is ModelBinding?

Essential fucked around with this message at 19:46 on Jan 28, 2016

Mr Shiny Pants
Nov 12, 2012

Essential posted:

I'm implementing a REST service to allow an outside company CRUD access to some of our data. There's probably a lot of things I'm missing in setting all this up, but what I'm currently scratching my head on is what to do when a complex object type is passed in as an update.

For instance, if it is a person object, how many of the fields should be optional? Should they only include things they want changed? Let's say they want to change the person's last name, when they pass in the DTO, should I expect only the personID & LastName properties to be populated and everything else will be null? Then I ignore everything except the LastName? I'd like to avoid having a bunch of "if (dto.property != null)" for checking what they've populated, but right now that's the only thing I can see to do.

I'm trying to keep this pretty simple, so I have my web api 2 project with very simple controllers (GET, POST, PUT) that goes off to a data access layer. I tried to ignore complex data types at first, but realized I had to have them for the update. I don't have anything except the default wep api 2 stuff.

Also, am I correct in assuming they will be sending the complex data types in the request body and I extract from there? Is there anyway (automapper/binding?) to extract from the body right into an object/entity on my end? I think what I want is ModelBinding?

Nope, this is pretty hard.

You can check the properties ( the != null ) variant, or you could create multiple update endpoints so you know what will be updated. like /url/model/lastname or have multiple update commands that can come in like: /url/models and instead of having the model be posted you expect a command to be posted that you serialize to a type.

Like: ChangeAddressCommand, UpdateName command and do your appropriate actions based on the incoming command. <-- I like this one. It ties into the whole CQRS stuff.

Otherwise you can do what they do in the XML world and have a property that specifies if something is explicitly specified. So you have a model and some extra properties like: "LastNameSpecified" and if this is set to "true" you know which properties to expect and which to update.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Mr Shiny Pants posted:

Nope, this is pretty hard.

You can check the properties ( the != null ) variant, or you could create multiple update endpoints so you know what will be updated. like /url/model/lastname or have multiple update commands that can come in like: /url/models and instead of having the model be posted you expect a command to be posted that you serialize to a type.

Like: ChangeAddressCommand, UpdateName command and do your appropriate actions based on the incoming command. <-- I like this one. It ties into the whole CQRS stuff.

Otherwise you can do what they do in the XML world and have a property that specifies if something is explicitly specified. So you have a model and some extra properties like: "LastNameSpecified" and if this is set to "true" you know which properties to expect and which to update.

Ugh just make it simple and have your resource have a replace endpoint via PUT or POST so the client has to return the entire changed resource and then maybe run validation

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

Malcolm XML posted:

Ugh just make it simple and have your resource have a replace endpoint via PUT or POST so the client has to return the entire changed resource and then maybe run validation

This really is the simplest way. Trying to figure out whether null means null or if it means "don't update this value" is too much headache to deal with.

Essential
Aug 14, 2003

Mr Shiny Pants posted:

Nope, this is pretty hard.

Malcolm XML posted:

Ugh just make it simple and have your resource have a replace endpoint via PUT or POST so the client has to return the entire changed resource and then maybe run validation

Bognar posted:

This really is the simplest way. Trying to figure out whether null means null or if it means "don't update this value" is too much headache to deal with.

Got it, thanks guys, that makes sense.

EDIT: Changing postBody to: string postBody = "{\"LastName\":\"Clark\"}"; works as I guess I didn't have properly formatted json.

Do I just provide them a sample of how the request body should look or provide them a copy of the properties? How are they going to properly format into json the correct object? In this case they are using java, but of course a REST service shouldn't care what anyone who has access is using. I understand this is getting into documentation, but is a simple example of what the json object will look like sufficient?

Essential fucked around with this message at 21:44 on Jan 28, 2016

Sedro
Dec 31, 2008

Malcolm XML posted:

Ugh just make it simple and have your resource have a replace endpoint via PUT or POST so the client has to return the entire changed resource and then maybe run validation

Bognar posted:

This really is the simplest way. Trying to figure out whether null means null or if it means "don't update this value" is too much headache to deal with.
APIs use PUT and PATCH for this. PUT updates a whole object, replacing its attributes. PATCH interprets missing values as the current value.

Don't bother implementing PATCH if it's just premature optimization.

zerofunk
Apr 24, 2004

Essential posted:

Do I just provide them a sample of how the request body should look or provide them a copy of the properties? How are they going to properly format into json the correct object? In this case they are using java, but of course a REST service shouldn't care what anyone who has access is using. I understand this is getting into documentation, but is a simple example of what the json object will look like sufficient?

The WebAPI Help Page nuget package makes it pretty easy to generate documentation. Whether or not the auto generated stuff is enough, probably just depends on the user and what they're expecting. You can of course customize it if necessary. I found it to be a good start when doing something similar for a client. Of course, said client never actually ended up using the API from what I understand. Go figure.

Essential
Aug 14, 2003

Sedro posted:

APIs use PUT and PATCH for this. PUT updates a whole object, replacing its attributes. PATCH interprets missing values as the current value.

Don't bother implementing PATCH if it's just premature optimization.

Thanks, yeah just having a PUT requiring the full object does make sense.

zerofunk posted:

The WebAPI Help Page nuget package makes it pretty easy to generate documentation. Whether or not the auto generated stuff is enough, probably just depends on the user and what they're expecting. You can of course customize it if necessary. I found it to be a good start when doing something similar for a client. Of course, said client never actually ended up using the API from what I understand. Go figure.

Awesome thanks, I'm looking into that right now.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Sedro posted:

APIs use PUT and PATCH for this. PUT updates a whole object, replacing its attributes. PATCH interprets missing values as the current value.

Don't bother implementing PATCH if it's just premature optimization.

doing PATCHcorrectly is giant PITA since you really have to have another meta-languague that describes the operations you want to send (a diff) and it has issues if client and server arent consistent on the same version of the resource you're PATCH-ing


in short dont do patch if you can avoid it and make resources and subresources modular and selfcontained

Essential
Aug 14, 2003

Calidus posted:

Anyone ever had to use the QuickBooks .NET API/SDK? General thoughts? Easy to use?

I'm in the same boat now and wondering if anyone has anything they can share. We're doing this specifically for QB Online and it's been easy enough to connect to a test dev company, but figuring what exactly to query isn't very easy. For example, I'd like to get all expenses for a given date range. I think those are Payment Transactions? Also wondering if I can get historical data on things like: Chart of Accounts, Profit/Loss statement.

On a more .net note, the WebAPI Help Page that zerofunk pointed me too worked great. It's not perfect, but is a huge help in getting something up and running. There's quite a few examples of people extending it and making it much more informative. One thing that is really nice, if you have a complex type passed in the body of the request, the help document will show the body parameters and multiple sample examples (json/xml). That was the thing I was most wanting and got it without any trickery.

epswing
Nov 4, 2003

Soiled Meat

Calidus posted:

Anyone ever had to use the QuickBooks .NET API/SDK? General thoughts? Easy to use?

I'm interested in this just to sync clients between QuickBooks and my application.

Begby
Apr 7, 2005

Light saber? Check. Black boots? Check. Codpiece? Check. He's more machine than kid now.
We need to roll a new REST server at work. It doesn't need to be some sort of rapid development framework, we want something that is easy to maintain and well written first and foremost.

ASP.NET web api v2 looks pretty cool, so does ASP.NET v5 that is currently RC. Is the former doable on a linux platform in production? The new ASP.NET looks nice since it supposed to be written for mac/linux, but we have a hard 6 month release date to get the initial feature list complete. Anyone have experience with either?

brap
Aug 23, 2004

Grimey Drawer
You need to use ASP.NET 5 to deploy to Linux. And I think it's getting renamed to ASP.NET Core.
Edit: well, mono can probably wrangle something up for the earlier versions, but the release that's actually intended by Microsoft to target Linux is ASP.NET 5.

kitten emergency
Jan 13, 2008

get meow this wack-ass crystal prison
The new .net core stuff is nowhere near prime time on Linux and Mono isn't really that hot. If you're committed to .NET I'd use MVC 5 WebAPI or maybe something in Java if you'd rather not do .NET

Comedy response; Use Python + Flask

epswing
Nov 4, 2003

Soiled Meat

Begby posted:

REST server.. easy to maintain.. in production..

That new Microsoft release candidate that supposedly runs on Linux is probably not the thing you want.

(I'm not saying it won't work, but based on your requirements I would probably go the tried/tested route, using the least number of unknowns.)

Edit: unsolicited rant:

At some point in the last couple years, I've stopped cleverly "hacking" on things.

I posted:

You know, if we just bend over backwards here, and use this stack in an unusual way there, we can prematurely optimize this 10-day old framework to do something it was never designed to do, in half the time complexity, but double the space requirement, and it will totally mostly work and be awesome.

Who is going to maintain that mess when you quit or die? Sometimes "boring" is the right thing to do. Especially at your day job.

epswing fucked around with this message at 07:09 on Feb 2, 2016

Mr Shiny Pants
Nov 12, 2012

Begby posted:

We need to roll a new REST server at work. It doesn't need to be some sort of rapid development framework, we want something that is easy to maintain and well written first and foremost.

ASP.NET web api v2 looks pretty cool, so does ASP.NET v5 that is currently RC. Is the former doable on a linux platform in production? The new ASP.NET looks nice since it supposed to be written for mac/linux, but we have a hard 6 month release date to get the initial feature list complete. Anyone have experience with either?

You could take a look at NancyFX: http://nancyfx.org/

It runs on Mono and it is really nice. Sure, it is not the Microsoft way, but it has been around for awhile and has very active development.

EssOEss
Oct 23, 2006
128-bit approved
We use ASP.NET Web API on both Windows and Linux (with Mono) and I can definitely recommend it - Web API makes web services a breeze! Mono is also quite mature these days - I cannot recall of us running into a Windows VS Linux bug.

Stay far away from ASP.NET Core (formerly ASP.NET 5), though - it is indeed far from usable or reliable.

Essential
Aug 14, 2003
Is there a difference between using the VS built in code signing (Project Properties->Signing->Sign the assembly) vs. using the SignTool utility, to sign an assembly?

Begby
Apr 7, 2005

Light saber? Check. Black boots? Check. Codpiece? Check. He's more machine than kid now.

uncurable mlady posted:

Comedy response; Use Python + Flask

Ha! Someone recommended this to me. I have been recommended all kinds of poo poo.

This is what is tough about this. If you ask someone the best solution, 90% of the time they will tell you whatever they currently use is the best solution. Its hard to filter out those sort of replies. I have been told to do it in PHP, node.js, django, spring.io, etc. etc.

It is looking more and more like C# something or other, and not ASP.NET 5 core. We'll dick around with it on Mono take some other platforms for a spin. We can host it on a windows server if need be, but our current windows platform where we have another C# service gives us more headaches than any of our other servers. That one has poo poo running as windows services and also under IIS, so at the very least it looks like OWIN/Katana will make it overall easier, and also easier to try out on both linux and windows.

Ochowie
Nov 9, 2007

epalm posted:

That new Microsoft release candidate that supposedly runs on Linux is probably not the thing you want.

(I'm not saying it won't work, but based on your requirements I would probably go the tried/tested route, using the least number of unknowns.)

Edit: unsolicited rant:

At some point in the last couple years, I've stopped cleverly "hacking" on things.


Who is going to maintain that mess when you quit or die? Sometimes "boring" is the right thing to do. Especially at your day job.

It seems like now is a particularly bad time to try this out since it's being migrated from running on DNX/DNVM to the new dotnet CLI. It seems like it would make sense to wait a bit for RC2 to come out and be more or less stable before trying this.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug
Yeah, as much as I love the direction Microsoft is heading with ASP .NET Core, it's not ready for prime-time yet.

Begby
Apr 7, 2005

Light saber? Check. Black boots? Check. Codpiece? Check. He's more machine than kid now.

Ithaqua posted:

Yeah, as much as I love the direction Microsoft is heading with ASP .NET Core, it's not ready for prime-time yet.

Good advice, I am not going to waste time on it then. As long as we don't create poo poo code, it shouldn't be hard to port it over to the new platform in the future should we decide it has some super new stuff we can't live without.

Thanks all for your input.

epswing
Nov 4, 2003

Soiled Meat
Trying to start a service with C#:

C# code:
using (var sc = ServiceController.GetServices().FirstOrDefault(s => s.ServiceName == SERVICE_NAME))
{
    if (sc != null)
    {
        sc.Start();
        sc.WaitForStatus(ServiceControllerStatus.Running);
    }
}
I'm getting

quote:

Cannot start service on computer '.'
with an InnerException message of

quote:

Access is denied.

Google is showing me lots of solutions involving running As Administrator. The application is running As Administrator. So... now what. Any suggestions?

Inverness
Feb 4, 2009

Fully configurable personal assistant.

epalm posted:

Trying to start a service with C#:

C# code:
using (var sc = ServiceController.GetServices().FirstOrDefault(s => s.ServiceName == SERVICE_NAME))
{
    if (sc != null)
    {
        sc.Start();
        sc.WaitForStatus(ServiceControllerStatus.Running);
    }
}
I'm getting
with an InnerException message of


Google is showing me lots of solutions involving running As Administrator. The application is running As Administrator. So... now what. Any suggestions?
Does the service itself have permissions to do what it is doing? Try setting it to run under the local system account first.

Inverness fucked around with this message at 02:15 on Feb 4, 2016

epswing
Nov 4, 2003

Soiled Meat

Inverness posted:

Does the service itself have permissions to do what it is doing? Try setting it to run under the local system account first.

You're right, the exception pertained to the permissions of the service, not the permissions of the app installing the service. We had the account set to ServiceAccount.NetworkService, and switching that to ServiceAccount.LocalSystem worked. I'm now looking into why NetworkService was used, and there may be a good reason (or have been a good reason at the time, which is no longer true), but it's been run under NetworkService for years, not sure why it's complaining now.

Thanks for the tip.

Sedro
Dec 31, 2008

epalm posted:

You're right, the exception pertained to the permissions of the service, not the permissions of the app installing the service. We had the account set to ServiceAccount.NetworkService, and switching that to ServiceAccount.LocalSystem worked. I'm now looking into why NetworkService was used, and there may be a good reason (or have been a good reason at the time, which is no longer true), but it's been run under NetworkService for years, not sure why it's complaining now.

Thanks for the tip.
NetworkService is more restrictive/more secure than LocalSystem. LocalSystem is basically root.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

epalm posted:

You're right, the exception pertained to the permissions of the service, not the permissions of the app installing the service. We had the account set to ServiceAccount.NetworkService, and switching that to ServiceAccount.LocalSystem worked. I'm now looking into why NetworkService was used, and there may be a good reason (or have been a good reason at the time, which is no longer true), but it's been run under NetworkService for years, not sure why it's complaining now.

Thanks for the tip.
Do you have a stack trace? I assume the problem is something like NetworkService not having access to some file or folder on disk.

It's safer to update things so NetworkService is suitable than to just run under LocalSystem.

Squall
Mar 10, 2010

"...whatever."
Has anyone tried to make a class library for ASP.NET Core/ASP.NET 5 and a Universal Windows application? I tried using a portable class library but usually its mere presence is enough to stop anything from building in Visual Studio.

Adbot
ADBOT LOVES YOU

epswing
Nov 4, 2003

Soiled Meat

Sedro posted:

NetworkService is more restrictive/more secure than LocalSystem. LocalSystem is basically root.

Inverness posted:

Do you have a stack trace? I assume the problem is something like NetworkService not having access to some file or folder on disk.

It's safer to update things so NetworkService is suitable than to just run under LocalSystem.

Ah gotcha. So it's in my interest to find out why NetworkService is complaining, rather than cop out by running under a more (too) powerful account.

The stack trace just looks like this:

quote:

at System.ServiceProcess.ServiceController.Start(String[] args)
at System.ServiceProcess.ServiceController.Start()
at Project.Model.<StartService>b__1c() in c:\Path\To\Blah.cs:line 163
at System.Threading.Tasks.Task.InnerInvoke()
at System.Threading.Tasks.Task.Execute()

How can I tell which resource is causing the permission problem?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply