Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Essential
Aug 14, 2003
What do you guys use for installing and updating distributed/commercial applications? Not huge scale commercial, but installed across 1000's of computers around the world and able to update to the latest version. Here's what I've used:

ClickOnce for installing/updating.

InstallShield for installing. When the app needs to update it launches an update.exe & closes so update.exe can do it's thing, then update.exe re-launches the app.

InstallShield for installing, then a plugin model. When the app detects an update it can download & overwrite the file without having the extra update.exe (and doesn't have to close & re-open).

All of those have their pro's and con's but I'm really interested in what you guys have done.

Adbot
ADBOT LOVES YOU

Essential
Aug 14, 2003

chmods please posted:

We distribute our products as MSI packages (I think we use WiX), but we have the luxury of our update model being sales emailing the clients and telling them to grab the new MSI when they get around to it. I've toyed around with the idea of using MEF or child AppDomains for reloading code without exiting the process, mostly for services, which seemed promising.

Child AppDomains is how the 'plugin' update model I'm using was built. MEF was too difficult for me to implement at the time and the child AppDomain was relatively simple to use. It's also proven to be quite effective.

I haven't used WiX, I've heard it's really, really powerful and awesome, but has a steep learning curve.

Essential
Aug 14, 2003
I have a data upload process I need to run at a certain point and I need to check every 30 seconds if it's time to upload. Then when I am uploading data I don't want to check again until the process is complete. Once complete I need to start checking every 30 seconds again.

EDIT: This DOES appear to be working as intended, I had some bad test code firing it off more than once.
code:
uploadTimer = new System.Threading.Timer(t => CheckDataUpload(), null, 30000, System.Threading.Timeout.Infinite);
What do you guys typically use in this situation?

Essential fucked around with this message at 18:09 on Aug 9, 2014

Essential
Aug 14, 2003

Ithaqua posted:

This sounds like a case for a separate data-uploading service. You put a message in a queue saying "i need to upload some data". The uploader takes care of business, and when it's done puts a message in a different queue saying "task is done" that your source application will read from.

This may be a perfect solution, I'm looking into it now. Thanks for the suggestion!

Essential
Aug 14, 2003

ljw1004 posted:

I hate callbacks. What you're describing is a "process" or "workflow". I do all of these with async, because that I way I can code my workflow algorithm using familiar constructs like "while":

code:
// launch the async worker at application startup:
StartUploadWorkerAsync();


async void StartUploadWorkerAsync()
{
   while (true)
   {
      await Task.Delay(30000);
      if (!timeToUpload) continue;
      await DoUploadAsync(...);
   }
}

Thank you for the suggestion Lucian! Is there a similar async thing I can do with the 4.0 framework? We have to support Win XP machines so I cannot get above 4.0 framework :(

Essential
Aug 14, 2003

Malcolm XML posted:

Microsoft.bcl.async

Ithaqua posted:

Async BCL on NuGet should work assuming you're developing on VS2012 or later.

Yep, I'm on vs2013. Thanks you guys!

Essential
Aug 14, 2003
Can MEF be used to load & unload a library from a running application? I want to be able to silently update files while a system tray app runs.

I've done the same thing with a windows service and AppDomain, however I'm wondering if MEF is a better option.

I thought I read before that MEF cannot unload a library once it's loaded (without closing the app down of course), but now I can't find anything on that.

Essential fucked around with this message at 17:35 on Aug 19, 2014

Essential
Aug 14, 2003

Gul Banana posted:

The only way to unload an assembly is to kill its app domain - MEF doesn't provide any special facilities for this.

Thanks, that's what I was thinking.

Essential
Aug 14, 2003

Scaramouche posted:

I know why this is happening, but I thought I'd pick youse guys brains on how to possibly get around it. I'm grabbing some images, resizing them, copying them somewhere, and then deleting them. The code I do it with is like so:
code:
Dim myClient As New System.Net.WebClient
dim dsImagesToGet as dataset = 'sql to get some images
For Each dr As DataRow In dsImagesToGet.Tables(0).Rows
  Dim fn As String = RegexFound("\/([A-Z0-9]+\.JPG)", dr("first_image"))
  Dim fna As String = RegexFound("\/([A-Z0-9]+a\.JPG)", dr("second_image"))

  myClient.DownloadFile(dr("first_image"), "d:\temp\images\small\" & fn)
  myClient.DownloadFile(dr("second_image"), "d:\temp\images\alt\" & fna)
  Dim source As New System.Drawing.Bitmap("d:\temp\images\small\" & fn)
  Dim target As Bitmap = ResizeImage(source, 1000, 1000) 'External function that does resizing
  target.Save("d:\images\pub\1000\" & fn, System.Drawing.Imaging.ImageFormat.Jpeg)
  File.Copy("d:\temp\images\alt\" & fna, "d:\images\pub\alt\" & fna)
  File.Delete("d:\temp\ruby\images\small\" & fn)
  File.Delete("d:\temp\ruby\images\alt\" & fna)
Next
Everything works, except for the last step of deleting the file in which I get "access denied". I've ran into this before when dealing with CSV/TXT/XML files I've created in the past, that they basically get locked until the entire process is over. I can set up a scheduled task to clear the directory at (x) PM every night, but I was wondering, is there a way to do it inside the For...Next loop without having to make another one?

Is it just the first File.Delete that has the access denied? I wonder if it's the ResizeImage() that's locking that file. I'm almost positive (I don't have the code in front of me at the moment) that I've called webclient.downloadfile and then deleted the file after moving it. Can you wrap the webclients in using statements to make sure they get disposed?

Or possibly you need to dispose target first?

I'm pretty sure any object that touches the images has to be disposed before you can delete. If that's the case, then one (or more) of those objects is what's locking the file.

Essential fucked around with this message at 00:42 on Sep 11, 2014

Essential
Aug 14, 2003

ljw1004 posted:

Unrelated: I'm having fun with animated gifs and the new language features...

and another:

and another:

and another:


This is awesome! I have many VB apps that have IsNot Nothing all over the place. This would be a really cool feature to have!

Essential
Aug 14, 2003

twodot posted:

Is there any particular reason you need to unload assemblies?

I've used it for a windows service, to allow the service to update a library without having to stop/restart the service. As far as I know, there isn't a way to update a library once it's been loaded. I don't know if that's a security issue, but at the time it was the only thing I found that would work.

Essential
Aug 14, 2003

twodot posted:

There's nothing inherently wrong with doing this, it's just that Windows services is not really a target scenario for .NET Core. When you're thinking about .NET Core think about things like Windows Store apps (Phone and Client) or ASP.NET.

Ah, got it thanks. I should have read the article first :S

Essential
Aug 14, 2003
I'm laptop shopping and hoping no one minds me asking a few questions here. I'll be using this strictly for Visual Studio, SQL Mgmt Studio and a few other business apps (FileZilla, Notepad++, etc).

I'm not sure what size screen to get. Do any of you work on a 15" screen and feel it is sufficient? Or is 17" the way to go? What about graphics cards, do I need to get a dedicated one or is the built in one ok?

Right now all my development is done on pretty high end pc's so everything works super fast. I have dual 22" monitors were I can do all the UI work on. I'd prefer a smaller laptop just for ease of traveling.

I'm looking at Dell because I'm not sure what else I would buy.

Essential
Aug 14, 2003
Thanks everyone for the help and suggestions!

Seems consensus is the SSD is a must. I'd like to get the smallest laptop possible, while still being productive. I'm going to Best Buy after work to play around with different laptop sizes. I'd like to see how small < 15" really is. It would be nice to have a really portable laptop.

The Surface Pro 3 is very tempting, but it's quite a bit more expensive than a laptop with the same specs.

Essential
Aug 14, 2003
^^^^^^Listen to Ithaqua over me, I guess I'm wrong, however I'll leave what I've experienced here.


aBagorn posted:

Should I be disposing context every chance I get to avoid race conditions?

I can't speak on caching/load balancing, but yes, you should dispose context every time you use it. Always wrap it in a using statement. I gather that it's meant to be thrown away and use a new one every time.

If you don't, you'll get weird poo poo happening and it won't always manifest itself right away, making the problem that much less obvious. I also had some really, really bad performance issue's that were solved when I stopped trying to reuse a context.


VVVV Hah, well yeah that's what I meant. :)

Essential fucked around with this message at 01:14 on Feb 10, 2015

Essential
Aug 14, 2003

TheEffect posted:

Can I use File.Copy to copy directories, or should I be looking at something else? I noticed it says the destination source can't be a directory, but I can work past that,

Maybe a better way to ask this is- my application lets users select either files or directories and the plan is to then copy said selected files/directories to user-specified locations. Is File.Copy what I should be using?

You can use the Directory class to enumerate the files in a directory (and create the desination directory) and then yes, File.Copy them to the destination. There is also a FileSystem class that has a CopyDirectory method: https://msdn.microsoft.com/en-us/library/ms127957%28v=vs.110%29.aspx Never used it myself but may be a more direct route.

The FileSystem method might include all subfolders and files, so you'll want to check that.

Essential
Aug 14, 2003

fleshweasel posted:

Hahahahaha how do people use this loving language

A lot of people (myself included) use vb.net. We're not weirdo's/idiots.

Essential
Aug 14, 2003
I have a Team Foundation Server question:

If I edit a file while logged into Team Foundation Server (via Code->Browse->Edit) how does versioning work with that? Does it save the changes as a new version? What happens if I have the file checked out on a different computer?

The reason I ask is because I want to make some edits but don't have my dev machine with me. Would this be a good time to look into Visual Studio online, that way I get intellisense and possibly full version control?

Edit: I should clarify, I have a visual studio online account by virtue of having TFS. Maybe I'm wrong, but I thought that there was a web based VS version and that's why they migrated TFS into visual studio online a while back. I only use my VS online account for TFS.

Essential fucked around with this message at 18:53 on Apr 3, 2015

Essential
Aug 14, 2003

crashdome posted:

e: ^^^ I think he's referring to the VS Online ability to view code and edit on the fly right on the website.

I know that in Visual Studio online when you edit a file, the save button is effectively a 'check in' and adds a new change set with the comment "Updated <filename>."

Yep, that's exactly what I'm referring to. So it sounds like I can edit and "check in" the file.

I have an MSDN subscription, so it's not a matter of not having access to VS, but I'm using my in-laws computer and don't want to install VS just to make a couple of changes.

crashdome posted:

The web editor is nice but very, very far off from using even Visual Studio Express (which I don't think supports TFS??). You can always get a single license of VS through VS Online by adding one paid user subscription or by paying for a VS Azure VM (anyone have any experience with this?).

For some reason I thought that a true web based version of VS was already out (not a full featured version, but something you could get intellisense & compile code in). I must be way off on that. It would be awesome if the web editor had intellisense though.

Ithaqua posted:

VSO stuff

Yep, that's exactly how I use it now on my dev machines. In this case, my wife made me leave my laptop at home for this trip and I'm trying to get a few things done via the web based VSO editor.

Essential fucked around with this message at 19:46 on Apr 3, 2015

Essential
Aug 14, 2003

Bognar posted:

I know you just came here for technical advice and probably don't want to hear opinions from some rear end in a top hat who knows both jack and poo poo about you, but: have you considered that your wife made you leave your laptop so you wouldn't try to work on your trip?

I feel bad for my mean response, so I'll just say "Yes Bognar, I'm aware of that".

crashdome posted:

You're probably thinking of "Monaco" which is AFAIK only available for Azure websites and not a full product yet.

Hm, maybe that's it. I'll check it out.

Essential fucked around with this message at 23:07 on Apr 3, 2015

Essential
Aug 14, 2003
Anybody have any experience setting up a https endpoint for an Azure cloud service? I followed this guide: https://support.microsoft.com/en-us/kb/2990804

I've setup the CNAME record on my public site, the certificate is loaded into Azure for that cloud service and I can browse to the https cloud service directly (although I get a certificate error warning the certificate was issued to another domain). However, I'm not sure how to force the CNAME redirect to the https endpoint. I think I may just need a redirect rule in the web.config or csdef file, but I can't find anything on how to set that up.

Essential
Aug 14, 2003

Bognar posted:

I'm not entirely sure what you're asking, but I think there may be a misunderstanding? The CNAME doesn't actually redirect HTTP requests, it just redirects the DNS request. If you have another server somewhere that needs to redirect to Azure, then you'll need to set that up separately. I've set up TLS on all of our Azure sites, but I need a bit more information about what you're doing to help.

Thanks Bognar.

Sorry for the confusion, I believe I'm mixing up terms here. My end goal here is to access my WCF azure cloud service over https, from my visual studio projects. I'd like to setup a Service Reference from visual studio over https.

I have a CNAME record on my public site which redirects the DNS request to my Azure cloud service, site.cloudapp.net (WCF cloud service). I wasn't sure how to get that DNS request to access my https cloud service (https://site.cloudapp.net/service.svc). I've since got that solved by adding this to the web.config of my wcf azure cloud service project (this comes from Steve Marx's blog):

code:
    <rewrite>
      <rules>
        <rule name="Redirect to HTTPS">
          <match url="(.*)" />
          <conditions>
            <add input="{HTTPS}" pattern="off" ignoreCase="true" />
          </conditions>
          <action type="Redirect" url="https://{SERVER_NAME}/{R:1}" redirectType="SeeOther" />
        </rule>
      </rules>
    </rewrite>
That seems to have solved the issue of the DNS request going to my https cloud service url.

However, I'm unable to add a service reference from visual studio to my https endpoint. I just got off the phone with Azure support and they are escalating it to a wcf expert. The support rep narrowed it down to what he believes is a problem with the httpBinding and/or with the metadata file pointing to http, rather than https. The metadata is available over http, just not https. In the metadata for http I can see an http url and the https svc page is trying (I think) to link back to the http metadata file and it can't.

I think I need to setup an https Binding either in both my wcf project and/or my client app. It looks like I need an https metadata file.

EDIT: All I needed to do was expose the metadata over https: httpsGetEnabled="true". That seems to have worked!

Essential fucked around with this message at 18:05 on Jun 17, 2015

Essential
Aug 14, 2003

epalm posted:

Is it just me, or is Azure just irreparably broken? I need to vent a little bit.

Unfortunately I can't provide any help on your current situation, however I can tell you that I pay the $30 per month support fee for 1 of my azure accounts and I can say the help was tremendous the 2 times I've needed it. It's seriously been the best tech support I've ever received. It was so good that the first time I needed it I kept the subscription going for 6 months now and ended up using it again a few weeks back. I think you can pay the $30 bucks for 1 month, get support and then cancel if you don't want to continue paying.

Both times I've used it the support rep has connected to my pc and then to my azure account and solved my issues. Each time they showed me various things/tricks/tools that I would never know about and both times they were able to get me going within 2 hours of when I first starting talking to them. They were super knowledgeable and explained everything they were doing. They have experts on all things azure and they may have someone who can solve your exact scenario.

Essential
Aug 14, 2003

epalm posted:

I decided to bite the bullet and purchase developer support. When I try to open a support ticket, I'm asked to choose a subscription, and developer support isn't in the list. If I choose my BizSpark subscription, I'm asked to choose a support plan, and developer support isn't in the list. Now what.

I just tried this and was able to get there by doing this:

Log into windowsazure (this is the old portal). Select username from top-right, select Contact MS Support (this now redirects to the new azure portal: portal.azure). Select Create Support...On step 4 choose Additional Options from dropdown and below the BizSpark Access ID/Contract ID there should be a link "Purchase support plan". When I click that link I see the different support plans, Developer being one of them.

Essential
Aug 14, 2003
One of my apps uses a third party REST api and they've recently implemented a 5 requests per second limit. We're hitting that limit a lot mostly because we have 15 systems spread out across the US and some of them run at the same time (each one fires off about 25 requests). So while an individual account doesn't have the issue, when 5 run at the same time it becomes a problem. Currently we use HttpResponseMessage to send the request async and await the result. The requests do not come from the client, but from an azure cloud service. The clients send an alert to the cloud service, which then builds & sends the requests on behalf of the client. This is fire-and-forget, the clients do not need to know the response.

I'm guessing that we need to implement a queue system for this? We have an azure account, so 2 queue systems are available (Queue Storage and Service Bus Queue). The queue will only be for these requests so I'm not worried about multiple apps being able to subscribe. Or maybe this isn't a good implementation for a queue and instead there is something else we should do? Maybe we can use the existing cloud service and implement rate limiting on it?

Whatever we come up with needs to be fair to their servers, so we can't retry over and over until it goes through. We want to make sure that we're being responsible on our end and eliminate any unnecessary calls to their server.

Anyone have any advice/direction?

Essential fucked around with this message at 16:52 on Jul 31, 2015

Essential
Aug 14, 2003

Bognar posted:

Assuming these 15 systems are expecting some sort of response from that server, then I don't think a queue is a great thing to use. If what you're doing is more fire-and-forget then a queue would be fine.

My immediate thought is to set up a proxy server for the API. Your servers make requests to the proxy, it forwards the requests on to the API but also rate limits itself.

The other option is to get on the phone and see if you can work something out with the API provider so you have a higher limit.

Sorry, I forgot to include that it is fire-and-forgot, nothing about these requests needs to make it's way back to the systems. I probably should have explained this better but the requests come from an azure cloud service, not the clients themselves. The clients just send an alert to the cloud service that they are finished, and then the cloud service makes and sends the requests on behalf of the client. So in effect, I already have a central spot that the requests are being built & sent, but I don't have any rate limiting.

I'll take a look into the proxy server as well, I wasn't aware I could do that.

And I've already been on the phone with them and this is my only option.

Edit: I edited my first post on this to better explain how the current system works.

Essential fucked around with this message at 16:44 on Jul 31, 2015

Essential
Aug 14, 2003

Bognar posted:

Here's a dead simple async rate-limiter I just threw together:

Wow, you're awesome, thank you!

I've never used SemaphoreSlim before, can you clarify how this works? Is the ratelimiter basically a 250 millisecond delay that creates it's own thread via SemaphoreSlim? And therefore it won't block the main thread?

This is a WCF Cloud Service and to be honest I don't have any idea/understanding of what thread(s) are in play. Because other methods in this wcf service are being called constantly, does using SemaphoreSlim make sure that no other methods are blocked? I'm not sure I'm even thinking about this right, I just know that all other methods handled by the cloud service must continue to function while the rate limiter is doing it's thing.

I also don't know what kind of performance impact this will have. We can always increase the performance on this server or I suppose create a new server just to handle this if it was ever necessary.

Essential
Aug 14, 2003

Bognar posted:

A semaphore is a synchronization primitive that only lets N things access it at once (where N here is one, aka a mutex). The C# SemaphoreSlim implementation allows us to wait on it asynchronously, so there shouldn't be any thread blocking.

The RateLimiter just uses the semaphore as a barrier to entry for its WaitAsync method (_semaphore.WaitAsync()), so only one task can be executing that code at once (released with _semaphore.Release()). The other two lines just wait on a delay task, then create a new delay task for the next caller to wait on.

The best real world example I can give is this. It's a room with two doors that has enough space for one person. There's a line out the front door for people waiting to get in. The front door only opens when the person inside the room has left through the back door. Inside the room is a timer - the back door unlocks only once the timer goes off and the person in the room restarts the timer.

Regarding the main thread, I don't think WCF Cloud Services have a "main" thread - not one that's relevant anyway. Requests are handled via different threads on the thread pool. When waiting asynchronously, threads are returned to the thread pool so you don't need to worry about blocking other methods. This also shouldn't impact performance in any noticeable way.

Got it, again thank you so much!

Regarding the aggressiveness, if I'm understanding this right, I could probably drop that down to 100ms or even a bit less on the RateLimiter. The actual request to the api will always take minimum 100ms to return back, just due to latency etc. Therefore, if the RateLimiter gives me a 100ms delay and the api request takes 100ms I'm at most 5 per second. Yes this is adding a delay even when one may not be necessary, but given how simple this is to implement I'm not sure that really matters.


VVVV I see, thanks! VVVV

Essential fucked around with this message at 20:51 on Jul 31, 2015

Essential
Aug 14, 2003
I found this extension a while back when I needed to get an epoch date. In my case I'm lucky in that I don't care about the time, simply just converting the date to epoch:

code:
        internal static long GetUnixEpoch(this DateTime dateTime)
        {
            var unixTime = dateTime.ToUniversalTime() -
                new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);

            return (long)unixTime.TotalSeconds;
        }
Jon Skeet has created http://nodatime.org/ , I've never used it but it may have something more akin to what you guys are trying to do.

Essential
Aug 14, 2003
What options are available for security when connecting a web service to a remote MySQL database, specifically on the web service side? Because I'm not aware of any. I have a client who is finally starting to get serious about security so they bought and installed an SSL cert and now only allow secure connections to their database.

My knowledge on this is that security lies entirely on their end by: implementing SSL and restricting users to only the db/tables/roles that are needed. I have implemented SslMode=Required into my connection string, but I have no idea what else they expect me to do.

However, their IT guy is insistent that I must implement security measures on my end and I'm really at a loss on what to do. The connection string is stored in web.config (not encrypted). He sent me there PEM cert file for their SSL, which they've already implemented to secure connections coming into their server, I'm trying to explain that I don't need the cert but he thinks I do.

Am I missing something obvious/important here that I really do need to do?

Essential
Aug 14, 2003

EssOEss posted:

Set goals, define the attack vectors you want to defend against, plan the security approach. Just throwing security measures at the wall is better than nothing but I recommend you have a sit-down with your client and talk through what the purpose of this is and then figure out the appropriate steps to tackle it.

Good point and that's effectively what the conversation I had with them turned into. What they are ultimately after is secure connections to their server and db users having the minimum rights necessary to do what they need to.

Essential
Aug 14, 2003
Has anyone created Task Scheduler jobs from code? It looks like it's possible from a command line. Another option seems to be to export as a file and then copy it into the correct folder.

I need something that works from XP on up. Is there any libraries recommended?

edit: found this, https://taskscheduler.codeplex.com/ That looks pretty good and is available via Nuget.

Essential fucked around with this message at 19:23 on Dec 8, 2015

Essential
Aug 14, 2003

Ithaqua posted:

It sucks. There's a com api but it's incomprehensible. Windows server 2012 has powershell cmdlets.

Personally, I just export /import them when I need to have scheduled tasks exist as part of environment configuration.

Bummer to hear that library sucks. I'll look into the export/import stuff.

In case I'm heading in the wrong direction and there's a better solution, all I'm attempting to do is make sure my app is up and running. It uploads data every 15 minutes and sometimes it gets turned off for various reasons. A lot of the computers that run it never get rebooted, so run at startup never fires. I'd like a task that run's once an hour or so and if the app's not running to start it up. It really only needs to run M-F from 8am-6pm, but I don't know that I need to get that complex with the task scheduler.

It can't be a Windows Service for various reasons. It's a system tray app and has to stay that way. And ideally I can set it all up automatically without anyone having to manually do anything.

Also Ithaqua, I sent you a PM. If it's not something you can help with then please feel free to ignore.

wwb posted:

FWIW we have moved most things that we triggered via scheduled tasks into the app as Quartz.NET jobs. Self contained is cool.

Cool thanks, I'll check that out as well! Now that I've explained my use case, would Quartz.net work?

Essential
Aug 14, 2003
A "using" statement on a connection object, when there is an error in the using statement, does it still close the connection? In my case I have an odbcconnection in a using statement that's throwing an error occasionally and I'm not sure if the connection is being closed. We have an open connection somewhere so I'm trying to narrow down the issue.

I should add the using statement is wrapped by the try/catch.

Essential
Aug 14, 2003

gariig posted:

A using statement is syntactic sugar for a try/finally. Being inside the try/catch is fine just realize the object will already be disposed.

And Dispose calls Close. Got it, thanks!

Essential
Aug 14, 2003
I'm implementing a REST service to allow an outside company CRUD access to some of our data. There's probably a lot of things I'm missing in setting all this up, but what I'm currently scratching my head on is what to do when a complex object type is passed in as an update.

For instance, if it is a person object, how many of the fields should be optional? Should they only include things they want changed? Let's say they want to change the person's last name, when they pass in the DTO, should I expect only the personID & LastName properties to be populated and everything else will be null? Then I ignore everything except the LastName? I'd like to avoid having a bunch of "if (dto.property != null)" for checking what they've populated, but right now that's the only thing I can see to do.

I'm trying to keep this pretty simple, so I have my web api 2 project with very simple controllers (GET, POST, PUT) that goes off to a data access layer. I tried to ignore complex data types at first, but realized I had to have them for the update. I don't have anything except the default wep api 2 stuff.

Also, am I correct in assuming they will be sending the complex data types in the request body and I extract from there? Is there anyway (automapper/binding?) to extract from the body right into an object/entity on my end? I think what I want is ModelBinding?

Essential fucked around with this message at 19:46 on Jan 28, 2016

Essential
Aug 14, 2003

Mr Shiny Pants posted:

Nope, this is pretty hard.

Malcolm XML posted:

Ugh just make it simple and have your resource have a replace endpoint via PUT or POST so the client has to return the entire changed resource and then maybe run validation

Bognar posted:

This really is the simplest way. Trying to figure out whether null means null or if it means "don't update this value" is too much headache to deal with.

Got it, thanks guys, that makes sense.

EDIT: Changing postBody to: string postBody = "{\"LastName\":\"Clark\"}"; works as I guess I didn't have properly formatted json.

Do I just provide them a sample of how the request body should look or provide them a copy of the properties? How are they going to properly format into json the correct object? In this case they are using java, but of course a REST service shouldn't care what anyone who has access is using. I understand this is getting into documentation, but is a simple example of what the json object will look like sufficient?

Essential fucked around with this message at 21:44 on Jan 28, 2016

Essential
Aug 14, 2003

Sedro posted:

APIs use PUT and PATCH for this. PUT updates a whole object, replacing its attributes. PATCH interprets missing values as the current value.

Don't bother implementing PATCH if it's just premature optimization.

Thanks, yeah just having a PUT requiring the full object does make sense.

zerofunk posted:

The WebAPI Help Page nuget package makes it pretty easy to generate documentation. Whether or not the auto generated stuff is enough, probably just depends on the user and what they're expecting. You can of course customize it if necessary. I found it to be a good start when doing something similar for a client. Of course, said client never actually ended up using the API from what I understand. Go figure.

Awesome thanks, I'm looking into that right now.

Essential
Aug 14, 2003

Calidus posted:

Anyone ever had to use the QuickBooks .NET API/SDK? General thoughts? Easy to use?

I'm in the same boat now and wondering if anyone has anything they can share. We're doing this specifically for QB Online and it's been easy enough to connect to a test dev company, but figuring what exactly to query isn't very easy. For example, I'd like to get all expenses for a given date range. I think those are Payment Transactions? Also wondering if I can get historical data on things like: Chart of Accounts, Profit/Loss statement.

On a more .net note, the WebAPI Help Page that zerofunk pointed me too worked great. It's not perfect, but is a huge help in getting something up and running. There's quite a few examples of people extending it and making it much more informative. One thing that is really nice, if you have a complex type passed in the body of the request, the help document will show the body parameters and multiple sample examples (json/xml). That was the thing I was most wanting and got it without any trickery.

Adbot
ADBOT LOVES YOU

Essential
Aug 14, 2003
Is there a difference between using the VS built in code signing (Project Properties->Signing->Sign the assembly) vs. using the SignTool utility, to sign an assembly?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply