Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Opulent Ceremony
Feb 22, 2012

gently caress them posted:

I basically want to have a searchable, jQuery datatables type Table that shows results from about ~1200 total rows of a DB which come down to "metadata about admin orders and a file path to a scan of the Judge's order bla bla bla." Do I want a WCF data service or a WCF service? Is there even a real difference? I was assuming just fork over the entire table and search/paginate, but I've also seen tutorials where every time someone changes the search string it actually queries the DB every time and sends the results. Maybe it's how green I am but it seems like if the list of rows is that low you'd be better off just sending it all once at page load, especially if it's an intranet app.

If you are using jQ Datatables you can choose to load partially from the server (and send a new request each time the user changes some filter) or load it in its entirety at once and let jQDT do all of the paging and filtering client-side. I think 'bServerSide' is the option during the initialization that determines this.

If you choose to let jQDT load all the data at once and filter client-side, you can also override and extend the built-in filtering via their API as well.

Adbot
ADBOT LOVES YOU

Opulent Ceremony
Feb 22, 2012

gently caress them posted:

in LINQ, what's the idiom for saying "where c.category is 1, 2, 13, 20, 21, or 23?"

If I'm reading you right: new List<int>{ 1, 2, 13, 20, 21, 23 }.Contains(c.category). Works for LINQ to Entities if you define the list outside the LINQ code.

Opulent Ceremony
Feb 22, 2012

gently caress them posted:

Would items = items.Where(Function(x) x.trTypeID.Equals(CategoryMapper(_company))) work?

CategoryMapper is returning the List right? So it doesn't make sense to compare a single int to a List of ints. If you just want to further filter your items so that only those whose trTypeID is within the list returned from CategoryMapper, this should work:

items = items.Where(Function(x) CategoryMapper(_company).Contains(x.trTypeID))

Although probably assign the return value of CategoryMapper somewhere beforehand if using that function takes any work, you don't want to execute it for every item in your items list.

Opulent Ceremony
Feb 22, 2012

Bognar posted:

In our own applications at work, we have used Expressions in numerous powerful ways to generalize common query operations and reduce code complexity.

Care to share any examples of this? These always interest me.

Opulent Ceremony
Feb 22, 2012
I appreciate you taking the time to write all that out! Very neat stuff (that I'm going to need to re-read a couple times).

Opulent Ceremony
Feb 22, 2012

aBagorn posted:

Should I be flattening in the service layer? I am mainly using that for data access and some calculations that need to be done before objects are saved to the DB.

We do all our DTO flattening at the Service/EF layer because we use AutoMapper projections to select only the fields we want from the database.

aBagorn posted:

I had been serializing to JSON at that layer and passing up to the Web API in that format

What? You are serializing before you get to the WebAPI layer? One of the main points of WebAPI is so you can have a Controller Action like
code:
public IEnumerable<Junk> GetJunks()
and the client can decide what format they would like it serialized in via headers. If your WebAPI Actions look like
code:
public string GetJunks()
because the Service GetJunks uses already turns the data into a JSON string, that is wrong.

Opulent Ceremony
Feb 22, 2012

In your first example, it looks like you were simply creating some Entity objects with new, which while they may be the same class you see in a DbSet on your DbContext, they are detached entities and your current DbContext doesn't know anything about them, which is why it tries to insert a row when you add these detached entities to the DbContext.

Your second example works because you are directly modifying properties on attached entities, because they are in your FooCollection (which I'm guessing is your DbContext in some capacity). The way to make your code work in the first example is to explicitly attach the Entities to your DbContext, which can be done a couple of ways: http://stackoverflow.com/questions/20451461/save-detached-entity-in-entity-framework-6

I noticed in your first post you actually tried one of these suggestions and it didn't work, and I would guess it has something to do with the way your keys are set up as someone else said.

Opulent Ceremony
Feb 22, 2012

Bognar posted:

This is true, and I didn't consider it in my post since I haven't used EF with lazy loading in years. I've seen it cause way too many performance problems to even consider using it again. It turns idiomatic C# code that looks like it's just in-memory into code that touches the database, which inevitably leads to round trip DB calls in a foreach. Nasty.


As long as you're using some profiling tool in conjunction it's no problem. MiniProfiler for MVC.NET will even alert in red at the highest level to indicate a duplicated query, which usually means an N+1 EF lazy-loading issue.

Opulent Ceremony
Feb 22, 2012

I don't even know where to start with any of this craziness. You know what makes sharing queries easy? Putting them somewhere outside of a controller in the first place. When you write unit tests for your multitude of microcontroller classes that are also doing EF, you are testing the same class and probably the same method when you test whether the user is being routed to the proper place as well as when you test whether some state change occurred in your data based on user-provided inputs. You can't make a separate project for model-related stuff to allow re-use and quick extension for other projects that might want to use the same data because you mixed your model stuff directly into ASP.NET.

Opulent Ceremony
Feb 22, 2012
How is that different from an endpoint that implements OData? They've already got nice tools for that like Breeze.js

Opulent Ceremony
Feb 22, 2012
I've got an issue that I hope someone can give me some ideas about. I've been hooking up a 3rd party vendor's webhooks to our own web application (that is, something event-worthy occurs with our data on their server, they send a POST request to our server to tell us about it) and I'm facing an issue in my staging environment that I didn't encounter in my dev.

This vendor sends a big pseudo base64-looking thing that contains a bunch of JSON and a JWS at the end. The JWS fails to validate every time (per their own SDK library I'm using, same one being used in both environments) on the staging server but never on my dev. Assuming they aren't doing something weird specifically for the requests going to our staging, failing JWS validation makes me think that somewhere along the way their POST body is getting modified by the staging server before it gets to the application level.

The dev server is IIS Express from VS2013, and staging is remotely hosted on Windows Server 2008 R2, IIS 7.5.

I've tried disabling any middleware-type things I could find that mess with requests (turned off UrlScan), and am fairly certain it isn't just truncating the POST data at a certain length. Also I'm just a developer and know very little about servers; we have an actual server admin here that is trying to look into this as well.

Any thoughts of where else to look or think about would be very helpful, thanks.

Opulent Ceremony
Feb 22, 2012

Bognar posted:

Have you inspected the actual POST data to see if it looks like it's being modified? I'd be surprised if your server is doing any modification to it before it reaches your application. I'm more inclined to think that there's something environment specific in the JWS validation that's causing it to fail. Have you tried capturing a request to your staging server and running it through dev to see if the validation still fails?

Yes, I've inspected the POST from both dev and staging, and they don't look different as far as I can tell (this involves decoding it as base64 with a method that doesn't break on imperfect base64-encode), they both have a bunch of JSON that looks about the same and the JWS at the end. I agree that there being an environmental concern about the JWS validation should be considered, but not sure about what else I should be checking. I know IIS servers identify the .NET version (or is it the ASP.NET version) of their app pool, but I don't think that corresponds to the version of .NET that the application is running on, since the app pool is 4.0 something, and this application targets 4.5.2 and we don't have any issues.

Unfortunately I can't take a request captured on staging and run it through dev, because there is a different exception that gets thrown from the vendor library that occurs before the JWS fail exception (confirmed with ILDASM) if their timestamp is off system time. Maybe I can just change my dev system time to get around that and try to validate JWS that way.

Opulent Ceremony
Feb 22, 2012

EssOEss posted:

Do I understand this right that there is broken base64-encoded data coming in?

I'm sorry, it looks like the data is a JWT, I thought it was base64-encoded because it looked like it and attempting to base64 decode provided mostly good results.

EssOEss posted:

At any rate, can you manually validate the JWS?

I tried putting in my data captures to http://jwt.io/ and neither validated there, even the data that the vendor library validated, so now I'm just more confused. Either way you are right, I'm in the process of talking to folks at the vendor company, still trying to jump through filtering hoops to find someone knowledgeable.

Edit: Probably didn't validate on that page since I didn't have the vendor secret!

Opulent Ceremony fucked around with this message at 21:59 on Sep 17, 2015

Opulent Ceremony
Feb 22, 2012

Ithaqua posted:

I have extensive experience with VSO (it's a big part of my job). VSO is awesome. Work management, SCM, build, release, testing, and soon package management all integrated together is rad.

We currently use VSO for our Git repositories and work tracking and we want to add some automated build and deployment to our process. Is the VSO offering for that available yet, and if not is it worth waiting for? Other things we've looked at are TeamCity, CC.NET and Jenkins. Basically we want on commit to a Git repo to build, run unit-tests, and if passing deploy to our remote server. Would be nice if it could handle the db migrations from EF Code First as well.

Opulent Ceremony
Feb 22, 2012
Is there something about React I'm not understanding? It seems like the purposeful removable of 2-way data binding just means a lot more boilerplate code for common scenarios.

I'm imagining any sort of CRUD app whose purpose is to pull an entity from the DB and let someone make modifications to properties of it on the front-end, then eventually save it back. It seems like to do this with React, you need to make an event handler for every input field tied to a property so you can determine what to do with the change, whereas when I use Knockout, a little hooking up in the DOM and it keeps track of all the changes done on the front-end within a JS object, and then I can write a little function to serialize this object and send it to the server to be updated in the DB on a Save button click or something.

Opulent Ceremony
Feb 22, 2012

Ithaqua posted:

You shouldn't do either of those in a web application since long-running threads are subject to arbitrary cancellation by the web server.

This is correct and the current built-in suggestion for ASP.NET with .NET 4.5.2+ is QueueBackgroundWorkItem

Opulent Ceremony
Feb 22, 2012

Calidus posted:

Is there anyway to prevent the EF templates from overwriting modified classes? I have added constructors to the majority of my class I don't want those changes getting nuked every time I run update model from database.

Haven't done DB-first EF in a while, but aren't the model classes partials? You can just make a new partial file somewhere to add the things you don't want the code-generator to touch.

Opulent Ceremony
Feb 22, 2012

spiderlemur posted:

Biggest hindrance in making that Repository class mockable is the hard tie-in with Identity Framework. The repo is initialized automatically from a base class that each controller inherits from which calls GetUserId() to setup some permissions inside the Repository and it's LINQ queries.

There's nothing hard about mocking that, we have the same setup. Just make the property that you attach the Repository in your base controller an interface that your EF DbContext Repo class also implements. Have the constructor for the base controller instantiate the EF one. Then make a separate unit test project, where you have a fake Repo class that also implements the same interface and write your unit tests against that.

Opulent Ceremony
Feb 22, 2012
The main reason to use AutoMapper is for ProjectTo with EF, which lets you select specific columns from a table without having to write the the same Select(x => new {}) in multiple places.

Opulent Ceremony
Feb 22, 2012
I've got an ASP.NET question. I'm looking at the .NET SDK for AWS S3 (http://docs.aws.amazon.com/AmazonS3/latest/dev/RetrievingObjectUsingNetSDK.html) for downloading a file from S3, which allows you to grab onto the ResponseStream of a file coming down from AWS.

If I were to, within my web server, grab that ResponseStream and just return it to the client in an MVC FileStreamResult (or whatever is necessary), I'm guessing that avoids requiring the entire file from being loaded into memory at once in my web server, but does it require my web server to sit around babysitting that stream so it can pass through to the client, or would it be handing off the AWS stream to the client and ending the process for my web server so it can get to other requests?

Thanks for the quick response VVV

Opulent Ceremony fucked around with this message at 19:42 on Feb 20, 2017

Opulent Ceremony
Feb 22, 2012
I'm looking for a little guidance into figuring out a problem we're having. Once or twice a day, we will get a quick series of EF db timeouts on a very specific set of methods. I can't reproduce it even with the same data so it probably has something to do with concurrency related to the database, but I feel like EF plays a role as well. We are using EF 6.1.3 CodeFirst and SQL Server 2008 R2.

The timeouts always occur in the SaveChanges() method, meaning updating or writing. Two of the update queries look almost exactly like this:

code:
UPDATE [dbo].[Things]
SET [ViewStatus] = 1
WHERE ([ID] = '339f1405-d956-40d8-b548-886dc09b7a84')
Where ID is a Guid primary key, plus there's an insert query that times out in the same manner. So something is preventing it from updating that table, but what? There are of course other select queries on that table happening at the same time (paging through, grabbing individual ones, etc), but not a big load and there's only ~130k rows, plus we have similar tables running similar queries that never timeout this way. We are using whatever defaults exist with the EF context transaction-wise.

What makes this table different? Possibly a bit more select queries going on, and it has a column that can get relatively large (like 2-10KB of text data), though the updates aren't touching that specific column and most selects don't select it either.

I realize this reads as extremely open-ended but I'm hoping someone here has a suggestion on what is worth exploring to identify the problem, thanks.

Edit: The unraveled Exceptions look like this:

System.Data.Entity.Infrastructure.DbUpdateException: An error occurred while updating the entries. See the inner exception for details.
---> System.Data.Entity.Core.UpdateException: An error occurred while updating the entries. See the inner exception for details.
---> System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
---> System.ComponentModel.Win32Exception: The wait operation timed out

Opulent Ceremony fucked around with this message at 17:37 on May 12, 2017

Opulent Ceremony
Feb 22, 2012

B-Nasty posted:

You could possibly try sample update/insert queries outside of the application (in SSMS) to isolate it to the database, but I don't think EF is the issue here.

Thanks for your ideas. Is it your guess that some select is locking the table too long (especially if it is table scanning as a result of not using an index), which is preventing the update from running up until it dies from a timeout?

Opulent Ceremony
Feb 22, 2012
Thanks for the additional info, I was aware such a thing was a bad idea but quickly checking it seems this table was already setup with the Guid Primary key as clustered unfortunately.

Opulent Ceremony
Feb 22, 2012

redleader posted:

Do the timeouts always occur at the same time of day? Do you run any periodic processes (in or out of the DB)?

They tend to be around similar times as previous days, though those tend to overlap with the high volume times. We do run some periodic processes within the DB but I've checked to make sure those aren't taking a longer amount of time then. Our DBA says the DB backups are happening at much earlier and lower traffic times, so I might just be stuck with trying to tune up the various queries (which they are in definite need of), which is what I've been working on now. If I can narrow it down any further I'll probably take it to the SQL thread, thanks for everyone's input.

Opulent Ceremony
Feb 22, 2012

a hot gujju bhabhi posted:

Am I going down the wrong track here? I'm kind of confused, surely people do this kind of thing all the time, it's making me feel kind of stupid!

VSTS has its own CI tools that I think would be considered an alternative to Jekyll. For an example of how to use some TFS services without the whole MS pipeline, we have TeamCity on a server local to us that watches our TFS respositories for changes. When it finds a change, it starts the build process, which is just whatever series of steps we've setup. In our case, this is MSBuild, running unit tests, running some npm and node stuff, and then using MSDeploy to publish to a web server. In your case, you can make a step to deploy to Azure. TeamCity has some ability to construct these steps in a simple wizard-like fashion for more common tasks but ultimately you can just have it run whatever CLI command or script you want, and I image that is the case for any CI tool.

Opulent Ceremony
Feb 22, 2012

dick traceroute posted:

The issue was throttling on the data stores... Since been replaced by binding table storage and sql as function parameters

Are you talking about using the specific available bindings vs conventional coding to a db at a specific ip? https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference

Opulent Ceremony
Feb 22, 2012

roflsaurus posted:

Anyone have experience with Azure Cosmos DB / Document DB?

I'm looking at building a new email distribution engine, and wondering if it would be suitable. It would certainly make things nice being able to use Azure functions with a Cosmos DB trigger.

I'm new to NoSQL, so not sure how this would handle aggregate functions.

I have experience with Cosmos DB insomuch as I investigated it as a potential tool for a solution of my own several months ago and immediately ditched the idea when I read that the only aggregate function it supported was Count

Edit: looks like it does COUNT, MIN, MAX, SUM, and AVG with the DocumentDB API, but that is still far too limited for what I needed.

Opulent Ceremony fucked around with this message at 17:07 on Feb 2, 2018

Opulent Ceremony
Feb 22, 2012

Space Whale posted:

Are there any known libraries to easily "log all the requests" or just grab the requests as strings to just shove in a database? I've seen some things to roll your own but this seems like an obvious candidate for "someone already did this."

https://www.elastic.co/elk-stack is great for this. Your web servers probably generate some form of request log file, you just feed that into ELK after you've found a Logstash template for your particular web server log and it does the rest. If you are in a cloud environment where you don't have access to these files there are generally other ways to hook web request logs into ELK that should be easily found by googling.

Opulent Ceremony
Feb 22, 2012
Is anyone in a position to give me a brief overview on whether it is a good idea to migrate our ASP.NET project from .NET Framework to Core? This is a medium-sized application that we hope to continue to maintain and add to for the foreseeable future, likely still running on Windows servers. One of the more serious concerns is will MS continue to sufficiently support regular ASP.NET?

Potential downsides to migrating other than whatever time it would take is that I'm sure we've got more than a few dependencies that don't have a .NET Standard version. It appears you can have an ASP.NET Core application but still target .NET Framework, which would prevent those kinds of dependency issues I think, but would that hinder any benefits of the migration? We have a ton of Entity Framework code as well, which I know has a port in .NET Core but I'm definitely afraid of how it might affect our existing queries: it looks like some earlier versions of the port didn't support some types of SQL generation, but I'm not seeing people complaining about that as much now.

Has anyone gone through this process and found it worthwhile or abandoned it? Any thoughts are appreciated, since I'm not all that clear on what we would get out of it, other than to be working with the latest tech that MS is working on.

Opulent Ceremony
Feb 22, 2012
Thanks for the responses mystes and EssOEss, sounds like it it won't be a trivial undertaking but probably worth doing for us.

Opulent Ceremony
Feb 22, 2012
Does anyone here have experience with Azure Event Grid? I'm having a weird problem.

My setup is: an Azure Media Services account and an Azure Function App, with two instances of each, each set of Media Service/Function App representing a different environment, let's call them environment 1 and 2. They are all a part of the same Azure Subscription, Location, Resource Group, and for the Media Services, Storage Account.

My goal was, per environment, to create an Event Grid subscription from the Media Service to a method on the Function App. With environment 1 (env 1 Media Service to env 1 Function App), this works perfectly. With environment 2, this fails.

My initial idea was to use the Azure web shell CLI to investigate the details of those existing subscriptions to see if env 2's was correct, but following along with (https://docs.microsoft.com/en-us/cli/azure/eventgrid/event-subscription?view=azure-cli-latest#az-eventgrid-event-subscription-list), the tool doesn't appear to work, as none of my existing subscriptions (including the one that works) show up here, although it will show me Event Grid subscriptions whose endpoint is a webhook (used when testing locally), but the Function App endpoint doesn't show up at all. The subscriptions show up in the Azure Portal, but they relate a very limited amount of info once already created.

I've tried simple things like re-creating the env2 Media Service and deleting and re-creating the Event Grid subscriptions a bunch of times.

My investigations so far has yielded these results:

Does env 2 Media Service publish the events? Yes, I can see the events displayed on the Service's Time Graph, and I can receive the correct events if I add another webhook subscription to my local system, they just aren't getting to the env 2 Function App.

Does env 2 Function App receive any Event Grid events? Yes, I was initially filtering for just JobFinished and JobErrored in the subscription, but re-creating the subscription without event type filtering (which is like 20 different kinds), env 2 Function App receives those others, just not the two ones I care about.

Basically, there isn't or shouldn't be any real difference in how env 1 and env 2 are setup and connected, and one works where the other doesn't and I'm going a little crazy.

Opulent Ceremony
Feb 22, 2012

EssOEss posted:

I can't help you but wow, Azure Media Services is still alive? I remember when they fired everybody but like 1 developer. What do you use it for? What are its strengths?

Well this sure doesn't give me a lot of confidence! We use it to encode and DASH stream our own video content.

Opulent Ceremony
Feb 22, 2012

I appreciate you sharing your experiences with various media services. The main factor for us choosing Azure was simply we already have in place a bunch of other Azure services that are a good fit and are working as expected. This latest experience is a pain though, and I'm sad to hear it sounds like it doesn't get a ton of support.

Adbot
ADBOT LOVES YOU

Opulent Ceremony
Feb 22, 2012
Can anyone recommend a vendor for generating PDFs? Nothing terribly fancy, just need to be able to take a byte array and treat it as a PDF, locate input fields on it and stamp data into them, flatten it for rendering a new PDF as a byte array with the data filled in.

We had been using iTextSharp but are now having licensing problems with it, and a replacement we've been working with appears to have unsolvable memory leak issues that become a problem at a high concurrent load.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply