Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Kyte
Nov 19, 2013

Never quacked for this

epswing posted:

What's the Right Way to trigger some code frequently (every X seconds) in an ASP.NET site? While there are plans to move it to an Azure App Service, and maybe WebJobs is what I'm looking for, it's running as a Site on IIS for now.

You could try https://www.quartz-scheduler.net/ which is a bit more lightweight than Hangfire and runs through IHostedService. It can even share host with ASP.NET and access it directly via DI. IIRC it also supports in-memory, database and redis as backing stores.

In summary it's basically everything explained above but without having to roll it on your own.

Kyte fucked around with this message at 20:21 on Nov 17, 2021

Adbot
ADBOT LOVES YOU

Kyte
Nov 19, 2013

Never quacked for this

fuf posted:

Thanks for the suggestions guys.
[etc]

ToList is not atomic and assumes the source list is static throughout the lifetime of the operation. What's happening here is that the source list is getting new items in between the destination list's backing store being initialized and all the items being copied over, so eventually it reaches the end, except it's not the end and there's new items, tries to copy them and discovers it's gone beyond the allocated space. Then it crashes.
You need to use a lock or similar synchronization mechanic to prevent new insertions while you're copying over, or switch to some collection that guarantees thread-safe enumeration (or maybe an immutable collection, which would be thread-safe by default).

Using a simple for works, but it'll have different behaviors based on compiler optimizations.
If the compiler checks .Length every time, then it'll loop over everything including whatever was added while the loop was executing.
If the compiler optimizes .Length into a local, it'll only loop over the length at the beginning of the loop.
And either way you're being saved from a crash because this list happens to only grow. If it ever shrinks the whole thing will blow up in your face. Better to do things properly.

Kyte fucked around with this message at 15:53 on Nov 26, 2021

Kyte
Nov 19, 2013

Never quacked for this
My guess is because you were initializing a new DbContext the change tracker started at zero so it saw all the elements in the Tags collection and thought "these are all new Tags, gotta insert them all", so the problem wasn't the tags being added or removed but rather the tags that stayed.
Loading the entity into the change tracker solves that because EF now knows which tags already exist in the DB.

This is going off memory so might be wrong but I'm pretty sure the procedure is to create your dbContext, attach the Video to the context so EF takes it as the base state, then do your change, then save.

Btw I'm fairly sure you can do db.Videos.Include(v => v.Tags) and save yourself the tag load. (Or configure the navigation property to auto-include)

Kyte fucked around with this message at 16:20 on Dec 12, 2021

Kyte
Nov 19, 2013

Never quacked for this

Rocko Bonaparte posted:

Is there a way to use the built-in .NET sorting algorithms with async functions? My scripting runtime is using async function calls for its callable stuff. Theoretically, any of these calls could actually block. It would be dumb to happen when comparing for a sort, but the prototype is async nonetheless. I am assuming I couldn't use the Sort functions in the collections with comparers that are awaiting on this stuff, but I thought I'd ask before having to pull out my own sorting algorithm.

Actual performance is not a big consideration here despite talking about sorting.

I can't begin to imagine how'd you have say, a quicksort, work without preloading all the data it needs. Which is effectively a ToListAsync().

Kyte
Nov 19, 2013

Never quacked for this

Rocko Bonaparte posted:

The sorting problem is that I need to specify an IComparable that invokes functions in the scripting runtime that do the comparisons. The functions are async categorically, even if I don't expect them to actually block in this function. However, that's the interface. The problem is I don't know if I can reasonably switch the IComparables prototype to become async. I'm assuming I have to write my own sort implementation, which isn't the end of the world.

Edit: The data is all there. The comparison should normally have everything it needs. However, the internals for invoking the call to do the comparison is async.

If you are willing to risk people loving up, you could write an IComparable wrapper around some kinda IAsyncComparable interface that just does GetAwaiter().GetResult()?

Kyte
Nov 19, 2013

Never quacked for this
This is kinda overkill but what about making a self-hosted webapp. Then you can build the frontend with HTML, grab some JS library for dashboard widgets and plug everything together with ajax.

Kyte
Nov 19, 2013

Never quacked for this
According to the docs, HashSet.Equals compares the object identities (it just inherits from Object). SetEquals compares the contents. You clearly want the latter.
(Element equality is checked using IEqualityComparer btw)

Kyte
Nov 19, 2013

Never quacked for this

Boz0r posted:

Some people from my work have a boner for MS Power Apps, but I hate working with them, and they always become a pain to maintain beyond a PoC stage. Are there any good alternatives in C# frameworks that can match the PoC speed, but also be robust, where I can still control the code?

Isn't that what the azure functions integration stuff is for?

Kyte
Nov 19, 2013

Never quacked for this

distortion park posted:

Instead of depending directly on those transient view models, you could depend on a view model factory (or one per VM depending on if the dependencies have much overlap). There are a few different ways to set this up depending on exact requirements and preferences, there are a bunch explained here:
https://stackoverflow.com/a/2280289


The Microsoft DI framework has explicit support for scoped dependencies but idk if it's easy to use outside of asp.net

I use MS DI all the time for small console applications. All you need to do is use a .NET Generic Host.
https://docs.microsoft.com/en-us/dotnet/core/extensions/generic-host

A lot of aspnet core's basic mechanisms are actually baked into the generic host, like configuration, DI and logging. It makes it really easy to bootstrap the app and lets you share configuration files, use secrets.json, etc. The defaults even plug into the windows event log so I can easily follow up errors in, say, a scheduled task.

In theory you're expected to use the generic host to launch a IHostedService, but nothing stops you from using AddTransient to register your main class and then use a single service locator to load it and execute it manually. You can even open a new scope to do prelaunch tasks in their own scope before you move onto the actual program. I use the latter to do "on boot" tasks in asp.net programs, too. (Stuff like rolling back uncommitted file uploads/deletions in case the server shut down or crashed midway)

Kyte fucked around with this message at 22:19 on Jan 23, 2022

Kyte
Nov 19, 2013

Never quacked for this

LongSack posted:

Interesting, I'll definitely check it out. I currently use this to generate a configuration:
C# code:
public class ConfigurationFactory : IConfigurationFactory
{
    public IConfiguration Create(string filename, bool isOptional = true, string? directory = null)
    {
        var ret = new ConfigurationBuilder()
            .SetBasePath(string.IsNullOrWhiteSpace(directory) ? Directory.GetCurrentDirectory() : directory)
            .AddJsonFile(filename, optional: isOptional, reloadOnChange: true)
            .AddEnvironmentVariables()
            .Build();
        return ret;
    }
}
Which works, but I don't see a way to get user secrets in there.

I usually let CreateDefaultBuilder() add it, but if you don't wanna use it you can manually install the Microsoft.Extensions.Configuration.UserSecrets nuget, add the AddUserSecrets() line and then refer to the asp.net docs.
(I think if you use the right click -> manage user secrets option in Visual Studio it'll actually wire up most of it for you)

Kyte fucked around with this message at 17:34 on Jan 24, 2022

Kyte
Nov 19, 2013

Never quacked for this

epswing posted:

I've got an ASP.NET (Framework 4.8) MVC site, and blob storage, both running on Azure. I'm displaying a list of files in storage that the user can click on to download. When they do, I'm basically doing this (this code is spread across a few classes, but I've boiled it down to one method in a controller):

C# code:
public async Task<FileResult> Download(string name)
{
    using (var stream = new MemoryStream())
    {
        CloudBlobContainer container = GetBlobContainer();
        CloudBlockBlob blob = container.GetBlockBlobReference(name);
        await blob.DownloadToStreamAsync(stream);
        byte[] bytes = stream.ToArray();
        return File(bytes, "application/octet-stream", name);
    }
}
The problem is obviously that, especially for large files, I download/buffer the entire file first (MemoryStream), and then feed it to the user. What I want to do is just stream it directly from Azure storage.

I've been attempting various solutions, but the usual googling is resulting in lots of different answers, half of which are .NET Core and don't apply, because the Azure storage libs are quite different. Which Stream should I be giving DownloadToStreamAsync? And how do I feed that back to the client without buffering?
One option is to use OpenReadAsync and pull data using FileStreamResult with Response.BufferOutput set to false.

Another option is to use DownloadToFile() to a temp file and then use Response.TransmitFile.

Or return a redirect to a SAS URI and the browser then downloads directly off Azure servers.

Or use DownloadToStream and point it to Response.OutputStream? I'm not sure what'd you need to do there wrt buffering and headers and whatnot.
You'd basically have to manually set up the headers, then do a loop with stream.Read(), Response.OutputStream.Write(), Response.Flush(), etc.

Kyte fucked around with this message at 01:50 on Jan 28, 2022

Kyte
Nov 19, 2013

Never quacked for this
ControllerBase has Request which has both Form and Body?
Are you not inheriting from ControllerBase?

Kyte
Nov 19, 2013

Never quacked for this

epswing posted:

VS is not required. At minimum DevOps is just git repos + pull request management.

So why did I have to wait to have a VS sub attached to my account in the client's DevOps before I could see the repo page? I'm honestly curious.

Kyte
Nov 19, 2013

Never quacked for this

epswing posted:

DevOps comes with X free seats (I think 5?), beyond which the admin needs to pay for additional seats. Unless they’re VS subs in which case there’s no charge. If you needed to provide your own VS sub I’d guess your client didn’t want to pay for your seat, which was my dilemma above.

At least, that’s my understanding, I’m still new at DevOps, we’ve been using it for less than a year.

I actually don't know if it was my management or the client that provided the sub but that makes sense.

Kyte
Nov 19, 2013

Never quacked for this

fuf posted:

Thank you, that's helpful.

I actually took a hammer to the problem and ignored the official Blazor advice to use a separate dbcontext for every operation. I replaced the dbcontextfactory with just a single dbcontext and it has basically solved all of my problems because now the context is just keeping track of everything. Maybe it will cause loads of other problems down the line but we'll see...

I think the Blazor advice is because they assume you will be doing db operations within components, but I have all my db stuff in a separate service so maybe it'll be ok...
The problem is DbContext and especially its change tracker is not designed to last longer than a single unit of work, and it'll start getting wonky after a while.


fuf posted:

man I still have such issues with EF and saving related entities. I just don't get it at all.

I have a couple of related classes like this:
C# code:
public class Playlist{
   public Guid Id {get;set;}
   public string Name {get;set;}
   public Filter Filter {get;set;}
}

public class Filter{
   public Guid Id {get;set;}
   public string Name {get;set;}
}
If I try and insert a new Playlist with a Filter that is already in the DB:
C# code:
using var db = contextFactory.CreateDbContext();
db.Playlists.Add(playlist);
db.SaveChanges();
Then I always get this error:
SQLite Error 19: 'UNIQUE constraint failed: Filter.Id'.

Because for some reason it's trying to insert the Filter as a new row even though it already exists.

So then I think ok, maybe I need to attach the Filters in the db to the current context so that it doesn't try to insert duplicates? Like this:

C# code:
using var db = contextFactory.CreateDbContext();
var Filters = db.Filters.ToList();
db.Playlists.Add(playlist);
db.SaveChanges();
But then the error is:
"The instance of entity type 'Filter' cannot be tracked because another instance with the same key value for {'Id'} is already being tracked."

I think this is because the context now has two versions of the same Filter object: one from the DB and one attached to the Playlist object.

How can I tell the context that they're the same object??

The only way I can get it to work is like this, by actually changing the Playlist's Filter to the DB version:
C# code:
using var db = contextFactory.CreateDbContext();
playlist.Filter = db.Filters.Where(f => f.Id == playlist.Filter.Id).FirstOrDefault();
db.Playlists.Add(playlist);
db.SaveChanges();
But this is a pretty laborious workaround and gets really complicated as the models get more complex. There's gotta be an easier way of just making the context aware of what's already in the DB when inserting or updating rows and related entities.

e: actually I just realised my "solution" doesn't work anyway because it's not actually gonna save any changes to the Filter lol, just revert it back to the DB version

The problem in the first one is that you didn't load the Filter from database but rather created a new Filter object and added it to the navigation property without attaching it to the context. Since it doesn't exist in the context, the change tracker believes it's something new and tries to INSERT.

That can be solved by using Attach(), as you did in the second example, but that errors out if the context already had it attached as from a previous operation. You correctly understood the problem in the second example: you loaded every filter from database, but then instead of picking out the filter from the loaded list that DbContext already knows of, you tried to attach your own untracked filter.

For the third case, it's easier if rather than using Where().FirstOrDefault() (which will always load from database, btw), you can use Find() to get the entity from the context (or query the DB if it wasn't already loaded).

The correct method to add a preexisting related entity to a new object is to:
1) Obtain your related entity from the context, usually through Find().
2) Add that entity object to the navigation property. Also do whatever updates you wanna do to it.
3) Save.


That said, sometimes you don't want to load a whole entity when a simple attach would work just fine. For that case, I made an extension method.

C# code:
public static TEntity FindTrackedOrAttach<TEntity>(this DbSet<TEntity> dbSet, int id)
    where TEntity : EntityBase, new()
{
    var e = dbSet.Local.FirstOrDefault(e => e.Id == id);
    if (e is null)
        {
            e = new TEntity { Id = id };
            dbSet.Attach(e);
        }
    return e;
}
It does have a very big disadvantage: If a later operation in the same DbContext (a Find() or navigation or such) happens to ask for the object, EF will happily provide me the empty object I attached, and since it's empty I get all kind of exciting NPEs and such.
(Incidentally this is one of the reasons why you limit your DbContext to a single unit of work)

Kyte fucked around with this message at 20:06 on Feb 5, 2022

Kyte
Nov 19, 2013

Never quacked for this

TheBlackVegetable posted:

What kind of NRE, do you mean like null lists of children references? Isn't the correct thing to initialise those in the entity's constructor?

And/or pass in an (optional) Action<TEntity> init to
FindTrackedOrAttach to initialise the entity if it is created during the call.
I didn't mean literally NREs every time, it could just be that some part of the code didn't expect uninitialized values, but same thing really.
Also I used to be lazy about adding collection initializers to my entity classes because EF takes care of it when it loads an entity.

I meant stuff like, say you have an application form for, dunno, social media or whatever. And there's a dropdown to select countries. You received in your viewmodel the country IDs, so you do something like:
C# code:
var newEntity = new Entity {
  [copy over props from the viewmodel]
}
newForm.Countries.AddRange(vm.CountryIds.Select(id => db.Countries.FindTrackedOrAttach(id)));
db.SaveChanges();
I have no need to load or initialize anything about the country entity and I'd like to avoid the queries.

LongSack posted:

Also, the DbContext is as ephemeral as I can possibly make it. This is hard on dependency injection because a Service needs an instance of a Repository which needs an instance of the DbContext, so if I inject a service directly into a viewmodel or controller, it leads to problems with the tracker, so I end up having to inject a ServiceFactory and creating services for single operations to make sure the DbContext doesn't outlive its usefulness.

Isn't this what transient or scoped dependencies are for? (DbContext is scoped by default)
Or you like them to be even more short-lived than per-request?

Kyte fucked around with this message at 04:39 on Feb 6, 2022

Kyte
Nov 19, 2013

Never quacked for this

LongSack posted:

It’s not really an issue with web apps, where a controller is being instantiated for a single request anyway - get me a list of Foos, update this Bar, delete this Dongle.

But in a desktop app, where you say go to the account type maintenance window, and can create / update / delete account types all in the same window with the same ViewModel, then the DbContext tracker can get in the way.

Ah, makes sense. You mentioned controllers and viewmodels so I assumed you were talking web. Yeah, for desktop it makes much more sense to use a factory.

Kyte
Nov 19, 2013

Never quacked for this
Given the usage pattern, wouldn't it make more sense to make it a property?

C# code:
public List<Tag> TagsOnPage => this.CustomTags ?? this.FilteredTags;
Literally the same minus the parentheses. Not all properties are autoprops.

Kyte
Nov 19, 2013

Never quacked for this
A github page, perhaps?

Kyte
Nov 19, 2013

Never quacked for this
You don't need to do that

Kyte
Nov 19, 2013

Never quacked for this
Just upgrade to .NET 6 :v:

Kyte
Nov 19, 2013

Never quacked for this

worms butthole guy posted:

Is there a way to access the value in a SortedList and then iterate on that value? So i'd like to do (pseudo code):

code:
	CansOfDogFoodtoFeed = new SortedList<string, int>(){
			{"Buttercup", 1},
			{"Fido", 5},
			{"Kennedy", 2}
		}

	// This is what i'm not sure the right way to do
	foreach(dog in CansOfDogFoodtoFeed){
				dog.value++;
		}
So that effectively Buttercup has 2 cans now to eat.

Thanks!

SortedList.Values doesn't work for you?

Kyte
Nov 19, 2013

Never quacked for this

rarbatrol posted:

Man I wish we had performance problems as subtle as that. I'm dealing with legacy code continuously being glued together in new and terrifying ways, and we keep finding code that performs a database lookup per record, which itself usually comes from an initial database lookup.

Oh man that reminded me.
Some time ago we received a project made in C# that we were supposed to add new stuff to. To get around the natural limitations wrt cross-database queries and get nice simple LINQ queries, they'd tacked a ToList() to every DbSet call. Before any applicable Wheres, of course.

We didn't fix every instance, because it was out of scope and out of budget and frankly I'd've left it as-is as a "go gently caress yourself", but it was so drat slow to test I'd have to fix some anyways just to get things done in time.

Kyte fucked around with this message at 00:43 on Mar 7, 2022

Kyte
Nov 19, 2013

Never quacked for this

insta posted:

Yeah I got it wrong in the rant, and decided to keep it since it was quoted like 5 times. There are many things like this, that show up in other constructs you wouldn't think of:

str1.ToUpperInvariant() == str2.ToUpperInvariant() vs string.Equals(str1, str2, StringComparison.OrdinalIgnoreCase),

I find this all the time, for random reasons. On the surface, they look functionally equivalent. The first one allocates 2 new strings and will traverse the length of both of them to build the culture-invariant uppercase version, then do an equality check. The second will first short-circuit by testing length, then will go character-by-character with an ordinal case-insensitive comparison (which itself is faster than invariant culture, if that's applicable), and fail as early as it can.

I'm going to stick with my method of converting the former to the latter, even if they're not on a hot-spot in a profiler. There is no case where the first one is better, and it will prolong the time until it does show up on a hot-spot in a profiler. It's something I can do on auto-pilot as I'm in the rest of the code, and if nothing else it helps me read it better since "string.Equals" is more appropriate to the logic instead of "ToUpperInvariant()"

The relative efficiency of the two isn't nearly as relevant as the fact they are not actually equivalent. Some languages do weird things with case conversions. ToUpper == ToUpper is less correct than Equals(IgnoreCase).

Kyte
Nov 19, 2013

Never quacked for this

worms butthole guy posted:

Thank you for this answer! It got my program running but now when I run it I get:

code:
Response : System.Collections.Generic.List`1[CityBreaks.Pages.CatData]
:suicide: so time to figure out how to destructure this lol.

That's normal. A List<T>.ToString() gives you that, since it doesn't (want to) know how to convert the contents to a string.
Options:
1) Create a DisplayTemplate for CatData and then use @Html.DisplayFor(list) (it will automatically iterate across the list)
2) Create a DisplayTemplate for CatData, then foreach across the list and use @Html.DisplayFor(item) (if you want to customize the stuff around/between each item
3) Use JsonConverter.Serialize() to get a JSON representation you can throw to the screen.
4) Roll your own thing using foreach and whatnot.

Kyte
Nov 19, 2013

Never quacked for this

LongSack posted:

Pretty sure this is an async related question.

I have an extension method:
C# code:
public static void ForEach<T>(this IEnumerable<T> list, Action<T> action)
{
  if (list is null)
  {
    throw new ArgumentNullException(nameof(list));
  }
  if (action is null)
  {
   throw new ArgumentNullException(nameof(action));
  }
  foreach (var item in list)
  {
    action(item);
  }
}
it seems to have issues in asynchronous code. For example, in a data repository, this code works:
C# code:
var offers = await conn.QueryAsync<OfferEntity>(sql, BuildParameters(parameters));
if (offers is not null && offers.Any())
{
  foreach (var offer in offers)
  {
    offer.Bean = await _beanRepository.ReadAsync(offer.BeanId);
    offer.User = await _userRepository.ReadAsync(offer.UserId);
  }
}
but this does not:
C# code:
var offers = await conn.QueryAsync<OfferEntity>(sql, BuildParameters(parameters));
if (offers is not null && offers.Any())
{
  offers.ForEach(async x =>
  {
    x.Bean = await _beanRepository.ReadAsync(x.BeanId);
    x.User = await _userRepository.ReadAsync(x.UserId);
  });
}
Any explanation for why it doesn't work? I am not strong on async stuff. TIA

Language-based foreach integrates with the async state machine of the method that contains it. It'll loop and stop as needed.
Your ForEach method does not have an async signature, so it'll be run synchronously.
async x => { } produces a Func<T, Task>, so basically your ForEach will go through each element, run your action, get a Task and move on. Since ForEach does not await these tasks, it will not wait for completion before moving onto the next item in the loop. (Hence why the compiler isn't asking you to make the ForEach async)

What you need is:
C# code:
public static Task ForEachAsync<T>(this IEnumerable<T> list, Func<T, Task> asyncAction)
{
  if (list is null)
  {
    throw new ArgumentNullException(nameof(list));
  }
  if (action is null)
  {
   throw new ArgumentNullException(nameof(action));
  }
  foreach (var item in list)
  {
    await action(item);
  }
}
And then you call it with await ForEachAsync(items, async x => {}).

Kyte fucked around with this message at 16:58 on Mar 14, 2022

Kyte
Nov 19, 2013

Never quacked for this
I think a good rule of thumb is that if you're passing an async lamba in first place you should be checking if the method can actually do anything with it. The async keyword kinda stands out.

Kyte
Nov 19, 2013

Never quacked for this

NihilCredo posted:

Yeah, I have the runtime switch, but multithreading Is the problem. This is our cronjob service, and it's running dozens of different jobs, so if we want to "drill down" into a particular one by increasing the switch to Debug/Verbose then all of them start logging at Debug/Verbose and it produces an absolute shitload of log data.

It's fine to do so for a bit, but if I could set the log level per-job we could leave a new / buggy / complicated job on Verbose level for a week or so and look for anomalies without running up our ingestion costs with terabytes of worthless crap from the other stable jobs.

The most direct solution is to get rid of the global ILogger instance and instead spawn a dedicated instance (which comes with its own level switch) for each job. This requires injecting it as a parameter into literally every single object or function that might want to emit a log, including common libraries. Great for purity but one heck of a refactor task.

(Comedy option: since almost all our jobs already operate in the AsyncSeq monad (IAsyncEnumerable, basically), I'm almost tempted to write a custom one that combines it with the Reader monad. Would be a giant type tetris mess though.)

Hmm, I might have a legit use case for the service locator pattern? If I take a look at how Serilog implements the execution-context-scoped PushProperty, I might be able to use the same technique to push an ILogger instance into the execution context, and then alias the Log property accessor to locate that instance if one exists (falling back to the global ILogger otherwise).

This is only a vague idea but what about replacing the global with something that hangs off the current synchronization context, and then set up a new context with a new logger for the code path you want to log differently? Async/await should flow the context across invocations (unless you've got everything with ConfigureAwait(false) in which case rip I guess).

Or maybe an async local or thread local depending on whether it's async or not.

These are all hacky, but if you want to avoid a major restructuring it'd at least give you global-like behavior without making it fully global.

Kyte
Nov 19, 2013

Never quacked for this
He did already consider that but it'd involve refactoring a ton of code to include the new parameter.
And this is not for a specific code path, so it'd need everything refactored.

Kyte
Nov 19, 2013

Never quacked for this
There seem to be two actors involved here. The request is sent to EndpointRoutingMiddleware, which matches it to the 405 Method Not Allowed endpoint. However, midstream during the response processing, the CorsService evaluates the request/response, recognizes it as a preflight, and modifies the response appropriately.

Perhaps if you set up your own endpoint for OPTIONS it'd shut the message up. Supposing CorsService continues to intercept the request/response, the clientside result should be the same.
(Or maybe a middleware? I'm not sure where does CorsService sit in the request pipeline)

Kyte fucked around with this message at 17:33 on Apr 27, 2022

Kyte
Nov 19, 2013

Never quacked for this
At least centralize your service location and member initialization into the viewmodel factory (if necessary add and call an InitAsync after creation and before returning).
IMO the combobox should have a simple binding to ObservableCollections in the viewmodel and nothing more.

wrt the original issue: Does the Loaded event not fire at all or is the problem somewhere down the chain? Is your loaded event actually being attached by OnWindowLoadedBehaviorChanged?

Kyte
Nov 19, 2013

Never quacked for this

Variable 5 posted:

Can someone tell me why

code:
foreach (var c in categories)
{
  <h2>@c</h2>
  <hr />
  foreach (var a in Model.Where(sa => sa.Category == c).OrderBy(sa => sa.Order))
  {
    <h3>@a.Title</h3>
    <h6>@Html.Raw(a.Content)</h6>
    <p class="mb-5 text-muted"><small>Last updated @a.LastUpdated.ToShortDateString()</small></p>
  }
}
works but

code:
foreach (var c in categories)
{
  <div>
    <h2>@c</h2>
    <hr />
    foreach (var a in Model.Where(sa => sa.Category == c).OrderBy(sa => sa.Order))
    {
      <h3>@a.Title</h3>
      <h6>@Html.Raw(a.Content)</h6>
      <p class="mb-5 text-muted"><small>Last updated @a.LastUpdated.ToShortDateString()</small></p>
    }
  </div>
}
doesn't? I get "CS0103: The name 'a' does not exist in the current context" when I try to compile.

I'm running ASP.NET Core 6.0. I tried a bunch of googling, but kept coming up blank. I feel like I'm missing something stupid.
First one is nested like @foreach foreach <tag>
Second one is nested like @foreach <div> @foreach <tag>. You're missing the underlined @.

Opening a HTML tag switches the parser to HTML mode, so you need to preface the inner foreach with a @ to change back to C# mode or it'll treat the foreach as text.

As an aside there's a stupid rule in Razor where you can't write @foreach when already inside C# mode and it constantly trips me up. The parser can already tell exactly what's going on, why not let the @ everywhere. Then you can put it everywhere and not get stupid errors when you find you have to wrap your block with another tag.

Kyte fucked around with this message at 08:19 on Aug 30, 2022

Kyte
Nov 19, 2013

Never quacked for this

worms butthole guy posted:

doh I think I figured it out thanks to both of you - I realized i'm overwriting gem every draw update. I still haven't figured out the best way to spawn something and not have it keep spawning over itself but that's my bad lol. still need to figure out OOP better.

Thanks you two!

Add a flag to the object that holds your thing so you can do if (!thingHasSpawned && otherRelevantConditions) { spawnTheThing(); }; // we assume spawnTheThing() sets thingHasSpawned to true.
If your program is particularly simple the flag could be the variable itself, like if (this.thing is null && otherRelevant conditions) { spawnTheThing(); };

Kyte
Nov 19, 2013

Never quacked for this

Munkeymon posted:

Is there a best practice for setting session data after you've logged a user in in ASP MVC 5 (yep, still on Framework)? Logging in invalidates the current session so anything added to it just gets dropped on the floor when the request is done, but I don't see a way to add to the new session before the next request, so I'm planning on stuffing some stuff in a short-lived cookie and adding it to the session on the next request which feels kinda gross and like there ought to be a better way.

fwiw I recall that TempData by default works through a short-lived cookie so it's not like you're alone there

Kyte
Nov 19, 2013

Never quacked for this

Red Mike posted:

code:
{
    _ = RunThingAsync(); //this starts and then runs in the background
    DoMoreStuff();
}
Generally you'll want to be safe and instead do:

code:
{
    _myBackgroundTask = RunThingAsync(); //this starts and then runs in the background
    DoMoreStuff();

    //at some point in the process/regularly/etc
    if(_myBackgroundTask.IsFaulted) 
    { ... }
}

Why not await the background task?

Kyte
Nov 19, 2013

Never quacked for this

Jabor posted:

Because then you can't do anything else on the awaiting thread until the background task finishes?

No I mean like:
code:
var bgTask = BackgroundTask.Execute();

(other work)

try {
   await bgTask;
} catch (...) {
   (...)
}
While yeah IsFaulted is non-blocking you can't let go of the task or you risk an unobserved task exception. You could put the task into a queue so you can check it later but that has its own issues.

Kyte
Nov 19, 2013

Never quacked for this

rarbatrol posted:

What do you mean by unobserved task exceptions? I thought that back when async/await was introduced, tasks would no longer cause unhandled exceptions unless of course you're doing something nasty like async void.

Maybe my knowledge is wrong but I'm pretty sure a faulted task eventually needs to be unwrapped and observed (whether via await or older mechanisms) or it'll eventually bubble up to the runtime? I know async void is dangerous because the task cannot be captured and therefore observed, but if you deliberately let a task go aren't got causing essentially the same behavior?

Kyte
Nov 19, 2013

Never quacked for this
Depending on the usage you might do with FileStreamResult from the File(stream) overload and possibly enabling EnableRangeProcessing.

Kyte
Nov 19, 2013

Never quacked for this
I don't use tuples for code that crosses file boundaries or anything public but I find it very useful for returning values from local or private functions as well as from Task.Run and cousins (ArcGIS addin development requires a looooooot of switching to the main CIM thread via QueuedTask.Run) where I can immediately unpack them into locals.

Oh yeah they're also often more useful than anonymous types for LINQ queries that end in a First/FirstOrDefault.

Basically anything that uses tuple unpacking, really.

Adbot
ADBOT LOVES YOU

Kyte
Nov 19, 2013

Never quacked for this
This might be a bit of a dirty trick but you can have an awaiter for tuples like https://gist.github.com/jnm2/3660db29457d391a34151f764bfe6ef7.

And then use it like:
code:
			var tasks = (
				apiMaestros.ObtenerCuentaAsync(participante.beneficiario_rut),
				apiMaestros.ObtenerEtniasAsync(),
				apiMaestros.ObtenerDiscapacidadesAsync(),
				apiMaestros.ObtenerRegionAsync(contacto.id_region),
				apiMaestros.ObtenerProvinciaAsync(contacto.id_provincia),
				apiMaestros.ObtenerComunaAsync(contacto.id_comuna)
				);

			var (cuenta, etnias, discapacidades, region, provincia, comuna) = await tasks;
At first I was afraid I'd invite the ghosts of race conditions or something but from what I can tell restsharp works just fine with multiple parallel async requests to the same RestClient. (Apparently in practice you can only run two parallel requests at once without some extra configuration but that's more of an internal matter)

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply