Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
LongSack
Jan 17, 2003

Cuntpunch posted:

Sure - but if FooViewModel *requires* an IBarService, that starts to get messy.

:doh: Of course, hadn't considered that.

I'm liking the Locator idea more and more. It solves the question of where to put the initialization code - it goes in the constructor of the Locator. And if I put my VMs into DI, I can use constructor injection to pass any services needed. Looks very clean. Thanks for that idea.

Adbot
ADBOT LOVES YOU

Cuntpunch
Oct 3, 2003

A monkey in a long line of kings

LongSack posted:

:doh: Of course, hadn't considered that.

I'm liking the Locator idea more and more. It solves the question of where to put the initialization code - it goes in the constructor of the Locator. And if I put my VMs into DI, I can use constructor injection to pass any services needed. Looks very clean. Thanks for that idea.

Protip: Start a side WPF project, add the MVVM Light package to it, it'll bootstrap the couple of files/changes that you can look at. It also includes its own tiny SimpleIOC implementation for registration, which may well suit your needs as well.

ThePeavstenator
Dec 18, 2012

:burger::burger::burger::burger::burger:

Establish the Buns

:burger::burger::burger::burger::burger:

LongSack posted:

Oh, and if you’re wondering why use Expression<Func<foo, bool>> rather than just Func<foo, bool>, it’s because Where with the latter returns an IEnumerable<foo>, so you can’t add, say, an .AsNoTracking(). Where with the former returns an IQueryable<foo>, so you can.

Expression<Func<foo, bool>> is an expression tree, which means it's a declarative set of instructions that are interpreted at runtime. Your Linq code on an IQueryable is really just adding more instructions to the expression tree. Then at runtime something like EF can interpret the expression and turn it into something like a SQL query and send it to a database. AsNoTracking() is just an extension method on Expression<> that adds to the tree and be interpreted at runtime.

Func<foo, bool> is an imperative function that gets compiled and executed as a part of your program. Which means that it can only operate on .NET types, which means you need actual data structures in memory (or IEnumerables emitting them). So going from Expression to Func (or IQueryable to IEnumerable) means that your expression tree gets interpreted and executed and anything chained after that will be ran in your program.

beuges posted:

Just so you know, there are .AsQueryable<T> and .AsEnumerable<T> methods that are available from entity framework that will transform from one to the other.

Converting from IEnumerable to IQueryable just means that your expression tree will get interpreted by the CLR and invoke methods as if your IQueryable was an enumerable. It won't tack your Linq on to the upstream expression tree.

bobua
Mar 23, 2003
I'd trade it all for just a little more.

Figured this was a good place for a visual studio question.

In some web based projects, visual studio will create a hierarchy for files with a . naming convention, like javascript.part2.js will become a sub file of javascript.js.

Is there a way to get it to do that for wpf apps?

spaced ninja
Apr 10, 2009


Toilet Rascal

bobua posted:

Figured this was a good place for a visual studio question.

In some web based projects, visual studio will create a hierarchy for files with a . naming convention, like javascript.part2.js will become a sub file of javascript.js.

Is there a way to get it to do that for wpf apps?

You can use this vs extension to do it.

https://marketplace.visualstudio.com/items?itemName=MadsKristensen.FileNesting

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!

raminasi posted:

Oh, Powershell closures apparently capture by value. One way around this is something called a reference object (apparently, I don't really know Powershell):
I don't really know Powershell either and this just ended the honeymoon.

mystes
May 31, 2006

Once you're doing sufficiently complicated stuff, PowerShell quickly becomes more complicated than just using c#.

bobua
Mar 23, 2003
I'd trade it all for just a little more.


thank you!

redleader
Aug 18, 2005

Engage according to operational parameters

mystes posted:

Once you're doing sufficiently complicated stuff, PowerShell quickly becomes more complicated than just using c#.

And the bar for "sufficiently complicated" is surprisingly low

GI_Clutch
Aug 22, 2000

by Fluffdaddy
Dinosaur Gum
I'll admit that I have a few preprocessors for files we ingest that are PowerShell scripts with C# source code in them that performs all the work. It made it easier to make changes without having to build an executable in VS, copy and paste into a VM, then copy into an RDP session. It also prevents me from having to repeatedly Google how to do basic things since I write PowerShell scripts so infrequently that I forget most of it.

putin is a cunt
Apr 5, 2007

BOY DO I SURE ENJOY TRASH. THERE'S NOTHING MORE I LOVE THAN TO SIT DOWN IN FRONT OF THE BIG SCREEN AND EAT A BIIIIG STEAMY BOWL OF SHIT. WARNER BROS CAN COME OVER TO MY HOUSE AND ASSFUCK MY MOM WHILE I WATCH AND I WOULD CERTIFY IT FRESH, NO QUESTION

LongSack posted:

DAL.Initialize()

No!

Xik
Mar 10, 2011

Dinosaur Gum

mystes posted:

Once you're doing sufficiently complicated stuff, PowerShell quickly becomes more complicated than just using c#.

The AD, IIS and Exchange modules all make PS an awesome tool, but outside of environment management at a MS heavy enterprise I've never really thought "oh I know, PowerShell would be a good tool to solve this problem". I've gone as far as to import custom DLL's and use reflection, but only in the context of environment management. If you're doing practically anything else then there will be a better way that doesn't involved PowerShell.

EssOEss
Oct 23, 2006
128-bit approved
I make multiplatform software and PowerShell is pretty sweet for making OS-independent scripts. Doesn't matter what OS the software is deployed on, just run this PowerShell script to make stuff happen!

That being said, I agree that it is a major pain in the rear end to work with, mostly due to syntax/language rear end-backwardness in many areas. It has so much promise to be a great tool but all of this promise is squandered by a design that just has the wrong bells and whistles. You can do cool things in 5-line snippets with those language warts but for real world use, I want consistency and clarity.

Also, it sort of manages but never really reaches interoperability with non-powershell executables. I never want to wonder "why does piping input to this command suddenly not work?" and reach the conclusion "microsoft magic, gently caress it, it works for some user accounts so just switch accounts" after 3 days of investigations. Unrelated, "Exit code failure is an error" should be an automatic behavior, not something I have to implement manually on every command (oops, missed one in the pipe!).

putin is a cunt
Apr 5, 2007

BOY DO I SURE ENJOY TRASH. THERE'S NOTHING MORE I LOVE THAN TO SIT DOWN IN FRONT OF THE BIG SCREEN AND EAT A BIIIIG STEAMY BOWL OF SHIT. WARNER BROS CAN COME OVER TO MY HOUSE AND ASSFUCK MY MOM WHILE I WATCH AND I WOULD CERTIFY IT FRESH, NO QUESTION
Literal clueless idiot baby speaking here, but what is the cross platform scripting situation like now that windows has the Linux subsystem thing? Does this kind of make bash scripts cross platform now? (at least Linux, Mac and Windows?)

nielsm
Jun 1, 2009



You can do some things but definitely not everything with WSL. I wouldn't try to depend on it for system configuration. The point where PowerShell really scores on Windows is the built-in capabilities for scripting WMI objects and loading and use pretty much any .NET assembly, and those are basically required for any non-trivial configuration on Windows. A much better cross-platform bet (but would still require special-casing for Windows) would be Python or Perl, except those obviously require installation.

EssOEss
Oct 23, 2006
128-bit approved
I have not encountered anyone brave enough to use Windows Subsystem for Linux (WSL) in production so far. Also, WSL just got re-written from a "native" implementation to a VM-backed implementation, in the form of WSL2, completely changing the architecture.

Windows Server 2019 does apparently support WSL1 but now that WSL2 is completely different, WSL1 seems to just be a technology that graduated from beta verision to legacy garbage, without ever entering the "production" phase. WSL2 is still beta release and will not be seen in a production release of Windows Server for a few years yet, I expect.

You could still use 3rd party Bash builds like Cygwin (eww) but the extent of what you can do is still limited (and indeed, might still be limited in WSL). Hence why I use PowerShell.

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



I use LinqPad for shell-like things fairly often because it's generally easier to write some throwaway C# with LP's fairly decent code completion features than it is to figure out how to do complex things in PS. I even have a bunch of helper functions to expand environment vars, read/write files, search directory structures and work with file/directory objects.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

I'll generally do PowerShell for quick and easy but usually it's because I have to access an mssql database and its real fast and easy and winnt 3.51 odbc left my brain with scars.

LongSack
Jan 17, 2003

Question about EF Core and navigation properties. My Character class has 4 navigation properties: Category, Gender, Native Language, and Race. When I create a new character, those values are chosen from combo boxes.

When I go to insert the new character, EF tries to create each navigation property as if it were new (probably related to the state). Since they already exist, this fails.

The Insert method looks like
C# code:
public void Insert(Character dto)
{
    CharacterEntity c = _mapper.Map<CharacterEntity>(dto);
    using (var context = new Context())
    {
        context.Characters.Add(c);
        context.SaveChanges();
    }
    dto.Id = c.Id;
}
I can work around it by nulling all the navigation properties before I insert, but that feels like a kluge. Is there a better way? TIA

roadhead
Dec 25, 2001

Anyone ever had the displeasure of working with XpsDocumentWriter? Getting anything out of it that isn't just 8.5x11 portrait?

I have what I thought was a fairly simple idea, taking the output of my third party control as three separate XPS files that are long and skinny, and then combine them into a single XPS - which sort of works except the page size is always the default no matter what I try.

I'm adding the FixedPage I find inside each of the XPS files the third party DLL renders out for me into a System.Windows.Media.ContainerVisual then using the System.Windows.Documents.Serialization.SerializerWriterCollator I created from the System.Windows.Xps.XpsDocumentWriter - now the collator has some options for a System.Printing.PrintTicket object, both when you create it and when you write to it, but it doesn't seem to do anything for me.

code:

using (var fs = new FileStream("C:\\temp\\output.xps", FileMode.CreateNew))
{

                Package package = Package.Open(fs, FileMode.Create, FileAccess.ReadWrite);

                System.Windows.Xps.Packaging.XpsDocument myDocument = new System.Windows.Xps.Packaging.XpsDocument(package, CompressionOption.Fast, "C:\\temp\\output.xps");

                System.Windows.Xps.XpsDocumentWriter writer = System.Windows.Xps.Packaging.XpsDocument.CreateXpsDocumentWriter(myDocument);

                ContainerVisual newPage = new ContainerVisual();

                XpsDocument xpsOld = new XpsDocument(track.FileLocation, FileAccess.Read);
          	FixedDocumentSequence seqOld = xpsOld.GetFixedDocumentSequence();
            	foreach (DocumentReference r in seqOld.References)
            	{
                	FixedDocument d = r.GetDocument(false);
                	foreach (PageContent pc in d.Pages)
                		{
                    			FixedPage fixedPage = pc.GetPageRoot(false);
                    			fixedPage.Arrange(new Rect(new System.Windows.Point(0, vertOffset), new System.Windows.Size(width, height)));
                    			newPage.Children.Add(fixedPage);
                		}
            	}
               xpsOld.Close();
             
                System.Printing.PrintTicket printTicket1 = new System.Printing.PrintTicket()
                {
                    PageMediaSize = new System.Printing.PageMediaSize(System.Printing.PageMediaSizeName.Roll36Inch),
                    PageOrientation= System.Printing.PageOrientation.Landscape,
                    PageMediaType = System.Printing.PageMediaType.Continuous,
                    PageResolution = new System.Printing.PageResolution(System.Printing.PageQualitativeResolution.Default)                    
                };
                System.Windows.Documents.Serialization.SerializerWriterCollator collator = writer.CreateVisualsCollator(printTicket1, printTicket1);

                collator.BeginBatchWrite();

                collator.Write(newPage, printTicket1);

                collator.EndBatchWrite();

                package.Close()
}

i'll ever take wild rear end guesses, is XPS obnoxious should I be combining into something else? My best option for source files is XPS as everything else is a raster...

raminasi
Jan 25, 2005

a last drink with no ice
Anyone have any Azure Pipelines experience? I'm trying to get a pipeline to build against a .NET Core 3.0 preview SDK, and it's not working. I had this exact problem:
code:
error NETSDK1045: The current .NET SDK does not support targeting .NET Core 3.0. Either target .NET Core 2.2 or lower, or use a version of the .NET SDK that supports .NET Core 3.0.
and the answers given there "fixed" it, in the sense that I got an SDK installed as part of my pipeline and it didn't complain about version numbers. But then the NuGet restore fails:
code:
error : Unable to locate the .NET Core SDK. Check that it is installed and that the version specified in global.json (if any) matches the installed version.
It's now this problem. The anwers there say that I need to manually set MSBuildSDKsPath, but that seems dumb (shouldn't the task do that?) and I tried it and it didn't fix anything anyway. I'm so close to getting this to work!

Nth Doctor
Sep 7, 2010

Darkrai used Dream Eater!
It's super effective!


raminasi posted:

Anyone have any Azure Pipelines experience? I'm trying to get a pipeline to build against a .NET Core 3.0 preview SDK, and it's not working. I had this exact problem:
code:
error NETSDK1045: The current .NET SDK does not support targeting .NET Core 3.0. Either target .NET Core 2.2 or lower, or use a version of the .NET SDK that supports .NET Core 3.0.
and the answers given there "fixed" it, in the sense that I got an SDK installed as part of my pipeline and it didn't complain about version numbers. But then the NuGet restore fails:
code:
error : Unable to locate the .NET Core SDK. Check that it is installed and that the version specified in global.json (if any) matches the installed version.
It's now this problem. The anwers there say that I need to manually set MSBuildSDKsPath, but that seems dumb (shouldn't the task do that?) and I tried it and it didn't fix anything anyway. I'm so close to getting this to work!
Are you using azure-pipelines.yml? If so what does that look like?

raminasi
Jan 25, 2005

a last drink with no ice

Nth Doctor posted:

Are you using azure-pipelines.yml? If so what does that look like?

There are more steps after this, but this is as far as it's getting:
code:
trigger:
  tags:
    include:
    - v*

pool:
  vmImage: 'windows-latest'

variables:
  solution: '**/*.sln'
  buildPlatform: 'Any CPU'
  buildConfiguration: 'Release'
  MSBuildSDKsPath: 'C:\Program Files\dotnet\sdk\3.0.100-preview7-012821\Sdks' # manually setting this path is the last thing i tried

steps:
- task: UseDotNet@2
  displayName: 'Use .NET Core SDK'
  inputs:
    packageType: sdk
    version: 3.0.100-preview7-012821
    installationPath: $(MSBuildSDKsPath)

- task: NuGetToolInstaller@1

- task: NuGetCommand@2
  inputs:
    restoreSolution: '$(solution)'

roflsaurus
Jun 5, 2004

GAOooooooh!! RAOR!!!
I'm trying to make a Azure webjob handle failure/restart a bit more gracefully for batch processing. In an ideal world, I'd convert to Azure Durable functions to leverage fan in/fan out. But I don't think I'll have the time to properly convert all the webjobs (it does some funky stuff with caching that I'll need to re-architect to move over to functions)

Currently - the webjob gets a message to process 50 documents at a time from an Azure SQL database. It performs 1 SELECT to retrieve all the records, and then generates and emails the documents. After it's processed the whole batch it performs 1 UPDATE to mark all the records as complete.

I think it's hitting some memory limits and the webjob dies, or it terminates unexpectedly. I lose track of which of those 50 documents were actually sent (0, 10, all 50?) and can't gracefully recover.

I can't do individual record updates one by one in SQL as this can run into 200k documents at a time and the DB is under a fair bit of load as it is. I'm thinking when I send the document, I could either write to Azure Table storage, Azure Storage Queue or Azure Redis. then if a webjob fails it will move the message to the poison queue. I can then read the temp storage (Table, Queue or Redis) to identify which documents were sent and recover (update the SQL db and then requeue the batch)

I just need a simple key storage e.g. <tenant-name>-<batch-id>-<document-id> = SENT. If I used queues I would create a queue per batch and just the messages into that queue.

Any recommendations as to Table, Queue or Redis for this scenario? Performance is key - aiming to send 200k documents per hour. I would lean towards Redis, but I'm worried about potential data loss as it's not persisted - or is this something not really to be concerned with? For it to be an issue the webjob would have to fail, and both Redis nodes would have to fail / be restarted before the webjob recovered.

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
Humor my shenanigans a bit here with IEnumerator and coroutine madness. I am toying around with integrating some scripting into Unity and I was trying to maintain scheduling for it separate from Unity's own schedulers. A lot of that is just trying to decouple, but also because I don't know what Unity is doing with synchronization contexts. That right there has kept me away from using async-await, although I'm about to poke the synchronization context with a stick and see what happens.

Anyways, I wound up trying to use IEnumerator coroutines because of that. Internally, that means a lot of calls are returning IEnumerators because the things they invoke might yield. But then sometimes I need to recover a return value and conclude the coroutine. This is leading to a very common copy-paste, and I don't like that. I'm not in front of the code but it's basically a foreach against the invocation of the subordinate code. If it's not a return value, I end up yielding it up a layer--usually to another foreach block. I'd use a macro for this if I wasn't working in C#. I can't write a helper function because then I'd have to wrap that in the same foreach. So I'm pondering out loud here how I might be able to maintain the IEnumerator model while not having to do this foreach stuff.

I'm actually starting to bank on overriding the synchronization context to defer all my own generated awaiters to my own scheduling because I think it would ultimately look cleaner, so please don't X instead of Y me on using async-await here. I'm starting to look into that too.

This gets particular ugly if any scripts are written recursively. I don't plan to do that. However, I wonder if switching to async-await would make that any better. Internally, does the .NET runtime manage that better?

epswing
Nov 4, 2003

Soiled Meat
Only semi-related to .NET but...

We need to purchase a Visual Studio Pro 2017 license (standalone) for a new developer, but I'm only seeing Visual Studio Pro 2019 in the MS Store. It looks like they make it really hard to (a) download and (b) buy a license for older VS versions. I've spoken to a couple clueless MS Store and VS Site reps (via in-browser chat), and they don't know what's up. One told me to physically go to a brick/mortar MS Store at a nearby mall. I might actually try this, but drat...really? Just take my money already and give me a license for some slightly older software!

In the last few years there generally haven't been any must-have features, so we tend to upgrade to the latest VS after it's been out for a while, and the rest of the world has worked out the obvious bugs. Usually "Update 1", "Update 2", etc has been released by then.

Shy
Mar 20, 2010

Removing older versions sounds like a good idea because some people may try to buy them.

ljw1004
Jan 18, 2005

rum

Rocko Bonaparte posted:

Humor my shenanigans a bit here with IEnumerator and coroutine madness. I am toying around with integrating some scripting into Unity and I was trying to maintain scheduling for it separate from Unity's own schedulers. A lot of that is just trying to decouple, but also because I don't know what Unity is doing with synchronization contexts.

If you're doing your own co-routines, I'd expect you to use your own tasklike objects instead of tasks: you'll write "await myCoroutineFunction()". The C# compiler translates this into "myCoroutineFunction().GetAwaiter()". Therefore it's entirely up to you how you implement your GetAwaiter method.

The stuff about synchronization contexts lies solely within from Task.GetAwaiter(), i.e. the default awaiter for task objects. If you do your own coroutines, then your work will be unrelated to synchronization contexts. (sticks or otherwise...). It'll be solely up to you where coroutines get scheduled.

Rocko Bonaparte posted:

This gets particular ugly if any scripts are written recursively. I don't plan to do that. However, I wonder if switching to async-await would make that any better. Internally, does the .NET runtime manage that better?

so please don't X instead of Y me on using async-await here. I'm starting to look into that too.

Note that async+await is fundamentally doing the same thing as an enumerator. With async+await it ends up calling ".MoveNext()" on the AsyncStateMachine object. This is conceptually no different from calling ".MoveNext()" on an enumerator object. (For the first prototype of async+await I actually implemented it with IEnumerator and tried to persuade the rest of the C# language design team that this would be a good idea. Anders Hejlsberg was having none of it, so we abandoned it in favor of introducing a new family of types. IEnumerator came with too much inefficiency baggage; the async types allow for vastly less memory allocation.)

I don't know what is the source of your ugliness with recursive scripts. If you post some code, maybe that would clarify. The .NET runtime (CLR) isn't at all involved in async+await. It's entirely accomplished in libraries which can be written by you, or by the framework designers. They are the ones that implemented Task.GetAwaiter and decided to make it use synchronization contexts. When you say "ugliness" I instead assume you're talking about language-level ugliness, i.e. is your paradigm for coroutines compositional in the same way that async+await is. Again it's too hard to speculate without seeing your code.

I'm sorry to "X instead of Y" you here, but I'm 100% sure that async+await is the right way to do co-routines. Here's some co-routines I wrote which use async+await to hibernate a method and resume it at a later date. This isn't what you're doing, but it might help familiarize you with how to play around with coroutines. https://blogs.msdn.microsoft.com/lucian/2016/04/20/async-workflow-2/

LongSack
Jan 17, 2003

Downloaded VS2019 preview today, and started messing around with WPF on .NET Core 3.0. It’s going surprisingly well, with one glaring exception: Properties.Settings.Default is gone. I use settings extensively for things that the user can change and that they would expect to stay changed the next time they run the app. Things like most recent Foo, application colors, sort directions and orders, etc.

I liked settings because they were strongly typed and were read-write (unless you made them otherwise).

I’m currently using an appsettings.json file for connection strings, but they’re read-only, right?

I’m currently thinking about putting all these user-configurable items into a Settings table, then at app startup loading up a singleton class that offers them as properties and knows how to persist changes back to the database.

Better ideas? TIA

WorkerThread
Feb 15, 2012

I'd be curious if you find the current Microsoft blessed replacement for settings/profile.

That said your solution is totally fine, you can even do something simple like using a Sqlite database to store the settings locally.

I again recommend against explicit singletons. :) Why not just write an IProfileService and inject it where needed?

LongSack
Jan 17, 2003

WorkerThread posted:

I'd be curious if you find the current Microsoft blessed replacement for settings/profile.

That said your solution is totally fine, you can even do something simple like using a Sqlite database to store the settings locally.

I again recommend against explicit singletons. :) Why not just write an IProfileService and inject it where needed?

I’m already using MSSQL for the rest of the app, so that’s fine for the settings.

As for the Singleton, there’s no need for multiple instances of a Settings class, and instantiating one wherever needed seems like wasteful and unnecessary database hits. Granted, on my desktop with it’s I9 processors and SSD drive, it’s probably moot, but why not strive for efficiency? I put my MainViewModel in as a singleton, since there should only ever be one (the other view models are transients). I thought that was what singletons were for.

Cuntpunch
Oct 3, 2003

A monkey in a long line of kings

LongSack posted:

I’m already using MSSQL for the rest of the app, so that’s fine for the settings.

As for the Singleton, there’s no need for multiple instances of a Settings class, and instantiating one wherever needed seems like wasteful and unnecessary database hits. Granted, on my desktop with it’s I9 processors and SSD drive, it’s probably moot, but why not strive for efficiency? I put my MainViewModel in as a singleton, since there should only ever be one (the other view models are transients). I thought that was what singletons were for.

This returns to your earlier questions about dependency injection. What he is saying is that at the same point at the start of the app that you're standing up your dependency graph, either via an IOC container or manually, you can simply register a single instance of IProfileService, rather than the type itself, to ensure that a single instance is the 'same instance' that gets used by all consuming classes.

LongSack
Jan 17, 2003

Cuntpunch posted:

This returns to your earlier questions about dependency injection. What he is saying is that at the same point at the start of the app that you're standing up your dependency graph, either via an IOC container or manually, you can simply register a single instance of IProfileService, rather than the type itself, to ensure that a single instance is the 'same instance' that gets used by all consuming classes.

Oh, that's what I was going to do anyways - register the concrete class as an implementation of an Interface. I probably misstated what I intended. At least, that's what I think I'm doing. It looks something like this:
C# code:
public class SystemSettings : ISettingsService
{
    ...
}

services.AddSingleton<ISettingsService, SystemSettings>();

raminasi
Jan 25, 2005

a last drink with no ice

raminasi posted:

Anyone have any Azure Pipelines experience? I'm trying to get a pipeline to build against a .NET Core 3.0 preview SDK, and it's not working. I had this exact problem:
code:
error NETSDK1045: The current .NET SDK does not support targeting .NET Core 3.0. Either target .NET Core 2.2 or lower, or use a version of the .NET SDK that supports .NET Core 3.0.
and the answers given there "fixed" it, in the sense that I got an SDK installed as part of my pipeline and it didn't complain about version numbers. But then the NuGet restore fails:
code:
error : Unable to locate the .NET Core SDK. Check that it is installed and that the version specified in global.json (if any) matches the installed version.
It's now this problem. The anwers there say that I need to manually set MSBuildSDKsPath, but that seems dumb (shouldn't the task do that?) and I tried it and it didn't fix anything anyway. I'm so close to getting this to work!

Quoting myself because I fixed this problem - dotnet appears to get stupid when either a preview or 3.x SDK is installed. I had to explicitly install it next to the rest of my SDKs, and provide a global.json naming it. dotnet pack still isn't working though, so I'm not done yet!

brap
Aug 23, 2004

Grimey Drawer

epalm posted:

Only semi-related to .NET but...

We need to purchase a Visual Studio Pro 2017 license (standalone) for a new developer, but I'm only seeing Visual Studio Pro 2019 in the MS Store. It looks like they make it really hard to (a) download and (b) buy a license for older VS versions. I've spoken to a couple clueless MS Store and VS Site reps (via in-browser chat), and they don't know what's up. One told me to physically go to a brick/mortar MS Store at a nearby mall. I might actually try this, but drat...really? Just take my money already and give me a license for some slightly older software!

In the last few years there generally haven't been any must-have features, so we tend to upgrade to the latest VS after it's been out for a while, and the rest of the world has worked out the obvious bugs. Usually "Update 1", "Update 2", etc has been released by then.

VS 16.2 is out I think, give it a try :)

Cuntpunch
Oct 3, 2003

A monkey in a long line of kings

brap posted:

VS 16.2 is out I think, give it a try :)

His entire question is predicated on needing a VS 15.x version - presumably owing to wanting to keep the entire development team in-sync.

To the OP's question - am I confused? For years now hasn't Visual Studio been effectively a subscription based license? And per https://visualstudio.microsoft.com/wp-content/uploads/2017/11/Visual-Studio-2018-Licensing-Whitepaper-November-2017.pdf

p.12 posted:

For Visual Studio Professional 2017 standalone licenses, the software included in the license is the current version of the
software, Visual Studio Professional 2017, plus downgrade rights to simultaneously run prior versions of Visual Studio
Professional to which you may otherwise have access.

If you have an active subscription for VS 2019, you should be just fine to install 2017 side-by-side.

nielsm
Jun 1, 2009



Yeah, when I log into my account I can download at least VS 2010, 2013, 2015, 2017, and 2019, and also a few even older versions.

Cuntpunch
Oct 3, 2003

A monkey in a long line of kings

nielsm posted:

Yeah, when I log into my account I can download at least VS 2010, 2013, 2015, 2017, and 2019, and also a few even older versions.

Yeah, I think that the only oddity would be going back to 2013 or older - where there was a 4-tier model. If you've got an Enterprise sub you're fine, but if you're on Professional I wouldn't know whether that would be considered Pro or Premium.

leper khan
Dec 28, 2010
Honest to god thinks Half Life 2 is a bad game. But at least he likes Monster Hunter.

Rocko Bonaparte posted:

Humor my shenanigans a bit here with IEnumerator and coroutine madness. I am toying around with integrating some scripting into Unity and I was trying to maintain scheduling for it separate from Unity's own schedulers. A lot of that is just trying to decouple, but also because I don't know what Unity is doing with synchronization contexts. That right there has kept me away from using async-await, although I'm about to poke the synchronization context with a stick and see what happens.

Anyways, I wound up trying to use IEnumerator coroutines because of that. Internally, that means a lot of calls are returning IEnumerators because the things they invoke might yield. But then sometimes I need to recover a return value and conclude the coroutine. This is leading to a very common copy-paste, and I don't like that. I'm not in front of the code but it's basically a foreach against the invocation of the subordinate code. If it's not a return value, I end up yielding it up a layer--usually to another foreach block. I'd use a macro for this if I wasn't working in C#. I can't write a helper function because then I'd have to wrap that in the same foreach. So I'm pondering out loud here how I might be able to maintain the IEnumerator model while not having to do this foreach stuff.

I'm actually starting to bank on overriding the synchronization context to defer all my own generated awaiters to my own scheduling because I think it would ultimately look cleaner, so please don't X instead of Y me on using async-await here. I'm starting to look into that too.

This gets particular ugly if any scripts are written recursively. I don't plan to do that. However, I wonder if switching to async-await would make that any better. Internally, does the .NET runtime manage that better?

You can grab Unity’s UnitySynchronizationContext by querying for the current synchronization context anywhere you’re guaranteed to be on the main thread (any lifecycle function or a coroutine).

You can then use that synchronization context to synchronize back to the unity thread when you’re off the unity thread.

If you just use coroutines, everything is running on the main thread.

Adbot
ADBOT LOVES YOU

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
Thanks leper khan and ljw1004. I think I was talking to myself for awhile there. I'm still experimenting.

ljw1004 posted:

I don't know what is the source of your ugliness with recursive scripts. If you post some code, maybe that would clarify. The .NET runtime (CLR) isn't at all involved in async+await. It's entirely accomplished in libraries which can be written by you, or by the framework designers. They are the ones that implemented Task.GetAwaiter and decided to make it use synchronization contexts. When you say "ugliness" I instead assume you're talking about language-level ugliness, i.e. is your paradigm for coroutines compositional in the same way that async+await is. Again it's too hard to speculate without seeing your code.
It's conceptual. In my case, a lot of calls end up leaning on something returning IEnumerator. I could then define recursive calls to the same one and say, have a child 100 layers down yield. That would the yield 100 times up while my scheduler moves on to something else presumably before coming back, going 100 times back down, and picking up again. I haven't studied if that's what happens for such things normally, but I'm pretty sure I screwed myself with that for recursive calls due to boilerplate foreach I have around them everywhere.

Edit: Okay, so to maybe explain more of what's going on. If you haven't tuned in before, I've been nibbling on a supposedly scaled-down Python interpreter in .NET here so I can write pseudo Python NPC scripting and stuff for Unity. Well, that and also study Python internals for potential job application stuff. A major requirement for me is something like half-sync/half-async; I want blocking calls on other parts of the engine for things like prompts without making the scripts re-entrant in clumsier ways like with state machines. So that's left me with a lot of shenanigans for doing coroutines.

Right now every callable thing returns an IEnumerator and blocking calls like this would just drift up and out of the current frame. Not every such call does this; a lot of stuff just returns immediately. However, I can't always tell that and it's always possible for something to get overridden. I mean, I'm the main rear end in a top hat using this so I don't think I'd override __add__ for the base integer class, but it's theoretically possible and I currently support it. Side fact: doing it this way was easier than some of the alternatives after implementing a certain amount of Pythonism. I didn't really expect that. Anyways, this leaves me with foreaching on IEnumerators all the time with a fairly boilerplate block that I can't do much about but clone and clone and clone. So I'm thinking that maybe I don't need to bother with it? I need something to deal with code that might yield before producing its final value. I'm not yielding in the sense of a classic generator but instead of trying to use it to punch out of the current frame and have something else run for awhile. So I need to pass those up before I get the one that represents the stuff I actually care about. Note that this is all going on under the hood so some messiness is acceptable, but the replication of that boilerplate is getting ridiculous. For example, every arithmetic opcode will defer to a dunder in Python and those are all callables that are getting this treatment.

The synchronization context came up when I was looking up what I might be getting into if I wanted the coroutines generated by the interpreter to be managed and scheduled by it instead of the hosting system directly. A big reason for that is to moderate how much some of the scripts run and interfere with them if a script is being naughty. Some of that is I just don't want the rest of the system to have to mess with it in general. What I thought from a cursory reading of SynchronizationContext is that there's a global one and I'd have to create my own implementation of something for it, replace the default one with mine, and use it to whittle out the ones I don't want to go on to normal processing. Between some stuff I read the other day and some comments here, I think I really didn't get it right with that. The synchronization context goes beyond basic coroutine task scheduling and into regular multithreaded stuff. So I'd prefer if I could do that interception without dealing with that.

So some of the requirements I'm balancing is:
1. Trying to avoid boilerplate.
2. Shuffling paused/yielded coroutines to a separate scheduler.

Rocko Bonaparte fucked around with this message at 08:07 on Aug 13, 2019

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply