Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Mr Shiny Pants
Nov 12, 2012

Bognar posted:


You can define queries on the schema using an arbitrary number of parameters as arguments. Parsing of the query is done in F# using FParsec, which is a port of the awesome Haskell library Parsec that's used for building parsers. From there, we map the parsed query to a query in the schema, then use the fields from the parsed query to build up a selector expression. Right now, this is all coded around using Entity Framework, but halfway through I realized it could pretty easily be built solely around IQueryable, so you could use any .NET ORM (or even query in-memory). I think that's going to be next on my list of things to do. There's a lot of work that needs to be done, though... this only superficially implements the spec and leaves out a lot of things. I'm also going to have to think hard about how to handle some aspects of it, such as Fragments.

What do you guys think?

I was thinking about this, and what it does, and I am wondering: How is this different from exposing a SQL server over HTTP? I reads like it is just an database endpoint with its own SQL like dialect that happens to use HTTP instead of a TCP socket.

I am genuinely curious.

Do we get GraphQL injection now?

Mr Shiny Pants fucked around with this message at 14:59 on Aug 8, 2015

Adbot
ADBOT LOVES YOU

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

mortarr posted:

That looks real interesting, like being able to use it would be pretty handy, but also implementing the spec itself looks like a sweet-as piece of work though. How far are you through the spec?

So far I'm supporting all query operations aside from Fragments, Variables, and nested input arguments (e.g. a top-level query can have arguments, but not fields beneath it). The GQL type system is superficially implemented - I didn't spend a whole lot of time on this since most of my effort went into building expressions from the query. Most things in the type system shouldn't be too hard to support, aside from Union types. Not really sure how I'm going to get that to work yet. Introspection kind of relies on the type system, but that will be pretty simple to add once it's finished. Those are the big missing pieces, the rest of the spec is mostly clarification and details.

Facebook hasn't revealed how they solve problems like over-requesting and causing performance issues. I have some ideas of my own for that, but I'd be interested to see what they are using.

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

Mr Shiny Pants posted:

I was thinking about this, and what it does, and I am wondering: How is this different from exposing a SQL server over HTTP? I reads like it is just an database endpoint with its own SQL like dialect that happens to use HTTP instead of a TCP socket.

I am genuinely curious.

Do we get GraphQL injection now?

SQL Server supports arbitrary joins across arbitrary tables with arbitrary filters, as well as arbitrary updates and deletes. Super dangerous to expose to the internet.

In GraphQL you explicitly define what the user is allowed to query for and join across. Permissions are still up to you to handle on the query end, but it's nothing like exposing SQL Server. Think of it more like this:

A REST API has multiple endpoints for different resources. Each endpoint exposes a certain number of fields, potentially joined to a certain number of other related resources. Maybe you have some pages on your UI that require more fields/joins than others, so for performance concerns you create a separate endpoint for that page to query. 6 months of development pass and you have a shitload of endpoints for returning different resources, populated to various degrees.

In GraphQL, each of those endpoints would become a query. The fields and joins that are available on each resource can then be specified explicitly in the query. In that sense, it's (theoretically) no more dangerous than a REST API. Now, your UI can explicitly request only what it needs from the server instead of you having to modify or create a new endpoint for when something needs an additional field. It simplifies your development since you don't have to switch back and forth between front-end and back-end to constantly change endpoints for your UI requirements.

As mentioned in the above post, there are things that can cause performance problems such as requesting a shitload of fields/joins, but there are ways to mitigate that. However, there's no inherent security risk like there would be with exposing SQL Server.

amotea
Mar 23, 2008
Grimey Drawer
Just use Reactive Extensions and ReactiveUI for the dependent property stuff. In fact, use it for all your WPF/MVVM needs. It prevents you from ending up with a gigantic spaghetti ball of change notifications.

We've moved from the traditional WPF + MVVM paradigm to using ReactiveUI for Views and ViewModels (and other places where applicable) and it really helps to keep your code clean. It also allows you to do more in your code-behind without making a mess. This really helps because XAML isn't very powerful when it comes to custom expressions/behaviour. I would never want to go back to the old ways of doing things.

Mr Shiny Pants
Nov 12, 2012

Bognar posted:

SQL Server supports arbitrary joins across arbitrary tables with arbitrary filters, as well as arbitrary updates and deletes. Super dangerous to expose to the internet.

In GraphQL you explicitly define what the user is allowed to query for and join across. Permissions are still up to you to handle on the query end, but it's nothing like exposing SQL Server. Think of it more like this:

A REST API has multiple endpoints for different resources. Each endpoint exposes a certain number of fields, potentially joined to a certain number of other related resources. Maybe you have some pages on your UI that require more fields/joins than others, so for performance concerns you create a separate endpoint for that page to query. 6 months of development pass and you have a shitload of endpoints for returning different resources, populated to various degrees.

In GraphQL, each of those endpoints would become a query. The fields and joins that are available on each resource can then be specified explicitly in the query. In that sense, it's (theoretically) no more dangerous than a REST API. Now, your UI can explicitly request only what it needs from the server instead of you having to modify or create a new endpoint for when something needs an additional field. It simplifies your development since you don't have to switch back and forth between front-end and back-end to constantly change endpoints for your UI requirements.

As mentioned in the above post, there are things that can cause performance problems such as requesting a shitload of fields/joins, but there are ways to mitigate that. However, there's no inherent security risk like there would be with exposing SQL Server.

Thanks for the info, though it still seems like they have created a sort of DBMS with REST.

I don't want to sound pedantic, but what you've described, sounds a lot like "you have a DB connection, and with your current credentials and access permissions you are allowed to arbitrarily query the information set exposed by the connection". Instead of running stored procedures ( REST endpoints if you will giving a fixed set of information back) we will let you write raw SQL (GraphQL) to query the data.

So the queries are defined on the server or the client?

I can see why you would want this, I am just wondering if this is a reinvention of something like an SQL server that you could query directly that exposes its results in JSON format. It almost looks like PHP with its SQL queries directly in code, but instead of raw SQL we get a JSON dialect that talks to an HTTP endpoint instead of MySQL.

Ok, reading some more about this. You craft the queries on the client and send them to the server.

One final question: Is there a schema? How do you know which properties you can query?

Thinking about this some more: This could be really helpful. So if I understand correctly you have written a server parser for it? nice.

Mr Shiny Pants fucked around with this message at 16:21 on Aug 8, 2015

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy
Phone posting so I'll be lazy and point to the intro and the spec:

http://facebook.github.io/react/blog/2015/05/01/graphql-introduction.html

https://facebook.github.io/graphql/

TL;DR: Yes, there's a schema (which itself is queryable), queries are defined on the server, the client specifies which query it wants to call and which fields that query should return.

What I put together is a (partial) implementation of the spec for .NET and anything supporting IQueryable. The neat thing is that, so far, I haven't seen any other GraphQL implementations that actually talk to a database. Most of them just work in memory, which is pretty useless.

Opulent Ceremony
Feb 22, 2012
How is that different from an endpoint that implements OData? They've already got nice tools for that like Breeze.js

crashdome
Jun 28, 2011
Quick question: I'm in discussions about a new project that is a 24/7 windows service which grabs data from a web service and then communicates/displays that data on a standard WPF app. I am trying to convince the web service developers to send push-style notifications to my windows service for real-time updates. My question, though, is: if I cannot get the web service guys to hand out push-style notifications of updates, what is the best timer (or other way) to use in a Windows Service for interval updates that has long-term reliability as a priority?

Is it still System.Timers.Timer?

Dietrich
Sep 11, 2001

crashdome posted:

Quick question: I'm in discussions about a new project that is a 24/7 windows service which grabs data from a web service and then communicates/displays that data on a standard WPF app. I am trying to convince the web service developers to send push-style notifications to my windows service for real-time updates. My question, though, is: if I cannot get the web service guys to hand out push-style notifications of updates, what is the best timer (or other way) to use in a Windows Service for interval updates that has long-term reliability as a priority?

Is it still System.Timers.Timer?

Ncron or Quartz.net will take care of scheduling jobs to be executed on a calendar or repeating basis. You could wire it all up with Timer, but there are lots of subtle gotchas you need to worry about.

As a bonus, NCron will even take care of the whole windows service install for you.

Dietrich fucked around with this message at 20:39 on Aug 11, 2015

epswing
Nov 4, 2003

Soiled Meat

crashdome posted:

Is it still System.Timers.Timer?

Seconding NCron.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

crashdome posted:

Quick question: I'm in discussions about a new project that is a 24/7 windows service which grabs data from a web service and then communicates/displays that data on a standard WPF app. I am trying to convince the web service developers to send push-style notifications to my windows service for real-time updates. My question, though, is: if I cannot get the web service guys to hand out push-style notifications of updates, what is the best timer (or other way) to use in a Windows Service for interval updates that has long-term reliability as a priority?

Is it still System.Timers.Timer?

Get the web service guys to use NServiceBus, it's perfect for this.

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

Opulent Ceremony posted:

How is that different from an endpoint that implements OData? They've already got nice tools for that like Breeze.js

OData and GraphQL are solving similar problems. GraphQL is definitely more human-readable than OData, but the OData spec has a much more rigid specification (for better or worse). OData seems like it's only meant to be used with a RESTful style, resource-based API whereas GraphQL can map to any sort of object graph since the spec is a bit looser (again, for better or worse).

The big thing that I can see is that GraphQL is more composable than OData. This is important for applications using a Component-style UI model (e.g. React) where data is passed down from one component to its children. Without composability, the top-level component must know what all child components will need data-wise, so it has tight coupling. However, with a composable query language, the child components can define what data they need and the parent component can just bolt those requirements onto its query, reducing coupling.

EDIT: How timely, Facebook just posted this: http://facebook.github.io/react/blog/2015/08/11/relay-technical-preview.html

Bognar fucked around with this message at 21:06 on Aug 11, 2015

crashdome
Jun 28, 2011

Ithaqua posted:

Get the web service guys to use NServiceBus, it's perfect for this.

I'll try!


Dietrich posted:

Ncron or Quartz.net will take care of scheduling jobs to be executed on a calendar or repeating basis. You could wire it all up with Timer, but there are lots of subtle gotchas you need to worry about.

As a bonus, NCron will even take care of the whole windows service install for you.

epalm posted:

Seconding NCron.

Oh dearie me... a whole scheduling framework? I'll look into it but, isn't that a bit overkill for something that executes a single operation once every few minutes?

epswing
Nov 4, 2003

Soiled Meat

crashdome posted:

Oh dearie me... a whole scheduling framework? I'll look into it but, isn't that a bit overkill for something that executes a single operation once every few minutes?

All it really takes is

Install package:
code:
Install-Package NCron
Create task:
code:
public class GrabTheDataTask : NCron.ICronJob
{
    public void Execute()
    {
        // grab the data
    }

    public void Initialize(NCron.CronContext context)
    {
    }
}
Schedule task:
code:
var schedService = new SchedulingService();
schedService.At("0 * * * *").Run<GrabTheDataTask>(); // once per hour

Dietrich
Sep 11, 2001

epalm posted:

-Good stuff-

Also, make sure you use the console application template, and when you have it built, you just copy the .exe and .dlls to wherever, open a command prompt, and run ProgramName.exe install to have it install as a windows service.

https://code.google.com/p/ncron/wiki/Deployment

epswing
Nov 4, 2003

Soiled Meat
C# code:
public void DoStuff()
{
    try
    {
        await CheckAdmin("epalm@example.com", "abc");
        // do some stuff
    }
    catch (Exception e)
    {
        // log exception
    }
}

private async void CheckCredentials(string email, string pass)
{
    // check user

    var userManager = HttpContext.GetOwinContext().GetUserManager<ApplicationUserManager>();

    var user = await userManager.FindByEmailAsync(email);
    if (user == null)
        throw new AdminNotFoundException();

    // check password

    var result = await userManager.CheckPasswordAsync(user, pass);
    if (!result)
        throw new InvalidPasswordException();
}
  • The user manager only offers async methods, so I have prefix them with "await"
  • After doing so, the CheckCredentials method needs to be marked as "async void" (the alarm bells are already starting to ring)
  • At runtime, calling DoStuff barfs with an InvalidOperationException, stating "An asynchronous operation cannot be started at this time" which "may also indicate an attempt to call an 'async void' method, which is generally unsupported within ASP.NET request processing. Instead, the asynchronous method should return a Task, and the caller should await it"
  • The nature of the CheckCredentials method is to complain if necessary, rather than returning anything. Do I need to change this because the user manager uses async? What Task would I even return?


Edit: I guess I need that to be
C# code:
public async Task DoStuff()
and
C# code:
await CheckAdmin("epalm@example.com", "abc");
and
C# code:
private async Task CheckCredentials(string email, string pass)

epswing fucked around with this message at 22:20 on Aug 11, 2015

Sedro
Dec 31, 2008
You can simply change the return value to Task, then await it to receive the exceptions

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy
To clarify, change the CheckCredentials return type to Task. I don't know where DoStuff is being called, but that should probably also be Task as well. Task is the async equivalent for void, and Task<T> is the async equivalent for some return type T. Async void is a hack for event handlers that can't have a return type - if you're doing ASP.NET stuff then you can probably ignore this and never use async void.

Also, async is great and all, but it's hard to fit into an already synchronous application due to it requiring everything from top to bottom to be async.

epswing
Nov 4, 2003

Soiled Meat

Ahh gotcha, thanks.

Bognar posted:

Also, async is great and all, but it's hard to fit into an already synchronous application due to it requiring everything from top to bottom to be async.

My thoughts exactly.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Bognar posted:

Async void is a hack for event handlers that can't have a return type

Yeah, the rule is basically "never ever ever use async void unless it's an event handler"

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

Ithaqua posted:

Yeah, the rule is basically "never ever ever use async void unless it's an event handler"

Or something resembling an event handler, e.g. ICommand.

Ciaphas
Nov 20, 2005

> BEWARE, COWARD :ovr:


I put that babby's-first-reflection INPCDepends thing on github (and hoping like hell I did it right, never used github before), over here, because I had a question about it. The relevant bits are INPCDependsAttribute.cs and ObservableBase.cs.

Basically, as written, [INPCDepends] is 100% useless on anything that doesn't inherit from ObservableBase. Is there any way to make the compiler throw up an error if that attribute is used anywhere else, or do I just have to wait for runtime? Or should I just not bother? :v:

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Ciaphas posted:

I put that babby's-first-reflection INPCDepends thing on github (and hoping like hell I did it right, never used github before), over here, because I had a question about it. The relevant bits are INPCDependsAttribute.cs and ObservableBase.cs.

Basically, as written, [INPCDepends] is 100% useless on anything that doesn't inherit from ObservableBase. Is there any way to make the compiler throw up an error if that attribute is used anywhere else, or do I just have to wait for runtime? Or should I just not bother? :v:

Nitpick:

code:
  catch (Exception e)
            {
                Console.WriteLine(input);
                throw e;
            }
throw e; will make you lose the stack trace. Just throw instead if you want to rethrow an exception. In general you don't want to throw an Exception, you want to either reuse an existing more specific exception or implement your own.

As for your question, no. Attributes are a runtime thing, the compiler doesn't know or care what's going to happen at runtime when you reflect the attribute out and start doing stuff with it.

Ciaphas
Nov 20, 2005

> BEWARE, COWARD :ovr:


Yeah that bit was part of the cycle check, and was kind of half done when I printed my notes from work :v: Thanks though.

And that's too bad. I sort of hoped there'd be a way to make the compiler care, but I guess it makes sense that you can't. :shobon:

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Ciaphas posted:

Yeah that bit was part of the cycle check, and was kind of half done when I printed my notes from work :v: Thanks though.

And that's too bad. I sort of hoped there'd be a way to make the compiler care, but I guess it makes sense that you can't. :shobon:

What industry do you work in where you aren't allowed internet access at your workstation?

Ciaphas
Nov 20, 2005

> BEWARE, COWARD :ovr:


Defense contractor. I don't do any :ninja: poo poo (or usually any C# stuff for that matter, but things are quiet) but we all get treated pretty equally in that regard :(

Gul Banana
Nov 28, 2003

nuget in vs2015 is unusable crap :/

ljw1004
Jan 18, 2005

rum

Ciaphas posted:

Is there any way to make the compiler throw up an error if that attribute is used anywhere else?

Yes there's a great way to do this but it only works in VS2015. Look up "Roslyn Analyzers". Basically, you'll distribute your attribute&stuff as a NuGet package, and you'll also distribute your analyzer as part of the same package. The analyzer lets you provide error/warning squiggles about incorrect use of your API.

ljw1004
Jan 18, 2005

rum

Gul Banana posted:

nuget in vs2015 is unusable crap :/

Please say more? I've been working with the NuGet team on UWP stuff. Are you referring to the UI? (they know it needs improvement...) Or to project.json stuff in UWP apps? Or to project.json in ASP.NET5 apps?

kitten emergency
Jan 13, 2008

get meow this wack-ass crystal prison
Speaking of ASP.NET 5, I'm having a hell of a time getting it to play nice with EF6. I'm registering my context into the request pipeline just fine, but it looks like it's not actually connecting to the DB during initialization. Unfortunately since this is all beta still I haven't found a lot of information about the best way to do this - has anyone else run into something similar?

Gul Banana
Nov 28, 2003

ljw1004 posted:

Please say more? I've been working with the NuGet team on UWP stuff. Are you referring to the UI? (they know it needs improvement...) Or to project.json stuff in UWP apps? Or to project.json in ASP.NET5 apps?

The issues i've been having are about features and performance. the new UI looks nice, actually - a bit sparse but it feels more integrated into VS than before. i'm not attempting to use any new features, just the old ones- unfortunately, nuget 3 has broken my workflow in several ways.

i develop libraries in an internal (enterprise-type) ecosystem. there are solutions from which some projects are built into nuget packages, and other solutions which consume some of those packages. when i'm testing a bundle of jointly-versioned packages it's an iterative process - i build them into a 'staging' package source, then upgrade packages from that source into a product's repo, test and repeat. by the nature of library development this can expose bugs, breaking changes and api design issues, so there can be a number of iterations.

problem 1: "Upgrade All" no longer exists. the functionality is straight up gone, which is a nightmare when i'm deploying different prerelease combinations of 5-10 libraries into 20-30 projects. they have internal and external dependency trees and i've been relying on the package manager to figure it all out- that power of that button was the value of nuget for me, turning declaratively specified dependencies into whatever operations are necessary to get everything into a consistent state. after enough googling to figure out that this is no longer a thing, i tried to script around the problem- *building* libraries is already automated, so it would only be a moderate hassle to update them the same way. this lead to..

problem 2: the powershell commands no longer resolve cross-package-source dependencies. this makes them useless - we have internal packages, partner/vendor packages, and the public nuget.org ecosystem. if i try to update an internal library which depends on e.g. microsoft.codeanalysis (because roslyn is great), it doesn't resolve- and if you were trying a multi-project command it halts the whole thing. there's no ability to continue on and apply changes to projects which *can* resolve their dependencies. that leaves me updating each project by hand each iteration, which reveals...

problem 3: it's very very slow. upgrading a simple project which references 3 nuget packages, all internal, takes about half a minute during which the VS ui locks up. this is on a core i5-4790 workstation with an SSD. this is not a great scenario for nuget to deal with- we have our server feeds, my staging feed and the public feed on another continent - but previous versions of the package manager handled it far better. right now our build/push processes are still using nuget 2 because i tested 3's version of nuget.exe and that was also way slower. it's just dealing with a local network share and an http symbol server, but seems to make many more expensive roundtrip calls than nuget 2.

some of these problems are listed as nuget team github issues, with milestone 3.1 attached. so i tried to update to that, following this link from the nuget blog. however, it leads to a VS Extension Gallery "This item is not yet published." page.

Gul Banana
Nov 28, 2003

apologies for the negativity. i'm all for the idea of package.json, .net core's package restore workflow and so on - unifying projects and packages will ultimately make my life better. it's just very annoying in the moment when an update to infrastructural software removes crucial functionality and seems basically half-baked- an impression i may be getting unfairly from the more open new development process. at the moment i have a better time with even SBT (in the Scala ecosystem) than NuGet 3.

Ika
Dec 30, 2004
Pure insanity

I'm just starting to use .NET / mixed mode C++, and have a assembly reference question.

When I need to reference a third party assembly I just add it as a DLL to the project references, and everything works fine as a local build. However, when I want to compile on a build server, am I forced to include the referenced DLL in my source code, and check it in, or is there an alternative way to pull the type information from a file which does not contain any implementation which I can add to the repository?

ljw1004
Jan 18, 2005

rum

Gul Banana posted:

Problems with NuGet 3

"problem 1: "Upgrade All" no longer exists." - Understood. The NuGet team is working bringing back UpgradeAll right at the moment.

"problem 2: the powershell commands no longer resolve cross-package-source dependencies." - slated for NuGet v3.2

"problem 3: it's very very slow." - there are a few things to do here. (1) Update the package source to V3. (2) If you have a local package source (directory) and it's slow, you can speed it up by putting it behind a server. (3) The NuGet guy wrote "Nuget3 is calling all sources in parallel rather than in order, that's a bug fix from NuGet2 that indeed makes things slower, but results in the right behavior." I don't understand this last comment but don't want to distract him for clarification any further since he should be heads-down implementing UpgradeAll...

"i tried to update to milestone 3.1, following this link from the nuget blog. however, it leads to a VS Extension Gallery "This item is not yet published." page." - I let the team know. Hopefully they'll fix the blog. [edit: they have]. But it's easiest just to update NuGet within VS via Tools > Extensions and Updates, which will also get you 3.1.


Edit: More notes from the guy in charge of NuGet, which I'm not expert enough in NuGet to understand: "Also note that with project.json and a better folder layout (similar to the packages folder) the update story becomes a lot better with star specifiers. Also update-package works from powershell exactly how update all/upgrade all worked from the UI. It's just a matter of adding a button for all packages (not so for updating individual packages)"

ljw1004 fucked around with this message at 17:46 on Aug 12, 2015

GoodCleanFun
Jan 28, 2004
General NuGet question. Started using an IIS hosted NuGet feed running on a virtual server for libraries and I've run into a weird issue that may be due to how I'm handling my package updates.

Example:

Say I have a package called Library and the version is 1.0.0.1. I pull that into a program called LibraryTest and them make an update to Library afterwards and change the version to 1.0.0.2 and push to the feed. Now say I make another change to Library and the version is now 1.0.0.3, etc. Now in LibraryTest, that is currently running 1.0.0.1, I get restore issues with NuGet. However, if I update to 1.0.0.2 in LibraryTest before pushing 1.0.0.3, it seems to work. While I can update LibraryTest after each new package publish, this becomes an impossible and absurd task when you have hundreds of programs using Library.

I've been removing my old packages from the packages folder in my NuGet IIS feed and adding the new one and pushing. I initially tried adding folders inside the Packages folder to hold all versions, but my packages were not picked up once pushed.

How should one go about this?

putin is a cunt
Apr 5, 2007

BOY DO I SURE ENJOY TRASH. THERE'S NOTHING MORE I LOVE THAN TO SIT DOWN IN FRONT OF THE BIG SCREEN AND EAT A BIIIIG STEAMY BOWL OF SHIT. WARNER BROS CAN COME OVER TO MY HOUSE AND ASSFUCK MY MOM WHILE I WATCH AND I WOULD CERTIFY IT FRESH, NO QUESTION
I've got a pretty basic best practice question. I develop almost exclusively in MVC and I like to handle all exceptions using an exception filter, or in the case of my latest project I'm using the "OnException" override on a base controller. The thing is, I want as many of these exceptions as possible to arrive with some kind of a friendly message that can be shown to the user on an error page, or returned in a JSON response, so that the message doesn't reveal sensitive details about the inner workings of the code (otherwise I may as well just stick with the yellow screen of death!) To achieve this, what I tend to do is something like this:

code:
try
{
    // do the thing, and maybe encounter some kind of exception!
}
catch (ExceptionICanHandle e)
{
    // handle exception
}
catch (FriendlyException)
{
    throw;
}
catch (Exception e)
{
    Logger.Writer.Write(string.Format("There was an unexpected exception while doing the thing. Message: {0}",
                    e.Message));
    throw new FriendlyException("There was an unexpected exception while doing the thing.", e);
}
Please note that obviously ExceptionICanHandle isn't literal, it's a placeholder for any type of exception I can log and then handle without needing to bubble it up to the top layer.

I know that the generally accepted guideline is to catch only those exceptions that you can actually handle, and that all other exceptions should be left to bubble upward. If I can move to that approach I'd love to, I would much prefer I'm doing this "the right way", but I feel like I have good reasoning behind my current approach so I would need to know how to achieve the same benefits while not catching those exceptions I can't handle.

My reasoning behind catching these exception types:

ExceptionsICanHandle
This is obvious and not at all controversial, I catch these so I can... handle them.

FriendlyException
In the try block there is often some calls to my own code where I may have already encountered an exception and packaged it up into a FriendlyException, so I don't need to repackage this I just need to rethrow it.

Exception
This is a way for me to intercept the nasty Exception that I don't want shown to the user and transform it into a FriendlyException that provides a user-friendly message that is (more importantly) contextual. At this point I know the context of the otherwise unhandled exception and I can report on it in the logging and in the friendly message the user receives. For example if the unhandled exception occurred while trying to update a user's details, I can report back the user id and the details that were sent for updating. I can't do that if the exception bubbles all the way up and is caught at the last minute.

My problem with what I'm doing is that I'm sure I'm violating a couple of principles I hear a lot: "throw early, catch late" and "only catch exceptions you can handle" and I'm also essentially hiding the unhandled exceptions from higher levels of the application. But I'm not sure how to achieve the same usefulness without using the approach I've used. In other words, I know my approach is wrong, but I can't quite hit on the correct approach.

putin is a cunt fucked around with this message at 01:27 on Aug 13, 2015

Gul Banana
Nov 28, 2003

ljw1004 posted:

fixes to nuget 3
sounds like things are on the right track! just a couple of clarifications:

quote:

"problem 3: it's very very slow." - there are a few things to do here. (1) Update the package source to V3. (2) If you have a local package source (directory) and it's slow, you can speed it up by putting it behind a server. (3) The NuGet guy wrote "Nuget3 is calling all sources in parallel rather than in order, that's a bug fix from NuGet2 that indeed makes things slower, but results in the right behavior." I don't understand this last comment but don't want to distract him for clarification any further since he should be heads-down implementing UpgradeAll...
not sure how an http package source could be faster than a local file system directory, but I'll try it!
unfortunately,, I'm guessing your coworker means that all package sources are contacted rather than checking each only if the previous ones failed to resolve a dependency. this means that every operation is now requiring a round trip to America rather than to my ssd or to our LAN.. I can see why that might be a bug fix but it's definitely going to be a permanent slowdown :( previously, we'd sped things uo by mirroring, in our corporate feed, packages from nuget.org, and it sounds like that won't work anymore.

quote:

But it's easiest just to update NuGet within VS via Tools > Extensions and Updates, which will also get you 3.1.
this doesn't actually work- the nuget extension is marked as non-updatable or something and VS tells you to visit the gallery. I'll use the new link or install the UWP tools (which also come with 3.1, I think)


quote:

Edit: More notes from the guy in charge of NuGet, which I'm not expert enough in NuGet to understand: "Also note that with project.json and a better folder layout (similar to the packages folder) the update story becomes a lot better with star specifiers. Also update-package works from powershell exactly how update all/upgrade all worked from the UI. It's just a matter of adding a button for all packages (not so for updating individual packages)"

to use package.json I'd have to build new-pcls, right? I'll look into whether those can target .net 4.5. if so, might be worthwhile.

EssOEss
Oct 23, 2006
128-bit approved
One thing that I noticed (and sent a frown about) is some bad behavior in the situation where I have a custom repository that is only sometimes accessible (i.e. when connected to VPN). I was not connected to VPN and tried to install Newtonsoft.Json to a project. Fairly simple operation, right? Well, NuGet failed to do it because it could not connect to my custom repository...

Do I really need to be connected all the time to all repositories? I think it could not be that silly. Indeed, I did not manage to reproduce this today. Perhaps it only does that if the project already includes packages from the custom repository? Even then it is not desirable when I am not actually installing anything new from the custom repository.

amotea
Mar 23, 2008
Grimey Drawer
Re: package managers, I found Paket (http://fsprojects.github.io/Paket/) to be very lean and fast. It probably doesn't support many of the advanced scenarios you're talking about here though.

Adbot
ADBOT LOVES YOU

Gul Banana
Nov 28, 2003

it's a new world out there.



sadly, the only one of these which supports .NET 4.5 is the bolded one - this is an xproj rather than a csproj, a DNX project. no good for my purposes since some of the libraries I'd like to build are written in VB (which DNX does not support). so, no project.json yet.

i was amused to see the slightly different schemas of these files- presumably nuget uses the common 'dependencies' and 'frameworks' keys

Class Library (Package) posted:

code:
{
  "version": "1.0.0-*",
  "description": "ClassLibrary-Package Class Library",
  "authors": [ "banana" ],
  "tags": [ "" ],
  "projectUrl": "",
  "licenseUrl": "",

  "dependencies": {
    "System.Collections": "4.0.10-beta-23019",
    "System.Linq": "4.0.0-beta-23019",
    "System.Threading": "4.0.10-beta-23019",
    "System.Runtime": "4.0.10-beta-23019",
    "Microsoft.CSharp": "4.0.0-beta-23019"
  },

  "frameworks": {
    "net452": { }
  }
}

Class Library (Portable) posted:

code:
{
  "supports": {
    "net46.app": {},
    "uwp.10.0.app": {},
    "dnxcore50.app": {}
  },
  "dependencies": {
    "Microsoft.NETCore": "5.0.0",
    "Microsoft.NETCore.Portable.Compatibility": "1.0.0"
  },
  "frameworks": {
    "dotnet": {
      "imports": "portable-net452+win81"
    }
  }
}

Class Library (Universal Windows) posted:

code:
{
  "dependencies": {
    "Microsoft.NETCore.UniversalWindowsPlatform": "5.0.0"
  },
  "frameworks": {
    "uap10.0": {}
  },
  "runtimes": {
    "win10-arm": {},
    "win10-arm-aot": {},
    "win10-x86": {},
    "win10-x86-aot": {},
    "win10-x64": {},
    "win10-x64-aot": {}
  }
}

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply