|
Bognar posted:
I was thinking about this, and what it does, and I am wondering: How is this different from exposing a SQL server over HTTP? I reads like it is just an database endpoint with its own SQL like dialect that happens to use HTTP instead of a TCP socket. I am genuinely curious. Do we get GraphQL injection now? Mr Shiny Pants fucked around with this message at 14:59 on Aug 8, 2015 |
# ? Aug 8, 2015 14:56 |
|
|
# ? Jun 6, 2024 10:51 |
|
mortarr posted:That looks real interesting, like being able to use it would be pretty handy, but also implementing the spec itself looks like a sweet-as piece of work though. How far are you through the spec? So far I'm supporting all query operations aside from Fragments, Variables, and nested input arguments (e.g. a top-level query can have arguments, but not fields beneath it). The GQL type system is superficially implemented - I didn't spend a whole lot of time on this since most of my effort went into building expressions from the query. Most things in the type system shouldn't be too hard to support, aside from Union types. Not really sure how I'm going to get that to work yet. Introspection kind of relies on the type system, but that will be pretty simple to add once it's finished. Those are the big missing pieces, the rest of the spec is mostly clarification and details. Facebook hasn't revealed how they solve problems like over-requesting and causing performance issues. I have some ideas of my own for that, but I'd be interested to see what they are using.
|
# ? Aug 8, 2015 15:13 |
|
Mr Shiny Pants posted:I was thinking about this, and what it does, and I am wondering: How is this different from exposing a SQL server over HTTP? I reads like it is just an database endpoint with its own SQL like dialect that happens to use HTTP instead of a TCP socket. SQL Server supports arbitrary joins across arbitrary tables with arbitrary filters, as well as arbitrary updates and deletes. Super dangerous to expose to the internet. In GraphQL you explicitly define what the user is allowed to query for and join across. Permissions are still up to you to handle on the query end, but it's nothing like exposing SQL Server. Think of it more like this: A REST API has multiple endpoints for different resources. Each endpoint exposes a certain number of fields, potentially joined to a certain number of other related resources. Maybe you have some pages on your UI that require more fields/joins than others, so for performance concerns you create a separate endpoint for that page to query. 6 months of development pass and you have a shitload of endpoints for returning different resources, populated to various degrees. In GraphQL, each of those endpoints would become a query. The fields and joins that are available on each resource can then be specified explicitly in the query. In that sense, it's (theoretically) no more dangerous than a REST API. Now, your UI can explicitly request only what it needs from the server instead of you having to modify or create a new endpoint for when something needs an additional field. It simplifies your development since you don't have to switch back and forth between front-end and back-end to constantly change endpoints for your UI requirements. As mentioned in the above post, there are things that can cause performance problems such as requesting a shitload of fields/joins, but there are ways to mitigate that. However, there's no inherent security risk like there would be with exposing SQL Server.
|
# ? Aug 8, 2015 15:26 |
|
Just use Reactive Extensions and ReactiveUI for the dependent property stuff. In fact, use it for all your WPF/MVVM needs. It prevents you from ending up with a gigantic spaghetti ball of change notifications. We've moved from the traditional WPF + MVVM paradigm to using ReactiveUI for Views and ViewModels (and other places where applicable) and it really helps to keep your code clean. It also allows you to do more in your code-behind without making a mess. This really helps because XAML isn't very powerful when it comes to custom expressions/behaviour. I would never want to go back to the old ways of doing things.
|
# ? Aug 8, 2015 15:50 |
|
Bognar posted:SQL Server supports arbitrary joins across arbitrary tables with arbitrary filters, as well as arbitrary updates and deletes. Super dangerous to expose to the internet. Thanks for the info, though it still seems like they have created a sort of DBMS with REST. I don't want to sound pedantic, but what you've described, sounds a lot like "you have a DB connection, and with your current credentials and access permissions you are allowed to arbitrarily query the information set exposed by the connection". Instead of running stored procedures ( REST endpoints if you will giving a fixed set of information back) we will let you write raw SQL (GraphQL) to query the data. So the queries are defined on the server or the client? I can see why you would want this, I am just wondering if this is a reinvention of something like an SQL server that you could query directly that exposes its results in JSON format. It almost looks like PHP with its SQL queries directly in code, but instead of raw SQL we get a JSON dialect that talks to an HTTP endpoint instead of MySQL. Ok, reading some more about this. You craft the queries on the client and send them to the server. One final question: Is there a schema? How do you know which properties you can query? Thinking about this some more: This could be really helpful. So if I understand correctly you have written a server parser for it? nice. Mr Shiny Pants fucked around with this message at 16:21 on Aug 8, 2015 |
# ? Aug 8, 2015 16:08 |
|
Phone posting so I'll be lazy and point to the intro and the spec: http://facebook.github.io/react/blog/2015/05/01/graphql-introduction.html https://facebook.github.io/graphql/ TL;DR: Yes, there's a schema (which itself is queryable), queries are defined on the server, the client specifies which query it wants to call and which fields that query should return. What I put together is a (partial) implementation of the spec for .NET and anything supporting IQueryable. The neat thing is that, so far, I haven't seen any other GraphQL implementations that actually talk to a database. Most of them just work in memory, which is pretty useless.
|
# ? Aug 8, 2015 16:52 |
|
How is that different from an endpoint that implements OData? They've already got nice tools for that like Breeze.js
|
# ? Aug 10, 2015 23:52 |
|
Quick question: I'm in discussions about a new project that is a 24/7 windows service which grabs data from a web service and then communicates/displays that data on a standard WPF app. I am trying to convince the web service developers to send push-style notifications to my windows service for real-time updates. My question, though, is: if I cannot get the web service guys to hand out push-style notifications of updates, what is the best timer (or other way) to use in a Windows Service for interval updates that has long-term reliability as a priority? Is it still System.Timers.Timer?
|
# ? Aug 11, 2015 19:57 |
|
crashdome posted:Quick question: I'm in discussions about a new project that is a 24/7 windows service which grabs data from a web service and then communicates/displays that data on a standard WPF app. I am trying to convince the web service developers to send push-style notifications to my windows service for real-time updates. My question, though, is: if I cannot get the web service guys to hand out push-style notifications of updates, what is the best timer (or other way) to use in a Windows Service for interval updates that has long-term reliability as a priority? Ncron or Quartz.net will take care of scheduling jobs to be executed on a calendar or repeating basis. You could wire it all up with Timer, but there are lots of subtle gotchas you need to worry about. As a bonus, NCron will even take care of the whole windows service install for you. Dietrich fucked around with this message at 20:39 on Aug 11, 2015 |
# ? Aug 11, 2015 20:03 |
|
crashdome posted:Is it still System.Timers.Timer? Seconding NCron.
|
# ? Aug 11, 2015 20:12 |
|
crashdome posted:Quick question: I'm in discussions about a new project that is a 24/7 windows service which grabs data from a web service and then communicates/displays that data on a standard WPF app. I am trying to convince the web service developers to send push-style notifications to my windows service for real-time updates. My question, though, is: if I cannot get the web service guys to hand out push-style notifications of updates, what is the best timer (or other way) to use in a Windows Service for interval updates that has long-term reliability as a priority? Get the web service guys to use NServiceBus, it's perfect for this.
|
# ? Aug 11, 2015 20:35 |
|
Opulent Ceremony posted:How is that different from an endpoint that implements OData? They've already got nice tools for that like Breeze.js OData and GraphQL are solving similar problems. GraphQL is definitely more human-readable than OData, but the OData spec has a much more rigid specification (for better or worse). OData seems like it's only meant to be used with a RESTful style, resource-based API whereas GraphQL can map to any sort of object graph since the spec is a bit looser (again, for better or worse). The big thing that I can see is that GraphQL is more composable than OData. This is important for applications using a Component-style UI model (e.g. React) where data is passed down from one component to its children. Without composability, the top-level component must know what all child components will need data-wise, so it has tight coupling. However, with a composable query language, the child components can define what data they need and the parent component can just bolt those requirements onto its query, reducing coupling. EDIT: How timely, Facebook just posted this: http://facebook.github.io/react/blog/2015/08/11/relay-technical-preview.html Bognar fucked around with this message at 21:06 on Aug 11, 2015 |
# ? Aug 11, 2015 21:04 |
|
Ithaqua posted:Get the web service guys to use NServiceBus, it's perfect for this. I'll try! Dietrich posted:Ncron or Quartz.net will take care of scheduling jobs to be executed on a calendar or repeating basis. You could wire it all up with Timer, but there are lots of subtle gotchas you need to worry about. epalm posted:Seconding NCron. Oh dearie me... a whole scheduling framework? I'll look into it but, isn't that a bit overkill for something that executes a single operation once every few minutes?
|
# ? Aug 11, 2015 21:06 |
|
crashdome posted:Oh dearie me... a whole scheduling framework? I'll look into it but, isn't that a bit overkill for something that executes a single operation once every few minutes? All it really takes is Install package: code:
code:
code:
|
# ? Aug 11, 2015 21:11 |
|
epalm posted:-Good stuff- Also, make sure you use the console application template, and when you have it built, you just copy the .exe and .dlls to wherever, open a command prompt, and run ProgramName.exe install to have it install as a windows service. https://code.google.com/p/ncron/wiki/Deployment
|
# ? Aug 11, 2015 21:32 |
|
C# code:
Edit: I guess I need that to be C# code:
C# code:
C# code:
epswing fucked around with this message at 22:20 on Aug 11, 2015 |
# ? Aug 11, 2015 22:08 |
|
You can simply change the return value to Task, then await it to receive the exceptions
|
# ? Aug 11, 2015 22:19 |
|
To clarify, change the CheckCredentials return type to Task. I don't know where DoStuff is being called, but that should probably also be Task as well. Task is the async equivalent for void, and Task<T> is the async equivalent for some return type T. Async void is a hack for event handlers that can't have a return type - if you're doing ASP.NET stuff then you can probably ignore this and never use async void. Also, async is great and all, but it's hard to fit into an already synchronous application due to it requiring everything from top to bottom to be async.
|
# ? Aug 11, 2015 22:25 |
|
Ahh gotcha, thanks. Bognar posted:Also, async is great and all, but it's hard to fit into an already synchronous application due to it requiring everything from top to bottom to be async. My thoughts exactly.
|
# ? Aug 11, 2015 22:27 |
|
Bognar posted:Async void is a hack for event handlers that can't have a return type Yeah, the rule is basically "never ever ever use async void unless it's an event handler"
|
# ? Aug 11, 2015 22:28 |
|
Ithaqua posted:Yeah, the rule is basically "never ever ever use async void unless it's an event handler" Or something resembling an event handler, e.g. ICommand.
|
# ? Aug 11, 2015 22:44 |
|
I put that babby's-first-reflection INPCDepends thing on github (and hoping like hell I did it right, never used github before), over here, because I had a question about it. The relevant bits are INPCDependsAttribute.cs and ObservableBase.cs. Basically, as written, [INPCDepends] is 100% useless on anything that doesn't inherit from ObservableBase. Is there any way to make the compiler throw up an error if that attribute is used anywhere else, or do I just have to wait for runtime? Or should I just not bother?
|
# ? Aug 12, 2015 02:25 |
|
Ciaphas posted:I put that babby's-first-reflection INPCDepends thing on github (and hoping like hell I did it right, never used github before), over here, because I had a question about it. The relevant bits are INPCDependsAttribute.cs and ObservableBase.cs. Nitpick: code:
As for your question, no. Attributes are a runtime thing, the compiler doesn't know or care what's going to happen at runtime when you reflect the attribute out and start doing stuff with it.
|
# ? Aug 12, 2015 02:37 |
|
Yeah that bit was part of the cycle check, and was kind of half done when I printed my notes from work Thanks though. And that's too bad. I sort of hoped there'd be a way to make the compiler care, but I guess it makes sense that you can't.
|
# ? Aug 12, 2015 02:42 |
|
Ciaphas posted:Yeah that bit was part of the cycle check, and was kind of half done when I printed my notes from work Thanks though. What industry do you work in where you aren't allowed internet access at your workstation?
|
# ? Aug 12, 2015 02:44 |
|
Defense contractor. I don't do any poo poo (or usually any C# stuff for that matter, but things are quiet) but we all get treated pretty equally in that regard
|
# ? Aug 12, 2015 02:48 |
|
nuget in vs2015 is unusable crap :/
|
# ? Aug 12, 2015 11:29 |
|
Ciaphas posted:Is there any way to make the compiler throw up an error if that attribute is used anywhere else? Yes there's a great way to do this but it only works in VS2015. Look up "Roslyn Analyzers". Basically, you'll distribute your attribute&stuff as a NuGet package, and you'll also distribute your analyzer as part of the same package. The analyzer lets you provide error/warning squiggles about incorrect use of your API.
|
# ? Aug 12, 2015 12:41 |
|
Gul Banana posted:nuget in vs2015 is unusable crap :/ Please say more? I've been working with the NuGet team on UWP stuff. Are you referring to the UI? (they know it needs improvement...) Or to project.json stuff in UWP apps? Or to project.json in ASP.NET5 apps?
|
# ? Aug 12, 2015 12:44 |
|
Speaking of ASP.NET 5, I'm having a hell of a time getting it to play nice with EF6. I'm registering my context into the request pipeline just fine, but it looks like it's not actually connecting to the DB during initialization. Unfortunately since this is all beta still I haven't found a lot of information about the best way to do this - has anyone else run into something similar?
|
# ? Aug 12, 2015 13:10 |
|
ljw1004 posted:Please say more? I've been working with the NuGet team on UWP stuff. Are you referring to the UI? (they know it needs improvement...) Or to project.json stuff in UWP apps? Or to project.json in ASP.NET5 apps? The issues i've been having are about features and performance. the new UI looks nice, actually - a bit sparse but it feels more integrated into VS than before. i'm not attempting to use any new features, just the old ones- unfortunately, nuget 3 has broken my workflow in several ways. i develop libraries in an internal (enterprise-type) ecosystem. there are solutions from which some projects are built into nuget packages, and other solutions which consume some of those packages. when i'm testing a bundle of jointly-versioned packages it's an iterative process - i build them into a 'staging' package source, then upgrade packages from that source into a product's repo, test and repeat. by the nature of library development this can expose bugs, breaking changes and api design issues, so there can be a number of iterations. problem 1: "Upgrade All" no longer exists. the functionality is straight up gone, which is a nightmare when i'm deploying different prerelease combinations of 5-10 libraries into 20-30 projects. they have internal and external dependency trees and i've been relying on the package manager to figure it all out- that power of that button was the value of nuget for me, turning declaratively specified dependencies into whatever operations are necessary to get everything into a consistent state. after enough googling to figure out that this is no longer a thing, i tried to script around the problem- *building* libraries is already automated, so it would only be a moderate hassle to update them the same way. this lead to.. problem 2: the powershell commands no longer resolve cross-package-source dependencies. this makes them useless - we have internal packages, partner/vendor packages, and the public nuget.org ecosystem. if i try to update an internal library which depends on e.g. microsoft.codeanalysis (because roslyn is great), it doesn't resolve- and if you were trying a multi-project command it halts the whole thing. there's no ability to continue on and apply changes to projects which *can* resolve their dependencies. that leaves me updating each project by hand each iteration, which reveals... problem 3: it's very very slow. upgrading a simple project which references 3 nuget packages, all internal, takes about half a minute during which the VS ui locks up. this is on a core i5-4790 workstation with an SSD. this is not a great scenario for nuget to deal with- we have our server feeds, my staging feed and the public feed on another continent - but previous versions of the package manager handled it far better. right now our build/push processes are still using nuget 2 because i tested 3's version of nuget.exe and that was also way slower. it's just dealing with a local network share and an http symbol server, but seems to make many more expensive roundtrip calls than nuget 2. some of these problems are listed as nuget team github issues, with milestone 3.1 attached. so i tried to update to that, following this link from the nuget blog. however, it leads to a VS Extension Gallery "This item is not yet published." page.
|
# ? Aug 12, 2015 13:38 |
|
apologies for the negativity. i'm all for the idea of package.json, .net core's package restore workflow and so on - unifying projects and packages will ultimately make my life better. it's just very annoying in the moment when an update to infrastructural software removes crucial functionality and seems basically half-baked- an impression i may be getting unfairly from the more open new development process. at the moment i have a better time with even SBT (in the Scala ecosystem) than NuGet 3.
|
# ? Aug 12, 2015 13:40 |
|
I'm just starting to use .NET / mixed mode C++, and have a assembly reference question. When I need to reference a third party assembly I just add it as a DLL to the project references, and everything works fine as a local build. However, when I want to compile on a build server, am I forced to include the referenced DLL in my source code, and check it in, or is there an alternative way to pull the type information from a file which does not contain any implementation which I can add to the repository?
|
# ? Aug 12, 2015 14:37 |
|
Gul Banana posted:Problems with NuGet 3 "problem 1: "Upgrade All" no longer exists." - Understood. The NuGet team is working bringing back UpgradeAll right at the moment. "problem 2: the powershell commands no longer resolve cross-package-source dependencies." - slated for NuGet v3.2 "problem 3: it's very very slow." - there are a few things to do here. (1) Update the package source to V3. (2) If you have a local package source (directory) and it's slow, you can speed it up by putting it behind a server. (3) The NuGet guy wrote "Nuget3 is calling all sources in parallel rather than in order, that's a bug fix from NuGet2 that indeed makes things slower, but results in the right behavior." I don't understand this last comment but don't want to distract him for clarification any further since he should be heads-down implementing UpgradeAll... "i tried to update to milestone 3.1, following this link from the nuget blog. however, it leads to a VS Extension Gallery "This item is not yet published." page." - I let the team know. Hopefully they'll fix the blog. [edit: they have]. But it's easiest just to update NuGet within VS via Tools > Extensions and Updates, which will also get you 3.1. Edit: More notes from the guy in charge of NuGet, which I'm not expert enough in NuGet to understand: "Also note that with project.json and a better folder layout (similar to the packages folder) the update story becomes a lot better with star specifiers. Also update-package works from powershell exactly how update all/upgrade all worked from the UI. It's just a matter of adding a button for all packages (not so for updating individual packages)" ljw1004 fucked around with this message at 17:46 on Aug 12, 2015 |
# ? Aug 12, 2015 16:15 |
|
General NuGet question. Started using an IIS hosted NuGet feed running on a virtual server for libraries and I've run into a weird issue that may be due to how I'm handling my package updates. Example: Say I have a package called Library and the version is 1.0.0.1. I pull that into a program called LibraryTest and them make an update to Library afterwards and change the version to 1.0.0.2 and push to the feed. Now say I make another change to Library and the version is now 1.0.0.3, etc. Now in LibraryTest, that is currently running 1.0.0.1, I get restore issues with NuGet. However, if I update to 1.0.0.2 in LibraryTest before pushing 1.0.0.3, it seems to work. While I can update LibraryTest after each new package publish, this becomes an impossible and absurd task when you have hundreds of programs using Library. I've been removing my old packages from the packages folder in my NuGet IIS feed and adding the new one and pushing. I initially tried adding folders inside the Packages folder to hold all versions, but my packages were not picked up once pushed. How should one go about this?
|
# ? Aug 13, 2015 00:06 |
|
I've got a pretty basic best practice question. I develop almost exclusively in MVC and I like to handle all exceptions using an exception filter, or in the case of my latest project I'm using the "OnException" override on a base controller. The thing is, I want as many of these exceptions as possible to arrive with some kind of a friendly message that can be shown to the user on an error page, or returned in a JSON response, so that the message doesn't reveal sensitive details about the inner workings of the code (otherwise I may as well just stick with the yellow screen of death!) To achieve this, what I tend to do is something like this:code:
I know that the generally accepted guideline is to catch only those exceptions that you can actually handle, and that all other exceptions should be left to bubble upward. If I can move to that approach I'd love to, I would much prefer I'm doing this "the right way", but I feel like I have good reasoning behind my current approach so I would need to know how to achieve the same benefits while not catching those exceptions I can't handle. My reasoning behind catching these exception types: ExceptionsICanHandle This is obvious and not at all controversial, I catch these so I can... handle them. FriendlyException In the try block there is often some calls to my own code where I may have already encountered an exception and packaged it up into a FriendlyException, so I don't need to repackage this I just need to rethrow it. Exception This is a way for me to intercept the nasty Exception that I don't want shown to the user and transform it into a FriendlyException that provides a user-friendly message that is (more importantly) contextual. At this point I know the context of the otherwise unhandled exception and I can report on it in the logging and in the friendly message the user receives. For example if the unhandled exception occurred while trying to update a user's details, I can report back the user id and the details that were sent for updating. I can't do that if the exception bubbles all the way up and is caught at the last minute. My problem with what I'm doing is that I'm sure I'm violating a couple of principles I hear a lot: "throw early, catch late" and "only catch exceptions you can handle" and I'm also essentially hiding the unhandled exceptions from higher levels of the application. But I'm not sure how to achieve the same usefulness without using the approach I've used. In other words, I know my approach is wrong, but I can't quite hit on the correct approach. putin is a cunt fucked around with this message at 01:27 on Aug 13, 2015 |
# ? Aug 13, 2015 01:22 |
|
ljw1004 posted:fixes to nuget 3 quote:"problem 3: it's very very slow." - there are a few things to do here. (1) Update the package source to V3. (2) If you have a local package source (directory) and it's slow, you can speed it up by putting it behind a server. (3) The NuGet guy wrote "Nuget3 is calling all sources in parallel rather than in order, that's a bug fix from NuGet2 that indeed makes things slower, but results in the right behavior." I don't understand this last comment but don't want to distract him for clarification any further since he should be heads-down implementing UpgradeAll... unfortunately,, I'm guessing your coworker means that all package sources are contacted rather than checking each only if the previous ones failed to resolve a dependency. this means that every operation is now requiring a round trip to America rather than to my ssd or to our LAN.. I can see why that might be a bug fix but it's definitely going to be a permanent slowdown previously, we'd sped things uo by mirroring, in our corporate feed, packages from nuget.org, and it sounds like that won't work anymore. quote:But it's easiest just to update NuGet within VS via Tools > Extensions and Updates, which will also get you 3.1. quote:Edit: More notes from the guy in charge of NuGet, which I'm not expert enough in NuGet to understand: "Also note that with project.json and a better folder layout (similar to the packages folder) the update story becomes a lot better with star specifiers. Also update-package works from powershell exactly how update all/upgrade all worked from the UI. It's just a matter of adding a button for all packages (not so for updating individual packages)" to use package.json I'd have to build new-pcls, right? I'll look into whether those can target .net 4.5. if so, might be worthwhile.
|
# ? Aug 13, 2015 04:37 |
|
One thing that I noticed (and sent a frown about) is some bad behavior in the situation where I have a custom repository that is only sometimes accessible (i.e. when connected to VPN). I was not connected to VPN and tried to install Newtonsoft.Json to a project. Fairly simple operation, right? Well, NuGet failed to do it because it could not connect to my custom repository... Do I really need to be connected all the time to all repositories? I think it could not be that silly. Indeed, I did not manage to reproduce this today. Perhaps it only does that if the project already includes packages from the custom repository? Even then it is not desirable when I am not actually installing anything new from the custom repository.
|
# ? Aug 13, 2015 06:16 |
|
Re: package managers, I found Paket (http://fsprojects.github.io/Paket/) to be very lean and fast. It probably doesn't support many of the advanced scenarios you're talking about here though.
|
# ? Aug 13, 2015 09:54 |
|
|
# ? Jun 6, 2024 10:51 |
|
it's a new world out there. sadly, the only one of these which supports .NET 4.5 is the bolded one - this is an xproj rather than a csproj, a DNX project. no good for my purposes since some of the libraries I'd like to build are written in VB (which DNX does not support). so, no project.json yet. i was amused to see the slightly different schemas of these files- presumably nuget uses the common 'dependencies' and 'frameworks' keys Class Library (Package) posted:
Class Library (Portable) posted:
Class Library (Universal Windows) posted:
|
# ? Aug 13, 2015 11:38 |