Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

GrumpyDoctor posted:

How do GhostDoc and Sandcastle/SHFB compare?

Theyre different things entirely :confused: Sandcastle compiles the inline documentation to a set of web pages or whatever. As far as I can tell, GhostDoc is good for automatically writing a comment stating that GetButts() gets the butts. Did they add compilation or something?

Adbot
ADBOT LOVES YOU

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

Mr. Crow posted:

Anyone have experience hosting external applications in WPF? In the simplest case it's pretty trivial (e.g. notepad) but if it's between two complex apps it quickly gets out of control.

Buzzwords: HwndHost, COM, OLE(?), Win32, routing message pumps

How much control do you have over the external process? If the answer is "none", I'd recommend building your program as a shell which draws itself around the other windows, it sounds like asking for trouble otherwise.

Essential posted:

What do you guys use for installing and updating distributed/commercial applications? Not huge scale commercial, but installed across 1000's of computers around the world and able to update to the latest version. Here's what I've used:

ClickOnce for installing/updating.

InstallShield for installing. When the app needs to update it launches an update.exe & closes so update.exe can do it's thing, then update.exe re-launches the app.

InstallShield for installing, then a plugin model. When the app detects an update it can download & overwrite the file without having the extra update.exe (and doesn't have to close & re-open).

All of those have their pro's and con's but I'm really interested in what you guys have done.

We distribute our products as MSI packages (I think we use WiX), but we have the luxury of our update model being sales emailing the clients and telling them to grab the new MSI when they get around to it. I've toyed around with the idea of using MEF or child AppDomains for reloading code without exiting the process, mostly for services, which seemed promising.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

Ender.uNF posted:

To be fair to you, this is one of the vast number of pitfalls and absurdities everyone encounters when learning. Why would the default ToString() behavior of any object be to print its type name? When is that ever useful? (hint: never).

On the contrary, it's very useful for telling me "hey moron, you need to do something with this object!"

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

Careful Drums posted:

I have a problem: the default bundling in System.Web.Optimization is not cutting it.

I need to be able to pull down css/js/images from our own hosted CDN, bundle it up, then serve that as one minified request. I guess with System.Web.Optimization you can only pull down inidividual CDN files.

Anyone have experience or reccomendations on this?

The candidates so far are to

- use grunt and a poo poo ton of dependencies to host a static site with everything bundled already. I don't hate this but I'm hesitant to introduce node.js into our all-asp.net mvc workflow
- use this bundler library which is basically a wrapper for what i previously described
- this library called squishit which as far as i can tell looks okay. But I'm not so sure what the hell is going on with their AssetsController class in this example.

If you go the all-static route, Microsoft has a library/tool called the Ajax Minifier and a tool which builds on it called WebGrease which you might find useful.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

spiderlemur posted:

Is there a good way to integrate certain CMS features into an existing site without having to full on use a CMS? I don't want the CMS taking over the site and disrupting what has already been written, I just need a small portion of that site to support say, creating and displaying articles, with the other site pages / part of the homepage undisturbed.

Are there libraries out there that make it easy to add these kinds of features into an existing site or is the best way just to write it all myself? I'm using MVC.

I think it's possible to convert an existing MVC site into an Orchard module, maybe you could look into that?

Or if that's too complicated, why not just make parts of the site dynamically generated, and only provide editors for those specific parts?

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

Uziel posted:

OK I have a monolithic Webforms application that uses EXT.Net for the front end. I am looking to convert it to MVC but need a replacement for EXT.Net for as much as possible.

I may need to keep certain parts of it as we use the Ext Calendar for a scheduling system (I'll cross that bridge when I come to it), but for the configuration stuff I want to leverage MVC models/routers/controllers but also avoid the pitfalls of having spaghetti jquery everywhere. Is there a recommended javascript framework or library that I should use? I'm probably just going to default to using Bootstrap for the design.

Right now I'm in the middle of upgrading an application which uses Ext.Net 1.5 and I would advise biting the bullet and migrating to straight up Ext. ExtJS 5 has lots of fun new MVVM features which make development a lot easier and a lot of pieces and pages have been a 1 to 1 translation from markup to components, even with jumping ahead two major versions of Ext. My application was all MVC from the start though, so getting in the mindset of having a separate front end was easier. You don't need to use Sencha Cmd, it's pretty easy to structure something in MVC which works with the loader. You can PM me if you have specific questions.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

Uziel posted:

I had thought about it but I'd never get approved for the ext js license unfortunately plus we'd still likely need to use ext.net for the calendar portion.

There's also a GPL license, if you dig deep enough on Sencha's site. Also, are you referring to the big Google Calendar style view? It's been a while but I seem to think that that's actually a third party product.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

Uziel posted:

Oh. Does my code have to be open source in order to take advantage of it? Its for an internal tool, not software that is sold.

If you are using ext for the View, how did you get around duplicate models?

Yeah, its a Google calendar style event scheduler that is from the Ext.NET team and the primary reason we went with that over ext js:
http://examples1.ext.net/#/Calendar/Overview/Basic/

The GPL question depends on the type of product you're building. Internal-only tools should be clean, because it's company IP and so the only users who could request the code work there anyway, but IANAL.

Having duplicate models is unfortunately a fact of life - I have database objects, then .NET model objects which are returned and consumed by web APIs (in some situations, you might think of them like the VM part of MVVM), and then Ext representations of those models. With reflection it would be possible to generate the JS from the .NET classes and save you from maintaining two copies, I think Ext.Net does that anyway. The view models you create for Ext are specific to components though, the models are what will be sent to and received from the server.

That calendar looks like the one from http://ext.ensible.com.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

epalm posted:

When I suck in a 3rd party library via NuGet, and that library does some logging, do I generally want to see their logs in my logs? If the answer is yes, what if the 3rd party lib uses a different logging mechanism, like log4net? If I have my facts straight, DiagnosticsTraceAppender will just act as a pass-through to pipe log4net logs into System.Diagnostics logs, so everything ends up in the same place.

How is this usually handled?

Yes, that will insert other libraries' log messages into your trace stream. Doesn't log4net include a trace appender already though?

With log4net, one configuration applies to all logging instances in the entire AppDomain. (For example, as part of my own application's log configuration, I set the NHibernate loggers to only be included at the error level and above.)

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

Newf posted:

The views are alternative representations of math problems.

EG, your basic multiplication problem can be put to people several obvious ways:




The idea is to feed this to children, "do big data stuff" to gain information about the relative merits of different question/view schemes, feed a better refined diet of it to children, and so on.


edit: It's not necessarily the case that '100s of views' are required for a particular question at a particular time, but the idea is to be running generalized A/B/C... testing against the existing stockpile in order to find the most useful ones and toss the least useful.

You should just keep them on the file system. When you're returning a View action result you can specify the name of the view file. If you need more control, IIRC you'll need to implement a view engine to customize the path lookup logic.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy
Aren't those array functions evaluated lazily? I wonder if stepping through in the debugger forces an evaluation and you wouldn't otherwise see it in normal execution for a while.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

Che Delilas posted:

The full User entity contains fields like PasswordHash and SecurityToken, which I don't need to display to the client. If I return View(context.Users.ToList()); then those fields will be transmitted to the client even if the view doesn't make use of them, right? Can someone who knows how do anything evil with those values?

I know about one-way encryption and password hashing, so I don't need the 101 version, but security is such a complex topic and I've never taken the time to really dive into it, so I'm curious. Oh, and I know how to render the issue moot by using a ViewModel containing only the values the View cares about; it really is just an academic question at the moment.

No, the values only go as far as the view rendering code, and go away once that finishes. If you never use them in the view, they will never go to the client.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy
I'm using Autofac with an MVC project. I need to provide a value from HttpApplication.Request to the container - it needs to be used elsewhere in the hierarchy, possibly from components which don't know anything about ASP.NET. Something like:

code:
protected void Application_Start()
{
    // register a bunch of things including ButtModule
    builder.RegisterInstance(Request.UserHostAddress).Named<string>("RequestAddress");
    container = builder.Build();
}

// in a module far, far away
public class ButtModule : Module
{
    protected override void Load(ContainerBuilder builder)
    {
        builder.Register(c => new Butt(c.ResolveNamed<string>("RequestAddress")).As<IButt>().InstancePerDependency();
    }
}
Unfortunately, the request object is not available in Application_Start - it throws an exception "Request is not available in this context". I tried to be sneaky and register it with a lambda instead, but run into the same problem. It seems like it would be bad to build the container in BeginRequest but I can't think of anywhere else where I could access the request data and provide it to the container before it is needed to resolve dependencies. Ideas?

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy
The problem is that ButtModule is dynamically loaded and Application_Start will have no idea it even exists (usually with RegisterAssemblyModules and a dynamically loaded assembly). Likewise, ButtModule does not know that it's being used in ASP.NET (or at least I'd like to avoid explicitly referencing/checking it). Can Autofac inject into a module constructor? I could maybe find a way to pass those values as parameters.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

Wardende posted:

If ButtModule doesn't know it's being used in ASP.NET, why does it need a user host address? Why shouldn't it reference System.Web if it deals in IP addresses? Seems natural to me...

In this case it needs a string identifying the origin of the request for auditing purposes. When used from ASP.NET I use the request IP address, from WCF I have to get the client IP address from the operation context, when used directly I would pass the string "local".

Destroyenator posted:

I think this does what you're looking for? http://stackoverflow.com/a/15542425/291137
You may have to add the Autofac.Mvc package if you haven't already.

I'll give it a shot at work tomorrow, thanks.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

Wardende posted:

Perhaps you need an, uh, IAuditIdentifierProvider and three implementations (one which retrieves the IP address from System.Web, one which gets the client IP, and one which provides "local"), and add that as a dependency to the ButtModule instead of a string. Then you can register the correct implementation in each App_Start.

Yeah, this ended up being a lot simpler than screwing around with registering delayed values or whatever. Thanks, I'm still new to the idea of structuring an application properly for DI.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

GrumpyDoctor posted:

Does anybody know why an application built against log4net 1.2.13 would think it needs log4net 1.2.11 as well and fail to run? It happens on some computers but not others.

Is one of its dependencies referencing 1.2.11? IIRC that was the version that log4net switched to a new code signing key, which broke usage when some assemblies in an AppDomain referenced an assembly with the old key and some referenced one with the new. As for happening on some computers and not others, it could be that the old version is being picked up from the GAC for some.

e: I realize Munkeymon already asked it but be sure to check third-party assemblies too.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

TheEffect posted:

So CopyDirectory doesn't do much really, like copying permissions over and things like that. I've found ways around most of these problems thanks to you guys but is there any way to copy the source folder's icon over? Or rather, whatever attribute handles the location of which icon to use for the directory?

I believe that's handled with desktop.ini inside the directory. I think it's hidden and possibly even system by default.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy
First time I've used async methods in a long time and I don't know which of two alternative approaches is better. I have two operations to do: read a file from disk, and deserialize it from JSON. Reading the file can potentially be done asynchronously, but AFAIK deserializing with Newtonsoft.JSON is only synchronous.

C# code:
// Method 1
private Task<Butt> ReadButtAsync(string path, CancellationToken cancellationToken)
{
    return Task.Run(() =>
    {
        using(var reader = new StreamReader(path))
        using(var jsonReader = new JsonTextReader(reader))
        {
            return new JsonSerializer().Deserialize<Butt>(jsonReader)
        }
    }, cancellationToken);
}
C# code:
// Method 2
private async Task<Butt> ReadButtAsync(string path, CancellationToken cancellationToken)
{
    string content;
    using(var reader = new StreamReader(path))
        content = reader.ReadToEndAsync();

    using(var jsonReader = new JsonTextReader(new StringReader(content)))
    {
        return new JsonSerializer().Deserialize<Butt>(jsonReader)
    }
}
My intuition from last time I heavily used C# was that method #2 is preferred, because everything should be made async as far down as possible. But method #1 was what I first came up with and it seems to work fine so far. Does #1 have potential for deadlock even if the synchronous I/O is wrapped in an asynchronous block? (Context here is that I'm building a web API on ASP.NET Core, if there's any considerations specific to that environment and/or Kestrel.)

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

hirvox posted:

#2 should perform better on higher loads, because the thread can be released to perform other tasks while the stream is being read. #1 will use up a thread until both the stream read and deserialization are complete.

BTW, Newtonsoft.JSON used to have asynchronous methods, but they were wrappers similar to method 1. When designing a public API, the method shouldn't lie about being asynchronous. If it's doing a significant amount of synchronous processing, keeping the method signature synchronous will allow the caller to decide whether they'll want to use a separate thread or not.

raminasi posted:

hirvox covered why #2 is preferred, but I’ll add that the async deadlocking footgun that I think you’re referring to manifests when you try to jam asynchrony into a synchronous workflow by using Result or Wait, rather than the opposite, which is what you’re doing here.

Got it, thanks! I'll go with the second. And yeah, now that I think of it, I had my fear of deadlocks backwards.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

The NPC posted:

We have made an app that sends email notifications to users under certain criteria. E.g.: Equipment needs to be returned, certificates expiring. These are set up so if an action is required, and a user doesn't do $thing after $timeperiod, their manager gets CC'd, and eventually we open up a ticket (Service-Now).

Now Management is saying that people "don't read emails" (true), and want some other way to get a more immediate(?) or maybe consistent response. Has any one had success doing something like this before?

Fake edit: The processes that don't escalate to creating a ticket at the end are the ones we are getting complaints about. I'll have to see if that is an option.

I think you answered your question at the end there. The next step after a ticket is tangible consequences for not taking the action or not following up on the ticket.

Adbot
ADBOT LOVES YOU

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

fuf posted:

Yeah sorry I think it's because I'm imagining this halfway thing like the Plex and SABnzbd examples I gave. Like it's an executable that you run locally but you access the UI through your browser by going to localhost:4444 or whatever. I'm not sure whether that's still called a webapp?

AFAIK applications like these usually solve that problem by either having the user type in the path relative to where the server is running, or providing a custom file browser component backed by an API which lists the server's file system. This is because they don't always necessarily run on the same machine as the user's web browser.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply