Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
100 degrees Calcium
Jan 23, 2011



I'm pretty envious. I write unit tests for my work, but I wouldn't say it ever really clicked for me. Like, the tests are there and they test what I need to test, but I wouldn't go as far as to say that my development is test-driven.

Adbot
ADBOT LOVES YOU

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

aBagorn posted:

So when you guys (in the last thread, or maybe it was somewhere else) said that at some point unit testing would just "click" and all of a sudden completely make sense and you will never want to code a different way again...


You guys were right. Holy poo poo. TDD is like crack. Every time another test goes green I get a little buzz.


Thanks again goons :glomp:

Sounds like something I would say.

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy
Unfortunately it hasn't clicked for me yet either. What kind of project were you guys working on when it did? Also, can you post some (non-trivial) example tests from your code? I always get the feeling that I'm not doing it right.

Okita
Aug 31, 2004
King of toilets.
I think the idea is to take the requirements and start to build out the tests one by one. As each fails, you write a piece of code that will make it pass, and move onto the next. The idea is at the end you'll have written minimal code that passes all the tests and satisfies the requirements.
It's more useful in some domains than others.

Usually my web dev clients like to change requirements/business logic like the wind, sometimes with massive sweeping changes, so I don't wanna write/rewrite tests every time. Couple that with the fact that I've built sites like theirs a million times before and it can end up slowing me down rather than helping.
I will still write tests for particularly complicated or out-of-the-ordinary business logic, but not before I try to reason with them on how and why to simplify things if possible.

[rant]
One example, I have a client right now who have Excel spreadsheets that use formulas to calculate some things and they asked me if I can hook up their .NET site to talk to the Excel spreadsheets. I told them it was possible with the Excel interop but that it would be a needlessly complicated way to solve the problem. They pushed back saying Excel is what their people know, but I'll either convince them to simplify or quote an exorbitant amount of hours for the work.

At some point you gotta put your foot down. No client is gonna convince me that a bunch of simple formulas in Excel sheets can't be extracted, simplified, and displayed on an web interface that allows them to tweak the formulas in a way that is easier than whatever they were doing before. Yeah I might be kind of overstepping my role but you know what, it makes my life easier and it makes their life easier(even though they might not realize it) to have those formulas stored in their DB and accessible via web interface rather than in random Excel files.
[/rant]

EssOEss
Oct 23, 2006
128-bit approved
My main problem for a long time was writing integration tests. Let's imagine I am exposing a web API that frobulates doodads. Do a POST request to http://example.com/Doodads/1/Frobulate and it does its thing and returns a result.

I was testing these features by doing just that - make an HTTP request to the local installation and check that the doodad got frobulated by checking the HTTP response and maybe also peeking into the database. This is a valid test to ensure that different parts of the application work well in integration but is a horrible way to actually test the functionality that is involved because there are many different parts of functionality involved, each of which could influence the application: the web client configuration, the web server configuration, filesystem security settings on any accessed files, the database server configuration, the database schema, the database data, the configuration files for the application, the parsing code of the configuration, the validation code of the configuration, the security mechanisms that authenticate a request, leftover data from me developing it for the last few days, leftover data from the previous 5 test cases executed as part of the same test run and then finally the actual logic represented by the written code.

That's a lot of dependencies to wrap into one pass/fail signal! I never really got the confidence to say that my code worked because I never knew whether any of those factors could be giving me false results. Indeed, there were many times when the same test failed either on a teammate's PC or on the build server.

I have recently corrected this deficiency. The above integration tests still exist but they only exist to verify that the different pieces work correctly together. Each piece is also tested independently of the whole chain, giving me the confience that I need in order to rely on my tests as an indicator of quality rather than circumstance.

Of course, there were reasons it was not natural to write such tests before! Code must be written in a certain way to be tested; I would not exactly characterize it as writing code for testing, though that is how I have heard some call it. Rather, the code must be designed so that each component is capable of working in isolation. If your business logic executes SQL statements or references file paths, it is doomed to be untestable by anything other than integration tests. Instead, each external dependency must be hidden behind an interface - my test that verifies the business logic will provide its own test data through this interface, instead of having the database provide it as a response to SQL statements; my test implements an IProductsDataProvider that is used by the business logic to retrieve a ProductsDataXml string instead of the latter just reading an XML file.

In other words, isolation was the main innovation I needed to make tests really click for me. There are other areas that also need attention but this is by far the most important one for me and one that I often see others fail to properly care for.

Here is an example of one of my real tests that is representative for my projects. This is a piece of business logic under test; normally, it is a part of a large chain of code execution as described above but here I am just testing one method of one class, substituting the rest of the universe with placeholder logic and data used only for testing. At the start, you see setup and teardown methods that set up the simulation of the entire web service infrastructure using placeholders. This logic is shared by all the test cases that test the web service functionality. Test-specific placeholders and overrides are defined in each test separately. Note that the entire simulation is re-created for each test, to ensure tests are isolated from each other.

code:
[TestFixture]
public sealed class MicrosoftFrobulationManagerTests
{
	private IWingdingInfoSession _wingdingInfoSession;

	[SetUp]
	public void SetupWebServiceTest()
	{
		// The request and response themselves are not used by any of the code under test, at least for now.
		// HttpContext and related objects are mainly used to carry state through different parts of the application.
		HttpContext.Current = new HttpContext(new HttpRequest(null, "http://example.com", null), new HttpResponse(null));

		// This container is set as the per-request container.
		HttpContext.Current.Items[Constants.InjectionContainerKey] = InjectionContainer;

		// Some meaningless substitutes that remove functionality that is not cared about in most tests.
		var defaultLoggerFactory = Substitute.For<ILoggerFactory>();

		defaultLoggerFactory.GetCurrentClassLogger().Returns(Substitute.For<ILogger>());
		defaultLoggerFactory.GetLogger(null).ReturnsForAnyArgs(Substitute.For<ILogger>());

		InjectionContainer.RegisterInstance(defaultLoggerFactory, new ContainerControlledLifetimeManager());
		InjectionContainer.RegisterInstance(Constants.ReportingLoggerInjectionKey, Substitute.For<ILogger>(),
			new ContainerControlledLifetimeManager());

		// The standard server certificate.
		var certificateDataSource = Substitute.For<ICertificateDataSource>();
		certificateDataSource.CertificateData.Returns(Resources.ServerCertificate);
		InjectionContainer.RegisterInstance(certificateDataSource);

		// A real certificate parser.
		InjectionContainer.RegisterType<IWingdingCertificate,
			RealWingdingCertificate>(new ContainerControlledLifetimeManager());

		// Hardcoded windding info.
		var wingdingInfoDataSource = Substitute.For<IWingdingInfoDataSource>();
		wingdingInfoDataSource.WingdingInfo.Returns(Resources.WingdingInfo);
		InjectionContainer.RegisterInstance(wingdingInfoDataSource);

		// A real session, though. Otherwise the Wingding Rapture SDK will not work!
		InjectionContainer.RegisterType<IWingdingInfoSession,
			RealWingdingInfoSession>(new ContainerControlledLifetimeManager());
		_wingdingInfoSession = InjectionContainer.Resolve<IWingdingInfoSession>();
		_wingdingInfoSession.Open();
	}

	[TearDown]
	public void CleanupWebServiceTest()
	{
		_wingdingInfoSession.Dispose();

		// Clean up any things that might still be using HttpContext.
		InjectionContainer.Dispose();

		HttpContext.Current = null;
	}
		
	[Test]
	public void HandleFrobulation_WithValidRequest_FrobulatesDoodad()
	{
		// Set up a dummy frobulator that provides fixed frobulation data for every request.
		var doodadFrobulator = Substitute.For<IFrobulationRequestHandler>();
		doodadFrobulator.WhenForAnyArgs(x => x.HandleFrobulationRequest(null, null)).Do(x =>
		{
			var frobulationResponse = x.Arg<IFrobulationResponse>();
			// Minimal set of data needed to frobulate a doodad.
			frobulationResponse.Foo = new byte[16];
			frobulationResponse.Bar = new Guid("AC78A390-1489-4E6E-B422-AB62C3610D4F");
		});

		InjectionContainer.RegisterInstance(doodadFrobulator);

		var frobulationManifest = new XmlDocument();
		frobulationManifest.LoadXml(Resources.ValidFrobulationManifest);

		// The part under test - the Microsoft frobulation manager.
		var frobulationManager = InjectionContainer.Resolve<MicrosoftFrobulationManager>();
		var result = frobulationManager.Frobulate(frobulationManifest);

		// We expect frobulation to return a valid result that includes a malfrobodon.
		Assert.IsNotNull(result);
		Assert.IsTrue(result.OuterXml.IndexOf("<malfrobodon>", StringComparison.InvariantCultureIgnoreCase) != -1);
	}
}

EssOEss fucked around with this message at 22:46 on Dec 15, 2014

Okita
Aug 31, 2004
King of toilets.

EssOEss posted:

My main problem for a long time was writing integration tests. Let's imagine I am exposing a web API that frobulates doodads. Do a POST request to http://example.com/Doodads/1/Frobulate and it does its thing and returns a result.

I was testing these features by doing just that - make an HTTP request to the local installation and check that the doodad got frobulated by checking the HTTP response and maybe also peeking into the database. This is a valid test to ensure that different parts of the application work well in integration but is a horrible way to actually test the functionality that is involved because there are many different parts of functionality involved, each of which could influence the application: the web client configuration, the web server configuration, filesystem security settings on any accessed files, the database server configuration, the database schema, the database data, the configuration files for the application, the parsing code of the configuration, the validation code of the configuration, the security mechanisms that authenticate a request, leftover data from me developing it for the last few days, leftover data from the previous 5 test cases executed as part of the same test run and then finally the actual logic represented by the written code.

That's a lot of dependencies to wrap into one pass/fail signal! I never really got the confidence to say that my code worked because I never knew whether any of those factors could be giving me false results. Indeed, there were many times when the same test failed either on a teammate's PC or on the build server.

Gonna play devil's advocate here:

I'm not sure what leftover data means exactly, but wouldn't this bolded part indicate a configuration or database issue? (by configuration I also mean security stuff)
I mean if you look at it in terms of what changed between the code running on your machine vs. the code running on those machines, all signs should point to configuration/database issues.

A quick and dirty way to check code in isolation is to use the VS debugger. Step over the specific piece of code and change some values/parameters to see what happens when you plug in different things. You can either introduce some dummy/test parameters or you can change values in the debugger while stepping through code using the watch/autowatch windows.

To take this example without unit tests, if I had a specific well-defined XML format I was looking for, I would do a few debugger runs with different versions of the XML, some malformed, some correct, etc. and see what the piece of code in question comes up with. I would hit it with as many different XML inputs as I needed to feel comfortable that that piece of code works correctly.

Obviously this technique isn't as tangible, reusable, or visible as unit tests. But if I know what's going into a piece of code and what's supposed to come out and it looks good, I'll move on to other parts.

Jewel
May 2, 2009

TDD clicked for me when I wrote a proper vector function extension for SFML.Net and seeing all the lights just work was really satisfying and I knew exactly what went wrong because the tests were all simple enough. I imagine it gets a bit less satisfying in a huge architecture but for personal projects it's good to know nothing is hosed up. I tend to work in games though so I don't get to test things as often as I'd like, because in games it resolves to usually visually debugging things because there's no real way to test if physics was pushing you up a ledge properly.

aBagorn
Aug 26, 2004
Evil Sagan I'll put together a couple examples tomorrow morning.

I started with really easy stuff, unit testing CRUD stuff (make sure GetAllFoos actually gets all the Foos) and then work on actual business logic like making sure the Foos are Frobbled and like.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug
Not every project lends itself to TDD, especially if you're entering unknown territory. I was just doing something where I had never used any of the frameworks or libraries period, let alone in concert. Thus, my method of development was "make something just loving work", then "make something else just loving work". I didn't have any clue about what my end goal would be, how classes would be composed (or decomposed), or anything like that. TDD would have been a terrible choice for me -- I would have been writing tests for scenarios that might not even work or be valid, and I would have had to constantly be rewriting huge swaths of tests as I figured out how to plug all of the pieces together into some sort of coherent architecture, which would have killed my enthusiasm for learning any of it.

For example, I found out late in the game that SignalR doesn't play nice with async event handlers. If I done TDD on the object model, I would have written the whole thing in isolation, with a beautiful suite of tests that proved out all of this functionality that would have had to be totally thrown out and rewritten when I went to hook it up to SignalR.

What I ended up with was, in my mind, a functional proof of concept. I figured out how to do all of the things I wanted to do and get all of the pieces to play nice together. Attempt #2 will be TDD from the ground up, including the AngularJS stuff. Angular is nice but its IOC implementation is kind of a mindfuck. I'm still trying to wrap my head around how to do IOC with AngularJS controllers and Jasmine. I think I need to start with a calculator backed by a web service or something silly like that.

New Yorp New Yorp fucked around with this message at 03:00 on Dec 16, 2014

100 degrees Calcium
Jan 23, 2011



aBagorn posted:

Evil Sagan I'll put together a couple examples tomorrow morning.

I started with really easy stuff, unit testing CRUD stuff (make sure GetAllFoos actually gets all the Foos) and then work on actual business logic like making sure the Foos are Frobbled and like.

Nice, thank you.

For my next project, I'm gonna try to break down the tasks into the smallest most discrete units I can (like, connecting to a database to begin) and build tests for each one and see how it goes just building the application to pass the tests.

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy
I really enjoy unit testing when what you're creating is inherently testable. For example, I wrote a dumbed-down BBCode parser for a Xamarin iPad app that first converted BBCode to an intermediate object representation, then from that to iPad specific code for rendering. This is an exceptionally easy thing to test and work with in a TDD manner, since it's solely inputs and outputs with no dependencies. It gets a lot grosser to test, in my mind, when you have logic that involves checking user permissions, pulling data from a database (especially with partially populated object graphs a la Entity Framework), and other things that Real™ LOB applications have to do.

Gul Banana
Nov 28, 2003

EssOEss posted:

Yes, an upgrade install of Windows will leave any existing Visual Studio installation in an operational state, barring exceptional circumstances.

cool, that should help me avoid having to dig up new MSDN licenses

Mr. Crow
May 22, 2008

Snap City mayor for life

Ithaqua posted:

Not every project lends itself to TDD, especially if you're entering unknown territory. I was just doing something where I had never used any of the frameworks or libraries period, let alone in concert. Thus, my method of development was "make something just loving work", then "make something else just loving work". I didn't have any clue about what my end goal would be, how classes would be composed (or decomposed), or anything like that. TDD would have been a terrible choice for me -- I would have been writing tests for scenarios that might not even work or be valid, and I would have had to constantly be rewriting huge swaths of tests as I figured out how to plug all of the pieces together into some sort of coherent architecture, which would have killed my enthusiasm for learning any of it.

For example, I found out late in the game that SignalR doesn't play nice with async event handlers. If I done TDD on the object model, I would have written the whole thing in isolation, with a beautiful suite of tests that proved out all of this functionality that would have had to be totally thrown out and rewritten when I went to hook it up to SignalR.

What I ended up with was, in my mind, a functional proof of concept. I figured out how to do all of the things I wanted to do and get all of the pieces to play nice together. Attempt #2 will be TDD from the ground up, including the AngularJS stuff. Angular is nice but its IOC implementation is kind of a mindfuck. I'm still trying to wrap my head around how to do IOC with AngularJS controllers and Jasmine. I think I need to start with a calculator backed by a web service or something silly like that.

I find this to be the case more often than not with me. I love writing with unit testing in mind but it seems every time I tried to do TDD it was probably just a poor choice, either because I'm not sure where I'm going end up (new frameworks/features/etc.) or because it hasn't 'clicked' for me either.

aBagorn
Aug 26, 2004
Ok as promised. I'm using Microsoft Test and Moq. And like I said, this is where I started. It made doing the harder ones easier:

Here's a simple test asserting that calling the GetActiveFoos method will only return Foos who have isActive = true
C# code:
[TestMethod]
public void GetActiveFoos_ShouldNotReturnInactive()
{
    //setup
    var mockSet = SetupMockFooDbSet();
    mockContext.SetupGet(c => c.Foos).Returns(mockSet.Object);
    var service = new FooService(mockContext.Object);

    //act
    var buildings = service.GetActiveFoos();

    //assert
    Assert.AreEqual(4, foos.Count);
}
The SetupMockFooDbSet takes a prefilled list of Foos (10 in total, 4 active) and sets up all the EF behind the scenes stuff to mock it out.


Here's a test asserting that an Add method will fail correctly if passed an empty sring (the real method takes a JSON object)
C# code:
[TestMethod]
public void Add_ShouldThrowExceptionWhenNoData()
{
    //setup
    var service = new BuildingService(mockContext.Object);
    var expected = ResponseMessages.Exception.GetDescription();

    //act
    var actual = service.Add(string.Empty);

    //assert
    Assert.AreEqual(expected, actual);
}
Lastly, this is a test to make sure that the Foos are flattened with their relational objects before being sent up to the API, as well as the inverse, that a flattened object coming down should be turned into 3 distinct EF objects before being saved to the database.

C# code:
[TestMethod]
public void Flatten_ShouldCreateValidDTO()
{
    var service = new FooService(mockContext.Object);
    var dto = service.FlattenFooObjects(MockFoo());

    Assert.IsTrue(dto is FooDTO);
}

[TestMethod]
public void Unflatten_ShouldCreateDatabaseObjectsFromDTO()
{
    var service = new FooService(mockContext.Object);

    var tuple = service.UnflattenDTO(MockDTO());

    Assert.IsTrue(tuple.Item1 is Foo);
    Assert.IsTrue(tuple.Item2 is FooBazzRelationalObject);
    Assert.IsTrue(tuple.Item3 is FooBarRelationalObject);
}

wwb
Aug 17, 2004

We do almost all of our core prototyping in a TDD manner -- feedback loop is great, you get very workable examples out of it. A also really love it for exploring new frameworks and tools.

Haven't had a whole lot of success doing full out TDD on larger projects. The one that did have full out TDD became a pretty sad example of onion architecture gone wild to be honest.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

wwb posted:

We do almost all of our core prototyping in a TDD manner -- feedback loop is great, you get very workable examples out of it. A also really love it for exploring new frameworks and tools.

Haven't had a whole lot of success doing full out TDD on larger projects. The one that did have full out TDD became a pretty sad example of onion architecture gone wild to be honest.

Sounds like one of the three steps was skipped -- everyone just went red -> green -> red instead of red -> green -> refactor. I agree that TDD tends toward disjointed messes, that's why you have to be super careful to do the "refactor" part frequently and liberally.

Sagacity
May 2, 2003
Hopefully my epitaph will be funnier than my custom title.
Please note that unit testing also doesn't necessarily mean that you need to test ALL the details. I've seen people write tests on the level of

code:
  client.Name = "Test";
  Assert.Equals("Test", client.Name);
Not completely of course, but you get the gist. It's perfectly fine to test a big chunk of your code (a unit, if you will) in one go so that it implicity tests a lot of the details. The main thing is to avoid touching all sorts of dependencies outside of this unit and that's why you typically isolate database and filesystem access if you're testing business logic, for instance.

Mr. Crow
May 22, 2008

Snap City mayor for life

Sagacity posted:

Please note that unit testing also doesn't necessarily mean that you need to test ALL the details. I've seen people write tests on the level of

code:
  client.Name = "Test";
  Assert.Equals("Test", client.Name);
Not completely of course, but you get the gist. It's perfectly fine to test a big chunk of your code (a unit, if you will) in one go so that it implicity tests a lot of the details. The main thing is to avoid touching all sorts of dependencies outside of this unit and that's why you typically isolate database and filesystem access if you're testing business logic, for instance.

We literally have hundreds of those kinds of tests (exactly testing auto-props or similarly redundant tests), it drives me up a wall. I think I've generally convinced people they're a waste of time but they still sneak up every now and then, including the huge backlog of old tests.

"Does .NET work? Yes? OK, good, moving on!"

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Mr. Crow posted:

We literally have hundreds of those kinds of tests (exactly testing auto-props or similarly redundant tests), it drives me up a wall. I think I've generally convinced people they're a waste of time but they still sneak up every now and then, including the huge backlog of old tests.

"Does .NET work? Yes? OK, good, moving on!"

It's okay to test a property if there's a reasonable chance that you'll be expanding it in the future to have some sort of logic in the get/set methods. It can also help validate settability -- if a property should be settable and you have downstream code that relies on it, having a test to ensure it's settable may make sense. In general it's a waste, though.

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
With all of this unit testing talk if you want a book on it The Art of Unit Testing is excellent and I highly recommend it.

crashdome
Jun 28, 2011
I've done some unit testing but, I have discovered that most of my unit tests are for testing items I generally have no issues with. I guess that's good I have these in case something does go wrong at some point but, I am weighing the cost of my time in building them. What I really need is a testing framework that is all integration testing. Most of my projects have complex business logic and rely on other devices communication habits. Occasionally I unit test things like: are all properties being assigned to that struct on each of those methods? Yes? Good. Which is great. Never fails. Ever. No matter how many months I've been working this project.

Where 99% of my bugs are though... integration. Problems with I/O timing, spelling errors in strings I send over serial ports, Bit/Byte conversion on lengthy packets of data. Changed out my whole framework of serial port behavior because New Protocol(TM) (they said we'd never have to implement another new protocol ever again 6 months ago) and now I need to know if all the previous protocols still function over the communications framework. :suicide:

Unit Testing and TDD cover like 2% of my problems.

wwb
Aug 17, 2004

@Ithaqua -- more like the project was designed to be a lot more bigger and pluggable than it was. And there were lots of layers of plugability you need to get that level of testability which makes things hard to follow. That said the app is running like a champ 5 years later so that is all good.

@crashdome -- my most successful testing has got things to the point where we know our side of the app works. So when it comes to debugging those nasty integration issues we can focus a lot better as we know one side of the equation is likely fine. Especially with outside vendors involved who want to throw your code under the bus because their code is of course "perfect" and "battle tested". Another side effect is if you can get good at simulating infrastructure you can make your code behave predictably when it fails, including firing off "this is wrong and this is what you need to look at" messages in the log.

I second the art of unit testing, great book and it helped me

crashdome
Jun 28, 2011

wwb posted:

@crashdome -- my most successful testing has got things to the point where we know our side of the app works. So when it comes to debugging those nasty integration issues we can focus a lot better as we know one side of the equation is likely fine. Especially with outside vendors involved who want to throw your code under the bus because their code is of course "perfect" and "battle tested". Another side effect is if you can get good at simulating infrastructure you can make your code behave predictably when it fails, including firing off "this is wrong and this is what you need to look at" messages in the log.

Unfortunately, my side is the only side that changes. Most of the issues are engineers expecting me to know exactly what a product does and why even though I don't know what the hell the product is even for half the time. I do have a solid messaging framework for the operators. It helps when they call with an unexpected problem. It's the massive amounts of weird edge cases I have to deal with because a product behaves wildly different than another product for some ungodly reason I don't understand without an hour long explanation. I end up doing Thread.Sleeps and IF-THEN cases everywhere until I have time to come back and refactor after the procedure is proven to work.

I have started simulating infrastructure but, a full test suite would double my time. I've got a small simulation app to do manual tests though and that cuts through half the bullshit.

Azubah
Jun 5, 2007

Has anyone had experience in integrating SSRS reports into a MVC project? I can get the /ReportServer folder directories to show up in an iFrame and navigate to the reports, but you still have to provide log in credentials and isn't as pretty. It's also not what is expected.

According to the database guy here I should be able to use Reportviewer control to do this, but so far I've only found that it renders one report if you've provided the report name in the specific directory. A lot of the tutorials I've found assume you need to build the report and use aspx pages.

What I'd like to do is render the directories as folders and drill down to the reports like Reporting Services's Folder.aspx page using credentials already provided by the system. Unlike that page though, we don't want the report builder or folder settings to appear, only the navigation.

epswing
Nov 4, 2003

Soiled Meat
Are MVC 4 SimpleMembership user accounts "good enough" to continue using for the next little while, or should I make it my business to aggressively update to MVC 5 Identity user accounts?

Knyteguy
Jul 6, 2005

YES to love
NO to shirts


Toilet Rascal
Anyone have experience deploying Windows Universal Apps for enterprise with InTune? Microsoft made it a complete loving mess to deploy these apps, to both RT/tablet/Desktop & Windows Phone 8.1. Is there anyway for example to deploy without having to use the store (we don't want to do the authentication process preferably), or without System Center? We seem to be running into certificate problems if anyone has a good resource on hand for that (we have a purchased certificate, but signing/building the app package is causing problems).

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Knyteguy posted:

Anyone have experience deploying Windows Universal Apps for enterprise with InTune? Microsoft made it a complete loving mess to deploy these apps, to both RT/tablet/Desktop & Windows Phone 8.1. Is there anyway for example to deploy without having to use the store (we don't want to do the authentication process preferably), or without System Center? We seem to be running into certificate problems if anyone has a good resource on hand for that (we have a purchased certificate, but signing/building the app package is causing problems).

Don't worry, Microsoft's entire strategy around software deployment is kind of schizophrenic right now. It's going to get better, but probably not in the next 3-6 months.

RICHUNCLEPENNYBAGS
Dec 21, 2010
Unit testing can be really handy but I don't believe in TDD. Also, I think it's a little bit like a church in that even its adherents don't actually always do it, but they all find guilty about it and try to make other people feel guilty too.

Also proponents of TDD frequently make unjustifiable claims, like saying the tests "prove" their software is defect-free. No, they don't, because the number of possible states is mind-boggling and that assumes your tests are correct. Certainly they can be a useful heuristic.

On the topic, this article is interesting and I agree with it.

RICHUNCLEPENNYBAGS fucked around with this message at 02:15 on Dec 17, 2014

InfernoJack
Dec 13, 2014

RICHUNCLEPENNYBAGS posted:

Unit testing can be really handy but I don't believe in TDD. Also, I think it's a little bit like a church in that even its adherents don't actually always do it, but they all find guilty about it and try to make other people feel guilty too.

Also proponents of TDD frequently make unjustifiable claims, like saying the tests "prove" their software is defect-free. No, they don't, because the number of possible states is mind-boggling and that assumes your tests are correct. Certainly they can be a useful heuristic.

On the topic, this article is interesting and I agree with it.

Similar things could be said about any borderline-religious movement.

TDD, Agile, OOP, Functional, they're all just tools. Folks gotta learn to take consultants with a grain of salt.

kingcrimbud
Mar 1, 2007
Oh, Great. Now what?

RICHUNCLEPENNYBAGS posted:

Unit testing can be really handy but I don't believe in TDD. Also, I think it's a little bit like a church in that even its adherents don't actually always do it, but they all find guilty about it and try to make other people feel guilty too.

Also proponents of TDD frequently make unjustifiable claims, like saying the tests "prove" their software is defect-free. No, they don't, because the number of possible states is mind-boggling and that assumes your tests are correct. Certainly they can be a useful heuristic.

On the topic, this article is interesting and I agree with it.

Unit tests don't necessarily show that software is 'correct'. They simply allow you to show it does something, so that when you make changes to your software you can quickly see how that something changed. The better your testing, the more accurate you can measure the potential delta.

Mr Shiny Pants
Nov 12, 2012

InfernoJack posted:

Similar things could be said about any borderline-religious movement.


I am finding out that there is a lot of this in programming. It might even be worse than the console flame wars.

Che Delilas
Nov 23, 2009
FREE TIBET WEED

gariig posted:

With all of this unit testing talk if you want a book on it The Art of Unit Testing is excellent and I highly recommend it.

I've been meaning to ask this: does this book go into a deep dive of the entire TDD process too, or does is it just concerned with teaching you how to write good unit tests in a vacuum?

Newf
Feb 14, 2006
I appreciate hacky sack on a much deeper level than you.

Che Delilas posted:

I've been meaning to ask this: does this book go into a deep dive of the entire TDD process too, or does is it just concerned with teaching you how to write good unit tests in a vacuum?

90% the latter. It's about the tests themselves.

e: On the topic, has anyone read Professional Test Driven Development with C#? It seems to be the dominant .NET focused book on the topic for Amazon. I've read one other WROX 'Professional X' book (ASP.NET MVC 5) and thought it was pretty good.

Newf fucked around with this message at 12:43 on Dec 17, 2014

zerofunk
Apr 24, 2004
If I remember right, he specifically says in the introduction (or somewhere early on) that he's not really trying to cover TDD in the book. It's just focusing on good unit testing practices. Can't remember if it had recommendations for material that covers TDD further and I don't have my copy available right now.

Begby
Apr 7, 2005

Light saber? Check. Black boots? Check. Codpiece? Check. He's more machine than kid now.
Right now I have a window service coded in .NET that runs a set of scheduled jobs from a database. Every 5 minutes it checks the database for schedule changes, then it will look for jobs that are ready to run, and runs them.

Each job runs a specific module/plugin, and each of those has a PluginRunner that is basically a class that inherits the a PluginRunner interface. The modules are separate .NET assemblies, and the Type is stored as a record in a plugin table like this, and the service instatiantes an instance of the correct module using this type.

"MyCompany.TheService.PluginAssemblyName.PluginRunner, MyCompany.TheService.PluginAssemblyName"

This all works great right now. The only problem is that all the plugins are all part of one single big huge project which has created a few issues with being lazy about putting in stupid dependencies, but the biggest deal is that whenever we commit a change for a plugin, we have to completely down the services then have team city push out the new changes.

Ideally I would like each module to be its own project with its own repository and unit tests, then just push out a single DLL to the live server without having to down it. Is there a strategy that I can use to make that work?

We just got a new continuous integration server setup, and we are going to migrate this all from SVN to GIT. So I figure if I am going to do this, now would be a good time.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug
Are you using MEF? This is the classic use case for MEF.

Eggnogium
Jun 1, 2010

Never give an inch! Hnnnghhhhhh!

Ithaqua posted:

Are you using MEF? This is the classic use case for MEF.

True, but it still doesn't solve the specific issue Begby's asking about I think. We use MEF for our plugin loading but we still have to stop a service to deploy a new plugin version because the DLL is in memory and can't be written to. I think the only way to fix this is to architect your service for redundancy across machines.

EssOEss
Oct 23, 2006
128-bit approved
You can unload assemblies by unloading the AppDomain that they belong to, though managing AppDomains can be a bit of a hassle. Is it really worth the convenience of swapping bits out at runtime? Possibly it is easier to just stop the service, update it and start it again. This, of course, assumes that the service code has been built with proper stopping logic in mind - not always the case.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Azubah posted:

Has anyone had experience in integrating SSRS reports into a MVC project? I can get the /ReportServer folder directories to show up in an iFrame and navigate to the reports, but you still have to provide log in credentials and isn't as pretty. It's also not what is expected.

According to the database guy here I should be able to use Reportviewer control to do this, but so far I've only found that it renders one report if you've provided the report name in the specific directory. A lot of the tutorials I've found assume you need to build the report and use aspx pages.

What I'd like to do is render the directories as folders and drill down to the reports like Reporting Services's Folder.aspx page using credentials already provided by the system. Unlike that page though, we don't want the report builder or folder settings to appear, only the navigation.

My only experience with SSRS is that the three times I've had to implement something with it that it's not capable of doing what we wanted it to do and we moved to other solutions. Sorry that doesn't help, and maybe it's gotten better (this was 2008R2), but good god was it a tangle of permissions and lovely bridge code and not-quite programming but not-quite scripting.

Adbot
ADBOT LOVES YOU

mortarr
Apr 28, 2005

frozen meat at high speed

Scaramouche posted:

My only experience with SSRS is that the three times I've had to implement something with it that it's not capable of doing what we wanted it to do and we moved to other solutions. Sorry that doesn't help, and maybe it's gotten better (this was 2008R2), but good god was it a tangle of permissions and lovely bridge code and not-quite programming but not-quite scripting.

Yeah, same here... I not at work so no code to hand, but the object model you're working with is pretty opaque - the best I could do was get the SSRS report to render to pdf in memory and then I sent that down to the client as a filecontentresult.

In fact, I remember I got so cross with the whole thing I put the report up on the sql server and just called that from my mvc proj, it was too hard to get meaningful error messages when hosting the whole shebang inside my web app. I'm sure there were permissions errors where kerberos/delegation wasn't working but damned if I could find what aspect of the thing was bailing.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply