Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
ljw1004
Jan 18, 2005

rum

Bognar posted:

void UpdateAsync(IEnumerable<Delta<T>> deltas);

Curious why it's not Task-returning? With this signature you're basically telling callers of this interface "you're not allowed to know when the long-running operation has finished and you're not allowed to catch any exceptions that arise from it"...

Adbot
ADBOT LOVES YOU

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy
Oh, oops, all those are actually Async methods with Tasks and all that. I was just trying to do it from memory and screwed up.

RangerAce
Feb 25, 2014

RICHUNCLEPENNYBAGS posted:

Inheritance is "usually bad?" Like, not just in ORMs but in general design?

Yes! It is very bad, almost all the time.

This is one of the better write-ups on the subject: http://raganwald.com/2014/03/31/class-hierarchies-dont-do-that.html

It mainly is written at javascript devs, but the general principles are sound.

Ochowie
Nov 9, 2007

RangerAce posted:

Yes! It is very bad, almost all the time.

This is one of the better write-ups on the subject: http://raganwald.com/2014/03/31/class-hierarchies-dont-do-that.html

It mainly is written at javascript devs, but the general principles are sound.

From your Link posted:

In theory, JavaScript does not have classes. In practice, the following snippet of code is widely considered to be an example of a “class” in java script:

I think you're underestimating the extent this is written for JavaScript devs...

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy
I think it's a bit much to say that inheritance is almost always bad, but I would agree that deep inheritance hierarchies can create fragile code. You typically want to stick to just providing implementation on top of abstract classes.

EDIT: ^^^ but yeah, there's a lot of clamor in that article about how bad JS inheritance in particular is (e.g. you really should just act like it doesn't exist).

Newf
Feb 14, 2006
I appreciate hacky sack on a much deeper level than you.
Here's a question to slot into inheritance/EF chat.

I've been working on something with MVC but I'm starting to doubt whether it's the tool for the job. The short of it is that I've got some slowly growing (as the system matures) number of hand-made classes each of which have ~100 instances stored in the DB. Each class also has associated with it at least one but potentially any number of views. My first look tells me that it's going to be a pain to store the views themselves in the DB, since the framework wants to be loading them from Views/ClassName/ActionName (or whatever, but it wants to be loading physical .cshtml files).

In practice, I expect classes to be static, but the views (including the number of them assigned to each class) will be in flux in a partially automated, partially user driver way. Moreover, the system itself needs to be 'aware' of the coming and going of the views. This is why I prefer to have them persisted as DB objects rather than as physical files.

Does anyone have experience with VirtualPathProvider to do this sort of thing? And on a related note, how inconvenient would it become to be editing razor views in VS in order to then persist them to the DB? Probably just a matter of setting up an alternate project to store all of that?

This is total greenfield and it's my first time looking at MVC, so it's possible that I'm missing a better route to take with MVC, and it's also possible for me to ditch and move to another framework if something is a more natural fit with my intention.

Would appreciate any feedback.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug
Yeah, when you have a really long inheritance chain, you end up being afraid to change anything far up the chain because it all of the potential issues it can cause.

This is, of course, mitigated somewhat by having unit tests... but still, you'll end up in a situation where you're playing whack-a-mole with unit tests because making some pass starts making others fail.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

Newf posted:

Here's a question to slot into inheritance/EF chat.

I've been working on something with MVC but I'm starting to doubt whether it's the tool for the job. The short of it is that I've got some slowly growing (as the system matures) number of hand-made classes each of which have ~100 instances stored in the DB. Each class also has associated with it at least one but potentially any number of views. My first look tells me that it's going to be a pain to store the views themselves in the DB, since the framework wants to be loading them from Views/ClassName/ActionName (or whatever, but it wants to be loading physical .cshtml files).

In practice, I expect classes to be static, but the views (including the number of them assigned to each class) will be in flux in a partially automated, partially user driver way. Moreover, the system itself needs to be 'aware' of the coming and going of the views. This is why I prefer to have them persisted as DB objects rather than as physical files.

Does anyone have experience with VirtualPathProvider to do this sort of thing? And on a related note, how inconvenient would it become to be editing razor views in VS in order to then persist them to the DB? Probably just a matter of setting up an alternate project to store all of that?

This is total greenfield and it's my first time looking at MVC, so it's possible that I'm missing a better route to take with MVC, and it's also possible for me to ditch and move to another framework if something is a more natural fit with my intention.

Would appreciate any feedback.

It sounds like you're doing something really, really fishy. I think we may have an XY problem. What's the goal of the system that's requiring hundreds of views?

Newf
Feb 14, 2006
I appreciate hacky sack on a much deeper level than you.

Ithaqua posted:

It sounds like you're doing something really, really fishy. I think we may have an XY problem. What's the goal of the system that's requiring hundreds of views?

The views are alternative representations of math problems.

EG, your basic multiplication problem can be put to people several obvious ways:

quote:

4x5=__

quote:

How many dots are there?

....
....
....
....
....

quote:

There are 4 cookies in each box. [picture of five boxes]. How many cookies are there?

The idea is to feed this to children, "do big data stuff" to gain information about the relative merits of different question/view schemes, feed a better refined diet of it to children, and so on.


edit: It's not necessarily the case that '100s of views' are required for a particular question at a particular time, but the idea is to be running generalized A/B/C... testing against the existing stockpile in order to find the most useful ones and toss the least useful.

Newf fucked around with this message at 23:04 on Sep 25, 2014

RICHUNCLEPENNYBAGS
Dec 21, 2010

Newf posted:

Here's a question to slot into inheritance/EF chat.

I've been working on something with MVC but I'm starting to doubt whether it's the tool for the job. The short of it is that I've got some slowly growing (as the system matures) number of hand-made classes each of which have ~100 instances stored in the DB. Each class also has associated with it at least one but potentially any number of views. My first look tells me that it's going to be a pain to store the views themselves in the DB, since the framework wants to be loading them from Views/ClassName/ActionName (or whatever, but it wants to be loading physical .cshtml files).

In practice, I expect classes to be static, but the views (including the number of them assigned to each class) will be in flux in a partially automated, partially user driver way. Moreover, the system itself needs to be 'aware' of the coming and going of the views. This is why I prefer to have them persisted as DB objects rather than as physical files.

Does anyone have experience with VirtualPathProvider to do this sort of thing? And on a related note, how inconvenient would it become to be editing razor views in VS in order to then persist them to the DB? Probably just a matter of setting up an alternate project to store all of that?

This is total greenfield and it's my first time looking at MVC, so it's possible that I'm missing a better route to take with MVC, and it's also possible for me to ditch and move to another framework if something is a more natural fit with my intention.

Would appreciate any feedback.

You're not supposed to be storing view models in the database, which I think is probably the root of your problem. Store one representation and map it into your view models at runtime.

As far as inheritance, sure, very deep hierarchies can be bad, but I think it's insanely reductive and silly (and going against the principles of framework code and so on) to say inheritance should just be avoided altogether.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy

Newf posted:

The views are alternative representations of math problems.

EG, your basic multiplication problem can be put to people several obvious ways:




The idea is to feed this to children, "do big data stuff" to gain information about the relative merits of different question/view schemes, feed a better refined diet of it to children, and so on.


edit: It's not necessarily the case that '100s of views' are required for a particular question at a particular time, but the idea is to be running generalized A/B/C... testing against the existing stockpile in order to find the most useful ones and toss the least useful.

You should just keep them on the file system. When you're returning a View action result you can specify the name of the view file. If you need more control, IIRC you'll need to implement a view engine to customize the path lookup logic.

epswing
Nov 4, 2003

Soiled Meat
If my test project does a pretty good job of testing my services and visiting most code paths, but uses a database instead of mocking out repositories, does that make me a bad person?

For things that must be unique, like product names for example, I have to do stuff like product.Name = Util.RandomString(length: 20);

But the idea of creating like 40 mock repositories just sounds like a monumental amount of work.

RICHUNCLEPENNYBAGS
Dec 21, 2010

epalm posted:

If my test project does a pretty good job of testing my services and visiting most code paths, but uses a database instead of mocking out repositories, does that make me a bad person?

For things that must be unique, like product names for example, I have to do stuff like product.Name = Util.RandomString(length: 20);

But the idea of creating like 40 mock repositories just sounds like a monumental amount of work.

This is the problem generics are meant to solve, isn't it? I have IRepository<T> and then I use Autofixture to spit out objects. But I mean, hey, if it's working for you and the drawbacks aren't giving you trouble then no worries.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

epalm posted:

If my test project does a pretty good job of testing my services and visiting most code paths, but uses a database instead of mocking out repositories, does that make me a bad person?

For things that must be unique, like product names for example, I have to do stuff like product.Name = Util.RandomString(length: 20);

But the idea of creating like 40 mock repositories just sounds like a monumental amount of work.

Use a mocking framework.

SirViver
Oct 22, 2008

epalm posted:

If my test project does a pretty good job of testing my services and visiting most code paths, but uses a database instead of mocking out repositories, does that make me a bad person?
Nah, but it makes your tests integration tests instead of unit tests.

If you do intend those to be unit tests, mock your repository. Things unit tests should NOT include:
  • A test database, that makes the tests comparatively difficult to setup, maintain and maybe even slow to run. Same goes for anything related to the test that is persisted outside of application memory somehow.
  • Non-reproducible results, both due to the test database containing previous test data (unless it is cleaned up properly or created from scratch in code on every test run) and due to the usage of random data in your test, which by chance may introduce random failures (generated an already existing product name).

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

SirViver posted:

Nah, but it makes your tests integration tests instead of unit tests.

If you do intend those to be unit tests, mock your repository. Things unit tests should NOT include:
  • A test database, that makes the tests comparatively difficult to setup, maintain and maybe even slow to run. Same goes for anything related to the test that is persisted outside of application memory somehow.
  • Non-reproducible results, both due to the test database containing previous test data (unless it is cleaned up properly or created from scratch in code on every test run) and due to the usage of random data in your test, which by chance may introduce random failures (generated an already existing product name).

Yeah, there's no excuse to not mock your repositories assuming you're already coding to an interface. VS2012 and beyond even includes a lightweight stubbing framework built-in, if you don't want to use something like Moq.

RICHUNCLEPENNYBAGS
Dec 21, 2010

Ithaqua posted:

Yeah, there's no excuse to not mock your repositories assuming you're already coding to an interface. VS2012 and beyond even includes a lightweight stubbing framework built-in, if you don't want to use something like Moq.

Pretty big assumption if you ever look at the applications people churn out.

RangerAce
Feb 25, 2014

This is somewhat related to this Unit Test vs. Integration Test discussion.

I'm writing code that hits a 3rd-party API. This code will be used by a RESTful web service platform. My goal is to have an integration layer to the API that is more or less a facade. I have a method GetScheduledWidgets(DateTime beginDate, DateTime endDate) and it returns a List<ScheduledWidget> that will then be used by the service layer.

I'm not yet a convert to dependency injection (though I do use factory classes a lot for IoC) but I want to unit test this integration layer. I want to have the code that is used by the integration layer to talk to the API just return some fake data, but I don't want to require a IHttpApiRequest or whatever to be passed in as a parameter, either. I know I could get around it with MS FakesFramework Shims, but I REALLY want to avoid using those, if I can.

For some reason (probably because it's Fri. afternoon) I just can't wrap my head around this.

What would you guys do?

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

RangerAce posted:

This is somewhat related to this Unit Test vs. Integration Test discussion.

I'm writing code that hits a 3rd-party API. This code will be used by a RESTful web service platform. My goal is to have an integration layer to the API that is more or less a facade. I have a method GetScheduledWidgets(DateTime beginDate, DateTime endDate) and it returns a List<ScheduledWidget> that will then be used by the service layer.

I'm not yet a convert to dependency injection (though I do use factory classes a lot for IoC) but I want to unit test this integration layer. I want to have the code that is used by the integration layer to talk to the API just return some fake data, but I don't want to require a IHttpApiRequest or whatever to be passed in as a parameter, either. I know I could get around it with MS FakesFramework Shims, but I REALLY want to avoid using those, if I can.

For some reason (probably because it's Fri. afternoon) I just can't wrap my head around this.

What would you guys do?

I'd just use dependency injection, personally.

If you don't want to do DI for IoC, look into an IoC container? I've always liked Ninject, although I still haven't had a compelling enough reason to start using it in anything so far.

Jewel
May 2, 2009

Crosspostin' from the gamedev thread, since it's generic enough that it works in every scenario (though I can't think of much use outside games other than maybe UI if you hate your users).

There’s a big lack of good generic tweening libraries for .Net, and any that are available have a poor API or missing functionality or are just.. really bad, so I made Betwixt. I made it for games and that's what I'm currently using it for; it's entirely generic so it works with any custom type you feed it.

I can’t show much since it’s a library meant to be used, and not a standalone application, but at the very least I made a small application to draw all of the built-in easing functions, all of which are completely optional as the API lets you easily and extensively create your own easing functions in pretty much any way you want.

I hope you give it a try if it interests you, or maybe just take a look through the source.

You can find the repository the source and downloads are located at on github right over here: https://github.com/Jewelots/Betwixt



This is my first actual release of something like this so please let me know if I've done something horribly wrong with the code or the git or anything :ohdear:

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy
If you plan on other people maybe actually using it, you should put it on NuGet as well.

Jewel
May 2, 2009

Bognar posted:

If you plan on other people maybe actually using it, you should put it on NuGet as well.

I've never used NuGet before but I think I did it..? https://www.nuget.org/packages/Betwixt/

Sorry if I did something horribly wrong with that, NuGet seems to have problems with providing both x86 and x64..? Should I have split them into separate packages?

EssOEss
Oct 23, 2006
128-bit approved

Newf posted:

The views are alternative representations of math problems.

EG, your basic multiplication problem can be put to people several obvious ways:




The idea is to feed this to children, "do big data stuff" to gain information about the relative merits of different question/view schemes, feed a better refined diet of it to children, and so on.


edit: It's not necessarily the case that '100s of views' are required for a particular question at a particular time, but the idea is to be running generalized A/B/C... testing against the existing stockpile in order to find the most useful ones and toss the least useful.

I recommend a change of viewpoint. What you are calling views here are not actually views - they are different ways to model some entities. Views are a purely UI layer thing. Think of the different representations as parts of the model and orient your views around showing these different representations.

EssOEss fucked around with this message at 10:19 on Sep 28, 2014

ljw1004
Jan 18, 2005

rum

Newf posted:

The idea is to feed this to children, "do big data stuff" to gain information about the relative merits of different question/view schemes, feed a better refined diet of it to children, and so on.

I developed a similar kind of app over the past four years. Rather than asking maths questions, my app asks the kind of questions a psychiatrist would ask to discover if someone has the early stages of dementia. Like yours, we have multiple forms of questions, and we used big data to figure out the optimum questions. Actually we dynamically select the optimum question to ask next based on the responses that the patient has given so far.

I started with a complex codebase+model like yours, but abandoned it pretty quick. It makes me think you're going down the wrong path too. Here's why.


Big data statisticians can't do magic where you just throw the full complexity of the data at them and they somehow discern the truth. Instead they invent a model in their head, e.g. "the person has an innate intelligence X and an affinity for pictures Y, and each question has a pictureness A (standard deviation A2) and a difficulty B (standard deviation B2), and the distribution of scores for this question will be X*B + Y^2*A with standard deviation A2+B2". They partition the data into a training set which they use to solve a best-fit for X, Y, A, A2, B, B2, and a test set to see whether these numbers explain the rest of it.

Ultimately my statistician coded in "R". The model that I used to represent the data was nothing like the model he wanted to use to represent the data. He was the authority. He had the expertise to know exactly what sort of data cleanliness he needed, what he needed to store about it, and so on. I did the initial rough draft of the question classifications, but he was the one who made the authoritative model, and he was the one who had to.

So ultimately I ended up storing every single test as just a Dictionary<string,string>, i.e. an untyped map from question-code to answer. That's the way he coded it in R anyway. He came up with the authoritative question codes, and the authoritative way to code answers (e.g. blank = not yet answered, Y/N = yes/no, 1/2/3/... = numerical answer, D = void/doesn't-apply, ...)

He came up with the ontology, e.g.
code:
A question is either a multiple-choice question,
or a yes/no question,
or a date question,
or an unbounded integral numerical question,
or ...
He also came up with the question definitions e.g.
code:
Question code=WASHING,
         type=MULTIPLECHOICE,
         options={1,"no impariment",
                  2,"can wash self with assistance",
                  3,"needs complete assistance"}

Question code=FEEDING,
         type=MULTIPLECHOICE,
         options={1,"no impairment",
                  2,"if carer prepares meals, can feed self",
                  3,"needs carer to put food into mouth"}
Now we could and did express the question definitions in an XML file. There wasn't much point expressing the ontology in a data-driven format because we had to write so much code for it. And no point figuring out inheritance hierarchies because ultimately that's not what the data is about. He had to write code for intermediate calculations (e.g. "living-impairment = max(WASHING, COOKING, FEEDING"). I had to write code because each item in the ontology required its own viewmodel.


So my advice to you is this:

(1) Don't do any more coding until you've met with your data-scientist and he's done the first run through of data.

(2) Store your test in a Dictionary<string,string>. This will make it really easy to store and manipulate. Strong typing doesn't actually buy you anything in terms of bug-finding. Unit-tests on the data-analysis will do all that for you.

(3) Hand code your viewmodel and/or view for each different question type. You might use inheritance for your views, but it's not needed for your data or viewmodels.

(4) Remember localization!


Ultimately it just became a slog, having to code up fifteen different views. But I think that hand-crafted views for each question type are necessary to get a good user interface. I went through four complete rewrites of my views+viewmodels until I ended up with a way of doing it in XAML+code that felt clean.

ljw1004 fucked around with this message at 18:27 on Sep 28, 2014

Malcolm XML
Aug 8, 2009

I always knew it would end like this.


As someone who works in "big data" this is just standard statistics and experimental design, the kinds of stuff that has been used for decades and is still useful in 99.9% of cases. If your data fit into memory on a commodity machine (i.e. 1TB or less) then it isn't big, at least that's our criterion.

As ljw said what you want is a data driven system where you can take a question definition and project it through a view (e.g., for the ones you showed, maybe have a MultipleChoiceView that takes a Question and Answers Properties with maybe a MultipleChoicePictureView that supports pictures as answers if needed etc)


epalm posted:

If my test project does a pretty good job of testing my services and visiting most code paths, but uses a database instead of mocking out repositories, does that make me a bad person?

For things that must be unique, like product names for example, I have to do stuff like product.Name = Util.RandomString(length: 20);

But the idea of creating like 40 mock repositories just sounds like a monumental amount of work.

Code to an interface, not an implementation, and allow a constructor overload that accepts that interface. Done. That's dependency injection.

I spent too drat long realizing that's all you ever need. Any containers or service locators or poo poo like that is just sugar to make it easy to wire up dependencies.

Newf
Feb 14, 2006
I appreciate hacky sack on a much deeper level than you.

EssOEss posted:

I recommend a change of viewpoint. What you are calling views here are not actually views - they are different ways to model some entities. Views are a purely UI layer thing. Think of the different representations as parts of the model and orient your views around showing these different representations.

I think I understand your meaning here, but the point of having a single model displayed via different views is to help students discern the equivalence for themselves. A numerate person will instantly and thoughtlessly use multiplication in order to count the square tiles on a floor, but a young kid won't necessarily do that without first giving it some thought. The idea is to give them the experience to internalize the equivalence for themselves, rather than 'telling' them about it, and a way to measure for it is to measure their relative performance with "different views" of the same "model". It simplifies some of the logistics if there's a hard-wired relationship between these different views. Still considering options here though - more plumbing work is ongoing.

ljw1004 posted:

(1) Don't do any more coding until you've met with your data-scientist and he's done the first run through of data.

(2) Store your test in a Dictionary<string,string>. This will make it really easy to store and manipulate. Strong typing doesn't actually buy you anything in terms of bug-finding. Unit-tests on the data-analysis will do all that for you.

(3) Hand code your viewmodel and/or view for each different question type. You might use inheritance for your views, but it's not needed for your data or viewmodels.

(4) Remember localization!

1) This is a one-man show I'm afraid. On the plus side, it keeps me from designing a CMS / ViewModel that's incompatible with what the data-scientist wants the data to look like! I'm not actually learned in anything like 'data-science' (honestly I'm not even sure what that refers to) but I've done a bit of evolutionary programming and I've got a ton of experience working with low-numeracy students and I have a scheme in mind that I think can work. The short of it is that it's an evolutionary scheme where a bunch of software objects have 'opinions' on how different skills are related to one another, and are continually gambling on the performance of the users. The objects with better opinions about the relationships between skills will be better able to predict student performance, and be selected for. (They all have access to the students' prior performance records). I built a version of this scheme for arbitrary binary data for a course a year or so ago and it was extremely successful (if a little slow) at learning the relationships between the individual bits of a stream of bitstrings I threw at it :)

2-3) That's something to think about. I'm going to plow forward for the moment because I'm nearing the point of actually loosing my data scheme against the existing model, at which point I guess I'll step back and try to figure out whether the model will work (well) going forward.

4) You mean cultural / language wise? My intended audience for this is a few dozen kids who are part of an after-school program around the corner. I think that internationalization is maybe a ways off :)


Your app sounds neat. Thanks for all of the feedback everyone.

Factor Mystic
Mar 20, 2006

Baby's First Post-Apocalyptic Fiction

Jewel posted:

I've never used NuGet before but I think I did it..? https://www.nuget.org/packages/Betwixt/

Sorry if I did something horribly wrong with that, NuGet seems to have problems with providing both x86 and x64..? Should I have split them into separate packages?

Something goes wrong when I try to install it into a new project:

code:
PM> Install-Package Betwixt
Installing 'Betwixt 1.2.0'.
Successfully installed 'Betwixt 1.2.0'.
Adding 'Betwixt 1.2.0' to App2.Windows.
Uninstalling 'Betwixt 1.2.0'.
Successfully uninstalled 'Betwixt 1.2.0'.
Install failed. Rolling back...
Install-Package : Failed to add reference to 'Betwixt'. Please make sure that it is in the Global Assembly Cache.
At line:1 char:1
+ Install-Package Betwixt
+ ~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Install-Package], InvalidOperationException
    + FullyQualifiedErrorId : NuGetCmdletUnhandledException,NuGet.PowerShell.Commands.InstallPackageCommand
Also, I recommend adding a note about nuget to the github readme.

Jewel
May 2, 2009

Factor Mystic posted:

Something goes wrong when I try to install it into a new project:

code:
PM> Install-Package Betwixt
Installing 'Betwixt 1.2.0'.
Successfully installed 'Betwixt 1.2.0'.
Adding 'Betwixt 1.2.0' to App2.Windows.
Uninstalling 'Betwixt 1.2.0'.
Successfully uninstalled 'Betwixt 1.2.0'.
Install failed. Rolling back...
Install-Package : Failed to add reference to 'Betwixt'. Please make sure that it is in the Global Assembly Cache.
At line:1 char:1
+ Install-Package Betwixt
+ ~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Install-Package], InvalidOperationException
    + FullyQualifiedErrorId : NuGetCmdletUnhandledException,NuGet.PowerShell.Commands.InstallPackageCommand
Also, I recommend adding a note about nuget to the github readme.

I have no idea what nuget wants from me, I've never personally used it and it's weird; can anyone help? :sigh:

Edit: I didn't realize I had License.txt in the lib folder accidentally, I only added it in 1.2 and I didn't see anyone complain before 1.2 but I don't know if that's because nobody told me it was wrong.

Try now..? It should be fixed in 1.3.3

Edit: I just split it up because nuget can't handle x64 and x86 assemblies so now you can use "Install_Package betwixt_x64" if you're targeting x64. I should be done for good now, no more updates. Sorry for anyone who downloaded each iteration, there were about three of you but I don't know if they were from here.

Jewel fucked around with this message at 01:31 on Sep 29, 2014

epswing
Nov 4, 2003

Soiled Meat

Malcolm XML posted:

Code to an interface, not an implementation, and allow a constructor overload that accepts that interface. Done. That's dependency injection.

I'm already doing this!

Factor Mystic
Mar 20, 2006

Baby's First Post-Apocalyptic Fiction

Jewel posted:

I have no idea what nuget wants from me, I've never personally used it and it's weird; can anyone help? :sigh:

Edit: I didn't realize I had License.txt in the lib folder accidentally, I only added it in 1.2 and I didn't see anyone complain before 1.2 but I don't know if that's because nobody told me it was wrong.

Try now..? It should be fixed in 1.3.3

Edit: I just split it up because nuget can't handle x64 and x86 assemblies so now you can use "Install_Package betwixt_x64" if you're targeting x64. I should be done for good now, no more updates. Sorry for anyone who downloaded each iteration, there were about three of you but I don't know if they were from here.

Cool, 1.3.3 installs.

Suggestion: your example in the github readme should be "new Tweener<float>" (missing the type param). Also, does it make sense to add some additional overloads to Update for other numeric types (eg, double?). Perhaps the time deltas you normally deal with are always floats and never doubles in which case carry on. Adding a double overload would mean I can just pass it "0.1" and not "0.1f", which is pretty minor all things considered.

I got this problem compiling a fresh project with your readme example:

quote:

There was a mismatch between the processor architecture of the project being built "MSIL" and the processor architecture of the reference "Betwixt", "x86".
This mismatch may cause runtime failures.
Please consider changing the targeted processor architecture of your project through the Configuration Manager so as to align the processor architectures between your project and references, or take a dependency on references with a processor architecture that matches the targeted processor architecture of your project.

I had been targeting AnyCPU, so I switched it to x86. Then it threw a FileNotFoundException:

quote:

System.IO.FileNotFoundException: Could not load file or assembly 'System.Core, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' or one of its dependencies. The system cannot find the file specified.

File name: 'System.Core, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'
at Betwixt.GenericMath.Subtract[T](T a, T b)
at Betwixt.Tweener`1.LerpFuncDefault(T start, T end, Single percent)
at Betwixt.Tweener`1.Calculate(T start, T end, Single percent, EaseFunc easeFunc, LerpFunc`1 lerpFunc)
at Betwixt.Tweener`1.Update(Single deltaTime)
at App2.MainPage..ctor()

Jewel
May 2, 2009

Factor Mystic posted:

Cool, 1.3.3 installs.

Suggestion: your example in the github readme should be "new Tweener<float>" (missing the type param). Also, does it make sense to add some additional overloads to Update for other numeric types (eg, double?). Perhaps the time deltas you normally deal with are always floats and never doubles in which case carry on. Adding a double overload would mean I can just pass it "0.1" and not "0.1f", which is pretty minor all things considered.

I got this problem compiling a fresh project with your readme example:


I had been targeting AnyCPU, so I switched it to x86. Then it threw a FileNotFoundException:

I think that's saying it can't load .Net 3.5..? Maybe you're on 2.0 or something? I'm usually in the gamedev department so I'm not used to these kind of problems, because usually my code isn't shared :sweatdrop:

And hmm, I should fix those. It'd require recompiling the documentation too so hopefully people aren't sick of my constant updates..! (I sure am)

Factor Mystic
Mar 20, 2006

Baby's First Post-Apocalyptic Fiction

Jewel posted:

I think that's saying it can't load .Net 3.5..? Maybe you're on 2.0 or something? I'm usually in the gamedev department so I'm not used to these kind of problems, because usually my code isn't shared :sweatdrop:

And hmm, I should fix those. It'd require recompiling the documentation too so hopefully people aren't sick of my constant updates..! (I sure am)

It was a Universal App. I tried Betwixt on a desktop .NET 4.5 app and it worked fine*, but that leads me to another question: Any reason this can't be a PCL?

* = the code from your readme "settles" on 10.00488. I would've expected that a tweener with an end value of "10" would not actually end with a value greater than 10. Maybe during an elastic bounce or something, but not after it's finished. Without a lot of gamedev experience, I can't tell if this is normal or not.

Jewel
May 2, 2009

Factor Mystic posted:

It was a Universal App. I tried Betwixt on a desktop .NET 4.5 app and it worked fine*, but that leads me to another question: Any reason this can't be a PCL?

* = the code from your readme "settles" on 10.00488. I would've expected that a tweener with an end value of "10" would not actually end with a value greater than 10. Maybe during an elastic bounce or something, but not after it's finished. Without a lot of gamedev experience, I can't tell if this is normal or not.

I set it to settle on the value it finishes on in the tween (ie a tween with time of 100%). At first it was "settle on end value" but there was the problem of "what if you had a tween that goes towards the end value then returns to the start value" which there's totally valid cases for, it would snap to the end with the old method; so I went with having it end where it ends physically.

E: Also it can't be a PCL because I have no idea what they actually provide and don't provide. What would be the upsides/downsides? I might consider it.

Edit: vvv Yeah it does use reflection. The Generic Math needs it to support any type, I think. At least I'm moderately sure that's how the .Net Expression module works internally. I've been told this won't work on IOS so yeah I'm pretty sure it won't work as a PCL.

Also hm, I haven't actually used Any CPU much, that's a good point. I've always been cautious of it but you're saying it should work fine in this scenario?

Edit2: Okay, sorry NuGet x64 people (and the x86 ones for having so many updates) this is hopefully the final update I'll really ever need to do to this. Grab the normal package now it should contain the AnyCPU libs.

Jewel fucked around with this message at 05:08 on Sep 29, 2014

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

Jewel posted:

E: Also it can't be a PCL because I have no idea what they actually provide and don't provide. What would be the upsides/downsides? I might consider it.

Does your code do anything with reflection, strange APIs from .NET 1.0, networking, threading, or inspecting expressions? No? It can be a PCL.

EDIT: Also, maybe I'm missing something, but why are you uploading x86 and x64 versions to NuGet instead of just targeting Any CPU?

Bognar fucked around with this message at 04:37 on Sep 29, 2014

Mr Shiny Pants
Nov 12, 2012

Bognar posted:


snipped the beginning out.
The big benefit of all this is when you add an entity to the model that should have its changes tracked. All you have to do is write one expression saying how to get from that item to the Company/Team/whatever and the changes are tracked automatically.

I was worried when building this, as I constantly am, about the performance implications of using a solution like this that does a significant amount of magically loading in extra entities. However, we have benchmarked it in multiple situations and only seen a 5-15% decrease in performance. I'm sure it could get significantly worse if you are joining on the entirety of your object model, but in the ways we use it I've seen no reason to expect that it will cripple the system.

Been on holidays, I just got back and saw this post sorry if it's a bit late. If I got it right it almost seems like you've build an ORM ( the defining relationships and change tracking ) over another ORM (EF). Why still use EF and not SQL directly if you do the change tracking yourself?

As for the item change tracking: Maybe something like event sourcing would have been a more natural fit? You write out all the events that happen to a model and store them in some list or table for later use.

Mr Shiny Pants fucked around with this message at 13:40 on Sep 29, 2014

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy
It's a bit excessive to say we've built an ORM.

The relationship definitions were originally designed as (and are still used as) a way to define how to get from one entity to another via traversing an arbitrary object graph. Most importantly, they live completely outside of the data layer - our data layer just uses them. However, because the relationships are implemented with Expressions, we are able to easily leverage EF's expression parsing to generate SQL. This would suck royal rear end if we were trying to do this manually.

The change tracking is not strictly per entity, like the functionality an ORM gives you. It tracks that changes made to a certain object are related in some way to other objects (e.g. a Bug object was marked as completed, that's related to a Bug, a Project, a Team, a User). Importantly, it doesn't have to explicitly track what changed, only that something changed. Adding in tracking of what changed, and then mapping that to SQL would, again, suck royal rear end.

The whole post was an explanation of how we leverage EF's expression parsing to let us do less work. Handling the SQL ourselves would just be more work.

Regarding event sourcing, we did discuss that option at the beginning of the project. I don't remember all the reasons that we decided not to go with it, but one was that there are significant other parts of the application that do not require the explicit change tracking. Using event sourcing for those portions would be overkill, but if we just use raw DB updates in those areas then we would have two separate models of interacting with the database.

Mr Shiny Pants
Nov 12, 2012
Cool, thanks for the explanation. I was just wondering :)

Mr Shiny Pants fucked around with this message at 16:59 on Sep 29, 2014

raminasi
Jan 25, 2005

a last drink with no ice
I've got a WPF slider that's bound to my viewmodel via a converter, and everything's working great, except that when the slider is set to zero, the converter sees null instead of a number. What the hell is that about?

Here's the XAML:
code:
<Slider Name="WwrNSlider"
    Height="23"
    Width="100"
    Value="{Binding Path=Settings, Converter={StaticResource WwrConverter}, ConverterParameter=WindowToWallRatioN, TargetNullValue=0.0}"
    IsEnabled="{Binding Path=AnySelection}"
    IsSnapToTickEnabled="True"
    Maximum="100"
    TickFrequency="10"
    TickPlacement="BottomRight" />

Che Delilas
Nov 23, 2009
FREE TIBET WEED
Edit: oh. you set it.
Double Edit: Goddammit I can't read.

Set the "Minimum" property.

Adbot
ADBOT LOVES YOU

raminasi
Jan 25, 2005

a last drink with no ice
No luck :(

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply