Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Sivart13
May 18, 2003
I have neglected to come up with a clever title

leper khan posted:

One of the problems with TDD is that neither of your benefits are necessarily realized to implement TDD. The tests can have negative value and the code doesn't need to be testable. These issues compound themselves.
I like this article that puts a skeptical eye how TDD is hyped as a 'design technique'.

Adbot
ADBOT LOVES YOU

luchadornado
Oct 7, 2004

A boombox is not a toy!

awesomeolion posted:

Things like ease of use (eg. for the caller of my code / api), ability for other engineers to tell what the code is doing, maintainability, ability to reuse my code is all important. Also important is getting other people that I work with to think that the code I write is well structured and architected. Because they're the ones giving feedback that I need to improve so I need to make them happy in a sense. I guess I could chase down more detailed feedback to try to figure out exactly what they saw that gave them a negative impression.

This is a deep pool with a lot of potential things that could help, I can only offer what resonated with me:

- Distilling Domain Driven Design down to the best parts. It's a big book and there's a lot of garbage out there, but if you just grab some things like event storming and minding your aggregates/domain boundaries and avoid the stuff that's low ROI, it can be a huge help

- "Design by guarantee", leaning on any guarantees you can get from the hardware, OS, runtime, etc. This could be letting your db make better use of os page cache, leveraging causality from Kafka, or just using the type system to provide guarantees so that its impossible for your code to represent a state that isn't representable in your domain: https://lexi-lambda.github.io/blog/2019/11/05/parse-don-t-validate/

- Grokking Simplicity. Even if you're not using Haskell or Clojure or something, separating your actions, calculations, and data in a functional way are huge. Once that feels good, expand the ideas to architecture (FCIS/Onion/Hexagon/whatever): https://www.destroyallsoftware.com/screencasts/catalog/functional-core-imperative-shell

A lot of this all comes down to reducing accidental complexity in our software, and it's hard to do a better job of Rich Hickey when talking about it: https://www.youtube.com/watch?v=LKtk3HCgTa8&t=2s

StumblyWumbly
Sep 12, 2007

Batmanticore!

awesomeolion posted:

Anyone have tips for how to improve my architecture and design skills? I tend to jump in and try stuff rather than sitting back in my rocking chair and comparing every possible architectural option. So that's one issue I think. But when I sit there brainstorming architecture options it feels like it doesn't matter and I get sleepy.

I have been writing unit tests this year for the first time. It's helping me understand why protocols and abstractions are so important. So maybe I should learn more about TDD and try to commit more to that style?

I've read sections of Designing Data Intensive Applications and it's not really relevant to me as an iOS engineer. A lot of suggestions from books like Clean Code are just kind of cringe and feel counter productive... I don't want 70 two line functions thanks. Anyways if there are books or methods for improving my software design skills please let me know, preferrably not by Robert Martin or backend/server focused.

Have you tried reading up on Design Patterns? I'm mostly in a different world from most SW so I really can't follow a lot of the standard patterns, but I think it was helpful to get a different view and change my thinking.

LLSix
Jan 20, 2010

The real power behind countless overlords

TooMuchAbstraction posted:

The value of TDD is twofold. First is the tests themselves, of course, which help catch regressions before they hit your clients. But second is that they force you to write testable code. If your code is a mess of long-lived threads and classes that reach deep into each others' pockets, then you have a lot of work to do before you can start writing useful tests.

Ideally (for TDD), all of your code should be organized into relatively small units that each have an "API" that encompasses the entirety of their dependencies. I.e. each exposed function is stateless aside from the state that the function's parameters provide. That lets you write unit tests by injecting mock dependencies.

I agree with all of this.

I've worked at... probably too many places because I spent half my career as a contractor. All the places I've worked that rely on and enforce unit tests passing have had smooth releases. All the places that don't use unit tests have had rough releases resulting in lots of overtime. At one company I was at briefly they had a policy of requiring unit tests to be written, but management also had a habit of okaying PRs with failing unit tests. Every single release where the unit tests passed were smooth. Every single release with a PR that failed unit tests had a rough release. In one particularly memorable release, an every-other-month release was delayed more than 3 months as a result of bugs caused by a change that commented out failing unit tests. I left for greener pastures shortly afterwards.

Bruegels Fuckbooks
Sep 14, 2004

Now, listen - I know the two of you are very different from each other in a lot of ways, but you have to understand that as far as Grandpa's concerned, you're both pieces of shit! Yeah. I can prove it mathematically.

gbut posted:

I've found that the whole idea of TDD does not really work in large orgs, especially if those orgs like to shuffle engineers around different teams every 6 months or so. It all ends up being a soup of low-hanging fruit that is also tightly coupled with implementation, mocks that don't implement the service interfaces well, and business-crucial stuff being skipped over because it's too hard to test due to complexity. I work in web space, so take this with a giant Himalayan salt rock.

TDD was popularized from web space, and the big proponents seem to come from dynamically typed language backgrounds (e.g. ruby/php) etc. I think some of the core ideas are nice, but then you run into test suites that are mocks testing mocks while the service they're testing doesn't actually loving work and you wonder what's the point.

Xarn
Jun 26, 2015

LLSix posted:

At one company I was at briefly they had a policy of requiring unit tests to be written, but management also had a habit of okaying PRs with failing unit tests.

How does that even work? I don't think I ever worked at any place where the management had any idea of what the PR gate status is. Sure, they know whether it is merged or not, there are blockers or not, but whether it is merged was never up to them.

Xarn
Jun 26, 2015

Sivart13 posted:

I like this article that puts a skeptical eye how TDD is hyped as a 'design technique'.

Yeah, that mostly matches up with my own thoughts and experiences.

spiritual bypass
Feb 19, 2008

Grimey Drawer

Xarn posted:

How does that even work? I don't think I ever worked at any place where the management had any idea of what the PR gate status is. Sure, they know whether it is merged or not, there are blockers or not, but whether it is merged was never up to them.

I work at a place where technically skilled VPs (real ones at a bigcorp) regularly intervene to solve technical problems. Seems like they do this because they don't know how to do their actual job so it makes a pleasant distraction.

bob dobbs is dead
Oct 8, 2017

I love peeps
Nap Ghost
waynes deffo generally a good read. ive been trying to get him to try and find some peeps who moved from computer-touching to trad-engineering in order to get the converse of his 'crossover project' where he looked at peeps who moved from trad-engineering to computer-touching to get peeps who complain about computer-touching not being engineering to shut up

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
The people submitting the PR that is getting rejected complain to the reviewer's manager that the reviewer is gating them.

ultrafilter
Aug 23, 2007

It's okay if you have any questions.


bob dobbs is dead posted:

waynes deffo generally a good read. ive been trying to get him to try and find some peeps who moved from computer-touching to trad-engineering in order to get the converse of his 'crossover project' where he looked at peeps who moved from trad-engineering to computer-touching to get peeps who complain about computer-touching not being engineering to shut up

That just can't be a really big population though.

Steve French
Sep 8, 2003

TDD is not the same thing as having [unit] tests. I don’t think anyone is arguing against having tests.

Hadlock
Nov 9, 2004

Tests are bad for job security, particularly the QA department

StumblyWumbly
Sep 12, 2007

Batmanticore!
Focusing on unit tests such that you ignore top level tests is bad.

thotsky
Jun 7, 2005

hot to trot

Steve French posted:

TDD is not the same thing as having [unit] tests. I don’t think anyone is arguing against having tests.

I would argue against having coverage-focused unit tests. Having some integration tests is nice, but it is possible to do that wrong as well.

bob dobbs is dead
Oct 8, 2017

I love peeps
Nap Ghost

ultrafilter posted:

That just can't be a really big population though.

he looked already for months for the original thing and found absolutely no-one

Jabor
Jul 16, 2010

#1 Loser at SpaceChem
Coverage metrics are a really good way to identify the most poorly-written and hard-to-maintain parts of your codebase.

At least, they are as long as no-one decides to make "improving test coverage" a goal in and of itself. At that point Goodhart's Law applies, and it loses its value as a metric.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

StumblyWumbly posted:

Focusing on unit tests such that you ignore top level tests is bad.

What is a "top-level" test?

StumblyWumbly
Sep 12, 2007

Batmanticore!

New Yorp New Yorp posted:

What is a "top-level" test?

Integration or End to End, depending on where you draw the line.

My company added a lot of SW tests in pretty late, and the SW folks decided that the "pure" decision was to write complete tests for each function, starting with the ones that were easy to test for. The main SW was taking in some files and processing it in a variety of ways, and I think we could have gotten more utility more quickly if we had just run some known files with known outputs through, to make sure new features did not break the existing ones.

TooMuchAbstraction
Oct 14, 2012

I spent four years making
Waves of Steel
Hell yes I'm going to turn my avatar into an ad for it.
Fun Shoe
It's often the case that end-to-end tests are expensive to run or slow, while unit tests are cheap and fast. If your E2E tests are fast, then sure, just use those. You lose some specificity when a test fails, but odds are you can figure out what broke by looking at what the PR changed.

My mantra generally is "code that is not tested does not work." It's not 100% true, but it comes within spitting distance.

StumblyWumbly
Sep 12, 2007

Batmanticore!

TooMuchAbstraction posted:

It's often the case that end-to-end tests are expensive to run or slow, while unit tests are cheap and fast. If your E2E tests are fast, then sure, just use those. You lose some specificity when a test fails, but odds are you can figure out what broke by looking at what the PR changed.

My mantra generally is "code that is not tested does not work." It's not 100% true, but it comes within spitting distance.
Yeah, you're spot on. I'd argue that if you're coming into a situation with no tests, starting with the end to end is worth the implementation time, even if you get less resolution on what causes the errors. Unit testing still has value, but I get frustrated with folks who are caught up in some "Unit testing purity" thing.

Jabor
Jul 16, 2010

#1 Loser at SpaceChem
An excessive focus on unit tests can also often end up with you having a whole bunch of "change detector" tests that break whenever you make an intentional change to the code being tested, causing extra busywork for everyone working on it, while providing no ability to detect actual unintended regressions.

prom candy
Dec 16, 2005

Only I may dance
I've often found that focusing on writing really (unit) testable code results in code that's harder to reason about initially because of all the dependency injection and tiny testable units you're creating. Sometimes the clearest way to do something is with procedural programming and the best way to test it is with an integration test.

New Yorp New Yorp
Jul 18, 2003

Only in Kenya.
Pillbug

TooMuchAbstraction posted:

It's often the case that end-to-end tests are expensive to run or slow, while unit tests are cheap and fast. If your E2E tests are fast, then sure, just use those. You lose some specificity when a test fails, but odds are you can figure out what broke by looking at what the PR changed

Brittleness is the real problem. Even the best-written E2E tests are going to be in a constant state of brokenness, and it just gets amplified the more of them you have. They're great as long as you have a limited number that verify that a few critical paths don't explicitly blow up. But validating actual application logic is best done at a deeper level.

I'm in the "a single broken test is an immediate problem" camp, so the idea of having suites of brittle, flaky, slow E2E tests just makes me shudder. I've lived that before. No thanks.

Jabor
Jul 16, 2010

#1 Loser at SpaceChem

New Yorp New Yorp posted:

Even the best-written E2E tests are going to be in a constant state of brokenness,

Uh, what? We have end-to-end tests that almost never get broken because about the only thing that breaks them is if the user flow that they're testing is actually not working - in which case your PR is broken and should not be merged!

Love Stole the Day
Nov 4, 2012
Please give me free quality professional advice so I can be a baby about it and insult you
Hot take: instead of writing unit tests for each method of each class in the entire project... just write unit tests from the entry point of your application and cover everything by passing in all of the use cases for your entry point. The only thing you should mock are your downstream calls.

That way, it works just like an integration test but it's cheap and fast.

Even if your component is 500K lines of code, that's why we have debuggers. If you really can't cover some code in this way then either you don't understand how your component works or that code can be safely removed.

Maybe the only exception to this could be the middleware classes that intercept incoming and outgoing requests to the downstream components, because you're mocking their responses.

fight me irl

Bruegels Fuckbooks
Sep 14, 2004

Now, listen - I know the two of you are very different from each other in a lot of ways, but you have to understand that as far as Grandpa's concerned, you're both pieces of shit! Yeah. I can prove it mathematically.

Jabor posted:

Uh, what? We have end-to-end tests that almost never get broken because about the only thing that breaks them is if the user flow that they're testing is actually not working - in which case your PR is broken and should not be merged!

it depends on what you mean by "end-to-end" and how the tests are automated. qa managers can get real philosophical about how stuff is automated and demand that the automation ought to work just as a user would (which somehow means locating buttons in the user interface, clicking them, and pressing keyboard keys, as if the generation of messages sent to a window through mouse/keyboard events is a more "natural" way of testing than exposing an automation api) - this leads to extremely simplistic, brittle automation that is effectively a bunch of macros that break when your ui designers move buttons around as they are wont to do.

CPColin
Sep 9, 2003

Big ol' smile.
This gets especially obnoxious in a world where people think doing an A/B test between a button that says "Sign up now!" and "Sign up today!" is a worthwhile thing to do.

gbut
Mar 28, 2008

😤I put the UN🇺🇳 in 🎊FUN🎉


"but it improved the sign up by 1.7% [on that other page that's doesn't use that button]!"

Steve French
Sep 8, 2003

Love Stole the Day posted:

Hot take: instead of writing unit tests for each method of each class in the entire project... just write unit tests from the entry point of your application and cover everything by passing in all of the use cases for your entry point. The only thing you should mock are your downstream calls.

That way, it works just like an integration test but it's cheap and fast.

Even if your component is 500K lines of code, that's why we have debuggers. If you really can't cover some code in this way then either you don't understand how your component works or that code can be safely removed.

First: tests from the entry point of your application are fundamentally not unit tests, so if we care about terminology then let’s not call them that.

Comprehensively testing all important code in your application solely through the top level entry point would require an absolutely absurd quantity and complexity of tests for any moderately complex application/product.

qsvui
Aug 23, 2003
some crazy thing

Hadlock posted:

Tests are bad for job security, particularly the QA department

must be nice having a QA department

gbut
Mar 28, 2008

😤I put the UN🇺🇳 in 🎊FUN🎉


Capitalism is a race to the bottom, and you don't need quality there.

I worked at only one place that had a QA team, and that team was dismantled shortly after I started. I never worked at a place that had a dedicated QA engineering team.

thotsky
Jun 7, 2005

hot to trot
Happy to learn absolutely no progress has been made on testing since I got into the game 10 years ago.

Sivart13
May 18, 2003
I have neglected to come up with a clever title

thotsky posted:

Happy to learn absolutely no progress has been made on testing since I got into the game 10 years ago.
there's no such thing as linear progress in something so subjective, just many different orthodoxies pushing in every direction

barkbell
Apr 14, 2006

woof
the less code you write the fewer tests you need

CPColin
Sep 9, 2003

Big ol' smile.

gbut posted:

"but it improved the sign up by 1.7% [on that other page that's doesn't use that button]!"

And this of course is an improvement of 1.7% on the A page's 0.007% conversion rate so it's obviously statistically relevant!

Hadlock
Nov 9, 2004

qsvui posted:

must be nice having a QA department

Right up until there's any kind of financial distress, then they all get laid off

csammis
Aug 26, 2003

Mental Institution

barkbell posted:

the less code you write the fewer tests you need

This goon gets it

gbut
Mar 28, 2008

😤I put the UN🇺🇳 in 🎊FUN🎉


Probably why TDD evangelists stop caring about tests when they move into management.

Adbot
ADBOT LOVES YOU

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
A lot of problems with all of methodologies is that you'll get them to work with reasonably experienced and motivated people that aren't bound to it as a metric but it'll fail for unmotivated or inexperienced people--or it'll fail if it's made a metric in some way.

I have to deal with people in a hardware QA organization who write code and testing the code is not only alien but anathemic to them. These are people writing code to test things. You'd think somebody would have to come in and yell at them about trying too hard to test and cover everything, but instead all concepts for testing might as well be written in a moon language.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply