|
I think I need some reassurance that there are places that test their code outside of just poking around in the UI manually. I've been through a few employers now who gave great sounding answers to questions about their automated testing in interviews and then had zero tests in practice, at least on the stuff I've worked on. This then leads to confrontations with management about how I'm too cautious and should just take some risks, and the obvious response that with no safety net at all, I have no idea what I might be breaking outside of what I'm working on.
|
# ? Nov 4, 2018 14:14 |
|
|
# ? May 26, 2024 08:22 |
|
Yes, there are places that do automated UI tests. The best line to use for management about this is "all code is tested eventually. The only question is whether it's in development, where it's quick and easy to fix and the customer never sees the problem, or in production where the custom has found the problem for you." Volmarias fucked around with this message at 17:19 on Nov 4, 2018 |
# ? Nov 4, 2018 14:23 |
|
I've worked at a place that had a super-extensive suite of regression tests. I guess bad employers find it easy to lie about their lovely test suites and other bad practices
|
# ? Nov 4, 2018 15:17 |
|
At my last job interview (after being burnt by agency work) I asked to see some of their tests. The dev who was interviewing me let me poke around on his laptop and was even apologetic, saying something like "We have good coverage for all the happy paths on the API so we know if something breaks there, but we don't have a lot of UI tests or testing our errors, we've been meaning to work on it". It was more his attitude around testing that reassured me more than anything. It's enough (for me) that the test suite is configured enough that it can be run easily in dev and a passed test suite is required for merging (or whatever the equivalent is). Asking generally about how code makes it to production is also a useful question that will yield ...let's say interesting results. There are no "rules" against asking to see code in interviews - some companies can be cagey about letting you see their stuff without an NDA but fine, I'll sign it, I'm not trying to steal company secrets. If they don't let you see it at all, I figure it's garbage and act accordingly.
|
# ? Nov 4, 2018 17:02 |
|
The Leck posted:I think I need some reassurance that there are places that test their code outside of just poking around in the UI manually. I've been through a few employers now who gave great sounding answers to questions about their automated testing in interviews and then had zero tests in practice, at least on the stuff I've worked on. This then leads to confrontations with management about how I'm too cautious and should just take some risks, and the obvious response that with no safety net at all, I have no idea what I might be breaking outside of what I'm working on. While automated testing is a useful practice, it's not a silver bullet. Good automated testing requires designing your product to be testable, being able to run the tests, making the tests actually test something useful, and having the tests produce useful results. Not all projects are worth the effort it would take to automate the testing. The problem is that there is enough of a bias towards automation and emphasis on it as a best practice that no one will ever say "no, we don't generally write automated tests, you can do so if you want" in an interview for fear of seeming out of touch. It sucks. I work in an environment with automated tests written by both developers and QA engineers, and it's kind of a pain in the rear end to do anything without some computer yelling at you because you're doing the right thing now as opposed to the test that formalizes the wrong way of doing something. As for breaking poo poo - even with automated test suites, you're going to gently caress poo poo up. If there are tests already, I'll fix them up and add my new scenarios, and I'll shoot code reviews if I'm working in a component I''m not familiar with or not sure about something, but I generally prefer to break poo poo quickly without agonizing about it. I mean, if they really cared about the code, it would've had tests anyway, right?
|
# ? Nov 4, 2018 17:11 |
|
Your org and code base is broken if you can't write tests without other things breaking. That's no excuse to write off tests. Sure, there's a trade off to writing tests, but unless you are a team of one working on disposable code the investment always pays off. Telling you poo poo is broken is just a first order effect. The second order effect of documenting some code for your team to build a shared understanding is arguably just as important.
|
# ? Nov 4, 2018 20:18 |
|
Browser automation for testing is well-supported. Something like Cypress or Capybara reduces a lot of the initial friction. Tests like this are slow, and if you're trying to bolt it onto an application that was already poorly designed you may run into difficulties, so you have to write them differently than you would write a unit test, but any app with a UI would benefit from having them.
|
# ? Nov 4, 2018 20:39 |
|
I've found browser automation testing to be very brittle and maintenance-heavy compared to testing at any lower level. Cypress has its place but I'd be inclined to keep usage of it to an absolute minimum.Bruegels Fuckbooks posted:it's kind of a pain in the rear end to do anything without some computer yelling at you because you're doing the right thing now as opposed to the test that formalizes the wrong way of doing something I'm okay with changing the test at the same time as the implementation if the test is obviously testing for thr wrong behaviour. Of course exactly 50% of the time when I do that it turns out that some other component was relying on that wrong behaviour and the test itself was there for a good reason.
|
# ? Nov 4, 2018 22:14 |
|
The Leck posted:I think I need some reassurance that there are places that test their code outside of just poking around in the UI manually. I've been through a few employers now who gave great sounding answers to questions about their automated testing in interviews and then had zero tests in practice, at least on the stuff I've worked on. This then leads to confrontations with management about how I'm too cautious and should just take some risks, and the obvious response that with no safety net at all, I have no idea what I might be breaking outside of what I'm working on.
|
# ? Nov 5, 2018 01:08 |
|
2nd Rate Poster posted:Your org and code base is broken if you can't write tests without other things breaking. That's no excuse to write off tests. This. Plus testing your code forces you to write testable code, which is generally better than code that isn't testable.
|
# ? Nov 5, 2018 08:30 |
|
They removed developer tools from our browsers due to a security requirement. DoD jobs in a nutshell.
|
# ? Nov 5, 2018 17:35 |
|
We write UI based tests for our regression tests because the only way to get some data is from our administrative console. The effort to just add some privileged endpoint didn’t seem to be plausible for reasons that boggle the mind. In the same set of tests we have to have direct database access from our Jenkins nodes to confirm writes to our data warehouse after certain events happen (these tests also directly poll and are in big, gigantic loops instead of anything like promises). This is a great case for setting up one of the many REST API generators from schema options instead of wasting hours and tons of dollars setting up Jenkins nodes that start up Google Chrome and Webdriver tooling to scrape two fields. At a certain point it’d have been easier to just run httpclientrequest and scrape but we have more ops engineers than app developers so they win anytime they need help make anything easier for them.
|
# ? Nov 5, 2018 18:27 |
|
Rubellavator posted:They removed developer tools from our browsers due to a security requirement. DoD jobs in a nutshell. How do you get anything done? Or are you just not doing web development?
|
# ? Nov 5, 2018 19:57 |
|
Alert dialogs everywhere, I'm sure
|
# ? Nov 5, 2018 21:29 |
|
LLSix posted:How do you get anything done? Or are you just not doing web development? I'm one of a few in a team that get to develop on a closed network with physical boxes, everyone else is doing their work on VMs. It's basically a work stoppage for a lot of people and the government client is trying to get an exemption.
|
# ? Nov 5, 2018 21:49 |
|
Hey, head-in-sand (clarification: as in, if devs don't have the tools to see problems, they aren't problems) is a perfectly valid way to deal with security problems with software you're developing. It's right up there with knowing about a huge security flaw and ignoring it for years in favor of feature work.
|
# ? Nov 5, 2018 22:45 |
|
prisoner of waffles posted:Quick q: if a place does "agile" with sprints but has no sprint planning meetings or retrospectives, how bullshit does that make them? I hate the term retrospective, but they should be done to have the team self-correct anything that's bad with the current process. Agile in and of itself sucks. The processes, ceremonies, and its dogshit contemporaries like SAFe, etc. are just snake oil from management consultants that took a solid idea (agile software development) and commoditized it into mythology. Companies eat them up hoping that they are a silver bullet for all of their problems.
|
# ? Nov 8, 2018 15:37 |
|
Wow it's almost like there's no recipe that will make your business automatically successful
|
# ? Nov 8, 2018 15:46 |
|
Vulture Culture posted:Wow it's almost like there's no recipe that will make your business automatically successful You can’t sell that concept to failing managers though.
|
# ? Nov 8, 2018 15:58 |
|
geeves posted:I hate the term retrospective, but they should be done to have the team self-correct anything that's bad with the current process. Agile in and of itself is fine. The ceremony and processes and attempts to formalize Agile development into a recipe you can follow are what you don't like. Some companies need the ceremony to act as guardrails while they change their way of thinking about software development. Some companies think the ceremony is a magic incantation that you can invoke to be better at delivering working software. "Now we must perform the Rite of Retrospective!", but they either focus on the wrong things in the retrospective, or acknowledge, "X didn't go well" and then actively fight/ignore any further conversation about or attempts to improve X. [edit] I've seen companies go both ways with Agile. Some adopt the mindset and whatever parts of the ceremony helps them be successful, and they're successful. Some adopt the ceremony, retain their existing waterfall-y mindset, and fail. We've all seen the companies that say "We're doing Agile now. We've planned our sprints out for the next 3 years with milestone sprints every 3 months." New Yorp New Yorp fucked around with this message at 16:23 on Nov 8, 2018 |
# ? Nov 8, 2018 16:20 |
|
The biggest issue I've seen with retrospectives not going well is that companies have no plan and no expectations before they build something, so there's no yardstick to measure against. Without a clear definition of success or failure, there's no way to see how the team outperformed or underperformed expectations, and the retro is just an invitation for people to complain about things or advocate for their favorite cargo cult behaviors. If you want retros to go well, the team needs to agree upon and communicate why the thing is being built, and how they expect it to improve outcomes for the business. The loop needs to be closed and those gauges need to be measured after some agreed-upon length of time. The team needs to have opinions on whether what was delivered had the business impact it was supposed to have. Once you know the what, then there's a valuable why to be discussed. Technical people, like engineering team leads and devs-turned-scrum-masters, tend to be awful at the business side of things. One of the things I really like about OKRs is that they give everyone in the org a fallback set of metrics, so they can see if the needle is moving in the right direction without having to design the needle themselves. Vulture Culture fucked around with this message at 16:42 on Nov 8, 2018 |
# ? Nov 8, 2018 16:38 |
|
I couldn't stand the retros because there was one person on the team who was clearly loving everything up and massively slowing us down and no amount of Socratic suggestion got through to them and I didn't feel comfortable slamming my fists on the table and yelling, "Look, stop loving everything up!" Retros also go better when you actually set a sprint goal during planning, which we never did. At my brief stint at my previous job, our Scrum Master had us set the goal by asking us to pretend we're at the end of the sprint, the sprint went perfectly, and we're looking back on what happened that made it work so well. That clicked better in my brain than the other perspective of, "What do we want to accomplish in the next three weeks?"
|
# ? Nov 8, 2018 16:45 |
|
Vulture Culture posted:Wow it's almost like there's no recipe that will make your business automatically successful leper khan posted:You can’t sell that concept to failing managers though. When scrum masters, PMs and VPs are arguing about the number of points instead of the quality of value - I just tune out. I'm just sick of arguing points and t-shirt sizes and fisting of fives and burndowns and would just like to get back to actually enjoying development.
|
# ? Nov 8, 2018 16:48 |
|
As a general rule, keep upper management out of retros. If there's someone there who could punish you for commenting on the Emperor's nudity, there's no way your team can do something about the way the Imperial Dong's majestic flopping is distracting you from your work. If someone of excessive rank is present, it means that instead of a retro, the meeting you're actually having is for explaining the value of whatever measurement they're using as an approximation of your productivity. That doesn't need the whole team, and it doesn't really help anyone to do it, and if you call this activity "retro" then the nomenclature is probably preventing you from doing other activities that would be useful for course correction. Bongo Bill fucked around with this message at 18:29 on Nov 8, 2018 |
# ? Nov 8, 2018 18:21 |
|
geeves posted:When scrum masters, PMs and VPs are arguing about the number of points instead of the quality of value - I just tune out. The last two weeks I was looking for a new contracting gig (I found one), I was often asked for my hourly rate. "My base rate is X. If I need to get into my car to get to work (cant bike there), my rate is X+5. If I am in a lot of meetings, it is X+10. Both can apply. This will make me expensive for roles I don't really want, which is the whole point." It worked well, I am now going to work in a team that knows I don;t like meetings but will work for the money they can budget for.
|
# ? Nov 8, 2018 18:53 |
|
The fundamental problem I have with an Agile process where I work is they decided that applying it to a bunch of electrical engineers turned them all into software developers.
|
# ? Nov 8, 2018 20:18 |
|
I say 'agile is a set of techniques we can choose from to make us more effective'and what gets heard is 'agile is a set of buzzwords we can use to tell our customers we are cheaper' and they're not quite the same thing
|
# ? Nov 8, 2018 20:30 |
|
Bongo Bill posted:As a general rule, keep upper management out of retros. If there's someone there who could punish you for commenting on the Emperor's nudity, there's no way your team can do something about the way the Imperial Dong's majestic flopping is distracting you from your work. When I have bothered to attend the retrospective, "what went well" has consisted of really inane stuff like "the 3 day weekend was really fun". The manager spent a lot of effort building elaborate rules for story points (fibonacci only), stories, etc. The rules constantly changed and were never written down until recently, so I was constantly getting chided for doing things "wrong". It's all so dumb.
|
# ? Nov 8, 2018 23:40 |
|
We had some TDD training. Here was someone’s takeaway:quote:Looking forward to team having tasks in rally against each which captures Test Case first and then actual development
|
# ? Nov 9, 2018 14:04 |
|
xiw posted:I say 'agile is a set of techniques we can choose from to make us more effective'and what gets heard is 'agile is a set of buzzwords we can use to tell our customers we are cheaper' and they're not quite the same thing People preaching agile made me understand dadaism. Paul Feyerabend posted:A Dadaist is utterly unimpressed by any serious enterprise and he smells a rat whenever people stop smiling and assume that attitude and those facial expressions which indicate that something important is to be said. A Dadaist is convinced that a worthwhile life will arise only when we start taking things lightly and when we remove from our speech the profound but putrid meanings it has accumulated over the centuries ("search for truth"; "defense of justice"; "passionate concern"; etc. etc.)
|
# ? Nov 9, 2018 14:22 |
|
smackfu posted:We had some TDD training. Here was someone’s takeaway: I dont even understand what they are saying.
|
# ? Nov 10, 2018 07:08 |
|
Its hard to parse but I interpreted as "the devs will just task up and race to make tests green instead of doing ~real~ development". Which is of course a super hot take because 1. Why you would create tasks for things against a user story which don't progress you towards delivering that user story. 2. There is no magical entity that is going to write the unit test that you are trying to make green, you -the dev- do it based on achieving the functionality you are trying to deliver. 3. The act of making those tests go green is the literal ~real~ development. Like, they're green so uh, congrats, you wrote the function?
|
# ? Nov 10, 2018 09:25 |
|
A team lead of 20 years experience once asked me "But what are you even testing?" about automated tests in general. Turns out he didn't get the idea that you can write code that isn't tightly coupled to a GUI framework. The same job had been advertised and interviewed looking for unit testing experience, predictably enough the code coverage was 0%
|
# ? Nov 10, 2018 13:14 |
|
strange posted:A team lead of 20 years experience once asked me "But what are you even testing?" about automated tests in general. Turns out he didn't get the idea that you can write code that isn't tightly coupled to a GUI framework. I tried to explain unit testing to my lead. He couldn’t wrap his head around it and kept asking, “what’s the point of a test that does so little?” Kind of glad I got shuffled in a re-org (and now he has no reports). Naturally his prescribed pattern for all problems is singletons, and half the project is a big ball of mud as a result.
|
# ? Nov 10, 2018 13:50 |
|
Love to write unit tests for my dtos and enums because they aren't excluded from code coverage requirements
|
# ? Nov 10, 2018 23:01 |
|
Pedestrian Xing posted:Love to write unit tests for my dtos and enums because they aren't excluded from code coverage requirements Just because people are stupid does not make unit tests bad.
|
# ? Nov 11, 2018 06:32 |
|
Pedestrian Xing posted:Love to write unit tests for my dtos and enums because they aren't excluded from code coverage requirements I mean they should get exercized by other unit tests, otherwise they'd be dead code right?
|
# ? Nov 12, 2018 18:21 |
|
Shouldn't a good code coverage tool notice that those enums and DTOs are being tested without writing explicit tests around them? At the same time they should be pretty simple tests to write and can easily bump up your coverage without much headache. Macichne Leainig fucked around with this message at 18:39 on Nov 12, 2018 |
# ? Nov 12, 2018 18:36 |
|
I was kind of wondering if it was something like older Python where enums have to come in from a library if they aren't implemented by hand in some way. I can imagine that confusing a code coverage analyzer.
|
# ? Nov 12, 2018 19:24 |
|
|
# ? May 26, 2024 08:22 |
|
I've said it before and I'll say it again: Code coverage requirements are stupid. It doesn't improve quality, it's easy to game (at best) or acts as an impediment to writing new code (at worst), and high code coverage doesn't mean that your code is well-tested. Code coverage in and of itself is fine. It tells you what isn't being tested. The delta of code coverage over a period of time can tell you about the evolution of your code quality -- going up is probably good, going down is probably bad.
|
# ? Nov 12, 2018 19:27 |