Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
barkbell
Apr 14, 2006

woof
cloudeflare pages is free with unlimited bandwidth and sites

Adbot
ADBOT LOVES YOU

Lumpy
Apr 26, 2002

La! La! La! Laaaa!



College Slice

uncle blog posted:

More redux questions:

In my react app, I have a page where the user can create a new thing. This is done by filling in a form and submitting which dispatches a redux action to create the thing with an api. If an error occurs, a pice of state is set to true (createThingError).
When the user hits "Create" I want to either send them back to the previous page if there is no error, or have them stay here if there was one.

The problem is checking the piece of state in the submit handler function.

code:
const CreateThingPage = () => {

	  const createThingError = useSelector((state: State) => state.things.createThingError);

	const handleSubmitClick = () => {
		const newThing = formData
		dispatch(createThing(newThing))

		if(!createThingError) {
			history.goBack()
		}
	}
}
The variable createThingError isn't updated automatically in this component, and I can't call the useSelector again inside the submitHandler, as hooks need to be called on the root level. So what's the right way to do the thing I'm trying to do?

Edit:
It seems the variable IS updated. But now I need a way to wait for the dispatched method to complete before checking the value of the variable. Tried putting an "await" before it, without much luck.

You can check the value of that piece of state in a useEffect hook in your component rather than how you are doing it:

code:

useEffect(() => {
  if( state.thing.createThingError === true) { // or whatever test you need
     history.goBack();
  }
}, [state.thing.createThingError]); 

Assuming you are doing this, but your createThing action can set a "thingIsBeingCreated" piece of state before it starts the request as well. Then your UI knows something is afoot and can show a spinner, disable the "SAVE" button or whatever. When the action comes back, unset it.

worms butthole guy
Jan 29, 2021

by Fluffdaddy
I have a decently popular app on Vercel that gets decent traffic and the free tier works. Maybe about 1000 views a day

fuf
Sep 12, 2004

haha
what's it called when you can do like div.hello then press tab to generate
code:
<div class="hello"></div>
?

And how do I get it working in visual studio? Visual studio will generate div tags if you just write div and then tab, but it doesn't do classes and IDs with div.class and div#id

e: it's called zen coding and there's an extension :)

fuf fucked around with this message at 15:28 on Oct 29, 2021

barkbell
Apr 14, 2006

woof
its the emmet plugin

kedo
Nov 27, 2007

LifeLynx posted:

A Digital Ocean droplet for Wordpress might be what I go with. It just has to run the CMS and hold the MySQL database. If I do anything larger scale than a simple business site that needs an event calendar + blog I'll try Kinsta though.

This is a good cost effective option, but be careful with Digital Ocean droplets – they're unmanaged, so you're responsible for setting up and maintaining firewalls, performing updates, provisioning SSL certs, etc. They're dirt cheap, but you end up paying with your time.

fuf
Sep 12, 2004

haha

barkbell posted:

its the emmet plugin

oh cool, thank you, this is even better

uncle blog
Nov 18, 2012

I have a bunch of json data that I want to put in a DB, and make an easy-to-use api that will be used both by web and mobile apps. The data is about a bunch of cards for a game (think Magic The Gathering). Each card also has an associated image, that I might also want stored in the same DB. The DB will be read only. Mostly used to pull a bunch of cards based on some filters.

What are some good options to look at for something like this?

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
Postgres can store both fairly easily. If you don't want to map each JSON field to a well-designed schema, just create a table with a column that has a JSON type and throw it in there. Postgres has a syntax for querying JSON fields.

The images can go in binary blob columns, but the mechanics of actually extracting the binary data from the SQL response are such a PITA (depending on what database client library you use) that I expect most people would store the image in the filesystem, and the DB would just store the image's filename/path.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
You might consider NOSQL. Then you don't need to change the structure of the data and most support storing blobs like images much better than SQL does.

Personally I'd write a script that extracts the JSON, structures it, and inserts it into a SQL database, with images on the filesystem. But it really depends how much time you want to put in and what your requirements are.

kedo
Nov 27, 2007

Can anyone recommend a favorite JS/SVG animation framework (specifically anything that might include handy functions for scroll-based animation)? I have a project coming up where the client wants an illustration that starts off in one state, and then as the user scrolls through the page it animates and goes through three or four different animation states (zooming/changing colors/etc).

Last time I did any major animation work I used https://greensock.com/, is it still a reasonable choice?

spacebard
Jan 1, 2007

Football~

kedo posted:

Can anyone recommend a favorite JS/SVG animation framework (specifically anything that might include handy functions for scroll-based animation)? I have a project coming up where the client wants an illustration that starts off in one state, and then as the user scrolls through the page it animates and goes through three or four different animation states (zooming/changing colors/etc).

Last time I did any major animation work I used https://greensock.com/, is it still a reasonable choice?

It probably depends on the commercial usage of the site and whether you're paying for it. It's probably still the most reasonable choice if you're comfortable with it. @svgdotjs/svg.js is a much more low-level SVG manipulation library, but it's fast and under the MIT license. It doesn't use the Flash-based concepts like “tween” so it definitely has a jump in complexity conceptually.

commie kong
Mar 7, 2019

kedo posted:

Can anyone recommend a favorite JS/SVG animation framework (specifically anything that might include handy functions for scroll-based animation)? I have a project coming up where the client wants an illustration that starts off in one state, and then as the user scrolls through the page it animates and goes through three or four different animation states (zooming/changing colors/etc).

Last time I did any major animation work I used https://greensock.com/, is it still a reasonable choice?

If you're flexible on the file type, Lotties do exactly what you're describing: https://lottiefiles.com/

nexus6
Sep 2, 2011

If only you could see what I've seen with your eyes
I'm still struggling to understand what the point of docker is and what problems it solves. It seems like it's becoming more and more popular but is there some fundamental issue with virtual machines I'm not aware of that docker fixes? If docker can only utilize the resources of my operating system, what is the point of a docker container running something that I can just install and run natively myself?

lunar detritus
May 6, 2009


nexus6 posted:

I'm still struggling to understand what the point of docker is and what problems it solves. It seems like it's becoming more and more popular but is there some fundamental issue with virtual machines I'm not aware of that docker fixes? If docker can only utilize the resources of my operating system, what is the point of a docker container running something that I can just install and run natively myself?

It makes dev environments reproducible. For a single developer its value is not as really that apparent but for a team (especially teams with lots of turnover and movement between projects) it's a godsend. It doesn't matter if you're running windows or mac, or what programming language you last used, or what postgres version you have installed, it's all defined right there in the project and it (mostly) works like magic.

nexus6
Sep 2, 2011

If only you could see what I've seen with your eyes

lunar detritus posted:

It makes dev environments reproducible. For a single developer its value is not as really that apparent but for a team (especially teams with lots of turnover and movement between projects) it's a godsend. It doesn't matter if you're running windows or mac, or what programming language you last used, or what postgres version you have installed, it's all defined right there in the project and it (mostly) works like magic.

Right, but my team are all on Macs, we all use PHP and use mySQL so...

camoseven
Dec 30, 2005

RODOLPHONE RINGIN'

nexus6 posted:

Right, but my team are all on Macs, we all use PHP and use mySQL so...

What if your versions of those dependencies diverge? What if someone on windows needs to run it? What if you want to deploy to AWS? What if after you deploy to AWS you wanna deploy it to Azure cause some of your customers won't do business with Amazon?

What if you want to run multiple instances of your apps on the same hardware? Yea you could do a bunch of different VMs but that sucks, why not containerize and run a bunch of instances of your app that all share the same basic resources!

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
What version of PHP? What version of MySQL? Are you all using the same? Is production and staging exactly the same? Are the same modules installed on each and configured identically? Because you can manage all that with a config mgmt system like Ansible / Chef / Salt / Puppet, and prevent version drift between systems, but those tools are not so suitable for developers. Containers make it easy for one person to make the image and for everyone else to trivially grab it and use it, and for exactly the same image to be used in both test environments and production.

It also solves problems where you want to have multiple versions running simultaneously, but don't want the installs to stomp on each other (e.g. when bleeding traffic from one version to another). And it makes it trivial to run on platforms like Kubernetes where you're not sure in advance what host you'll be running on, nor how many instances you'll need.

You can achieve the same isolation & portability effects with other tools such as baked VM images. But they may be larger (since they need to incorporate the OS), have longer spin up times, have limitations on (say) mounting shared filesystems simultaneously with other VMs, be harder to debug, etc.

CarForumPoster
Jun 26, 2013

⚡POWER⚡

nexus6 posted:

I'm still struggling to understand what the point of docker is and what problems it solves. It seems like it's becoming more and more popular but is there some fundamental issue with virtual machines I'm not aware of that docker fixes? If docker can only utilize the resources of my operating system, what is the point of a docker container running something that I can just install and run natively myself?

I use it to develop AWS Lambda functions on Windows that run a headless web browser. It would be impossible to configure your system to be like the Lambda environment including permissions, timeout, etc. Docker makes that trivial.

nexus6
Sep 2, 2011

If only you could see what I've seen with your eyes
Ok, thanks all, but what I'm getting is hypothetical situations where you don't have a managed team or devops stuff. See, we use things like:
  • Sass to make writing CSS easier
  • Webpack for compiling and optimizing assets
  • Capistrano for deploying code for a git repository to servers
So I don't see how Docker fits into this, if at all. To be honest, it seems to me like Docker just does what virtual machines already do and is just an alternative.

I'm not saying it isn't useful - you're just not selling it to me.

Impotence
Nov 8, 2010
Lipstick Apathy

nexus6 posted:

Ok, thanks all, but what I'm getting is hypothetical situations where you don't have a managed team or devops stuff. See, we use things like:
  • Sass to make writing CSS easier
  • Webpack for compiling and optimizing assets
  • Capistrano for deploying code for a git repository to servers
So I don't see how Docker fits into this, if at all. To be honest, it seems to me like Docker just does what virtual machines already do and is just an alternative.

I'm not saying it isn't useful - you're just not selling it to me.

We build containers nearly immutably, read only, with everything mounted read only. Even if there is a vuln, it is impossible to traverse the filesystem or drop a web shell or persist anything onto the filesystem. Any writes are written to a separate database or S3 bucket.

The container takes seconds to build, and has much less stuff compared to a full VM. Even a pared down VM is still running a ton more than my application in a container with something like distroless. If someone managed to pop a shell on it, there is no shell, and there is no local utils like curl or nc or wget for them to do stuff with.

Containers have much less overhead compared to a full virt VM (xen hvm, vmware, qemu, etc) and are "easy" segmentation. There are no "remnants" from deploying over and over to the same server, and there are no left-around artifacts afterward, just blow it away and try again.

Devs cannot "hey i'll enter this pod and edit one file and then forget about it", they have to have it tracked in VCS.

A VM is usually at least a few GB given that a minimal debian install is still somewhere near the gig range. A container for static binaries is ~2MB as a base.

[you probably do not use serverless given the quote, but] I do not want to wait for a VM to boot and then shut down every single time I visit a web page. This is basically how you develop for serverless.

Impotence fucked around with this message at 11:00 on Nov 4, 2021

barkbell
Apr 14, 2006

woof

nexus6 posted:

Ok, thanks all, but what I'm getting is hypothetical situations where you don't have a managed team or devops stuff. See, we use things like:
  • Sass to make writing CSS easier
  • Webpack for compiling and optimizing assets
  • Capistrano for deploying code for a git repository to servers
So I don't see how Docker fits into this, if at all. To be honest, it seems to me like Docker just does what virtual machines already do and is just an alternative.

I'm not saying it isn't useful - you're just not selling it to me.

when you onboard a new dev, how do they run your app(s)?

nexus6
Sep 2, 2011

If only you could see what I've seen with your eyes
What do you mean? Checking code out from a repo?

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender

nexus6 posted:

I'm not saying it isn't useful - you're just not selling it to me.
These are not hypothetical problems for others. If they're hypothetical to you, then either you don't have the same problems as other teams (in which case you are correct to ignore Docker), or you do have them but don't realize it yet.

Some non-hypothetical problems that my team had which Docker helped solve:
- developers wanted to experiment with new versions of PHP/Django/whatever, but didn't want to have separate dev envs
- "it fails in production, but it works on my machine" due to some small difference in installation procedure
- "You want to bump the version of $thing in the next release because the code relies on it? Ok put in a ticket with Ops and carefully coordinate with them to upgrade it before the next code deployment."
- "I want to debug a problem in production, but the version running there is much older than my dev sandbox, and it's a PITA to downgrade it."
- "I want to rollback a problematic version, now I have to meticulously de-install everything and re-install the old version"
- "I want to bleed traffic from V1 to V2 on the same host to reduce downtime, but it's hard to have two versions installed and running on the same host"
- "When I want to setup a new sandbox or prepare to do a build, I have to precisely follow a list of instructions to ensure all the dependencies and tools are installed correctly"

If those aren't a problem (or are only a mild irritation) then consider yourself lucky. I reckon they're very common problems in most teams.

Impotence
Nov 8, 2010
Lipstick Apathy

nexus6 posted:

What do you mean? Checking code out from a repo?

check out code from repo, install all the dependencies, install specific versions of java, javascript, whatever else, maybe have to run a database server or something vs "docker-compose up" and everyone has identical runtime environments with all the same dependencies in one line, and your software and library versions are committed to repo

Data Graham
Dec 28, 2009

📈📊🍪😋



I mean it feels like you're coming at this from more of a front-end perspective than back-end, and sure docker is a lot less relevant in that case, I would think.

ynohtna
Feb 16, 2007

backwoods compatible
Illegal Hen
Being able to perfectly reproduce a specific, easily tagged and archived, environment is :kiss: for debugging and diagnostics.

kedo
Nov 27, 2007

nexus6 posted:

  • Sass to make writing CSS easier
  • Webpack for compiling and optimizing assets
  • Capistrano for deploying code for a git repository to servers

Real world example: your main dev is on vacation and you have someone acting as a backup dev in the interim. Something blows up on your app/site, and the backup dev needs to make some edits to Sass files and then compile everything and push it up to the server. Except your main dev has an old version of Node installed on their machine which works with whatever program they're using to compile Sass/run Webpack, but your backup dev has the newest version of Node installed and whenever they try to compile, things blow up with obscure, opaque error messages. They can't compile, which means they can't fix the error. Your main dev isn't answering their phone, and when they finally look at their email in between margaritas they say, "oh, I don't know what version of Node I'm using and I left my laptop at home so I can't check," so your only options are to wait, or to have your backup dev experiment and try to figure out exactly which version of Node will actually work, and in the meantime your app/site has been broken for hours or days because no one thought to put "Use Node version 10.16.3" in the repo's readme. And this all assumes that your backup dev is actually experienced and knowledgeable enough to know that the Node version is actually the problem, because there are probably a thousand dependencies that might be responsible for the error, and none of them have error messages that the human mind can parse.

Spoiler alert: this isn't fantastyland, this happened to me last week, and it could have been avoided with either A) good documentation, or B) something like Docker. I've been in this industry long enough to know that A cannot be relied upon, so B is a fantastic, better alternative that just eliminates this problem altogether.

e: If only one person on one computer will ever touch the code, you don't need Docker.

kedo fucked around with this message at 18:20 on Nov 4, 2021

Bruegels Fuckbooks
Sep 14, 2004

Now, listen - I know the two of you are very different from each other in a lot of ways, but you have to understand that as far as Grandpa's concerned, you're both pieces of shit! Yeah. I can prove it mathematically.

kedo posted:

Real world example: your main dev is on vacation and you have someone acting as a backup dev in the interim. Something blows up on your app/site, and the backup dev needs to make some edits to Sass files and then compile everything and push it up to the server. Except your main dev has an old version of Node installed on their machine which works with whatever program they're using to compile Sass/run Webpack, but your backup dev has the newest version of Node installed and whenever they try to compile, things blow up with obscure, opaque error messages. They can't compile, which means they can't fix the error. Your main dev isn't answering their phone, and when they finally look at their email in between margaritas they say, "oh, I don't know what version of Node I'm using and I left my laptop at home so I can't check," so your only options are to wait, or to have your backup dev experiment and try to figure out exactly which version of Node will actually work, and in the meantime your app/site has been broken for hours or days because no one thought to put "Use Node version 10.16.3" in the repo's readme. And this all assumes that your backup dev is actually experienced and knowledgeable enough to know that the Node version is actually the problem, because there are probably a thousand dependencies that might be responsible for the error, and none of them have error messages that the human mind can parse.

Spoiler alert: this isn't fantastyland, this happened to me last week, and it could have been avoided with either A) good documentation, or B) something like Docker. I've been in this industry long enough to know that A cannot be relied upon, so B is a fantastic, better alternative that just eliminates this problem altogether.

e: If only one person on one computer will ever touch the code, you don't need Docker.

i've had that exact problem with certain versions of node.js crashing when doing sass/webpack poo poo and it was infuriating.

barkbell
Apr 14, 2006

woof

nexus6 posted:

What do you mean? Checking code out from a repo?

thats usually a good first step, but usually there is more right? are you just creating static html files or something?

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
I think my frustration with docker in part correlates with the immense amount of boilerplate and configuration that goes with it which always inevitably stops working. I don't know how many times I've been frustrated fighting with docker containers for hours. Only to in the end stop all of them, delete them, re-initialize them, and have everything working properly again.

Maybe it's still too young a development tool? Maybe the docker developers have focused too much on adding features (yet still not being easily interchangeable with kubernetes)? I dunno. It was sold as, and written about, like some huge godsend but for me as well I'm not having a good time with it. There's a lot of alternatives now that have been popping up, but everyone's still hooked on the first thing they ever came across.

Impotence
Nov 8, 2010
Lipstick Apathy

Nolgthorn posted:

I think my frustration with docker in part correlates with the immense amount of boilerplate and configuration that goes with it which always inevitably stops working. I don't know how many times I've been frustrated fighting with docker containers for hours. Only to in the end stop all of them, delete them, re-initialize them, and have everything working properly again.

Maybe it's still too young a development tool? Maybe the docker developers have focused too much on adding features (yet still not being easily interchangeable with kubernetes)? I dunno. It was sold as, and written about, like some huge godsend but for me as well I'm not having a good time with it. There's a lot of alternatives now that have been popping up, but everyone's still hooked on the first thing they ever came across.

Can you explain this more? I don't have "many configuration" or "boilerplate" or fighting it for hours. Are you trying to use an ephemeral container as a persistent dumping ground or "editing in container"? What is your use case for it and what are you trying to do with it? Docker is not a VM, if that.

Splinter
Jul 4, 2003
Cowabunga!
I like Docker (or more generally, containers) even outside of a team environment (ie for personal projects) just because it lets me easily use tons of different technologies for different projects without actually having to have them installed locally. Now a year or two down the road, when some projects are long abandoned, I don't have a bunch of unused crap installed by my package manager that I'm not 100% sure which projects those were installed for and if anything I'm still working on depends on it. It's also especially useful for having projects that run on different versions of the same technology (e.g. I have something developed that is deployed on an old version that I don't want to have to update to use a newer version with a new project), which installed locally gets messy as then there's the question of which version is run when I type the command in the command line (ie which version is on my $PATH). It's also just nice and convenient that I can spin up say a new postgres DB w/ a volume for a new project in just 2 or 3 commands with minimal configuration. And yes, most of these concerns could be handled by VMs instead, but the performance would be worse and I wouldn't want to have to setup and manage (and store) a new VM for every project, especially on my laptop.

nexus6
Sep 2, 2011

If only you could see what I've seen with your eyes

quote:

NPM issues

Oh yeah, been there. We enforce .nvmrc files in repos to stop that happening again.

Splinter posted:

I like Docker (or more generally, containers) even outside of a team environment (ie for personal projects) just because it lets me easily use tons of different technologies for different projects without actually having to have them installed locally. Now a year or two down the road, when some projects are long abandoned, I don't have a bunch of unused crap installed by my package manager that I'm not 100% sure which projects those were installed for and if anything I'm still working on depends on it. It's also especially useful for having projects that run on different versions of the same technology (e.g. I have something developed that is deployed on an old version that I don't want to have to update to use a newer version with a new project), which installed locally gets messy as then there's the question of which version is run when I type the command in the command line (ie which version is on my $PATH). It's also just nice and convenient that I can spin up say a new postgres DB w/ a volume for a new project in just 2 or 3 commands with minimal configuration. And yes, most of these concerns could be handled by VMs instead, but the performance would be worse and I wouldn't want to have to setup and manage (and store) a new VM for every project, especially on my laptop.

How does this work then? A docker container is like a VM and self-contained but it also isn't because it can interact with things outside the container?

barkbell posted:

thats usually a good first step, but usually there is more right? are you just creating static html files or something?

Having MAMP installed is usually enough in like 95% of cases

nexus6 fucked around with this message at 22:44 on Nov 4, 2021

Impotence
Nov 8, 2010
Lipstick Apathy

nexus6 posted:


How does this work then? A docker container is like a VM and self-contained but it also isn't because it can interact with things outside the container?


It's self contained, but you can volume mount shared folders to persist state, or in development, mount a local folder of your public_html into a PHP runtime container that is linked in a private network to a Postgres container

Jabor
Jul 16, 2010

#1 Loser at SpaceChem

nexus6 posted:

Having MAMP installed is usually enough in like 95% of cases

Which versions?

If you want to upgrade any of those versions, do you make sure every developer is doing so in sync? And also in sync with what you're running in production? How?

nexus6
Sep 2, 2011

If only you could see what I've seen with your eyes

Jabor posted:

Which versions?

If you want to upgrade any of those versions, do you make sure every developer is doing so in sync? And also in sync with what you're running in production? How?

To be honest, it really hasn't been much of an issue. We use a recent version of MAMP because that's what we have licenses for, we have procedures set out for versions (and things like nvmrc for enforcing consistency) and exceptional circumstances and requirements are put in README files for onboarding new devs.

I get how in some circumstances docker might be useful, but for our team I don't see how getting everyone to install docker and learn how to use and set up containers solves any issue we don't have.

edit: btw isn't that exactly what Vagrant already does?

nexus6 fucked around with this message at 11:18 on Nov 5, 2021

Vincent Valentine
Feb 28, 2006

Murdertime

My boss and I were the only two people that used Docker enthusiastically, while everyone else begrudgingly went along with it. This meant that I had to spend, if I'm honest, a probably too-long period of time learning how it works because boy howdy it was complicated, at least for our use case.

But what we ended up with was a completely turn-key system. The entire dev setup process was simplified to checking out the git branch and typing "docker-compose up". It launched a local proxy, an orchestration service which connected to our microservices, started a sass and js watcher/compiler and ran tests automatically when their coverage area was changed, all by default. If you wanted to, there were great non-default options that allowed quite a bit of customization. You could set the entire system up locally(i.e. no internet connection necessary, but obviously data was dummy placeholder in this case), you could bypass the orchestration layer and connect directly to the microservices, you could pick and choose how the load balancer acted, tons of other stuff. It was nuts, and it was all simple poo poo like docker-compose up develop-local, it would just automatically reconfigure everything to work locally like magic.

Design and Product ended up actually doing simple pull requests because setting up the system was the entire barrier of entry stopping them from debugging things that bothered them.

Then both of us got laid off and outsourced. I'm genuinely curious as to whether or not it still works.

I can't imagine not wanting to use docker. Every single person on your team benefits.

CarForumPoster
Jun 26, 2013

⚡POWER⚡
If you're doing serverless stuff on AWS Lambda and dont wanna learn much about Docker, the AWS SAM CLI (which uses Docker) is pretty loving great. Even if you do know a bunch about Docker. Going from deploying Python apps with Zappa to SAM was night and day in terms of limitations, ability to test, and ability to diagnose.

Stuff I do on Lambda that would have taken WAYYYY longer without Docker based testing locally:

Scrape websites triggered via a REST API using headless Chrome.
Editing videos with moviepy/ffmpeg.
Downloading files locally, modifying them, uploading them elsewhere.
Greatly greatly exceed the 250MB file size limit using lambda-layers. (SAM CLI handles making the extra layers basically)

Also, the fact that I can wrap everything in a try/except with the except having a time.sleep(900) means I can have a terminal to the AWS Lambda-like Docker environment. When trying to diagnose file system permissions issues, figure out errors with binaries, getting system logs in ?? locations, being able to terminal in to the environment while it is executing is a god-send.


EDIT: Oh and this worked very well even though the actual development machines were a combo of Windows and Mac

CarForumPoster fucked around with this message at 14:24 on Nov 5, 2021

Adbot
ADBOT LOVES YOU

Lumpy
Apr 26, 2002

La! La! La! Laaaa!



College Slice
I am going insane. Does anyone know / use sinon in a Node + typescript environment to mock fetch calls? This should be so easy, but every solution I have tried doesn't work or typescript won't allow.

Basically I have this:

JavaScript code:
class Lol {
  async doThing() {
    const someData = await fetch(url);
    const myJson = await someData.json();
    return doStuffWith(myJson);
  }
}
What I want to do:

JavaScript code:
import sinon, { SinonSandbox } from "sinon";
import fetchMock from "fetch-mock";

describe("my thing", () => {

  beforeEach(() => {
    sandbox = sinon.createSandbox();
    fetchMock.mock("*", fakeData);
  });

  afterEach(() => {
    sandbox.restore();
    fetchMock.restore();
  });

  it("does a thing", () => {
    const l = new Lol();
   expect(l.doThing()).toBe(whatever);
  })
});
But the request is not mocked. Apparently that's because fetch is not global or something. Every other solution I have researched does not work, or causes typescript to fail the build. Anyone run into this before?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply