|
cloudeflare pages is free with unlimited bandwidth and sites
|
# ? Oct 29, 2021 14:40 |
|
|
# ? Jun 5, 2024 05:34 |
|
uncle blog posted:More redux questions: You can check the value of that piece of state in a useEffect hook in your component rather than how you are doing it: code:
|
# ? Oct 29, 2021 15:02 |
|
I have a decently popular app on Vercel that gets decent traffic and the free tier works. Maybe about 1000 views a day
|
# ? Oct 29, 2021 15:07 |
|
what's it called when you can do like div.hello then press tab to generatecode:
And how do I get it working in visual studio? Visual studio will generate div tags if you just write div and then tab, but it doesn't do classes and IDs with div.class and div#id e: it's called zen coding and there's an extension fuf fucked around with this message at 15:28 on Oct 29, 2021 |
# ? Oct 29, 2021 15:20 |
|
its the emmet plugin
|
# ? Oct 29, 2021 15:50 |
|
LifeLynx posted:A Digital Ocean droplet for Wordpress might be what I go with. It just has to run the CMS and hold the MySQL database. If I do anything larger scale than a simple business site that needs an event calendar + blog I'll try Kinsta though. This is a good cost effective option, but be careful with Digital Ocean droplets – they're unmanaged, so you're responsible for setting up and maintaining firewalls, performing updates, provisioning SSL certs, etc. They're dirt cheap, but you end up paying with your time.
|
# ? Oct 29, 2021 16:48 |
|
barkbell posted:its the emmet plugin oh cool, thank you, this is even better
|
# ? Oct 29, 2021 17:11 |
|
I have a bunch of json data that I want to put in a DB, and make an easy-to-use api that will be used both by web and mobile apps. The data is about a bunch of cards for a game (think Magic The Gathering). Each card also has an associated image, that I might also want stored in the same DB. The DB will be read only. Mostly used to pull a bunch of cards based on some filters. What are some good options to look at for something like this?
|
# ? Nov 2, 2021 10:53 |
|
Postgres can store both fairly easily. If you don't want to map each JSON field to a well-designed schema, just create a table with a column that has a JSON type and throw it in there. Postgres has a syntax for querying JSON fields. The images can go in binary blob columns, but the mechanics of actually extracting the binary data from the SQL response are such a PITA (depending on what database client library you use) that I expect most people would store the image in the filesystem, and the DB would just store the image's filename/path.
|
# ? Nov 2, 2021 14:47 |
|
You might consider NOSQL. Then you don't need to change the structure of the data and most support storing blobs like images much better than SQL does. Personally I'd write a script that extracts the JSON, structures it, and inserts it into a SQL database, with images on the filesystem. But it really depends how much time you want to put in and what your requirements are.
|
# ? Nov 2, 2021 15:16 |
|
Can anyone recommend a favorite JS/SVG animation framework (specifically anything that might include handy functions for scroll-based animation)? I have a project coming up where the client wants an illustration that starts off in one state, and then as the user scrolls through the page it animates and goes through three or four different animation states (zooming/changing colors/etc). Last time I did any major animation work I used https://greensock.com/, is it still a reasonable choice?
|
# ? Nov 2, 2021 16:17 |
|
kedo posted:Can anyone recommend a favorite JS/SVG animation framework (specifically anything that might include handy functions for scroll-based animation)? I have a project coming up where the client wants an illustration that starts off in one state, and then as the user scrolls through the page it animates and goes through three or four different animation states (zooming/changing colors/etc). It probably depends on the commercial usage of the site and whether you're paying for it. It's probably still the most reasonable choice if you're comfortable with it. @svgdotjs/svg.js is a much more low-level SVG manipulation library, but it's fast and under the MIT license. It doesn't use the Flash-based concepts like “tween” so it definitely has a jump in complexity conceptually.
|
# ? Nov 2, 2021 21:09 |
|
kedo posted:Can anyone recommend a favorite JS/SVG animation framework (specifically anything that might include handy functions for scroll-based animation)? I have a project coming up where the client wants an illustration that starts off in one state, and then as the user scrolls through the page it animates and goes through three or four different animation states (zooming/changing colors/etc). If you're flexible on the file type, Lotties do exactly what you're describing: https://lottiefiles.com/
|
# ? Nov 3, 2021 03:08 |
|
I'm still struggling to understand what the point of docker is and what problems it solves. It seems like it's becoming more and more popular but is there some fundamental issue with virtual machines I'm not aware of that docker fixes? If docker can only utilize the resources of my operating system, what is the point of a docker container running something that I can just install and run natively myself?
|
# ? Nov 3, 2021 22:48 |
nexus6 posted:I'm still struggling to understand what the point of docker is and what problems it solves. It seems like it's becoming more and more popular but is there some fundamental issue with virtual machines I'm not aware of that docker fixes? If docker can only utilize the resources of my operating system, what is the point of a docker container running something that I can just install and run natively myself? It makes dev environments reproducible. For a single developer its value is not as really that apparent but for a team (especially teams with lots of turnover and movement between projects) it's a godsend. It doesn't matter if you're running windows or mac, or what programming language you last used, or what postgres version you have installed, it's all defined right there in the project and it (mostly) works like magic.
|
|
# ? Nov 3, 2021 22:55 |
|
lunar detritus posted:It makes dev environments reproducible. For a single developer its value is not as really that apparent but for a team (especially teams with lots of turnover and movement between projects) it's a godsend. It doesn't matter if you're running windows or mac, or what programming language you last used, or what postgres version you have installed, it's all defined right there in the project and it (mostly) works like magic. Right, but my team are all on Macs, we all use PHP and use mySQL so...
|
# ? Nov 3, 2021 23:00 |
|
nexus6 posted:Right, but my team are all on Macs, we all use PHP and use mySQL so... What if your versions of those dependencies diverge? What if someone on windows needs to run it? What if you want to deploy to AWS? What if after you deploy to AWS you wanna deploy it to Azure cause some of your customers won't do business with Amazon? What if you want to run multiple instances of your apps on the same hardware? Yea you could do a bunch of different VMs but that sucks, why not containerize and run a bunch of instances of your app that all share the same basic resources!
|
# ? Nov 3, 2021 23:54 |
|
What version of PHP? What version of MySQL? Are you all using the same? Is production and staging exactly the same? Are the same modules installed on each and configured identically? Because you can manage all that with a config mgmt system like Ansible / Chef / Salt / Puppet, and prevent version drift between systems, but those tools are not so suitable for developers. Containers make it easy for one person to make the image and for everyone else to trivially grab it and use it, and for exactly the same image to be used in both test environments and production. It also solves problems where you want to have multiple versions running simultaneously, but don't want the installs to stomp on each other (e.g. when bleeding traffic from one version to another). And it makes it trivial to run on platforms like Kubernetes where you're not sure in advance what host you'll be running on, nor how many instances you'll need. You can achieve the same isolation & portability effects with other tools such as baked VM images. But they may be larger (since they need to incorporate the OS), have longer spin up times, have limitations on (say) mounting shared filesystems simultaneously with other VMs, be harder to debug, etc.
|
# ? Nov 4, 2021 00:01 |
|
nexus6 posted:I'm still struggling to understand what the point of docker is and what problems it solves. It seems like it's becoming more and more popular but is there some fundamental issue with virtual machines I'm not aware of that docker fixes? If docker can only utilize the resources of my operating system, what is the point of a docker container running something that I can just install and run natively myself? I use it to develop AWS Lambda functions on Windows that run a headless web browser. It would be impossible to configure your system to be like the Lambda environment including permissions, timeout, etc. Docker makes that trivial.
|
# ? Nov 4, 2021 03:41 |
|
Ok, thanks all, but what I'm getting is hypothetical situations where you don't have a managed team or devops stuff. See, we use things like:
I'm not saying it isn't useful - you're just not selling it to me.
|
# ? Nov 4, 2021 10:33 |
|
nexus6 posted:Ok, thanks all, but what I'm getting is hypothetical situations where you don't have a managed team or devops stuff. See, we use things like: We build containers nearly immutably, read only, with everything mounted read only. Even if there is a vuln, it is impossible to traverse the filesystem or drop a web shell or persist anything onto the filesystem. Any writes are written to a separate database or S3 bucket. The container takes seconds to build, and has much less stuff compared to a full VM. Even a pared down VM is still running a ton more than my application in a container with something like distroless. If someone managed to pop a shell on it, there is no shell, and there is no local utils like curl or nc or wget for them to do stuff with. Containers have much less overhead compared to a full virt VM (xen hvm, vmware, qemu, etc) and are "easy" segmentation. There are no "remnants" from deploying over and over to the same server, and there are no left-around artifacts afterward, just blow it away and try again. Devs cannot "hey i'll enter this pod and edit one file and then forget about it", they have to have it tracked in VCS. A VM is usually at least a few GB given that a minimal debian install is still somewhere near the gig range. A container for static binaries is ~2MB as a base. [you probably do not use serverless given the quote, but] I do not want to wait for a VM to boot and then shut down every single time I visit a web page. This is basically how you develop for serverless. Impotence fucked around with this message at 11:00 on Nov 4, 2021 |
# ? Nov 4, 2021 10:42 |
|
nexus6 posted:Ok, thanks all, but what I'm getting is hypothetical situations where you don't have a managed team or devops stuff. See, we use things like: when you onboard a new dev, how do they run your app(s)?
|
# ? Nov 4, 2021 16:44 |
|
What do you mean? Checking code out from a repo?
|
# ? Nov 4, 2021 17:27 |
|
nexus6 posted:I'm not saying it isn't useful - you're just not selling it to me. Some non-hypothetical problems that my team had which Docker helped solve: - developers wanted to experiment with new versions of PHP/Django/whatever, but didn't want to have separate dev envs - "it fails in production, but it works on my machine" due to some small difference in installation procedure - "You want to bump the version of $thing in the next release because the code relies on it? Ok put in a ticket with Ops and carefully coordinate with them to upgrade it before the next code deployment." - "I want to debug a problem in production, but the version running there is much older than my dev sandbox, and it's a PITA to downgrade it." - "I want to rollback a problematic version, now I have to meticulously de-install everything and re-install the old version" - "I want to bleed traffic from V1 to V2 on the same host to reduce downtime, but it's hard to have two versions installed and running on the same host" - "When I want to setup a new sandbox or prepare to do a build, I have to precisely follow a list of instructions to ensure all the dependencies and tools are installed correctly" If those aren't a problem (or are only a mild irritation) then consider yourself lucky. I reckon they're very common problems in most teams.
|
# ? Nov 4, 2021 17:52 |
|
nexus6 posted:What do you mean? Checking code out from a repo? check out code from repo, install all the dependencies, install specific versions of java, javascript, whatever else, maybe have to run a database server or something vs "docker-compose up" and everyone has identical runtime environments with all the same dependencies in one line, and your software and library versions are committed to repo
|
# ? Nov 4, 2021 17:53 |
I mean it feels like you're coming at this from more of a front-end perspective than back-end, and sure docker is a lot less relevant in that case, I would think.
|
|
# ? Nov 4, 2021 17:58 |
|
Being able to perfectly reproduce a specific, easily tagged and archived, environment is for debugging and diagnostics.
|
# ? Nov 4, 2021 18:02 |
|
nexus6 posted:
Real world example: your main dev is on vacation and you have someone acting as a backup dev in the interim. Something blows up on your app/site, and the backup dev needs to make some edits to Sass files and then compile everything and push it up to the server. Except your main dev has an old version of Node installed on their machine which works with whatever program they're using to compile Sass/run Webpack, but your backup dev has the newest version of Node installed and whenever they try to compile, things blow up with obscure, opaque error messages. They can't compile, which means they can't fix the error. Your main dev isn't answering their phone, and when they finally look at their email in between margaritas they say, "oh, I don't know what version of Node I'm using and I left my laptop at home so I can't check," so your only options are to wait, or to have your backup dev experiment and try to figure out exactly which version of Node will actually work, and in the meantime your app/site has been broken for hours or days because no one thought to put "Use Node version 10.16.3" in the repo's readme. And this all assumes that your backup dev is actually experienced and knowledgeable enough to know that the Node version is actually the problem, because there are probably a thousand dependencies that might be responsible for the error, and none of them have error messages that the human mind can parse. Spoiler alert: this isn't fantastyland, this happened to me last week, and it could have been avoided with either A) good documentation, or B) something like Docker. I've been in this industry long enough to know that A cannot be relied upon, so B is a fantastic, better alternative that just eliminates this problem altogether. e: If only one person on one computer will ever touch the code, you don't need Docker. kedo fucked around with this message at 18:20 on Nov 4, 2021 |
# ? Nov 4, 2021 18:16 |
|
kedo posted:Real world example: your main dev is on vacation and you have someone acting as a backup dev in the interim. Something blows up on your app/site, and the backup dev needs to make some edits to Sass files and then compile everything and push it up to the server. Except your main dev has an old version of Node installed on their machine which works with whatever program they're using to compile Sass/run Webpack, but your backup dev has the newest version of Node installed and whenever they try to compile, things blow up with obscure, opaque error messages. They can't compile, which means they can't fix the error. Your main dev isn't answering their phone, and when they finally look at their email in between margaritas they say, "oh, I don't know what version of Node I'm using and I left my laptop at home so I can't check," so your only options are to wait, or to have your backup dev experiment and try to figure out exactly which version of Node will actually work, and in the meantime your app/site has been broken for hours or days because no one thought to put "Use Node version 10.16.3" in the repo's readme. And this all assumes that your backup dev is actually experienced and knowledgeable enough to know that the Node version is actually the problem, because there are probably a thousand dependencies that might be responsible for the error, and none of them have error messages that the human mind can parse. i've had that exact problem with certain versions of node.js crashing when doing sass/webpack poo poo and it was infuriating.
|
# ? Nov 4, 2021 19:00 |
|
nexus6 posted:What do you mean? Checking code out from a repo? thats usually a good first step, but usually there is more right? are you just creating static html files or something?
|
# ? Nov 4, 2021 19:42 |
|
I think my frustration with docker in part correlates with the immense amount of boilerplate and configuration that goes with it which always inevitably stops working. I don't know how many times I've been frustrated fighting with docker containers for hours. Only to in the end stop all of them, delete them, re-initialize them, and have everything working properly again. Maybe it's still too young a development tool? Maybe the docker developers have focused too much on adding features (yet still not being easily interchangeable with kubernetes)? I dunno. It was sold as, and written about, like some huge godsend but for me as well I'm not having a good time with it. There's a lot of alternatives now that have been popping up, but everyone's still hooked on the first thing they ever came across.
|
# ? Nov 4, 2021 19:56 |
|
Nolgthorn posted:I think my frustration with docker in part correlates with the immense amount of boilerplate and configuration that goes with it which always inevitably stops working. I don't know how many times I've been frustrated fighting with docker containers for hours. Only to in the end stop all of them, delete them, re-initialize them, and have everything working properly again. Can you explain this more? I don't have "many configuration" or "boilerplate" or fighting it for hours. Are you trying to use an ephemeral container as a persistent dumping ground or "editing in container"? What is your use case for it and what are you trying to do with it? Docker is not a VM, if that.
|
# ? Nov 4, 2021 21:21 |
|
I like Docker (or more generally, containers) even outside of a team environment (ie for personal projects) just because it lets me easily use tons of different technologies for different projects without actually having to have them installed locally. Now a year or two down the road, when some projects are long abandoned, I don't have a bunch of unused crap installed by my package manager that I'm not 100% sure which projects those were installed for and if anything I'm still working on depends on it. It's also especially useful for having projects that run on different versions of the same technology (e.g. I have something developed that is deployed on an old version that I don't want to have to update to use a newer version with a new project), which installed locally gets messy as then there's the question of which version is run when I type the command in the command line (ie which version is on my $PATH). It's also just nice and convenient that I can spin up say a new postgres DB w/ a volume for a new project in just 2 or 3 commands with minimal configuration. And yes, most of these concerns could be handled by VMs instead, but the performance would be worse and I wouldn't want to have to setup and manage (and store) a new VM for every project, especially on my laptop.
|
# ? Nov 4, 2021 22:37 |
|
quote:NPM issues Oh yeah, been there. We enforce .nvmrc files in repos to stop that happening again. Splinter posted:I like Docker (or more generally, containers) even outside of a team environment (ie for personal projects) just because it lets me easily use tons of different technologies for different projects without actually having to have them installed locally. Now a year or two down the road, when some projects are long abandoned, I don't have a bunch of unused crap installed by my package manager that I'm not 100% sure which projects those were installed for and if anything I'm still working on depends on it. It's also especially useful for having projects that run on different versions of the same technology (e.g. I have something developed that is deployed on an old version that I don't want to have to update to use a newer version with a new project), which installed locally gets messy as then there's the question of which version is run when I type the command in the command line (ie which version is on my $PATH). It's also just nice and convenient that I can spin up say a new postgres DB w/ a volume for a new project in just 2 or 3 commands with minimal configuration. And yes, most of these concerns could be handled by VMs instead, but the performance would be worse and I wouldn't want to have to setup and manage (and store) a new VM for every project, especially on my laptop. How does this work then? A docker container is like a VM and self-contained but it also isn't because it can interact with things outside the container? barkbell posted:thats usually a good first step, but usually there is more right? are you just creating static html files or something? Having MAMP installed is usually enough in like 95% of cases nexus6 fucked around with this message at 22:44 on Nov 4, 2021 |
# ? Nov 4, 2021 22:40 |
|
nexus6 posted:
It's self contained, but you can volume mount shared folders to persist state, or in development, mount a local folder of your public_html into a PHP runtime container that is linked in a private network to a Postgres container
|
# ? Nov 4, 2021 22:52 |
|
nexus6 posted:Having MAMP installed is usually enough in like 95% of cases Which versions? If you want to upgrade any of those versions, do you make sure every developer is doing so in sync? And also in sync with what you're running in production? How?
|
# ? Nov 5, 2021 05:57 |
|
Jabor posted:Which versions? To be honest, it really hasn't been much of an issue. We use a recent version of MAMP because that's what we have licenses for, we have procedures set out for versions (and things like nvmrc for enforcing consistency) and exceptional circumstances and requirements are put in README files for onboarding new devs. I get how in some circumstances docker might be useful, but for our team I don't see how getting everyone to install docker and learn how to use and set up containers solves any issue we don't have. edit: btw isn't that exactly what Vagrant already does? nexus6 fucked around with this message at 11:18 on Nov 5, 2021 |
# ? Nov 5, 2021 11:14 |
|
My boss and I were the only two people that used Docker enthusiastically, while everyone else begrudgingly went along with it. This meant that I had to spend, if I'm honest, a probably too-long period of time learning how it works because boy howdy it was complicated, at least for our use case. But what we ended up with was a completely turn-key system. The entire dev setup process was simplified to checking out the git branch and typing "docker-compose up". It launched a local proxy, an orchestration service which connected to our microservices, started a sass and js watcher/compiler and ran tests automatically when their coverage area was changed, all by default. If you wanted to, there were great non-default options that allowed quite a bit of customization. You could set the entire system up locally(i.e. no internet connection necessary, but obviously data was dummy placeholder in this case), you could bypass the orchestration layer and connect directly to the microservices, you could pick and choose how the load balancer acted, tons of other stuff. It was nuts, and it was all simple poo poo like docker-compose up develop-local, it would just automatically reconfigure everything to work locally like magic. Design and Product ended up actually doing simple pull requests because setting up the system was the entire barrier of entry stopping them from debugging things that bothered them. Then both of us got laid off and outsourced. I'm genuinely curious as to whether or not it still works. I can't imagine not wanting to use docker. Every single person on your team benefits.
|
# ? Nov 5, 2021 13:31 |
|
If you're doing serverless stuff on AWS Lambda and dont wanna learn much about Docker, the AWS SAM CLI (which uses Docker) is pretty loving great. Even if you do know a bunch about Docker. Going from deploying Python apps with Zappa to SAM was night and day in terms of limitations, ability to test, and ability to diagnose. Stuff I do on Lambda that would have taken WAYYYY longer without Docker based testing locally: Scrape websites triggered via a REST API using headless Chrome. Editing videos with moviepy/ffmpeg. Downloading files locally, modifying them, uploading them elsewhere. Greatly greatly exceed the 250MB file size limit using lambda-layers. (SAM CLI handles making the extra layers basically) Also, the fact that I can wrap everything in a try/except with the except having a time.sleep(900) means I can have a terminal to the AWS Lambda-like Docker environment. When trying to diagnose file system permissions issues, figure out errors with binaries, getting system logs in ?? locations, being able to terminal in to the environment while it is executing is a god-send. EDIT: Oh and this worked very well even though the actual development machines were a combo of Windows and Mac CarForumPoster fucked around with this message at 14:24 on Nov 5, 2021 |
# ? Nov 5, 2021 14:22 |
|
|
# ? Jun 5, 2024 05:34 |
|
I am going insane. Does anyone know / use sinon in a Node + typescript environment to mock fetch calls? This should be so easy, but every solution I have tried doesn't work or typescript won't allow. Basically I have this: JavaScript code:
JavaScript code:
|
# ? Nov 5, 2021 16:09 |