Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
camoseven
Dec 30, 2005

RODOLPHONE RINGIN'

Lumpy posted:

If you were using Typescript, you’d have caught that typo before you hit submit. :colbert:

Adbot
ADBOT LOVES YOU

worms butthole guy
Jan 29, 2021

by Fluffdaddy
would it? Still a string lol.

lazerwolf
Dec 22, 2009

Orange and Black
Coming soon TypoScript

spacebard
Jan 1, 2007

Football~

worms butthole guy posted:

would it? Still a string lol.

More like a union type of string | ShitPost

HaB
Jan 5, 2001

What are the odds?

fsif posted:

I'll be another voice in the room saying the answer probably has to be React.

I disagree with the framing that React and Vue are on equal footing re: "major player" status. The React pie is waaay larger and my guess is that any upstart frameworks (Svelte, Lit, Fresh, etc) are going to take proportionately larger chunks out of the Vue pie than the React pie.

This isn't a statement on the merits of the frameworks themselves; it's just about where the industry currently is. When I was looking for work a couple months ago, nearly everything required React and the few places that worked in Vue said they were okay with hiring React developers. I doubt that would hold the other way around.

For the record, I said they are both major players, not that they are on equal footing. React has the larger market share by a good margin. But the big 3 are still: React, Vue and Angular.

Angular has become a weird side thing now, since it’s so much different than Vue/React.

Spime Wrangler posted:

I've spent the last six years as a solo developer working in Vue + Typescript (never touched react) and was all excited about starting my next long-lived project in Vue3, likely bringing folks on in the next year.

Do you guys feel the long-term benefits of React are big enough and the transition straightforward enough to throw away the familiar tools and start migrating over?

Absolutely not. And Vue 3 is so good to work in. You’ll love it.

worms butthole guy posted:

I don'teqn to troll but TyoeScript seems like such a dumb thing. Either have a strongly typed language or not.

Possible spicy take on Typescript:

I feel like typescript makes maintaining your code base a LOT better, but is a colossal pain in the rear end when first developing something.

“Yes, typescript. It’s an any because I don’t know yet what shape it will wind up being later, now can you just compile so I can see if this is working?”

Maybe it’s a coding style thing, since I tend to jump in and futz about in a chaotic manner until I get something working, then I go back and clean it up. I feel like typescript is constantly getting in the way of that process.

I don’t love typescript until I am revisiting code I or someone else wrote months ago and nobody remembers that well. Typescript makes that far easier. It’s just during the initial development cycle I find it bothersome.

gbut
Mar 28, 2008

😤I put the UN🇺🇳 in 🎊FUN🎉


As someone working almost exclusively in TS these days, wrangling code made by juniors who don't really understand how helpful it can be us the hardest part of my day, but also the most fulfilling one when I manage to get my point across. I like teaching. If I were forced to do the same in JS, I'd probably just... quit?

camoseven
Dec 30, 2005

RODOLPHONE RINGIN'

HaB posted:

Possible spicy take on Typescript:

I feel like typescript makes maintaining your code base a LOT better, but is a colossal pain in the rear end when first developing something.

“Yes, typescript. It’s an any because I don’t know yet what shape it will wind up being later, now can you just compile so I can see if this is working?”

Maybe it’s a coding style thing, since I tend to jump in and futz about in a chaotic manner until I get something working, then I go back and clean it up. I feel like typescript is constantly getting in the way of that process.

I don’t love typescript until I am revisiting code I or someone else wrote months ago and nobody remembers that well. Typescript makes that far easier. It’s just during the initial development cycle I find it bothersome.

You can set up your local dev env to let you know about those warnings/errors but not block compilation. That's what we do at my work, exactly so that we can "futz about in a chaotic manner until I get something working, then I go back and clean it up"

Lumpy
Apr 26, 2002

La! La! La! Laaaa!



College Slice

camoseven posted:

You can set up your local dev env to let you know about those warnings/errors but not block compilation. That's what we do at my work, exactly so that we can "futz about in a chaotic manner until I get something working, then I go back and clean it up"

Bingo. Locally it will issue a warning, but let you do whatever. If you push, the build will fail.

TS is “great” in the sense that it puts a safety on the ultimate footgun, but yeah, it would be nicer to have a language that was strongly typed to begin with like Elm or something. But that’s not the world we live in so Typescript it is!

Armauk
Jun 23, 2021


Lumpy posted:

If you were using Typescript, you’d have caught that typo before you hit submit. :colbert:

The compiler would've caught the error and prevent the comment from being submitted ;)

prom candy
Dec 16, 2005

Only I may dance

HaB posted:

Possible spicy take on Typescript:

I actually like the process of defining my types up front, it helps me think about what I'm building. When I'm building stuff I really like thinking first in terms of "what public API does this component/module/class expose" and typescript helps me start there.

If you use esbuild your code will still compile even with type errors. I switched some of my projects to Vite and they're way faster, you just have to remember to do typechecking at some point

smackfu
Jun 7, 2004

Dumb hooks question:

If I conditionally use a hook, React gets very upset with me.
If I conditionally render a component that uses a hook, React is just fine.
That seems like it’s a refactoring.

Is this just an example of React internals leaking out?

prom candy
Dec 16, 2005

Only I may dance

smackfu posted:

Dumb hooks question:

If I conditionally use a hook, React gets very upset with me.
If I conditionally render a component that uses a hook, React is just fine.
That seems like it’s a refactoring.

Is this just an example of React internals leaking out?

Yes, you can't use hooks conditionally (or in loops) so conditionally rendering the component is what people normally do. Or writing books that return functions instead of firing them. Or writing hooks that take a skip: option.

zokie
Feb 13, 2006

Out of many, Sweden
React keeps track of which useState is which by knowing which component is being rendered and the order of the hook calls inside of the component.

dupersaurus
Aug 1, 2012

Futurism was an art movement where dudes were all 'CARS ARE COOL AND THE PAST IS FOR CHUMPS. LET'S DRAW SOME CARS.'

smackfu posted:

Dumb hooks question:

If I conditionally use a hook, React gets very upset with me.
If I conditionally render a component that uses a hook, React is just fine.
That seems like it’s a refactoring.

Is this just an example of React internals leaking out?

Hooks are stored in a stack in the order they’re called, so conditionally defining one is inherently breaking. But that stack is per-component instance, so whether the component is mounted or not doesn’t have any bearing on all of the other stacks of hooks

Splinter
Jul 4, 2003
Cowabunga!
I need to migrate a React/NextJS app file upload feature from a simple multipart/form-data file upload to our first-party API server to a much more involved chunk based file upload workflow with a 3rd party API.

The new workflow is something like this:
  • POST to create a Record
  • POST to attach a File to that Record (no actual file data at this time, just things like filename, size and content type)
  • In 32 MB chunks, POST raw bytes to the File created in the previous step
  • -- Content-Type application/octet-stream
  • -- Another header with base64 encoded JSON indicating which chunk this is (sequential index), block size and sha512 hash
  • -- This can result in a 400 response if a duplicate index is detected or hash fails
  • After all blocks have POSTed, another POST to mark the File as complete
  • After all Files for the Record are complete (though in my case it will only ever be 1 File per Record), another POST to mark the Record as complete
  • Alternatively, there's another POST route to instead cancel (rather than complete) the Record. For instance, if there were some errors previously that could be resolved by re-tries.

Currently, we're using Axios as our HTTP client, and also typically use React Query mutations even for requests like this for the success/loading/error states. Another potential wrinkle is users will want to be uploading many files at once (either drag-and-dropping from a folder or multi-select through OS file picker), rather than just one at a time.

I've been doing some research into handling chunk based file uploads, so I have some idea where to start here and how to handle this, but I've never done anything quite like this before so I'm expecting some pitfalls and gotchyas. Anyone have any advice, guidance or recommended resources on how to handle this?

Also, for existing requests to this same 3rd party API, we proxy everything through our API server (Spring based). This allows for our API server to handle/manage the OAuth2 access token workflow and automatically inject the Bearer token into requests made to that API (rather easily via Spring Security magic). However, I'm thinking in this instance it might be better to hit the 3rd party API directly from the browser in this case as a) it seems like due to all the requests involved it will be more straightforward to manage without an additional intermediate server (especially in the case of errors), and b) it will save our API server a ton of bandwidth. This will result in the browser app needing to deal with injecting the access token, but my plan for this is to add an API route that serves up the auth token, so the API server can still manage the OAuth workflow/token and then our 3rd party API credentials (client secret based) won't have to be sent to the browser. Only users authenticated with our app (a B2B app, so employees only) would have the ability to GET the access token, so potential for access token misuse is relatively small. Does this sound like a reasonable approach?

M31
Jun 12, 2012
First ask the 3rd party for a client implementation, because if they come up with something like this they should have the decency to provide an implementation as well.

Alternatively, keep the normal form upload and perform the splitting on the server (ULPT: and let a backend developer handle it)

If you are sending the token to the client, be sure that it's tied to the user and not a service account.

Splinter
Jul 4, 2003
Cowabunga!

M31 posted:

First ask the 3rd party for a client implementation, because if they come up with something like this they should have the decency to provide an implementation as well.

Alternatively, keep the normal form upload and perform the splitting on the server (ULPT: and let a backend developer handle it)

If you are sending the token to the client, be sure that it's tied to the user and not a service account.

Requesting a sample client implementation is a good idea. I'm not very confident this company will have this as they haven't been very responsive (integrating with us is kind of tangential to the product they provide), but I'm asking anyway.

Despite authing most of our users from the same identity provider as this 3rd party API (our client's GApps ), their API only has the ability to accept a service account auth token. We do provide information about which user is performing actions via setting certain fields in our requests, but there's no ability for us to use a user token rather than one tied to our service account. Using user tokens is one of the first things I explicitly asked about, and was told 'no' (and their API docs appear to confirm this).

One thought for how to approach this (sending a token to the browser) in a more secure manner is to configure 2 separate API clients/secrets/tokens for our app. We have the ability to configure what permissions each API client has (in terms of what routes it's able to use) from their system. 1 would only have the ability to perform actions directly related to this upload process, and that auth token would be provided to the browser for uploading file chunks only. The other would be more permissive and only used for requests made from our server. That way even if someone obtained the token provided to the browser for uploading, all they would be able to do with this token is upload something new (and these users are all generally trusted). There are possible CORS issues with this approach though which I'm going to test out.

The backend dev that would handle this will almost certainly be me...there are not that many resources available for this project. Handling this on the backend is certainly an option, but I'm aiming for the overall path of least resistance here. The big downside of handling the entire process via our backend is all that file bandwidth still has to go through our server. A ton of money that was budgeted for our servers is now being saved with the file hosting now being handled by this 3rd party (and this was factored into our client's budgeting), but it would be ideal to also avoid all the extra bandwith use as well. If we can't do that due to restrictions with the 3rd party's API, then that's not really on us at least. The switch to using this 3rd party for file storage was made to help our client with other aspects of their work not directly related to what our app provides (rather than by us or due to requirements related to our app), so there's at least some expectation/understanding from our client that this integration isn't going to be ideal/perfect.

zokie
Feb 13, 2006

Out of many, Sweden
Yeah if they make such a convoluted API they should include a client. But they probably don’t. Reading the whole description made me think state machine, maybe xstate or some other library can help you with the plumbing

Chenghiz
Feb 14, 2007

WHITE WHALE
HOLY GRAIL

Splinter posted:

Despite authing most of our users from the same identity provider as this 3rd party API (our client's GApps ), their API only has the ability to accept a service account auth token. We do provide information about which user is performing actions via setting certain fields in our requests, but there's no ability for us to use a user token rather than one tied to our service account. Using user tokens is one of the first things I explicitly asked about, and was told 'no' (and their API docs appear to confirm this).

Then this uploading should be performed on your servers. Unless you’re directly responsible for the finances of your employer, your job is to advocate for the security of your systems and sending service account credentials to a browser client is wildly irresponsible.

Roadie
Jun 30, 2013

Chenghiz posted:

Then this uploading should be performed on your servers. Unless you’re directly responsible for the finances of your employer, your job is to advocate for the security of your systems and sending service account credentials to a browser client is wildly irresponsible.

Yeah, the only sane thing to do here is have your own backend that's a passthrough.

Cup Runneth Over
Aug 8, 2009

She said life's
Too short to worry
Life's too long to wait
It's too short
Not to love everybody
Life's too long to hate


Make your own API which accepts the user info and does the upload. Like a service onion.

Roadie posted:

Yeah, the only sane thing to do here is have your own backend that's a passthrough.

Yep, this.

Lumpy
Apr 26, 2002

La! La! La! Laaaa!



College Slice
Has anyone updated react-scripts from 4 to 5 in a Create React App? For fun, I just gave it a whirl following the docs and boy howdy it did not go well. It complains loudly that things require Postcss8, but react-scripts itself requires resolve-url-loader 4.0.0, which requires postcss 7.0.35. Other things were updated to postcss 8, and that is installed, but this seems odd.

prom candy
Dec 16, 2005

Only I may dance

Lumpy posted:

Has anyone updated react-scripts from 4 to 5 in a Create React App? For fun, I just gave it a whirl following the docs and boy howdy it did not go well. It complains loudly that things require Postcss8, but react-scripts itself requires resolve-url-loader 4.0.0, which requires postcss 7.0.35. Other things were updated to postcss 8, and that is installed, but this seems odd.

The correct upgrade path is react-scripts@4.x -> vite@3.x

teen phone cutie
Jun 18, 2012

last year i rewrote something awful from scratch because i hate myself
once CRA started having opinions about how tsconfig.json should be and going as far as overwriting changes you make to force you into their config values when you npm run start, that's when i swore that off forever.

biceps crimes
Apr 12, 2008


I'm working on a monolith that's growing quickly on the front end, we've gone from 0 to about 400k lines of typescript/tsx code over the past two years. Our monolith has also grown from having like 3 teams working in it to 12-15ish.

We've mitigated the size costs in a few different ways, one is using jest's --changedSince to run selective tests for PR builds, only linting changed files, etc. Webpack builds were slowing down steadily but I swapped babel-loader out for swc-loader earlier in the year and that sped things up like 5x. I think the main pain we're feeling is that the team I'm on is responsible for dependency management, the component library, etc, and our latest component library major version migration took us a couple of months to complete because there's just so much typescript code. I think people on the team (including myself) were thinking that maybe migrating would have been easier if we had everything sectioned off and organized in a monorepo like way but now that I'm looking into it more, it doesn't seem like that's actually the case. The NX authors recommend only one package.json file for the whole monorepo, and I may be mistaken, but having multiple versions of some dependencies throughout the monolith and having it not break seems like it might require some microfrontends approach? Anyone dealing with something similar? Do I need to tell management that migrating a dependency for 400k lines of code just takes a while and they need to adjust their expectations? jscodeshift was a huge help, but most of the time we spent was being unfamiliar with the other teams' code we were looking at, and of course just ironing out dreaded styling issues that couldn't be found programmatically, due to teams or legacy style sheets crapping on our components from the global namespace.

biceps crimes fucked around with this message at 01:54 on Aug 26, 2022

MrMoo
Sep 14, 2000

Monorepos are somewhat another cargo cult. There are very little actual benefits and many, many disadvantages. What you are experiencing is advertised as a benefit but is counter to all modern practice of modularity and slowly updating components one at a time.

https://youtu.be/VvcJGjjEyKo

Show me wrong, please!

prom candy
Dec 16, 2005

Only I may dance
I lost a morning this week trying to run a React 18 app in a monorepo along side some React 17 apps. So now it's in its own repo but I'm having to copy and paste a ton of code into it so really what are you gonna do.

If you have a monolith isn't that kinda already a monorepo? Usually moving to monorepo involves moving separate projects into one repo so they can share dependencies and configuration. This is why I'm starting a monorepo in the first place, we have a handful of front end apps that should be able to easily share components and utils and in separate repos they can't really.

teen phone cutie
Jun 18, 2012

last year i rewrote something awful from scratch because i hate myself

MrMoo posted:

Monorepos are somewhat another cargo cult. There are very little actual benefits and many, many disadvantages. What you are experiencing is advertised as a benefit but is counter to all modern practice of modularity and slowly updating components one at a time.

https://youtu.be/VvcJGjjEyKo

Show me wrong, please!

monorepos were great for my previous teams of ~10 frontend developers. We had 2-3 services though. 1 JS SDK for interactions with our Python backend, 1 for the website client code, and 1 for our component library (this one i don't think should have been it's own project IMO).

It worked great but i honestly believe it's because I learned from experience and knew what I was doing when I set them up. It sped up development time and the hot-reloading between dependencies worked amazingly. For example, updating code in the JS SDK hot-reloaded the website

That being said, there are very few people that I would trust to own a monorepo. You really have to know what you're doing.

Jabor
Jul 16, 2010

#1 Loser at SpaceChem
If your monolithic application can't handle different subcomponents using different versions of the same library as a dependency, then you're going to have to deal with the pain of updating that library everywhere in your monolith at the same time. There's not anyway around it, and this has absolutely nothing to do with the type of repository you're using. One big advantage to a monorepo here is that it's actually tractable to upgrade a dependency for all users simultaneously, if that turns out to be the best way to do it.

Why does upgrading your component library to a new major version require significant work? Do the library authors not believe in backwards compatibility?

Doom Mathematic
Sep 2, 2008
I say this every time but always remember that there's a middle ground between whatever you have currently and a monorepo. It might be that all you really need to do is merge two or three of your repos together because that makes sense, and the rest can stay separated.

HaB
Jan 5, 2001

What are the odds?

Doom Mathematic posted:

I say this every time but always remember that there's a middle ground between whatever you have currently and a monorepo. It might be that all you really need to do is merge two or three of your repos together because that makes sense, and the rest can stay separated.

Hey speaking of monorepos, I just started a new gig and pulled down the repo for the first time today. So uh....assuming you don't work at something like a game shop where you check in gigantic assets- what's the longest you've seen a repo take to pull down?

Before today, I would have said maybe a minute, tops. I have gigabit fiber and errrrything.

This one took NINE minutes.

I suppose it shouldn't be that alarming now that I have seen it, as the folders the frontends live in are 8 folders deep.

:stare:

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
The Google/Facebook monorepos are so large that they use sparse files and special filesystems to only pull down files that are actually used.

prom candy
Dec 16, 2005

Only I may dance

Doom Mathematic posted:

I say this every time but always remember that there's a middle ground between whatever you have currently and a monorepo. It might be that all you really need to do is merge two or three of your repos together because that makes sense, and the rest can stay separated.

This is the direction I'm moving in. Some of our stuff doesn't make sense separated anymore, but some of our stuff would make no sense together either.

Doom Mathematic
Sep 2, 2008

HaB posted:

Hey speaking of monorepos, I just started a new gig and pulled down the repo for the first time today. So uh....assuming you don't work at something like a game shop where you check in gigantic assets- what's the longest you've seen a repo take to pull down?

Before today, I would have said maybe a minute, tops. I have gigabit fiber and errrrything.

This one took NINE minutes.

I suppose it shouldn't be that alarming now that I have seen it, as the folders the frontends live in are 8 folders deep.

:stare:

We had an issue where a team, instead of uploading their package to our internal npm registry and then using it as a production dependency, like

JSON code:
"@our-co/package-1": "^1.2.3"
, would put a git URL like

JSON code:
"@our-co/package-1": "git+ssh://git@ourinternal.github.com:our-co/package-1.git#master"
. This meant that installing it pulled down around 50MB of live source files (most of which were useless scripts, documentation...) and 350MB of git history. At first we thought that npm install was just stalling, but in fact it ran to completion after about an hour.

frogbs
May 5, 2004
Well well well
Maybe this isn't the best thread for this, but i'll give it a shot.

I want to make a simple page that scrapes some publicly available data from a government site once per hour and then displays some of that data. Because i'm old and dumb my inclination would just be to write a cron job that scrapes the data, saves it locally, then one PHP file that parses and displays it.

What's a more modern/2022 way of doing this that isn't reliant on node/npm and all the package hell/layers of abstraction? I'm interested in learning something new, but am kind of sour on a lot of the JS ecosystem stuff. Any suggestions welcome!

prom candy
Dec 16, 2005

Only I may dance

frogbs posted:

Maybe this isn't the best thread for this, but i'll give it a shot.

I want to make a simple page that scrapes some publicly available data from a government site once per hour and then displays some of that data. Because i'm old and dumb my inclination would just be to write a cron job that scrapes the data, saves it locally, then one PHP file that parses and displays it.

What's a more modern/2022 way of doing this that isn't reliant on node/npm and all the package hell/layers of abstraction? I'm interested in learning something new, but am kind of sour on a lot of the JS ecosystem stuff. Any suggestions welcome!

Rust is pretty hot right now.

HaB
Jan 5, 2001

What are the odds?

frogbs posted:

Maybe this isn't the best thread for this, but i'll give it a shot.

I want to make a simple page that scrapes some publicly available data from a government site once per hour and then displays some of that data. Because i'm old and dumb my inclination would just be to write a cron job that scrapes the data, saves it locally, then one PHP file that parses and displays it.

What's a more modern/2022 way of doing this that isn't reliant on node/npm and all the package hell/layers of abstraction? I'm interested in learning something new, but am kind of sour on a lot of the JS ecosystem stuff. Any suggestions welcome!

I did something similar recently and for the scrape I just used jQuery.

Pull specific elements via judicious use of selectors, grab the innerText - then do with it what you want.

Mine was done in a node backend, but you could drop jquery in to just about anything, really. Could even do it as a cron job that calls a command line node script using jquery.

Can def be done without getting into the package ecosystem, as you don't really need anything other than jquery.

camoseven
Dec 30, 2005

RODOLPHONE RINGIN'
Please, please do not use jquery. Basically everything jquery does is now built in to JS.

If you're going to scrape with Node try something like JSDOM

kedo
Nov 27, 2007

I used cheerio to handle some server-side DOM parsing for a node project recently and it did a great job. I realize you can do 99% of this stuff with vanilla JS these days, but the jQuery syntax and helpers make things a bit quicker.

e: I finished reading the full first question and saw that OP is trying to avoid npm/JS, so :shrug:

Adbot
ADBOT LOVES YOU

blunt
Jul 7, 2005

frogbs posted:

Maybe this isn't the best thread for this, but i'll give it a shot.

I want to make a simple page that scrapes some publicly available data from a government site once per hour and then displays some of that data. Because i'm old and dumb my inclination would just be to write a cron job that scrapes the data, saves it locally, then one PHP file that parses and displays it.

What's a more modern/2022 way of doing this that isn't reliant on node/npm and all the package hell/layers of abstraction? I'm interested in learning something new, but am kind of sour on a lot of the JS ecosystem stuff. Any suggestions welcome!

The once-hourly gathering/processing/saving data sounds like a perfect use case for a lambda/serverless function. Then you can display the data with basic html and as much or little javascript as you fancy. Host the whole thing for free on something like Netlify.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply