Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
I'm in node_modules heck

So, as with every company I've ever worked for I was given a bug ticket. It says that puppeteer which opens our html assets server side is taking screenshots and the images in the screenshot are blurry. I track it down to our version of Chrome running on the server needing to be updated.

Well this means the version of puppeteer needs to be updated since every version of puppeteer is only suitable for one version of chrome. However when I update the version of puppeteer and I update the version of Chrome now I get the following error. Because why would it work? What possible scenario could I ever expect the updating of a module to ever work? The whole point of node_modules is to break the second you have to touch anything in there.

code:
Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: Package subpath './internal/node/install.js' is not defined by "exports"
Of course, Package subpath is not defined by exports what the heck does that mean? Apparently there's several schools of knowledge about this error. One is that you should delete your node_modules and .lock file, then install everything again. Another is to npm update, and another is to npm audit fix. None of which solves my problem so if I just dig a little bit deeper maybe I'll come up with a solution.

Thankfully I found the rogue dev who posts their findings, usually this happens on mailing list archives. There were more than one, I tried all of them but one got me closer than the others. They say that the new version of puppeteer doesn't throw this error as long as it's below a certain version. So I downgrade puppeteer and go through the effort of figuring out what version of Chrome that version supports, and find a copy of that too.

Now, I get a little bit further, now it's a totally different error.

code:
Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: Package subpath './internal/common/DeviceDescriptors.js' is not defined by "exports"
Oh great. I forgot why I love node_modules so much, and why maybe an entire industry shouldn't depend on a handful of dipsticks who were responsible for nuking a bunch of servers a year or two ago because of a `rm -rf` bug they included in a patch release.

I have spent an entire day on this now. It's pretty unbelievable. I just want the dang version of Chrome to update.

Adbot
ADBOT LOVES YOU

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
Sounds interesting, thanks

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense

Nolgthorn posted:

ERR_PACKAGE_PATH_NOT_EXPORTED

The solution to this was to update node to a compatible version inside of the docker instance. Now instead npm is telling me dotenv is not installed, it certainly absolutely is. I've deleted node_modules and .lock again, re-installed, force installed, cleared cache, audit fix on and on and on.

Npm is impossibly bad.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
Ever feel like "is this honestly what they pay me for?"

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
I'm at a company that enforces strict versions in the package json file, like most companies are these days. You can't rely on third party developers to use semvar properly, which is why the lock file was introduced. But then when you look up just about any npm related issue half the suggestions are to delete the lock file. So, it's worthless. I can't imagine depending on it.

It's not the cause of this issue anyway.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
Yes.

The problem was that I need separately to run `npm install` inside each docker container that wants to use this specific library, after navigating to the directory where it is used, because of the way that library is being triggered, which is inside of a postinstall hook or something like that. I've forgotten the details because when it comes to problems like this I hate them so much I erase the solution from my brain.

That way if they ever come up again I can re-experience solving the problem from scratch.

Separately updating node broke many libraries in node_modules and so on. And then every so often I'll be browsing the web and someone shows up and is like "I don't know why anyone would care what's in their node_modules directory that seems silly to me."

So anyway, all I wanted to do a few days ago was update chrome.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
It also speeds up development dramatically on teams. Because even if I don't know what your function does I can be confident it takes certain types and returns certain types.

Thank God

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense

fuf posted:

It's annoying how you have to do things the wrong way first to figure out why the right way is the right way.

Wait till you do things the right way and figure out why they're also wrong. All solutions point back to vanilla js, so then you do that and realize why React is probably the right way. It's a never ending circle.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense

go play outside Skyler posted:

Every time I try to start a project without react I end up with a million calls to document.createElement and sometimes I feel that all JS needs is a standardized simple template system.

But isn't that refreshing? I love it. It's like

JavaScript code:
function buildCatEle (name, age) {
    const ele = document.createElement('div');
    // etc...
    return ele;
}

const catEle = buildCatEle('Darby the Cat', 6);
catsEle.appendChild(catEle);
ah, peace and quiet.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense

Obfuscation posted:

Now all you need is a state management system that knows which templates to re-render when your state changes and whoops, you have re-invented react

That's the big feature for sure. But sometimes I prefer to just

JavaScript code:
function updateCat (props) {
    const cat = cats.find(({ id }) => id === props.id);
    if (cat) {
        cat.elements.name.innerHTML = props.name;
    }
}
Or whatever.


In a lot of cases I don't need everything to react to events.

Nolgthorn fucked around with this message at 15:04 on Jan 31, 2023

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
Does this count as jsx? Could do a full json version of it would be easier to read.

JavaScript code:
function ele (name, attrs, ...children) {
    const element = document.createElement(name);
    for (const [key, value] of Object.entries(attributes)) {
        element.setAttribute(key, value);
    }
    for (const child of children) {
        if (typeof child === 'string') {
            element.appendChild(document.createTextNode(child));
        } else {
            element.appendChild(child);
        }
    }
    return element;
}

ele('div', { class: 'cat' },
    ele('div', { class: 'name' }, 'I\'m a cat'),
    ele('div', { class: 'avatar' },
        ele('img', { src: 'https://blahblah' })
    )
);

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
Simplest solution is just

JavaScript code:
const html = `
  <div class="cat">
    <div class="name">${name}</div>
    <div class="avatar"><img src="${src}" /></div>
  </div>
`;
What would be close enough to jsx ?

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
^ Yes that's the thing I was wondering what happened to it but couldn't remember what they were called.


The following

JavaScript code:
const element = document.createRange().createContextualFragment(html);
wherever.appendChild(element);
Continuing my previous post this seems to be the most up to date way to work with dom nodes from an html string using vanilla javascript. You can perform all the expected operations like `element.querySelector` and etc. It's more up to date than `DOMParser` apparently.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense

fuf posted:

option 3: ??
code:
class Category
   id: number
   name: string

class Tag
   id: number
   name: string

class CategoryTag
  categoryId: number
  tagId: number
To me this doesn't seem like a good solution because you're still performing operations that are as expensive as they would be if you stuck a categoryId onto each Tag. If you don't need a many to many relationship and you don't need additional parameters on CategoryTag then go with the simpler option.

You are probably more often finding tag references from the category so I'd reference tags within the category.

Nolgthorn fucked around with this message at 23:03 on Feb 8, 2023

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
You could try my framework, it does it's best to let you use just native req/res node thingys.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
For ages I used || for defaults, going through old code I always change it to ??.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
I like storing datetimes as integers in the database. For a while now I've been stripping ms off the unix timestamp so that it fits, but this is just pushing back the problem until unsigned integers can't store the number of seconds since epoch. So now in order to continue feeding my delusion I want to start stripping years off the integer too. So It'd be seconds since epoch plus 50 years.

That ought to fit more numbers in there for a little bit longer.

The suggestion would be stop using an unsigned integer use a bigint or something like that. Sure but then what once I try and read the number in JavaScript, I don't want to deal with a BigInt in JavaScript.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
MySQL Int for example stores up to 2,147,483,647 which is in 2038 sometime. My app then becomes one of many that will break on that date so I'm expected to store a unix timestamp as BigInt instead. Javascript can work with numbers up to 9,007,199,254,740,991 which is in 2255 sometime so that leaves a lot of room. I guess the difficulty is in converting the database's BigInt to a javascript number.

For some reason if I use BigInt my ORM refuses to return anything but a javascript BigInt which then requires conversion. It's Prisma, by the way, the best ORM of all time, except for this one dire flaw.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
Yes I know the correct answer is to use a datetime format. I just also don't like working with Date objects.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
I have decided to use datetime, copilot convinced me in like 30 minutes.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
I am argumentative and after years of experience I've found I prefer getting into confrontations with robots.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense

Polio Vax Scene posted:

its probably been so widely used now that changing it in any way would cause chaos

Deprecate, lets get this ball rolling

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
I think it's odd that Temporal defaults to using timezones, specifically the system timezone, unless you clarify otherwise with the much longer "Plain" naming convention. It's like we're gonna stick with that, even though the only time I can ever see timezones being useful is when we want to display some data to a user and even then the system timezone isn't what we would use.

Seems like if you created a Temporal object using a string that contains a timezone it should assume you want that time and date in utc. Or assume that any datetime you provide without one is otherwise in utc.

Nolgthorn fucked around with this message at 11:14 on Dec 12, 2023

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
When I go back to old codebases it isn't the coding practices that bother me, it's the fact I still didn't appreciate typescript as much as I should and so didn't use it. I met one guy professionally who was against it. His reason was that dynamic types is a feature and that you're supposed to validate the type of an argument everywhere it's passed for reasons beyond what I could understand.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
Sometimes I want to go back to a more innocent time where I did my duty as a good programmer who listened to more experienced developers. But you can only have been right secretly without telling anybody about it so many times before you wanna explode.

Follow the advice given but also make it clear that you are thinking something different anyway, just do it a friendly conversational way.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
I don't mean to be demeaning in any way I pasted your exact post into chatgpt to see what it would say and it pretty much nailed this nail as hard as a nail could possibly be nailed.

quote:

To convert the arrow function to a regular function declaration, you need to explicitly define the function and its parameters. In your case, you have a function that takes a metrics argument and returns another function that takes an input argument. Here's how you can rewrite it using a regular function declaration:

TypeScript code:
const publish = function(metrics) {
  return async function(input) {
    return metrics.someChainedFuncsThatPublishAMetric(input.exampleKey);
  };
};

// Called as
await publish(metrics)({ exampleKey: 'example value' });
In this example, the publish function takes a metrics parameter and returns an asynchronous function that takes an input parameter. The body of the function remains the same, with the call to metrics.someChainedFuncsThatPublishAMetric(input.exampleKey).

Keep in mind that using arrow functions in this context can often lead to more concise and readable code. Arrow functions have lexical scoping for this, which can be beneficial in certain situations. However, if you specifically need a regular function declaration, the example above should help you achieve that.

I guess what it's missing is that if all you're trying to do is this concise you could make it one function. But all you've really done there is reconfigured the way someChainedFuncsThatPublishAMetric is called and renamed it.

TypeScript code:
async function publish(metrics, input) {
  return metrics.someChainedFuncsThatPublishAMetric(input.exampleKey);
}

// Called as
await publish(metrics, { exampleKey: 'example value' });

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
We could also put some funky types on it. I prefer to return the value from async functions so it would look something like

TypeScript code:
function publish (metrics: TheThing) {
  return async function (input: ExampleObjType): Promise<MetricsLogger> {
    return await metrics.someChainedFuncsThatPublishAMetric(input.exampleKey);
  };
}

// Called as
await publish(metrics)({ exampleKey: 'example value' });
// Opposed to
await metrics.someChainedFuncsThatPublishAMetric('example value');

Nolgthorn fucked around with this message at 14:39 on Jan 23, 2024

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
Oh I get it.

I misred the code. Dunno what you're working with but assuming it's a callback because it's got a delay you can make it async yourself.

TypeScript code:
async function getMetrics () {
  return new Promise((res) => metricScope(res));
}

const metrics = await getMetrics();

metrics.putDimensions({ Service: "Aggregator" });
metrics.putMetric("ProcessingLatency", 100, Unit.Milliseconds, StorageResolution.Standard);
metrics.putMetric("Memory.HeapUsed", 1600424.0, Unit.Bytes, StorageResolution.High);
metrics.setProperty("RequestId", "422b1569-16f6-4a03-b8f0-fe3fd9b100f8");
// ...
Or, inline even if you wanted to.

TypeScript code:
const metrics = await new Promise((res) => metricScope(res));
Apologies if I gummed up this thread with a bunch of nonsense.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense

fuf posted:

My adventures in learning React state management continue, and I'm currently rebuilding my card game app in NextJS 14.

I like the idea of using routes to manage state, but uhhhh now my app router directory looks like this:


Is this kind of thing normal or have I gone completely insane? I probably don't need all of those pages and layouts, but I'm trying to come up with a consistent pattern so I can build out the UI as necessary.

I think I could use parallel routes and intercepting routes to make the hierarchy a little less deep, but I'm not sure I need them (yet) and they might just make things more complex.

The app is all about selecting cards for different slots, and then doing stuff with the array of currently selected cards. The routes above handle doing the selecting, but it looks like I'm still going to need to use something like Zustand to keep track of my currently selected cards array in a global store. Which is fine but it would be nice if I could do it all within NextJS.

The server-side rendering thing feels like a bit of a double edged sword. I really really love that you can just do something like this directly in a component:
code:
const categories = await prisma.category.findMany();
and cut out the whole middle layer of api calls.

But it also adds a lot of complexity, like apparently with Zustand you have to initialise the store both on the server and the client somehow.

Anyhoo any NextJS thoughts or tips very welcome.

I've been playing with this stuff too.

Regarding the router, I'm finding it to be pretty awesome for keeping pages organized and all the magic it does with layout is wicked. So I'm feeling drawn to defend it. Why do you need so many ids in your url? Assuming they're unique identifiers all you really need is the last one.

You could have `/category/categoryId` the same way you could have `/card/cardId`, unless I'm misunderstanding why your url is so complicated.

Also I would never initialize a store on the server that seems weird. That's your source of truth and you should be updating a store on the client based off of that source of truth, I'm not sure managing state on your server is something you're supposed to do. On the other hand writing sql statements in a react component didn't used to be something we are supposed to do either.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
I don't have anything to contribute but I want to compliment your easy to read well organized code.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
Writing my own SQL ORM, I have a schema like this:

JavaScript code:

export default createSchema({
    tables: [
        {
            name: 'Address',
            columns: [
                { name: 'id', type: 'uuid', auto: true },
                { name: 'userId', type: 'uuid' },
                { name: 'name', type: 'string', nullable: true },
                { name: 'city', type: 'string' },
            ],
            indexes: [
                { type: 'primary', column: 'id' },
                { type: 'unique', column: 'userId' },
            ],
            foreignKeys: [
                { table: 'User', id: 'id', column: 'userId', onUpdateDelete: 'cascade' },
            ],
        },
        {
            name: 'Pet',
            columns: [
                { name: 'id', type: 'uuid', auto: true },
                { name: 'userId', type: 'uuid' },
                { name: 'name', type: 'string' },
                { name: 'type', type: 'enum', values: ['dog', 'cat', 'fish'] },
            ],
            indexes: [
                { type: 'primary', column: 'id' },
            ],
            foreignKeys: [
                { table: 'User', id: 'id', column: 'userId', onUpdateDelete: 'cascade' },
            ],
        },
        {
            name: 'User',
            columns: [
                { name: 'id', type: 'uuid', auto: true },
                { name: 'name', type: 'string' },
            ],
            indexes: [
                { type: 'primary', column: 'id' },
            ],
        },
    ],
});
And that's great and I might have types therefore that look like this:

JavaScript code:
export interface DbAddress {
    id: string;
    userId: string;
    name: string | null;
    city: string;
    user?: DbUser;
}

export interface DbPet {
    id: string;
    userId: string;
    name: string;
    type: 'dog' | 'cat' | 'fish';
    user?: DbUser;
}

export interface DbUser {
    id: string;
    name: string;
    address?: DbAddress;
    pets?: DbPet[];
}
But the only way I have really figured out how to get that schema to become these types is to actually programmatically generate a types.ts file. Which is ok, I am doing that I'm just working out what is the best solution. For one thing it means I'm generating a file inside of the node_modules directory, which doesn't seem right. I think prisma does that but just because prisma does it doesn't mean it's necessarily the best solution.

I've played around with putting the generated types in the project directory but I'm having a lot of problems when it comes to using those types inside my module, I'm effectively forced to use dynamic imports which are asynchronous.

I asked copilot and it says I can do funny things with typescript like this:

JavaScript code:
// Utility function to dynamically create a type based on a JavaScript array of strings
function createDynamicType(keys: string[]): Record<string, string | number> {
    const result: Record<string, string | number> = {};
    keys.forEach(key => {
        result[key] = 'string'; // You can customize the type here based on your requirements
    });
    return result;
}

// Example usage with a JavaScript array of strings
const dynamicKeysArray = ['id', 'name', 'age'];
type DynamicObject = typeof createDynamicType(dynamicKeysArray);

// Now DynamicObject is equivalent to { id: string, name: string, age: string }
const myData: DynamicObject = {
    id: '123',
    name: 'John',
    age: '25'
};
But I haven't worked out how to use this knowledge or if it's a sensible approach, or if I can make it dynamically reference other dynamically generated types. What is best, I'm leaning towards my generate a types file in node_modules approach.

Nolgthorn fucked around with this message at 02:28 on Mar 21, 2024

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
It would be really cool if I could figure out how not to. One of the problems with generating the types file is that my orm can only really support one database at a time without risking types mapping over one another.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
Having trouble getting anything else to work but I found a solution to the "only one database" problem. Have schema include a type prefix:

JavaScript code:
export default createSchema({
    typePrefix: 'Db',
    tables: [
        {
Then the generated code:

JavaScript code:
export type DbKey = 'Db';
export type DbTableKey = 'User' | 'Address' | 'Pet';

// MAPS
export type DbUtil = { [K in DbKey]: DbTableMap[K] };

interface DbTableMap {
    Db: { row: DbTableRow, options: DbTableOptions, where: DbTableWhere, select: DbTableSelect, include: DbTableInclude };
}
type DbTableRow = { [K in DbTableKey]: DbTableRowMap[K]; };
type DbTableOptions = { [K in DbTableKey]: DbTableOptionsMap[K]; };
type DbTableWhere = { [K in DbTableKey]: DbTableWhereMap[K]; };
type DbTableSelect = { [K in DbTableKey]: DbTableSelectMap[K]; };
type DbTableInclude = { [K in DbTableKey]: DbTableIncludeMap[K]; };

interface DbTableRowMap {
    User: DbUserRow;
    Address: DbAddressRow;
    Pet: DbPetRow;
}

etc
Then I can access the type using:

JavaScript code:
DbUtil['Db']['options']['User']
Or something like that.

It's sort of like I definitely have to use generics, I can't just lookup types using strings in my code... generics is how I did it originally but I thought I'd try and find a way to remove the litany of generics everywhere in the codebase.

Nolgthorn fucked around with this message at 13:30 on Mar 21, 2024

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
Perhaps in an effort to go crazy I have made a little bit of headway regarding inferred types.

JavaScript code:
const mySchema = {
    tables: [
        {
            name: 'Address',
            columns: [
                { name: 'id', type: 'uuid', auto: true },
                { name: 'userId', type: 'uuid' },
                { name: 'name', type: 'string', nullable: true },
                { name: 'city', type: 'string' },
            ],
            indexes: [
                { type: 'primary', column: 'id' },
                { type: 'unique', column: 'userId' },
            ],
            foreignKeys: [
                { table: 'User', id: 'id', column: 'userId', onUpdateDelete: 'cascade' },
            ],
        },
        {
            name: 'Pet',
            columns: [
                { name: 'id', type: 'uuid', auto: true },
                { name: 'userId', type: 'uuid' },
                { name: 'name', type: 'string' },
                { name: 'type', type: 'enum', values: ['dog', 'cat', 'fish'] },
            ],
            indexes: [
                { type: 'primary', column: 'id' },
            ],
            foreignKeys: [
                { table: 'User', id: 'id', column: 'userId', onUpdateDelete: 'cascade' },
            ],
        },
        {
            name: 'User',
            columns: [
                { name: 'id', type: 'uuid', auto: true },
                { name: 'name', type: 'string' },
            ],
            indexes: [
                { type: 'primary', column: 'id' },
            ],
        },
    ],
} as const;

type Schema = typeof mySchema;
type Table = Schema['tables'][number];
type Column = Table['columns'][number];

type TypeMapping = {
    boolean: boolean;
    date: Date;
    datetime: Date;
    enum: string;
    integer: number;
    string: string;
    text: string;
    time: Date;
    uuid: string;
};

type ColumnType<T extends Column> = T extends { type: 'enum', values: readonly string[] }
    ? T['values'][number]
    : TypeMapping[T['type']];

type NullableType<T extends Column> = T extends { nullable: true }
    ? ColumnType<T> | null
    : ColumnType<T>;

type TableType<T extends Table> = {
    [K in T['columns'][number]['name']]: NullableType<Extract<T['columns'][number], { name: K }>>
};

type User = TableType<Extract<Table, { name: 'User' }>>;
type Address = TableType<Extract<Table, { name: 'Address' }>>;
type Pet = TableType<Extract<Table, { name: 'Pet' }>>;

const address: Address = {
    id: 'a-uuid',
    userId: 'a-uuid',
    name: null,
    city: 'a-string',
};

const pet: Pet = {
    id: 'a-uuid',
    userId: 'a-uuid',
    name: 'a-string',
    type: 'dog',
};

const user: User = {
    id: 'a-uuid',
    name: 'a-string',
};
The trick is to use `as const` on the schema object. This allows all kinds of inference operations because typescript then knows that the data doesn't change, `T extends { name: 'userId' }` isn't going to suddenly have any of the possible keys from other objects of the array, it's got the keys that's on it's object.

However this is about as far as I can go.

I am using `createSchema(mySchema)` to provide the user with types while building the object. There isn't any way at least that I've found to have `createSchema` accept an object of a specific type and also have that object be of type const.

JavaScript code:
export function createSchema (schema: TSchemaOptions) {
    return schema;
}
Like, you can only use as const when the object is defined. So, I dunno a bit of an ugly user experience if I require the user to say `createSchema({} as const)` but also it eliminates type checking which is the entire point of the function. This function the way it's written eliminates any possibility of the object being of type const. Which is what it needs to be. Other libraries seem to have this figured out but I haven't figured out how to read their codebases.

So that's a roadblock.

The other big setback is that I still need my Address type to have a user parameter that maps to the User type. Also visa-versa. In code I'm figuring out what these associations are with a sweet javascript function that calculates all of them.

But I can't use that... because then I'm no longer in type world. It's like I have to look at the Address table referencing the User table and then that means the User type should have a address parameter.

Almost like instead of having the user define foreign keys, I should be doing that programmatically and instead have them define what all the relationships between tables are. A massive change largely just so that I can more easily negotiate types. Furthermore I'm really not certain what happens if the Address object type infers a User object type which infers a Address object type.

Isn't that gonna spin my cpus into the stratosphere?

Nolgthorn fucked around with this message at 11:27 on Mar 26, 2024

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
Why did I think I shouldn't use sequelize, I thought a long time ago everyone switched to knex. Then it was prisma and now drizzle. Ultimately you're correct we don't need it but when I built my own testing framework for example I learned a heck of a lot. When I built my own web framework that has been the most useful thing ever, it's the best node web framework out there in my opinion. At least when it comes to apis and simple sites, it can't do the fancy things that next can do but it's really easy to use and does everything I need, it's the only one out there that does cors properly.

Ultimately, I learned a lot, and this sql orm is the same. Although, my sabbatical is ending and I don't think I'm going to finish it in time. It's turned out to be about 10 times harder than a web framework to make.

Nolgthorn fucked around with this message at 11:08 on Mar 26, 2024

Adbot
ADBOT LOVES YOU

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense
You have to do something like render it on the page with no height or width, measure it, then adjust the size using a percentage. There's no easy way to do it because different fonts and operating systems and browsers and etc.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply