|
octan3 posted:I've been meaning to reply to your post since I saw it show up but haven't had the time to. Just letting you (and anyone else who cares haha) that I'm working on this again. Part of what I'm doing before incorporating the floorplan module is decoupling a lot more code from the project. For example the current generation method provided by the original designer had mesh placement within room generation which is fair, but also hella static. Rooms themselves are also statically defined which is no good. So the objective for the next couple of days is ripping out the room generator completely and slotting in the floorplan generator instead which will not attempt to add meshes (a separate module for that). I've been looking at the work around as well (assume rectangular and trim) and for now I will be going with it with the added caveat of how rooms are designed. Currently the generator assumes one giant floor for each floor in the building. Which means you don't get individual floor materials. That's fine because essentially the rooms themselves are self contained sitting flush with the building frame. So each room has it's on walls, floor and ceiling inside the story. Because when rooms are trimmed the walls will be removed and just replaced with an inner duplicate of the outer wall I can keep the window slicer function and replicate it inside. So you can have a triangular room flush with the outer wall with an appropriate window and awning etc. Currently I am absolutely assuming "modern" looking buildings, but realistically there should be no reason I can't adapt the code to allow changes to provide older (or futuristic I guess) styles as well. But that's a later problem.
|
# ? Jan 10, 2022 09:12 |
|
|
# ? May 8, 2024 05:45 |
|
Did you know GitHub Wikis aren't indexed in search engines? And weren't at all for at least 9 years until a select few started? It's a travesty. For the past year, I've made, maintained, and enhanced https://github-wiki-see.page to try to get un-indexed GitHub Wikis indexed. It's gotten GitHub to get off their asses to get some Wikis indexed. There are still quite some more that aren't indexed though. Things like all repos with < 800 stars or wikis that are publically editable. If you want more details about the situation, just visit the front page of the site. The screenshot attached? That's the real product. GitHub Wiki content showing up on a Google search result. The page itself is quite ugly but it's easy to get to the original GitHub content from there. I kind of wanted to keep it Web 1.0 to make sure there's absolutely no confusion it is GitHub or any "professional" site. Also keeps the costs down. crazysim fucked around with this message at 18:33 on Mar 9, 2022 |
# ? Mar 9, 2022 18:28 |
|
Came back to my alphagram game I made for studying Scrabble and did a few improvements: You can now makes guesses for all word lengths from 2 to 8 (the eight-letter words being solutions for playing through a tile already on the board). There's a button for going back to a previously viewed alphagram which I added at my wife's request because she kept accidentally skipping past one that took her interest. Definition of the word appears after a successful guess (or if you give up and check the answers). There's also a new game mode, the nine-letter word challenge. The general idea is that there's a two-letter word on the board and you've got to figure out the possible words that can be made from your rack: There's also some study tools for nine-letter words for learning how many nine-letter words can be made through two-letter words on the board. They can be sorted alphabetically, based on the number of solutions, or the likelihood of the two-letter word being on the board. There are some other study tools as well, I was curious to see what the most likely nine-letter words would be through two-letter words so I ran a program to score all the nine-letter words on the likelihood of the two-letter word and the likelihood of having the remaining tiles on your rack. For those interested, apparently the most likely is TRIALOGUE (a conversation involving three people), playable through AL, LO, and GU. and the least likely that is still possible is WAILFULLY (in a sorrowful manner) which is only playable through AI. Don't really know how accurate it is as I don't have anything to compare it to. For your delectation, the 49 nine-letter words that it is not possible to play through a two-letter word on the board: My wife's addicted to playing this now so I'm going to call that a success.
|
# ? Mar 13, 2022 22:12 |
|
as an attempt to never have to do a live coding interview ever again, I wanted to do a personal project that will allow me to say "no i will not do your algorithm interview but i have something that can prove i can do the job. hire me if you want or not" So i'm halfway done it. Made an entire forums software backend using SA as the reference with some additional features like notifications/mentions, etc. Currently building out the front-end in https://remix.run/ teen phone cutie fucked around with this message at 19:40 on Jan 9, 2023 |
# ? Jul 13, 2022 22:26 |
|
teen phone cutie posted:as an attempt to never have to do a live coding interview ever again, I wanted to do a personal project that will allow me to say "no i will not do your algorithm interview but i have something that can prove i can do the job. hire me if you want or not" I don't think this will do the thing you want.
|
# ? Jul 13, 2022 23:03 |
|
Yeah my company uses karat and I hate it and wish I could skip it for some applicants, but I can’t cause we’d get sued probably. Their live coding is a waste of time for everyone pretty much. I hear it’s better for QA/infra/IT though.
|
# ? Jul 13, 2022 23:22 |
leper khan posted:I don't think this will do the thing you want. Personal projects can go a long way in the job hunt. I know it's different at every organization, but where I work they are certainly valued. Hell I showed off https://awfulyearbook.com/ during my interview. It probably would have been good if there wasn't a user purchased banner ad during my demo that just had gently caress YOU in big flashing letters.
|
|
# ? Jul 13, 2022 23:24 |
|
Yeah I agree but it sucks that I can’t use that sort of stuff in place of karat.
|
# ? Jul 13, 2022 23:29 |
|
Also I’ve been looking for some front end thing that I can wrap my head around so I can actually show of some of my stuff in a nice way, thanks for introducing me to remix
|
# ? Jul 13, 2022 23:35 |
|
leper khan posted:I don't think this will do the thing you want. well the point is that i don't want jobs that don't accept a full-stack personal project as a proof of my skill, so it actually works out
|
# ? Jul 14, 2022 02:37 |
|
dougdrums posted:Also I’ve been looking for some front end thing that I can wrap my head around so I can actually show of some of my stuff in a nice way, thanks for introducing me to remix I love remix. Sveltekit is also pretty good, but comes with less magic.
|
# ? Jul 14, 2022 14:48 |
|
Hello all. I've been continuing to work on little hardware-ish hobby projects here and there, but I don't have anything new to share on that front at the moment. What I do have is this: I'll be a speaker at RetroGameCon in Syracuse, NY on Sunday, 09 October 2022. My panel is called "Retro Reverse-Engineering 101", and I'll be talking about the software and hardware tools, books, and other stuff that you'll need to start exploring old console hardware and software. The event organizers told me to really simplify my material for a general audience, so I can't really dig into the nuts and bolts of the sort of projects that I do. But, I can give people a "shopping list" of sorts of equipment to buy, software to download, etc. Basically just me nudging people in the right direction and giving them advice on how to avoid getting frustrated and to continue to make progress. I can also answer any nitty-gritty low-level questions that the audience might have, in case there's some engineering college student or something that has some oddball thing that they want more info on. So, if you are in the neighborhood and want to BS with me about anything that you're working on, or any of my projects, stop on by! Or maybe you just want to watch me talk for an hour about hardware/software hacking. That's cool, too. And with that, I'll leave you with a few things that I've worked on in the past that will probably come up during my talk: https://twitter.com/DrHendersa/status/1521306842030391297https://twitter.com/DrHendersa/status/1366166743513759744https://twitter.com/DrHendersa/status/1348463303060844544https://twitter.com/DrHendersa/status/1304272213160460290
|
# ? Sep 19, 2022 03:25 |
|
Oooh nice. Is there any chance that your talk will be recorded?
|
# ? Sep 19, 2022 22:21 |
|
Cory Parsnipson posted:Oooh nice. Is there any chance that your talk will be recorded?
|
# ? Sep 20, 2022 12:02 |
|
hendersa posted:RGC doesn't provide anyone to record the panel sessions. There isn't any rule against recording them, as long as you get the consent of everyone attending the panel. I'd have to supply my own camera and find someone to do the actual recording for me. Oh I see. That sounds like a huge ordeal. Don't worry about it! I'm not in the area but that sounds like a very cool talk. Good luck!
|
# ? Sep 20, 2022 21:52 |
|
I posted ages ago about the city builder I was working on. I ended going back and realising that a lot of what was being done (from the original author's code) was more for getting it done, rather than efficiency. So I have started ripping out a tonne of stuff and rebuilding it to be more versatile. To start with, I am ignoring the plots and buildings, and working on the fundamental layout of the city itself. Things I have addressed: - Inability to modify the city in any reasonable way after creation - Required to have a completely flat city I basically took the generated data and attached custom nodes to beginning and end points, and removed any duplicates. The nodes themselves are movable but more importantly I created a DataNode that pairs with a parent DataNode to send information. This is what gets generated with several hundred roads (my poor laptop taps out around 500+) You can see the node actors, and they are real-time movable: The DataNode essentially checks on construction (which in editor is when it is moved) that the transform data given to it by the generator is the same as what it is now. If it isn't then it dispatches a call to the generator which updates it's data. It's handy, because I'm only updated nodes that are actually, like, updating. If you move the nodes vertically it lifts any streets it connects with it, so you can now get vertical movement. It probably doesn't seem like much, but being able to realtime modify the layout easily helps me a lot, and the plan is to get the plot and building generators to modify the affected actors as well.
|
# ? Oct 17, 2022 10:27 |
|
i'd always wondered what the slot in shigesato itoi's bass tsuri was for
|
# ? Oct 21, 2022 16:43 |
|
syntaxfunction posted:I posted ages ago about the city builder I was working on. I ended going back and realising that a lot of what was being done (from the original author's code) was more for getting it done, rather than efficiency. So I have started ripping out a tonne of stuff and rebuilding it to be more versatile. Woo, thanks for the update! I'm still watching along while not spending any time on my own city generator. Did you end up sticking with UE4 for this one?
|
# ? Oct 24, 2022 20:29 |
|
octan3 posted:Woo, thanks for the update! I'm still watching along while not spending any time on my own city generator. I have actually migrated to UE5 for a very practical reason, which is Nanite. Basically the original project developed for the master's thesis a cool dude made needed a lot of work. One of the big issues for me is the use of procedural mesh components to produce the buildings. While they worked they became limited and were not a great way to produce a lot of detailed content. I moved because of Nanite because while I have retained a lot of the base algorithms from the project I have essentially started redoing the implementation. The core of it produces streets and plots as well as polygons for building bases within those plots very quickly and realistically, so that has been kept. The actual building generation I'm going in a different direction because the end goal is produce meshes generated statically for each building that can work with nanite. The goal is that once the building is produced with as much detail as possible (every room enterable, with objects within the rooms) that nanite can help keep poly counts in control, hopefully allowing for even higher levels of content for the same cost. There's a lot of other little things I'm changing about the implementation but I plan to talk about them when they're actually getting done haha.
|
# ? Oct 26, 2022 03:28 |
|
I just picked up a project of mine I had put down for about a month. I'm working out some model structures, so it's a lot of staring into space today. After a while, and coming up with a few solutions, I opened my notes.txt file to check on some things. Lo and behold, I had already mapped out an answer to this problem already. Only I don't remember doing it at all, and the solution is so much better than what I just came up with. Usually I can at least recall working on this in the past, but man I just 100% don't remember coming up with the answer, and it's so much smarter than I'm capable of. Whoa.
|
# ? Oct 30, 2022 00:58 |
|
Early attempts at divining Carbon content of soil from sattelite remote sensing using ~mAtHS~ BASICALLY We are trying to get farmers to give a gently caress about climate change by creating processes for them to tap into that sweet sweet carbon credit eurobucks by sinking CO2 into the ground using various agricultural processes whilst minimizing the cost of instrumenting giant farms by taking photos of poo poo from space. Next step , get out there with carbon probes and figure out if our machine learning models actually work. gently caress I love doing science. duck monster fucked around with this message at 12:52 on Dec 1, 2022 |
# ? Dec 1, 2022 12:48 |
|
I hate WebRTC, here are 16 devices streaming to a single desktop 🤷♀️ The small one is 4K, and the pillar and letterbox ones are 1920x1200, all others are 1080p. MrMoo fucked around with this message at 17:58 on Dec 23, 2022 |
# ? Dec 23, 2022 16:14 |
|
I really like the retro look of terminal graphics and I wanted to see whether blitting terminal codes was fast enough for real-time applications. So I wrote an interactive Mandelbrot fractal explorer: The main trick is using a box drawing character and separate foreground/background colours to get square pixels in the terminal. It's probably a well known trick, but I was quite pleased when I figured it out. Performance is surprisingly good even when you increase the "resolution" by decreasing the font size, although obviously not nearly as good as with proper pixel blitting. Still, I'll probably continue using this technique to run visualisations on machines I can only access via text mode SSH. Code is here: https://git.sr.ht/~athas/tui-mandelbrot
|
# ? Feb 9, 2023 10:26 |
|
Athas posted:I really like the retro look of terminal graphics and I wanted to see whether blitting terminal codes was fast enough for real-time applications. So I wrote an interactive Mandelbrot fractal explorer: Just use sixel
|
# ? Feb 9, 2023 15:09 |
|
leper khan posted:Just use sixel That doesn't work in many terminal emulators. If I wanted to use something only pseudo-standard and not widely supported, I'd just use Kitty's graphics protocol, as it's much better than sixel.
|
# ? Feb 9, 2023 18:53 |
|
Cory Parsnipson posted:so basically I'm just using this as an excuse to over engineer everything Sounds like how I do everything when I’m in project mode lol
|
# ? Feb 13, 2023 14:48 |
|
Athas posted:That doesn't work in many terminal emulators. If I wanted to use something only pseudo-standard and not widely supported, I'd just use Kitty's graphics protocol, as it's much better than sixel. Pseudo-standard! I wish more stuff supported vt300 instead of vt100, Sixel and ReGIS are great for TUI.
|
# ? Feb 13, 2023 19:07 |
|
I went crazy with multiple web technologies, an attempt at displaying synchronized content (images, video, 3d), across the interwebs. Kind of impressive how many web things can fail and fail well. WebRTC x Raft Consensus x WebSocket, and here showing Chrome x Edge x Firefox at 144fps with content sync at 10fps and renderer interpolation to full frame rate. https://www.youtube.com/watch?v=vvca0WYOCas Running Chrome x Chrome x Chrome is a lot better, 60fps normal iPhone cap: https://www.youtube.com/watch?v=shhIL3AtRL4 Then a 240fps iPhone cap: https://www.youtube.com/watch?v=YeXInmer-LA Which looks pretty much frame accurate on a 144Hz monitor MrMoo fucked around with this message at 02:55 on Feb 14, 2023 |
# ? Feb 13, 2023 23:49 |
|
That's pretty cool. I did some sports broadcast systems work in the past and the way we'd get frame accuracy on a wire is to have a separate coax input for "genlock," which kept all the slow-mo, switchers, outputs on the same frame lock. I'm puzzled at how near frame accurate sync could happen over the web, but it looks clean! Is there some master clock pulse being sent? Even so, latency variance must throw this off by many frames randomly? It looks like a really neat thing to work on. e: I looked up the Raft Consensus algo you mentioned and printed out the paper for some light reading in the am. former glory fucked around with this message at 05:03 on Feb 14, 2023 |
# ? Feb 14, 2023 05:01 |
|
Synchronisation always becomes a lot easier with the more latency you're willing to accept. What you've achieved is impressive! Is the content live? How much are you buffering?
|
# ? Feb 14, 2023 11:52 |
|
I get reminded about genlock a lot, but this is the web and such integration is not available. Pretty much any broadcast quality video uses SDI and that's just not available on most hardware either. Live video is another problem, with insufficient research or funding over at least 2 decades, I don't have answers for that. Currently "cheating" by using MPV with low-latency configuration instead of the browser. The problem is generally that the clients have to follow the clock of the encoder, and to skip video to resync means audio jumping which humans are very much perceptible towards. Thus one usually has a choice between no-sync, some latency + audio, or no-audio, low-latency video. The majority of video players follow the clock signal in the video feed, but that assumes that the video player real-time clock is actually synchronised with the encoder clock. There will be drift, and it will accrue, and be incredibly obvious. The design decision in this implementation is reducing the processing required in the renderer. One conventionally thinks of clocks, as above, and this is correct for a video stream, however for a multi-media stream one has an abstraction and the clock is actually feeding a scheduler instead of the renderer. Thus if the encoder operates at the scheduler level and forwards the commands it would send to the renderer one yields less processing at the client, but larger transit payload. Consider how FPS network games operate, Doom 3, APEX, Valorant, etc. The game state is independent of the renderer and runs at a separate frequency. The viewable game state is forwarded to each client on each step. The client then extends that state with player-side prediction to scale up to the local display clock rate. Thus the state engine is running at 10Hz, _broadcasting_ at low latency to each client with viewable state, then the client interpolates to the local display frequency, here 144Hz.
|
# ? Feb 14, 2023 12:34 |
I got bored and people were talking about KSP2s broken... everything so I decided to write a little N-body physics sim. It's kind of a boring shot, but without motion they all kinda are. The red color is just ones that are over an arbitrary limit I wanna use for star formation later.
|
|
# ? Feb 27, 2023 03:06 |
|
|
# ? May 8, 2024 05:45 |
|
Recently restored "Gargantuan Takeout Rocket" which was broken for 4 months. It allows "Liftoff from Google Takeout into Azure Storage, repeatedly, very fast, like 1GB/s+ or 10 minutes total per takeout fast". I'm one of those people with a TB to transfer when backing up. So getting it quickly out every two months is very useful. https://github.com/nelsonjchen/gargantuan-takeout-rocket It's pretty fugly design, but in case the worst happens to my Google account or YouTube, at least I'll have a copy of everything.
|
# ? Feb 27, 2023 22:37 |