|
So you've got some big blob of json that represents a graph? Is it a tree structure? What are you trying to get from this data structure?
|
# ? Jul 13, 2018 13:55 |
|
|
# ? Jun 3, 2024 22:52 |
|
Acidian posted:I have an actual proper PHP question this time. In general you can solve problems of this type using recursion. I'll give a javascript solution. code:
|
# ? Jul 13, 2018 15:18 |
|
I solved the problem by using the array_walk_recursive function, and retrieving the data I needed from each branch and endpoint, without needing to know how many arrays there are or how they are nested. If I want to make a check that the array structure is correctly formed, then I will have to solve this problem, but right now I have a solution that will work for now. Bruegels Fuckbooks posted:In general you can solve problems of this type using recursion. I'll give a javascript solution. I don't know any Javascript currently, but I think I understand your code and what you mean. I can easily count the number of arrays by using array_walk_recursive() and counting the number of 'id' fields. I am thinking that I could do a function using $array[$x][$y][$z]. Then it would iterate through $z until it is done, do a $y++ until all the $y's were done, then $x++ until all the array elements were done. I would have to assume there can be more than 3 dimensions, but if I code up to 5 dimensions then that should be pretty future proof, I think. Right now it's alot of work for very little gain, so I think I will have to come back to this issue at a later date when I want to improve my code. rt4 posted:So you've got some big blob of json that represents a graph? Is it a tree structure? What are you trying to get from this data structure? It's a tree structure, and it's catalogue data. Each branch has data values, and each end point has data values, for example they include an id field and a label field. code:
|
# ? Jul 14, 2018 10:47 |
|
It's probably a sorted tree, so I'd expect you to want an in-order traversal. But you really need to wrap your head around the recursive approach. It's basically "Call the function that was called on me on my children first, then return my result with theirs."
|
# ? Jul 18, 2018 06:48 |
|
php:<?php function fart($array) { $c = 0; foreach ($array as $node) { if (isarray($node)) { $c += fart($node); } else { $c++; } } return $c; } Zamujasa fucked around with this message at 02:25 on Jul 20, 2018 |
# ? Jul 20, 2018 02:15 |
|
Just wanna say gently caress what PHP does to $_FILES when there are multiple uploads. I know it's been like that forever, and that's why it can't be fixed, but good god what a stupid, backwards, broken way to handle it.
|
# ? Aug 16, 2018 23:49 |
|
I have a Wordpress template PHP file that interprets JSON and used to spit out a nice tap list. With the latest PHP it looks like bones_main_nav() has been deprecated and I get the following error:code:
code:
|
# ? Aug 20, 2018 17:58 |
|
bones_main_nav isn't a PHP function, but a function from a WordPress theme or plugin. Going by the bones_ prefix, I'm assuming its this WP boilerplate theme: https://themble.com/bones/ I'm guessing your problem is that your theme is not set to be a child theme of Bones, and therefore now can't find the function it expects. It looks like that theme no longer exists, but I did find a reference to the function on GitHub: https://github.com/DD9/boiler2/blob/a4b6cf0d5b02beb8d128f7bed159dca0bf7deee5/library/bones.php#L256
|
# ? Aug 20, 2018 18:09 |
|
Heskie posted:bones_main_nav isn't a PHP function, but a function from a WordPress theme or plugin. Is it possible to take bones.php and stick it in my theme so add those functions? It might be more of a wordpress specific question...
|
# ? Aug 20, 2018 18:50 |
|
raej posted:Is it possible to take bones.php and stick it in my theme so add those functions? It might be more of a wordpress specific question... Probably. So long as nothing in there is relying on something else that's missing. e: I don't endorse this fix though.
|
# ? Aug 20, 2018 19:12 |
|
I have gotten into a situation now where I plan to set up multiple cron jobs, but some of the scripts are potentially very time consuming, and my worry is that they might overlap. That is, new script might start running before the last one was complete. There are also certain script I want to be sure are run in a proper sequence, to check that certain conditions are in place, or put those conditions into place if they are not. One script I was running took 24 hours to complete, but I have some ideas on how to improve on this, and it wont take 24 hours every time its run, only the first time (populating tables over REST API which requires a response from the client for every line). In the case of the 24 h queue script, which on most days might take 0-5 minutes, if the cron job is set up to run every 15 minutes and the update suddenly takes an hour, then I cant have the same script running 4 times. I have an idea, which I am pretty sure I got from this thread but I can't find where someone mentioned it, which is to set up a queue table in mysql. I am thinking a cron job php script will run and start the php scripts that are listed in the sql table one by one sequentially, and the scripts themselves will mark themselves as ongoing when they start. This way the another cron job script can be run to add jobs to the queue-table, or check if all the necessary scripts are already in the table/queue, and running or waiting to be run, and if the script is already in the table, then it will not add it again. I am also thinking that I will find a loop in the scripts (all the scripts have a loop somewhere in the code) which can send an update to the SQL table every time it loops, to show that it's actually running. If my server crashes, maybe because of power outage, then the job table would be there with the queue in place, but no scripts would actually be running. So a cron job script would have to flush the "running" scripts from the queue, or maybe just empty the queue outright, and the way to figure out if a script is running or not is if it is continuously sending some kind of keep-alive information to the table. The down side of this is that the scripts would potentially take longer to run if they continuously send database updates. It might be smart if I could set up a loop counter, and say that the updates to the database queue-table only need to be sent every modulo 10, 100 or 1000 of the loop counter. Just so the queue flusher script knows that the script has sent a keep-alive the last 10-30 minutes or so. Do you goons have any input on this? Is this a good idea, or are there better ways of doing this? I am switching over to production in a little over a week, so I don't have a whole lot of time if I need to learn something new.
|
# ? Aug 21, 2018 23:24 |
|
Most of what you have said seems pretty reasonable. If you want a fairly simple way to handle script(s) from running at the same time, you can use a locking mechanism. For example https://symfony.com/doc/current/components/lock.html If you need finer grain control on when scripts are run you may want to look into supervisord instead of cron.
|
# ? Aug 21, 2018 23:35 |
|
What would cause a conversion error when using post data for a SQL server between query. format is yyyy-mm-dd I am using SQL server. I am using jQuery date picker for the form. The query works if I use $_get for Datatables. But when I use $_post it throws an error. This is driving me crazy. I
|
# ? Aug 22, 2018 01:32 |
|
joebuddah posted:What would cause a conversion error when using post data for a SQL server between query. What specific error are you getting, for which specific data?
|
# ? Aug 22, 2018 02:39 |
|
bigmandan posted:Most of what you have said seems pretty reasonable. If you want a fairly simple way to handle script(s) from running at the same time, you can use a locking mechanism. For example I want to learn to use the Symfony framework, since the application I am using is written in symfony and I want to do some back end modifications later down the line. If what I say sounds reasonable, then I will just stick with that for now, and learn a better way when I start learning Symfony. I have "supervisor" installed on the server for the application, not sure if that does the same as "supervisord", but right now I don't think I need "fine grain" control so I think it's ok. Thank you.
|
# ? Aug 22, 2018 11:40 |
|
Acidian posted:I want to learn to use the Symfony framework, since the application I am using is written in symfony and I want to do some back end modifications later down the line. If what I say sounds reasonable, then I will just stick with that for now, and learn a better way when I start learning Symfony. You can use the lock library without using the rest of the framework.
|
# ? Aug 22, 2018 14:28 |
|
You could also add flock to the front of your cron command to let the OS handle locking for you
|
# ? Aug 22, 2018 15:26 |
|
joebuddah posted:What would cause a conversion error when using post data for a SQL server between query. Not sure if it's true in your case, but I've found date pickers can be influenced by local settings (e.g. localization) so you need to be strict about parsing that info before the database acts on it. What I'd suggest is: - Capturing the date string in both cases and see how they differ (my guess is that if there is a difference it's on T value) - Forcing a CONVERT(x,datetime)/GetDate before trying to use it in a query This discussion goes over some of the tricks and further down the page lists some of the more obscure ints for GetDate: https://stackoverflow.com/questions/889629/how-to-get-a-date-in-yyyy-mm-dd-format-from-a-tsql-datetime-field EDIT-Whhops, thought this was the SQL thread, you were probably looking for a code solution
|
# ? Aug 22, 2018 17:02 |
|
This is my error messagecode:
Thanks for the help Edit I still don't understand why it's throws an error on one server but works on the other. Both of which run SQL server joebuddah fucked around with this message at 23:17 on Aug 22, 2018 |
# ? Aug 22, 2018 20:23 |
|
rt4 posted:You could also add flock to the front of your cron command to let the OS handle locking for you That seems really simple, will check that out, thanks!
|
# ? Aug 25, 2018 11:58 |
|
I am running into a weird memory issue with a script. It's a short looping script that uploads an image, so it continously loads in an image file, converts it to base 64 and saves it to a variable, uploads the image to a server and unsets the image file. Then the proccess repeats. Since this is in a while statement, and the variables are the same onces being used over and over, shouldn't the memory allocation (size and address) stay more or less the same with some fluctiuations of 1-2mb depending on the image size?code:
|
# ? Sep 5, 2018 15:59 |
|
Acidian posted:I am running into a weird memory issue with a script. It's a short looping script that uploads an image, so it continously loads in an image file, converts it to base 64 and saves it to a variable, uploads the image to a server and unsets the image file. Then the proccess repeats. Since this is in a while statement, and the variables are the same onces being used over and over, shouldn't the memory allocation (size and address) stay more or less the same with some fluctiuations of 1-2mb depending on the image size? I don't see anything erroneous that would cause memory issues. Without knowing what $ak_client, $mag_client, etc. are doing under the hood it's difficult to tell. You may want to do some profiling with xdebug and cachegrind. Depending on the version I think profiling was removed then added back into xdebug at some point though.
|
# ? Sep 5, 2018 16:18 |
|
bigmandan posted:I don't see anything erroneous that would cause memory issues. Without knowing what $ak_client, $mag_client, etc. are doing under the hood it's difficult to tell. Ok, thanks, I will try running xdebug on it later tonight. I have also increased the max script size from 128MB to 2GB, so will see how far that gets me. ak_client and mag_client are guzzle functions just sending a request and recieving data. Again it's the same variables being called over and over, can't see any situation where any new variables would be "piling up". I also uploaded 43000 products without images using the same class and functions.
|
# ? Sep 5, 2018 16:36 |
|
Anyone written a GraphQL server in PHP and if so which implementation did you use? How was the experience? The webonyx graphql-php project seems to be the most popular one?
|
# ? Sep 9, 2018 19:27 |
|
bigmandan posted:I don't see anything erroneous that would cause memory issues. Without knowing what $ak_client, $mag_client, etc. are doing under the hood it's difficult to tell. PHP does not do a great job with memory in a loop like this. Try to minimize your calls to the database. Process the images, then make a single operation to the database with all the successful images. Log any errors from the images.
|
# ? Sep 9, 2018 23:44 |
|
my bony fealty posted:Anyone written a GraphQL server in PHP and if so which implementation did you use? How was the experience? I've been playing around with one in my limited dev time at work and I'm liking it so far. Using the webonyx bundle on top of Symfony 4.1 (using overblog/graphql-bundle) and they play really nicely together. I'm very new at the whole GraphQL thing though, and I haven't been doing anything particularly complicated. This is strictly a read-only application so I couldn't tell you how well it works with mutations, but I'm not at all sold on GraphQL being responsible for inserts/updates anyway,
|
# ? Sep 10, 2018 10:03 |
|
Trying to enable ssl on my webserver and I am losing my mind. Sorry to pester the PHP thread about this, but there is no specific webserver thread I think. I am trying to use letsencrypt with certbot. It refuses to write the /.well-known/acme-challenge/ folder. I have tried disabling all sites, and just made a new config file that is super basic: Cloudfare is blocking me trying to add the following text, so I will have to add it as a photo. Both fail. I have been at this for 3 hours now trying different configurations. The example.com domain has ssl encryption with another provider, I do not know who, but I am just trying to add some ssl encryption to the subdomain at check.example.com In all cases, it refuses to write the /.well-known/acme-challenge/ folder. No matter where I point the configuration to or how I chmod or chown the folders.
|
# ? Sep 11, 2018 16:48 |
|
Peggle Fever posted:PHP does not do a great job with memory in a loop like this. My first script did not use a database, but because uploading to the server takes alot of time, 4-7seconds per product depending on image size for request and response, I wanted to run one script that adds all the products to a database, which goes super fast. Then I have 5-10 scripts running separately using shell_exec() to upload the data from the database. This cut the upload time from 48 hours to under 12 hours for the whole database. Usually I will not be uploading the whole database, only the changes being made, but sometimes there might be many thousand changes and I want the changes uploaded before people start work in the morning. Edit: If it's all about calls to the databse, then maybe I could add 10 products (need to see product description lenghts and size of the text sql field to calculate how many products I could safely add) to one row, then make 1 call, process 10 products and upload each, then make a new database call. Acidian fucked around with this message at 16:56 on Sep 11, 2018 |
# ? Sep 11, 2018 16:53 |
|
Run getenforce to see if SELinux is active. If it's running, that's why. You aren't military contract or medical records, you don't need it. It's an obtuse and obfuscate pile of poo poo and we totally had this exact issue yesterday because Linode's Centos image now has it on by default.
|
# ? Sep 11, 2018 18:21 |
|
I'm about to work with an external API that requires OAuth2. In the past I've always just opted for using API keys since most third party APIs offer alternatives and I didn't at the time want to dive down the rabbit hole of OAuth2. Anyway, the API I'll be working with is DocuSign and they require OAuth2. My question is, how do you typically store the access tokens? Also, the DocuSign API documentation says that you can't refresh their tokens so you need to regenerate them when they expire but it appears to allow you to set the expiry time. What would be a good duration to expire these access tokens? I'm working with the Laravel framework if it matters.
|
# ? Sep 12, 2018 13:53 |
|
Depends on your level of giving a poo poo. At the lowest tier, you could commit them to your repo (dangerous). Next step up is to store them in an environment variable on the server. Above that, you could have a server running something like Vault that stores all your secrets and your various web applications connect to it every time they need a secret.
|
# ? Sep 12, 2018 15:29 |
|
Can someone give me a rundown on state machines and any reputable libraries? For something simple like a support ticket being opened, moving to a different status level, and closing, etc.
|
# ? Dec 24, 2018 02:12 |
|
It's more of a mental model than something you get from a library. The Wikipedia page is pretty good. Any textbook on data structures ought to cover it, too. In my own words, you'd think about how your program starts and finishes and every step in between. Draw a directed graph of all the different decisions that happen along the way. Basically all business needs can be represented by a state machine rather than any particular algorithm. Speaking of state machines and web dev, you should take a look at Elm as food for thought. It's a language/framework combo for building web frontends that forces you to explicitly enumerate the possible states of your program and how they flow. It feels tedious every time you start building something new, but the feeling of getting a program that comes out bug free on the first shot is amazing.
|
# ? Dec 24, 2018 04:10 |
|
It sounds like the thing you're looking for can also be called "workflow," and it looks like there are plenty of library options. I've never used one in the past and instead just did the minimal necessary work manually. It's not too hard. If I was going to use a library, I'd start by looking at the Symfony Workflows library to see how well it might work for my use case. You should also google BPMN and then run away from it.
|
# ? Dec 24, 2018 07:47 |
|
I want to zip a folder full of images. Opening an archive, adding 1 image, closing the archive works. However, it is slow, it makes a new zip file for every image, and sometimes the close() function fails and the temp file is left in the folder and the image is not added ( I assume). Trying to open the archive before the foreach loop, does not work. It doesnt even try to make the file it seems, but the open() statement still returns TRUE. I really don't understand what is going on. This works: code:
code:
|
# ? Jan 20, 2019 23:41 |
|
Acidian posted:This does not work: This worked for me pretty much as-is (PHP 7.1). I plopped it into some php file, made sure $dir and $image_archive had __DIR__ concatenated, ran it, and I got a zip file with my files. Do you have any memory or file system restrictions? Or maybe the calling code is timing out or closing the file handle? Honestly no clue other than that why it wouldn't work since you've pretty much guaranteed the files exist.
|
# ? Jan 21, 2019 01:10 |
|
Ok, thank you for checking. I will just try and do some further testing and see, and check the php.ini file for any memory restrictions. To me it doesn't even seem to make the file, and when I tried making the file beforehand and just appending the images to the file, that did not work either.
|
# ? Jan 22, 2019 10:27 |
|
I'm a huge dork who is a fan of Star Trek and has a degree in linguistics. So, I built a file that outputs every possible syllable of Klingon. I am also a masochist who likes trying to write files in as few lines/functional blocks as possible. Does anyone know a way I could do this with less? I think I'm at the end of my creativity here. Basically, there's three arrays of characters, and the file puts together every logically possible combination of those characters. Then, it asks, "Is the substring "ow" or "uw" present?" If so, echo the empty string. If not, echo the combination you're currently on. This is for funsies, so please offer your most absurd suggestions. code:
Welp, I at least managed to use only one array: code:
Kraus fucked around with this message at 21:33 on Jan 27, 2019 |
# ? Jan 27, 2019 18:12 |
|
Kraus posted:I'm a huge dork who is a fan of Star Trek and has a degree in linguistics. So, I built a file that outputs every possible syllable of Klingon. I am also a masochist who likes trying to write files in as few lines/functional blocks as possible. Does anyone know a way I could do this with less? I think I'm at the end of my creativity here. Wow. Ok, so the code (at first glance) looks fine, but it's quarter to eleven on a Sunday night here and I just realised that this is way too much for me to deal with right now. Hope your code works. Dork.
|
# ? Jan 27, 2019 21:49 |
|
|
# ? Jun 3, 2024 22:52 |
|
You could make it slightly shorter by using short array declarations
|
# ? Jan 27, 2019 22:07 |