Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
bigmandan
Sep 11, 2001

lol internet
College Slice
In laravel 4 you could also do something like this:

php:
<?
class UserController extends BaseController {
    public function showProfile($id)
    {
        $user = User::findOrFail($id);
        return View::make('user.profile', array('user' => $user));
    }
}
?>
The findOrFail method will throw ModelNotFoundException if the record cannot be found. No need to explicitly check $id for type with this method. I think handling this exception as 404 page makes sense so you could add the following in app/start/global.php add:

php:
<?
App::error(function(Illuminate\Database\Eloquent\ModelNotFoundException $exception, $code)
{
    return Response::view('errors.404', array(), 404);
});
?>
and define the 'errors.404' template to show a 404 not found page.

If you are looking at Laravel 5 in the future, things are handled differently

bigmandan fucked around with this message at 19:35 on May 13, 2015

Adbot
ADBOT LOVES YOU

bigmandan
Sep 11, 2001

lol internet
College Slice

revmoo posted:

Yes you definitely pointed me in the right direction, thanks.

It's so nice to work in Laravel again, I've been stuck in 'legacy' code for the past two years.

I just finished a 6 week stint of updating legacy (2002!) code, written by an ex-employee, to add a few "emergency" features. Pulling back the layers of mixed logic/presentation cthulhu spaghetti code was... unpleasant. After it was said and done I got the go ahead to do a complete rewrite at least, instead of supporting it for years to come.

bigmandan
Sep 11, 2001

lol internet
College Slice

v1nce posted:

...
[*]Committing lots of libraries can result in bloated VCS, especially if you frequently update your libs. This isn't fun to pull down, or walk history for.
...

This has burned me before. Keeping stuff separate is a good thing. We've been refactoring a lot of cruft where I'm at and one of the main goals is clean code and maintainability.

bigmandan
Sep 11, 2001

lol internet
College Slice
I've been working on a small problem and I have a solution that works for me, but I'm concerned about style and correctness. The problem appears to be similar to a 0-1 knapsack problem, only I have multiple knapsacks.

Is what I came up with decent, or am I a coding horror?


edit:

After giving it some thought, what I came up with was pretty terrible. I think I came up with a better solution. Maybe.

bigmandan fucked around with this message at 21:42 on Sep 3, 2015

bigmandan
Sep 11, 2001

lol internet
College Slice

v1nce posted:

I'm looking to make Logging easier for my team to use. We're on Symfony2 and passing the logger into everything with DI is a real thorn in everyone's rear end.

We're using a single Monolog logger as a service. I'm thinking of creating a static singleton which gets the Monolog service via DI during boot, and uses the PSR-3 interface, so I'm not expecting any changes.
It'll act like a simple facade for the Monolog class, like this:
code:
final class MyShittyLogger
{
    protected static $logger;

    public static function setLogger($logger)
    {
        self::$logger = $logger;
    }

    public static function log($level, $message, array $context = array())
    {
        return self::$logger->log($level, $message, $context);
    }

    // ...
}
Can anyone think of a reason I shouldn't implement this? I have a nagging feeling I shouldn't, but I can't come up with an actual reason not to.
Most peoples complaints about static/singletons stem from issues with testing, but if we bootstrap the test framework to populate the logger with its own object, then I actually see this as a good thing. It'll let us see any alerts raised during the test execution, and we won't have to inject mocks for Monolog all over the drat place.

This may be one of those cases where "breaking the rules" (tight coupling, global state) would be beneficial to you and your team. Logging is one of those things that you need in enough places to warrant this kind of treatment. Definitely go over it with your team though.

bigmandan
Sep 11, 2001

lol internet
College Slice

revmoo posted:

Awesome thanks. Looks like upgrading won't be too terrible.

As an aside, can anyone think of the best method to have a PHP script communicate with a long running PHP daemon on the same machine? I need to send data to the daemon on a regular basis but I'm not sure the best way to go about it. I figure I can either write something like JSON to a flat file, or use memcache perhaps, mostly just wondering what the best performance would be. I think writing to a RDBMS would probably be the slowest so that's out.

Beanstalkd should do what you want. The script would push onto the queue and the the daemon would read from it. From what I remember I think pheanstalk is one of the better PHP libs to talk to beanstalkd. It might be a little overkill, but you can scale it out in the future if needed.

bigmandan fucked around with this message at 21:02 on Nov 12, 2015

bigmandan
Sep 11, 2001

lol internet
College Slice

revmoo posted:

Hmm. I'm already using zmq I guess I could just use that. I don't really think I need a "work que" though, as I can process the data in question at once (and can't really scale past that anyway). I'm mainly concerned with "what is the fastest way to send an entire block of data to the daemon and have it picked up by whatever polling mechanism the daemon uses. I can nuke the whole data set with each update, so I'm not sure a message que is really the right tool.

I'm guessing a flat file in /tmp full of json is probably the best way to go about it.

If you can nuke the data set with each update then a flat file, like you said, would do it just fine. I assumed the "integrity" of the data being pushed was important.

bigmandan
Sep 11, 2001

lol internet
College Slice

joebuddah posted:

I have a php form that has a texa area. I want it to take the items in a list and separate each item as its own sql entry. Its not throwing any errors but it is also not sending the database.
Ex:

Bob
Tom
Chick
John

code:
    error_reporting(E_ALL);
    ini_set('display_errors',1);
            $textarray = explode("\n",$textarea);if(isset($_POST['textarea'])){
        $textarea= $_POST['textarea'];
        $q = "INSERT INTO Owners(Names) VALUES(:textarea);";
        $query = $odb->prepare($q);
        $results = $query->execute(array(
        ":textarea" => $textarea
        ));
        }?>


Can you clarify what you are trying to accomplish? In your example do you want 4 insert queries (one for each "line")? If so:

PHP code:
<?php

/**
 * broilerplate, etc...
 */

$items = explode("\n", $_POST['textarea']);

foreach ($items as $item) {
    $sql = "INSERT INTO Owners(Names) VALUES(:item);";
    $query = $odb->prepare($sql);
    $results = $query->execute([":item" => $item]);
}
Also you may want to handle the result and any errors/exceptions that occur.

bigmandan
Sep 11, 2001

lol internet
College Slice

There Will Be Penalty posted:

The thing about prepared statements is you can do this:

PHP code:
$sql = "INSERT INTO Owners(Names) VALUES(:item);";
$query = $odb->prepare($sql);
foreach ($items as $item) {
    $results = $query->execute([":item" => $item]);
    /* ... */
}
It's kind of one of the purposes of prepared statements, really.

You are correct of course. Brain fart on my part.

bigmandan
Sep 11, 2001

lol internet
College Slice
One thing that can be annoying with Eloquent is that it does not "support" compound keys. I've only ran into this issue a small handful of times, and ended up just writing raw queries for what I needed.

bigmandan
Sep 11, 2001

lol internet
College Slice
Since an address only has one type, it would seem more correct to change the method name to be singular as it better represents what is actually happening.

bigmandan
Sep 11, 2001

lol internet
College Slice

Gozinbulx posted:

I have zero experience with PHP and m here just to shamelessly ask for a ready-made solution.

I want to setup a small local web server (running Nginx) that I can put some PHP script on so that I can do just the following:

I send a HTTP GET or POST request that includes some kind of identifying name and some kind of message. The server accepts and stores the message and the indentifying name and returns an "OK 200" or something similar.

I send another HTTP GET or POST request asking to retrieve the message, asking for it by name. The server returns the message.


This seems super simple but I'll be damned if I know what to search for that is specific enough to actually get results relevant to that I'm asking.

Anyone know of some super simple solution to implement something like this? Or even just what I should be searching for?

Any help would be appreciated, thanks.

Here's a super low :effort: script that does this:

https://gist.github.com/dcabanaw/929c0bbf8c5dea56c41f29cba55bccd3

Runs fine with "php -S 127.0.0.1:8080" ( I don't have access to an nginx server atm)

- make POSTs to the root to add/update
- (key=someKey value=someValue)
- make GET requests with http://127.0.0.1:8080?key=someKey

Like I said this is super low effort. No error checking, etc... Feel free to do whatever you want with it.

bigmandan
Sep 11, 2001

lol internet
College Slice

Experto Crede posted:

What is generally considered the best way to write a system that uses modules/plugins?

Basically, I'm planning to write a system that scrapes information from various sources and ideally want it to be modular. The basic idea is that it'll have a central core that'll take the request for information and store the results in the database. To get the data it needs it'll use the modules in a specific folder to get the info it needs that return the data in a consistent format for the central core to process store and display, etc..

But I want to be reasonably automatic, so if I want to get a new source of information, I just write a new module that returns the data in a format the core can understand and the next time it runs it'll also pull information from that new source.

I'm not sure what the best approach is for doing this though, so some input would be appreciated!

Sound very similar to how Laravels middleware system works. For a naive solution:

PHP code:
<?php

// do some work, etc...

// all modules would implement some interface...
$modules = [
	ModuleOne::class, ModuleTwo::class, ModuleThree::class
];

$results = [];

foreach ($modules as $mod)
{
	$m = new $mod();
	$results[] = $m->run($data_from_scraping);
}


// do something with the results
Then the only thing you would need to change when adding a new module would be to update the array of modules. Ideally this would be wrapped up in some sort of class and have some error handling.

bigmandan
Sep 11, 2001

lol internet
College Slice

rt4 posted:

Am I the only one coming to the conclusion that frameworks aren't useful? It seems like Symfony or Laravel expect the programmer to use a whole bunch of code they don't understand to provide features they don't always need. I feel like I can get much better results by pulling together my own libraries for database interactions, routing, authentication, and whatever else. All the frameworks seem to give me is a clumsy installation process and the threat of being forced to migrate away if the framework ever dies.

It depends. If you have existing libraries for routing, authentication, etc... that are well tested, use that. If you're working on a team with individuals of varying skill levels, having a framework, it's documentation and large collections of tutorials and videos is really useful. "expecting the programmer to use stuff they don't understand" is a rather silly argument. Of course you're going to want to read the documentation or inspect the source and see how it all works, otherwise you're just going to make things more painful for yourself. If you already have a collection of libraries that work for you, it's probably going to seem like a waste of time learning a framework, but for someone new or someone unsatisfied with what they have, why re-invent the wheel?

I'm not sure about Symfony, but I'd hardly call Laravel's setup clumsy now. Maybe in the 3.x and early 4.x versions, but not now.

bigmandan
Sep 11, 2001

lol internet
College Slice

v1nce posted:

I have felt this way. I do not feel this way anymore. I likely would not dream of rolling my own framework for anything non-trivial providing one exists with a modicum of features that make reasonable sense at first glance. On any language.

Just because the framework is dead doesn't mean you have to migrate. Symfony, Zend and Laravel aren't going anywhere any time soon. Chances are that security concerns will be identified which may affect you, and you'll have to upgrade and move forward. This is true of PHP as a whole (get off 5.5 its EOL!) and not just the frameworks. Web is always evolving, and you probably can't write code that'll never change.

Learning (googling, usually) is easier than discovering a problem and writing the framework bits to get rid of it. Often all the features you could ever want are already accounted for in some manner, and if they're not then it's totally possible to crowbar them in.
A mature framework or library is a lot more powerful than whatever you can hope to come up with inside your time allotment. Symfony itself excels at this, with almost all components being interchangable (interfaces!), so if you need to make some magical extra functionality you can just replace/extend the base Symfony classes and tell the service container to use X instead of Y.

If you work for a company and it's affordable then the best head-start with the framework is to hire someone who knows it already, even just as a consultant. This is then your go-to guy for a few weeks, and you provide them every "I have X, I want to Y" problem, and they tell you how it should be done. Often finding the solution to a problem in a framework is as simple as knowing what term to Google. The truly unsupported stuff can usually be solved with some clever patterns.

Frameworks do evolve and sometimes their evolution is not backwards compatible (Symfony, and especially Angular if you work in the JS SPA space). This isn't the end of the world, but some adaptation is required, and it's often done to fix some very serious, very real problems. If at this time you choose to diverge from the original framework, that's fine too.

The double edged sword of frameworks is security; a vulnerability discovered in an open-source framework can be lethal if you don't stay up to date and are still rocking vulnerable libs (see Drupal, JWT). But there's no reason to believe your hand-rolled application isn't peppered with security problems you just haven't discovered (see TalkTalk, Tesco). Security-by-obscurity is only effective when the invisible security hole is small. Having the framework be open is a great way to get peer-review before the code ever makes it to production.

Your own framework implementations make sense because they fit you and you know how they work. Frameworks like Symfony are huge, and they're that way for some very good reasons which might not apply to you.
I've been working on Symfony for around 3 years and there are parts of the codebase I have never looked into, but I use on a daily basis. That's the sign of a good code; interfaces that abstract the problem so that I both never have to dig into the implementation, and updates never require the interface to change.

You're obviously already down the path of using your own framework, and that's OK too, providing you can justify the learning curve for ever developer who will need to use it in the future. I wouldn't recommend changing to another framework from here because you already have established practices and knowledge, and you'd have to un-learn, learn, and migrate, which means being dead-in-the-water for likely a few months. There's nothing stopping you from using the contents of frameworks though; Symfony is built from the ground-up as component based, and lots of the things can just be plugged in to any framework (translator, console, event dispatcher, forms, routing, serializer, validator, etc). If you happen to find your hand-rolled code isn't cutting it, you can always shunt some of the responsibilities over to the framework libraries.

You explained this much better than I did/could!

One thing I noticed when using a framework, is it forced me to write my business logic stuff in such way that it's essentially it's own library. This has worked out well as some of our older projects needed the new code and it was pretty painless to backport it over.

bigmandan
Sep 11, 2001

lol internet
College Slice
There are some frameworks that registers a handler for notices and convert them to either errors or exceptions. You can do this your self too with:

PHP code:
<?php

$old_handler = set_error_handler("my_handler");

function my_handler($errno, $errstr, $errfile, $errline, array $errcontext = [])
{
	// deal with it here, throw an exception etc...
}
http://php.net/manual/en/function.set-error-handler.php

I find it pretty useful to do so as most php notices should be errors imo.

bigmandan
Sep 11, 2001

lol internet
College Slice
HTML Purifier is a decent library for sanitizing input and strips out most XSS content. Ideally you would sanitize the input with HTML Purifier, validate the sanitized input against some rules (valid ranges, format, etc..), and then store it in the database. Escaping output is ideal unless you are doing some sort of HTML based WYSIWYG editor.

bigmandan
Sep 11, 2001

lol internet
College Slice

McGlockenshire posted:

If you are using your domain's email, use the SMTP server provided by the host of the (MX records for the) domain.

If you are sending as a gmail address, use Google's SMTP servers. Using your gmail address as the From address, but sending through your host's servers will significantly increase the likelihood of your mail being flagged as spam by the recipients.

Never use mail(). It's poo poo. You can't troubleshoot it, you can't debug it, you can't control it, and when things go wrong like this, you are completely up a creek. Instead, use Swiftmailer if you can, but it's fine to use use PHPMailer if learning Swiftmailer seems overwheming.

PHPMailer is pretty good and my goto for one off scripts, but it's definitely worth learning the API of Swiftmailer if you use any of the frameworks that use it as its core mail lib (Laravel, Yii, etc..)

bigmandan
Sep 11, 2001

lol internet
College Slice
If it was not for all the legacy applications and large number of internal libraries we've written, we'd probably switch to C# where I'm at.

bigmandan
Sep 11, 2001

lol internet
College Slice
You could check out Laravel Homestead. It is a pretty great development environment even if you are not using Laravel.

bigmandan
Sep 11, 2001

lol internet
College Slice

Acidian posted:

Alright, thanks guys!

I know EDI and code editors are very subjective. I am currently using Dreamweaver, but mostly because it allows me to get the output immediately in a split screen after saving the file, rather than having to refresh my browser. At my level it doesn't really matter much, but any recommendations? Sublime and PHP Storm seems the most recommended?


It's definitely not a decade since last time I used linux. I can't remember if I was running Debian or Ubuntu, but I was sure I had problems writing to my NTFS drives. That doesn't mean there wasn't a hack for it, but does Ubuntu now support it out of the box?

Both Sublime and PHP Storm are great. I use PHP Storm most of the time now as I like it's debugger interface a lot better than the xdebug plugin for Sublime. But If I'm just writing something really quick I'll use Sublime still.

bigmandan
Sep 11, 2001

lol internet
College Slice

Acidian posted:

Wow, that's allot of help, thanks! I will just stick with COUNT() for getting the table size then.

Ditching mysqli now and starting with PDO is a bit demoralizing, took me long enough getting that figured out. I'll see what I do, maybe I will wait until I know more about how classes work, although I could probably figure it out without with allot of help from google.

I will have to read through the index post you linked, as I barely understand how indexing works (other than knowing I should be indexing my tables). I will be using allot of mysql moving forward, so the more I learn the better.

SQL injection I do not know what is yet, I assume it's similar to injecting javascript or html into a $_GET request, but the course I am on will cover it in the next chapter, and on how to defend against it.

You'll be thankful for switching over to PDO. Stick with it, and you'll get there. If you need some general tutorials check out https://laracasts.com/ . While it was initially focused on Laravel, it has expanded out to be a great PHP/webdev resource. There is a lesson for PDO too "https://laracasts.com/series/php-for-beginners/episodes/13"

bigmandan
Sep 11, 2001

lol internet
College Slice

96 Port Hub posted:

You can pay someone for a certificate or you can use Let's Encrypt for free. I'd recommend LE unless you need something more exotic like a wildcard cert or extended validation.

Let's Encrypt can do wildcards now too. https://community.letsencrypt.org/t/acme-v2-and-wildcard-certificate-support-is-live/55579

bigmandan
Sep 11, 2001

lol internet
College Slice

itskage posted:

Anyone seem to have an issue with PHPUnit dying on a max time limit of 300 seconds? It seem to ignore php.ini settings and any set_time_limit/ini_set features. Our CI is is dying on it when running the full suite on new commits. Running it on just a few tests works fine, so it's like PHPUnit is trying to execute the tests with that time limit for the suite, which isn't going to work for this large project.

E: I think I found it. Going to be really sad if it's what I think it is.

I'm curious... what do you think it is?

bigmandan
Sep 11, 2001

lol internet
College Slice

itskage posted:

Someone had a set_time_limit set in a function that iterates over a lot of records. For the purposes of the unit test it doesn't matter, but once set it would stick and PHPUnit would use that for the rest of the test.


How do people handle this? I don't see an issue using set time limit for things that will take awhile and moving it beyond the 30 second default. I don't like the idea of configuring CI to be longer globally because something hanging can be caught in CI before it hits production.

The best idea I can think of is to have each phpunit's TestCase set_time_limit to default during setup. So that any classes or functions that use set time limit in other cases won't impede on others.


Edit: For the record we're adding tests to an existing 6 year old ball of mud project. It's a fun an interesting journey that's going about as well as you'd think something like that would go.

Changing the time limit in test setup may work, but it could cause issues down the road when other changes are made. To me it would make sense to have the function setting the time limit clean up after itself:

PHP code:
function aFunctionThatSetsTimeLimit()
{
	$originalLimit = ini_get('max_execution_time');
	set_time_limit($someLimit)

	// code

	set_time_limit($orginalLimit)

	return $whateverIfNeeded;
}
If used often enough the setting/resetting of the time limit could be wrapped up in some helper functions. Resetting the time limit may make some hosting providers angry though.

Ideally long running tasks should be in some sort of queue system but doing that refactor may be out of scope for your project.

bigmandan
Sep 11, 2001

lol internet
College Slice

Cool Matty posted:

It's on a socket. I tried TCP for giggles but apparently that's busted in WSL, so they recommend socket anyway. Both nginx and php-fpm are running as my local user (confirmed via top).

After digging a bit more, I feel like it has something to do with closing the connection after a request. If I disable fastcgi_buffers in nginx, I am no longer able to load the page at all. If I turn it back on, I can get the page to load the first request, but not subsequent ones. I can only assume that means the socket is working (otherwise it'd never load). But I don't know why php-fpm would not be finishing a request, or screwing up the buffer, or whatever it's doing there.

I was having a similar problem and I eventually got it working in my dev env at home. I'll post my config once I get a chance to do so.

edit:

Running WSL with Ubuntu 18.04.

/home/bigmandan/projects is a symlink to /mnt/c/projects

NOT simlinked, but normal file for server config. Had to turn off fastcgi buffering
/etc/nginx/sites-enabled/mysite


code:
server {

        listen 80;
        server_name mysite.test;
        root /home/bigmandan/projects/mysite;

        index index.html index.htm index.php;

        charset utf-8;

        location / {
                try_files $uri $uri/ /index.php?$query_string;
        }


        location = /favicon.ico { access_log off; log_not_found off; }
        location = /robots.txt  { access_log off; log_not_found off; }

        location ~ \.php$ {
                include snippets/fastcgi-php.conf;                
                fastcgi_pass unix:/var/run/php/php7.2-fpm.sock;
                fastcgi_buffering off;
        }

        location ~ /\.ht {
                deny all;
        }

        location ~ /.well-known {
                allow all;
        }
}
Changes to /etc/php/7.2/fpm/pool.d/www.conf
Had to change user/group AND listen.owner and listen.group

code:
user = bigmandan
group = bigmandan
listen.owner = bigmandan
listen.group = bigmandan
And finally nginx confing:

code:
user bigmandan;
worker_processes auto;
pid /run/nginx.pid;
include /etc/nginx/modules-enabled/*.conf;

events {
	worker_connections 768;
	# multi_accept on;
}

http {

	##
	# Basic Settings
	##

	sendfile on;
	tcp_nopush on;
	tcp_nodelay on;
	keepalive_timeout 65;
	types_hash_max_size 2048;	

	include /etc/nginx/mime.types;
	default_type application/octet-stream;

	##
	# SSL Settings
	##

	ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE
	ssl_prefer_server_ciphers on;

	##
	# Logging Settings
	##

	access_log /var/log/nginx/access.log;
	error_log /var/log/nginx/error.log;

	##
	# Gzip Settings
	##

	gzip on;

	##
	# Virtual Host Configs
	##

	include /etc/nginx/conf.d/*.conf;
	include /etc/nginx/sites-enabled/*;
}

bigmandan fucked around with this message at 01:21 on Jun 22, 2018

bigmandan
Sep 11, 2001

lol internet
College Slice

Acidian posted:

So regarding using REST. Is there any common practice to rate limit REST requests in a script? I will set up the larger scripts (that is, scripts that make a lot of calls) to run at like 4 am on Sundays, but even so I worry that making too many requests on the REST API will reduce performance on the Magento server, while the script is running. Also, I might have to run scipts during the day that might not make thousands of requests, but it will still make requests until it's done. I can't really test this impact on performance on my own Magento test environment in a good way.

Rate limiting your requests will depend on what the endpoint limits are. Check their documentation to make sure. You could setup some sort of worker queue (beastalkd, rabbitmq, etc...) to perform your requests. The worker would send up to the maximum requests in the allowed time period then "sleep" until the next run.

Acidian posted:

From what I understand of the REST API, I need to send a PUT or POST request on one object at a time. This means that if I have 500 000 new products, then I need to make 500 000 individual calls on the API. Is there any way of adding more information per request, so that I can maybe send 50 objects in one request? The tutorial or documentation on the Magento site did not seem to indicate that you could.

This will really depend on what their endpoint accepts. I've used/written endpoints that accept a json array of objects. You'll have to dig around in their docs and see.

Acidian posted:

The GuzzleHttp documentation also shows how you can queue up several requests, and then send multiple requests concurrently, but is there any point to that? I would assume it's better to just work through them sequentially. I also don't understand the difference between a synchronous request and an asynchronous request.

If you have a lot of requests to make, sending multiple, concurrent, requests at a time could be more efficient than doing each sequentially. Using Guzzle's async would allow you to fire off a bunch of requests then have a handler do something with the response as they complete.

bigmandan
Sep 11, 2001

lol internet
College Slice
Most of what you have said seems pretty reasonable. If you want a fairly simple way to handle script(s) from running at the same time, you can use a locking mechanism. For example

https://symfony.com/doc/current/components/lock.html

If you need finer grain control on when scripts are run you may want to look into supervisord instead of cron.

bigmandan
Sep 11, 2001

lol internet
College Slice

Acidian posted:

I want to learn to use the Symfony framework, since the application I am using is written in symfony and I want to do some back end modifications later down the line. If what I say sounds reasonable, then I will just stick with that for now, and learn a better way when I start learning Symfony.

I have "supervisor" installed on the server for the application, not sure if that does the same as "supervisord", but right now I don't think I need "fine grain" control so I think it's ok.

Thank you.

You can use the lock library without using the rest of the framework.

bigmandan
Sep 11, 2001

lol internet
College Slice

Acidian posted:

I am running into a weird memory issue with a script. It's a short looping script that uploads an image, so it continously loads in an image file, converts it to base 64 and saves it to a variable, uploads the image to a server and unsets the image file. Then the proccess repeats. Since this is in a while statement, and the variables are the same onces being used over and over, shouldn't the memory allocation (size and address) stay more or less the same with some fluctiuations of 1-2mb depending on the image size?

code:
    //Fetching 1 product in JSON form.
$product = $sql_client_products->get_products();

//Infinite loop insurance.
$i = 0;
    while($product && $i<10000){
        $product = json_decode($product['ProductJSON'], true);

        //Get file location and image file name.
        $file_loc = $product['media_gallery_entries'][0]['content']['base64_encoded_data'];
        $filen_name = $product['media_gallery_entries'][0]['content']['name'];

        //Fetch image from server.
        if($ak_client->get_image_from_url($filen_name, $file_loc)){
            $image_base64_encoded_data = base64_encode(file_get_contents($filen_name));

            //Remove the image location and instead insert the base64 image date
            $product['media_gallery_entries'][0]['content']['base64_encoded_data'] = $image_base64_encoded_data;
            unlink($filen_name);
        }
        //Uploading the product with image to database.
        if($mag_client->add_product($product)){
            //Deleting the product from database.
            $sql_client_products->del_product($product['sku']);
        }

        $date = new DateTime();
        echo $date->format('H:i:s') . ": " . ++$i . "sku: ". $product['sku'] . "\n";

	//Get new product, if table is empty, returns false.
        $product = $sql_client_products->get_products();
    }

I don't see anything erroneous that would cause memory issues. Without knowing what $ak_client, $mag_client, etc. are doing under the hood it's difficult to tell.

You may want to do some profiling with xdebug and cachegrind. Depending on the version I think profiling was removed then added back into xdebug at some point though.

Adbot
ADBOT LOVES YOU

bigmandan
Sep 11, 2001

lol internet
College Slice

Mr Crucial posted:

A newbie question: when should I stop developing frontends in PHP and switch over to something like Vue or React?

I’ve been creating a site using Laravel which has been great for getting all the backend functionality that I want, but for the front end I’ve been using basic views using Bootstrap with a dash of jquery thrown in when I need something a bit more dynamic (datatables, autocomplete lookups etc).

As I’m adding more functionality in I’m finding that the jquery libraries I’m using aren’t cutting it for me and I’m coming to the conclusion that it’s time to learn a front end framework. But it seems to me that React etc all expect to be handling all of the front end as a single page app and I can’t just replace my jquery stuff with equivalent React components and call it a day, or at least that wouldn’t be optimal. Nor do Bootstrap components appear to play nicely when used within React components, although another framework like Bulma looks like it might work a bit better.

What to do?

As mentioned already, Vue is very flexible with how much of the framework you use. You can start with just a few pages at a time and then gradually change into a SPA if you want to. You can also just use components where needed and leave the rest as is.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply