|
Zorilla posted:You have this: I tried it both ways but it still happens
|
# ? Jun 2, 2008 03:37 |
|
|
# ? May 15, 2024 04:50 |
|
I'm looking at implementing my own RSS feed generator, which will work off of a series of existing tables in my database. I had planned to simply write a PHP script that output the required XML straight to the client, but most of the tutorials that I've seen on the web do it by generating a flat text file at a regular interval and having that serve as the feed. I had hoped to be able to do customisable feeds (ie: so the user can filter out particular contributors or topics and get a feed containing only that), but if I'm to generate flat files that will mean I'll need to do hundreds of them to cover every possibility. Is there some compelling reason why I can't just take a querystring (rewritten with mod_rewrite, of course) and spit the output directly to the user based upon the input parameters?
|
# ? Jun 2, 2008 05:40 |
|
clockworkjoe posted:I tried it both ways but it still happens I think I inappropriately put colons into single-line "if" statements while trying to add consistency to the code. It's my guess that the statement will go on until it reaches the next endif;. Get rid of the colons on those and try again. Zorilla fucked around with this message at 06:29 on Jun 2, 2008 |
# ? Jun 2, 2008 06:27 |
|
I think I got it. raillery.tv/blog Can someone confirm the site looks okay in firefox 3 beta?
|
# ? Jun 2, 2008 06:55 |
|
clockworkjoe posted:Can someone confirm the site looks okay in firefox 3 beta? Don't know about Firefox, but looks okay in Opera 9.27: http://www.halo-17.net/iomha/upload/080602_2_raillery.png
|
# ? Jun 2, 2008 07:45 |
|
Lankiveil posted:I'm looking at implementing my own RSS feed generator, which will work off of a series of existing tables in my database. I had planned to simply write a PHP script that output the required XML straight to the client, but most of the tutorials that I've seen on the web do it by generating a flat text file at a regular interval and having that serve as the feed. You should be fine. The most obvious reason off the top of my head is performance, but I wouldn't worry about it unless you anticipate enterprise-level traffic right off the bat.
|
# ? Jun 2, 2008 08:17 |
|
Lankiveil posted:I'm looking at implementing my own RSS feed generator, which will work off of a series of existing tables in my database. I had planned to simply write a PHP script that output the required XML straight to the client, but most of the tutorials that I've seen on the web do it by generating a flat text file at a regular interval and having that serve as the feed. This optimization works very well for blogs and the like, which are updated probably a couple of times a day at most, and have feeds that are not customized for any particular reader. But it breaks down when feeds need to be customized, because customization goes against the grain of cachability. The most custom something is, the less effective caching techniques will work. The easiest way to implement a customized feed is just to generate it from scratch every time and return it, i.e. don't cache it to a flat file on the disc. But this won't scale well. So you could cache each custom feed in a file, but then that'll mean you need a file for each customized feed. This is more complex to manage, needs more disc space, and you'll need a cronjob or something to clean up stale feeds. Depending on the number of clients I was expecting to see, I might try caching the feeds with memcached or something. The way it'd work is that the PHP script would generate a unique key based on the custom feed parameters, and check to see if that feed was present in memcache. If it is, return it. If not, generate it, stick it in memcache, and then return it. This way you can cache a very large number of feeds, and not have to worry about cleanup of the flat files.
|
# ? Jun 2, 2008 08:18 |
|
What's the best or easiest way to add language support? By language support I mean we can display an error message in English or maybe French rather than just English.
|
# ? Jun 3, 2008 01:04 |
drcru posted:What's the best or easiest way to add language support? By language support I mean we can display an error message in English or maybe French rather than just English. In the past I've used some sort of a Message Catalog table in the DB with fields id, language, message. Then create a way to access that, like: php:<? echo MessageCatalog->getMessage("hello", "es");?> code:
|
|
# ? Jun 3, 2008 01:10 |
|
fletcher posted:In the past I've used some sort of a Message Catalog table in the DB with fields id, language, message. Then create a way to access that, like: There is also zend framework, which you can use just parts of. http://framework.zend.com/manual/en/zend.locale.html
|
# ? Jun 3, 2008 02:53 |
|
drcru posted:What's the best or easiest way to add language support? By language support I mean we can display an error message in English or maybe French rather than just English. Pick the language based on the HTTP request "Accept-Language" header, but always allow the user to override it by setting a cookie or something.
|
# ? Jun 3, 2008 03:25 |
|
I hope this hasn't been covered, but I'm going to take my chances since this seems sort of obscure. I have to integrate with a SOAP API that uses periods in its method calls. The syntax is Scope.Method (e.g. "Report.GetDetails"). I'd really like to use PHP5's SoapClient for this so I can use its elegant method-calling structure. When SoapClient parses a WSDL file, it turns the methods described in the WSDL to methods that are callable as method functions of the resulting client class. So if you need to access a SOAP method called "GetInfo" from a client object "$client", you would call "$client->GetInfo([params])". Since this particular service uses periods in all of its method names, I can't figure out how to call this service's methods, since calling $client->Report.GetDetails() isn't syntactically valid. The best thing I came up with is this: $client->__soapCall('Report.GetDetails', array('param1', 'param2' ... )); But that defeats the purpose of using the more-elegant callable method support. It also reminds me of NuSoap, which is what I'm trying to get away from (although if I had to, I'd rather use SoapClient than NuSoap no matter what). Any ideas?
|
# ? Jun 5, 2008 04:24 |
|
fletcher posted:You will probably want to implement some sort of cache so you can store the messages in memory rather than querying the DB for each message every time you use it. In the past, I've created files that contain translation tables for each text element that I want translated: php:<? // /languages/en.php $lang->login = "Login"; $lang->logout = "Logout"; $lang->welcome = "Welcome"; // ... // /languages/ga.php $lang->login = "Logáil isteach"; $lang->logout = "Logáil amach"; $lang->welcome = "Fáilte!"; ?> php:<? // default language require_once("languages/en.php"); // user preference language require_once("languages/$userprefs->lang.php"); ?> I've never done this for a site with more than a couple of dozen text elements though, and it's probably horribly inefficient for a site that has thousands of them.
|
# ? Jun 5, 2008 11:43 |
|
mr. why posted:The best thing I came up with is this:
|
# ? Jun 6, 2008 09:37 |
|
gibbed posted:Inherit SoapClient and wrap all the functions in it? That's the only thing you can really do. Good idea! Here's what I came up with: php:<? function __call($function_name, $arguments) { $function_name = str_replace('_', '.',$function_name); return parent::__call($function_name, $arguments); } ?> Thanks!
|
# ? Jun 6, 2008 17:11 |
|
So how do I disable PHP in one specific folder (for security reasons)?
|
# ? Jun 6, 2008 23:57 |
|
drcru posted:So how do I disable PHP in one specific folder (for security reasons)? In .htaccess: code:
|
# ? Jun 7, 2008 01:02 |
|
Mine GO BOOM posted:In .htaccess: I'm more worried about being hit by PHP files disguised as images. Is there a way to get the engine to ignore a specific folder? I tried the php_admin commands in an htacess but it just gave internal server errors.
|
# ? Jun 7, 2008 06:19 |
|
drcru posted:I'm more worried about being hit by PHP files disguised as images. Is there a way to get the engine to ignore a specific folder? I tried the php_admin commands in an htacess but it just gave internal server errors.
|
# ? Jun 7, 2008 07:36 |
|
Bruno_me posted:'php_flag engine 0' should do it Doesn't seem to be working. Do I have to change any settings on the server?
|
# ? Jun 7, 2008 10:13 |
|
drcru posted:Doesn't seem to be working. Do I have to change any settings on the server? In Apache you need to have AllowOverride Options (or All).
|
# ? Jun 7, 2008 10:25 |
|
Lacithe posted:In Apache you need to have AllowOverride Options (or All). I'm so confused, it's already set to All. I'm running on mod_php.
|
# ? Jun 7, 2008 10:48 |
|
Are there any issues with using file_get_contents(\'php://input\') to read a JSON string sent by MooTools request.JSON? My php developers have said that it's a security risk, but I haven't been able to find anything by googling, and I'm not sure why MooTools would send their JSON strings in a raw post if there were any major issues?
|
# ? Jun 7, 2008 19:28 |
|
SA has removed its ad monitoring functionality for banner ads so I now have no idea who clicked my ad or how many times they did -vs- impressions. Once I found this out, I quickly Google-hacked some php together to make a .txt file that recorded clicks before punting people back to my coupons thread. I now need to find a way to parse all this crap and toss it into some sort of database, preferably MySQL so I can keep utilizing the script in the future. I'd love the eventual ability to do sorting and reporting so I can see if people clicked it more than once, etc. What would be the best way to attack something like this? I have the ability to read and understand existing php but don't know enough to come up with anything on my own. Here's an example of two entries: code:
|
# ? Jun 9, 2008 11:41 |
ryanbruce posted:What would be the best way to attack something like this? Why bother putting it in a text file and the parsing that to build your database? Why not just insert it directly into the database? php:<? //do whatever you to to get the $vesrion, $ip, and $referrer variables filled $database = new PDO('mysql:host=localhost;dbname=mysite', 'username', 'password'); $insert = $database->prepare("INSERT INTO ad_hits (version, ip, referrer) VALUES (:version, :ip, :referrer)"); $insert->bindParam(":version", $version); $insert->bindParam(":ip", $ip); $insert->bindParam(":referrer", $referrer); $insert->execute(); ?> fletcher fucked around with this message at 22:53 on Jun 9, 2008 |
|
# ? Jun 9, 2008 22:48 |
|
fletcher posted:Why bother putting it in a text file and the parsing that to build your database? Why not just insert it directly into the database? I had it dump to .txt because I was in a hurry and it was quick and dirty. Technically 1and1's TOS says I'm not allowed to use MySQL to log referrers, adclicks, chat, or anything else that is high processor usage. I'll probably go this way anyways since I'm hardly a big time website. I'll look into the php you pasted/etc and see if I can figure it out. Thanks for the lead! edit: I'd like to know why I didn't think to use CSV though.. Way late edit: If anyone ever makes the plaintext mistake that I made, here's how I fixed it: Opened the .txt file in Word, did a find/replace (ie Find [paragraph character]Referring URL: and Replace with a comma) for everything. End result: Nice clean .csv! ryanbruce fucked around with this message at 23:24 on Jun 10, 2008 |
# ? Jun 9, 2008 23:28 |
|
I'm trying to implement a simple URL hiding script that retrieves a zip file from the server filesystem. It's pretty basic and could surely be much better. Something like this... php:<? header("Content-type: application/zip"); header("Content-Disposition: attachment; filename=\"".$filename."\""); header("Content-Transfer-Encoding: binary"); header("Content-Length: " .(string)(filesize($file_path)) ); $file = @fopen($file_path,"rb"); if ($file) { while(!feof($file)) { print(fread($file, 1024*8)); flush(); } @fclose($file); } ?>
|
# ? Jun 11, 2008 17:19 |
|
mcbuttbutt posted:It works for smaller files, but larger files (>50 megs) are always incomplete. Any idea why this is happening? Thanks!
|
# ? Jun 11, 2008 18:43 |
|
minato posted:You're probably hitting the PHP 30 second execution time limit. You can get around this in some cases by doing set_time_limit(0) But watch out, since it will basically keep running (well until you stop the browser). Just make sure you don't allow it to be backgrounded also, and then have a run away process There are some issues also because fopen (process) can be blocked. Can you do file_get_contents('file path here'), that pretty much takes everything you do and does it in one easy swoop.
|
# ? Jun 11, 2008 19:20 |
|
minato posted:You're probably hitting the PHP 30 second execution time limit. Nothing regarding that in the logs, plus there are some database updates that occur after the fread() that successfully execute. MrEnigma posted:set_time_limit(0) Yeah, forgot to mention already having set_time_limit(0). I tried using get_file_contents and saw this error -- PHP Fatal error: Allowed memory size of 12582912 bytes exhausted mcbuttbutt fucked around with this message at 19:31 on Jun 11, 2008 |
# ? Jun 11, 2008 19:25 |
|
Try adding "while(@ob_end_clean());" before you start outputting data.
|
# ? Jun 11, 2008 19:33 |
|
That didn't fix it either. Thanks for your help so far!
|
# ? Jun 11, 2008 19:45 |
|
mcbuttbutt posted:Nothing regarding that in the logs, plus there are some database updates that occur after the fread() that successfully execute. Ah, you're running out of memory. PHP has a memory_limit parameter in php.ini You can also hit it using ini_set('memory_limit', '64M'); in a script or via .htaccess From: http://drupal.org/node/29268 Granted it's for drupal specifically, but it should work for you also. Edit: If you're on a shared host that might not work for you. And file_get_contents is probably more memory intensive since it loads up the entire thing, while on the other one you were doing it in chunks. MrEnigma fucked around with this message at 20:03 on Jun 11, 2008 |
# ? Jun 11, 2008 19:58 |
|
Yeah, using get_file_contents() I ran out of memory and got the message in the error log. However reading it chunk by chunk still doesn't complete the download and following code is executed. Forgive the poor wording, but could it be possible that I'm losing connection with the download once the PHP script stops executing? It's strange that I'm only getting half of the zip file and no errors.
|
# ? Jun 11, 2008 20:17 |
|
mcbuttbutt posted:Yeah, using get_file_contents() I ran out of memory and got the message in the error log. it's dying somehow, it might be because of something else, like a remote host dropping it, or simply php not being able to set a timeout on that process. you could try raising up your level of reporting...if you're on PHP5 use error_reporting(E_ALL | E_STRICT);
|
# ? Jun 11, 2008 20:19 |
|
MrEnigma posted:error_reporting(E_ALL | E_STRICT); PHP Notice: ob_flush() [<a href='ref.outcontrol'>ref.outcontrol</a>]: failed to flush buffer. No buffer to flush. in path/to/download.php on line 133 edit: duh, fixed it with @ob_flush();. Still not working though. mcbuttbutt fucked around with this message at 20:36 on Jun 11, 2008 |
# ? Jun 11, 2008 20:33 |
|
Also take note that you'll likely need to raise all the following settings if you want a user to upload a really huge file: memory_limit post_max_size upload_max_filesize max_execution_time Of course, the memory limit will have to be slightly higher than the post size, which must be slightly higher than the max filesize.
|
# ? Jun 11, 2008 22:59 |
|
I remember something a while ago about PHP corrupting the download. If possible, make it send a file that is just the alphabet repeated, and then diff the output to see the actual byte mark it fails. I'm pretty sure I remember it being at a static offset.
|
# ? Jun 11, 2008 23:36 |
|
This isn't always working like it should. I have a special link on my pages that will take you to loggingout.php. It uses a token to verify that you actually clicked the link (and didn't have anyone IM you a fake logout link). Stripping away the logic what it really does is this, which should log you out and redirect you to the index:php:<? LOGGINGOUT.PHP setcookie('blargh','',time()-$tenyears,myCookiePath(),myCookieDomain()); header('Location: http://website.com/theindexpage.php'); ?> What's the deal?
|
# ? Jun 12, 2008 04:07 |
|
|
# ? May 15, 2024 04:50 |
|
I could have sworn I posted this here earlier this evening, but I think I'm crazy, apologies if I'm not crazy. I need a php class or set of UDFs that makes Markov word chains from a given source body of text. I've found a few that make Markov words from letter chains, and one that just plain doesn't work, but none that do word chains. The one that seems to be floating around the highest on google is the one that doesnt seem to work at all- this one edit- added wikipedia link
|
# ? Jun 12, 2008 04:50 |