Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
clockworkjoe
May 31, 2000

Rolled a 1 on the random encounter table, didn't you?

Zorilla posted:

You have this:

<?php endwhile; endif; ?>

...when you should have this (like shown in my example)...

<?php endwhile; ?>

I tried it both ways but it still happens

Adbot
ADBOT LOVES YOU

Lankiveil
Feb 23, 2001

Forums Minimalist
I'm looking at implementing my own RSS feed generator, which will work off of a series of existing tables in my database. I had planned to simply write a PHP script that output the required XML straight to the client, but most of the tutorials that I've seen on the web do it by generating a flat text file at a regular interval and having that serve as the feed.

I had hoped to be able to do customisable feeds (ie: so the user can filter out particular contributors or topics and get a feed containing only that), but if I'm to generate flat files that will mean I'll need to do hundreds of them to cover every possibility. Is there some compelling reason why I can't just take a querystring (rewritten with mod_rewrite, of course) and spit the output directly to the user based upon the input parameters?

Zorilla
Mar 23, 2005

GOING APE SPIT

clockworkjoe posted:

I tried it both ways but it still happens

I think I inappropriately put colons into single-line "if" statements while trying to add consistency to the code. It's my guess that the statement will go on until it reaches the next endif;. Get rid of the colons on those and try again.

Zorilla fucked around with this message at 06:29 on Jun 2, 2008

clockworkjoe
May 31, 2000

Rolled a 1 on the random encounter table, didn't you?
I think I got it. raillery.tv/blog

Can someone confirm the site looks okay in firefox 3 beta?

Lankiveil
Feb 23, 2001

Forums Minimalist

clockworkjoe posted:

Can someone confirm the site looks okay in firefox 3 beta?

Don't know about Firefox, but looks okay in Opera 9.27:

http://www.halo-17.net/iomha/upload/080602_2_raillery.png

DaTroof
Nov 16, 2000

CC LIMERICK CONTEST GRAND CHAMPION
There once was a poster named Troof
Who was getting quite long in the toof

Lankiveil posted:

I'm looking at implementing my own RSS feed generator, which will work off of a series of existing tables in my database. I had planned to simply write a PHP script that output the required XML straight to the client, but most of the tutorials that I've seen on the web do it by generating a flat text file at a regular interval and having that serve as the feed.

I had hoped to be able to do customisable feeds (ie: so the user can filter out particular contributors or topics and get a feed containing only that), but if I'm to generate flat files that will mean I'll need to do hundreds of them to cover every possibility. Is there some compelling reason why I can't just take a querystring (rewritten with mod_rewrite, of course) and spit the output directly to the user based upon the input parameters?

You should be fine. The most obvious reason off the top of my head is performance, but I wouldn't worry about it unless you anticipate enterprise-level traffic right off the bat.

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender

Lankiveil posted:

I'm looking at implementing my own RSS feed generator, which will work off of a series of existing tables in my database. I had planned to simply write a PHP script that output the required XML straight to the client, but most of the tutorials that I've seen on the web do it by generating a flat text file at a regular interval and having that serve as the feed.

I had hoped to be able to do customisable feeds (ie: so the user can filter out particular contributors or topics and get a feed containing only that), but if I'm to generate flat files that will mean I'll need to do hundreds of them to cover every possibility. Is there some compelling reason why I can't just take a querystring (rewritten with mod_rewrite, of course) and spit the output directly to the user based upon the input parameters?
Unfortunately the RSS standard does not specify a concrete way to say how often a RSS client will poll the server to retrieve the XML file. It might hit it every few hours, every 15 minutes, or at worst every minute or two. So the server needs to be able handle this constant polling, multiplied by the expected number of RSS clients. If the cost of generating an XML feed is running a PHP script and probing the database, then that can be expensive and inefficient when the number of clients is high and the content is infrequently updated. So a common optimization is to use a cronjob to write the feed to a flat file and serve that instead. That way it's a fixed cost for the server to generate it, and a much smaller cost for the server to serve it. In addition, the client can also use caching techniques like Etags to avoid retrieving the whole feed on every poll.

This optimization works very well for blogs and the like, which are updated probably a couple of times a day at most, and have feeds that are not customized for any particular reader. But it breaks down when feeds need to be customized, because customization goes against the grain of cachability. The most custom something is, the less effective caching techniques will work.

The easiest way to implement a customized feed is just to generate it from scratch every time and return it, i.e. don't cache it to a flat file on the disc. But this won't scale well. So you could cache each custom feed in a file, but then that'll mean you need a file for each customized feed. This is more complex to manage, needs more disc space, and you'll need a cronjob or something to clean up stale feeds.

Depending on the number of clients I was expecting to see, I might try caching the feeds with memcached or something. The way it'd work is that the PHP script would generate a unique key based on the custom feed parameters, and check to see if that feed was present in memcache. If it is, return it. If not, generate it, stick it in memcache, and then return it. This way you can cache a very large number of feeds, and not have to worry about cleanup of the flat files.

Acer Pilot
Feb 17, 2007
put the 'the' in therapist

:dukedog:

What's the best or easiest way to add language support? By language support I mean we can display an error message in English or maybe French rather than just English.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

drcru posted:

What's the best or easiest way to add language support? By language support I mean we can display an error message in English or maybe French rather than just English.

In the past I've used some sort of a Message Catalog table in the DB with fields id, language, message. Then create a way to access that, like:

php:
<?
echo MessageCatalog->getMessage("hello", "es");?>
code:
hola
You will probably want to implement some sort of cache so you can store the messages in memory rather than querying the DB for each message every time you use it.

MrEnigma
Aug 30, 2004

Moo!

fletcher posted:

In the past I've used some sort of a Message Catalog table in the DB with fields id, language, message. Then create a way to access that, like:

php:
<?
echo MessageCatalog->getMessage("hello", "es");?>
code:
hola
You will probably want to implement some sort of cache so you can store the messages in memory rather than querying the DB for each message every time you use it.

There is also zend framework, which you can use just parts of.

http://framework.zend.com/manual/en/zend.locale.html

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender

drcru posted:

What's the best or easiest way to add language support? By language support I mean we can display an error message in English or maybe French rather than just English.
For static or printf()-style strings, use gettext. For user-editable stuff, you have to roll your own.

Pick the language based on the HTTP request "Accept-Language" header, but always allow the user to override it by setting a cookie or something.

a starchy tuber
Sep 9, 2002

hi yes I'm very normal
I hope this hasn't been covered, but I'm going to take my chances since this seems sort of obscure. I have to integrate with a SOAP API that uses periods in its method calls. The syntax is Scope.Method (e.g. "Report.GetDetails").

I'd really like to use PHP5's SoapClient for this so I can use its elegant method-calling structure. When SoapClient parses a WSDL file, it turns the methods described in the WSDL to methods that are callable as method functions of the resulting client class. So if you need to access a SOAP method called "GetInfo" from a client object "$client", you would call "$client->GetInfo([params])".

Since this particular service uses periods in all of its method names, I can't figure out how to call this service's methods, since calling $client->Report.GetDetails() isn't syntactically valid.

The best thing I came up with is this:

$client->__soapCall('Report.GetDetails', array('param1', 'param2' ... ));

But that defeats the purpose of using the more-elegant callable method support. It also reminds me of NuSoap, which is what I'm trying to get away from (although if I had to, I'd rather use SoapClient than NuSoap no matter what).

Any ideas?

Lankiveil
Feb 23, 2001

Forums Minimalist

fletcher posted:

You will probably want to implement some sort of cache so you can store the messages in memory rather than querying the DB for each message every time you use it.

In the past, I've created files that contain translation tables for each text element that I want translated:

php:
<?
// /languages/en.php

$lang->login = "Login";
$lang->logout = "Logout";
$lang->welcome = "Welcome";
// ...

// /languages/ga.php
$lang->login = "Logáil isteach";
$lang->logout = "Logáil amach";
$lang->welcome = "Fáilte!";
?>
Then, in my standard header code, I do something along the lines of:

php:
<?
// default language
require_once("languages/en.php");

// user preference language
require_once("languages/$userprefs->lang.php");
?>
This not only allows the user to choose whatever language they wish, it also provides a fallback to English or another default language if the translation is not complete for the language they want.

I've never done this for a site with more than a couple of dozen text elements though, and it's probably horribly inefficient for a site that has thousands of them.

gibbed
Apr 10, 2006

mr. why posted:

The best thing I came up with is this:

$client->__soapCall('Report.GetDetails', array('param1', 'param2' ... ));

But that defeats the purpose of using the more-elegant callable method support. It also reminds me of NuSoap, which is what I'm trying to get away from (although if I had to, I'd rather use SoapClient than NuSoap no matter what).

Any ideas?
Inherit SoapClient and wrap all the functions in it? That's the only thing you can really do.

a starchy tuber
Sep 9, 2002

hi yes I'm very normal

gibbed posted:

Inherit SoapClient and wrap all the functions in it? That's the only thing you can really do.

Good idea!

Here's what I came up with:
php:
<?
function __call($function_name, $arguments) {
    $function_name = str_replace('_', '.',$function_name);
    return parent::__call($function_name, $arguments);
}
?>
That function is defined in the extended SoapClient class. Now I can make a call like "$client->Report_GetDetails($params)". Works beautifully :)

Thanks!

Acer Pilot
Feb 17, 2007
put the 'the' in therapist

:dukedog:

So how do I disable PHP in one specific folder (for security reasons)?

Mine GO BOOM
Apr 18, 2002
If it isn't broken, fix it till it is.

drcru posted:

So how do I disable PHP in one specific folder (for security reasons)?

In .htaccess:
code:
AddType text/html .php

Acer Pilot
Feb 17, 2007
put the 'the' in therapist

:dukedog:

Mine GO BOOM posted:

In .htaccess:
code:
AddType text/html .php

I'm more worried about being hit by PHP files disguised as images. Is there a way to get the engine to ignore a specific folder? I tried the php_admin commands in an htacess but it just gave internal server errors.

Bruno_me
Dec 11, 2005

whoa

drcru posted:

I'm more worried about being hit by PHP files disguised as images. Is there a way to get the engine to ignore a specific folder? I tried the php_admin commands in an htacess but it just gave internal server errors.
'php_flag engine 0' should do it

Acer Pilot
Feb 17, 2007
put the 'the' in therapist

:dukedog:

Bruno_me posted:

'php_flag engine 0' should do it

Doesn't seem to be working. Do I have to change any settings on the server?

Lacc
Jul 12, 2004

Install fist, problem solved.

drcru posted:

Doesn't seem to be working. Do I have to change any settings on the server?

In Apache you need to have AllowOverride Options (or All).

Acer Pilot
Feb 17, 2007
put the 'the' in therapist

:dukedog:

Lacithe posted:

In Apache you need to have AllowOverride Options (or All).

I'm so confused, it's already set to All. I'm running on mod_php. :confused:

mwarkentin
Oct 26, 2004
Are there any issues with using file_get_contents(\'php://input\') to read a JSON string sent by MooTools request.JSON? My php developers have said that it's a security risk, but I haven't been able to find anything by googling, and I'm not sure why MooTools would send their JSON strings in a raw post if there were any major issues?

ryanbruce
May 1, 2002

The "Dell Dude"
SA has removed its ad monitoring functionality for banner ads so I now have no idea who clicked my ad or how many times they did -vs- impressions. Once I found this out, I quickly Google-hacked some php together to make a .txt file that recorded clicks before punting people back to my coupons thread.

I now need to find a way to parse all this crap and toss it into some sort of database, preferably MySQL so I can keep utilizing the script in the future. I'd love the eventual ability to do sorting and reporting so I can see if people clicked it more than once, etc.

What would be the best way to attack something like this? I have the ability to read and understand existing php but don't know enough to come up with anything on my own. Here's an example of two entries:
code:
Ad version: Sa_ad_001
IP Address: 192.168.1.1
Date: Tue, 20 May 2008 16:39:38 -0400
Referring URL: [url]http://forums.somethingawful.com/showthread.php?noseen=0&threadid=2212134&pagenumber=95[/url]
----
Ad version: Sa_ad_001
IP Address: 192.168.1.2
Date: Tue, 20 May 2008 16:51:48 -0400
Referring URL: [url]http://forums.somethingawful.com/adlist.php[/url]
----

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

ryanbruce posted:

What would be the best way to attack something like this?

Why bother putting it in a text file and the parsing that to build your database? Why not just insert it directly into the database?

php:
<?
//do whatever you to to get the $vesrion, $ip, and $referrer variables filled

$database = new PDO('mysql:host=localhost;dbname=mysite', 'username', 'password');
$insert = $database->prepare("INSERT INTO ad_hits (version, ip, referrer) VALUES (:version, :ip, :referrer)");
$insert->bindParam(":version", $version);
$insert->bindParam(":ip", $ip);
$insert->bindParam(":referrer", $referrer);
$insert->execute();

?>
Don't bother inserting the date, make MySQL do that automatically for you when you insert the row. If you want your existing logs to be put in your database, use fgets to read a line, and then some string functions to widdle away at the line to exactly what you want. In the future, store something like this in a common format like CSV, so you can import it into ANY database with ZERO effort (or just straight into the database)

fletcher fucked around with this message at 22:53 on Jun 9, 2008

ryanbruce
May 1, 2002

The "Dell Dude"

fletcher posted:

Why bother putting it in a text file and the parsing that to build your database? Why not just insert it directly into the database?

I had it dump to .txt because I was in a hurry and it was quick and dirty. Technically 1and1's TOS says I'm not allowed to use MySQL to log referrers, adclicks, chat, or anything else that is high processor usage. I'll probably go this way anyways since I'm hardly a big time website.

I'll look into the php you pasted/etc and see if I can figure it out. Thanks for the lead!

edit: I'd like to know why I didn't think to use CSV though.. :downs:
Way late edit: If anyone ever makes the plaintext mistake that I made, here's how I fixed it: Opened the .txt file in Word, did a find/replace (ie Find [paragraph character]Referring URL: and Replace with a comma) for everything. End result: Nice clean .csv!

ryanbruce fucked around with this message at 23:24 on Jun 10, 2008

mcbuttbutt
Jul 1, 2004
I'm trying to implement a simple URL hiding script that retrieves a zip file from the server filesystem. It's pretty basic and could surely be much better. Something like this...
php:
<?
header("Content-type: application/zip");
header("Content-Disposition: attachment; filename=\"".$filename."\"");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " .(string)(filesize($file_path)) );

$file = @fopen($file_path,"rb");
if ($file) {
  while(!feof($file)) {
    print(fread($file, 1024*8));
    flush();
  }
  @fclose($file);
}

?>
It works for smaller files, but larger files (>50 megs) are always incomplete. Any idea why this is happening? Thanks!

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender

mcbuttbutt posted:

It works for smaller files, but larger files (>50 megs) are always incomplete. Any idea why this is happening? Thanks!
You're probably hitting the PHP 30 second execution time limit.

MrEnigma
Aug 30, 2004

Moo!

minato posted:

You're probably hitting the PHP 30 second execution time limit.

You can get around this in some cases by doing

set_time_limit(0)

But watch out, since it will basically keep running (well until you stop the browser). Just make sure you don't allow it to be backgrounded also, and then have a run away process :) There are some issues also because fopen (process) can be blocked.

Can you do file_get_contents('file path here'), that pretty much takes everything you do and does it in one easy swoop.

mcbuttbutt
Jul 1, 2004

minato posted:

You're probably hitting the PHP 30 second execution time limit.

Nothing regarding that in the logs, plus there are some database updates that occur after the fread() that successfully execute.

MrEnigma posted:

set_time_limit(0)

Can you do file_get_contents('file path here'), that pretty much takes everything you do and does it in one easy swoop.

Yeah, forgot to mention already having set_time_limit(0). I tried using get_file_contents and saw this error -- PHP Fatal error: Allowed memory size of 12582912 bytes exhausted

mcbuttbutt fucked around with this message at 19:31 on Jun 11, 2008

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
Try adding "while(@ob_end_clean());" before you start outputting data.

mcbuttbutt
Jul 1, 2004
That didn't fix it either. Thanks for your help so far!

MrEnigma
Aug 30, 2004

Moo!

mcbuttbutt posted:

Nothing regarding that in the logs, plus there are some database updates that occur after the fread() that successfully execute.


Yeah, forgot to mention already having set_time_limit(0). I tried using get_file_contents and saw this error -- PHP Fatal error: Allowed memory size of 12582912 bytes exhausted

Ah, you're running out of memory.

PHP has a memory_limit parameter in php.ini

You can also hit it using

ini_set('memory_limit', '64M');

in a script

or via .htaccess

From: http://drupal.org/node/29268

Granted it's for drupal specifically, but it should work for you also.

Edit: If you're on a shared host that might not work for you. And file_get_contents is probably more memory intensive since it loads up the entire thing, while on the other one you were doing it in chunks.

MrEnigma fucked around with this message at 20:03 on Jun 11, 2008

mcbuttbutt
Jul 1, 2004
Yeah, using get_file_contents() I ran out of memory and got the message in the error log.

However reading it chunk by chunk still doesn't complete the download and following code is executed. Forgive the poor wording, but could it be possible that I'm losing connection with the download once the PHP script stops executing? It's strange that I'm only getting half of the zip file and no errors.

MrEnigma
Aug 30, 2004

Moo!

mcbuttbutt posted:

Yeah, using get_file_contents() I ran out of memory and got the message in the error log.

However reading it chunk by chunk still doesn't complete the download and following code is executed. Forgive the poor wording, but could it be possible that I'm losing connection with the download once the PHP script stops executing? It's strange that I'm only getting half of the zip file and no errors.

it's dying somehow, it might be because of something else, like a remote host dropping it, or simply php not being able to set a timeout on that process.

you could try raising up your level of reporting...if you're on PHP5 use

error_reporting(E_ALL | E_STRICT);

mcbuttbutt
Jul 1, 2004

MrEnigma posted:

error_reporting(E_ALL | E_STRICT);
Did this and got an interesting message (path to the file edited)...

PHP Notice: ob_flush() [<a href='ref.outcontrol'>ref.outcontrol</a>]: failed to flush buffer. No buffer to flush. in path/to/download.php on line 133

edit: duh, fixed it with @ob_flush();. Still not working though.

mcbuttbutt fucked around with this message at 20:36 on Jun 11, 2008

Zorilla
Mar 23, 2005

GOING APE SPIT
Also take note that you'll likely need to raise all the following settings if you want a user to upload a really huge file:

memory_limit
post_max_size
upload_max_filesize
max_execution_time


Of course, the memory limit will have to be slightly higher than the post size, which must be slightly higher than the max filesize.

Mine GO BOOM
Apr 18, 2002
If it isn't broken, fix it till it is.
I remember something a while ago about PHP corrupting the download. If possible, make it send a file that is just the alphabet repeated, and then diff the output to see the actual byte mark it fails. I'm pretty sure I remember it being at a static offset.

functional
Feb 12, 2008

This isn't always working like it should. I have a special link on my pages that will take you to loggingout.php. It uses a token to verify that you actually clicked the link (and didn't have anyone IM you a fake logout link). Stripping away the logic what it really does is this, which should log you out and redirect you to the index:

php:
<?
LOGGINGOUT.PHP

setcookie('blargh','',time()-$tenyears,myCookiePath(),myCookieDomain());
header('Location: http://website.com/theindexpage.php');
?>
Once every few times it takes me to the index page without erasing the cookie. I can tell because the index page does stuff that it would only do if you were logged in properly.

What's the deal?

Adbot
ADBOT LOVES YOU

Lamb-Blaster 4000
Sep 20, 2007

I could have sworn I posted this here earlier this evening, but I think I'm crazy, apologies if I'm not crazy.

I need a php class or set of UDFs that makes Markov word chains from a given source body of text. I've found a few that make Markov words from letter chains, and one that just plain doesn't work, but none that do word chains.

The one that seems to be floating around the highest on google is the one that doesnt seem to work at all- this one

edit- added wikipedia link

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply