Prevent race condition in PHP

Looking for some PHP tips here - I’ve got a click counter on the no ai webring page which writes a +1 to a JSON file for every time someone clicks a next, previous or random link. I’m using flock() to prevent too many clicks from causing a race condition and corrupting the file, but still, it keeps happening. I haven’t spent that much time looking into it as the main solution to this kind of thing is using flock(), which I’m already using. I’ve moved the flock() locks and unlocks around a bit to see if it helps, but I keep getting race conditions. At this point I have an hourly backup cronjob set up just so I can restore when the code inevitably screws up, sometimes not for a week, sometimes three times in a day.

Here’s the code. Does anyone have any ideas about how to fix this, or how I’m doing it completely wrong, or anything?

<?php
if ((isset($_GET['prv']) || isset($_GET['nxt']) || isset ($_GET['rnd'])) && (!preg_match('/bot|crawl|@|slurp|spider|google|archive|search|http/i', (filter_var($_SERVER['HTTP_USER_AGENT'], FILTER_SANITIZE_STRING))))){
$logdate = date('Y-m-d', time());
flock('log.json', LOCK_EX);
$logdata = file_get_contents('log.json');
$logdata = json_decode($logdata, true);
$oldnum = $logdata[$logdate];
$logdata[$logdate] = $oldnum + 1;
$logdata = json_encode($logdata, JSON_PRETTY_PRINT);
file_put_contents('log.json', $logdata);
flock('log.json', LOCK_UN);
}
?>

flock() takes an open file resource, not a filename: PHP: flock - Manual

2 Likes

Crap, is that what I’m doing wrong? Doing a little rewrite now, seeing what happens. Goes to show you just need someone else to look at your code occasionally. Thanks.

2 Likes

Also… writing to files is very expensive, compared to writing to memory. A long-running process should keep a count in memory, and then periodically flush that count to disk. Since PHP is request-based, you should offload that to a database, if you can.

2 Likes

Will look into that, it makes sense. Time to Struggle With Code!

It’s been three days now without a hiccup, I think I’ve got it. Still doing the backups for the moment, I’ve been hurt before.

<?php
if ((isset($_GET['prv']) || isset($_GET['nxt']) || isset ($_GET['rnd'])) && (!preg_match('/bot|crawl|@|slurp|spider|google|archive|search|http/i', (filter_var($_SERVER['HTTP_USER_AGENT'], FILTER_SANITIZE_STRING))))){
$logdate = date('Y-m-d', time());
$logdata = 'log.json';
$fp = fopen($logdata, 'c+');
if(flock($fp, LOCK_EX)){
$logdata = file_get_contents($logdata);
$logdata = json_decode($logdata, true);
$logdata[$logdate]++;
$logdata = json_encode($logdata, JSON_PRETTY_PRINT);
file_put_contents('log.json', $logdata);
flock($fp, LOCK_UN);
}
fclose($fp);
}
?>
1 Like