Jump to content


Member Since 29 Dec 2005
OFFLINE Last Active Yesterday, 14:26

Topics I've Started

Replacement for .htaccess?

23 December 2016 - 14:40

I am trying to get a website (my own PHP code + canned applications such as osC) to do some Stupid Pet Tricks. I have discovered why .htaccess programming is referred to as "voodoo" even in its own documentation -- it's horribly structured, inconsistent, and works only half the time. Anyway, I'm quite frustrated with .htaccess and seek an alternative for an Apache-based site.


The chief sticking point with .htaccess has been SEF URIs (I want /path/module/key1/val1/key2/val2 to become /path/module.php?key1=val1&key2=val2). This is easy enough for a fixed number of parameters, but none of the suggestions I've tried for generalized (variable number of parameters) work. .htaccess only goes through one pass, leaving me with /path/module/key2/val2?key1=val1 or some other wreckage like that. Maybe it's a limitation of my host, maybe I just haven't found the right incantations yet.


I was able to write a nice "redirector" PHP script that uses normal PHP code and $_SERVER variables to pick apart the URL and put it back together the way I want, and then use header("Location: xxxx") to go to the revised URL and status code. It's called from /.htaccess once per page. It works beautifully, except when POST data is involved. The only way I've found to deal with that is to set up a <form> with hidden data fields, and automatically submit it (as POST). It's a bit ugly (the form flashes up for a second or two), but works. The last problem, which I have been unable to solve, is that something like a CAPTCHA doesn't work. Apparently, the "correct" answer changes between the first and second page calls, so it's never a match. Has anyone gotten around this? I want to avoid altering canned software if at all possible, otherwise I would serialize the POST data and pass it through a $_SESSION variable, or something. 403 and 404 error handling is a bit of a kludge, but that's secondary.


So, can anyone offer pointers on either 1) foolproof cookbook ways to set up .htaccess URI rewriting to handle SEF variable length false paths, 2) foolproof ways to pass POST data through a single PHP page, or 3) some other .htaccess replacement altogether? I know in something like WordPress, if a request is not a real file or directory, that it will pass it to index.php and let PHP code play with it and internally redirect to the right place. However, I'm try to avoid making changes to canned software, and would like it to think it's running under normal circumstances.


Is there an open source replacement for .htaccess, with procedural language instead of rewrite rules? .htaccess is associated with Apache -- how do Nginx and various Windows servers handle these tasks? Do they just emulate .htaccess? I'm on a shared server, so it can't be something that the host needs to install (I'm sure they won't). Thanks much for any leads!

Creating a file in PHP?

27 April 2016 - 23:33

I have a general question about servers and files. It is possible in PHP to write out a file on the server, say, temp.php created by index.php. If this file has a fixed name, is this file accessible to every user of this site (which I don't want)? That is, is there only one copy at a time, or does every user (potentially hundreds at any given moment) somehow get their own copy?


I'm toying with the idea of having a PHP file (both HTML and PHP code) written out by index.php and then run (PHP header Location call). If the name of the file is not somehow unique, I fear I will run into problems with conflicts and race conditions. For example, if User A runs index.php to create temp.php, and a fraction of a second later, User B does the same (with slightly different content), might User A end up running User B's temp.php? Or, User A ends up trying to run an empty temp.php, because User B has just truncated temp.php in preparation for rewriting it. Can these things really happen, or am I overthinking it? I saw this very thing happen with Simple Machines Forum's configuration file when it was rewritten on the fly to record error events (yes, a very stupid design).


I suppose there's an alternative in that I can make a really ugly and complex index.php that ends up outputting only HTML (directly to the browser), without an intermediate PHP file. I guess this is sort of what a Content Management System like WordPress, Joomla, or Drupal does, but I really don't want to reinvent such a thing. Do osC's "hooks" work anything like this? This is something that is probably too much to put in a cookie or HTML local storage or in a database. It would be so nice to just write a PHP file and be able to run it (letting the server take care of code execution), if I can be sure that multiple users won't step on each other.


PHP gives the ability to write a file, and PHP is normally run on a server (with multiple active users at any one time). Can it really keep everyone's same-named files separate, or was it never intended to? I can't think of how it would do that. I could also try to give each new file a unique name (or unique directory with mkdir), but that would be ugly looking, and there is the danger of cluttering up the server with thousands of no-longer-needed temporary files, which would need to be cleaned out on a regular basis (once run). Someone must have solved this problem before! Suggestions or hints welcome.