PHP Membership Provider

I came across a post made a little while ago by Steven Benner on PHP’s notable lack of a consistent Membership Provider as in ASP.Net. All good points, but I think the hard bit isn’t so much session management since there are a million examples out there on how to provide a login interface.

The tough part seems to be a consistent method of accessing, creating and maintaining users in a database. Also a lot of those examples tend to be MySQL specific so if you need something easily customized for something else, like say PostgreSQL, you’re pretty much stuck customizing to no end, at which point you might as well write your own.

And that seems to be what an overwhelming majority of people are doing if they’re not getting an off-the-shelf CMS to start with. Of the few frameworks for membership management that are available out there, most are a bit over complicated and have licenses, fees and other strings attached.

So most people are stuck writing a user management system from scratch or getting a CMS off the shelf with its own unique (and usually incompatible) user management system.

I got home from work early on Friday and have this weekend off for a pretty good rest, so early this morning, I thought I’d write up a Membership Provider for PHP that anyone can use with no strings attached (and hopefully doesn’t suck too much). The idea is to have a simple drop-in user management interface.

Keep in mind that I haven’t actually tested any of this and it all may in fact blow up spectacularly (since I didn’t do any testing, everything is as-is and I’m not a professional PHP programmer). It’s a bit late for Steven’s post, but better late than never eh? ;)


I didn’t want to mimic everything in the Membership Provider for ASP.Net in that there are a lot of things that I don’t like, some features that could be improved and others that are completely missing.

Notably, I don’t see the need for password questions and answers since resetting by email is often a better solution than lowering the security barrier by introducing (essentially) a weaker password in the form of an answer to a known question. Also, people often forget the answers to questions too or repeat the same questions and answers on different sites, which defeats the purpose of a strong password anyway.

I’d also like to have a way to find and delete users by either Id, Name or Email and easily lock/unlock, approve/unapprove users and provide strong passwords with Blowfish. For this last option, I downloaded the Portable PHP password hashing framework, which I think is the best implementation for PHP I’ve found so far.

Now I need a table…

Although this example table is in MySQL, you could modify this for any other database. In the queries, I avoided any MySQL specific commands like sql_calc_found_rows and the like and because I’m using PDO the code itself should work with no modifications on PostgreSQL and SQLite provided it’s the same table structure.

CREATE TABLE `users` (
  `userId` int(20) unsigned NOT NULL AUTO_INCREMENT,
  `username` varchar(50) NOT NULL,
  `hash` varchar(10) NOT NULL,
  `password` varchar(200) NOT NULL,
  `passwordSalt` varchar(100) NOT NULL,
  `email` varchar(100) NOT NULL,
  `avatar` varchar(200) DEFAULT NULL,
  `createdDate` datetime NOT NULL,
  `modifiedDate` datetime NOT NULL,
  `lastActivity` datetime NOT NULL,
  `bio` text DEFAULT NULL,
  `isApproved` tinyint(1) NOT NULL DEFAULT '1',
  `isLocked` tinyint(1) NOT NULL DEFAULT '0',

  PRIMARY KEY (`userId`)

Pretty typical for most websites. The “hash” there is just a unique key made by combining the username and email in a hash that will set apart users even with similar looking usernames. Think of it as a very primitive Tripcode shown alongside a username.


If there’s one thing that makes configuring an ASP.Net web page consistent is the web.config file. It’s only appropriate then to put all the configuation stuff into one file, say a .ini, like so :

;*** Website configuration file ***
; All comments start with a semicolon (;), and this file can be something other than config.ini
; as long as the new file name is specified in index.php .

;*** Globally accessible settings ***

dsn = "mysql:host=localhost;dbname=testdb"
username = "testuser"
password = "BKjSheQubKEufqHC"

usersTable = "users"
rolesTable = "roles"
tablePrefix = ""

;*** Membership specific settings ***

minRequiredPasswordLength = 5
maxInvalidPasswordAttempts = 3
passwordAttemptWindow = 10
userIsOnlineTimeWindow = 20
autoApproveUsers = "true"

Of course, for this example, we won’t be using all of that, but I just created a file called config.ini and put all that in there. Now how do we load all of this? First we need an index.php file.


ini_set( "display_errors", true );

// Set the application configuration file. You may store this outside the web root
// if your web host allows it.
define('INI', 'config.ini');


	End editing


define('BASE_PATH', dirname(__FILE__));

// Platform specific include path separator
if ( strtolower(substr(PHP_OS, 0, 3)) === 'win') 
	define('SEP', ';');
	define('SEP', ':');

// There must be an /app folder in root with all the class files

// Optionally there should also be a /mod folder in root where you can include modules/plugins in the future

// Modify include path
set_include_path( CLASS_DIR  . SEP . MODULE_DIR );



The down side to this approach is that all your file names must be lowercase, but it should be slightly faster than any other autoload implementation and I’m reasonably confident it should run on a *nix platform with no hiccups as well as on Windows.

For now, that autoload would force the application to look in the /app folder where all the magic happens and optionally in /mod for any future modules. And for that magic we’ll create a base class that all “action” classes would inherit from. This is basically to provide dynamic class properties.


 * Base class provides dynamic class property functionality
 * @package Base
class Base {
	 * @var array Config properties storage
	protected $props = array();
	 * Get class accessible settings
	protected function __get( $key ) {
		return array_key_exists( $key, $this->props ) ? 
			$this->props[$key] : null;
	 * Set class accessible settings
	protected function __set( $key, $value ) {
		$this->props[$key] = $value;
	 * Checks whether accessible setting is available
	protected function __isset( $key ) {
		return isset( $this->props[$key] );

Now we need a configuration class that will load the settings from the defined .ini file into itself and make the settings accessible to all classes calling it.


 * Application configuration file.
 * All settings are imported from the .ini file defined in index.php
 * @package Config

class Config extends Base {
	 * Class instance
	private static $instance;
	private function __construct() {
		$this->props = parse_ini_file(INI, true);
	public static function getInstance() {
			self::$instance = new Config();
		return self::$instance;

This is starting to get a bit long so onward to the next page


Font reference revamp

A while ago, I created a single small HTML page with a bunch of easy to read fonts for web publishing. This afternoon, I got some free time so I did a rewrite of the whole thing as a web application.

Font reference

The biggest problem with the old version is that every font was displayed all on one page which would have made expanding the list more cumbersome. Also, a lot of people had started to hotlink to the old font images (not just these, but a lot of other images on my server which was starting to affect my bandwith), so I’ve instead added permalinks to each font and the letter index as an alternative.

There’s no server-side scripting and the whole thing is basically JavaScript courtesy of jQuery. You can take a look at the old version at the end of this post.

Sourcecode of “fref.js” (please be advised that I may change the actual running version at any time and this is only current as of this post).

$(function() {
	var pIndex, pFont, fCanvas;
	var ta = [];


	function Init() {

		fCanvas = $("#fonts");

		// Titles list
		var t = $('<ul />');

		// Unique index
		for(var i = 0; i < fonts.length; i++) {
			var f = fonts[i];
			var ni = true;

			for(var j = 0; j < ta.length; j++) {
				if(ta[j] == f[0])
					ni = false;

				ta[ta.length] = f[0];

		for(var i = 0; i < ta.length; i++) {
			t.append('<li><a href="#'+ ta[i] +'">'+ ta[i] +'</a></li>');
			var s = $('<div id="'+ ta[i] +'" />');
			var u = $('<ul />');

			for(var j = 0; j < fonts.length; j++) {
				if(fonts[j][0] == ta[i]) {
					var f = fonts[j];

			s.append('<h2>“'+ ta[i] +'” Fonts '+
				'<span><a href="'+
				ta[i] +'">Permalink</a></span></h2>');

			create: function(e, ui) {
				findFont(pIndex, pFont);

	function searchValidate() {
		if("?i=") > -1) {

			var pn = $.trim(
					.replace(/\s/g, "");

			var s = pn.indexOf("?i=") + 3;
			var e = pn.indexOf("&f=");

			if(e > s)
				pIndex = pn.substr(s, e - 3);
				pIndex = pn.substr(s);

			pIndex = (!pIndex)? "0-9" : pIndex;

			if((e > 0) && (e + 3 < pn.length))
				pFont = pn.substr(e + 3, pn.length);

		if(ta.indexOf(pIndex) < 0)
			pIndex = ta[0];

	function findFont(i, f) {
		fCanvas.tabs("option", "selected", ta.indexOf(i));
		var fnt = $('#' + f);

		if(fnt.length > 0) {
			$('html,body').animate({ scrollTop: fnt.offset().top - 10 }, 1000);
			fnt.effect("highlight", {}, 5000);

	function felement(f) {
		var a = f[2].replace(".png", ""); // Image without .png extension becomes font ID
		var st = '<li id="'+ a +'">' +
			'<ul><li>'+ f[1] +'</li>';

		// Additional attributes
		for(var i = 0; i < f[3].length; i++)
			st += '<li>'+ f[3][i] +'</li>';

		// Permalink
		st += '<li><a href="'+ f[0] +'&f='+ a +'">Permalink</a></li></ul>' +
			'<img src="fonts/'+ f[2] +'" alt="'+ a +'" /><hr /></li>';

		return st;

The data is stored in a single “data.js” file as an array. I may change this to JSON in the future.

// Convention : index, title, font image, description array (new item per line)
var fonts = [
	["A", "Am Sans light", "am-sans-light.png", ["20pt", "Sans-serif"]],
	["A", "Andalé Mono", "andale-mono.png", ["20pt", "Monospace", "Sans-serif"]],
	["A", "Arial", "arial.png", ["20pt", "Sans-serif"]],
	["A", "Avenir - Book <strong>Commercial</strong>", "avenir-book.png", ["20pt", "Sans-serif"]],
	["B", "Bitstream Vera Sans (Roman)", "bitstream-vera-sans-roman.png", ["20pt", "Roman", "Sans-serif"]],
	["B", "Bitstream Vera Sans Mono (Roman)", "bitstream-vera-sans-m-roman.png", ["20pt", "Roman", "Monospace", "Sans-serif"]],
	["B", "Book Antiqua", "book-antiqua.png", ["20pt"]],
	["B", "Bank Gothic Lt BT", "bank-gothic-light-bt.png", ["20pt", "Small caps", "Sans-serif"]],
	["C", "Calibri", "calibri.png", ["20pt", "Sans-serif"]],
	["C", "Candara", "candara.png", ["20pt", "Sans-serif"]],
	["C", "Century Gothic", "century-gothic.png", ["20pt", "Modern, Geometric, Bold", "Sans-serif"]],
	["C", "Consolas", "consolas.png", ["20pt", "Sans-serif"]],
	["C", "Courier Std", "courier-std.png", ["20pt", "Monospace"]],
	["D", "Dutch801 Rm BT <strong>Commercial</strong>", "dutch801-rm-bt.png", ["20pt"]],
	["E", "Eurostile", "consolas.png", ["20pt", "Modern", "Sans-serif"]],
	["F", "Florencesans", "florencesans.png", ["20pt", "Sans-serif"]],
	["F", "Franklin Gothic Book", "franklin-gothic-book.png", ["20pt", "Modern, Geometric", "Sans-serif"]],
	["G", "Georgia", "georgia.png", ["20pt"]],
	["G", "Gill Sans MT", "gill-sans-mt.png", ["20pt",  "Sans-serif"]],
	["G", "Gotham Light <strong>Commercial</strong>", "gotham-light.png", ["20pt", "Mdoern, Geometric", "Light", "Sans-serif"]],
	["H", "Humanist521 BT (Roman) <strong>Commercial</strong>", "humanist521-bt-roman.png", ["20pt", "Roman", "Sans-serif"]],
	["L", "Lucida Sans", "lucida-sans.png", ["20pt", "Sans-serif"]],
	["L", "Lucida Unicode", "lucida-unicode.png", ["20pt", "Unicode", "Sans-serif"]],
	["M", "Microsoft Sans Serif <strong>*</strong>", "microsoft-sans-serif.png", ["<strong>* Not the same as MS Sans Serif</strong>", "20pt", "Sans-Serif"]],
	["M", "Monaco", "monaco.png", ["20pt", "Modern, Condensed", "Sans-serif"]],
	["M", "MS Sans Serif <strong>*</strong>", "ms-sans-serif.png", ["<strong>* Not the same as Microsoft Sans Serif</strong>", "20pt", "Sans-serif"]],
	["M", "Myriad", "myriad.png", ["20pt", "Sans-serif"]],
	["T", "Tahoma", "tahoma.png", ["20pt", "Sans-serif"]],
	["T", "Trebuchet MS", "trebuchet-ms.png", ["20pt", "Sans-serif"]],
	["U", "Univers <strong>Commercial</strong>", "univers.png", ["20pt", "Sans-serif"]],
	["V", "Verdana", "verdana.png", ["20pt", "Sans-serif"]]

It still leaves a lot of work to be done; most obviously, the reference phrase “The quick brown fox jumps over the dirty dog” is wrong. It should be “The quick brown fox jumps over the lazy dog”. Also the list is pretty small and limited only to a handful of fonts I’ve used. I’m planning to add a full collection of possibly every font I’ve used in the past (even perhaps the stylish and not-so-easy-to-read variety)

Old font reference index

I’m thinking of writing a very basic discussion board in ASP.Net MVC and I may use the new design in that. The CSS is very minimal and most of the extra jQuery UI styles are from the customized Smoothness theme.

Dev Team : Need more monkeys

Sorry we’re out. Will trap recruit more at next Developer Conference.

– Human Resources

And they think I’m grouchy on weekends for no reason. Let’s bring in some perspective, shall we?

Needed :

  • User management
  • Article management
    • Comment management
  • Discussion board
    • Forum management
    • Topic/Post management
  • Document management
    • Granular privileges/Edit restrictions

Most mortals will look at this list and think “Portal”. Well, that’s essentially what it is with some minor tweaks. It was for an intranet and the deadline is a week.

Are we using something off the shelf?
Are we using something appropriate?

Are we going to waste time/money/people (literally) trying to build something from scratch that will try to do things 10x longer than that list?
You betcha!

2 Programmers alone could have finished all of this in a day with time to spare for meals and coffee, but since we’re a “team”, I have to put up with the most inane BS ever to come out of an orifice… on a face. I feel lucky though, since I’m only handling data access, I won’t be dealing with the brunt of the feature flood though I’m sure I’ll have to write some OR/Mapper because “off-the-shelf” is “untrustworthy”. Yay me!

I swear, the vast majority of projects that fail are directly the result of bosses and project managers not knowing what the devil it is they are doing. I propose mandatory Ritalin® prescriptions for these people so we don’t get interrupted every hour by “Ooh shiny new feature. We must implement it!!”

I’ve been trying to convince people to stay focused for most of this year with no effect.

The next time you’re given a list of requirements and a programmer and/or project manager stands up to insist on a new feature before the core is complete, put him in a straight jacket and throw him out the window.

I will guarantee the liability and injury costs will be nothing compared to how much money will be saved by not having him aboard.


Eksith Rodrigo isn’t actually a licensed lawyer, but he does know throwing someone out of a window in a straight jacket might constitute grounds for a lawsuit.

Reader discretion is advised.

Three praises of cloud computing

This was, once again, a long winded comment that I decided to turn into a post. I really need to work on my verbal diarrhea.

In response to Nicholas Sinlock‘s Three critiques of cloud computing post yesterday, here is the opposite point of view. (Please check out the rest of his posts. They’re well worth your time.)

First, let me address some of your critiques…

I can certainly understand your reservations as well as Stallman’s. While the criticisms are valid, they also side-step a bit of our own history re: computing. That is to say all computing endeavours started out with abysmal reliability, questionable control, and imaginary security.

These are not excuses for lackluster performance today, but by relative comparison of the platforms of concern, specifically Gmail as in Stallman’s example, they are actually at a head start. However, the fact that this is merely emerging technology, these concerns may yet to be addressed to a degree computing experts will be satisfied. Give it a couple of years ;)

Addressing Gmail :
I don’t use a web browser to access my accounts. Considering I have a dozen or so email addresses and I’ve subscribed to dozens more newsgroups, I use Mozilla Thunderbird, an open source mail client, instead.

And specifically addressing Google’s ownership of Gmail :
Hey, they’re no worse than Hotmail, Yahoo! or Hushmail. It’s rather unrealistic and quite silly to expect everyone to run their own mail daemon if that was Stallman’s expectation. In this case many of us have no choice but to use a hosted service anyway. Besides, privacy is always relative on the Net when it comes to email.

The trap that wasn’t

I don’t view this as just more proprietary creep (a la feature creep) as Stallman does. The platforms offered today, as is the case with many icons in prolific use, are only the tip of the iceberg.

We have alternatives to proprietary software in quantities (and qualities) never dreamed of before. In fact there are several that are only available as OSS. As far as idea bins go, CC software is right up there with the Operating System in terms of developer magnetism.

I can see OSS alternatives to CC proprietary competitors today in as soon as 3-5 years. In the same spirit as Wikipedia and the Web Archive, it is only a matter of time when free alternatives are brought to light sponsored by good samaritans, if not idealistic ones. Where slogans like “knowledge should be free” reign all over these projects, it isn’t too much of a stretch to see even the likes of Ibibilo getting into this.

I see the proprietary and paid CC providers as a major motivator for the creation of free alternatives and may even bring about the end of vendor lock-in, in regard to desktop publishing and office software.

This isn’t 1997

… When browsing the Internet was a matter of clicking on one of those icons found on the desktop of a Windows PC that came bundled with Prodigy, Earthlink, AOL, MSN etc…

Considering the nosedive in bandwidth prices these days for hosts as well as the proliferation of higher speed networks for homes, access and reliability are quite a bit higher than they were back when I was in high school.

We’re already depending on machines run by someone else in day-to-day life. So much so that we barely notice. Every time we book a flight, or buy something online, or check into a hospital, or do something as mundane as check our email, we’re relying on systems run by someone else.

The reason why this magnificent ballet works isn’t because it’s perfect, but because it’s continually being perfected. In other words, since the first Internet node came online, we’ve done nothing but perfect the system. It’s the procedure and path of perfection that makes it all work, not “Perfection” as a destination.

Instead of driving across the country, we’re booking flights instead. We’re taking the chores related to working out of our hands so we can actually focus on the work. E.G. Looking for service stations, checking tyres, checking oil, checking maps etc… are all part of going to work if you live some distance from it. But by booking the flight, we can focus on the actual meeting.

All I’m doing in the cloud is just handing over the chores of work so I can focus on the work itself. The question, then, isn’t “do I hand over control?” It’s “do I need to control this aspect of it?”

Considering :
CC won’t be limited to the current providers.
And, not all CC providers will be run by incompetent nitwits.
And, not all CC servers will be located in the U.S. where court orders are optional for violations of privacy…

…I think it’s a reasonable choice given the alternatives. In fact, dare I say, it may be a safer alternative than storing all your content at home where it’s, potentially, easily available to anyone entering without your permission.

Redundancy is cheaper in bulk

Hardware and software failures are just a part of life for those of us who spend so much time on our computers. However, if I can have the option of datacenter calibre redundancy and storage capacity, I would certainly go for that.

Though the cost of computing hardware is always falling, it never seems to fall fast enough to keep up with my software. And I’m not even talking about games. OpenOffice, Gimp and Audacity in particular seem to balk at my 2Ghz 2.25Gb RAM Aspire 9200. Of course, I was listening to music, surfing and had all of those applications open at the same time and had Photoshop running for some reason… But that’s beside the point. *cough*

If I could get a raid 5 array and 7 – 10 drives, I certainly would.
If I could get failover in that array without lifting a finger, I certainly would.
If I could get all those drives in 1.5 TB increments, I certainly would.
If I could get unlimited replacements for them, I certainly would.
…But I can’t afford it.

But if I’m buying software and hardware for a company, then I certainly would be looking into a couple of raid arrays. Maybe not in those exact specs, but it’s something to look into. I would probably look into two arrays in fact. One for storage, and another for applications.

As I mentioned above, I think it’s only a matter of time before open source alternatives are available for CC platforms. And I think one of the biggest sellers is in the office and school arena where there are potentially dozens to hundreds, if not thousands, of computers all using the same set of applications.

I think CC software will find quite a nice market in the office and in school. Not to run over the Internet, but over the local network. I hope there are developers already implementing this as it seems like the next logical step to bulky, locally installed, software.

Imagine, if you can install a CC bundle on a local server for easy access… Considering the majority of processing takes place at the terminal and only save/backup operations take place at the server, you’re actually reducing the vulnerability by splitting the work between your terminal and the server.

If for some reason your terminal were to fail, and you had your work saved locally, it’s possible that some to most of it will never be recovered. After all, how many offices or schools do you know where each and every computer saves to a raid array?

But if you had a central server for applications and another server for saving, if there is a failure in either one, your options are far more varied. If the save server fails, your data will still be saved in the array. If not in the backups. It’s a lot easier to backup one or two servers than to backup an entire collection of computers. It’s also a lot easier to restore one or two servers than to restore an entire collection.

CC, as far as I’m concerned, is the next logical step after the diskless client. There are already implementations of “installed” software running on the server, but if the option of running the software locally via a browser is available, then I say go for it!

CC software has the potential to reduce costs by a huge margin. If only because we won’t have to rely so much on local storage for our files. Portability will also skyrocket, as a lot of energy and space is reserved for storage on mobile devices such as laptops. And with the advent of free alternatives, I’m sure, it will be well worth the wait.

Symfony : The ASP.Net of PHP

… Well not quite, but I’m sure most ASP.Net programmers would refer to it in this fasion. If you’re coming form another web application framework, then this is well worth looking into.

This project came into focus for us when we had a client with a preexisting web application in need of a drastic rewrite. The catch was the time. We literally had 3 days!!

Well, considering (as I’ve mentioned before) I’m not the greatest PHP programmer out there. In fact, before this, my most extensive work with PHP was a discussion board and FAQ comment system tied to a CRM. But, as they say in England, you’ve got to keep a stiff upper lip at these types of situations.

I would highly recommend downloading Uniform Server if you’re on a Windows machine. It’s, by far, the easiest method of creating a complete Apache/PHP/MySQL platform painlessly for Windows systems. Also of note is the nice admin panel, which, though rather sparse, gets the job done and is easy to use.

The first step is pretty basic. Get your server running and then navigate to W: in a command prompt (This is where the server will mount when running).

Note : The commands to type are in bold white.
w:>cd usrlocalphp

Now we’re going to use Pear to access the Symfony repository.
w:usrlocalphp>pear channel-discover

If your system for some reason cannot identify PHP (some users noted this problem), then use the full path of PHP.
w:usrlocalphp>w:usrlocalphppear channel-discover

Once that’s done, we’re going to install it
W:usrlocalphp>php -r "readfile('');" > go-pear

Again, if it doesn’t recognize this, use the full path as the above example.
W:usrlocalphp>w:usrlocalphpphp -r "readfile('');" > go-pear

Now you will be prompted with a few installation options.

Note: All these are without quotes…

For : “If you wish to abort, press Control-C now, or press Enter to continue: ”
Press : “Enter”

For : “HTTP proxy (, or Enter for none:”
Press : “Enter”

For : “1-8, ‘all’ or Enter to continue:” (This is regarding the executable directory)
Press : “Enter”

For : “Would you like to install these as well? [Y/n] :”
Type : “y”

For : “Would you like to alter php.ini ? [Y/n] :”
Type : “y”

Now as of this post, the version you have just installed is the 1.1.x branch. It’s the stable branch and works well.

My suggestion afterwards is to visit the cookbook so you can quickly get in touch with the basics. Also take a peek at the video tutorial on the Admin Generator.  It isn’t very difficult, but be thorough when you go through this. Missed steps or oversights can cause some issues down the road, so double-check your work.

Aside from the command line fiddling and a few manual edits, Symfony can get a great deal accomplished in a very short period of time.

Using the Admin Generator, here is my experimental (and very minimal) MySQL database generated by the tool :

# This is a fix for InnoDB in MySQL >= 4.1.x
# It "suspends judgement" for fkey relationships until are tables are set.

#-- articles


CREATE TABLE `articles`
`title` VARCHAR(255) NOT NULL,
`description` VARCHAR(255),
`content` TEXT,
`abstract` TEXT,
`author_id` INTEGER NOT NULL,
`category_id` INTEGER default NOT NULL,
`is_published` INTEGER,
`date_pub` DATETIME,
`date_created` DATETIME,
INDEX `articles_FI_1` (`author_id`),
CONSTRAINT `articles_FK_1`
FOREIGN KEY (`author_id`)
REFERENCES `users` (`id`),
INDEX `articles_FI_2` (`category_id`),
CONSTRAINT `articles_FK_2`
FOREIGN KEY (`category_id`)
REFERENCES `categories` (`id`)

#-- users


`username` VARCHAR(50) NOT NULL,
`password` VARCHAR(50) NOT NULL,
`display_name` VARCHAR(255),
`first_name` VARCHAR(255),
`last_name` VARCHAR(255),
`email` VARCHAR(100) NOT NULL,
`avatar` VARCHAR(255),
`bio` TEXT,

#-- categories

DROP TABLE IF EXISTS `categories`;

CREATE TABLE `categories`
`title` VARCHAR(255) NOT NULL,
`abstract` TEXT,
`photo` VARCHAR(255),
`is_private` INTEGER,

# This restores the fkey checks, after having unset them earlier

Not the best example out there, but you can see what I was aiming for.

The great thing about this is that you can very quickly create the basic functionaly of any web application from CMS to CRM to Blog with plenty of flexibility and room to fix errors.
Highly recommended if this is the first time you’re dealing with a web framework. Also, a good way to improve your PHP.

This example was setup on a Vista PC.