30 Jan, 2003
mySQL LIMIT, Table Joins, and creating CSV files
Posted by: Jennifer In: Lessons learned
Last night in my freelance work, I hit the "A-HA" moment on three seperate issues I'd be fighting with for a long time. Probably simple concepts to some people, but they were a bit of struggle for me. I just wanted to make a note (partially to myself) about what I learned, so I don't forget it! And also, in case it helps somone else who had been fighting with the same concepts.
The freelance project is an event planner. There are three seperate tables in the database. One that stores all the user's information (we'll call it the "user" table), one that stores all the different event info (ie. name, time, location) (we'll call this table: "event"), and one that keeps track of the registrations (table "registrar").
1) TABLE JOIN
This is basically the concept of calling data from two (or more) seperate tables within the same SELECT statement.
What I wanted to do was to create a page that showed a list of the users that had signed up for one particular event. (FYI, each user, event, and registration has an id – so there is a userid field in the usertable, an eventid field in the event table. The registrar even has a regid – but mostly that table is used to match users to events (by way of their respective id numbers). So one row (in the registrar) would have user #234 signed up for event #22.)
The problem: We can pass an "event id" to a page, and from that info, I wanted to display a list of users (their names, some of their info, etc) that had signed up for that particular event.
My original mistake: I was making TWO seperate database calls (actually now that I think about it – a WHOLE LOT MORE than two calls!!!). First I'd select a list of userid's from registrar where the event id in that row matched the one passed to the page. THEN to get the user's name etc. from the user table, I'd loop through that list of userid's and make a new call to the database each time, selecting the name, info, etc. from the user table where the userid matched the one in the currently fetched row from the registrar…talk about overhead!!!
Solution: TABLE JOIN! Here's bascially what the ONE statement looked like that did ALL of the above:
SELECT user.name, user.email, user.address, user.userid, registrar.eventid, registrar.userid FROM user, registrar WHERE registrar.eventid = $_REQUEST['eventid'] AND registrar.userid=user.userid
(some support info… this assumes the url to this page was something like showevents.php?eventid=24)
That's it!! Grabs everything I need in ONE statement!
2) LIMIT
Problem: What I wanted to do on one page was show a list of ALL users that have signed up for ANY event… ever. Over time, this could get to be one hell of a long list. If you show it all on one page, it could get very scary. (Maybe even crash your browser).
Solution LIMIT!! Previously, I thought you could only use LIMIT in a select statement like this:
SELECT * from user LIMIT 20
Which would pull only the first 20 rows from the table. But what I wanted to do was periodically skip over the first "x" number of entries. Well, it turns out you can do that with LIMIT as well, like this:
SELECT * from user LIMIT 40,20
In that example, it will skip the first 40 rows, and grab the next 20. To get that to work with "next/previous" type links I did something like this:
the url would looklike: showallusers.php?group=20
if (isset($_REQUEST['group'])) {
$group = $_REQUEST['group'];
} else {
$group = 0;
}
i had previously done select count(*) as count to get the total number of users
so ($total = $getUsersRow['count']) – (i'm leaving out some code here for space-saving sake… it's just a basic query…)
then this:
<? if ($group > 0) { ?>
<a href="showallusers.php?group=<? echo $group-20; ?>">previous</a>
<? } // end if show previous link ?>
<? if ($group < $total) { ?>
<a href="showallusers.php?group=<? echo $group+20; ?>">next</a>
<? } // end if show next link ?>
then above, where you do your select statement, put it together like this:
$getUsersQuery = sprintf("SELECT name, address, email FROM user LIMIT %s,20;", $group);
That will replace the first number with whatever we're skipping with for that page… if it's "0" then it doesn't skip any, if it's 20, it skips the first 20, 40 – it skips 40, etc….and doesn't display the links when appropriate…
(I know that I should probably have extra code in there to make sure that "$group" is divisible by 20, so that someone doesn't enter a strange number in the url manually and get bad feedback… I'm just showing the basic concept here.)
3) CSV files
This one was so rediculously simple, I'm almost ashmed to admit it took me as long as it did to figure it out. What I wanted to do, was grab that long list of users from the db and write a simple CSV file, and download it. The trick here, was that I didn't want it to store that file on the server. I wanted to have it go directly from the database call TO the download stream.
Go ahead and laugh at me (you probably are if you know how to do this) but I swear to you, I searched and searched for an example of how to do this, and all I could find were examples of how to download a file that already existed on the server. Not how to (sort of) write and download at the same time.
echo
that's it. you start the download stuff with all the header lines, etc… and then instead of "readfile($filename);" (where filename was defined as a file on your server) you simply do echo $stringforCSVfile; (I'd already dumped all the data from the db into that one string "$stringforCSVfile" – there was a trick I learned the other night about that one too, I'll post later tonight). So here's a snippet of that code:
$p = "yourCSVfile.txt";
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$p\"");
header("Content-Transfer-Encoding: binary");
if(strpos($_SERVER['HTTP_USER_AGENT'], 'MSIE')){
// IE cannot download from sessions without a cache
header('Cache-Control: public');
}
echo $stringforCSVfile;
exit;
so simple – yet so elusive.