|
Posted by SC on 11/23/05 22:48
I'm developing a site that may eventually have a very large database of
users (hopefully, but who knows). It will be a community website where
users can search each other (think Friendster, Classmates, every dating site
out there, etc.).
Often there will be queries with multiple joins and sometimes may include a
few fulltext searches. Each page of results will have a limited number of
results displayed (obviously). As I understand it, MySQL caches only exact
queries, so doing a LIMIT (x,y) on each query to get results for the
specific page will cause MySQL to run the entire query again, instead of
pulling from the cache. I will limit the results to 1000, but what's the
best way to limit results per page?
I was thinking that I could run the SELECT and LIMIT to 1000, then do a
mysql_data_seek to jump to the starting row for the page and fetch a certain
number of rows for the page. For each page, the exact same SELECT would be
used and therefore the results would be cached.
Also, how should I save the search query? The users will be searching with
any number of search terms, way too many to do use GET variables. I don't
want to create a hidden form with each search term as hidden inputs (for
one, that's kind of lame). I was thinking of saving the search query to the
database, I guess I could implode the POST array into a string and then save
it in the db.
Any suggestions for either of these issues that might work better? Any help
would be greatly appreciated.
Thanks!
[Back to original message]
|