|
Posted by flowover on 01/05/08 13:15
On Jan 5, 5:13 am, flowover <flowover...@gmail.com> wrote:
> On Jan 4, 6:46 pm, Jerry Stuckle <jstuck...@attglobal.net> wrote:
>
> > Rowan wrote:
>
> > > What is the best approach to caching database results. example say i'm
> > > doign an update on several entries which i've loaded into an array. I
> > > want to allow the user to click through and up date each array entry
> > > then dump everythign to the db once they are done...
>
> > Don't bother. It's normally cheaper to just keep track of the ID's and
> > fetch the results again.
>
> > You should be fetching them again before updating anyway, and verifying
> > the rows haven't changed (i.e. two people updating at the same time).
>
> > --
> > ==================
> > Remove the "x" from my email address
> > Jerry Stuckle
> > JDS Computer Training Corp.
> > jstuck...@attglobal.net
> > ==================
>
> If you're writing the site to scale then yes, plan for multi users
> being in there changing. If the site is just an administration
> backend that you know isn't going to have more than one person making
> changes at a time, stuff that array in a session. This requires
> either a big ugly key in your URL or a cookie though, but imo is the
> best way to cache data between requests.
Note that the database result is a resource though, and you can not
serialize a resource. You have to actually put every result into an
array to store it. Jerry has a huge point when he says that it's
usually cheaper just to fetch it again.
Navigation:
[Reply to this message]
|