|
Posted by "Christian Calloway" on 07/21/05 22:10
Still playing around.. if I remove the gzuncompress function.. it works
perfectly. hmm.. it looks something like this
$resultRecords = 2-dimensional set of associative records from database
query (i use the pear db class)
foreach ($resultRecords as $record)
{
$fileContent = gzuncompress ($record["compress_content"]); // this
statement causes problems when compress content > 64kb
$fileContent = $record["compress_content"]; // this statement works just
dandy
}
""Christian Calloway"" <callowaylc@yahoo.com> wrote in message
news:21.BB.33635.913FFD24@pb1.pair.com...
> Heya guys,
>
> I've ran into a problem that I just can't seem to get around. Basically I
> am storing file contents in a compressed format (gzcompress -at 9) in my
> database -- why -- basically to affect a search of said content using
> keywords retrieved via an input source (it could be any, doesn't really
> matter). Anyways, all works fine, until the blob size reaches about 64kb,
> then my script (and box) just completely crap out -- the page eventually
> gets forwarded to a "page cannot be found" after a while (and this is even
> if I have set my max_execution_time to like 5 seconds). Ok, so I theorized
> it "may" be a memory consumption problem, so I played around with the
> memory_limit directive in upwards of 100mg, and I still get the problem.
> It's very wierd.. it works just fine if the blob size is below 64kb, but
> just craps out when it is >=. Oh yeah, xp box. Thanks in advance,
>
> Christian
Navigation:
[Reply to this message]
|