|
Posted by J.O. Aho on 01/12/07 18:15
Rik wrote:
> Geoff Berrow wrote:
>> Message-ID: <378bb$45a7bf44$8259c69c$19192@news1.tudelft.nl> from Rik
>> contained the following:
>>
>>>> I had a devil of a job recently trying to store an html file that
>>>> had loads of auto generated JavaScript in it (a crossword puzzle).
>>>> My quick and dirty solution was to save it as a file and then simply
>>>> store
>>>> a reference to it. This is fine if you don't want to do any data
>>>> processing on the content.
>>> If there are reasonably few html snippets/pages it could be OK.
>>> Wouldn't want to try it with 1000+ files though, the filesystem
>>> becomes a bottleneck.
>> I couldn't say. I always thought that's what the filesystem was good
>> at.
>
> Well, it's not really designed to hold 1000+ files in one directory. Split
> them up in subdirs (for instance on first character) and it'll be much
> faster again.
>
>>> Then again, just simply throwing it though mysql_real_escape_string()
>>> _should_ have done the job without any hassle.
>> Yeah, that's what I did. But after a couple of hours messing about
>> with it (and a tight budget) you do what you have to do.
>
>
> Indeed, no use wasting hours on it offcourse. Allthough I'm interested in
> what kind of gibberish was causing you this headache.
1000 files are nothing, finding 10000 files takes not more than 0.02 - 0.04
seconds on a good file system, but of course if using something like fat-file
system, then things will be painful slow.
--
//Aho
[Back to original message]
|