|
Posted by Miguel Cruz on 07/27/06 15:16
b007uk@gmail.com wrote:
> I have to store over a million files, 10 - 15 kb each, in one folder.
> The files are created by my php script, sometimes the old files are
> deleted and new ones are written.
> So, basically on every connection my script reads/deletes/ writes files
> from/to that folder.
> Right now i have only around 300 000 files in that folder, and it feels
> like its getting slower for that script to work. It does work at the
> moment, but i am not sure what will happen when there is over a million
> files there...
> Are there any limits of files that can be stored in a folder?
No (depends on the filesystem but in general no).
However, with many filesystems the search time will get really bad when
you have so many files in one folder.
Instead you can make a little hash structure, it's easy to do and will
provide you a significant performance boost.
Let's say your files are all named with a sequence of 6 random letters
(like "rjudfx" and "qopmnu" and "zsijpa").
Make yourself 26 directories inside of your one large directory: 'a',
'b', 'c', 'd', 'e', etc.
Then store the files in the directory named after the first letter. file
"rjudfx" would go inside 'r', and so on.
You can make some quick, easy functions to add the directory prefix onto
the names when you are reading and writing them.
function hashname($filename)
{
return $filename{0} . "/{$filename}";
}
Then, instead of doing fopen($filename), just do
fopen(hashname($filename)).
This way the search space is cut into 1/26 of what it was before, and
accessing the files will be much faster.
miguel
--
Photos from 40 countries on 5 continents: http://travel.u.nu
Latest photos: Malaysia; Thailand; Singapore; Spain; Morocco
Airports of the world: http://airport.u.nu
Navigation:
[Reply to this message]
|