Reply to Re: Data transfer problem - ideas/solutions wanted (please)

Your name:

Reply:


Posted by noone on 02/04/06 20:42

NC wrote:
> E.T. Grey wrote:
>
>>>>I have a (LARGE) set of historical data that I want to keep
>>>>on a central server, as several separate files.
>>>
>>>
>>>How large exactly?
>>
>>At last count, there are about 65,000 distinct files (and increasing)
>
> ...
>
>>Each file has the equivalent of approx 1M rows (yes - thats 1 million)
>
> ...
>
>>If you multiply the number of rows (on avg) by the number of files -
>>you can quickly see why using a db as a repository would be a
>>poor design choice.
>
>
> Sorry, I can't. 65 million records is a manageable database.

I agree... I have designed and deployed binary and ascii data loads in
excess of 250Million records/day. Searching the data was a piece of
cake - if you know how to actually designed the database correctly.

65M records is peanuts to a database - even MySql. With proper indexing
you can do a direct-row lookup in < 4-8 I/O's - not so with the path
you are currently trying to traverse... you are looking at up to 65M
reads - and reads are very expensive!!

Use the proper tools/mechanisms for the job at hand...


Michael Austin
DBA
(stuff snipped)

[Back to original message]


Удаленная работа для программистов  •  Как заработать на Google AdSense  •  England, UK  •  статьи на английском  •  PHP MySQL CMS Apache Oscommerce  •  Online Business Knowledge Base  •  DVD MP3 AVI MP4 players codecs conversion help
Home  •  Search  •  Site Map  •  Set as Homepage  •  Add to Favourites

Copyright © 2005-2006 Powered by Custom PHP Programming

Сайт изготовлен в Студии Валентина Петручека
изготовление и поддержка веб-сайтов, разработка программного обеспечения, поисковая оптимизация