Reply to Re: Best text search for many Gigabyte Database?

Your name:

Reply:


Posted by Erwin Moller on 02/05/07 10:19

ngocviet wrote:

>
>> If you use mySQL, consider using FULLTEXT and MATCH AGAINST().
>> Bottomline stays: If you have 100GB of data, some searchtime is needed,
>> but using FULLTEXT can speed that up considerable.
>> Check your mySQL documentation for details.
>
> I've tried FULLTEXT but it take too long, about over 10 seconds with
> 1.2 Gigabyte table.
> If need, I can convert database to other structure.

Hi,

If you use MySQL ISAM with FULLTEXT, and not LIKE, but MATCH AGAINST, you
are using one of the fastest approaches a developer can set up (as far as I
know).
I expect the only way to increase searchspeed for a 100GB database filled
with text, is throwing more/better hardware at it (more memory, faster disk
IO, faster CPU, etc.).
In general: When searching through a huge datastructure: Disk IO is the
bottleneck. So faster disk IO will help the most.
Dive into different RAID systems maybe.
eg: If you have 2 HD delivering data at the same time (using some RAID),
your query will maybe run twice as fast.

Regards,
Erwin Moller

[Back to original message]


Удаленная работа для программистов  •  Как заработать на Google AdSense  •  England, UK  •  статьи на английском  •  PHP MySQL CMS Apache Oscommerce  •  Online Business Knowledge Base  •  DVD MP3 AVI MP4 players codecs conversion help
Home  •  Search  •  Site Map  •  Set as Homepage  •  Add to Favourites

Copyright © 2005-2006 Powered by Custom PHP Programming

Сайт изготовлен в Студии Валентина Петручека
изготовление и поддержка веб-сайтов, разработка программного обеспечения, поисковая оптимизация