|
Posted by Arjen on 01/29/07 17:04
Arjen schreef:
> Michael Fesser schreef:
>> .oO(Arjen)
>>
>>> I had to do the same thing for a project im working on. Worst case I
>>> had to join 10+ tables on the fly with more then 10k entries.
>>> Performance was terrible so I join the data overnight and dig in the
>>> large pool of data with a simple select :-)
>>
>> 10k entries are _nothing_ for a database, which is able to handle
>> millions of records. Even a join over 10+ tables shouldn't be much of a
>> problem, except maybe on a real high-traffic site. Are you sure the
>> tables were properly indexed? That's one of the most important things.
>> If every constraint or join requires a full table scan because of
>> missing or improperly defined indexes, then of course the performance
>> will be very poor.
>>
>> Micha
>
> Yup im sure. Indexes are ok (checked with explain and common sense) and
> we have had quite a few specilists that have worked on it. The server is
> relatively new. But it's a busy website indeed. Too bad I cant show you
> since the data is private. Anyway it works like a charm now :-)
>
The mind is a funny thing :-) You tell me that im wrong and still i
insist im right even dough you make more sence !
I checked the data .. worst case I need to join 43 tables with 200k+
entries per table.
--
Arjen
http://www.hondenpage.com
Navigation:
[Reply to this message]
|