Reply to Re: Data transfer problem - ideas/solutions wanted (please)

Your name:

Reply:


Posted by NC on 02/04/06 05:50

E.T. Grey wrote:
>
> I have a (LARGE) set of historical data that I want to keep
> on a central server, as several separate files.

How large exactly?

> I want a client process to be able to request the data in a
> specific file by specifying the file name, start date/time and
> end date/time.

The start/end date/time bit actually is a rather fat hint that you
should consider using a database... Searching through large files will
eat up enormous amounts of disk and processor time.

> New data will be appended to these files each day, by a
> (PHP) script.

Yet another reason to consider a database...

> What is the best (i.e. most efficient and fast way) to transfer data
> from the server to clients ?.

Assuming you are using HTTP, compressed (gzip) CSV will probably be the
fastest.

> How can I insure that that the (binary?) data sent from the Unix server
> can be correctly interpreted at the client side?

Why should the data be binary? Compressed CSV is likely to be at least
as compact as binary data, plus CSV will be human-readable, which
should help during debugging.

> How can I prevent clients from directly accessing the files
> (to prevent malicious or accidental corruption of the data files.?

Import them into a database and lock the originals in a safe place.

Cheers,
NC

[Back to original message]


Удаленная работа для программистов  •  Как заработать на Google AdSense  •  England, UK  •  статьи на английском  •  PHP MySQL CMS Apache Oscommerce  •  Online Business Knowledge Base  •  DVD MP3 AVI MP4 players codecs conversion help
Home  •  Search  •  Site Map  •  Set as Homepage  •  Add to Favourites

Copyright © 2005-2006 Powered by Custom PHP Programming

Сайт изготовлен в Студии Валентина Петручека
изготовление и поддержка веб-сайтов, разработка программного обеспечения, поисковая оптимизация