|
Posted by ZeldorBlat on 01/26/08 00:19
On Jan 25, 5:48 pm, Martin <martinval...@comcast.net> wrote:
> On Fri, 25 Jan 2008 11:27:00 -0800 (PST), ZeldorBlat
>
>
>
> <zeldorb...@gmail.com> wrote:
> >On Jan 25, 11:25 am, Martin <martinval...@comcast.net> wrote:
> >> I have a series of php scripts that provide the user interface to an
> >> industrial automation program.
>
> >> One of the scripts serves out a page which displays some data, some
> >> of which is constantly changing. On that page, I have some AJAX code
> >> running that frequently requests fresh data and displays it
> >> (frequently means about once a second).
>
> >> When the php script that services the AJAX requests receives one, it
> >> passes a request of its own over to the main process (via a socket)
> >> which responds with some data which the script then sends out to the
> >> browser page.
>
> >> This all works fine but I'm concerned about the load this exchange of
> >> data places in the main process.
>
> >> What I'm wondering is: would it be possible that I could have my main
> >> process send its data (in real time) to "something" in the php system
> >> such that the php script could access this data much like it accesses
> >> its own global variables? IOW, the script would have immediate access
> >> to the values instead of having to request them from the main process.
>
> >> Please understand that all of this activity is running on one computer
> >> and this is all confined to an inTRAnet (there's no internet access
> >> involved). And, this one computer is entirely under my control. I am
> >> free to do what ever needs to be done with regards to php
> >> configuration, security settings, etc.
>
> >> Any recommendations?
>
> >> Thanks.
>
> >Sounds likes a database would work just fine. The main process can
> >update values in the database and the PHP script can read them
> >whenever it needs to. Or perhaps I'm missing something larger here...
>
> Yes, I agree - a database probably would work ok. But, even then,
> there would be a lot of disc activity. I'm thinking that if the data
> could be stored in memory somewhere/someway, then the processing time
> would be minimized.
>
Key word here is "thinking." Do you know that there will be too much
disk activity? Have you used a profiler to test both methods to see
which one is faster? Have you tried using a database and know for a
fact that the performance is bad? If so, how much worse? Do you even
know that putting it all in memory will be better than what you're
doing now?
Premature optimization is a common pitfall. You can't optimize
something until you know what you need to optimize. Code things in a
clean and manageable way and then, if you run into problems, you can
optimize specific areas.
Google for "premature optimization" and you'll find a lot of info on
the subject.
[Back to original message]
|