#1
  1. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Nov 2012
    Posts
    31
    Rep Power
    2

    AJAX without recreating objects


    This may be a simple question, but I can't find the answer for it. Even if someone could point me in the right direction, I'd appreciate it.

    I have an application (in the planning phase) that is running on a simple MVC framework (we wrote our own). We want to use AJAX for any and all possible actions.

    There are about 15 objects that are created on every page load (Users, Security, Routing, etc). The way the system works now, each of these objects is recreated for each AJAX call, because each call executes the script from beginning to end and has to initialize the environment.

    Is there a best practice way around this? For instance, is there anyway to pass objects through JQuery to the executing script so it can use the already-created objects? Or is there a caching solution that would be recommended?

    I'm thinking about performance. Is this just the way its done? Am I sweating small stuff?

    Thanks for the time,

    Michael
  2. #2
  3. Come play with me!
    Devshed Supreme Being (6500+ posts)

    Join Date
    Mar 2007
    Location
    Washington, USA
    Posts
    13,756
    Rep Power
    9397
    Why would you want to avoid them? AJAX needs to run according to the user browsing, needs to have "security" (whatever that means), probably needs to use routing to some degree...
  4. #3
  5. No Profile Picture
    Dazed&Confused
    Devshed Novice (500 - 999 posts)

    Join Date
    Jun 2002
    Location
    Tempe, AZ
    Posts
    501
    Rep Power
    127
    I'd be interested if there's a go-to answer for this concept, myself...

    But I will say that ~15 objects isn't much to worry about and you're probably sweating small stuff. Optimization is always something good to be mindful of but unless each of those objects are doing an incredible amount of processing, you should be well within the threshold of concern.

    Are the AJAX calls taking a notable amount of time?

    If you have some that aren't contingent upon having to authenticate the user, etc., you might be able to split that logic out into standalone scripts; such as ones for simple validation checks, etc.

    Edit/Addendum:

    To add a frame of reference, I have AJAX that updates a part of my CMS software showing the most recent user posts. In order to figure that out it has to authenticate the user, load the groups they belong to, figure out what forums those groups have access to, then load forum and topic information to build the list.

    Add to that some common conventions I use, like handling object properties through small handler objects, and you're looking at easily a couple hundred objects being instantiated.

    The system is able to return a response at around 600ms (.6 seconds)

    Now where that fits peoples' expectations of performance, I'm not sure, but I think that's generally considered acceptable for a heavily dynamic website. And that's running on a 7-8 year old server.

    So 15 objects... not much to worry about.

    Addendum #2:

    Another thing to consider is if you have scheduled AJAX calls (ie, ones that run every minute) and multiple of them, try combining them into one mutli-part call to the backend.

    Here's something I did this past weekend as a first step towards that end:

    Code:
    	var AJAXCallbacks = {};
    	function registerAJAXCallback(url,callback){ AJAXCallbacks[url] = callback; }
    	function unregisterAJAXCallback(url){ delete AJAXCallbacks[url]; }
    	function executeAJAXCallbacks(){
    		$.each(AJAXCallbacks,function(url,callback){
    			$.ajax(url,{
    				xhrFields: { withCredentials: true },
    				success: function(data){ callback(data); }
    			});
    		});
    	}
    	registerAJAXCallback('/themes/gs1/header_topics.php?topicLimit=7&forumId=333',function(data){ $('#dl_game').html(data); });
    	registerAJAXCallback('/themes/gs1/header_topics.php?topicLimit=7&notForumId=333',function(data){ $('#dl_forum').html(data); });
    	executeAJAXCallbacks();
    	window.setInterval(executeAJAXCallbacks,60000);
    So you register your AJAX calls with a central executor. That runs the calls and kicks the response to your callback function.

    The next step will be to combine the requests into one, send that super-request to the backend, get back a JSON array containing a subarray for each subrequest, then send those subarrays to their respective callback functions.

    Of course this only works for scheduled AJAX calls that happen at the same frequency. But it turns multiple PHP processes into one, reducing overall processing time.
    Last edited by dmittner; July 10th, 2013 at 04:49 PM.
  6. #4
  7. No Profile Picture
    Lost in code
    Devshed Supreme Being (6500+ posts)

    Join Date
    Dec 2004
    Posts
    8,301
    Rep Power
    7170
    The closest you could come would be to serialize the objects at the end of your script and then unserialize them at the beginning. You can store the serialized objects somewhere like a database, shared memory, a file, the session, etc.

    The C-level data structures that PHP uses internally to track objects always get torn down at the end of the script; there is no way to preserve them from one execution of a script to the next. Doing so would require a fundamental change to the way PHP works internally.

    The types of objects you're talking about do not sound like anything you would want to run through the client side (a cookie or JavaScript).
    PHP FAQ

    Originally Posted by Spad
    Ah USB, the only rectangular connector where you have to make 3 attempts before you get it the right way around
  8. #5
  9. Mad Scientist
    Devshed Expert (3500 - 3999 posts)

    Join Date
    Oct 2007
    Location
    North Yorkshire, UK
    Posts
    3,660
    Rep Power
    4123
    The primary reason for this is that http is stateless and consists of a single request and a single response.

    Every AJAX request is just another http request, and after each request finishes then that's it.

    HTTP and PHP are not suited to 2-way communication in this way.

    If you think each request is too expensive, and that its the initialisation that's expensive, then try do do as much in a single request as possible. For example, can you push stuff into local storage? a cookie? or a json object in a script tag on the page?

    Other things to consider are optimising your code so that initialisation is not expensive, or each request is less expensive, or use a subset of packages just for your ajax requests.

    The latter is what I do - I have "packages" section which sorts out the html/templates/etc and an "api" section which manages data, this can be accessed from packages or via a separate URI is a ReSTful web service)

    I then have APC installed for opcode caching. Combined with the nginx web server a request that does not use a database takes about 7 nanoseconds and uses 0.5 - 1 Mb RAM on the server (firebug reports ~ 60ms) whereas a database request takes about 30 nanoseconds and uses 1 - 1.5 Mb RAM (firebug reports ~ 120ms).

    My home brew framework uses between 30 and 60 classes depending on the request (lazy loading/autoloading)

    I have chosen to minimise request and data sizes to produce small, fast granular requests on a webserver capable of handling huge numbers of concurrent connections as the application I am building will be all ajax based, with a large bulk of layout work managed by javascript (hence this workload is distributed to the client side)

    A proposed work around:

    This idea has just popped into my head, and it may be a bit silly, but it half gets around your initial request:

    1 - page loads in first http request
    2 - open an ajax http connection, but do not wait for ready state 4, just continue to watch for content (see comet)
    3 - the initail ajax request opens a comet server on the server, which loops for a long time, outputting content as necessary
    4 - subsequest ajax requests go a light script that just logs what the request is
    5 the still open connection (2&3) reads from the log written to in (4), processes and responds

    This sounds crazy to me as there's a lot going on, and a lot could fall apart.

    Sencha's ExtJS framework has something called "direct" which batches up multiple ajax requests and sends them at once, but your controller(s) then need to be able to handle this. They claim no adverse affects for the end user
    I said I didn't like ORM!!! <?php $this->model->update($this->request->resources[0])->set($this->request->getData())->getData('count'); ?>

    PDO vs mysql_* functions: Find a Migration Guide Here

    [ Xeneco - T'interweb Development ] - [ Are you a Help Vampire? ] - [ Read The manual! ] - [ W3 methods - GET, POST, etc ] - [ Web Design Hell ]

IMN logo majestic logo threadwatch logo seochat tools logo