Page 1 of 2 12 Last
  • Jump to page:
    #1
  1. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Sep 2002
    Posts
    39
    Rep Power
    17

    Php speed => solution


    i react on my previous issue, what deserved new topic.


    I have about 60 000 results in the mysql table (about 30 cells)
    i need to fetch all rows one by one.

    I noticed it takes about 2sec to fetch all rows, what is killer

    Do you have any ideas how i could take all data into array from one mysql table ?
    I need all cells, all rows cuz need to process them (a lot of ifs...)

    if you ask me about query or indexes, no its without WHERE, i just need all data in one huge array, but how to put them into array as i want as fast as i can, it must be under 0,1s, is this question for hardware ? i will buy latest dual quad core extreme, but i don't think this fix my problem.

    I need to find solution how to have data from mysql table up-to-date


    thanx very much for your inputs..
  2. #2
  3. Headless Moderator
    Devshed Supreme Being (6500+ posts)

    Join Date
    Mar 2007
    Location
    Washington, USA
    Posts
    16,939
    Rep Power
    9647
    It seems your problem is not listening to what we're saying.

    You have 60000 results with (apparently) 30 columns. That's 1.8 million cells of data. If each cell averages to 5 bytes of data that's 8.6MB to process - and 5B is undoubtedly a understatement.

    It won't go faster just because you want it to. Cut down on the number of rows, on the number of columns, and/or make sure you have indexes on the right fields if applicable. Beyond that there's not much you can do.


    Thread reported as a duplicate of this. There's no need to create another thread to answer the same question.

    Comments on this post

    • lnxgeek agrees
    Last edited by requinix; January 23rd, 2010 at 04:00 PM.
  4. #3
  5. No Profile Picture
    I AM A GOLDEN GOD
    Devshed God 1st Plane (5500 - 5999 posts)

    Join Date
    Apr 2003
    Location
    Camarillo, California
    Posts
    5,929
    Rep Power
    1170
    Aside from what requinix said, if you really want some help with your SQL statement, post your query, tables and keys in the MySQL forum. If we decide it's relevant to be in PHP, we'll move it back.
    "Seriously, we're not a search engine, we're actual people." ~ ManiacDan

    BookMooch.com : Give books away. Get books you want.
  6. #4
  7. Contributing User
    Devshed Supreme Being (6500+ posts)

    Join Date
    Jan 2005
    Location
    Internet
    Posts
    7,625
    Rep Power
    6089
    Can you cache the results for quicker pulling next time?
    Chat Server Project & Tutorial | WiFi-remote-control sailboat (building) | Joke Thread
    “Rational thinkers deplore the excesses of democracy; it abuses the individual and elevates the mob. The death of Socrates was its finest fruit.”
    Use XXX in a comment to flag something that is bogus but works. Use FIXME to flag something that is bogus and broken. Use TODO to leave yourself reminders. Calling a program finished before all these points are checked off is lazy.
    -Partial Credit: Sun

    If I ask you to redescribe your problem, it's because when you describe issues in detail, you often get a *click* and you suddenly know the solutions.
    Ches Koblents
  8. #5
  9. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Jan 2010
    Location
    London
    Posts
    42
    Rep Power
    11
    and even if you did manage to download the data quickly... rendering that data in a browser will be a nightmare - it will bring the DOM to its knees.
  10. #6
  11. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Sep 2002
    Posts
    39
    Rep Power
    17
    This is definitely not question for mysql
    if mysql has no chance to bring these data faster

    i have to go more into detail what i need to do


    I have 60k classifieds in same category
    there are brands, models,types, and next 20 needed cells that i can make search according it. I definitely cant remove any cell


    After any kind of search i need to know from all table not just this results
    how many is rows with brand 'bata' or brand 'bata' and model '2010' ...
    The problem that i have such asks for mysql about 1000 or more all based on one search

    i'm thinking about mysql memory table or shared mem functions that can
    Fast up the process of fetching my counters

    if you put a different search i have to know again these counters per each brand, model ...
    Thats why a need to have all table in array and fetch counters from it


    Maybe you say to me, use cache for counters based on each search
    but the problem is that i get about 50 000 different searches per one hour, so cache is not solution
    Last edited by miob; January 24th, 2010 at 03:46 AM.
  12. #7
  13. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Sep 2002
    Posts
    39
    Rep Power
    17
    Originally Posted by lnxgeek
    Aside from what requinix said, if you really want some help with your SQL statement, post your query, tables and keys in the MySQL forum. If we decide it's relevant to be in PHP, we'll move it back.
    yes but 1st request is killer

    i think about pre-loading data into mem array or something like that
  14. #8
  15. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Jan 2010
    Location
    Leuven (Belgium)
    Posts
    168
    Rep Power
    273
    I'm at a loss as to why you should have all data in an array.. then what's the use of the database anyway?

    just get more servers and do some decent load-balancing to make sure they don't crash because of all the queries..
  16. #9
  17. Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Nov 2009
    Location
    Hobart, Tasmania
    Posts
    475
    Rep Power
    291
    Miob ... with all due respect, you've either confused yourself into a corner, or aren't articulating your problem adequately.

    It sounds like you need a full text search index across these "20" mystery fields and be done with it.

    As for "counters" ... there is absolutely NO WAY PHP is going to perform a count by looping over a result set faster than using SELECT COUNT(`foo`) .. in SQL. If you index the countable fields, you should be able to run dozens of derived counter queries with no significant performance hit. I'd be interested in your justification to the claim you need to run a thousand such queries. Why?

    You can set up all the servers you can muster, implement uber-caching using liquid lithium clouds on Jupiter ... that won't fix a bad design.
    Latery Matey
  18. #10
  19. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Jan 2010
    Posts
    11
    Rep Power
    0

    2 possible solutions


    I see 3 solutions:
    1. Use C to process data
    2. Put data 2 file as PHP array (yes, beginning with <?, ending with ?> and has lines like $arr
    Code:
    []
    =array("field"=>"value", etc)) by cron job every 2-3-10-20 minutes.

    Usually such problems caused by weak data model, but this talk would be in SQL thread
  20. #11
  21. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Sep 2002
    Posts
    39
    Rep Power
    17
    to l8rm8e, you're sure that mysql will be everytime faster than catching counters via php (i really wasn't but i think i smoothly agree with you)

    you said dozens, but i know there are 1300 requests to mysql per one page load, that's the issue

    that's why i went via array

    this method was first i was thinking about "SELECT COUNT(`foo`) .. in SQL."


    i tried to make special memory indexed table only with mandatory cells to ask for counter, i got one counter for 0.002 sec, if you multiply it with 1000 you get 2 sec. which is quite good, but not perfect and maybe killer for mysql

    i still there must by any better option..



    will try to explain "20" mystery fields:

    just sample query for counter should be

    SELECT count(*) FROM `records` WHERE brand='skoda' and model='octavia' and modelyear>2000 and drivenkm<1000000 and price <10000 and fuel=1 and transmissiontype=0

    request took 0.0252 sec.

    i need such queries about 1300

    most crazy thing is that i can't cache these counters, bc as i mentioned above, if you change only one thing in search, you must recall all those 1300 queries for getting real counters for each field.
    Last edited by miob; January 24th, 2010 at 08:39 AM.
  22. #12
  23. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Sep 2002
    Posts
    39
    Rep Power
    17
    Originally Posted by qforever
    I see 3 solutions:
    1. Use C to process data
    2. Put data 2 file as PHP array (yes, beginning with <?, ending with ?> and has lines like $arr
    Code:
    []
    =array("field"=>"value", etc)) by cron job every 2-3-10-20 minutes.

    Usually such problems caused by weak data model, but this talk would be in SQL thread

    #1. hm, ok but i need to have data in php then...
    #2. this is what i'm thinking about, to store all data in one huge array and in php just use cron for refreshing data.
    that's why i was thinking about share_mem php functions, this idea would be even better cuz, data are everytime in memory and i can fetch what i want, not needed to use include this huge array, with each surfer load.
  24. #13
  25. Contributing User
    Devshed Frequenter (2500 - 2999 posts)

    Join Date
    Mar 2008
    Location
    North Carolina
    Posts
    2,670
    Rep Power
    2679
    Why the hell do you have 1300 requests to MySQL for ONE page load?

    Originally Posted by miob
    you said dozens, but i know there are 1300 requests to mysql per one page load, that's the issue
    You really need to work on reducing the amount of queries made.
  26. #14
  27. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Sep 2002
    Posts
    39
    Rep Power
    17
    Originally Posted by simshaun
    Why the hell do you have 1300 requests to MySQL for ONE page load?



    You really need to work on reducing the amount of queries made.

    when i show you what is in the search you tell me ok. hm good one , just don't ask me how you're doing that ;-)
  28. #15
  29. Mad Scientist
    Devshed Expert (3500 - 3999 posts)

    Join Date
    Oct 2007
    Location
    North Yorkshire, UK
    Posts
    3,659
    Rep Power
    4128
    Originally Posted by miob
    just sample query for counter should be

    SELECT count(*) FROM `records` WHERE brand='skoda' and model='octavia' and modelyear>2000 and drivenkm<1000000 and price <10000 and fuel=1 and transmissiontype=0

    request took 0.0252 sec.

    i need such queries about 1300
    So you want to display 1300 counts on one page, each count made up from a set of parameters like this?

    I think you need to go an ask in the SQL forum. I'm guessing you would be looking for something like this

    sql Code:
    SELECT
    	something,
    	COUNT(ID) AS qty
    FROM
    	(SELECT
    		ID,
    		(CASE
    			WHEN (- - -SOME_CONDITION 1- - -) THEN 'SOME_VALUE 1'
    			WHEN (- - -SOME_CONDITION 2- - -) THEN 'SOME_VALUE 2'
    			.....
    			WHEN (- - -SOME_CONDITION N- - -) THEN 'SOME_VALUE N'
     
    		ELSE
    			'Not Defined'
    		END) AS something
    	FROM
    		TABLE
    	) a
    GROUP BY
    	something
    ORDER BY
    	something
    Last edited by Northie; January 25th, 2010 at 01:32 AM.
    I said I didn't like ORM!!! <?php $this->model->update($this->request->resources[0])->set($this->request->getData())->getData('count'); ?>

    PDO vs mysql_* functions: Find a Migration Guide Here

    [ Xeneco - T'interweb Development ] - [ Are you a Help Vampire? ] - [ Read The manual! ] - [ W3 methods - GET, POST, etc ] - [ Web Design Hell ]
Page 1 of 2 12 Last
  • Jump to page:

IMN logo majestic logo threadwatch logo seochat tools logo