#1
  1. Banned (not really)
    Devshed Supreme Being (6500+ posts)

    Join Date
    Dec 1999
    Location
    Brussels, Belgium
    Posts
    14,646
    Rep Power
    4492
    I've picked up around here that a good way to handle searches is to take all of your text, divide it up by each word, filter out the common ones, and then put each word into a database, along with an id the references what record it belonged to. This should be a lot faster than using LIKE clauses, right?

    I know a brute force way to do it, just like I said above, and do an insert for every word. It seems like this could take a while for a large paragraph (a lot of inserts).

    should i run a cron job, so that it doesn't slow down the browser of the page?

    just looking for comments or suggestions on how people are doing this...Thanks.

    ---John Holmes
    ---www.SepodatiCreations.com
  2. #2
  3. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Aug 2000
    Location
    London/UK
    Posts
    91
    Rep Power
    15
    Hi

    it depends, if you've got the luxury of bucket loads of traffic, then you definitely don't want to strain your poor server doing multiple inserts @ busy times, but if you're getting minimal hits then it won't *really* take that long.

    Assuming the former, you could move the data into a tmp table and -as you say - cron a script that'll migrate all temp data into your live tables.

    regs

    Bealers


    ------------------
    http://back-end.org

Similar Threads

  1. Database design: relationship convergence
    By saodl in forum Database Management
    Replies: 6
    Last Post: February 12th, 2004, 05:06 PM

IMN logo majestic logo threadwatch logo seochat tools logo