September 14th, 2000, 06:17 PM
I've picked up around here that a good way to handle searches is to take all of your text, divide it up by each word, filter out the common ones, and then put each word into a database, along with an id the references what record it belonged to. This should be a lot faster than using LIKE clauses, right?
I know a brute force way to do it, just like I said above, and do an insert for every word. It seems like this could take a while for a large paragraph (a lot of inserts).
should i run a cron job, so that it doesn't slow down the browser of the page?
just looking for comments or suggestions on how people are doing this...Thanks.
September 14th, 2000, 06:26 PM
it depends, if you've got the luxury of bucket loads of traffic, then you definitely don't want to strain your poor server doing multiple inserts @ busy times, but if you're getting minimal hits then it won't *really* take that long.
Assuming the former, you could move the data into a tmp table and -as you say - cron a script that'll migrate all temp data into your live tables.