January 7th, 2011, 02:20 PM
I am discussing the performance of our Firebird database connected by a Delphi application. Everytime I open the program the program will provide an overview of all the orders of our customer table. Very usefull when you want to check what happened the last day/ hours.
So my question:
query A: Select * from table;
query B: select * from table where column like "searchvalue" limit 0,10;
Assuming a large table, (100.000) records.
If 10-50 people on a database perform queries like A, I easily run into performance problems.
- Can I solve/optimise them by using queries like B, and ask the searchvalues as a input?
- How to provide a short overview that will update every 10 seconds (like the last 30 records) of a large table and keep performance?
- Which tools will you use? Which language?
- What will happen with the performance when you make it web-based, (internet speed not included).
- Do you know other ways to optimise the search / standard overview?
I think our programmer makes a serious design flaw. Who agrees?
Thanks for your attention, your reply will be appreciated.
And sorry for my bad English.
January 7th, 2011, 04:24 PM
Way too many variables in your situation to "know" the solution.
1. Select * from ... is never good if you do not want all the records.
If you only want the latest information.
Try adding a TIMESTAMP field (if you do not have one already) index it and change the query to something like
SELECT ONLY THE FIELDS YOU NEED
WHERE (CURRENT_TIMESTAMP - TIMESTAMP_FIELD) <= 24 HOURS
ODER BY TIMESTAMP_FIELD DESC
Other than that, you need to make sure that the bottleneck is not the server.
Can it handle 50 concurrent connections, how many processors do you have etc. etc.
What about your internal network?
If you want to add a WHERE clause to one of the columns then make sure it is indexed, that it has a high cardinality and that the query can use the index.
WHERE LAST_NAME LIKE 'Will%'
Can use the index BUT
WHERE LAST_NAME LIKE '%son'
HTH - Clive
January 7th, 2011, 05:17 PM
Thank you Clivew.
About my internal network
everything is 1000 Mbit, networkcards, cables, switches.
About my server:
Quad core, 6 GB, Debian 5, RAID5 - 15.000 rpm, 1000 Mbit/s
The server does vary between 10 and 60% CPU when everybody is working, but is never touching the max.
I tried, but could not find a clear reference about the
maximum concurrent connections.
How to check this? Is this a SQL thing, linux config or Firebird parameter?
Does it matter that I run 2.0.4 Super Server in this case?
Please help, thanks.
January 7th, 2011, 05:36 PM
The hardware and even the DBA side of things is not my area of expertise.
I have to research it myself when I run into an issue. I just know they can be factors.
My personal expertise (I hope!) is in SQL construction and optimization as
per the directions I suggested you investigate for the SQL itself.