August 25th, 2012, 09:32 AM
Manage big data properly with libpq, c++
I am Alin and I am programmer using C++, libpq, PostgreSQL,PL/SQL.
I have only 2 months experience with Postgresql but I still learn.
I have worked at GIS project for a company.
The database is big having many tables. Some tables are 4 GB almost. The tables contains data from all Europe, that's why are so big. My problem is how to manage the big tables properly.
I made tunning for server Postgresql, I indexed some tables
but I am disappointed because of low performance.
Obviously the queries are slow.
I have a computer with raid HDD and 8 GB RAM.
I tried to read a table using cursor and fetch command but
it does not work ( it takes very long time).
I would like to read some advices regarding this issue.
The data provider do not want to split data per country, for example, to make me life easier. I do not know why but for me it is a stupid thing to have big tables with data from all Europe.
Please tell me about some technics to speed up the processing data.
August 26th, 2012, 05:27 PM
Why are you asking a database question in a C/C++ forum? Is it because you program in C/C++? If so, clearly you don't know anything about databases because the language you use to interface with the program is totally irrelevant to the performance of the database.
August 27th, 2012, 01:55 AM
Yes, I use C++ and libpq.
Originally Posted by mitakeet
I do not know what section is more suitable for my thread.
I was in hury.
August 27th, 2012, 04:40 AM
How about a forum devoted to Postgresql?