[fpc-pascal] TSQLQuery and buffering.

noreply at z505.com noreply at z505.com
Wed Apr 12 15:21:45 CEST 2017


On 2017-03-24 14:42, Gary Doades wrote:
> Hi everyone,
> 
> Firstly, I realise this is an old subject, but I still can't find any
> easy answer.
> 
> Really, the question is simple: Is there ANY way of TSQLQuery NOT
> reading the entire result set into memory?

What about making multiple queries and setting sql's LIMIT?

Should you really be writing sql queries that return millions of 
results?
Is this a design choice that could be a bad one?

For example in many web programs, I only retrieve 10-100 results back by 
using SQL's LIMIT feature...

Example:
SELECT * FROM Orders LIMIT 15, 10

SQLDB likely has a similar setting/feature so that you don't have to 
write actual SQL code to achieve it.

And in your case, you may actually really need to retrieve millions of 
results of the database, I don't know your use case. But generally in my 
web programs where I have millions of results, I use SQL to my advantage 
and it's language features, such as LIMIT, instead of retrieving all of 
the results at once which kind of defeats the purpose of an sql database 
since it's job is to give you only the data you need. Are you using 
pascal to do the work that the database could already do for you?

I could be misunderstanding your situation, and again your use case 
could be different.

> 
> This is really killing me. I've got quite a lot of web code using
> sqldb and it works really well. The classes for connection,
> transaction, query etc. are perfect for general use and make it so
> easy to write database applications. I find it excellent that I only
> need to change the connection class in my code to use a different
> database.
> 
> For web code (mine included), fetching a few dozen, hundred or even a
> few thousand rows is no real problem. The sqldb classes are *VERY*
> fast and very easy to use.
> 
> The problem comes when I need to write some apps to either analyse or
> move a lot of data. Initially all I need to do is analyse some data by
> simply reading through rows, doing some calculations and other
> analysis. For even 1 million rows this is very fast (10 seconds using
> MySQL) so no major problems. However, it does use quite a lot of
> memory.
> 

If you need a temporary work around simply use SQL Limit feature.. then 
your data that comes back into your memory on your end is only what you 
have limited the result set to... If you need to combine all this data 
into one big data set to do analysis though, maybe sql LIMIT is the 
wrong tool for the job, I don't know.

But it's a big warning sign if you have a program that retrieves 
millions of results from a database and you only need to display 10 of 
those items to the end user - in that case LIMIT is really useful. But, 
you are analyzing these millions of results maybe monolithically so your 
case may be different.



More information about the fpc-pascal mailing list