Jump to content
Andro12

Looking for Advice on Improving the Performance of Delphi Applications

Recommended Posts

Hello Everyone🤗,

 

I'm contacting you to ask for your professional opinion on a performance problem I'm having regarding a Delphi application I'm working on.

I'm working on a desktop application that was created with Delphi 10.4 Sydney, & I've found that it's not performing up to standard, especially when it comes to data processing and presentation. Large dataset management and intricate computations are the main uses for this programme, and although it does its job well, I would prefer faster and more responsive performance.

Here are some details on the programme and the problems I'm having:

Data processing: Thousands of records are handled by the application in datasets. I'm currently manipulating data in-memory using TClientDataSet. Large dataset activities like sorting and filtering are taking longer than expected for me.

 

Rendering Performance: Several data grids & charts in the user interface (UI) must dynamically update when users engage with the programme. The whole user experience is impacted by the refresh rates, which are slower than intended.

Resource utilisation: During intense processes, I've seen that the application's memory and CPU utilisation surge dramatically. I've started thinking about possible memory leaks and ineffective code paths as a result.

 

I also followed this 👉 https://stackoverflow.com/questions/1115421/how-to-increase-the-startup-speed-of-the-delphi-app

I'm seeking advice on the best ways to optimise Delphi apps in these mlops domains.

 

In particular, I would value suggestions regarding:

 

  • Methods for improving Delphi's dataset operations.
  • Best practices for enhancing the presentation of charts and data grids.
  • Instruments or techniques for analysing Delphi application performance and pinpointing bottlenecks.

 

Thank you 🙏 in advance.

Share this post


Link to post

It's very difficult to advise without seeing how you build the application.

If you didn't put active sql's on datamodules and autocreate forms the startup must be very fast.

Which database or data storage do you use? 

What is the speed and memory of your desktop machines and database server?

What about the connection speed with your data(base?) server.

How do you handle internal memory and calculation loading data.

Are you looping through very large result sets?

Which components are lacking speed? Are they native of 3th party?

Are you loading zillions of datarecords in grids?

 

Is it possible to create a small application with one example where you see lack of speed?

"Large dataset activities like sorting and filtering are taking longer than expected for me."

 

I lost track of the topic here somewhere on this board about loading very large datasets as a competition. Anyone? 

Share this post


Link to post
1 hour ago, Andro12 said:

Data processing: Thousands of records are handled by the application in datasets. I'm currently manipulating data in-memory using TClientDataSet. Large dataset activities like sorting and filtering are taking longer than expected for me.

TClientDataSet is pretty fast. Faster than FireDAC.

You should reduce filtering and sorting if this is a bottleneck. Consider using CloneCursor if you need same sorting or filtering many times.

 

But first: use a profiler!

 

  • Like 1

Share this post


Link to post

As Cristian says, use a profiler!

I have been happy using NexusQA profiler for many years.  There are other profilers out there, some free.  But, my time is worth enough that spending a little money to get a commercial product was well worth it to me.  There are profiler discussions elsewhere on the forum here.  Avoid AQTime. It's a good company selling an archaic, inferior (and pretty-much unsupported in my experience) Delphi Profiler.

Also, evaluate your SQL. Look at the execution plan, consider stored procedures, etc.

But, don't do anything until you know where your bottlenecks are in code or SQL.

Share this post


Link to post
41 minutes ago, Tom F said:

Also, evaluate your SQL. Look at the execution plan, consider stored procedures, etc.

And look at any code in events that may be accessing the underlying data source or server. For example, there are events that will fire as you scroll through the data shown in a db grid. If your grid, dataset, or datasource components have events connected, those will slow down the performance of using the grid.

Share this post


Link to post

First advice: Measure. Without measurement, you are blind.

NexusQA has a method profiler, that shows you, what methods are called, how many times and how much time is spent.

Once you have this data, you can identify bottlenecks.

Besides this: What usually increases performance a lot:

- sorting and filtering done not at the client but on server side

- caching data

- avoiding disk access

 

Share this post


Link to post

My 2c worth...

  • I'd consider using stored procs where possible and server side as per previous suggestions
  • charting wise... load from DB initially, and then send new from a server/service (if possible). (That's what I've had to do, and using background threads)
  • does SQL server suggest any indexes to add?

Share this post


Link to post
5 hours ago, silvercoder79 said:

does SQL server suggest any indexes to add?

Don't blindly accept suggestions for indexes from SQL Server. Those would help that one query but may only help that query. You also have to consider the database write and maintenance cost of each index.

Share this post


Link to post

There are lot's of ways to "slow your application". That's why it's important to have some code sample.

I will assume you're already using "beginUpdate / EndUpdate" and/or "DisableControls / EnableControls" as also "LockDrawing / UnLockDrawing".
 

That said, the best way to handle large dataset is to avoid working with large datasets up to the last minute, specially if there's any kind of visual binding. This will slow things down: Be sure to handle you data visualization carefully.
It should (almost always) be faster to order your records server side, you should construct your order clause and feed the user with the amount of data required for his/her operation.
The same goes to filtering. If you know the table you are querying, you should construct your "where clause" carefully to be sure there's an index backing it up.

Managing large amounts of data client side will impact considerably memory. For example, if you need to traverse your dataset backwards and forwards, as most TDBGrid ( and alike ) requires, you will use more memory than Forward Cursor Only.
If you require to order your dataset with several "memory created index", you will use more memory. Memory client datasets will require more ( or less ) memory depending on the amount of features implemented.

 

And if things get really complex, you can consider having a local database.
 

 

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Ă—