Jump to content
Registration disabled at the moment Read more... ×
Maxidonkey

[Info & Feedback] DelphiGenAI v1.1.0 — Sharing Updates and an Educational Approach

Recommended Posts

Posted (edited)

Hi everyone,

I wanted to share the main updates in version 1.1.0 of DelphiGenAI (OpenAI wrapper for Delphi) here—mainly to offer what I can to the community, and maybe get some feedback if anyone feels like weighing in on my approach.

 

Main Updates
Compatibility with the Latest OpenAI Features (including Remote MCP & Code interpreter)
Ready-to-Use Templates

  •  Two archives (TestGenAI_VCL.zip and TestGenAI_FMX.zip) let you copy and paste any snippet from the documentation and test it out, with no complicated setup.

Variety of Code Snippets

  •  For each feature: synchronous, asynchronous (event/callback), and asynchronous with async/await.
  •  The idea is to allow everyone to compare the approaches and pick the one that fits them best.

Documentation That’s Directly Usable

  •  All markdown examples can be tested as-is within the supplied templates.

A Gradual Learning Path

  •  The aim: make it easier to learn about AI in Delphi, whatever your starting level, and to keep things as accessible as possible.

 

About This Approach

I’m not an experienced “Delphi old-timer” and I don’t know the community very well yet; that’s why I chose to offer several approaches (synchronous, asynchronous, and async/await) for each example.
But maybe that’s not really necessary—maybe some would prefer strictly asynchronous, or even just synchronous?
I’d be interested in your thoughts or experiences on this (no pressure—just curious, and trying to do what's most helpful).

 

Also Tried on Other Wrappers

I’ve used the same principle in a Delphi wrapper for Deepseek. Depending on the feedback I get here, I’ll decide whether to take this approach with a few other wrappers I’ve put up on Github.

 

Thanks in advance for reading, and best wishes to all.

Edited by Maxidonkey
  • Like 3
  • Thanks 1

Share this post


Link to post

Hi. Two comments here:
 

1. Library description

Many libraries use big words to describe what they do. In the end you don't get it - unless you invest some time to look deep into the code or documentation.
So, maybe you can "dumb down" a bit the project description. Maybe something like: "if you use this library right now in your Delphi project your will be able to xyz", where xyz is the most important feature of your library. It is like when you go to the store and you see a can and you instantly know that inside is tomato soup. 

Then add the rest of the "complicated" description 🙂 

 

PS: I read the documentation and now I think I get it.  But it took a while.

 

2.  asynchronous

In general when I use the AI, I ask then I wait for the answer.
Is there something you can do if you work in asynchronous mode, while waiting for the answer from the AI? Maybe starting a second request???
If no, then there is no point in using asynchronous.

 

Great library!
Thanks for sharing.

Share this post


Link to post

Thank you very much for your detailed feedback and suggestions.

 

  • Simplified description of the wrapper :

 

I understand your point about the clarity of the project description. It’s not easy to explain simply, especially since my main background is in mathematics, and both coding and documentation are not my core activities. The DelphiGenAI project provides a “wrapper” to simplify working with OpenAI’s APIs in Delphi—in other words, it lets you easily integrate features like ChatGPT or DALL-E into your Delphi projects without having to deal with the technical complexity of API calls yourself.
I’m definitely open to any ideas or help that could make this explanation even clearer or more engaging for the community.

 

  • Asynchronous: Basic and advanced use cases 

 

Regarding the asynchronous aspect, there are two main uses:

– The “simple” asynchronous mode is for launching a single task that doesn’t require immediate interaction or feedback from the user. This avoids blocking the application, but it’s somewhat limited in scope.
– For more advanced scenarios, the use of promises, also provided in the library, enables true orchestration: chaining multiple requests or operations smoothly and without blocking, as illustrated in the File2knowledge project, where several operations are chained together during file registration for RAG.

 

Refert to the units : https://github.com/MaxiDonkey/file2knowledge/blob/main/providers/Provider.OpenAI.FileStore.pas

and https://github.com/MaxiDonkey/file2knowledge/blob/main/providers/Provider.OpenAI.VectorStore.pas

 

Thanks again for your feedback. Any help with wording or contributions to the clarity of the project are most welcome.

Edited by Maxidonkey

Share this post


Link to post

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×