Jump to content

Maxidonkey

Members
  • Content Count

    4
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by Maxidonkey

  1. Maxidonkey

    Coding using AI

    Hey, Delphi’s built-ins already cover everything the AI tried to re-code. Here’s the absolute minimum—25 lines, no more—tested on Delphi 12.1 uses System.Net.URLClient, System.Net.HttpClient, System.Net.HttpClientComponent, System.JSON; procedure CallService; var Http : TNetHTTPClient; Response : IHTTPResponse; Payload : TJSONObject; begin Http := TNetHTTPClient.Create(nil); try // auth ➜ replace XXX Http.CustomHeaders['Authorization'] := 'Bearer XXX'; // timeouts, proxy… if needed Response := Http.Get('https://api.your-service.com/v1/endpoint'); if Response.StatusCode = 200 then begin Payload := TJSONObject.ParseJSONValue(Response.ContentAsString) as TJSONObject; try // process your data here Writeln(Payload.ToJSON); finally Payload.Free; end; end else raise Exception.CreateFmt('HTTP %d – %s', [Response.StatusCode, Response.StatusText]); finally Http.Free; end; end; Auth: shown with a simple Bearer token—swap in OAuth2, HMAC, etc., as needed. POST / PUT: Http.Post(URL, TStringStream.Create(JSON, TEncoding.UTF8), nil, ['Content-Type', 'application/json']); Skip TRESTClient if you’re only making a couple of calls; it just wraps TNetHTTPClient and adds another mapping layer. Why the AI went off the rails API hallucination – mixes old FireMonkey REST.* units with the newer System.Net.* ones. Copy-pasta from outdated posts – many snippets were written for Delphi XE7–10 Seattle; since 10.4 the method signatures changed. No compile-fix loop – the model never actually builds the code, it just invents. Hope that saves you from wading through 300 generated lines. When the AI starts free-styling, fall back to Delphi’s native API—shorter and, most important, it compiles. Ping me if you get stuck on auth or JSON parsing.
  2. Hello all, I recently released File2knowledge on GitHub—a true playground for Delphi developers who want to explore OpenAI via the v1/responses endpoint, perfect for agentic approaches. This project is primarily educational, and here’s why it deserves your attention: Web/Edge UI ready to go: dive in within seconds, no hassle. Session history: replay every experiment, debug model logic, and refine your prompts step by step. Native async: everything’s built on Promises for clean, non-blocking workflows. Mockable architecture: with IoC/DI you can swap out any internal service, simplify your tests, evolve the project… or even switch to FMX if you feel like it. Built-in Assistant File2knowledge goes beyond a simple wrapper: it uses file_search to dig through your source files and the docs of existing GenAI wrappers on GitHub. In practice, you can: Understand a wrapper’s design in detail. Integrate the code directly into your own projects. Customize or fix the base files using vectorized versions of your own technical resources. Turnkey Client Delivered in VCL so you can jump in right away, with a fully modifiable foundation—and an FMX version is coming soon to cover even more platforms. Whenever OpenAI rolls out a new tool (with v1/responses as THE standard for the agentic revolution), I’ll integrate it ASAP so you can test it on the fly. This tool is designed as a hands-on educational lab, but it’s up to you to shape it however you like. I’m all ears for your feedback and suggestions! GitHub : https://github.com/MaxiDonkey/file2knowledge
  3. Hi everyone, (Oops, I made a mistake posting in the VCL section when it's more appropriate here) For more than six months, I’ve been developing turnkey solutions for the Delphi developer community in the form of API wrappers that harness the full range of models and features offered by leading providers (OpenAI, Deepseek, Anthropic, Gemini, MistralAI, Groq Cloud, Hugging Face, and Stability AI). Each README file is written as a tutorial to help users get started. Natively, these libraries support both synchronous and asynchronous workflows and some even offer parallel execution to optimize performance. In addition, I’ve published two projects demonstrating the use of the Promise pattern to orchestrate asynchronous requests (notably with OpenAI, but also compatible with other wrappers), as well as a pipeline-based project that simplifies the processing chain for model calls. Coming soon, I will introduce a module dedicated to OpenAI’s new file_search feature (endpoint v1/response), which, thanks to an expanded semantic surface, enables querying vector stores with unprecedented precision and temporarily enriches models without the need for fine-tuning. If you’re new to AI integration or looking to accelerate your development, feel free to check out my projects on GitHub. Additionally, all of these tools are available via GetIt. Your feedback is welcome and will be incredibly helpful in improving these tools. https://github.com/MaxiDonkey As a point of clarification regarding my background, I am a mathematician by training rather than a professional software developer. I returned to Delphi just over a year ago—if you’d like to understand my motivations, please consult this document. My core expertise lies in using formal proof tools such as Lean and Coq, rather than in Delphi-specific development.
  4. Maxidonkey

    Delphi and LLM

    Hi everyone, For more than six months, I’ve been developing turnkey solutions for the Delphi developer community in the form of API wrappers that harness the full range of models and features offered by leading providers (OpenAI, Deepseek, Anthropic, Gemini, MistralAI, Groq Cloud, Hugging Face, and Stability AI). Each of these libraries supports both synchronous and asynchronous workflows, and some even offer parallel execution to optimize performance. In addition, I’ve published two projects demonstrating the use of the Promise pattern to orchestrate asynchronous requests (notably with OpenAI, but also compatible with other wrappers), as well as a pipeline-based project that simplifies the processing chain for model calls. Coming soon, I will introduce a module dedicated to OpenAI’s new file_search feature (endpoint v1/response), which, thanks to an expanded semantic surface, enables querying vector stores with unprecedented precision and temporarily enriches models without the need for fine-tuning. f you’re new to AI integration or looking to accelerate your development, feel free to check out my projects on GitHub. Please note that I only provide the entry point to my GitHub page, from which you can navigate to the repositories that interest you. Additionally, all of these tools are available via GetIt. If you’d like me to share direct links to each repository, I can do so in this thread. Your feedback is welcome and will be incredibly helpful in improving these tools. https://github.com/MaxiDonkey
×