-
Content Count
48 -
Joined
-
Last visited
-
Days Won
2
tinyBigGAMES last won the day on June 19
tinyBigGAMES had the most liked content!
Community Reputation
43 ExcellentAbout tinyBigGAMES
- Birthday January 15
Technical Information
-
Delphi-Version
Delphi 11 Alexandria
-
Latest update: - Added miniaudio
-
Using the new Clang based C++ Builder (win64 modern platform), which can compile more C/C++ code than it could in the past, which means it is more likely to be able to compile your favorite C library. You compile the code to generate .o files, which Delphi can consume with {$L unit1.o} for example. I have a unit called CPas.CRuntime that references all the missing c runtime routines that the C lib may reference and thus allow Delphi to compile and use the static C library. Continue to update CPas.CRuntime as you encounter references that you may need. I have included a far bit already. “Now you can link in static C libs directly into Delphi:” means you can take your favorite C lib, such as sqlite and link inside the EXE vs calling it from a DLL. I have sqlite3 as an example in the repo. Not all of them will be possible, but you should be able to do more with the new tool chain than you ever could before. This does require 12.1+
-
Ok, I've revived this project. Now you can link in static C libs directly into Delphi: tinyBigGAMES/CPas: Static C Libraries for Delphi (github.com)
-
dllama Dllama - Local LLM inference Library
tinyBigGAMES replied to tinyBigGAMES's topic in I made this
It evolved into tinyBigGAMES/LMEngine: Local LLM Inference (github.com) for Vulkan based GPUs and tinyBigGAMES/Infero: An easy to use, high performant CUDA powered LLM inference library. (github.com) for CUDA GPUs. More stable and supports using up to full model context if you have the resources on your device. -
Dllama, a simple and easy to use library for doing local LLM inference directly from Delphi (any language with bindings). It can load GGUF formatted LLMs into CPU or GPU memory. Uses Vulkan back end for acceleration. Simple Example uses System.SysUtils, Dllama, Dllama.Ext; var LResponse: string; LTokenInputSpeed: Single; LTokenOutputSpeed: Single; LInputTokens: Integer; LOutputTokens: Integer; LTotalTokens: Integer; begin // init config Dllama_InitConfig('C:\LLM\gguf', -1, False, VK_ESCAPE); // add model Dllama_AddModel('Meta-Llama-3-8B-Instruct-Q6_K', 'llama3', 1024*8, '<|start_header_id|>%s %s<|end_header_id|>', '\n assistant:\n', ['<|eot_id|>', 'assistant']); // add messages Dllama_AddMessage(ROLE_SYSTEM, 'you are Dllama, a helpful AI assistant.'); Dllama_AddMessage(ROLE_USER, 'who are you?'); // display the user prompt Dllama_Console_PrintLn(Dllama_GetLastUserMessage(), [], DARKGREEN); // do inference if Dllama_Inference('llama3', LResponse) then begin // display usage Dllama_Console_PrintLn(CRLF, [], WHITE); Dllama_GetInferenceUsage(@LTokenInputSpeed, @LTokenOutputSpeed, @LInputTokens, @LOutputTokens, @LTotalTokens); Dllama_Console_PrintLn('Tokens :: Input: %d, Output: %d, Total: %d, Speed: %3.1f t/s', [LInputTokens, LOutputTokens, LTotalTokens, LTokenOutputSpeed], BRIGHTYELLOW); end else begin Dllama_Console_PrintLn('Error: %s', [Dllama_GetError()], RED); end; Dllama_UnloadModel(); end.
-
Hi, for now yes. I may revisit it in the near future.
-
Integrate with OpenAI's ChatGPT API seamlessly from Delphi. Features Easily access the GPT API from a single class Supports both GPT3 and GPT4 models API key can be read from ChatGPTApiKey environment variable if defined Automatically sanitizes input to minimize errors Ability to define proxy settings Adjust personality response with a precision range of 0-1, from precise to creative Stream responses just like in the ChatGPT web interface Usage Get your API Key: https://platform.openai.com/account/api-keys Define environment variable ChatGPTApiKey and assigned your API key. You may have to reboot your machine for it to take effect. // Basic example showing how to query ChatGPT uses AskChatGPT; var LChat: TAskChatGPT; begin LChat := TAskChatGPT.Create; try // set chat params LChat.Model := GPT3; // use the GPT3 model, or GPT4 for the GPT4 model LChat.Creative := 1; // 0-1, 0 being most percise and 1 being most creative // ask question LChat.Question := 'What is Delphi?' // print question PrintLn('Q: %s', [LChat.Question]); // process and print response if LChat.Process then PrintLn('A: %s', [LChat.Response]); finally LChat.Free; end; end; Media Download https://github.com/tinyBigGAMES/AskChatGPT
-
Delphi 11.3 unusable due to full-build-requiring onslaught of F2084 "Internal Compiler Errors" from minor source modifications
tinyBigGAMES replied to PaulM117's topic in Delphi IDE and APIs
1. If you have not done so, make sure you have excluded all Delphi related folders + your project folders from real-time virus scanning 2. ASLR is enabled by default so there may be places in your code that are now suddenly invalid whereas it worked perfectly before. There may be places where you must use NativeInt/NativeUInt instead for example. So, check for ASLR related issues that may exist in your code base. After doing the above, the constant AVs and crashes diminished for me. -
Best Practice Question: Bidirectional EXE-to-EXE communication
tinyBigGAMES replied to Alexander Halser's topic in RTL and Delphi Object Pascal
Quick and dirty example using CreateFileMapping IPCTest.mp4 IPCTest.zip -
Best Practice Question: Bidirectional EXE-to-EXE communication
tinyBigGAMES replied to Alexander Halser's topic in RTL and Delphi Object Pascal
lol, I will assume that you never used CreateFileMapping? So, with just three calls with no other dependencies, you can setup IPC between EXEs. With just two calls you can allocate virtual memory into the address space of your EXE or those same calls with different parameters, you can map a large file into your EXE address space and access it as it was contiguous memory. The power of CreateFileMapping at your disposal. Why does it need to overlay complicated? -
Best Practice Question: Bidirectional EXE-to-EXE communication
tinyBigGAMES replied to Alexander Halser's topic in RTL and Delphi Object Pascal
CreateFileMapping, shared memory between processes. Creating Named Shared Memory - Win32 apps | Microsoft Learn -
SDL3 for Pascal If you want to get your hands dirty and directly use the new SDL3, I got you covered. 😎 Add SDL3 to your uses statement and it will be linked into your executable with direct access, no DLLs to maintain. You also get miniaudio (for audio), Nuklear (for GUI), pl_mpeg (for video) and stb (for images & fonts) and more. Added a contrib folder and accepting PRs, if you wish to add a contribution. To start the ball rolling, I added ziparc archive utility for making standard password protected zip archives, using zlib/minzip from SDL3pas only. Enjoy! https://github.com/tinyBigGAMES/SDL3
-
Hi, thanks. Oh no worries, I welcome all feedback. I just committed a huge update to the repo today with all of that and more. There is a new unit GamePascal.Framework, there is a new class TGame that you can derive your games from with all of what you suggested built in. TGame, TActor, TActorList, TActorScene together allow you to have a dynamic and responsive OOP system to manage your game. All the examples have been updated to take advantage of this.
- 8 replies
-
- object pascal
- gamedev
-
(and 6 more)
Tagged with:
-
I'm working on this demo, which will be in the next update.
- 8 replies
-
- object pascal
- gamedev
-
(and 6 more)
Tagged with:
-
Also, you can always spin up a TCompiler instance (see GPCC if you want to play around with it now) and compile your source standalone via the built-in CaaS (compiler as a service). There will be more information/examples about CaaS in future releases. The allows for the move toward a standalone game engine/ide environment in the future.
- 8 replies
-
- object pascal
- gamedev
-
(and 6 more)
Tagged with: