Kas Ob. 121 Posted March 13 24 minutes ago, Rollo62 said: what made you think C++ fails to be the best language. As i said because it failed so many times to guarantee better and safer code practice, as we read so many times, so is it the best?, yeah it is,.. is it suggest to continue to be used in most critical software ? here comes the answer : NO , so is there better solution ? Yes and they calling to skip it to something better and more safer, not ... again not because it can't deliver, but there is a problem with the human factor and they gave up on fixing it. Don't take my opinion on the subject, and think about this : Can all the big companies wrong as they call to drop the language and switch to something else ? And please remember that : what ever you might suggest or even imagine, is already had been done and tried with C/C++ and still failed again and again. Rust doesn't have that much of unique as language except key features which limit the developer to design as he might imagine or wish, and involve the compiler even more, for Delphi (though) this might trigger many here, smart pointer will be wrong direction and will not solve anything, not just because Embarcadero might fail to deliver, or waste of time and resources, but because with all the libraries in C/C++ and still failing, so why repeat the tested and expect different result, please don't take this as debate for now about this subject, it is what one can deduce from what is happening now. Delphi will be many folds safer with removing or limiting the memory handling by code and give that to the compiler, same as Rust, if there is no shuffling then we don't need smart pointer or stupid ones, it is so simple, will this hinder our coding design, may be it will, but most likely it only need more code, though i doubt it will be longer than it is now, in all cases it will ensure better design and structure, a hardened and sound one. Imagine one directive at the top of the unit Delphi file that make all the variables and fields initialized and compiler and compile time managed, the compiler will stop you until fix them all, breaking zero backward code (legacy), and ensure the future is brighter and better, and the transition will be easier with compiler error and warning, and no smart pointers with stupid code that trying to retrieve few milliseconds in performance while different approach could be faster at the root. Share this post Link to post
Rollo62 536 Posted March 22 (edited) More on memory safety, from Marco Cantu. Regarding marketing, what comes into my mind is: Can the memory safety of RadStudio C++ safely assumed to be more improved than standard C++, because it's close relation to its sibling Delphi within the same package? My marketing mind screams: Of course RadStudio C++ is advanced, because the memory safety of RadStudio Delphi rubs off on RadSudio C++. P.S.: "rub-off" is a professional software developer term, that decision-makers in government and elsewhere can easily adapt and understand. Edited March 22 by Rollo62 Share this post Link to post
David Heffernan 2345 Posted March 22 2 hours ago, Rollo62 said: Can the memory safety of RadStudio C++ safely assumed to be more improved than standard C++, because it's close relation to its sibling Delphi within the same package? No Share this post Link to post
Attila Kovacs 629 Posted March 23 As an additional note to this "safety" topic. Just yesterday, it happened to me that the type of a form event parameter was defined under the same name in another unit as well, and the IDE didn't alert me, nor did the form load signal any issues during loading. Thus, the resulting code was completely wrong. The stack got shifted, and the event was returned to a garbage address. Originally, the parameter type was a class which was overridden in the other unit by a record. It was really fun trying to find the problem. Share this post Link to post
David Schwartz 426 Posted April 9 (edited) On 3/12/2024 at 7:30 AM, Brandon Staggs said: Nevertheless, all of the non-trivial projects I work on make use of pointer math too. What you say about pointer math, speed, and what companies want, may be true in many cases, but extrapolating from that to your original claim about pointers (math or aliases, regardless) is quite a leap. They have not "pretty much disappeared from use." One example that comes to mind is a custom 8-bit bitmap backing we developed that would be excruciatingly slow were it not for basic, simple, tried-and-true pointer arithmetic. The reality is also that even libraries that don't offer access to these programming methods still make use of them internally. Somewhere, someone is doing the pointer math. I have not used any calculus since college. I've never had a job that required it. I know I'd be remiss in asserting that calculus must be a dead part of math in the same way that Latin is a dead language, as I'm confident there are plenty of projects where their programmers are implementing solutions to calculus problems on a daily basis. However, I seriously doubt they're using Delphi to do so. (queue up those here who want to tell me otherwise...) FWIW, I've known plenty of programmers who do nothing but write highly-performant code that implements math functions I can't begin to explain, most of which employ calculus. And not one of them said they used Dephi. The most commonly used languages they mentioned are C, FORTRAN, and MATLAB. For those who've looked, Delphi has far too much "compiler overhead" added to the code it generates. Likewise, I'm confident that there are plenty of problems that can be written without pointers and bit-whacking, but if you want to drop down to that level, then the program will definitely run much faster. And if you go all the way down to Assembly language, they'll run even faster. Does this mean that we should avoid adding features to the language that the vast majority of people who write Delphi code would find beneficial, simply because some tiny percentage of users aren't served by them because of hand-optimized code that uses pointers? I've probably seen millions of lines of Delphi code in my career, and aside from a few isolated routines here and there, there was no use of the ^ operator anywhere, indicating a very broad lack of employing pointers simply because they run a little faster. Sure, pointers are heavily used in the RTL and common libs that I think SHOULD be highly optimized. But the people responsible for them are the library vendors, not end-users. This attitude is probably why there have been so few changes to the Delphi language over the years. The industry considers Delphi fairly "ancient" because it's lacking so many features that most contemporary programming languages have. Those features were added to help a majority their of users, not 100%. Nothing seems to get added to Delphi's language unless it benefits as close to 100% of users as possible. When Delphi was introduced, many of it's features were state-of-the-art. Today, it's almost an anachronism. Nobody teaches Pascal any more, and features found in most languages used today are probably years away from showing up in Delphi, if they ever do. Edited April 9 by David Schwartz Share this post Link to post
David Heffernan 2345 Posted April 9 10 minutes ago, David Schwartz said: However, I seriously doubt they're using Delphi to do so. Yeah, you are wrong. Such people exist. I am one. 10 minutes ago, David Schwartz said: Likewise, I'm confident that there are plenty of problems that can be written without pointers and bit-whacking, but if you want to drop down to that level, then the program will definitely run much faster. Not necessarily. No reason why pointer arithmetic should be faster than, for example, plain indexing of arrays. 11 minutes ago, David Schwartz said: And if you go all the way down to Assembly language, they'll run even faster. Again, good compilers are often better than humans. 13 minutes ago, David Schwartz said: Sure, pointers are heavily used in the RTL and common libs that I think SHOULD be highly optimized. But the people responsible for them are the library vendors, not end-users. I don't think pointers are used in the RTL especially more than other libraries, and I don't think pointers are used there fore for optimisation and performance. As far as this whole memory safety debate goes, you can't exclude the RTL from it. That code executes in the code that is subject to attack. 2 Share this post Link to post
mitch.terpak 5 Posted April 9 (edited) 26 minutes ago, David Schwartz said: However, I seriously doubt they're using Delphi to do so. (queue up those here who want to tell me otherwise...) We do, but have been needing to resort to a C++ DLL for some performance critical parts. But our Delphi code still outperformed Intel Math Kernel Library. Main reason was just that the Linux compiled code is unbearable slow. 26 minutes ago, David Schwartz said: And if you go all the way down to Assembly language, they'll run even faster. You're underestimating this quite frankly, the amount of time it'd take you to correctly optimize Assembly code instead of relying on the Compiler to do a decent job insane. Once again you're better off writing C++ since it's compiler will do only a tiny bit worse job then hand optimizing Assembly code, and better then Delphi compiler. 26 minutes ago, David Schwartz said: This attitude is probably why there have been so few changes to the Delphi language over the years. The industry considers Delphi fairly "ancient" because it's lacking so many features that most contemporary programming languages have. Those features were added to help a majority their of users, not 100%. Nothing seems to get added to Delphi's language unless it benefits as close to 100% of users as possible. Agree Edited April 9 by mitch.terpak Share this post Link to post
David Schwartz 426 Posted April 9 (edited) Ok, all of these nuances aside, I for one would LOVE to see something added to the Delphi language that DOES make it more "memory safe", specifically an option that let you tell the compiler (simply in the declaration) that some var is to be treated local to a block and to free it at the end of the block, without requiring try...finally. C++ has done that forever, as well as many other languages. Off-the-cuff, maybe something like this: function get_users_name : string; begin Result := ''; local tmpForm := TUserNameForm.Create(...); // set some properties tmpForm.Caption := 'Enter name'; . . . if tmpForm.ShowModal = mrOK then Result := tmpForm.aName; end; tmpForm will be automatically freed at the end of the block and the memory cleared. I'd like this because chasing down orphaned blocks and references to objects that got deleted early seem to take up more of my time than any other types of debugging issues. I want more help from the compiler and language for this common situation using LESS CODE! Maybe we'll see something when Delphi turns 30 next February? The try...finally, Free, and clearing memory are all managed implicitly by the 'local' variable designator (as an example). It only needs to work initially with normal TObject references (including lists), and arrays of anything (including Objects). If you declare 'local' scalars, they'll be cleared at the end. For pointers, just come back later and use AI to figure out what's going on with the pointers so the code can help deal with them. Edited April 9 by David Schwartz Share this post Link to post
Lars Fosdal 1792 Posted April 9 55 minutes ago, David Heffernan said: Again, good compilers are often better than humans. I wonder if anyone has tried to train an LLM on Assembly code generation to see if it could improve the current optimization patterns? Share this post Link to post
Lars Fosdal 1792 Posted April 9 32 minutes ago, David Schwartz said: The try...finally, Free, and clearing memory are all managed implicitly by the 'local' variable designator (as an example). Hasn't this already been demonstrated with "smart pointers"? 1 Share this post Link to post
David Heffernan 2345 Posted April 9 3 minutes ago, Lars Fosdal said: I wonder if anyone has tried to train an LLM on Assembly code generation to see if it could improve the current optimization patterns? (PDF) Application of artificial intelligence in compiler design (researchgate.net) (ijresm.com) Share this post Link to post
David Heffernan 2345 Posted April 9 36 minutes ago, David Schwartz said: I'd like this because chasing down orphaned blocks and references to objects that got deleted early seem to take up more of my time than any other types of debugging issues. It's odd you say that, but I never have to debug issues like this. There are two simple patterns for lifetime and they are so ingrained, nobody ever makes a mistake in my team. 4 Share this post Link to post
Lars Fosdal 1792 Posted April 9 2 minutes ago, David Heffernan said: (PDF) Application of artificial intelligence in compiler design (researchgate.net) (ijresm.com) None of the techniques described in that paper appear to be related to the LLM generation of AIs. Share this post Link to post
mitch.terpak 5 Posted April 9 (edited) 12 minutes ago, Lars Fosdal said: I wonder if anyone has tried to train an LLM on Assembly code generation to see if it could improve the current optimization patterns? I once tested GPT4.0 for some assembly code. Its actually quite good at explaining and improving Assembly code, but absolutely horrendous at writing it from scratch. Edited April 9 by mitch.terpak Share this post Link to post
David Heffernan 2345 Posted April 9 6 minutes ago, Lars Fosdal said: None of the techniques described in that paper appear to be related to the LLM generation of AIs. Keep searching then Share this post Link to post
David Schwartz 426 Posted April 9 (edited) 20 minutes ago, David Heffernan said: It's odd you say that, but I never have to debug issues like this. There are two simple patterns for lifetime and they are so ingrained, nobody ever makes a mistake in my team. Well, the two patterns I can think of are: (1) when you have everything contained within a single method; and (2) when you're working with an object that has an unpredictable lifetime and lots of methods are touching it. I encounter errors in (1) when the code is rather long and I declare an object but forget to add the Free to the Finally clause. For (2), I'd love to see what your approach is. I've not found an approach that is robust and works well everywhere, other than something that takes a lot of extra code to check things everywhere. Edited April 9 by David Schwartz Share this post Link to post
Lars Fosdal 1792 Posted April 9 Request/Release 29 minutes ago, mitch.terpak said: I once tested GPT4.0 for some assembly code. Its actually quite good at explaining and improving Assembly code, but absolutely horrendous at writing it from scratch. Which is to be expected, I guess, since the training was not done with optimization in mind. Share this post Link to post
Anders Melander 1784 Posted April 9 42 minutes ago, Lars Fosdal said: I wonder if anyone has tried to train an LLM on Assembly code generation to see if it could improve the current optimization patterns? Do you really want your code to be generated based on fuzzy statistics? How do you even verify the correctness of the results? I'd like mine to be based on strict patterns and known deterministic properties of those patterns. I think trying to solve these problems with AI is a bit like when some companies moves their stuff to the cloud; They don't understand how it works or know what is going on but now it's somebody else's problem. 2 1 Share this post Link to post
Lars Fosdal 1792 Posted April 9 12 minutes ago, Anders Melander said: code to be generated based on fuzzy statistics That is not what I intended to say. I was wondering if someone had tried to apply LLM for finding even better patterns for optimization than those that are currently implemented. Naturally, such improved patterns would be made into new deterministic rules in the compiler after being properly vetted. I agree that todays AI output has to be treated as indicative at best, and as bullshit at worst. 2 Share this post Link to post
Anders Melander 1784 Posted April 9 1 minute ago, Lars Fosdal said: That is not what I intended to say. I was wondering if someone had tried to apply LLM for finding even better patterns for optimization than those that are currently implemented. Ah, yes I see your point. Interesting. A bit labor intensive though, having to vet all the different solutions. Personally, I use another approach 🙂 2 Share this post Link to post
David Heffernan 2345 Posted April 9 50 minutes ago, David Schwartz said: I encounter errors in (1) when the code is rather long and I declare an object but forget to add the Free to the Finally clause. I don't really understand this. I always write the try/finally immediately after the construction, and always bang the Free in immediately. Then I fill out the body. It's just a habit that you form so that you don't make such mistakes. And honestly, this is never one that is hard to debug because you just have a leak. And presumably you use leak detection tools so that you'd always find them immediately. 52 minutes ago, David Schwartz said: For (2), I'd love to see what your approach is. I've not found an approach that is robust and works well everywhere, other than something that takes a lot of extra code to check things everywhere. I don't really understand this scenario either. If you have a reference to something that may or may not exist, you would test Assigned() before using it. And when you were finished, you'd set the reference back to nil once you'd destroyed it. The scenario that is tricky is when you have multiple references to an object. 6 Share this post Link to post
David Schwartz 426 Posted April 9 2 minutes ago, David Heffernan said: The scenario that is tricky is when you have multiple references to an object. That's what I'd like to hear more about. Share this post Link to post
David Heffernan 2345 Posted April 9 Just now, David Schwartz said: That's what I'd like to hear more about. The real trick there is not to do it. Avoid it at all costs. Share this post Link to post
David Schwartz 426 Posted April 9 12 minutes ago, David Heffernan said: The real trick there is not to do it. Avoid it at all costs. So what DO you do in a case where, say, you might use an object to collect an accumulation of data that is provided by multiple sources? It's often done for contextual state management over time... Share this post Link to post
David Heffernan 2345 Posted April 9 1 minute ago, David Schwartz said: So what DO you do in a case where, say, you might use an object to collect an accumulation of data that is provided by multiple sources? It's often done for contextual state management over time... That's kind of a vague specification. For instance, is the data pushed or pulled? I would imagine that makes a difference. Share this post Link to post