Jump to content
AlexBelo

FreeAndNil() - The Great Delphi Developer Debate

Recommended Posts

36 minutes ago, Lajos Juhász said:

Here maybe you will think you have an excellent code as there is FreeAndNil

:DDDDDDD

It's the same as x1.fee; in your ecample. Are you sure you understand pointers? Maybe Java would be a better choice?

Edited by Attila Kovacs

Share this post


Link to post
2 minutes ago, Attila Kovacs said:

DDDDDDD

It's the same as x1.fee; in your ecample. Are you sure you understand pointers? Maybe Java would be a better choice?

In this case we were discussing object vs interfaces. If those were interface references it would be a bit different outcome.

Share this post


Link to post
1 hour ago, David Heffernan said:

My code uses FAN rather than free as a rule, and I don't recognise the problem you describe. That's my practical experience. 

I am not that smart... I need all the help I can get...

 

I cannot say for sure how would it work for my own code, which I know the best, because I don't use FreeAndNil everywhere. But this has definitely been a problem for me when reading other people's code. Or maybe that is just because it was overall not the best code ever written...

  • Like 1

Share this post


Link to post
2 hours ago, Lajos Juhász said:

For example free and nil will not save you here:

 

var

  x1, x2: TMyClass;

 

begin

  x1:=TMyclass.create;

  .....

  x2:=x1;

  ......

  FreeAndNil(x1);

 

  x2.SaveTheWorldAndMakeItABetterPlace;

end;

Here maybe you will think you have an excellent code as there is FreeAndNil, however that will not help and x2 is still a stale reference.

True. But then I've never made that claim. 

Share this post


Link to post
1 hour ago, Lajos Juhász said:

In this case we were discussing object vs interfaces. If those were interface references it would be a bit different outcome.

Thought so. GC warriors. Why not Java?

 

Anyway, if they were interfaces that would be a different program. I don't think you are familiar with pointers at all.

Edited by Attila Kovacs

Share this post


Link to post
54 minutes ago, Dalija Prasnikar said:

I cannot say for sure how would it work for my own code, which I know the best, because I don't use FreeAndNil everywhere. But this has definitely been a problem for me when reading other people's code. Or maybe that is just because it was overall not the best code ever written...

Your argument about intent is the closest that I have ever seen to a cogent argument on this subject. But I still don't buy it.

 

In your argument, if you see code that sets references to nil then that indicates this "optional lifetime" pattern, for the sake of inventing a name.  And your point is that if you set the reference to nil every time you destroy an instance, then you can't distinguish the optional lifetime pattern from the two more common lifetime patterns, let's call them "owned lifetime" where the object is destroyed in its owner's destructor, and "local variable lifetime" where the object is created and destroyed in a local method, and the reference is a local variable.

 

The thing is though, when you see FAN in owned lifetime and local variable lifetime, it's unmistakeable what the lifetime is.  You never think, oh, I wonder if this might be optional lifetime.  That's why I don't buy the argument, and why I've never experienced the issues you describe.

 

What I have experienced though is the pain of using a stale reference.  That pain is eased if the reference is nil.  And yes, we can use debug MM to write over memory when an object is destroyed, but I don't want that to happen in my release builds.  And sometimes bugs only emerge in release builds.

  • Like 2

Share this post


Link to post
3 hours ago, David Heffernan said:

Wrong solution. Right solution is to learn what you don't know how to do. No shortcut. Certainly not by just saying "use interfaces". 

Thats true, unfortunately there is no silver bullet in sight.

 

On the other hand I also try to rework my code permanently, to be using interfaces over objects, so I would also put my bet a little on the interface side too.

What I would say is that "interfaced" design leads to a more clean code in general, without the need of headache over construction/destruction too much, won't you agree ?

Yes, you can misuse anything, but what I mean here is that interfaces may lead to a more clean class design like a kind of "pattern" too, at least from my experience.

 

Edited by Rollo62

Share this post


Link to post
12 minutes ago, David Heffernan said:

Your argument about intent is the closest that I have ever seen to a cogent argument on this subject. But I still don't buy it.

Fair enough.

12 minutes ago, David Heffernan said:

In your argument, if you see code that sets references to nil then that indicates this "optional lifetime" pattern, for the sake of inventing a name.  And your point is that if you set the reference to nil every time you destroy an instance, then you can't distinguish the optional lifetime pattern from the two more common lifetime patterns, let's call them "owned lifetime" where the object is destroyed in its owner's destructor, and "local variable lifetime" where the object is created and destroyed in a local method, and the reference is a local variable.

 

The thing is though, when you see FAN in owned lifetime and local variable lifetime, it's unmistakeable what the lifetime is.  You never think, oh, I wonder if this might be optional lifetime.  That's why I don't buy the argument, and why I've never experienced the issues you describe.

I am saying that in some situations lifetime is beyond obvious, no matter what you use.  Both in locally constructed instances and in instances with larger scope, regardless whether their lifetime is dynamic or not.

 

But I have seen plenty of more complex code, where lifetime is not as clear cut and where there are thousand lines of code involved with multiple references (to different objects that are functionally intertwined). So FreeAndNil in destructor can make a difference between intended behavior and bugs, And the FreeAndNil might solve one bug only to make another one appear. Being able to categorize references based on intended behavior can make untangling such code an easier task. 

12 minutes ago, David Heffernan said:

What I have experienced though is the pain of using a stale reference.  That pain is eased if the reference is nil.  And yes, we can use debug MM to write over memory when an object is destroyed, but I don't want that to happen in my release builds.  And sometimes bugs only emerge in release builds.

One thing is for sure. For you I don't doubt that you do know what particular code does. I may think that you are needlessly using FreeAndNil, but it will not make me think that your code is possibly bug ridden because you have no idea what you are doing. Unfortunately, that is generally not true and in plenty of code I have seen using FreeAndNil in "wrong" place was nothing comparing to the other coding horrors and bugs it contained. 

Share this post


Link to post

I can't see any valid reason not to use FreeAndNil() yet, just complaints about not being able to read others code by ppl. who are well known for being ARC advocates.

Share this post


Link to post
17 minutes ago, Attila Kovacs said:

I can't see any valid reason not to use FreeAndNil() yet, just complaints about not being able to read others code by ppl. who are well known for being ARC advocates.

I gave my reasons and arguments. I never said anyone has to agree with me.

 

One thing is for sure. With ARC compiler that wouldn't have to care about compatibility with existing codebases there would be no Free nor FreeAndNil. 

 

  • Like 2

Share this post


Link to post
13 hours ago, Dalija Prasnikar said:

With ARC compiler that wouldn't have to care about compatibility with existing codebases there would be no Free nor FreeAndNil. 

Just .DisposeOf's... and the reoccuring questions about why objects doesn't self-destruct (after you have intentionally or unintentionally made references that keeps it alive.

Share this post


Link to post
21 minutes ago, Lars Fosdal said:

Just .DisposeOf's... and the reoccuring questions about why objects doesn't self-destruct (after you have intentionally or unintentionally made references that keeps it alive.

DisposeOf is needed solely for compatibility with existing code. Without maintaining compatibility and coding purely according to ARC rules, there would be no need for DisposeOf, too. 

Share this post


Link to post
15 hours ago, Rollo62 said:

What I would say is that "interfaced" design leads to a more clean code in general, without the need of headache over construction/destruction too much, won't you agree ?

Yes, you can misuse anything, but what I mean here is that interfaces may lead to a more clean class design like a kind of "pattern" too, at least from my experience.

Interfaces give automatic memory management for the price of significantly complicated code navigation. They (in Delphi implementation) add more boilerplate for g/setter methods. Moreover, to avoid circled ownership they require weak refs which appeared only in 10.x

  • Like 1

Share this post


Link to post
13 minutes ago, Fr0sT.Brutal said:

... They (in Delphi implementation) add more boilerplate for g/setter methods. ...

Yes, but I'm looking at it from a users perspective, they just can make sense in many places to get to cleaner UI for the caller.

I've try to keep all complexity behind the interface, so that the user don't have to care about it, of course this has its limitations.

There can be issues with more complex stuff, but as said the interfaces are no general problemsolver either.

Share this post


Link to post
1 hour ago, Lars Fosdal said:

Anyways, ARC has gone the way of the Dodo. Free(AndNil) it is.

Yes, I know. Situation where we had two different memory models that required slightly different coding patterns was unbearable. 

 

When people belittle ARC as memory management model, they tend to miss the fact that the problem with full ARC memory management was not the ARC as memory management model, nor the compiler itself, but the existing code that was not coded according to ARC rules. Hence the whole DisposeOf nightmare we had. Also, every memory management mode has good sides and bad sides. Each model is better for some use cases and worse for other. 

  • Like 2

Share this post


Link to post
On 6/28/2022 at 9:27 AM, dummzeuch said:

Hm, wasn't the point of the first post in this thread that FreeAndNil isn't big enough of a subject to make a whole video about?

Two pages of discussion say otherwise.

Law of Triviality. Every body's got an opinion about the bike shed but do we really need a video and an endless thread about it?

  • Like 1

Share this post


Link to post

Since this is such sensitive debate, better set some base rules:

  • no pulling hair
  • no poking the eye
  • no dropping hot coffee over the speaker desktop 
  • no high kick
  • no low kick
  • no wedgie

let's try to keep this debate civilized :classic_smile:

 

  • Like 1

Share this post


Link to post
23 hours ago, Lars Fosdal said:

Anyways, ARC has gone the way of the Dodo. Free(AndNil) it is.

Wait, what? I thought y'all told me they put the memory management back in after taking it out (then doing the hokey pokey and dancing all about)?

Share this post


Link to post

They are debating the age rating for the video. Things went pretty wild yesterday....
"I want to Free this"... "you can't, the fabric of spacetime will collapse if you write code this way".."but, spacetime is thread-safe" .. "No!"

Well... I hope the link will be available soon. I want to see some parts in slow motion. :classic_cheerleader:

  • Haha 4

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×