Jump to content

Anders Melander

Members
  • Content Count

    2297
  • Joined

  • Last visited

  • Days Won

    119

Posts posted by Anders Melander


  1. I'm not saying that the feature doesn't make sense.

    I'm saying that it doesn't make sense that these settings can be edited more than one place and that one of the places is the global Components, Install Packages dialog.

    After all if I still get confused about this after using Delphi for 24 years, then there's either something wrong with the UI or I'm just slow.

     

    Change the global dialog so it doesn't edit the project package list and instead make it clear that it's editing the default options and there's no confusion.

     

    Even better, and I remember this was discussed ad nauseam some twenty years ago, but if the package list had been opt-in instead of opt-out then much of these package juggling problems wouldn't exist.

    • Like 5

  2. Just now, PeterBelow said:

    The package list is specific for the active project.

    I believe you, but that makes absolutely no sense from a usability perspective.

     

    You're saying that the Design packages list in project options is working on the same settings as the Design packages list in the Component, Install Packages dialog (which is where one used to install packages globally).

    [checking...] You're right. The dialog even says "Project options". Amazing! :classic_huh:


  3. 2 hours ago, Ian Branch said:

    ... save the project, the files save very quickly, it is the closing that takes around 13 minutes.

    Sorry. Didn't read your description properly. I thought it was the save that was slow.

     

    I've encountered projects that took a long time closing but I can't remember what I did to solve it.

     

    I don't know if Live Bindings can affect close performance like they do open. One of my clients has a project where one of the datamodules has almost 200 datasets and thousands of persisted fields. Before I disabled Live Bindings there it took in the neighborhood of 15 minutes to just open this datamodule. Now, with Live Bindings disabled it takes less than a second.

     

    Anyway, you can easily get a call stack of the Delphi IDE with Process Explorer. Double click bds.exe, view threads, double click the thread with the highest CPU usage.

    That should give you (or us, if you post the call stack here) a clue about what it's doing.

    • Like 2

  4. My bet would be cloud storage: Try disabling iTunes, GDrive, OneDrive, DropBox etc. if you have them.

    For some reason these don't show up in Process Explorer (probably because they work on a too low level), but in my experience they can completely kill system performance.

     

    Every so often members of my family will ask for a new computer because it's gotten so slow they can't use it. I've tried, in vain, to explain to them that hardware doesn't get slower with age (my main workstation is actually the oldest in the household) but they persist. So I just uninstall or disable the various cloud storage services they use and just like that their system runs like new again.

    • Like 1

  5. 3 minutes ago, Joseph MItzen said:

    The United States isn't ...

    I'm pretty sure many of the countries on the receiving end of US' forced friendship has a different view on that. Look to me like the US is actually also guilty of most of the issues you listed. I don't mind the US looking after its own interests, just don't pretend it's doing anything but that.

    China is a complex issue. Sure it's bad in some areas, but it's slowly improving. It's a huge country and rapid change would cause the country to implode.

    • Like 1

  6. @Dalija Prasnikar I agree with your characterization but in some cases I find that composition is better suited than inheritance even though inheritance would be more natural. For example if I were to create a list of TSomething it would be tempting to use TList<T> as a base class, Problem is that it might give me too much. If I only need an Add method, a Count and an array property, then composition is probably better (I guess one could call it inheritance through composition).

    I've seen too many examples of inheritance (not just from TList) where most of the inherited methods doesn't make any sense and would break the application if used.

    • Like 1

  7. 22 minutes ago, Rudy Velthuis said:

    I don't understand. A breaking change like this (64 bit seek instead of 32 bit seek) would probably not even be noticed, or if it were, easily fixed. Most breaking changes, well, break things and thus stand out like a sore thumb. They are easy to find and easy to fix, IME.

    OK then, let me spell it out; In regular applications or even simple libraries individual braking changes are probably relatively easy to locate and fix. The TStream change for example could almost have been made with search/replace, but I guess they thought that the backward compatibility solution was safe enough that they didn't need to make it breaking. In retrospect, although I never personally got bitten by it, they should have marked the old overload as deprecated to flag that ppl should fix their code or else. FWIW I don't think the TStream method is fragile. From what I can see it's very robust.

     

    For a large framework the situation can be different. A framework has an API and therefore a contract with the users of the framework. A breaking change in the RTL that lies below the framework can mean that the break propagates to the API of the framework. For example I just yesterday finished updating my local copy of a framework that sits on top of DevExpress from a two year old version of DevExpress to the latest version. DevExpress changed the type of some properties from TBitmap to TdxSmartGlyph and since the framework exposed those same properties as TBitmap it got stuck on that old version. If the framework had followed suit and also changed the type, then it again would have broken the code of hundreds of customers (the users of the framework) with no gain to them. The company that with this particular framework is still stuck on that old version of DevExpress since they no longer have have the in-house expertise to solve the problem and you can bet they would have preferred a non-breaking change.

     

    Another example is systems with a plethora of applications and millions of LOC. A breaking change here can be hugely costly because of the accumulated time it takes to evaluate, fix and test each required change. In some cases the people that wrote the original code have moved on and nobody knows or understand what it does anymore. I see that daily (not the break, the lack of knowhow).

     

    Anyway, I don't think there much point in flogging this much more so I'll let you have the last word - I know you want it 🙂


  8. 1 hour ago, Stefan Glienke said:

    "Backwards compatibility" is the ultimate excuse to pile up garbage in your backyard ...

     

    It is used or ignored whenever convenient - moving forward also includes getting a compile error in your face but with a clear guide at hand how to solve it.

    If you ever inherited from a TDataSet and used one of its method that have TBookmark or TRecordBuffer arguments while writing code for different Delphi versions since 2010 or so you know what I mean.

     

    But some developers seem to rather want to save an hour when moving their code to a new version and waste hours or days later hunting down a bug. 😉  

    Ehem.. I think you should speak for yourself and not put these labels on the motives of other developers you know nothing about.

    Breaking backward compatibility is easy. Maintaining it is often very hard.

    I have seen many examples of projects that have stranded on old versions of 3rd party libraries because it was simply to costly to locate and fix breaking changes. That said I naturally agree that one should strive to move code forward and eliminate dependencies on backward compatibility.


  9. 6 minutes ago, Dalija Prasnikar said:

    YES, backward compatibility matters. But in cases where backward compatibility causes more trouble down the road, then it is not worth the price.

     

    In this case, maintaining backward compatibility also opened TStream and descendant classes to subtle bugs when working with streams larger than 2GB. 

    Are you saying that it wasn't worth it back when the change was made or that it isn't worth it anymore (i.e. today)?

    IMO the change was definitely worth it at the time because it didn't break backward compatibility and AFAIR didn't introduce new problems. They could have marked the old methods deprecated at some point and eventually retired it completely.

    AFAIK the change didn't introduce any new problems in older TStream descendants with 2+Gb files - it just made them possible going forward. If you have examples of bugs then I'd love to hear of them.


  10. 20 hours ago, Rudy Velthuis said:

    If I had been them, I would simply have replaced Seek to accept a 64 bit parameter and to return a 64 bit value. I would have removed the 32 bit version. Versions only using 32 bit can still use those arguments. In other words: I would not have overloaded, just replaced.

    You're missing the point. The purpose was to provide backward compatibility for existing descendants of TStream - not existing code using TStream.

    • Like 2

  11. 1 hour ago, Tommi Prami said:
    
    if Assigned(LDocument.documentElement) then 
      LNodeList := LDocument.documentElement.getElementsByTagName('IBAN')
     else
       LNodeList := LDocument.getElementsByTagName('IBAN');

     

    Why do you have this test? I would think that the second variant (LDocument.getElementsByTagName) would be good enough...

     

    I also think it would be safer if you used XPath or specified the path in your search criteria:

    LDocument.getElementsByTagName('/BkToCstmrStmt/Stmt/Acct/Id/IBAN');

     


  12. On 3/9/2019 at 6:44 PM, Anders Melander said:

    Only because I've just uploaded a new version 🙂.

    A word of warning to those (I'm counting 50 since yesterday) that downloaded this version: If your resources contains bitmaps that were created by older versions of Delphi (or rather applications built with older versions of Delphi) then the resource editor might corrupt them on save.

     

    It appears that a bug was introduced in TBitmap between Delphi 2009 and 10.2.  Here's the short version:

    The format of a windows bitmap is basically 1) Header, 2) Color table, 3) Pixel data. For bitmaps with PixelFormat>pf8bit the color table is optional.

    The Header specifies the number of colors in the color table (the TBitmapInfoHeader.biClrUsed field).

     

    Older versions of Delphi sometimes saved bitmaps in pf24bit/pf32bit format with a color table and the corresponding value in the biClrUsed field. This was unnecessary but harmless and perfectly legal according to the bitmap specs.

    Here's an example of what such a bitmap might look like:

    [File header]

    [Bitmap header, biClrUsed=16, biBitCount=32]

     

    [Pixel data]

     

    These bitmaps can be read by newer versions of Delphi, but when the bitmaps are written again they become corrupt. Delphi keeps the value in the biClrUsed field but fails to write the corresponding color table. The result is that the pixel data ends up at the wrong file offset.
    Here's an example of a corrupt bitmap:

    [File header]

    [Bitmap header, biClrUsed=16, biBitCount=32]

    [Pixel data]

     

    The reason why this is a problem for the resource editor is that it is built with Delphi 10.2. I have a fix for the problem but I'm not ready to release a new version with the fix.

    Here's the fix btw:

    // Fix for bug in TBitmap.
    // Saving bitmap with PixelFormat>pf8bit with biClrUsed>0 fails to save the color table
    // leading to a corrupt bitmap.
    type
      TBitmapColorTableBugFixer = class helper for TBitmap
      type
        TBitmapImageCracker = class(TBitmapImage);
      public
        function FixColorTable: boolean;
      end;
    
    function TBitmapColorTableBugFixer.FixColorTable: boolean;
    begin
      if (TBitmapImageCracker(FImage).FDIB.dsBmih.biBitCount > 8) and (TBitmapImageCracker(FImage).FDIB.dsBmih.biClrUsed <> 0) then
      begin
        TBitmapImageCracker(FImage).FDIB.dsBmih.biClrUsed := 0;
        Result := True;
      end else
        Result := False;
    end;

     The problem appears to be the same one reported here: Setting TBitmap.PixelFormat can lead to later image corruption or EReadError

    • Like 4
    • Thanks 1
×