Jump to content

David Schwartz

Members
  • Content Count

    1191
  • Joined

  • Last visited

  • Days Won

    24

Everything posted by David Schwartz

  1. David Schwartz

    Is it really good practice to create Forms only as needed? Always?

    Generally speaking, I don't auto-create any forms that aren't required to be around for the entire duration of the program's execution. So there's the Main form, any DataModules that are used by other forms (bad practice, but very common), and any other forms that are simply easier to leave around to hang onto values that would otherwise need to be saved somewhere else. Everything else is created as-needed and freed when I'm done using it. As far as your 50-tabbed notebook ... it must take forever to open that form in Delphi! At some point, I'd change that from a tabbed notebook to a panel where I loaded up an embedded form (created on-the-fly) from a combobox selection.
  2. David Schwartz

    Should my record be a class instead?

    It's unclear what your point is other than to formulate an excuse not to use classes. I mean, you're going through a lot of gyrations simply to avoid using classes. I think Uwe did you a disservice by simply wrapping your existing mess inside of a class. It would be far simpler to just convert it to a class directly so you could see the difference. Your PrepareGlobalRec method IS your constructor, so claiming you don't need "create" is baseless. constructor SomeName; <-- This is for doing exactly what Preparexxxx is doing! The name is irrelevant You probably will also need a destructor to avoid memory leaks, although if the lifetime of these vars is the same as the application, then they'll be freed when the program quits. I'm not sure why you think you need 100's of global vars other than this is just how you've been doing things for years. This is how VB structures things. There are generally very few global vars needed in properly designed Delphi apps. What you'd probably want to do is put them inside of a DataModule or Form and access them from there. Do you NEED TO? No, of course not. Nor do you need to use records or arrays for anything either, just lots and lots of variables. Either you're going to use OOP principles to build your software or you're not. It's up to you. You don't need to ask anybody here for permission to build something using pre-OOP practices. Sure, you CAN ... but why?
  3. David Schwartz

    General DB access question -- paging query results

    This has to do entirely with how CLOBs are stored vs. VARCHAR2's. The data is new -- it happens with five records created yesterday. And changing the architecture is not going to help because if you need to store >4k then you need to use LOBs. Period. Apparently Oracle saves CLOBs and BLOBs directly to disk in their own files, not in the DB. So it's way slower loading them. I don't know why other DBs don't seem to have this problem with huge text entities (like SQLServer's Memo fields).
  4. David Schwartz

    Include unused class and its RTTI in the executable

    is there any guarantee that the compiler won't ever figure out the never-true condition and simple ignore the code in the block? If it's never executed then there can't be any side-effects, so why not just ignore it entirely? That would not solve the problem, would it? I don't like stuff that depends on subtle compiler optimizations that can occur silently and I'd never know they happened unless I happened to look at the generated code. It's even worse than that because it would work fine in debug mode that does no optimization, but then fail when the code is built as a Release and optimization is turned on.
  5. David Schwartz

    Include unused class and its RTTI in the executable

    put this somewhere: TMyClass.Create().Free(); Or add a Register() method and call it in an Initialization section somewhere.
  6. I can't recall ever seeing any discussions about unit testing of server-side assets, like views, stored procs, and functions. Does anybody do it?
  7. David Schwartz

    Delphi Rio IDE hangs opening 10.2.3 projects

    D10.2.3 does seem to be slower than 10.2.2 at opening projects with complex forms, appearing to "hang" for quite a while the first time any complex form is opened.
  8. I'm working with an Oracle DB and am using TOracleDataset from AllroundAutomation. The app is pre-processing some data and saving it to a couple of tables. One string field is mostly pretty small, usually <100 characters. But there are some big instances, and a handful of REALLY HUGE instances (8500 chars). The max varchar2 size is 4000 chars. So to hold the handful of records that are bigger, I replaced it with a CLOB field. The problem is, when the app starts up, it opens the tables using "select * from xyz". That works fine as long as there are no LOB fields -- each one takes 1-2 seconds to open. But adding a CLOB field suddenly causes it to take ~15-20 seconds to open, even if there's just one short line of data in each of the CLOB fields. So I'm trying to figure out some way to "lazy load" the data in the CLOB fields (or even most of the table). The nature of this problem is that most of the fields are static -- which is why they're filled from the initial preprocessing step. Even the initial data in this CLOB field comes from the pre-process step. But the user will need to modify it and then save it. (Ironically, it will usually be shrunken way down in the process.) This CLOB field is not read-only, but read-write. However, the data really isn't needed until the user scrolls to that exact record. Then it gets loaded almost instantly even if it's huge. So I'm trying to figure out a way of minimizing the initial load time when I open the table so it doesn't actually load the CLOB data right then, but rather when the user accesses the record the first time. If it matters, none of the data needs to be pre-loaded. It can be loaded as the user scrolls through it. It can also be paged, but I don't know how to do that with typical DB-aware controls. (I'm using cxGrids as well as some separate edit controls, and I've heard cxGrids may have that ability.) I'm looking for suggestions on how to resolve this quickly -- as opposed to re-architecting the entire app.
  9. David Schwartz

    log file policies

    I'm curious if anybody has any particular policies they follow when it comes to the use of log files. I've actually never run into any such policy anywhere I've ever worked. But we've run into a problem that is leaving us open to someone asking, quite legitimately, "What sort of policies do you guys follow when it comes to log files and data logging?"
  10. David Schwartz

    Forward declarations

    As I said earlier, if you have circular references, you need to rethink your approach. If a "person" can have an "address" as a property, and an "address" can have a "person" as a property, you've got a problem. If you think of this in normal "in real life" terms, sure, a "person" DOES have an "address". But more abstractly, you can say, "person has a residence address" And, a "person" could have a "work address", and a "previous residence/work address" and so forth. Now, an "address" usually does not "have a person" -- it could have MANY such "persons" associated with it who have different roles: owner, landlord, manager, tenant, maintenance people, etc. So if you look at this more realistically, your simplified explanations may be misleading you in your architecture. Also, you seem to be bumping up against the general notions of "inheritance" vs. "composition", or "Is-A" vs. "Has-A". As Einstein once said, you cannot solve a problem at the same level of logic that caused it.
  11. David Schwartz

    Delphi-Zeos-SQLite3-TDBEdit

    Your post is a bit vague, so I'll just post the obvious in case you need it: You need to hook the dataset (Zeos query or table) to a TDataSource, then hook the TDBEdit to the TDataSource
  12. David Schwartz

    any suggestions on how to "lazy load" a BLOB/CLOB from Oracle DB?

    I think you're still missing the problem. Varchar2 fields represent baseline performance. It takes ~2 seconds to load 100% of tables into memory using "select * from Xxxx" on ALL FOUR TABLES. It's about 10 MB of data. With this approach, I'm only missing about 2% of the data because of the small number of records that have > 4000 characters in those fields. When I simply replace even one Varchar2 field in a table with a CLOB field, the time to load "select * from Xxxx" goes up by nearly 60 seconds PER TABLE! That's 0.5 sec to 60 secs. Just changing one field from Varchar2 to CLOB. We're talking maybe 100kb more data in a few dozen records TOTAL. There is absolutely no reason that ten megabytes can be loaded in 2 seconds but then an additional 100kb takes FOUR MINUTES! I keep saying, the problem is NOT IN LOADING ALL OF THE RECORDS INTO MEMORY!!! Where is the overhead coming from in switching from a Varchar2 field to a CLOB field? THAT IS THE ISSUE! I've worked with applications that use SQL Server and MySQL with tables that have Memo fields (equivalent to Oracle's CLOBs) that hold 100kb chunks of text -- tens of thousands of records -- and running "select * from Xxxx" takes a few seconds to load 100% of them into memory. We're talking a couple orders of magnitude LARGER databases than what I'm working with here -- and all I ever hear about Oracle is how it's supposedly so much faster and more efficient for HUGE databases. I'm sorry, but it's TOTALLY FAILING here on a teeny weenie change that should have zero impact on performance. This is NOT NOT NOT related to "select * from Xxxx" loading the whole table into memory! Not when the entirety of all four tables is UNDER 10 MB! What accounts for the disparity between 2 sec and 4 minutes when the size of the data load is only 100kb more? It's NOT "load time"! Not when I've got several GBytes of free memory available!
  13. David Schwartz

    log file policies

    Thanks Arnaud, this is one "level" of detail I'm curious about. Another thing I'd like to see is ... what kinds of stuff gets logged? There are some devs who set up log files to only capture errors. Some capture more. Context is often a big clue to tracking down programming errors. Do you have a policy to make an effort to recover from exceptions, or just log them and cancel whatever was happening and move on? When you log errors, do you log just the exception messages, or do you log contextual data as well? (I like to use MadExcept because it grabs a whole bunch of useful contextual info that's sometimes quite helpful.) I realize everybody is different, but what I'm asking is as a general practice, what kinds of policies have you guys seen? (If you get sued and someone alleges your program was throwing errors left and right for months and months, but the log files show no unexpected activities that can be proven DID in fact occur ... what's your defense?)
  14. David Schwartz

    any suggestions on how to "lazy load" a BLOB/CLOB from Oracle DB?

    Ok, forget all this crap. You guys are so far off in the weeds that it's ridiculous. Thanks, but let's refocus. The root problem I'm trying to deal with is the performance hit that occurs when varchar2 fields are replaced with CLOB fields. That's all it is. Everything flows from that. **************************************************** Here's the deal... I have two pairs of tables that represent master-detail relationships. They're very similar, but not identical. The data I'm working with is such that I pre-process it ONCE to load up the master tables and also initialize a lot of the records in the detail tables. The two "master" tables have fewer than 500 records each, and the two "detail" tables won't ever have more than 1500 records. (Not "millions"!) The users will use the program to go through the data, making comments, setting other data flags, and doing things that even weak minds are far quicker at figuring out than code. As it happens, there's one field in all four tables that has a very wide range in terms of its length: 95% of the records are under 1k, and 4% are under 4k, all of which fit very happily inside of a single varchar2 field (limited to 4000 chars). But 1% are > 4k and a dozen are > 8k. This forces me to either use CLOBs or employ some crazy-ass paging using 3 varchar2 fields (which I haven't tried yet). The problem is that when I change this field in each table from varchar2 to a CLOB, the TOTAL TIME to simply call Open() on all four tables goes from about two (2) seconds to almost four (4) minutes -- one minute per table. We're talking about only a few hundred records! Given that 99% of the data in those fields is the exact same length (ie., <4k) whether it's in a varchar2 or CLOB, the performance impact clearly has something to do with how CLOBs are managed internally, not the tiny number (1%) of really long fields. (And it's NOT data transfer time to the client, because again, <1% of the data is >4k in length.) I'm taking this approach to simply allow me to use data-aware controls so I don't have to manually manage all of the loads and stores whenever the cursor changes. Maybe someone has a better idea. But please stop talking about nonsense inherited from the BDE and swapping queries at run-time. At its core, this is a very simple problem, and I'm just trying to find a simple solution so the program doesn't take so damn long to initialize (open the tables) when it starts up.
  15. David Schwartz

    Forward declarations

    If you have circular references, then you need to re-think your design.
  16. David Schwartz

    any suggestions on how to "lazy load" a BLOB/CLOB from Oracle DB?

    That doesn't solve the problem I'm having. The initial Open() on the table uses "select * from <table>" and the presence of the CLOB field has a ginormous performance impact when it runs the Open(). We're talking 1-2 seconds vs. nearly a full minute! A grid gets loaded with a few of the columns. When you click the grid, it selects the entire record. It's at THAT MOMENT that I want to load the CLOB data into a TDBMemo. It's like lazy-loading just that column, and then just the records as they're selected. The grid in this case is a DevEx cxGrid.
  17. David Schwartz

    any suggestions on how to "lazy load" a BLOB/CLOB from Oracle DB?

    Yup, that's mine.
  18. In that case, I wouldn't call it "paranoid code". There are people who are taught that you want to employ certain practices to make your code "bullet-proof" or at least "safe". Unfortunately, they either weren't taught, or didn't understand, the reasons WHY or what's really going on. I'm working on this huge code base right now and it's got lots of try ... except clauses throughout it. Unfortunately, in many places, the except clause simply logs the exception's standard error message to a log file then just goes wherever it goes. The log file is absolutely useless in this case, because there's insufficient information in the typical exception error message to allow you to figure anything out. So we've got lots of nice code designed to safely capture exceptions and log them, but the information being logged is useless. And even worse (IMHO) is that the code makes no effort to recover! Many if not most of these exceptions are the result of a try ... except block that I suspect was just wrapped around a huge block of existing code. No effort was spared to narrow down the scope of the exceptions. So there are potentially dozens of exceptions that could occur in many different places. The info logged in the file is useless because there are no line numbers, no variable names, no context information at all. And in most cases in THIS particular bunch of code, the logic could be easily adapted to recover from the errors if it did any useful amount of testing and validating things. I mention this b/c we recently had to deal with one of the apps that started generating a ton of errors. They were all silently logged in the log file, and nobody had a clue anything was awry until a customer called and said there was a problem with some of their billings. We started looking through the log files and saw that there was a GPF occurring regularly that started one day a few weeks earlier. Nobody had a clue! And we stared at the errors in the log file scratching our heads trying to figure out what in the heck might be wrong. All we could tell was there was a GPF occurring in one of the units. That's why I always lobby to include something like MadExcept with everything. This app hasn't been rebuilt in ages (until recently), so nobody even thought about configuring it in.
  19. Does anybody happen to have an example of how to Serialize/Deserialize a TObjectDictionary with JSON? Most of the examples I've found are for several versions ago, and I've heard that the JSON classes keep improving. There seems to be two or three different approaches, none of which strike me as very straightforward. The class I'm using in this case contains two strings and a TObjectList of simple objects that contain two integers (similar to TPoints).
  20. Just following-up ... I ended up doing something fairly simple to de/serialize a few of my classes. Nothing special, and it turned out to be quite easy. But I've been looking over the Grijjy code and it looks like the Griggy.ProtocolBuffers unit is perfect for what I need. I may go back and replace what I'd done. I could have also used BSON because there's no need to be able to read the serialized data. Thanks for the suggestions. 🙂
  21. David Schwartz

    Releasing memory devoted to arrays

    My experience with the FastMM and the additional diagnostic tools it offers (you need to download the source and use it in your program) tells me that most of the time memory errors are like landmines planted way earlier, and when they blow up it can be quite random as to who or what triggers them. Here's an analogy: Using static arrays is like building a house. One simply does not call a friend to increase or decrease the size of a room with a quick refresh. If something is blowing up in a room, it's because a previous tenant did something he should not have done and left a time-bomb -- not the current tenant. You can stare at the current tenant all you want, you're not likely to find the error. Something else is crapping over some memory unexpectedly. That's about all we know right now.
  22. David Schwartz

    Has the migration tool EVER worked for you?

    not usually. It seems to miss some key things. I can see the differences it makes in the Registry, yet the IDE looks exactly the same after I run it as it did before. I end up spending more time trying to make it work and track down issues than it takes to just reorganize everything by hand.
  23. David Schwartz

    General DB access question -- paging query results

    I get your point, but I'm not sure I agree with your initial point. I've had this beef with Delphi for ages -- the entire IDE and even some language "features" (limitations, actually) are built around the same usage paradigm embraced with D2 for building Client/Server applications. When anybody mentions allowing things in Delphi like separately compiled classes (which C# supports), there's a lot of resistance from the peanut gallery saying, "Oh, this isn't pascalish!" and it never gets anywhere. The Delphi language and IDE have been calcified in this respect since they were born! Consequently, C# and Visual Studio have evolved to the point where you can build apps that use some kind of MVC-like architecture (among others) that are supported to some extent within the IDE itself. Trying to build anything other than the regular Client/Server style app in Delphi becomes quite a chore, and you're fighting the IDE at nearly every step. You're right in your characterization of Delphi vs. C# devs, but not for lack of innovative thinking. The simple approach in Delphi is like falling off a log, while a more innovative approach that's reasonable to take in C# is far too convoluted for most Delphi users to even consider, since it goes against everything the Delphi environment has been optimized for over time. In a world where Client/Server architectures are pretty much dead, the wizards in charge of Delphi's future continue to crank out release after release with features that make it quicker and easier to build more ever more complex Client/Server apps. If you want to build a REST-based server in Delphi, they offer a solution by providing a pre-built server framework that can be extended by creating plug-in modules (DLLs basically), but the IDE and components still have no knowledge of the dynamics inherent in such designs. But there's no particular benefit of one server over another -- the IDE sees them all as foreign objects that require the programmer to manipulate directly. I keep trying to avoid specific DACs in this discussion in part because we're using ASTA components in our app, and over time the programmers have written code that's highly specific to ASTA's features. AND because the Delphi IDE only really knows one paradigm (ie., Client/Server), the code is infused at every possible level with business logic and DB actions that have always been seen as perfectly legitimate for C/S apps, but they cannot survive a transformation to a more modern server-based architecture where all of the business logic and DB interactions happen at the other end of something like an HTTP connection (eg, REST requests). It's really easy to SAY: "Oh, just replace ASTA with ____" but that's a HUGE project it its own right, regardless of whatever _____ might be. There's still the matter of extricating all of the business logic and embedded SQL from the client code and moving it to the server side. (This also presumes that everybody sees these issues as a problem from the outset.) Yes, you're right. It's really seductively easy to drop a few components on a form in Delphi and have a simple client/server type app running in an hour or two. When we can build an MVC-like app that uses, say, a REST-based service, all within the Delphi IDE just as quickly, then we'll have made some real progress. Knowing how such a thing is structured and what's needed is one thing; but having to build it within the Delphi IDE is like swimming upstream against a strong current -- something most of us prefer to avoid, and is very hard to justify from a complexity standpoint to Management. It's hard enough finding good Delphi devs. But having a crazy complex application architecture that nobody can figure out easily doesn't help. These things are fairly common in the C#/VS ecosystem, not in Delphi's world.
  24. David Schwartz

    Rio has a broken REST Library

    That'll teach you to avoid using a fresh-out-of-the-oven compiler release on a time-sensitive project! In this case, I'd have stuck with what I had the most confidence in. So your hackathon experience turned into a bug-a-thon for EMBT. 🙂
  25. David Schwartz

    Binding TObjectList to TListView

    Do you want to use the TListView just for viewing the contents of the TObjectList? Or do you want to treat it like a DB-aware component that lets you edit what's in the list as well? Those are going to be very different solutions. The former is rather simple; the latter can be quite complex depending on how you want to approach it.
×