Jump to content

MichaelT

Members
  • Content Count

    32
  • Joined

  • Last visited

Everything posted by MichaelT

  1. MichaelT

    Print TDBGrid

    So you are looking for a certain kind of poor man's PrintDAT! in order to provide list-reporting, list-printing or formatted list-printing. The situation is similar to poor man's partitioning on Oracle. It works well, but it's work and to a certain degree more in both cases. Take the data from the dataset, use the TPrinter's canvas in order to format the document, print the result to a PDF printer installed into the OS for example. I'd not bother with 'arguments' /thoughts like 'What if there is no such printer installed'. A computer cannot boot to no OS installed, so a program can fairly print to no appropriate printer installed. - In case of designing real reports and passing data from a Delphi data source the story is a different one. Fast Report does integrate into Delphi and not the other way around. - Indeed I'm more infrastructure focused rather than a fan of application centric doing things, that can/should be done another way. An application should not handle the transfer of messages for example, it should just send it. It's just in the early beginning or in case no reliable (messaging) infrastructure can be found or set up. Users tend to get used to this situation. I mentioned messaging just because it's a prominent example or the most prominent one.
  2. MichaelT

    Does FireDAC allow more than one connection?

    Yes. FireDAC definitely does and did from the very beginning.
  3. MichaelT

    Firebird 5.0 released

    Great!
  4. No idea. Wondering about the title="Printer-friendly preview' with a " at the beginning and ending with '. Maybe that's a result from providing the manual description. No idea what 'Remoting SDK' refers too, but in general 'Software caused connection abort' doesn't tell a lot and the socket error is a general one for any kind of failure or protocol error. I think you are on the right track. It's very likely about a not well formatted template or a problem with handling the template in most cases. But that doesn't come as a surprise in your case.
  5. MichaelT

    FireDAC Alternative

    Why do you want to switch? As far as relational databases are concerned I do fairly see an advantage in switching at all. Just in case you build your own data access framework as Arnaud mentioned Zeos is one alternative and indeed FireDAC was designed for this and the component layer is really 'just' put on top in order to make the layered library more easy to use, to lower the learning curve. FireDAC as far as the command is concerned is/was more ADO.net stylish compared to other component sets. If I remember correctly UniDAC was introduced pretty much around the time AnyDAC (predecessor of FireDAC) showed up in public. The difference is in the nifty detail. From the bird's eye perspective both are more or less pretty much the same. Under the hood both work completely different. Another question is about having being in the position to utilize access to different database systems by one application since the database is already provided by the customer or 'just' being in the position to use a different database for 'the next application'. In the first case have a focus on the unification layers, for example the unified SQL offered by the 'components' and the statements issued to the database. FireDAC in some cases emulates functionality missing in/not offered by the database system. No idea in how far UniDAC goes beyond what's offered. The authors of FireDAC e.g. in case of Oracle had a focus on doing things the way they are done on the Oracle database. Which for example allowed to have 3000 users accessing a database about 8 to 10 years ago. Working with UniDAC is easier and those components perform very well too. Their advantage is being in the position to access many databases without the need to have a client library installed. As long that is accepted by the customer this direct mode works (very well already) today. There is no thing such as a dedicated support forum. In general what is written in places like this is brought to the attention of the authors or file a request in the official Embarcadero Quality portal. In general: If you are a hobbyist for example feel free to use ZEOS. These component set is easy to use and pretty straight also in case of for example someone who doesn't care a lot about specifics of the underlying database and ZEOS components are pretty straight and well layered too. If you have the choice rely on one database ('forever) and use specific components. I like DOA for Oracle access simply because that's really Oracle just in Delphi and not TDataSet style put on top of the Oracle database. It's ORACLE like when it comes to treating things the ORACLE way in case of CODE Blocks vs. procedures vs. queries in a way things are presented at the database level - kinda 'SQLPlus script' just coded in Delphi. Today few people care about such things.
  6. I usually don't use both or better said never bothered with one of those. So please consider my 'suggestion'/guess carefully. If I recall correctly midas.dll has to be registerd using the regserver utility (syswow64 directory on 64-bit Windows) or use the midaslib unit. Not totally sure, but the first suggestion found in a thread on Stack Overflow. The point with midaslib seem(s|d) to be a registry entry allowing the library to be located. Frankly, I'm just guessing.
  7. Benchmarks are benchmarks and usually compare the capability of a certain piece of hardware or a system as a whole to solve a task. That's what they do and speed up is the point, because slower is easy to achieve in general. The granularity of data is growing and considering time series gets more important than ever before, since hardware became fast enough. I'm also doing hard to find an application for just sorting key figures/values regarding an array as an independent dimension not connected to descriptive values. Maybe my brain is too strongly tied to non-math database thinking and I'm no technician or mathematician. These guys make use of such data-structures. Sorting values should result in faster lookup times acting as an index and extracting ranges in series of values for example. No idea why someone would need that, but as mentioned short before ... In general we are used to, me at least, to know what I'm looking for. If values are just retrieved searching and creating ranges can help to find out what should be looked up. Integer (just normal) as well as floating point values can act as describing dimensions too. In technical applications or the analysis of production line data regarding or planning worked that way. In a business application almost everything that can be related to a surrogate id is more or less discrete and searching starts with specifying describing values retrieved from describing dimensions assumed to have an impact. In case of finding the impacting/influencing dimensions values measured have to be regarded/analyzed first, values grouped and so on. One of my former business partners simplified optimization by removing criteria that had no impact to the optimization of the product mix in production first instead of considering a fixed set. Richard Werner (German economist) used pretty much the same trick to remove the impact of the interest rate to the demand for credit in business for example unless the plug is pulled by the central bank. It's no surprise because 'a line'/company providing goods in the classical Mid-European industry model used the interest rate after app. 35(40) to 50 years to decide if the line should keep producing and just repair the broken machines or build a new plant (reproduction of the line). Since the mid. 1980s even in Austria everything has been pushed into direction the 'Anglo-Saxon' industrial/business model. Those guys simply rebuilt the plant after 45 years and never really bothered with this situation. The myth of the impact of the interest rate and the believe concerning investment decisions is still alive and kept alive. The same myths do exist in many production planning systems considering a wide variety of influential factors/criteria retrieved from the past and with this for so called experience. That's why so many optimization model rely on let's say about 40 describing dimensions/criteria/influential factors taken from a pool of maybe 2000 potential influential factors while in current production, in a steel plant liquid phase, only 3 or 4 really drive the optimization software's decision and the result of the planning and optimization run. This requires the analysis of the values/key figures and describing technical dimensions first an the analysis should finish quickly, because today a run of production plan happens in de facto real-time (about 0,5 sec or less) and finding the influential factors should also not take long. In the past 20 years ago such a run was allowed to take 20 minutes.
  8. One of the major reasons to use components from Chilkat was SFTP and/or FTPS in contrary to Indy, a few years ago. Without a real good reasons, performance concerning floating point operations for example, I'm no big fan of deploying dynamic link libraries instead of static linking. What's missing Embarcadero has to provide and improvements too. Where does deploying this and that a different way lead? It does cause headaches even on a mid term in practice. Headaches go away with a Delphi only solution after while and then problems 'Wolpertinger' style deployments start. The whole story looks different if you want one homogenous component set serving different development technologies and/or programming languages. Even in this case homebrew, (bindings) provided by a community and/or out of the box come into play. Thinking of Java with the possibility to 'just plug-in' a different socket would be an alternative (early days of SSL). After a while what makes the difference today will be caught up by a certain kind of 'out of the box' way of doing things.
  9. MichaelT

    Is it worth resubscribing now?

    I'd say, simply give it a try. I personally don't use Code Insight and/or in a very limited way. Code Completion is turned on and autoinvoke is not used in general. Anything else sugar for me. So what happend. I simple switched to Code Insight based on LSP away from classic in 10.4.1 with target 32-bit enabled on an open project and methods added to a class and procedures/functions added to a module didn't show up first. Closing the project helped and cleaning the project directory too. Afterwards I switch to 64-bit build target, no idea if that had an impact, thins started to work. Later I started to add one feature after the other. I was surprised that Error Insight for example works pretty smooth, apart from a few minor glitches/optical irritations. The moment the modules interface becomes more stable, the better the experience turned out to be. Now it's possible to add many more features surrounding the core editing features and those show speed improvements compared to the classic Code Insight. Delphi has never been bug free, but the moment existing problems were fixed and others showed up in other places 'just a few people needed' the IDE became very usable in the past too. The absence of methods, functions and procedures in Code Completion is something was obvious. So I started to investigate deeper. Doing so makes people unhappy and the focus turns to occurrences that show up a lot less later in practice or seemingly go away. I'm happy about things that work. After years I turned on code navigation for example and help/information automatically generated from comments when the mouse is moves over a symbol. That's very helpful. Honestly I cannot comment on anything else, but my simply sit down and write a test application from scratch worked pretty smooth in 11.x starting with 11.2. 11.1 I simply missed. In practice I don't use too many sophisticated coding features, because few IDEs offer so much and many things working also in detail over time once they become stable. I have no idea about the whole bunch of FMX related stuff like Android, Mac, iOS and such things. At the moment I don't care about those. Since I'm a hobbyist I mostly turned to Freepascal/Lazarus on Linux in a first place and if fp would offer code completion I'd not even start the GUI service. Coming back to your question? Don't wait with testing and hope for everything works great as announced in high glossy brochures. Expectations raised this way fairly turn into reality and in almost no place not just Embarcadero. My impression still is that the RAD-Studio in general has undergone major improvements. A working IDE is a matter of perception concerning things relevant to the developer. An in depth look shows that the RAD-Studio really works pretty well for what it does and to what extent it does it's job. The question concerning Delphi has always been, how much less would be more helpful. Imo a lot more beyond core editing will not help to make development better or the developer a better/skilled one.
  10. MichaelT

    Is it worth resubscribing now?

    Thank you very much Uwe!
  11. MichaelT

    Is it worth resubscribing now?

    Give it a try first. If you are satisfied the upgrade will pay. I mostly use 10.3.2 (Rio) and 10.4.1 (Sydney) (which includes the LSP fixes from 10.4.2) in a certain advanced trial mode on my old Laptop with no working accumulator and not even a working battery in fact always plugged in but with Win 8.1 and a working Delphi installed (simply because of the Client Server Add-On pack). I fairly use Code Insight but prefer the old not LSP based one, but the LSP based works amazingly great as long as Code Completion relying on it works. Sometimes Cleaning the project directory helps. This laptop is equipped with a hard-disk and just a hard-disk. 11.3 is installed on my new a Win 10 Laptop I fairly use but it does it's job so far. I don't have to maintain large code bases. Berlin is/was a great release indeed. In worst case you could step back to 11.2 for example. The overall experience with RAD Studio 11.x including 11.3 is satisfactory. The IDE seems to be very responsive but I cannot offer the results of a direct comparison on the same computer which in case of the new machine is more or less a gaming laptop from hell and dead fast in general. This makes a/the difference if I think of full RAD-Studio Enterprise but just for Windows and Linux installed and using all the features made available. This works definitely smooth without delays. I think others can share their experience with Code Insight and they are a more viable source. Again, in general I tend to think that 11.x is an improvement over Rio and Sidney, that's my overall impression.
  12. MichaelT

    FireDAC

    I think ADO should work. Apart from the specific requirement SQL Server Connectivity using ZEOSLIB (Zeos Database Objects) makes sense. In the Zeoslib Portal on top you can find a links section with direct links to the various download sections. Zeoslib also works for me, but I'm a hobbyist. As far as I see, SQL Server (native) seems not be considered especially.
  13. Maybe you are rembring/talking about RealThinClient Portal built on top of the RTC SDK (iirc) - not totally sure about that) which is still available as is/was in the customer area of NexusDB. The release date indicates a version 3.72 made public on 09/24/2012. It's a Standard version an no idea if all source code is included, e.g. the gateway and so on. As far as remote control is concerned you are there, but for file transfer it's not required and HTTP is used directly. Coming back to RealThinClient SDK. Allow me to mention one sentence found in the RTC Forum taken from a thread named File transfer question. Danijel replied: ... Just use a TRtcDataProvider component on the Server and TRtcDataRequest on the Client side. If you need examples, check BrowserUpload, ClientUpload and ServerLesson1 - 4 in the QuickStart folder (included in the RealThinClientSDK zip file). Regards Mike
  14. TIOBE Index makes no sense (anymore). No idea if it ever id. After the (first) software crisis a NATO conference in the late 1960s addressed this issue and one of the results was to establish various programming languages, domain driven ones especially. That's what all those rankings reflect nowadays, not more not less. What can we say? What we see today returned and in more sophisticated fashion. Compilers were in the beginning. Over the years one lesser restricted technology (in the end) and programming languages were (re)introduced in order to get an idea of what people can/will do once compiler restrictions are gone. After a strong period of (W)intel dominance the question, btw. an old one even on the 'very first' IBM computers that made it into public, the question arose 'What will happen if new and/or more specialized processors will show up'. That was already pretty clear in the 1980s and 1990s and no surprise at all if we think of home computing. One direction hinted at compilers and portability and a second at virtual runtime environments, interpreted first and JIT-compiled & friends later. What we see is, that no solutions was found until today. What we see is that the insanity ascribed to programmers moved away from the individual to technology engineers and language designers. TIOBE index to a certain degree could hint at what combination will be tried out next and by whom or who has a fair chance to succeed. Programming languages and technologies are not solely aimed at software engineers. Since no best way to address the everlasting changes 'underneath the programming language' and/or execution environment was found, feel free to use whatever you like. Outside the traditional world of customer/supplier relations freedom can be found at every corner today and the question if 'your programming skill are up to date' cannot simply be answered by a ranking of programming languages.
  15. MichaelT

    Firedac MySql pessimistic lock

    TFDPhysMySQLCommandGenerator.GetPessimisticLock in FireDAC.Phys.MySQLMeta It seems that LockMode is not considered and in case of Lockpoint = lpImmediate FOR UPDATE is used and LOCK IN SHARE MODE lpDeferred and the WHERE clause is considered. To me this code looks like being part of preparing the select SQL for the command. I have checked the FireDac sources related to MySQL and the only line found was this one in line 679 in the code shipped with Delphi 10.4.1. Since I don't use MySQL or MariaDB very often, not even rarely. I simply use AUTOCOMMIT or distinct sets of records. Maybe I don't remember correctly but this behavior was and very likely still is by design. I remember a discussion long long time ago about MyISAM (extensive locking in general) and InnoDB on the other. Heaven knows if that's still (a|the) reason today. It seems that decision done these days to offer AUTOCOMMIT on one hand and the compromise found an described above. I simply do it that way. I'm no fan of locking in general and indeed I prefer non-committed reads, because if my code is not dirty enough, at least the blocks read from the DB should. Kidding. Usually read committed is enough for me.
  16. MichaelT

    Comport Serial Port help

    Give the Read function/method a try As far as I see the function in the ComPort.dll does offer to pass a Pointer as a Buffer and a Count(er). I'm not sure if that's included in the component itself and offered. I think this is not the first time I read about this issue in a discussion.
  17. MichaelT

    Comport Serial Port help

    Winsoft's ComPort.dll indeed does offer a function called ReadByte. I hope the same counts for the component.
  18. MichaelT

    What is RAD Server?

    You could regard RAD-Server as a Interbase add-on making this database more complete and future ready than ever before. Kidding. We all have been around in the Delphi forums for too long. So I couldn't stand. There is more. Push notifications for example, but indeed closely related to technologies used in conjunction with let's call it Delphi. Once you come to the point, when you think, I take a) a web server b) add some modules to encapsulate a frame-works functionality into something similar to web snap modules c) put your code into libraries loaded at runtime for example into an executable oder another library communicating/integrating with the web server d) making use of FAST-GCI for example e) .... whatever comes into your mind use something what works out of the box and handles a) b) d) for you. Most of these 'technologies in a broader sense' ended up a cut down version of what's originally intended. In general the EMBT eco-system needed their own thing allowing to focus on c). The opportunity arose with REST services.
  19. MichaelT

    TTask on Sydeny 10.4.1

    I personally just see a few entries in the structure pane while adding code to the VAR sections for example. But those immediately disappear on entering the code correctly is finished. For a short moment the one you mentioned appears too accompanied by many regarding TThread.Synchronize for example.
  20. MichaelT

    TTask on Sydeny 10.4.1

    Works in Embarcadero® Delphi 10.4 Version 27.0.38860.1461 RAD Studio 10.4 Update 1 Not sure if this is 10.4.1
  21. MichaelT

    Conflict with TestInsight

    Howdy Uwe, I had a similar issue, we had a short talk here. I thought GExperts was causing the issue with MMX Explorer not drawing the panel in the middle where users can add classes for example. After having disabled or hidden the Structure Pane (provided by Delphi) that the Structure Pane sometimes takes a while to analyze the file and you or Gerrit stopped to populate this panel. Not sure about the real reason. As far as I remember not doing so has something do with the Structure Pane starting to use mulit-threading in order to analyze the files and/or it took so/too long before until it returned in some cases without making use of threads. I think Gerrit once wrote in a thread in the forums something like having both view populated wouldn't make lots of sense if populating one already takes too long. Maybe there is a common cause that the panels don't get updated. I knew the blank TestInsight window already from the past without having had MMX Explorer installed. I also don't think the problem lies in having these both add-ons oder add-ins combined. I have experienced no problem so far, even if I rarely make use of both even 'standalone'. Michael
  22. MichaelT

    Conflict with TestInsight

    Quick shot. Do have the structure pane enabled/visible even if 'hidden' by/behind for example the object inspector tool window? If yes, does the behavior change, if you hide it?
  23. MichaelT

    The future of Delphi

    Did you visit David I. 😉 Connect all the 6 boxes, since everything is connected with everything else and all problems are gone. I doubt that will work. pgAdmin works great in virtual-boxes, especially because of the dashboard and debugging against a run-time environment on the server works great but only if just the screen is sent to the client. Forget it. I tend to agree that in both cases Delphi and Python a solid well maintained base for C-Bindings attracted people. Think of project Jedi but a Windows origin and the integrate everything into the desktop and the Explorer strategy put another level of complexity on top, which lead to anything but simplicity in the end. Apart from that, you talk about something totally different than Delphi. Not sure if a Delphi like way is an answer to the underlying questions accepted by a broader audience. Elevate Web Builder would be a first step into such a direction once debugging on the client side is possible, but even that is not really required at the moment. Without proprietary add-ons in general integrated functionality is hard to achieve. It's 25 years too late for 4GL love. There is no such thing as an open let's say ABAP-stack just for the Web. Sounds like that. You want something pretty similar like the SAP-GUI called Delphi-GUI. I worked on/with something that was called XSTP-GUI which integrated Java widgets in the mid/late 1990s and an approach that work like a charm on Smalltalk-System put to the next level on Java busted with flying colors. Javascript is about portability and from this perspective an integrated IDE like Delphi is a meta tool allowing to build the environment you suggest. The more features you add to the very definition of an IDE, an advanced editor with a menu entry called tools supporting invoking them in the context of the IDE adds interfaces (common sense) to other disciplines of software engineering who do not even care a tiny rabbit shit about you and your IDE. The last revival into such a direction, which inevitably comes with your suggestions outside the scope of what an IDE is meant to be, I have seen in the fashion of add-ons to the Eclipse IDE which again failed to succeed even on a mid term, because of breaking changes in the Eclipse IDE itself. Do you really that all the others have to put things in the right place at the right time just because you want to press a button and everything works as intended by you 😉. Maybe it's the biggest tragedy for the Delphi super-hero that the world never worked this way and rest assured it never will. The Delphi world is about succeeding in a dynamic environment where anarchy still matters and rulez from time to time or all the time and not about a consensus on praising others for leaving things unchanged. I tend to agree that developing in another environment but the target environment makes development pretty complicated. Going beyond an IDE based approach leads to to something beyond pretty quickly a workbench for example or many of the 1990s approaches that worked pretty great generating C Code and indeed they were abandoned for the wrong reason 'C', which was heavily bashed these days. All that came later used a virtual machine especially because of being in the position to utilize dictionaries in case of the CRM-systems (integrated development and execution environments) or RTTI exposed (JVM, .net runtime, ....). It's not surprise that those who never worked with e.g. Smalltalk before tried a revival but on another technology called Java oder .net and spent their time wasting to again rebuild all the crap people threw out of the windows when XP-programming was introduced including their managers and the bureaucrats. Kidding. Even if the IDEs dominated the scenes for along time theirs has come when it comes to software-development. I see no way why an integrated tool should give the answer. You should not assume that an application is what people want. People got used to it. Except from the very early versions Borland turned Delphi, said more precisely Turbo Pascal, into a Y2K child. I didn't have the impression that Y2K was a challenge for small shops in a first place. Everyone asked for Business-software on Windows. So it's no surprise that Delphi turned into what it is today. It's just not that bad. EMBT successfully had enough to do with putting the Wild West style mess we left from 1997 to 2005 into a somehow consistent no one knows how what it's good for today. The 1990s were about making money and not software-engineering since it turned out soon that the whole bunch of software methodologies and other failed approaches allowed to kill budgets. After the these days were finally gone with the disappearance of Windows XP the honst soles were left back what without them could be called a ghost-town. But what those who disappeared behind the horizon after a tough ride were the requirements for what is called FMX today and EMB had to live with that situation. In both cases the question of in how far an IDE based approach is the answer to both love in the very detail and a total open approach attracting developers beyond what's already available. If you remember what was the answer to performance, a) buy a new Windows and a new computer b) let the database do the calculation jobs or use extensive profiling and c) use assembler.
  24. MichaelT

    Absolute Database question

    That's one just for the German government. The small-footprint(f) *) light-weight databases made/make lots of sense in environments where people managed(ed) infrastructure on their own. There is a use case in the steel works of my hometown. As a result of a problem in a plant the whole cabling including those related to the IT network melted. The multi-user application for a small group of about five people from the team responsible for repairs just drove the data-center, took the files and put them on a local share in an office within the plant to a local server. The Inventory system was required because one or two hours later about 200 to 400 people started the repairs and needed lots of materials and so on. This application didn't rely on Absolute DB, they relied on the other solution whose name I've already forgotten but you you know which one I mean. In several scenarios some features worked pretty fast. 'A better in-memory table covering multi-user scenarios' is not a bad choice but don't expect it to be a page oriented DB server. You should know better. In general for about 10 years know the demand for those seemingly 'esoteric' solutions has been shrinking radically. Honestly why not choose a different database from what anyone else uses? Hey. It's Delphi! Why bother with the 21st century, doesn't look that great until today anyway, when you can still program as if both Kurt Cobain as well as Elvis were still alive kicking. Kidding. As long as no one else but you or your application are fiddling with the data such solutions do their job pretty well. What I'm wondering is if ARM processors would give such plug-able persistence interface solutions a push. That's why I never really understood why EMB(T) didn't offer an ARM compiler outside iOS or Android an a minimalist frameworks allowing people to just port non visual stuff. I love such things. TurboDB, Apollo, Vista DB (.net only I think today), NexusDB, ElevateDB and so on are still great. Also here, except form niche scenarios people simply put their IT to a data-center and 'forget the crap'.
×