Jump to content

MichaelT

Members
  • Content Count

    38
  • Joined

  • Last visited

Everything posted by MichaelT

  1. MichaelT

    What is Shift+F2 supposed to do?

    Shift + F2 makes the IDE focus on the Object Inspector and puts a property selected in a certain kind of edit mode. No idea if that feature is a useful one in practice or required, welcome, widely used or whatsoever, since I never came across this keyboard shortcut before.
  2. Did you have IDE Fixpack 2007 or any other helpful extension/add-on installed before that made problems go away?
  3. MichaelT

    Anyone using Clever Components?

    The whole MS Microsoft ECO-System gained it's competitive advantage by off-shoring to Russia in the 1990s at the time of Y2K (in comparison to IBM). It has always been about (big) Enterprise(s) and nothing else. Afterwards there have been several economic development programs in the U.S. aiming at moving companies to the West in a legal sense at least (subsidies for founding companies). It's more or less a copy of the British off-shoring (deindustrialization of England in the 1970s) idea. As far as business is concerned no change with Russia so far. If you think of the prime minister of Estonia, she's touting extreme Anti-Russian propaganda on one hand and her husband still does extensive business with Russian companies on the other still. A second reason was a gain in reputation by setting up the company in the U.S. and within the U.S. in the state with the most restrictive terms and laws concerning business (SQL Detective & Co for example). The one who setup the business portion is a former director of Oracle Marketing manger from Austria and 'friend', a big word, of mine. The economy of Russia in contrary to the E.U. is far more liberal and younger, so it's no surprise that one-man shows and smaller companies have a fair chance to succeed. When we grew up in Austria during the so called steel-crisis in the mid 1980s the situation was not that bad as in Russia in the 1990s but from the perspective of the material flow the Germans put their goods into their shelves setup up here (Aldi/Hofer). At the peak of this evolution we had three kind of shops for food. Discounters, luxury stores and market stands at the backyard of the parts in the city populated by the Turkish community offering the best fruits and vegetables one could grab. Pretty much the same happend to the vast majority third-party vendors after 10 years as far as Delphi was concerned. Things don't work out this way, such transition things happen in a well organized manner in order not to run into shortages. Local products disappear for about 10 to 15 years from the Business to Customer part of an classic industrial consumption society an move into B2B, especially when evolving from non-industrial lines to industrial ones (carpenter vs. . Another trick in Austria was to invest free business and the matching licenses and everyone with money on the savings account could get one and access the product from B2B. Follow the model and introduce exceptions (to the rules) whenever applicable. In the meanwhile some carpenters turned their small companies into furniture factories (more automation, machines instead of heavy tools) and tourism as well as the business with Russia was the supporting driver of this evolution. What was needed, was a higher price level. A third tendency was to cut off the growing dominance of shops from Russia (in a sense of protection) on the Huge Java stacks for example in middle of the first decade after Millennium shift.
  4. That language 'feature' is a useful relic. A file in the context of the 'C' or 'Pascal' world is a compilation unit and a module is defined at a different level of abstraction. The keyword unit is about compilation units. Modula introduced modules. That's more or less the 'Host world' & Friends, no compilers or what we call a compiler today. These systems loaded a module for example into an execution context/process and the code run and/or were interpreted under the control of the executing system. A PC application or an program on home computers is controlled by a minimalistic runtime and utilizes the OS way more actively. Modules can have parameters. Have a look at PL/SQL. If you think of MTS or COM and apartments, said sloppy, it's about putting those old machines to the Windows PC. There is good reason for having interfaces and functions/procedures there and not classes in general. This COM and COM+ world is a mixture of IBM-host and DEC computers I think (appartments and friends). In remote scenarios a parameterized module does it's job pretty well (Oberon - Modula successor). On way to mimic modules are those kind of procedures. Object orientation also means modularization and a class is usually responsible fot this. In the context of above an object is a loaded module ready to execute and the application is already a 'host' computer and with a GUI involved in a certain kind of 'single user' mode (kinda MS-DOS). With the help of threads or co-routines this single user application turns into a multi-user application but without a GUI on the server side. Another example would be package vs. package library. The library is simply a concatenation of complied compilation units and the package library goes beyond. In Delphi you still have compilation units in fashion of a module, for example initialization and finalization, module variables and so on, classes, and procedures acting as a module pretty similar to Pascal on 'host' computers. On the other hand people tried out different kind of ways to experiment with the visibility and accessibility of variables and recursion on the other hand. Some problems in theoretical computer science can be solved easier or more readable when stack a function call is read a different way from left to right, right to left or from the middle. In practice those cases are rare, but at the days of upcoming multi purpose languages there have been approaches to mimic the capabilities of domain languages focused on math problems and expressed in the U.S. style of function definitions (Haskel and other functional languages from the syntax perspective). In Europe we use another standard way of describing mathematical functions, different notations. Agreed, without ever having seen these old, in the 1990s most of this was already pretty old stuff just mentioned in our scripts at the university by the guys who invented PL/I at IBM, Mr. Rechenberg (Left Linear programming languages LL(1 aka Pascal for example), just on e symbol needed to decide what comes next) and his buddy at the university in Berlin who focused on LR aka. C, a guy from the Netherlands. The difference was that the LL compiler were hand crafted and the LR languages were those built with Lex and Yacc. The difference is not the grammar solely but the way the compilers are built. Later also C and especially C++ syntax and compilers were harmonized into a more left linear appearances (2 pass no longer required or as substantial as before - from necessity to feature). I hope I crumbled this very past knowledge together correctly. I never built a compiler, we built Pascal interpreters or better said Modula Interprets aka. Mini-Modula later replaced by Mini-Java interpreters with the focus on recursive decent parsers.
  5. MichaelT

    ActionList Editor: New Standard Action...

    On 1) LOL. I was not aware of this fact, since I don't use third party controls that add actions. I see the advantage. Maybe they have/had no other alternative left, because caching means listening and updating. No doubt that such a delay sucks and EMB has to fix the issue urgently (without delay) and make the whole thing fast. Showing hour glasses is nice, but in that case I'd guess, if possible to implement a speed up is required. The situation is not just cumbersome for you and your daily work, nowadays such thing are just annoying. Putting the blame on EMB solely doesn't make sense, since those who add such actions have to take care that the Delphi IDE stays responsive. If the dialog broke afterwards, we'll see. Agreed. It's been a hot discussion these days, a storm in the tea cup, because of the no coding touring of marketing and speed up in development. Karma is returning. In fact I just wanted to let you know to lower expectations and the outcome of the discussion these days. Question? Does the problem show up in actions manager too? Looks like the same dialog is used.
  6. MichaelT

    ActionList Editor: New Standard Action...

    Cat lover? Feel free to cuddle a bunny of your choice, but in the end bunnies are bunnies with all their very specific lumps and 'd'umps 🐇 As far as I have read you already recognized that the problem with the default actions in the RAD Studio IDE cannot be solved in Visual Studio for example. So it's wise to file in a QC report or as Uwe calls it a QP report and please consider that a new portal has been set up by EMB formerly known as EMBT, but for a while now the technology went ashtray to a certain but in the end minor degree than expected. I didn't come across this issue in the past and thank you for posting your experience. Since take it or leave it is the new normal almost everywhere or better said take things the way they are and leave'em that way, simply don't use the default actions as long as the fix is not considered and/or implemented. I have no idea if it's possible to separate actions into separate BPLs, If I remember correctly that topic or a similar related to actions is not totally new. At least there as something in the past with 'too' many actions too and the outcome as not to use the default action list. I don't recall if that topic came up at TMS. The result of the discussion simply was that people using component set XYZ should use the Action Manager provided and in case of the Action Manager (dialog) provided don't use the feature you are talking about.
  7. MichaelT

    Print TDBGrid

    So you are looking for a certain kind of poor man's PrintDAT! in order to provide list-reporting, list-printing or formatted list-printing. The situation is similar to poor man's partitioning on Oracle. It works well, but it's work and to a certain degree more in both cases. Take the data from the dataset, use the TPrinter's canvas in order to format the document, print the result to a PDF printer installed into the OS for example. I'd not bother with 'arguments' /thoughts like 'What if there is no such printer installed'. A computer cannot boot to no OS installed, so a program can fairly print to no appropriate printer installed. - In case of designing real reports and passing data from a Delphi data source the story is a different one. Fast Report does integrate into Delphi and not the other way around. - Indeed I'm more infrastructure focused rather than a fan of application centric doing things, that can/should be done another way. An application should not handle the transfer of messages for example, it should just send it. It's just in the early beginning or in case no reliable (messaging) infrastructure can be found or set up. Users tend to get used to this situation. I mentioned messaging just because it's a prominent example or the most prominent one.
  8. MichaelT

    Does FireDAC allow more than one connection?

    Yes. FireDAC definitely does and did from the very beginning.
  9. MichaelT

    Firebird 5.0 released

    Great!
  10. No idea. Wondering about the title="Printer-friendly preview' with a " at the beginning and ending with '. Maybe that's a result from providing the manual description. No idea what 'Remoting SDK' refers too, but in general 'Software caused connection abort' doesn't tell a lot and the socket error is a general one for any kind of failure or protocol error. I think you are on the right track. It's very likely about a not well formatted template or a problem with handling the template in most cases. But that doesn't come as a surprise in your case.
  11. MichaelT

    FireDAC Alternative

    Why do you want to switch? As far as relational databases are concerned I do fairly see an advantage in switching at all. Just in case you build your own data access framework as Arnaud mentioned Zeos is one alternative and indeed FireDAC was designed for this and the component layer is really 'just' put on top in order to make the layered library more easy to use, to lower the learning curve. FireDAC as far as the command is concerned is/was more ADO.net stylish compared to other component sets. If I remember correctly UniDAC was introduced pretty much around the time AnyDAC (predecessor of FireDAC) showed up in public. The difference is in the nifty detail. From the bird's eye perspective both are more or less pretty much the same. Under the hood both work completely different. Another question is about having being in the position to utilize access to different database systems by one application since the database is already provided by the customer or 'just' being in the position to use a different database for 'the next application'. In the first case have a focus on the unification layers, for example the unified SQL offered by the 'components' and the statements issued to the database. FireDAC in some cases emulates functionality missing in/not offered by the database system. No idea in how far UniDAC goes beyond what's offered. The authors of FireDAC e.g. in case of Oracle had a focus on doing things the way they are done on the Oracle database. Which for example allowed to have 3000 users accessing a database about 8 to 10 years ago. Working with UniDAC is easier and those components perform very well too. Their advantage is being in the position to access many databases without the need to have a client library installed. As long that is accepted by the customer this direct mode works (very well already) today. There is no thing such as a dedicated support forum. In general what is written in places like this is brought to the attention of the authors or file a request in the official Embarcadero Quality portal. In general: If you are a hobbyist for example feel free to use ZEOS. These component set is easy to use and pretty straight also in case of for example someone who doesn't care a lot about specifics of the underlying database and ZEOS components are pretty straight and well layered too. If you have the choice rely on one database ('forever) and use specific components. I like DOA for Oracle access simply because that's really Oracle just in Delphi and not TDataSet style put on top of the Oracle database. It's ORACLE like when it comes to treating things the ORACLE way in case of CODE Blocks vs. procedures vs. queries in a way things are presented at the database level - kinda 'SQLPlus script' just coded in Delphi. Today few people care about such things.
  12. I usually don't use both or better said never bothered with one of those. So please consider my 'suggestion'/guess carefully. If I recall correctly midas.dll has to be registerd using the regserver utility (syswow64 directory on 64-bit Windows) or use the midaslib unit. Not totally sure, but the first suggestion found in a thread on Stack Overflow. The point with midaslib seem(s|d) to be a registry entry allowing the library to be located. Frankly, I'm just guessing.
  13. Benchmarks are benchmarks and usually compare the capability of a certain piece of hardware or a system as a whole to solve a task. That's what they do and speed up is the point, because slower is easy to achieve in general. The granularity of data is growing and considering time series gets more important than ever before, since hardware became fast enough. I'm also doing hard to find an application for just sorting key figures/values regarding an array as an independent dimension not connected to descriptive values. Maybe my brain is too strongly tied to non-math database thinking and I'm no technician or mathematician. These guys make use of such data-structures. Sorting values should result in faster lookup times acting as an index and extracting ranges in series of values for example. No idea why someone would need that, but as mentioned short before ... In general we are used to, me at least, to know what I'm looking for. If values are just retrieved searching and creating ranges can help to find out what should be looked up. Integer (just normal) as well as floating point values can act as describing dimensions too. In technical applications or the analysis of production line data regarding or planning worked that way. In a business application almost everything that can be related to a surrogate id is more or less discrete and searching starts with specifying describing values retrieved from describing dimensions assumed to have an impact. In case of finding the impacting/influencing dimensions values measured have to be regarded/analyzed first, values grouped and so on. One of my former business partners simplified optimization by removing criteria that had no impact to the optimization of the product mix in production first instead of considering a fixed set. Richard Werner (German economist) used pretty much the same trick to remove the impact of the interest rate to the demand for credit in business for example unless the plug is pulled by the central bank. It's no surprise because 'a line'/company providing goods in the classical Mid-European industry model used the interest rate after app. 35(40) to 50 years to decide if the line should keep producing and just repair the broken machines or build a new plant (reproduction of the line). Since the mid. 1980s even in Austria everything has been pushed into direction the 'Anglo-Saxon' industrial/business model. Those guys simply rebuilt the plant after 45 years and never really bothered with this situation. The myth of the impact of the interest rate and the believe concerning investment decisions is still alive and kept alive. The same myths do exist in many production planning systems considering a wide variety of influential factors/criteria retrieved from the past and with this for so called experience. That's why so many optimization model rely on let's say about 40 describing dimensions/criteria/influential factors taken from a pool of maybe 2000 potential influential factors while in current production, in a steel plant liquid phase, only 3 or 4 really drive the optimization software's decision and the result of the planning and optimization run. This requires the analysis of the values/key figures and describing technical dimensions first an the analysis should finish quickly, because today a run of production plan happens in de facto real-time (about 0,5 sec or less) and finding the influential factors should also not take long. In the past 20 years ago such a run was allowed to take 20 minutes.
  14. One of the major reasons to use components from Chilkat was SFTP and/or FTPS in contrary to Indy, a few years ago. Without a real good reasons, performance concerning floating point operations for example, I'm no big fan of deploying dynamic link libraries instead of static linking. What's missing Embarcadero has to provide and improvements too. Where does deploying this and that a different way lead? It does cause headaches even on a mid term in practice. Headaches go away with a Delphi only solution after while and then problems 'Wolpertinger' style deployments start. The whole story looks different if you want one homogenous component set serving different development technologies and/or programming languages. Even in this case homebrew, (bindings) provided by a community and/or out of the box come into play. Thinking of Java with the possibility to 'just plug-in' a different socket would be an alternative (early days of SSL). After a while what makes the difference today will be caught up by a certain kind of 'out of the box' way of doing things.
  15. MichaelT

    Is it worth resubscribing now?

    I'd say, simply give it a try. I personally don't use Code Insight and/or in a very limited way. Code Completion is turned on and autoinvoke is not used in general. Anything else sugar for me. So what happend. I simple switched to Code Insight based on LSP away from classic in 10.4.1 with target 32-bit enabled on an open project and methods added to a class and procedures/functions added to a module didn't show up first. Closing the project helped and cleaning the project directory too. Afterwards I switch to 64-bit build target, no idea if that had an impact, thins started to work. Later I started to add one feature after the other. I was surprised that Error Insight for example works pretty smooth, apart from a few minor glitches/optical irritations. The moment the modules interface becomes more stable, the better the experience turned out to be. Now it's possible to add many more features surrounding the core editing features and those show speed improvements compared to the classic Code Insight. Delphi has never been bug free, but the moment existing problems were fixed and others showed up in other places 'just a few people needed' the IDE became very usable in the past too. The absence of methods, functions and procedures in Code Completion is something was obvious. So I started to investigate deeper. Doing so makes people unhappy and the focus turns to occurrences that show up a lot less later in practice or seemingly go away. I'm happy about things that work. After years I turned on code navigation for example and help/information automatically generated from comments when the mouse is moves over a symbol. That's very helpful. Honestly I cannot comment on anything else, but my simply sit down and write a test application from scratch worked pretty smooth in 11.x starting with 11.2. 11.1 I simply missed. In practice I don't use too many sophisticated coding features, because few IDEs offer so much and many things working also in detail over time once they become stable. I have no idea about the whole bunch of FMX related stuff like Android, Mac, iOS and such things. At the moment I don't care about those. Since I'm a hobbyist I mostly turned to Freepascal/Lazarus on Linux in a first place and if fp would offer code completion I'd not even start the GUI service. Coming back to your question? Don't wait with testing and hope for everything works great as announced in high glossy brochures. Expectations raised this way fairly turn into reality and in almost no place not just Embarcadero. My impression still is that the RAD-Studio in general has undergone major improvements. A working IDE is a matter of perception concerning things relevant to the developer. An in depth look shows that the RAD-Studio really works pretty well for what it does and to what extent it does it's job. The question concerning Delphi has always been, how much less would be more helpful. Imo a lot more beyond core editing will not help to make development better or the developer a better/skilled one.
  16. MichaelT

    Is it worth resubscribing now?

    Thank you very much Uwe!
  17. MichaelT

    Is it worth resubscribing now?

    Give it a try first. If you are satisfied the upgrade will pay. I mostly use 10.3.2 (Rio) and 10.4.1 (Sydney) (which includes the LSP fixes from 10.4.2) in a certain advanced trial mode on my old Laptop with no working accumulator and not even a working battery in fact always plugged in but with Win 8.1 and a working Delphi installed (simply because of the Client Server Add-On pack). I fairly use Code Insight but prefer the old not LSP based one, but the LSP based works amazingly great as long as Code Completion relying on it works. Sometimes Cleaning the project directory helps. This laptop is equipped with a hard-disk and just a hard-disk. 11.3 is installed on my new a Win 10 Laptop I fairly use but it does it's job so far. I don't have to maintain large code bases. Berlin is/was a great release indeed. In worst case you could step back to 11.2 for example. The overall experience with RAD Studio 11.x including 11.3 is satisfactory. The IDE seems to be very responsive but I cannot offer the results of a direct comparison on the same computer which in case of the new machine is more or less a gaming laptop from hell and dead fast in general. This makes a/the difference if I think of full RAD-Studio Enterprise but just for Windows and Linux installed and using all the features made available. This works definitely smooth without delays. I think others can share their experience with Code Insight and they are a more viable source. Again, in general I tend to think that 11.x is an improvement over Rio and Sidney, that's my overall impression.
  18. MichaelT

    FireDAC

    I think ADO should work. Apart from the specific requirement SQL Server Connectivity using ZEOSLIB (Zeos Database Objects) makes sense. In the Zeoslib Portal on top you can find a links section with direct links to the various download sections. Zeoslib also works for me, but I'm a hobbyist. As far as I see, SQL Server (native) seems not be considered especially.
  19. Maybe you are rembring/talking about RealThinClient Portal built on top of the RTC SDK (iirc) - not totally sure about that) which is still available as is/was in the customer area of NexusDB. The release date indicates a version 3.72 made public on 09/24/2012. It's a Standard version an no idea if all source code is included, e.g. the gateway and so on. As far as remote control is concerned you are there, but for file transfer it's not required and HTTP is used directly. Coming back to RealThinClient SDK. Allow me to mention one sentence found in the RTC Forum taken from a thread named File transfer question. Danijel replied: ... Just use a TRtcDataProvider component on the Server and TRtcDataRequest on the Client side. If you need examples, check BrowserUpload, ClientUpload and ServerLesson1 - 4 in the QuickStart folder (included in the RealThinClientSDK zip file). Regards Mike
  20. TIOBE Index makes no sense (anymore). No idea if it ever id. After the (first) software crisis a NATO conference in the late 1960s addressed this issue and one of the results was to establish various programming languages, domain driven ones especially. That's what all those rankings reflect nowadays, not more not less. What can we say? What we see today returned and in more sophisticated fashion. Compilers were in the beginning. Over the years one lesser restricted technology (in the end) and programming languages were (re)introduced in order to get an idea of what people can/will do once compiler restrictions are gone. After a strong period of (W)intel dominance the question, btw. an old one even on the 'very first' IBM computers that made it into public, the question arose 'What will happen if new and/or more specialized processors will show up'. That was already pretty clear in the 1980s and 1990s and no surprise at all if we think of home computing. One direction hinted at compilers and portability and a second at virtual runtime environments, interpreted first and JIT-compiled & friends later. What we see is, that no solutions was found until today. What we see is that the insanity ascribed to programmers moved away from the individual to technology engineers and language designers. TIOBE index to a certain degree could hint at what combination will be tried out next and by whom or who has a fair chance to succeed. Programming languages and technologies are not solely aimed at software engineers. Since no best way to address the everlasting changes 'underneath the programming language' and/or execution environment was found, feel free to use whatever you like. Outside the traditional world of customer/supplier relations freedom can be found at every corner today and the question if 'your programming skill are up to date' cannot simply be answered by a ranking of programming languages.
  21. MichaelT

    Firedac MySql pessimistic lock

    TFDPhysMySQLCommandGenerator.GetPessimisticLock in FireDAC.Phys.MySQLMeta It seems that LockMode is not considered and in case of Lockpoint = lpImmediate FOR UPDATE is used and LOCK IN SHARE MODE lpDeferred and the WHERE clause is considered. To me this code looks like being part of preparing the select SQL for the command. I have checked the FireDac sources related to MySQL and the only line found was this one in line 679 in the code shipped with Delphi 10.4.1. Since I don't use MySQL or MariaDB very often, not even rarely. I simply use AUTOCOMMIT or distinct sets of records. Maybe I don't remember correctly but this behavior was and very likely still is by design. I remember a discussion long long time ago about MyISAM (extensive locking in general) and InnoDB on the other. Heaven knows if that's still (a|the) reason today. It seems that decision done these days to offer AUTOCOMMIT on one hand and the compromise found an described above. I simply do it that way. I'm no fan of locking in general and indeed I prefer non-committed reads, because if my code is not dirty enough, at least the blocks read from the DB should. Kidding. Usually read committed is enough for me.
  22. MichaelT

    Comport Serial Port help

    Give the Read function/method a try As far as I see the function in the ComPort.dll does offer to pass a Pointer as a Buffer and a Count(er). I'm not sure if that's included in the component itself and offered. I think this is not the first time I read about this issue in a discussion.
  23. MichaelT

    Comport Serial Port help

    Winsoft's ComPort.dll indeed does offer a function called ReadByte. I hope the same counts for the component.
  24. MichaelT

    What is RAD Server?

    You could regard RAD-Server as a Interbase add-on making this database more complete and future ready than ever before. Kidding. We all have been around in the Delphi forums for too long. So I couldn't stand. There is more. Push notifications for example, but indeed closely related to technologies used in conjunction with let's call it Delphi. Once you come to the point, when you think, I take a) a web server b) add some modules to encapsulate a frame-works functionality into something similar to web snap modules c) put your code into libraries loaded at runtime for example into an executable oder another library communicating/integrating with the web server d) making use of FAST-GCI for example e) .... whatever comes into your mind use something what works out of the box and handles a) b) d) for you. Most of these 'technologies in a broader sense' ended up a cut down version of what's originally intended. In general the EMBT eco-system needed their own thing allowing to focus on c). The opportunity arose with REST services.
×