Leaderboard
Popular Content
Showing content with the highest reputation on 08/31/20 in all areas
-
http://ollydbg.de/Paperbak/
-
Use the FDConnection BeforeConnect event procedure TdbModule.fdConnectionBeforeConnect(Sender: TObject); {$IF DEFINED(MACOS)} var resourcesPath : String ; newPath : String ; position : Integer ; {$ENDIF)} begin // Get the correct Win, OSX, iOS or Android DB path fdConnection.Params.Values['Database'] := 'mydb.s3db' ; {$IF DEFINED(iOS) or DEFINED(ANDROID)} fdConnection.Params.Values['ColumnMetadataSupported'] := 'False' ; fdConnection.Params.Values['Database'] := TPath.Combine(TPath.GetDocumentsPath, 'mydb.s3db') ; {$ELSEIF DEFINED(MACOS)} resourcesPath := paramstr(0) ; position := Pos( '/Contents', resourcesPath ) ; newPath := paramstr(0).SubString(0, position )+'Contents/Resources/mydb.s3db' ; fdConnection.Params.Values['Database']:= newPath ; {$ENDIF} end; Don't forget to add the db to the Project Deployment window and set Remote Path for each platform: macOS: Contents\Resources\ iOS: StartUp\Documents\ (this one will show up the DB in FileSharing under iTunes or your device finder - You can download, erase, add the file to the app) iOS: Library/Application Support/[your app directory] for files you want accessible from your app but not from the user (not shared) Android: .\assets\internal
-
XML used to be considered the universal data format. Now is a bit passé with JSON, YAML etc being "in". I got involved in XML parsing, since SVG files are in XML format. XML Delphi support At the surface the XML support in Delphi is very good: You have TXMLDocument/IXMLDocument offering high-level support (Xml.xmldoc) Support for the standard DOM interfaces (Xml.XmlDom) Multiple implementations including (MSXML, OmniXML, OpenXML and more) Ability to plug in your own implementation Multiple platform support. The most common way of accessing XML is through TXMLDocument/IXMLDocument. However there is a big catch: PERFORMANCE. Say you want to use MSXML and you specify 'MSXML' as your DefaultDomVendor. (or you simply include the implementation unit Xml.Win.msxmldom in your uses clause). Your create an XML document and you access the top node: var Doc: IXMLDocument = TXMLDocument.Create(nil); var Node: IXMLNode := Doc.DocumentElement; Node is an IXMLInterface implemented by TXMLNode (TInterfacedObject defined in Xml.XmlDoc). TXMLNode wraps an IDOMNode stored in a private field FDOMNode. IDOMNode is defined in Xml.Xmldom. The IDOMNode is implemented by the used vendor in this case Xml.Win.msxmldom by a class TMSDOMNode TMSDOMNode (also a TInterfacedObject) wraps IXMLDOMNode stored in a private field FMSNode. IXMLDOMNode is defined in Winapi.msxml. As a result when you create any IXMLNode, a TXMLNode is created and this creates a TMSDOMNode which points to an IXMLDOMNode. Any call/property access to IXMLNode translates in a call of IDOMNode which then calls IXMLDOMNode. The created TInterfaced objects also need to be destroyed when you release your XML Node. The same two-level indirection applies to all XML objects (attributes, Children) and cause a huge degradation of performance. Conclusion If you care about speed forget about TXMLDocument. You can access the Vendor implementation or even better in the case of MSXML the Microsoft ActiveX objects directly: uses WinAPI.msxml var XML: IXMLDOMDocument3 := CoDOMDocument60.Create; XML.loadXML(XMLString); var DocNode: IXMLDOMNode := XML.documentElement; In SVG parsing and processing accessing directly the ActiveX objects reduced processing time by more than 50%. Additional tip A common performance pitfall with MSXML is explained in http://www.gerixsoft.com/blog/delphi/msxml. The fastest way to iterate through ChildNodes is via getFirstChild/nextSibling and Attributes via nextNode.
-
You should have all your work in revision control. Then when your drive fails you still have it. Not to mention all the other benefits.
-
It's a misconception that RAID is a means to better data security. RAID is a means to better performance or higher availability. If you buy cheap drives or cheap NAS with cheap drives in them then you get what you pay for. No surprise there really. The HDDs in my system are 16 years old and some of them have been running almost continuously. They're Western Digital WD5000YS drives with a MTBF of 137 years. I have two drives mounted internally and five drives in a hot-swap rack on the front for use as backup storage. Of course I also have a couple of SDDs for the performance critical stuff. HDDs have better reliability than SDDs and they are better for long term archival. While a brand new SDD might have better MTBF values than a HDD this dramatically changes with time as the drive is used. After only a few years the SDD will have deteriorated by a magnitude to have much worse MTBF than the HDD. So you're good with relying on the cheapest available devices to save you when your primary storage fails? Interesting. Personally I would prefer a reliable backup medium so that I could afford to use faster, but less reliable, primary storage devices.
-
Hi... Is this what you mean? ...the base folder for new projects? ...in german
-
SD cards are expected to keep data for about five years before they start deteriorating. They actually lose charge over time. http://www.datarecoveryspecialists.co.uk/blog/what-is-the-life-expectancy-of-an-sd-card Even CDs and DVDs may deteriorate within two to five years unless stored in a dark, dry, cool place. Blu-ray disks are supposed to be more robust. It is not hard to find reliable online storage. The real question is what value you put on your data, and if you are willing to invest that value.
-
Incredible! If it runs on WIndows 2019 Core, finally I can fire up that 15-year-old leporello printer to have some good use of it!
-
Sure. If you want reliable backup use a media that has been designed for it. Floppies, writable CDs and DVDs, cheap HDDs, SDDs and SDs aren't reliable. I have TK-50 DLT tapes from the late eighties that can still be read (each tape contains a whopping 94Mb). The tape format is standard and the drive is SCSI so it's no problem finding a way to read them - and the data is still there. Try that with any other storage type after thirty years (stone tablets and punch cards excepted). I think it pretty irresponsible to make a statement like that. You can't guarantee anything like that and I'm pretty sure you will not take financial responsibility when your guarantee turns out to be false. The companies that make these devices doesn't even make such claims. It is known that SDD and SD degrade over time. Their good MTBF is only valid when they are new.
-
ANN : Continua CI Version 1.9.2 Beta - Continuous Integration Server
Lars Fosdal replied to Vincent Parrett's topic in Delphi Third-Party
I love this tool! There is not a thing you can't automate with it.- 1 reply
-
- continuous integration
- automation
-
(and 1 more)
Tagged with:
-
ANN : Continua CI Version 1.9.2 Beta - Continuous Integration Server
Vincent Parrett posted a topic in Delphi Third-Party
We are delighted to announce a new beta release in Continua CI. We have added the following new features: Export and Import: You can now export one or more configurations to a file and import them back from the file into Continua CI. Requeuing Stages: Requeue a failing stage without restarting the build. Multiple Daily Cleanup Rules: Each type of build by-product can now have a different shelf life. https://www.finalbuilder.com/resources/blogs/introducing-continua-ci-version-192-beta Continua CI is a low cost, easy to use Continuous Integration Server which includes first class support for Delphi (using FinalBuilder or MSBuild) and version control integration with Git, Mercurial, Subversion and more. https://www.finalbuilder.com/resources/blogs/building-delphi-projects-with-continua-ci- 1 reply
-
- continuous integration
- automation
-
(and 1 more)
Tagged with:
-
The amount of numbers HDDs failed on us more is because SSD is much-much younger than HDDs are; but I already experienced SSD issues too. I still have an SSD in my laptop and in my dev server (with OS drives of guests on it) and they are still error free. I love them too; we all love them. But their lifespan is shorter than a HDDs - especially if we don't let that HDD to stop 🙂 Edit: online backups are nice and should be safe enough. I just happen to be paranoid enough not to trust any big company.
-
Oh, and another reason why this is wrong is that it's better for HDDs to run continuously. It's the power cycles that kills them (thermal wear).
-
XML Parsing and Processing
Anders Melander replied to pyscripter's topic in RTL and Delphi Object Pascal
If you care about speed use a SAX parser instead. The DOM model has different priorities. -
Yes and yes. Since I have a dev environment at home I just added 2 extra hard drives mirrored (with daily automatic health check and warning E-mails). I do my development on my PC then syncing it upstream on a Git repository stored on this mirror. I have a physical RAID controller with a battery backup (salvaged a HPE P400 with 2 failed battery packs. Turns out it is using 4 AAA rechargeable batteries inside, so the quick fix was a matter of a bit of soldering) and the server is on a high capacity UPS. All this hassle for home, you are free to call me paranoid and I'm not going to argue 🙂 P.s.: I'm not considering any flash-based (including SSDs) devices suitable for backup. Their write cycles are low and sometimes you recognize something failed when you are reading information back. And that's too late if it's used for backup. Minimum a mirrored HDD (which is always on, so not USB-attached which "I keep safe in the wardrobe and only use when necessary". It's good for power consumption, but it's bad for lifespan) but the best is still a tape drive - IN MY OPINION. I have personal privacy and safety concerns when it comes to "cloud" data storage.
-
My solution to something like this was to create a frame which contains only one row. Add basic functionality, like a grip and the resize code, initialization method, validation method, etc. Create as many variants as descendants of said frame as needed to handle different input / visual needs. Then, create one form with an alClient scrollbox and an alBottom panel with an OK and a cancel button. Give an array of data as an input to this form so it knows how many and what kind of frames it has to create in the ScrollBox. When the user clicks OK, you can see if all data is valid (because the validator is in the parent frame) so you can deny exiting. It ended up like this: I still need the resizing logic, though, it was a recent request 🙂
-
It's pretty undocumented, but kind of easy to understand. Feel free to modify it to your needs: @ECHO OFF SET GITDIR=C:\LocalWork\_DelphiComponents SET OLDDIR=%CD% FOR /F "tokens=1 delims=" %%a IN ('DIR /B /A:D %GITDIR%') DO CALL :CHECK "%%a" GOTO :END :CHECK CD /D %GITDIR%\%~1 > nul 2>&1 IF ERRORLEVEL 1 GOTO :eof "C:\Program Files\Git\cmd\git.exe" fetch > nul 2>&1 IF ERRORLEVEL 1 GOTO :eof ECHO/|SET /P=%~1... "C:\Program Files\Git\cmd\git.exe" status > "%TEMP%\gitstatus.tmp" 2>&1 IF ERRORLEVEL 1 (ECHO querying status failed! & GOTO :DELEOF) TYPE "%TEMP%\gitstatus.tmp" | FIND /I "is behind" > nul 2>&1 IF ERRORLEVEL 1 (ECHO up to date. & GOTO :DELEOF) TYPE "%TEMP%\gitstatus.tmp" | FIND /I "nothing to commit, working tree clean" > nul 2>&1 IF ERRORLEVEL 1 (SET STASHED=1) ELSE (SET STASHED=0) IF %STASHED%==0 GOTO :PULL "C:\Program Files\Git\cmd\git.exe" stash > nul 2>&1 IF ERRORLEVEL 1 (ECHO could not stash changes! & GOTO :DELEOF) :PULL "C:\Program Files\Git\cmd\git.exe" pull --rebase > nul 2>&1 IF ERRORLEVEL 1 (ECHO could not download updates!) ELSE (ECHO update successful.) IF %STASHED%==0 GOTO :DELEOF "C:\Program Files\Git\cmd\git.exe" stash pop > nul 2>&1 IF ERRORLEVEL 1 ECHO could not restore changes! :DELEOF IF EXIST "%TEMP%\gitstatus.tmp" DEL "%TEMP%\gitstatus.tmp" GOTO :eof :END CD /D %OLDDIR% PAUSE For me it now outputted... All you have to do is to change the path to the folder, where your Git repositories are. The script will check all folders within the root and if it's a Git repository it will do it's work. If you need help, feel free to ask.
-
Boolean short-circuit with function calls
Lars Fosdal replied to Mike Torrettinni's topic in Algorithms, Data Structures and Class Design
Ternary is the correct term for computer language operators with three parameters. https://en.m.wikipedia.org/wiki/Ternary_operation -
Boolean short-circuit with function calls
aehimself replied to Mike Torrettinni's topic in Algorithms, Data Structures and Class Design
It makes no sense, especially if a compiler directive instructs otherwise. My guess is that there is some badly written structure behind the scenes and it would be too much effort to "fix" it now. This is typically when we start to name our bugs, celebrate their birthday, call them a feature and see where they evolve to 🙂 -
Dude probably you should start a blog instead