Leaderboard
Popular Content
Showing content with the highest reputation on 07/13/23 in Posts
-
Trap TFDConnection error on data module create etc?
Tom Chamberlain replied to Chris1701's topic in Databases
We prevent this issue with GExperts -> Set Component Properties and having it set the Active and Connected properties to False on all DB components before compiling. It does it automatically so we don't have to remember to do it 🙂 One of the best features of GExperts for us. -
Warning: Windows Update KB5028166 breaks NT Domains
Fr0sT.Brutal replied to dummzeuch's topic in Tips / Blogs / Tutorials / Videos
Autoupdate is evil and autoinstalling fresh MS updates until they're tested for at least a month is like jumping from the roof in hope there will be a haystack below. -
Feature req: Compiler unit dependency graph / log with warnings about circularity
Bill Meyer replied to Lars Fosdal's topic in Delphi IDE and APIs
There can be exceptional cases which justify sparing use of circular dependencies. Donald Knuth makes the point with regard to sorting algorithms. @Stefan Glienke has also argued for such cases. My point, and what @Uwe Raabe was speaking very explicitly about is large programs where circularity is commonplace, and not at all exceptional. There is no rational case which can support that practice. In the majority of cases, circular references are needed because of badly organized code modules. And I say that based on thousands of modules and millions of lines of code, over a period of 15+ years. -
Did you read the page linked to? First paragraph states: [edit] I guess you don't know what the PDB file is for. The PDB file is used by the profiler to map the addresses in the application being profiled to source files, function names, and line numbers. Without that information, the profiler would only be able to show you the raw addresses. Download and install VTune. Download map2pdb, extract the exe, and save it somewhere of your choice. For example c:\tools\map2pdb\map2pdb.exe Add a menu item in the Delphi IDE via Tools -> Configure Tools... The parameters in the above are: -debug -v -pause -bind:$EXENAME $PATH($EXENAME)$NAMEONLY($EXENAME).map -include:0001 Make sure the compiler are generating a full map file: Compile your project. Execute the map2pdb tool action. Launch VTune and create a profiler project for your exe. Profile the project in VTune. Profit!
-
These two statements seem to be in conflict. :-) Mobile development is much more of a moving target than desktop. I don't think staying on old versions of your development system is a viable strategy for mobile applications.
-
Trap TFDConnection error on data module create etc?
Fr0sT.Brutal replied to Chris1701's topic in Databases
Yeah, field generation will be painful. I got rid of table fields in design-time but assigning grid columns manually really annoys. Luckily it's not the frequent task. Anyway the alternative could be some kind of pre-build/post-build processing that would remove connection properties from DFM and restore it back. -
FireDAC Create table with TFDTable class at SQL Postgres 14
weirdo12 replied to shalapai's topic in Databases
This works in RAD Studio 11.3 with PostgreSQL 14.8 running on Ubuntu. Without the SET search_path code, the table is always being added to public even if SchemaName was set to dba or if the TableName was qualified like dba.test_table. std::unique_ptr<TFDTable> vTable(new TFDTable(0)); vTable->Connection = db_connection; vTable->TableName = "test_table"; vTable->FieldDefs->Add("login", ftLargeint, 0, true); vTable->FieldDefs->Add("ticket", ftLargeint, 0, true); vTable->FieldDefs->Add("exec_time", ftDateTime, 0, true); vTable->FieldDefs->Add("req_price", ftFloat, 0, true); TFDIndex* idx = vTable->Indexes->Add(); idx->Name = System::Sysutils::Format("%s_pkey", String("test_table")); idx->Fields = "login;ticket"; idx->Active = true; db_connection->ExecSQL("SET search_path TO dba, public"); if (!vTable->Exists) { vTable->CreateTable(False); } db_connection->ExecSQL("SET search_path TO public, dba");- 18 replies
-
- create table
- tfdtable
-
(and 1 more)
Tagged with:
-
Trap TFDConnection error on data module create etc?
weirdo12 replied to Chris1701's topic in Databases
If both of those are False the state of Connected will be ignored. You can test this by setting Connected to True, saving the file, closing it and reopening it. Connected will be False. Or create a new app. Drop a TFDConnection component on the main form. Set DriverName to SQLite. Add a BeforeConnect event handler. Put some dummy code in it so it does get deleted because it's empty. Set a break point on it. Set Connected to True. You should get prompted for a password. Click OK. Now run the app. The breakpoint won't occur. Now check auRunTime. Run the app again. It will break at the BeforeConnect event handler. -
Trap TFDConnection error on data module create etc?
Sherlock replied to Chris1701's topic in Databases
Is there no "DesignConnection" property? I recall using DOA components from AllroundAutomation 20 years ago that had this nifty feature. -
Warning: Windows Update KB5028166 breaks NT Domains
DiGi replied to dummzeuch's topic in Tips / Blogs / Tutorials / Videos
Well, blocking old SAMBA... I have some old NAS/Media center at home. And of course there are no new updates for ages. And I really don't waste money on new one because someone decides that it is not secure enough anymore and I cannot access it over LAN. -
Luxembourg; open job position: Delphi software developer / customer support
Foersom replied to Foersom's topic in Job Opportunities / Coder for Hire
The job description is now published at the Microtis site. https://www.microtis.lu/en/job -
Trap TFDConnection error on data module create etc?
weirdo12 replied to Chris1701's topic in Databases
There is an TFDConnection property that you can set to tell it how use/save the state of the Connected property - ConnectedStoredUsage. -
You are storing custom allocated data in the TTreeNode.Data property. To free that memory, you can override the TreeView's virtual Delete() method, which is called for every TTreeNode object that gets destroyed at runtime, eg: type TmyDBTreeView = class(TTreeView) protected procedure Delete(Node: TTreeNode); override; end; procedure TmyDBTreeView.Delete(Node: TTreeNode); begin inherited Delete(Node); // fires the OnDeletion event handler Dispose(PNodeRec(Node.Data)); Node.Data := nil; end; However, a better way to handle this is to not use the TTreeNode.Data property at all. Leave that untouched so users of your component can use it for their own application data. Instead, you should create a custom descendant of TTreeNode to hold your component's extra data, and then you can override the TreeView's virtual CreateNode() method to return new instances of your custom Node class, eg: type TmyDBTreeNode = class(TTreeNode) public IdNode: integer; IdParent : integer; end; TmyDBTreeView = class(TTreeView) protected function CreateNode: TTreeNode; override; end; function TmyDBTreeView.CreateNode: TTreeNode; begin Result := TmyDBTreeNode.Create(Items); end; Then, any place that you were accessing PNodeRec(Node.Data)^ before, you can instead access TmyDBTreeNode(Node), for example: // if DataSource.DataSet.FieldByName(IDParentNode).AsInteger = PNodeRec(Node.Data)^.IdNode then if DataSource.DataSet.FieldByName(IDParentNode).AsInteger = TmyDBTreeNode(Node).IdNode then function TmyDBTreeView.GetNodeIdParent(Node: TTreeNode): integer; begin //Result := PNodeRec(Node)^.IdParent; Result := TmyDBTreeNode(Node).IdParent; end; function TmyDBTreeView.GetNodeId(Node: TTreeNode): integer; begin //Result := PNodeRec(Node)^.IdNode; Result := TmyDBTreeNode(Node).IdNode; end; And when adding new nodes to the TreeView, you don't need to use Add...Object() methods anymore, you can just use the normal Add...() methods instead, eg: //New(NRec); //NRec^.IdNode := DataSource.DataSet.FieldByName(IDNode).AsInteger; //NRec^.IdParent := DataSource.DataSet.FieldByName(IDParentNode).AsInteger; //Node := Items.AddObject(nil, DataSource.DataSet.FieldByName(DataField).AsString, NRec); Node := Items.Add(nil, DataSource.DataSet.FieldByName(DataField).AsString); TmyDBTreeNode(Node).IdNode := DataSource.DataSet.FieldByName(IDNode).AsInteger; TmyDBTreeNode(Node).IdParent := DataSource.DataSet.FieldByName(IDParentNode).AsInteger; // New(DNode); // DNode^.IdNode := DataSource.DataSet.FieldByName(FIDNode).AsInteger; // DNode^.IdParent := DataSource.DataSet.FieldByName(FIDParentNode).AsInteger; // NewNode := Items.AddChildObject(Node, Field.AsString, DNode); NewNode := Items.AddChild(Node, Field.AsString); TmyDBTreeNode(NewNode).IdNode := DataSource.DataSet.FieldByName(FIDNode).AsInteger; TmyDBTreeNode(NewNode).IdParent := DataSource.DataSet.FieldByName(FIDParentNode).AsInteger; --- That being said, I notice a number of other issues with your code, that are unrelated to node management. 1. Why do you have a Sleep() loop inside of your TmyDBTreeView destructor? Not only is a loop redundant when you could simply calculate the wanted duration in a single call, but calling Sleep() during destruction really doesn't belong there to begin with. 2. Any use of Items.(Begin|End)Update() should be protected with a try..finally block. 3. This code is wrong: procedure TmyDBTreeView.CMGetDataLink(var Message: TMessage); begin Message.Result := Integer(FDataLink); end; If you ever try to use your component in 64bit, this will fail as it will truncate the returned pointer. The Message.Result is a NativeInt, not an Integer, so cast accordingly. 4. Inside of TmyDBTreeView.SetDataSource(), since you are calling FreeNotification() on a new DataSource, you should call RemoveFreeNotification() on any DataSource that was previously assigned. Although, I think this is completely redundant since you are not actually storing a reference to the DataSource, you are just assigning it to your DataLink, so you could just remove use of FreeNotification() altogether and let the DataLink handle that for you. Also, SetDataSource() is assigning the new DataSource to your DataLink only if csLoading is enabled, which means it is being assigned only during DFM streaming. But if a user of your component decides to change the DataSource at runtime after the DFM is loaded, you are not updating your DataLink. 5. in TmyDBTreeView.DeleteNodeFromTreeOnly(), you are looping incorrectly. Since you are looping forwards, if a node is deleted then your loop counter will get out of sync and end up skipping the next node. To account for that, you need to either: a. loop backwards instead of forwards, eg: procedure TmyDBTreeView.DeleteNodeFromTreeOnly(id_node: integer); var i: integer; begin for i := Items.Count - 1 downto 0 do begin //if PNodeRec(Items[i])^.IdNode = id_node then Items[i].Delete; if TmyDBTreeNode(Items[i]).IdNode = id_node then Items[i].Delete; end; end; b. change the loop so your 'i' variable is incremented only when a node is NOT deleted, eg: procedure TmyDBTreeView.DeleteNodeFromTreeOnly(id_node: integer); var i, cnt: integer; begin i := 0; cnt := Items.Count; while i < cnt do begin //if PNodeRec(Items[i])^.IdNode = id_node then Items[i].Delete then if TmyDBTreeNode(Items[i]).IdNode = id_node then Items[i].Delete then Dec(cnt) else Inc(i); end; end;
-
Feature req: Compiler unit dependency graph / log with warnings about circularity
dummzeuch replied to Lars Fosdal's topic in Delphi IDE and APIs
I guess going back to the feature set of Turbo Pascal 3 would also improve the stability of the IDE and the compile speed [/sarcasm] -
Marketing. I don't think they originally intended it as such but since it at best doesn't hurt performance, that's what it became. Yes, it does. Maybe now would be a good time to read up on what virtual memory is. You don't have to allocate beyond physical memory before virtual memory comes into play. You just have to allocate beyond the process' working set. The working set is the part of your process' virtual memory that is backed by physical memory. The working set can grow and shrink depending on memory access and global resource pressure but there's always an upper and lower limit. Of course, it's a bit more complicated than that (there are more layers than what I have described) but that's the basic overview. In general, I would recommend that one doesn't try to outsmart things one doesn't fully understand. Be it threading, memory management, or women 🙂
-
unit Bitset; interface uses SysUtils, Math; type TBitSet = record private FBitCount: Int64; FSets: array of set of 0..255; class function SetCount(BitCount: Int64): Int64; static; procedure MakeUnique; procedure GetSetIndexAndBitIndex(Bit: Int64; out SetIndex: Int64; out BitIndex: Integer); function GetIsEmpty: Boolean; procedure SetBitCount(Value: Int64); function GetSize: Int64; public class operator In(const Bit: Int64; const BitSet: TBitSet): Boolean; class operator Equal(const bs1, bs2: TBitSet): Boolean; class operator NotEqual(const bs1, bs2: TBitSet): Boolean; property BitCount: Int64 read FBitCount write SetBitCount; property Size: Int64 read GetSize; property IsEmpty: Boolean read GetIsEmpty; procedure Clear; procedure IncludeAll; procedure Include(const Bit: Int64); procedure Exclude(const Bit: Int64); end; implementation { TBitSet } procedure TBitSet.MakeUnique; begin // this is used to implement copy-on-write so that the type behaves like a value SetLength(FSets, Length(FSets)); end; procedure TBitSet.GetSetIndexAndBitIndex(Bit: Int64; out SetIndex: Int64; out BitIndex: Integer); begin Assert(InRange(Bit, 0, FBitCount-1)); SetIndex := Bit shr 8; // shr 8 = div 256 BitIndex := Bit and 255; // and 255 = mod 256 end; function TBitSet.GetIsEmpty: Boolean; var i: Int64; begin for i := 0 to High(FSets) do begin if FSets[i]<>[] then begin Result := False; Exit; end; end; Result := True; end; procedure TBitSet.SetBitCount(Value: Int64); var Bit, BitIndex: Integer; SetIndex: Int64; begin if (Value<>FBitCount) or not Assigned(FSets) then begin Assert(Value>=0); FBitCount := Value; SetLength(FSets, SetCount(Value)); if Value>0 then begin (* Ensure that unused bits are cleared, necessary give the CompareMem call in Equal. This also means that state does not persist when we decrease and then increase BitCount. For instance, consider this code: var bs: TBitSet; ... bs.BitCount := 2; bs.Include(1); bs.BitCount := 1; bs.BitCount := 2; Assert(not (1 in bs)); *) GetSetIndexAndBitIndex(Value - 1, SetIndex, BitIndex); for Bit := BitIndex + 1 to 255 do begin System.Exclude(FSets[SetIndex], Bit); end; end; end; end; function TBitSet.GetSize: Int64; begin Result := Length(FSets)*SizeOf(FSets[0]); end; class function TBitSet.SetCount(BitCount: Int64): Int64; begin Result := (BitCount + 255) shr 8; // shr 8 = div 256 end; class operator TBitSet.In(const Bit: Int64; const BitSet: TBitSet): Boolean; var SetIndex: Int64; BitIndex: Integer; begin BitSet.GetSetIndexAndBitIndex(Bit, SetIndex, BitIndex); Result := BitIndex in BitSet.FSets[SetIndex]; end; class operator TBitSet.Equal(const bs1, bs2: TBitSet): Boolean; begin Result := (bs1.FBitCount=bs2.FBitCount) and CompareMem(Pointer(bs1.FSets), Pointer(bs2.FSets), bs1.Size); end; class operator TBitSet.NotEqual(const bs1, bs2: TBitSet): Boolean; begin Result := not (bs1=bs2); end; procedure TBitSet.Clear; var i: Int64; begin MakeUnique; for i := 0 to High(FSets) do begin FSets[i] := []; end; end; procedure TBitSet.IncludeAll; var i: Int64; begin for i := 0 to BitCount-1 do begin Include(i); end; end; procedure TBitSet.Include(const Bit: Int64); var SetIndex: Int64; BitIndex: Integer; begin MakeUnique; GetSetIndexAndBitIndex(Bit, SetIndex, BitIndex); System.Include(FSets[SetIndex], BitIndex); end; procedure TBitSet.Exclude(const Bit: Int64); var SetIndex: Int64; BitIndex: Integer; begin MakeUnique; GetSetIndexAndBitIndex(Bit, SetIndex, BitIndex); System.Exclude(FSets[SetIndex], BitIndex); end; end. This is based on code of mine that has is limited to integer bit count. I've not tested it extended to Int64, but I'm sure anyone that wanted to use random code like this would test.
-
How about the idea to store this in a 2D-image as Bitfield, of maybe 17320x17320 pixel ? Of course storage will be maybe more efficient, but access would be slower. Thinking further, using such Bitfield with involvement of the GPU ....
-
Wow! If my math is correct that needs more than 32GB of memory. I have no idea what the purpose is, but perhaps there are other approaches to achieve the same.