Jump to content

Stefan Glienke

Members
  • Content Count

    1428
  • Joined

  • Last visited

  • Days Won

    141

Everything posted by Stefan Glienke

  1. Stefan Glienke

    Profiler for Delphi

    I like SamplingProfiler but compared to the capabilities of VTune or μProf it's just a toy.
  2. Stefan Glienke

    Profiler for Delphi

    Does it help to put the pdb into one of the directories mentioned in 6. of this doc? https://software.intel.com/content/www/us/en/develop/documentation/vtune-help/top/set-up-project/search-directories/search-order.html
  3. Stefan Glienke

    Good Key/Hash for SQL string

    Whats the problem in simply using a TDictionary<string,queryresult> where the key is the sql statement - even if there is a hash collision the hashtable handles it. I doubt that the extra comparisons of the colliding SQL strings would affect performance noticably.
  4. "Let's put dots into unit names, add some confusing matching logic for uses clauses that nobody understands and call that namespaces"
  5. Stefan Glienke

    Profiler for Delphi

    That's also something that went through my head when I was reading the LLVM documentation.
  6. If you look into the debugger you see what causes the breakpoint to stop It looks to me as the many situations where the compiler produces debug symbols that are a little off - as you can see in the asm view you have two different pieces being affected by the breakpoint. The thing is that every line generated by the compiler - even the implicit for the looping goes somewhere in terms of what line of code they belong to - and in this case its the piece of code happening for the outer loop. We had another situation where this happened just recently:
  7. As Remy rightly assumed this indeed was a bug - an untyped pointer was assignment compatible to a dynamic array - a very dangerous and long lasting bug - it was finally in 10.2. And because the default for {$TYPEDADDRESS} is OFF this code actually resulted in the pointer to the TRect variable passed as a TBytes - now because the code inside that overload never does any range or bounds check it happens to work if you pass the correct size of that variable.
  8. Stefan Glienke

    DEC 6.x issue

    tbh you pretty much butchered the entire code for older versions. I just gave XE a try and it fails on numerous things - not supported $IF/$ENDIF (it was $IF/$IFEND back then) - scoped unit names, usage on intrinsic helpers, AtomicIncrement and more I guess (I stopped fixing the code at that point)
  9. It does matter - if your ID values for example are in a certain range maybe even starting at 1 you could get O(1) access if you used them simply as index in the array itself. I don't know what that means - but talking about at most 10k records with strings up to 30 characters looking up one of those is certainly within the ms if not ns range.
  10. No surprise there - linear search is O(n), binary search is O(log n) and hash table is O(1), what you see with the differences for certain number of records are the different constant factors - building a hash has a certain cost which is why TDict is slower up to a certain point - this depends on the hash algorithm being used - indirections being caused by extra function calls like IEqualityComparer.GetHashCode and the memory layout of the hash table. You can even see that it's not strictly O(1) as it gets a little slower the more records you have which I would guess is due to memory layout within the hashtable where at a certain point memory access time kicks in because the data does not fit in the fast cache anymore (access pattern of the benchmarking code matters there as well). Without specifying the exact use cases it's hard to give any suggestions - is the data one time fill and then only lookup, is data being changed after - adding/removing records, is data within the records changing. Whats the typical lookup pattern - is data being looked up randomly or in certain patterns (such as ID or Name sequences in certain order). If you aim for maximum performance a custom data structure and algorithm will always win over some standard algorithm by a few percent - the question is if you want to do all that work including risking bugs for that negligible few percent or use some standard data structures and algorithms that are proven to work and easy to be applicable.
  11. That is because technically what the compiler generates for line 93 is this: xComparison := function(const left, right: TDataLine): Integer begin Result := CompareIDs(left, right) end); When you put a breakpoint there you have them in multiple lines: one time it gets hit for the assignment to xComparison and all other times for the Result := CompareIDs
  12. The main reason is in the very first sentence of my previous post. Also as you are so into benchmarking I suggest you make yourself familiar with profilers so you don't have to guess or ask others what is taking time as you can very well see for yourself.
  13. Which does absolutely nothing since like XE7 or so when they fixed the $RTTI switch to be local to the current unit.
  14. This cannot be because XE already had extended RTTI. It's more likely that internal refactorings of the RTL and VCL such as using generic lists instead of good old Classes.TList and Contnrs.TObjectList contributes to the bloat. Of course with RTTI being enabled on those lists it leaves all the typically inlined method calls in the binary. I wonder what difference {$WEAKLINKRTTI ON} would have.
  15. Stefan Glienke

    Can Delphi randomize string 'Delphi'?

    52^6 is bigger than 32bit so of course a 32bit RNG might not yield it. In fact its over 4 times more than 32bit so only like every 4th possible 6 letter combination would ever be yielded. Bonus hint: try a lowercase d 😉
  16. Because IComparer<T> created via TComparer.Construct (TDelegatedComparer<T>) suffer from this issue: https://www.idefixpack.de/blog/2016/05/whats-wrong-with-virtual-methods-called-through-an-interface/ Furthermore every call to GetName_TArrayBinarySearch constructs the comparer again. Eliminating that as well gives me a result of 159 vs 105 which then can be explained by the additional calls through IComparer<T> and the probably a little less optimal allocated registers in the actual method that performs the search because that is the one with many more arguments in class function TArray.BinarySearch<T>(const Values: array of T; const Item: T; out FoundIndex: Integer; const Comparer: IComparer<T>; Index, Count: Integer): Boolean; FWIW your implementation is not exactly the same as the RTL one as you exit as soon as you find a match while the RTL implementation because it returns the index goes on because it returns the first index in case there are successive elements matching.
  17. If you mean accessing the TList<T> as an IList<T> without moving the items from one list to the other - youll need to write an adapter that wraps the TList<T> into an IList<T> - library does not contain one. If you mean to move the items then you need to loop or use .ToArray on the TList as the IList interface does not offer any overloads accepting a TList<T> or TEnumerable<T> from System.Generics.Collections and that will not change.
  18. Stefan Glienke

    TNothingable<T>

    Last time I checked this was a Delphi forum, so no clue if php can do this and the API can very well be not 100% adhering the spec - but an array with a null in json would be [null] Anyway none of that relates to your initial question imo - all mentioned cases can be handled with default data types - explicit nullable type is to add that additional state of nothing/null/nada to a value type.
  19. Stefan Glienke

    TNothingable<T>

    Which is correct, because {} represents an empty object, which is not the same as no object (null).
  20. Stefan Glienke

    TNothingable<T>

    Your json is wrong - an empty array is represented as [], not as null
  21. Stefan Glienke

    TNothingable<T>

    Nullable<T> you mean?
  22. Stefan Glienke

    Blogged : Advice for Delphi library authors

    I wonder when that will happen given the oldest mention of that version I remember was in 2010
  23. Stefan Glienke

    Micro optimization: Split strings

    you must be using a version before 10.3 - this has been fixed: https://quality.embarcadero.com/browse/RSP-11302
  24. Stefan Glienke

    10.4.2 Released today - available to download

    I am also glad we could find a solution - fyi the actual fix was done a bit different as suggested in the comments of that issue. Also thanks to @jbg who gave some input on the subject and @Bruneau who we worked with to get this solved.
  25. Stefan Glienke

    Micro optimization: Split strings

    Some of your functions have a defect as in returning an empty array when no delimiter is found - they must return a 1 element array with the input string if they should follow RTL behavior. Also you can remove some unnecessary branching and make the code simpler: function CustomSplitWithPrecountByIndex2(const aString: string; const aDelimiter: Char): TArray<string>; var i, resultLen, resultIdx, tokenPos, inputLen, lastDelimiterPos: Integer; begin inputLen := aString.Length; lastDelimiterPos := 0; resultLen := 1; for i := 1 to inputLen do if aString[i] = aDelimiter then begin Inc(resultLen); lastDelimiterPos := i; end; SetLength(Result, resultLen); resultIdx := 0; tokenPos := 1; for i := 1 to lastDelimiterPos do if aString[i] = aDelimiter then begin SetString(Result[resultIdx], PChar(@aString[tokenPos]), i - tokenPos); tokenPos := i + 1; Inc(resultIdx); end; SetString(Result[resultIdx], PChar(@aString[tokenPos]), inputLen - lastDelimiterPos); end;
×