Jump to content

jobo

Members
  • Content Count

    8
  • Joined

  • Last visited

Community Reputation

1 Neutral

Technical Information

  • Delphi-Version
    Delphi 2 - 7
  1. "returning" mentioned here is part of sql command, it >returns values of auto id and other expressions. Maybe this is not available in Pervasive. Here is firebird documentation explaining it: http://www.firebirdsql.org/refdocs/langrefupd21-insert.html This is of course a solution coming from the database side. Firedac offers client side solutions.
  2. jobo

    Best site/source for SQL Server questions?

    I'm not a ms sql specialist. In fact I don't like it. So I have no specific points here. From Your description it sounds like everything is mixed up a little. Don't know if this is just an impression from the description or if this describes present situation. The problem I mentioned above occurs on selling limited real world articles (not like software license keys). And selling things which are already sold is really a problem. But this is different from optimizing picking paths, keeping pallets available etc. So without specific problems in question I can just give some final hints: clean up and separate issues (maybe even separate systems) do application and processing optimization on every single step (having a few lazy spots is sufficient to break the system*) in fact sql tuning (single statements) could speed up things by factors you can't afford in hardware tuning could be achieved by - denormalization for speed (space/precalculation is the price for speed, speed avoids growing locks) - normalization for speed (find old flaws, which call for double work and endanger data accuracy within the time critical process) - index optimization (mostly indexing the proper columns) - using most advanced features of the system (which could be i.e. sql feature like CTE or also a really cool new index technology- I'm not up to date with sql server here) - avoid local (client side) processing in favour of SP (the famous "if loop" and so on) - avoid costly deduction of states, use modelled, explicit, simple states - consider additional technologies to notify clients on state changes etc. (decreasing stock count, ..) - consider specialized use cases to easy situation, include data model changes if necessary (see also: de-normalization) *Finally healing a few smaller flaws could add up to more than the sum of the single effects.
  3. jobo

    Best site/source for SQL Server questions?

    In my point of view, this is best way to make use of database transaction control, when you are on client side. Servers transaction handling is the server side implementation to guarantee a complete, error free transaction, considering each and every rule known to the server. So called client side "transaction handling" tries to achieve the same without being the server. Of course you can do this, but it f.. ups the server principle and comes with a price. It uses more resources and by doing so it leads more quickly to the situation you'd like to avoid. Timeouts, lock escalation (mssql), locked tables, dead locks. So first thing to do, is making the process as lean as possible. Prepare everything possible before the final transaction. Use highly specific SP if possible (RDBMS offers strong SP features), which could mean for example, avoid parameter driven "smartness" of the SP and choose the best possible SP (and processing path) from client. If necessary model the "processing path" more explicit in your database, which could be a table for "reservations" or distinguish between view mode (of to be processed data) and "I'm really buying now" mode. Leave the developer comfort zone and stop using client transactions, but make tiny, meaningful steps in backend (transactions) and frontend and try to handle raising up resource problems gracefully.
  4. jobo

    Best site/source for SQL Server questions?

    some thoughts and questions to this - the only requests like this one I know from ms sql "users" - so some systems seem to be picky about row level locking others not - the sql handling problem itself here is bound very close to the real world problem of demanding a physically limited thing - row level locking in ms sql server under load could cause «lock escalation> aka a much more general locking like page locking - maybe one has to think about some denormalisation, which of course have to be accompanied by accompanied by adequate constraints, keys, triggers, etc. - handling the «grabbing» might be easier, quicker, more robust, if handled by a single, atomic Grab Stored Procedure (considering the denormalisation)» - which version is in use? - could you elaborate «..the actual grabbing is totally void of transactions..» please? - or describe the workflow a little?
  5. jobo

    General DB access question -- paging query results

    Add a paging mechanism by using restrictions in sql where clause certainly doesn't use extra indexing. The base query itself my use indexes, if well designed and fitting. I think Lars just was surprised, that components today realy do that ~"offset trick" aka make use of more modern SQL Features.
  6. Ok, sorry, I read somewhere about a general discussion and up to now, nobody but You knew about record counts. Of course, a few thousand records can't itself be a problem. So I already asked the magic question and one must die a death. Use some of the magic switches already mentioned . Do intelligent initialization instead of swallowing the whole database into memory. At the moment processing seems to be complete opposite to "scaling" design. This includes a restriction right at the beginning. Does abstraction make problems simpler and lead to simple solutions? I don't think so, devide and conquer! Have a look at the details! All the best for 2019 to all forum visitors!
  7. Well, I‘m afraid You have basically two big problems from a technical perspective. Taking Your explanation from the other thread https://en.delphipraxis.net/topic/476-any-suggestions-on-how-to-lazy-load-a-blobclob-from-oracle-db/ into account, these problems result from an incomplete or imperfect “migration” from BDE. When or where ever a dataset is opened by “select * from anytable;”, it misses the required precision in horizontal and vertical data limitation. a) * doesn’t cope with specific field handling requirements like BLOB data b) missing filter (where clause) forces the used components to fetch all data from remote server (ignoring bandwidth shared with others) Now the question is, how do you imagine a cure? (I’m cross-eyed on the other thread again) Are You looking for a kind of magic switch? Are You doomed to an unintelligent solution? (Because of management, resources, ...) When I wrote my idea of a “workaround” I wasn’t really aware, You are selecting thousands or millions of such BLOBed records unfiltered. (Which is in my point of view a no go even without BLOB fields involved) Okay, the good news is: There are magic switches, limiting the number of fetched records. The bad news: Talking about big and complex queries You will sooner or later face the problem, that a big and complex query itself takes its time to be evaluated even on server side. Any sort operation for example is where the trouble starts. And like explained in the other thread, the paging mechanism you are looking for is implemented with sort and limit and offset clauses. So if You ask the server to run a complex query against millions of rows, also returning millions of rows without grace, then the magic switch asking for just a few of the records will not help in every situation. Let me put it this way: even a “modern” paging mechanism doesn’t help, if data design doesn’t scale. You will not find such an app, which offers high speed data retrieval without some kind of windowing or boxing or filtering your data prior to showing it. And a normal use case in fact offers a lot of such limits, it’s “natural”. A clerk works on one order after the other, not with thousands or millions. Also the customer hotline. They ask for my number each time and yes, after that we are talking about my order and everything I ordered, but this isn’t about millions of order items. And I’m sorry, but your requirements (from the other thread) sound a little like “wash me, but don't wet me”. In any case, you can make use of a powerful database and some good components you seem to have already available to get this problems under control. (Even with an older version of delphi) At least that was what I did, when I was actively developing delphi apps. Starting with BDE, moving to ADO/OLEDB, ODBC using mechanisms discussed here. Sure my suggestion would help, if You implement it.
  8. is this Yours http://forums.allroundautomations.com/list.html ? If so, it would have been nice to mention this here. Anyway, my idea is: Create a view which returns all fields 1:1, except Your clob column, which is truncated to a just a view characters. (may be an individual truncation comment could be added also, if it realy happened. Now change Your app to showing the view instead of the table and add an event for editing & showing the complete clob. This is no paging but a quite minimal change to the app.. Of course it depends on capabilities of your components, I never used them. But at least the tools of aa are well known to me and they are quite fast. I guess they use their own data access components and this workaround would be a quick fix.
×