Jump to content

Mark Williams

Members
  • Content Count

    274
  • Joined

  • Last visited

Everything posted by Mark Williams

  1. Ok. Thanks. I'll have a look at that and report back if I can improve on what I have already got.
  2. Scanlines is fast enough for my purposes even when it scans every pixel. I would be reluctant to rely on random sampling.
  3. Thanks. I've sort of come up with something along those lines using Scanlines. It seems to be tolerably effective and sufficiently fast.
  4. I have written an ISAPI DLL (Delphi 10.3) for IIS to carry out various functions (download/upload of documents, query tables etc). It was working fine until I started firing more than one request at a time then the DLL simply crashed and I had to restart WWW service. The DLL is being used to request smallish files (20-30KB) over a local network. These are being requested one at a time. If I make these requests the same time as one or two Head requests that all goes Kaput. I though it may be the TWebRequestHandler's maxConnections so I set that to 200. It made no difference. After a bit of digging around I noticed that the IIS ApplicationPool that I created for the DLL has a "Maximum Worker Processes" property and that this was set to 1. I have set this to 20 and now the DLL seems to be able to handle more than one concurrent request. However, I noticed from my DLL's log file that it was now creating a new instance of the DLL for every request. I had a dig around on the web and from the information I found I have come to the conclusion that Maximum Worker Processes is probably not what I need. It seems to be designed for long-running processes, which is not what my DLL is designed for. If not Maximum Worker Processes and increasing MaxConnections doesn't work how do I get my DLL to deal with concurrent requests.
  5. Mark Williams

    ISAPI DLL concurrent requests

    Solved at last. The DLL was hanging due to the call to AddToLog in CloseServer due to the Log thread being terminated at the same time as the call. In the meantime, I had added a TFDMoniFlatFileClientLink component to provide a FireDAC trace to see what was happening with the FDManager as I thought that was causing the problem. Once I'd fixed the problem with the thread log, I still had issues. By a process of starting from scratch (as you suggested) and adding units/components as I went along, I eventually managed to narrow it down to the FlatFile component. Once I removed it, all was well. Must be a threading problem or some such. Many thanks for your help
  6. Mark Williams

    ISAPI DLL concurrent requests

    I've been through all my code. As far as I can see, I am destroying everything I create in an action event within the same action event. I have tried calling the DefaultHandlerAction and nothing else. procedure TWebModule1.WebModule1DefaultHandlerAction(Sender: TObject; Request: TWebRequest; Response: TWebResponse; var Handled: Boolean); begin Handled:=true; try SetResponse(Response, 400, 'Bad Request', 'Default Handler - if you are getting this message it either means that your are providing the wrong PathInfo or the action has been deleted.'); except end; end; And SetResponse: Procedure SetResponse(var Response:TWebResponse; code:integer; statusText:String; debug:string=''); begin Response.StatusCode:=code; Response.ReasonString:=code.ToString+' '+StatusText; if debug<>'' then AddToLog('Response set: '+Response.StatusCode.ToString+' '+StatusText+' Debug Info: '+debug, leMinorError) else AddToLog('Response set: '+Response.StatusCode.ToString+' '+StatusText, leDevelopment); end; It still hangs on shutdown. So I tried removing my calls to StartServer and CloseServer. Edited my DefaulHandlerAction so it didn't call SetResponse. It just sets the statuscode to 400 and still a huge delay in shutdown. My project file now looks like this: library MWCWebServer; uses Winapi.ActiveX, System.Win.ComObj, System.SysUtils, Web.WebBroker, Web.Win.ISAPIApp, Web.Win.ISAPIThreadPool, WinAPI.Isapi2, WinAPI.Windows, System.SyncObjs, system.classes, WebModule in 'WebModule.pas' {WebModule1: TWebModule}; {$R *.res} exports GetExtensionVersion, HttpExtensionProc, TerminateExtension; begin CoInitFlags := COINIT_MULTITHREADED; Application.Initialize; Application.WebModuleClass := WebModuleClass; Application.MaxConnections:=200; //MaxConnection=32 by default; IsMultiThread:=true; Application.Run; end. I'm not creating anything over and above the webapplication itself and still it won't shut down properly.
  7. Mark Williams

    ISAPI DLL concurrent requests

    Yes. I'm sure that freeing of FDManager is not the issue. Yes. Will do.
  8. Mark Williams

    ISAPI DLL concurrent requests

    See post two up. I don't think the problem is anything to do with FDManager. The reason it was hanging at the point of freeing was due to the call to the threadlog and my freeing of the threadlog before the threaded call had been made. My CloseServer now executes fully and appears to free all resources its requested to free. There has to be something else I am missing. Something I am creating somewhere and not freeing. No idea what though! Checking now.
  9. Mark Williams

    ISAPI DLL concurrent requests

    Terminate WWW via services. I can see my logfile is updated almost immediately with message "Server Closed" and then the progress bar from Services takes well over a minute from this point to complete the shutdown.
  10. Mark Williams

    ISAPI DLL concurrent requests

    Further process of elimination. It was nothing to do with the call to free FDManager. It was the AddToLog calls that were causing the problem. These are threaded using the FThreadFileLog, which I free in the final part of CloseServer and before the threads called in CloseServer had terminated. I have changed these calls so that they are now not threaded and also as a double precaution added a check in ThreadFileLog's destroy event to make sure no threads are running before it is shut down. My CloseServer procedure now frees all the resources as expected, but the application pool is still taking an age to close down. There must be some resource I am creating and not freeing. I will check all my code and if I can pinpoint what it is I will update.
  11. Mark Williams

    ISAPI DLL concurrent requests

    A good question! There is no need. Removed. But that's not the cause of the problem on shutdown. Any ideas why it would be hanging at this point?
  12. Mark Williams

    ISAPI DLL concurrent requests

    @Yaron It's taken me a while to revisit this and as usual it has not disappointed: days thrown into the abyss trying to change the structure of my ISAPI DLL! I have implemented points 1 to 4 mentioned by Yaron. Point 5 is causing me a problem. I am having a problem with the closing down of the application pool. Yes I get a long delay and my closing section does not fire. My dpr now looks like this: library MWCWebServer; uses Winapi.ActiveX, System.Win.ComObj, System.SysUtils, Web.WebBroker, Web.Win.ISAPIApp, Web.Win.ISAPIThreadPool, WinAPI.Isapi2, WinAPI.Windows, System.SyncObjs, system.classes, WebModule in 'WebModule.pas' {WebModule1: TWebModule}; {$R *.res} function GetExtensionVersion(var Ver: THSE_VERSION_INFO): BOOL; stdcall; begin Result := Web.Win.ISAPIApp.GetExtensionVersion(Ver); CriticalSection := TCriticalSection.Create; StartServer; end; {I have removed the call to TerminateVersion as it wasn't firing function TerminateVersion(var Ver: THSE_VERSION_INFO): BOOL; stdcall; begin Result := Web.Win.ISAPIApp.GetExtensionVersion(Ver); CloseServer; CriticalSection.Free; end;} {I have added this procedure as shown in the linked post provied by Yaron} procedure DoTerminate; begin CloseServer; CriticalSection.Free; end; exports GetExtensionVersion, HttpExtensionProc, TerminateExtension; begin CoInitFlags := COINIT_MULTITHREADED; Application.Initialize; Application.WebModuleClass := WebModuleClass; Application.MaxConnections:=200; //MaxConnection=32 by default; IsMultiThread:=true; TISAPIApplication(Application).OnTerminate:=DoTerminate; {Application.CacheConnections:=true;} //NB not necessary as cached by default Application.Run; end. My call to CloseServer now looks as follows: Procedure CloseServer; begin CriticalSection.Enter; try if assigned(MimeTable) then try MimeTable.Free; except end; FDManager.Close; try FDManager.Free; except end; try AddToLog('Connection manager deactivated', leMajorEvents); except end; try AddToLog('Server Closed', leMajorEvents); except end; if assigned(FThreadFileLog) then try FThreadFileLog.Free; except end; finally CriticalSection.Leave; end; end; CloseServer gets called on shut down as expected. However, it hangs at "try FDManager.Free; except end;" It appears to execute the line of code above (ie to close FDMAnager), but it does not appear to execute the call to free FDManager not does it it get caught by the try except nor by the try finally end. It just hangs at the call to free. And this is when the application pool stalls in its shutdown process. Does anyone have any ideas why?
  13. Mark Williams

    Using Params with recycled queries

    On intial setup I set up queries with SQL.Text and prepare them ready for use and reuse. Some of these queries use params. On first usage the queries work as expected. When reused they simply are not fired at all. As an example: with DAC.FDQueryDocCats do begin ParamByName('CASEID').AsInteger:=FProject.CurrCaseID; execute; end; Works as expected on first use. When it is run a second time using the debugger I can see that the correct CASEID value is being passed in and that the query appears to execute, however, it doesn't. The query dataset doesn't update. I have enabled tracing and on checking the trace it is clear that the query does not run at all. I have tried "EmptyDatSet", "ClearDetails", but same effect. If however, I do this: with DAC.FDQueryDocCats do begin SQL.Text:=SQL.Text; ParamByName('CASEID').AsInteger:=FProject.CurrCaseID; execute; end; all works well, however, I lose any advantage in having prepared my query in the first instance (although I have no idea just how great that advantage is). It would seem setting the SQL text clears data or resets flags that I am missing. I have professional delphi not enterprise so can't see the FireDac code to see what it is doing so I can copy.
  14. Mark Williams

    Using Params with recycled queries

    Sorry. It should have said open not execute. I call the appropriate command in another script which handles various errors and post details to a database. Didn't want to confuse by including all that code, but managed to confuse by not doing so! I was sure I had already tried Close and caused an AV. However, it works! Thanks.
  15. Mark Williams

    Best way to refresh static data

    I'm trying to work out the best way to refresh data loaded from a local file with data from the server. Using FireDAC with PostgreSQL. However, solution needs to be a general one as it will eventually need to work for SQL, Oracle etc. I have a table (DOCS_TABLE) which can vary greatly in size. DOCS_TABLE has a timestamp field (LAST_UPDATED). As the name suggests this records the date on which data in the record last changed. When user's open an app if they haven't queried DOCS_TABLE previously it is loaded via the server using a fairly complicated WHERE statement(COMPLICATED_WHERE_STATEMENT) which involves a number of joins to establish which records from the table the user is permitted to access. When the user closes the app the data from DOCS_TABLE is stored locally along with a timestamp to record the date and time the data was last refreshed (STORED_TMESTAMP). Next time the app opens it loads the data from the locally stored file. It then needs to ensure the user is working with up-to-date data. At the moment I am running a refresh query SELECT [fields] FROM DOCS_TABLE WHERELAST_UPDATED >[STORED_TIMESTAMP] AND [COMPLICATED_WHERE_STATEMENT]. I use the resulting data from the refresh query to update the in memory dataset holding DOCS_TABLE. This works, although it doesn't deal with records that were available at time of last saving locally and have now been deleted or access denied. As such,within the app, I run a check to make sure the user still has access to any record before trying to do anything with it, but it's not a terribly elegant solution. It would be better if such items were removed soon after loading the locally saved data. I have some thoughts on how to deal with this, which are below. However, I am concerned I may be overcomplicating things and that there may be much simpler solutions to this problem. Load the data from the local file. Run a thread for the following: Run a query (ID_QUERY) to ascertain which rows are now available to the user: SELECT id FROM DOCS_TABLE WHERE [COMPLICATED_WHERE_STATEMENT] Check the locally saved data against the result of this query to see what rows are no longer available to the user and remove them. Build a list of ids from the locally saved data (EXISTING_ID_ARRAY). Check the locally saved data against the results from ID_QUERY to see whether there are any new records to be added and build a list of the ids (NEW_ID_ARRAY). Run the refresh query using the arrays: SELECT [fields] FROMDOCS_TABLE WHERE (id in ([NEW_ID_ARRAY])) OR (id in [EXISTING_ID_ARRAY] ANDLAST_UPDATED >[STORED_TIMESTAMP]). Subject to my whole theory being cock-eyed I am pretty sure NEW_ID_ARRAY is the way to go. The part that concerns me is EXISTING_ID_ARRAY? Whilst it will cut out the use of the COMPLICATED_WHERE_STATEMENT and enable the query to focus explicitly on a group of records clearly identified, I would think the size of the array, could become a problem. Is there a law of diminishing returns with an IN clause? For example, if there were 1M records in the table and 20 items in the array, I suppose it must be the case using EXISTING_ID_ARRAY will be quicker than using COMPLICATED_WHERE_STATEMENT. But what if the array contained 800K of ids? I guess it has to be significantly less efficient to use EXISTING_ID_ARRAY and more efficient to use COMPLICATED_WHERE_STATEMENT. I appreciate without providing full details of the structure of DOCS_TABLE and the various joined tables, the data being retrieved from it and the full nature of the COMPLICATED_WHERE_STATEMENT, I may be asking for a comparison between apples and pears. What I am really interested in is whether my logic set out above is sound or idiotic and any suggestions on how best to achieve what I am trying to achieve.
  16. Mark Williams

    Best way to refresh static data

    I'm not convinced I have such a major issue here, but only time will tell and I'll have to make a judgement call at that time. Give it a REST! Joking aside, I get the message. REST is the way to go. Although by the time I get round to looking at it, it will probably be out to pasture with the brontosauri.
  17. Mark Williams

    Best way to refresh static data

    I think there may be some confusion. DOCS_TABLE does not contain the actual documents, rather it contains only data relating to the documents (such as data, author, file type, when uploaded etc. I don't download all the documents in one hit, just the data relating to them. The documents are stored on the server as files and downloaded only when they are needed. I could (and did until recently) just load the data from the database on startup. However, this obviously gets progressively slower as the number of records increases. It also struck me as pointless downloading the data time after time where it had not changed or was little changed. So I thought it would be better to store the data locally. For a large dataset (including heavy duty encryption of the local file) I get around a 20% time saving and a lot less traffic on the server. The actual documents when downloaded are cached (if that's what the user specifies) in a temp folder. It is all server side save for some caching. Whilst there isn't a web UI it is not a path I want to go down. Have used them extensively in the past and I don't think it is appropriate for the current usage for various reasons. Quite happy with a desktop app and using internet components and/or FireDAC. It works well and I am long way down the road. That's as may be. However, I am 5 years down the road, the software works as I want it. I am thinking of changing the way I load and refresh data not thinking of throwing out baby with the bathwater!
  18. Mark Williams

    Best way to refresh static data

    I understand that, but I am keen to speed up user experience at client end and reduce traffic at server end and doing it by way of a local file does that. I will now demonstrated my ignorance of REST services. I don't understand how it will result in faster loading of the same data and less bandwidth.
  19. Mark Williams

    Best way to refresh static data

    Noted. Just working in Postgres at the moment and trying to finish an app. Offered in Postgres initially and intending to offer connectors for other dbs when requested. If at the time of doing so I experience any issues I may then opt for REST server. Yes. It's there for PostGres also. You use the PGAdvanced parameter of the connection which gives you access to Postgre's Database Connection Control Functions. They don't have to be. But with large amounts of data it I find it is much faster to load the data from a local file and refresh it from the server in the background.
  20. Mark Williams

    Best way to refresh static data

    I've not really looked into possibility of using REST. If I'm not mistaken I would need the enterprise edition of Delphi rather than the mere professional to implement REST services. Not a deal breaker in itself, but I am not convinced that I need to go down that route (time is a major constraint for me at the moment). FireDAC seems to make it fairly simple to change horses between different databases. Isn't it then a question of ensuring your queries etc are compliant SQL and, if not, adapted as necessary for each database you support?
  21. Mark Williams

    Best way to refresh static data

    I'm not sure how to measure how much time it takes to retrieve from the server and how much to locally load. However, I know that loading from a static file is significantly faster than loading from the server so I'm pretty sure the bottleneck lies with the retrieval time. I'll have a look at compression. Haven't used it so far and didn't know FireDac supported it. Although my principal goal is to minimize resources server side. Noted. I will try and do it via temp tables.
  22. Mark Williams

    Best way to refresh static data

    It's a system for handling documents in litigation matters. So the number of documents can range from a few hundred to hundreds of thousands. Whilst a case is in its early stages the documents will (depending on the nature of the case) be loaded in large tranches. When it gets closer to its conclusion, you will get much fewer new documents being added to the case. I guess a count of how much new data there is and then decide on whether to use where statement or pass in an array of ids. Not sure, but I will look into it. However, I need a solution that will work for Postgre, Oracle and MS SQL at the very least, so it has to be compatible across the board, although I suppose I could write a different routine for each server if needs be. As for the IN clause, if it is particularly large in relation to the total number of records I could just load the lot via a thread and then dump the static data when I have the new data. If it's not too large relative to the total number of records, but still relatively large for an IN clause I guess I could submit in batches.
  23. Mark Williams

    Best way to refresh static data

    Unfortunately, I need to be able to offer a range of databases.
  24. Mark Williams

    Thinfinity VirtualUI - cloud-based conversion

    Of course, although an ocx usually has a visual element to it rather than just a library of procedure. I thought it may have a problem with ocx because of the visual element. Always best to check I feel.
  25. Mark Williams

    Best way to refresh static data

    It can be a pretty massive table, which can take a long while to load. It obviously loads much quicker from local file (even though encrypted). I am keen to avoid passing unnecessary traffic to the server, hence suggestion of local file and update with only the data necessary. There could be just a handful of records to update. I really don't want to download the lot again just for that. I think this is a sensible way to approach it. I'm just not sure that the way I propose to handle it is the most sensible. But I am keen to avoid full reload of whole table.
×