Mark Williams
Members-
Content Count
282 -
Joined
-
Last visited
Everything posted by Mark Williams
-
The default MaxConnections is set at 32 for TISAPIApplication. I have no idea how many people may be calling my dll at any given time. 32 seems to me to be too low a number. I have set it to 200. I cannot find any advice as to best policy on setting this value. The help file simply says to test the ActiveCount and InactiveCount and adjust accordingly. That is going to be a little difficult to test. I don't want to have exceptions flying off because of MaxConnection exceeded and likewise I don't want to degrade performance. If, for example, I set MaxConnection to 1000 I assume that will not of itself degrade performance and that there will only be issues when actual connections are that high, If that is right then it seems to me that slightly deteriorated performance is preferable to exceptions due to exceeding max connections. Or am I missing something?
-
Thanks. I'll look into this at some point. However, my ISAPI dll isn't really designed to do heavy lifting. Just user validation, configuration settings and downloading of smallish files. There is occasionally a need to upload/download largish files and this is something I am planning to move to FTP or maybe there is a better/faster solution?
-
I wouldn't have thought it was possible to do this, but when my kids sign in to my Netflix account from various locations I get an email from Netflix to let me know what country and region they have signed in from. Does anyone have any idea how Netflix gets this info?
-
Detect user location from ip address
Mark Williams replied to Mark Williams's topic in Network, Cloud and Web
Must be a European thing. As of last Friday it is now unlawful for me to get such jokes! -
Detect user location from ip address
Mark Williams replied to Mark Williams's topic in Network, Cloud and Web
I'm not sure what you're referring to. Can't see anything on the page/website which helps. -
Detect user location from ip address
Mark Williams replied to Mark Williams's topic in Network, Cloud and Web
I found an example of how to do it (although not tried it) here: https://www.example-code.com/delphiDll/geolocation_ip_address.asp -
I'll give that a try and see how it pans out. I thought this was more or less what iis does with the dll.
-
Detect if image is color, greyscale or black white
Mark Williams posted a topic in Algorithms, Data Structures and Class Design
Modern scanners on auto color detection mode seem to be very efficient in detecting whether a document is a text document and whether to scan it as monochrome even though the image contains some degree of color. It has to be more than a pixel by pixel analysis to analyse the extent of the color in any image. Does anyone have any idea how this is done or even better if they can point me o some example code? I have searched on Google and the only example I can find is written in Python and I have no idea how to adapt for Delphi (Python example). -
Detect if image is color, greyscale or black white
Mark Williams replied to Mark Williams's topic in Algorithms, Data Structures and Class Design
Ok. Thanks. I'll have a look at that and report back if I can improve on what I have already got. -
Detect if image is color, greyscale or black white
Mark Williams replied to Mark Williams's topic in Algorithms, Data Structures and Class Design
Scanlines is fast enough for my purposes even when it scans every pixel. I would be reluctant to rely on random sampling. -
Detect if image is color, greyscale or black white
Mark Williams replied to Mark Williams's topic in Algorithms, Data Structures and Class Design
Thanks. I've sort of come up with something along those lines using Scanlines. It seems to be tolerably effective and sufficiently fast. -
I have written an ISAPI DLL (Delphi 10.3) for IIS to carry out various functions (download/upload of documents, query tables etc). It was working fine until I started firing more than one request at a time then the DLL simply crashed and I had to restart WWW service. The DLL is being used to request smallish files (20-30KB) over a local network. These are being requested one at a time. If I make these requests the same time as one or two Head requests that all goes Kaput. I though it may be the TWebRequestHandler's maxConnections so I set that to 200. It made no difference. After a bit of digging around I noticed that the IIS ApplicationPool that I created for the DLL has a "Maximum Worker Processes" property and that this was set to 1. I have set this to 20 and now the DLL seems to be able to handle more than one concurrent request. However, I noticed from my DLL's log file that it was now creating a new instance of the DLL for every request. I had a dig around on the web and from the information I found I have come to the conclusion that Maximum Worker Processes is probably not what I need. It seems to be designed for long-running processes, which is not what my DLL is designed for. If not Maximum Worker Processes and increasing MaxConnections doesn't work how do I get my DLL to deal with concurrent requests.
-
ISAPI DLL concurrent requests
Mark Williams replied to Mark Williams's topic in Network, Cloud and Web
Solved at last. The DLL was hanging due to the call to AddToLog in CloseServer due to the Log thread being terminated at the same time as the call. In the meantime, I had added a TFDMoniFlatFileClientLink component to provide a FireDAC trace to see what was happening with the FDManager as I thought that was causing the problem. Once I'd fixed the problem with the thread log, I still had issues. By a process of starting from scratch (as you suggested) and adding units/components as I went along, I eventually managed to narrow it down to the FlatFile component. Once I removed it, all was well. Must be a threading problem or some such. Many thanks for your help -
ISAPI DLL concurrent requests
Mark Williams replied to Mark Williams's topic in Network, Cloud and Web
I've been through all my code. As far as I can see, I am destroying everything I create in an action event within the same action event. I have tried calling the DefaultHandlerAction and nothing else. procedure TWebModule1.WebModule1DefaultHandlerAction(Sender: TObject; Request: TWebRequest; Response: TWebResponse; var Handled: Boolean); begin Handled:=true; try SetResponse(Response, 400, 'Bad Request', 'Default Handler - if you are getting this message it either means that your are providing the wrong PathInfo or the action has been deleted.'); except end; end; And SetResponse: Procedure SetResponse(var Response:TWebResponse; code:integer; statusText:String; debug:string=''); begin Response.StatusCode:=code; Response.ReasonString:=code.ToString+' '+StatusText; if debug<>'' then AddToLog('Response set: '+Response.StatusCode.ToString+' '+StatusText+' Debug Info: '+debug, leMinorError) else AddToLog('Response set: '+Response.StatusCode.ToString+' '+StatusText, leDevelopment); end; It still hangs on shutdown. So I tried removing my calls to StartServer and CloseServer. Edited my DefaulHandlerAction so it didn't call SetResponse. It just sets the statuscode to 400 and still a huge delay in shutdown. My project file now looks like this: library MWCWebServer; uses Winapi.ActiveX, System.Win.ComObj, System.SysUtils, Web.WebBroker, Web.Win.ISAPIApp, Web.Win.ISAPIThreadPool, WinAPI.Isapi2, WinAPI.Windows, System.SyncObjs, system.classes, WebModule in 'WebModule.pas' {WebModule1: TWebModule}; {$R *.res} exports GetExtensionVersion, HttpExtensionProc, TerminateExtension; begin CoInitFlags := COINIT_MULTITHREADED; Application.Initialize; Application.WebModuleClass := WebModuleClass; Application.MaxConnections:=200; //MaxConnection=32 by default; IsMultiThread:=true; Application.Run; end. I'm not creating anything over and above the webapplication itself and still it won't shut down properly. -
ISAPI DLL concurrent requests
Mark Williams replied to Mark Williams's topic in Network, Cloud and Web
Yes. I'm sure that freeing of FDManager is not the issue. Yes. Will do. -
ISAPI DLL concurrent requests
Mark Williams replied to Mark Williams's topic in Network, Cloud and Web
See post two up. I don't think the problem is anything to do with FDManager. The reason it was hanging at the point of freeing was due to the call to the threadlog and my freeing of the threadlog before the threaded call had been made. My CloseServer now executes fully and appears to free all resources its requested to free. There has to be something else I am missing. Something I am creating somewhere and not freeing. No idea what though! Checking now. -
ISAPI DLL concurrent requests
Mark Williams replied to Mark Williams's topic in Network, Cloud and Web
Terminate WWW via services. I can see my logfile is updated almost immediately with message "Server Closed" and then the progress bar from Services takes well over a minute from this point to complete the shutdown. -
ISAPI DLL concurrent requests
Mark Williams replied to Mark Williams's topic in Network, Cloud and Web
Further process of elimination. It was nothing to do with the call to free FDManager. It was the AddToLog calls that were causing the problem. These are threaded using the FThreadFileLog, which I free in the final part of CloseServer and before the threads called in CloseServer had terminated. I have changed these calls so that they are now not threaded and also as a double precaution added a check in ThreadFileLog's destroy event to make sure no threads are running before it is shut down. My CloseServer procedure now frees all the resources as expected, but the application pool is still taking an age to close down. There must be some resource I am creating and not freeing. I will check all my code and if I can pinpoint what it is I will update. -
ISAPI DLL concurrent requests
Mark Williams replied to Mark Williams's topic in Network, Cloud and Web
A good question! There is no need. Removed. But that's not the cause of the problem on shutdown. Any ideas why it would be hanging at this point? -
ISAPI DLL concurrent requests
Mark Williams replied to Mark Williams's topic in Network, Cloud and Web
@Yaron It's taken me a while to revisit this and as usual it has not disappointed: days thrown into the abyss trying to change the structure of my ISAPI DLL! I have implemented points 1 to 4 mentioned by Yaron. Point 5 is causing me a problem. I am having a problem with the closing down of the application pool. Yes I get a long delay and my closing section does not fire. My dpr now looks like this: library MWCWebServer; uses Winapi.ActiveX, System.Win.ComObj, System.SysUtils, Web.WebBroker, Web.Win.ISAPIApp, Web.Win.ISAPIThreadPool, WinAPI.Isapi2, WinAPI.Windows, System.SyncObjs, system.classes, WebModule in 'WebModule.pas' {WebModule1: TWebModule}; {$R *.res} function GetExtensionVersion(var Ver: THSE_VERSION_INFO): BOOL; stdcall; begin Result := Web.Win.ISAPIApp.GetExtensionVersion(Ver); CriticalSection := TCriticalSection.Create; StartServer; end; {I have removed the call to TerminateVersion as it wasn't firing function TerminateVersion(var Ver: THSE_VERSION_INFO): BOOL; stdcall; begin Result := Web.Win.ISAPIApp.GetExtensionVersion(Ver); CloseServer; CriticalSection.Free; end;} {I have added this procedure as shown in the linked post provied by Yaron} procedure DoTerminate; begin CloseServer; CriticalSection.Free; end; exports GetExtensionVersion, HttpExtensionProc, TerminateExtension; begin CoInitFlags := COINIT_MULTITHREADED; Application.Initialize; Application.WebModuleClass := WebModuleClass; Application.MaxConnections:=200; //MaxConnection=32 by default; IsMultiThread:=true; TISAPIApplication(Application).OnTerminate:=DoTerminate; {Application.CacheConnections:=true;} //NB not necessary as cached by default Application.Run; end. My call to CloseServer now looks as follows: Procedure CloseServer; begin CriticalSection.Enter; try if assigned(MimeTable) then try MimeTable.Free; except end; FDManager.Close; try FDManager.Free; except end; try AddToLog('Connection manager deactivated', leMajorEvents); except end; try AddToLog('Server Closed', leMajorEvents); except end; if assigned(FThreadFileLog) then try FThreadFileLog.Free; except end; finally CriticalSection.Leave; end; end; CloseServer gets called on shut down as expected. However, it hangs at "try FDManager.Free; except end;" It appears to execute the line of code above (ie to close FDMAnager), but it does not appear to execute the call to free FDManager not does it it get caught by the try except nor by the try finally end. It just hangs at the call to free. And this is when the application pool stalls in its shutdown process. Does anyone have any ideas why? -
On intial setup I set up queries with SQL.Text and prepare them ready for use and reuse. Some of these queries use params. On first usage the queries work as expected. When reused they simply are not fired at all. As an example: with DAC.FDQueryDocCats do begin ParamByName('CASEID').AsInteger:=FProject.CurrCaseID; execute; end; Works as expected on first use. When it is run a second time using the debugger I can see that the correct CASEID value is being passed in and that the query appears to execute, however, it doesn't. The query dataset doesn't update. I have enabled tracing and on checking the trace it is clear that the query does not run at all. I have tried "EmptyDatSet", "ClearDetails", but same effect. If however, I do this: with DAC.FDQueryDocCats do begin SQL.Text:=SQL.Text; ParamByName('CASEID').AsInteger:=FProject.CurrCaseID; execute; end; all works well, however, I lose any advantage in having prepared my query in the first instance (although I have no idea just how great that advantage is). It would seem setting the SQL text clears data or resets flags that I am missing. I have professional delphi not enterprise so can't see the FireDac code to see what it is doing so I can copy.
-
Sorry. It should have said open not execute. I call the appropriate command in another script which handles various errors and post details to a database. Didn't want to confuse by including all that code, but managed to confuse by not doing so! I was sure I had already tried Close and caused an AV. However, it works! Thanks.
-
I'm trying to work out the best way to refresh data loaded from a local file with data from the server. Using FireDAC with PostgreSQL. However, solution needs to be a general one as it will eventually need to work for SQL, Oracle etc. I have a table (DOCS_TABLE) which can vary greatly in size. DOCS_TABLE has a timestamp field (LAST_UPDATED). As the name suggests this records the date on which data in the record last changed. When user's open an app if they haven't queried DOCS_TABLE previously it is loaded via the server using a fairly complicated WHERE statement(COMPLICATED_WHERE_STATEMENT) which involves a number of joins to establish which records from the table the user is permitted to access. When the user closes the app the data from DOCS_TABLE is stored locally along with a timestamp to record the date and time the data was last refreshed (STORED_TMESTAMP). Next time the app opens it loads the data from the locally stored file. It then needs to ensure the user is working with up-to-date data. At the moment I am running a refresh query SELECT [fields] FROM DOCS_TABLE WHERELAST_UPDATED >[STORED_TIMESTAMP] AND [COMPLICATED_WHERE_STATEMENT]. I use the resulting data from the refresh query to update the in memory dataset holding DOCS_TABLE. This works, although it doesn't deal with records that were available at time of last saving locally and have now been deleted or access denied. As such,within the app, I run a check to make sure the user still has access to any record before trying to do anything with it, but it's not a terribly elegant solution. It would be better if such items were removed soon after loading the locally saved data. I have some thoughts on how to deal with this, which are below. However, I am concerned I may be overcomplicating things and that there may be much simpler solutions to this problem. Load the data from the local file. Run a thread for the following: Run a query (ID_QUERY) to ascertain which rows are now available to the user: SELECT id FROM DOCS_TABLE WHERE [COMPLICATED_WHERE_STATEMENT] Check the locally saved data against the result of this query to see what rows are no longer available to the user and remove them. Build a list of ids from the locally saved data (EXISTING_ID_ARRAY). Check the locally saved data against the results from ID_QUERY to see whether there are any new records to be added and build a list of the ids (NEW_ID_ARRAY). Run the refresh query using the arrays: SELECT [fields] FROMDOCS_TABLE WHERE (id in ([NEW_ID_ARRAY])) OR (id in [EXISTING_ID_ARRAY] ANDLAST_UPDATED >[STORED_TIMESTAMP]). Subject to my whole theory being cock-eyed I am pretty sure NEW_ID_ARRAY is the way to go. The part that concerns me is EXISTING_ID_ARRAY? Whilst it will cut out the use of the COMPLICATED_WHERE_STATEMENT and enable the query to focus explicitly on a group of records clearly identified, I would think the size of the array, could become a problem. Is there a law of diminishing returns with an IN clause? For example, if there were 1M records in the table and 20 items in the array, I suppose it must be the case using EXISTING_ID_ARRAY will be quicker than using COMPLICATED_WHERE_STATEMENT. But what if the array contained 800K of ids? I guess it has to be significantly less efficient to use EXISTING_ID_ARRAY and more efficient to use COMPLICATED_WHERE_STATEMENT. I appreciate without providing full details of the structure of DOCS_TABLE and the various joined tables, the data being retrieved from it and the full nature of the COMPLICATED_WHERE_STATEMENT, I may be asking for a comparison between apples and pears. What I am really interested in is whether my logic set out above is sound or idiotic and any suggestions on how best to achieve what I am trying to achieve.
-
I'm not convinced I have such a major issue here, but only time will tell and I'll have to make a judgement call at that time. Give it a REST! Joking aside, I get the message. REST is the way to go. Although by the time I get round to looking at it, it will probably be out to pasture with the brontosauri.
-
I think there may be some confusion. DOCS_TABLE does not contain the actual documents, rather it contains only data relating to the documents (such as data, author, file type, when uploaded etc. I don't download all the documents in one hit, just the data relating to them. The documents are stored on the server as files and downloaded only when they are needed. I could (and did until recently) just load the data from the database on startup. However, this obviously gets progressively slower as the number of records increases. It also struck me as pointless downloading the data time after time where it had not changed or was little changed. So I thought it would be better to store the data locally. For a large dataset (including heavy duty encryption of the local file) I get around a 20% time saving and a lot less traffic on the server. The actual documents when downloaded are cached (if that's what the user specifies) in a temp folder. It is all server side save for some caching. Whilst there isn't a web UI it is not a path I want to go down. Have used them extensively in the past and I don't think it is appropriate for the current usage for various reasons. Quite happy with a desktop app and using internet components and/or FireDAC. It works well and I am long way down the road. That's as may be. However, I am 5 years down the road, the software works as I want it. I am thinking of changing the way I load and refresh data not thinking of throwing out baby with the bathwater!