ioan 45 Posted March 3, 2023 I have a situation where one server is responsible for hosting several thousands of files, and multiple other servers require access to these files. These servers may have hundreds of concurrent threads accessing, adding, or deleting files on the shared drive. I have been using a windows shared directory to facilitate this access, but I have experienced issues where the shared directory becomes unresponsive, possibly due to too many simultaneous connections. What is the best solution for this scenario? Thanks! Share this post Link to post
Brian Evans 105 Posted March 3, 2023 Thousands of files should be nothing. Investigate the unresponsiveness. Would not hurt to read some on performance tuning SMB file servers (ex. Performance Tuning for SMB File Servers | Microsoft Learn) as well as server tuning. One common problem is a lot of files in one directory with similar names with 8.3 name generation still on. Turning off the generation of 8.3 filenames can have a dramatic effect in those cases. Share this post Link to post
FPiette 383 Posted March 3, 2023 1 hour ago, ioan said: What is the best solution for this scenario? Network share is the way to go. You simply have to design the file server correctly: use a Windows SERVER operating system with a lot of RAM (At least 64GB, preferably 128GB. RAM is more important than CPU power in the role). Also make sure the network is not the bottleneck. Use a 10Gbit Ethernet network with all devices properly sized. Not all network interface card are born equal, pay attention to their performance. Use fast hard disks such as 15000 rpm SAS or fast SSD. Share this post Link to post
Attila Kovacs 629 Posted March 3, 2023 1 hour ago, ioan said: I have experienced issues where the shared directory becomes unresponsive perhaps the write cache is full turn it off manually, measure the write speed, calculate Share this post Link to post