So I have this setup:
- a blazor web app project
- .net 8
- hostested in Azure app service with the Standard S1 tier.
- a storage account from where I use file share to maintain documents from the app.
What I am trying to do?
I am trying to upload files from the blazor app to the file share.(I make a transition from web forms to blazor and file shares is already there so I cannot use something else).
Problem and what I tried so far
When I upload files from the application hosted in azure, the upload takes over 70 seconds for a 15MB pdf. This should be done faster as the user will have a problem with this.What could be the problem?
So far I tried multiple solution(for each solution the upload from my local machine works fast as I would like to be, so from this I suppose that the problem comes from azure app service somehow).
For each of the solutions presented below I also tried different app service plans. Standard S2 and all the premiums ones.
Solution 1:I tried to mount the file share in the app service. It uploads but it takes very long time.
Solutions 2:I tried using Azure.Storage.Files.Shares to upload directly to the file
using var fileStream = fileUpload.File.OpenReadStream(long.MaxValue); // Limit max size to handle larger files if needed long fileSize = fileStream.Length; const int chunkSize = 4 * 1024 * 1024; // 4MB per chunk await fileClient.CreateAsync(fileSize); byte[] buffer = new byte[chunkSize]; long offset = 0; // Store the tasks for parallel execution var uploadTasks = new List<Task>(); while (offset < fileSize) { int bytesRead = await fileStream.ReadAsync(buffer, 0, chunkSize); if (bytesRead == 0) { break; } // Copy the data to avoid threading issues var chunkData = new byte[bytesRead]; Array.Copy(buffer, 0, chunkData, 0, bytesRead); long currentOffset = offset; // Create and store the upload task var task = Task.Run(async () => { using (var memoryStream = new MemoryStream(chunkData)) { await fileClient.UploadRangeAsync(new HttpRange(currentOffset, bytesRead), memoryStream); } //progressCallback(currentOffset + bytesRead, fileSize); }); uploadTasks.Add(task); // Add the task to the list offset += bytesRead; // Move the offset forward } // Wait for all upload tasks to finish await Task.WhenAll(uploadTasks);Solution 3: I tried Microsoft.Azure.Storage.DataMovement which I read that this package is desined for blobs and for large files. I managed to point it to my file share and upload file but the same problem.
CloudFile fileReference = uploadFileLocal.GetFileReference(fileUpload.File.Name.Trim()) var transferContext = new SingleTransferContext { ProgressHandler = new Progress<TransferStatus>(progress => { progressCallback(progress.BytesTransferred, fileUpload.File.Size); }) }; // Perform the upload using DMLib // await TransferManager.UploadAsync(tempFilePath, fileReference, null, transferContext); using (var stream = fileUpload.File.OpenReadStream(104857600)) { // Perform the upload using DMLib await TransferManager.UploadAsync(stream, fileReference, null, transferContext); }I also tried to save the file to the disk first and there I noticed that the disk upload took over 70 seconds and after the upload to file share took 2 seconds.. When uploading I would have given the file path await
TransferManager.UploadAsync(tempFilePath, fileReference, null, transferContext);Nothing is working. What can I do?