Quantcast
Channel: Active questions tagged blazor - Stack Overflow
Viewing all articles
Browse latest Browse all 4839

Why can't I send an audio stream from JavaScript via SignalR to a .NET Hub?

$
0
0

I’m trying to send an audio stream captured in JavaScript from a browser tab to a .NET SignalR Hub. My goal is to stream audio in chunks/realTime to the server and broadcast it to all connected clients.

Here’s the setup:

SignalR Hub:I have a Hub named /SoundHub with the following method:

   public class SoundHub : Hub    {        public async Task SendAudioChunk(byte[] audioChunk)        {            await Clients.All.SendAsync("ReceiveAudio", audioChunk);        }    }

JavaScript Client:I’m capturing the system audio using navigator.mediaDevices.getDisplayMedia and sending audio chunks through SignalR. Here’s the JavaScript code:

window.startScreenSharing = async function () {    try {        const connection = new signalR.HubConnectionBuilder()            .withUrl("/soundHub")            .withAutomaticReconnect()            .build();        await connection.start();        console.log("SignalR connection established.");        const stream = await navigator.mediaDevices.getDisplayMedia({            video: true,            audio: true        });        const audioContext = new AudioContext();        const audioSource = audioContext.createMediaStreamSource(stream);        const processor = audioContext.createScriptProcessor(2048, 1, 1);        audioSource.connect(processor);        processor.connect(audioContext.destination);        processor.onaudioprocess = async (event) => {            const inputBuffer = event.inputBuffer.getChannelData(0);            const intBuffer = new Int16Array(inputBuffer.length);            for (let i = 0; i < inputBuffer.length; i++) {                intBuffer[i] = Math.max(-32768, Math.min(32767, inputBuffer[i] * 32767));            }            const audioChunk = new Uint8Array(intBuffer.buffer);            try {                if (connection.state === signalR.HubConnectionState.Connected) {                    await connection.invoke("SendAudioChunk", audioChunk);                    console.log(`Audio chunk sent: ${audioChunk.length} bytes`);                } else {                    console.warn("SignalR connection is not in 'Connected' state.");                }            } catch (err) {                console.error("Error sending audio chunk:", err);            }        };    } catch (err) {        console.error("Error setting up screen sharing and audio streaming:", err);    }};

Issue:If I change the Hub method to accept string instead of byte[] and send text data, it works fine.However, when I send audio chunks (Uint8Array), the method in the Hub is never triggered. I’ve added logging inside the method, but it seems the method is not even reached.

Troubleshooting:I verified that the JavaScript is capturing the audio correctly and sending the chunks.The SignalR connection is established, and the state is Connected when invoking SendAudioChunk.

No exceptions are thrown, but I receive an error in the browser console:

Error sending audio chunk: Error: Failed to invoke 'SendAudioChunk' due to an error on the server.

Question:Is there something wrong with the way I’m formatting the audio data (Uint8Array -> byte[])?Do I need to preprocess the audio differently before sending it?Why does sending text work, but sending audio chunks does not?

Additional Info:SignalR is working correctly with text data.I’m using Blazor Web App and the latest SignalR client.


Viewing all articles
Browse latest Browse all 4839

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>