You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@guacamole.apache.org by "Nick Couchman (JIRA)" <ji...@apache.org> on 2019/02/12 14:57:00 UTC

[jira] [Updated] (GUACAMOLE-732) navigator.mediaDevices.getUserMedia() returns a promises

     [ https://issues.apache.org/jira/browse/GUACAMOLE-732?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Nick Couchman updated GUACAMOLE-732:
------------------------------------
    Priority: Minor  (was: Major)

> navigator.mediaDevices.getUserMedia() returns a promises
> --------------------------------------------------------
>
>                 Key: GUACAMOLE-732
>                 URL: https://issues.apache.org/jira/browse/GUACAMOLE-732
>             Project: Guacamole
>          Issue Type: Bug
>          Components: guacamole-common-js
>    Affects Versions: 1.0.0
>            Reporter: Fabian Spieß
>            Priority: Minor
>              Labels: newbie, patch
>
> navigator.mediaDevices.getUserMedia() in Line 426 of the AudioRecorder.js is not working with callbacks anymore, but with promises. See [here.|https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia]
> Version 1.0.0 of AudioRecorder.js contains:
>  
> {code:java}
> var beginAudioCapture = function beginAudioCapture() {
> // Attempt to retrieve an audio input stream from the browser
> navigator.mediaDevices.getUserMedia({ 'audio' : true }, function streamReceived(stream) {
> // Create processing node which receives appropriately-sized audio buffers
> processor = context.createScriptProcessor(BUFFER_SIZE, format.channels, format.channels);
> processor.connect(context.destination);
> // Send blobs when audio buffers are received
> processor.onaudioprocess = function processAudio(e) {
> writer.sendData(toSampleArray(e.inputBuffer).buffer);
> };
> // Connect processing node to user's audio input source
> source = context.createMediaStreamSource(stream);
> source.connect(processor);
> // Save stream for later cleanup
> mediaStream = stream;
> }, function streamDenied() {
> // Simply end stream if audio access is not allowed
> writer.sendEnd();
> // Notify of closure
> if (recorder.onerror)
> recorder.onerror();
> });
> };
> {code}
>  
>  
> A possible fix would be:
>  
> {code:java}
> var beginAudioCapture = function beginAudioCapture() {
>         // Attempt to retrieve an audio input stream from the browser
>         navigator.mediaDevices.getUserMedia({ 'audio' : true })
>             .then(stream => {
>             // Create processing node which receives appropriately-sized audio buffers
>             processor = context.createScriptProcessor(BUFFER_SIZE, format.channels, format.channels);
>             processor.connect(context.destination);
>             // Send blobs when audio buffers are received
>             processor.onaudioprocess = function processAudio(e) {
>                 writer.sendData(toSampleArray(e.inputBuffer).buffer);
>             };
>            
>             // Connect processing node to user's audio input source
>             source = context.createMediaStreamSource(stream);
>             source.connect(processor);
>             // Save stream for later cleanup
>             mediaStream = stream;
>         }).catch(err => {
>             // Simply end stream if audio access is not allowed
>             writer.sendEnd();
>             // Notify of closure
>             if (recorder.onerror)
>                 recorder.onerror();
>         });
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)