Closed
Bug 910959
Opened 11 years ago
Closed 9 years ago
Need to expose Audio capture latency API in nsAudioManager
Categories
(Core :: DOM: Device Interfaces, defect)
Tracking
()
People
(Reporter: jesup, Assigned: mwu)
References
(Blocks 1 open bug)
Details
(Whiteboard: [getusermedia][webrtc] [FT: Media Recording, Sprint:4][ft:webrtc])
Attachments
(1 file, 1 obsolete file)
(deleted),
patch
|
Details | Diff | Splinter Review |
To resolve bug 899204, we need access to the expected capture latency for a given frequency/etc (typically this is obtained from AudioRecord, which is not a public API at the C++ level, only the Java level). Michael Wu suggested we expose this via an api in nsAudioManager or someplace equivalent. Overall should be pretty simple; just proxying the call to get the latency to Java, or querying the latencies for different sampling frequencies/etc ahead of time and caching the data (not 100% sure that works in all cases; I don't know enough about the underlying model). The examples I've seen just calculate min_buffer_size_in_samples(freq)*1000/freq as the latency (in ms)
We're really hoping to get it into 26
Reporter | ||
Updated•11 years ago
|
Whiteboard: [getusermedia][webrtc]
Assignee | ||
Updated•11 years ago
|
blocking-b2g: --- → koi?
Updated•11 years ago
|
Whiteboard: [getusermedia][webrtc] → [getusermedia][webrtc] [FT: Media Recording, Sprint:4]
Comment 1•11 years ago
|
||
Is WebRTC already out of V1.2? If yes, it is no necessary to be koi+.
Comment 3•11 years ago
|
||
(In reply to Randell Jesup [:jesup] from comment #0)
> To resolve bug 899204, we need access to the expected capture latency for a
> given frequency/etc (typically this is obtained from AudioRecord, which is
> not a public API at the C++ level, only the Java level).
framework/base/include/media/AudioRecord.h
Please refer to the link as above. There is a AudioRecord class in C++ layer.
So you can create a AudioRecord instance with your freq/channel count and ...etc.
Then you can call latency() to get time in mseconds which will include the buffer latency as well as HW driver's latency.
Is this what you want?
> Michael Wu
> suggested we expose this via an api in nsAudioManager or someplace
> equivalent. Overall should be pretty simple; just proxying the call to get
> the latency to Java, or querying the latencies for different sampling
> frequencies/etc ahead of time and caching the data (not 100% sure that works
> in all cases; I don't know enough about the underlying model). The examples
> I've seen just calculate min_buffer_size_in_samples(freq)*1000/freq as the
> latency (in ms)
>
> We're really hoping to get it into 26
Assignee | ||
Updated•11 years ago
|
Assignee: nobody → mwu
Assignee | ||
Comment 4•11 years ago
|
||
How about something like this?
This gets you the latency reported by the audio hal.
However, it works best on jellybean based devices. ICS is probably hopeless.
Attachment #804664 -
Flags: feedback?(rjesup)
Reporter | ||
Comment 5•11 years ago
|
||
Comment on attachment 804664 [details] [diff] [review]
Add GetOutputLatency
Review of attachment 804664 [details] [diff] [review]:
-----------------------------------------------------------------
f+; I assume the pref change is a typo
::: b2g/app/b2g.js
@@ +73,4 @@
> pref("mozilla.widget.use-buffer-pixmap", true);
> pref("mozilla.widget.disable-native-theme", true);
> pref("layout.reflow.synthMouseMove", false);
> +pref("layers.force-tiles", false);
?? Typo?
::: dom/system/gonk/AudioManager.cpp
@@ +671,5 @@
> +AudioManager::GetOutputLatency(uint32_t sampleRate, uint32_t format, uint32_t channelMask, uint32_t* latency)
> +{
> + status_t status;
> +#if ANDROID_VERSION >= 18
> + audio_io_handle_t output = AudioSystem::getOutput(AUDIO_STREAM_DEFAULT, sampleRate, format, channelMask, AUDIO_OUTPUT_FLAG_FAST);
split line
::: dom/system/gonk/nsIAudioManager.idl
@@ +54,5 @@
> void setAudioChannelVolume(in long channel, in long index);
> long getAudioChannelVolume(in long channel);
> long getMaxAudioChannelVolume(in long channel);
> +
> + unsigned long getOutputLatency(in unsigned long sampleRate, in unsigned long format, in unsigned long channelMask);
split line
Attachment #804664 -
Flags: feedback?(rjesup) → feedback+
Assignee | ||
Comment 6•11 years ago
|
||
(In reply to Randell Jesup [:jesup] from comment #5)
> ::: b2g/app/b2g.js
> @@ +73,4 @@
> > pref("mozilla.widget.use-buffer-pixmap", true);
> > pref("mozilla.widget.disable-native-theme", true);
> > pref("layout.reflow.synthMouseMove", false);
> > +pref("layers.force-tiles", false);
>
> ?? Typo?
>
Opps. Forgot to strip that from the patch.
Assignee | ||
Comment 7•11 years ago
|
||
Attachment #804664 -
Attachment is obsolete: true
Assignee | ||
Updated•11 years ago
|
Attachment #804697 -
Flags: review?(mchen)
Comment 8•11 years ago
|
||
Comment on attachment 804697 [details] [diff] [review]
Add GetOutputLatency, v2
Review of attachment 804697 [details] [diff] [review]:
-----------------------------------------------------------------
::: dom/system/gonk/AudioManager.cpp
@@ +667,5 @@
> return NS_OK;
> }
>
> +nsresult
> +AudioManager::GetOutputLatency(uint32_t sampleRate, uint32_t format, uint32_t channelMask, uint32_t* latency)
1. chars more then 80.
2. May I know why here used AUDIO_XXX_FAST directly not use others?
As I know that FAST is used for fast mixer and it's latency is different then normal mixer (all are in audio flinger). And the AUDIO_XXX_INDIRECT/NONE seems to be the normal case.
3. The AudioManager currently is alive only on B2G process and as a singleton service.
May I know who will use this function and is it on content process?
4. Since AudioManager is a XPCOM component, do we need to expose this function into idl.
::: dom/system/gonk/android_audio/AudioSystem.h
@@ +27,5 @@
> */
> typedef enum {
> AUDIO_POLICY_OUTPUT_FLAG_INDIRECT = 0x0,
> + AUDIO_POLICY_OUTPUT_FLAG_DIRECT = 0x1,
> + AUDIO_OUTPUT_FLAG_PRIMARY = 0x2, // this output is the primary output of
Could you help to align the position? I checked others already did. Thanks.
@@ +644,4 @@
> static status_t getOutputSamplingRate(int* samplingRate, int stream = DEFAULT);
> static status_t getOutputFrameCount(int* frameCount, int stream = DEFAULT);
> static status_t getOutputLatency(uint32_t* latency, int stream = DEFAULT);
> + static status_t getSamplingRate(audio_io_handle_t output,
nit: indent
Attachment #804697 -
Flags: review?(mchen)
Assignee | ||
Comment 9•11 years ago
|
||
Thanks for the review.
I'd like to get a user of this new API before we proceed much further.
Assignee | ||
Updated•11 years ago
|
Attachment #804697 -
Flags: feedback?(rjesup)
Comment 10•11 years ago
|
||
(In reply to C.J. Ku[:CJKu] from comment #2)
> WebRTC::PeerConnection is in V1.3, but gUM is in 1.2 scope.
Is gUM enough to koi+ this?
Flags: needinfo?(cku)
Reporter | ||
Comment 11•11 years ago
|
||
The latency values for input are interesting for latency monitoring but not critical for getUserMedia() alone until we move the AEC into gUM; it's needed for PeerConnection calls (for 28/1.3). The AEC will likely move in 28 as well.
Comment 12•11 years ago
|
||
(In reply to Randell Jesup [:jesup] from comment #11)
> The latency values for input are interesting for latency monitoring but not
> critical for getUserMedia() alone until we move the AEC into gUM; it's
> needed for PeerConnection calls (for 28/1.3). The AEC will likely move in
> 28 as well.
Thanks!
blocking-b2g: koi? → 1.3?
Comment 13•11 years ago
|
||
Bug 853356 landed.(In reply to Andrew Overholt [:overholt] from comment #10)
> (In reply to C.J. Ku[:CJKu] from comment #2)
> > WebRTC::PeerConnection is in V1.3, but gUM is in 1.2 scope.
>
> Is gUM enough to koi+ this?
Bug 853356 landed, which means gUM path works well on B2G now.
Flags: needinfo?(cku)
Comment 14•11 years ago
|
||
Plus it for v1.3 since it blocks the committed feature of webRTC audio P2P
blocking-b2g: 1.3? → 1.3+
Whiteboard: [getusermedia][webrtc] [FT: Media Recording, Sprint:4] → [getusermedia][webrtc] [FT: Media Recording, Sprint:4][ft:webrtc]
Reporter | ||
Comment 15•11 years ago
|
||
Since I moved bug 899204 to 1.3? (and recommended it not block 1.3), I'm bumping this one down too. It's a nice-to-have, and only if it works better than the (new) fixed latency estimate of 25ms. (Real latency is almost always longer than that, given the number of buffers used in the opensles code in webrtc.) As long as we don't run out of tail, this should work (see comments in bug 899204).
If we can get this and bug 899204 for 1.3, and they work better, great, but so long the AEC processing works without it, this can wait.
blocking-b2g: 1.3+ → 1.3?
Reporter | ||
Updated•9 years ago
|
Attachment #804697 -
Flags: feedback?(rjesup)
Reporter | ||
Comment 17•9 years ago
|
||
I think we no longer need this (especially given the AEC changes)
Status: NEW → RESOLVED
Closed: 9 years ago
Resolution: --- → INCOMPLETE
You need to log in
before you can comment on or make changes to this bug.
Description
•