|
|
Created:
5 years, 2 months ago by aluebs-webrtc Modified:
5 years, 1 month ago CC:
webrtc-reviews_webrtc.org, peah-webrtc, Andrew MacDonald, tterriberry_mozilla.com, hlundin-webrtc, kwiberg-webrtc, the sun, bjornv1 Base URL:
https://chromium.googlesource.com/external/webrtc.git@highfreq Target Ref:
refs/pending/heads/master Project:
webrtc Visibility:
Public. |
DescriptionMake the nonlinear beamformer steerable
Depends on this CL: https://codereview.webrtc.org/1395453004/
R=andrew@webrtc.org
Committed: https://crrev.com/cb3f9bd9c024f11e1ee060de23bf65c7a1f9f594
Cr-Commit-Position: refs/heads/master@{#10458}
Patch Set 1 #
Total comments: 38
Patch Set 2 : Rebasing #Patch Set 3 : Add AimAt method to Beamformer #Patch Set 4 : Add nonlinear_beamformer_unittest #
Total comments: 15
Patch Set 5 : Rebasing #Patch Set 6 : Generalize interferer scenarios #
Total comments: 19
Patch Set 7 : Fix and test InitInterfAngles #Patch Set 8 : Formatting #
Total comments: 19
Patch Set 9 : Rebasing #Patch Set 10 : Use Maybe #Patch Set 11 : Formatting #Patch Set 12 : Fix windows compile error #Patch Set 13 : Fix more windows compile errors #Patch Set 14 : Fix yet another windows bug #Patch Set 15 : More windows fixing fun #Patch Set 16 : Windows fixing fun never ends #Patch Set 17 : More windows fun #Messages
Total messages: 32 (7 generated)
aluebs@webrtc.org changed reviewers: + andrew@webrtc.org
solenberg@google.com changed reviewers: + solenberg@google.com
https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/include/audio_processing.h (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/include/audio_processing.h:113: target_angle_radians(M_PI / 2.f) {} Use C++11 initialization?
peah@webrtc.org changed reviewers: + peah@webrtc.org
https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h:48: void SteerBeam(float target_angle_radians); My gut feeling is that it would be simpler for an outside user to use degrees when steering the beam. At least I need to think a second time to get a feeling for what angle a certain value in radians corresponds to. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/test/audioproc_float.cc (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/test/audioproc_float.cc:116: M_PI * FLAGS_target_angle_degrees / 180.f)); Since the usage is done using degrees, it would make sense for the beamformer to accept the target angle in degrees instead of radians.
https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:195: float DotProduct(std::vector<float> a, std::vector<float> b) { const references. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:204: float NormalizedDotProduct(std::vector<float> a, std::vector<float> b) { const references. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:206: float norm_a = DotProduct(a, a); You're repeatedly computing this for first_pair_direction. Can we improve that? https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:218: std::vector<float> directiona; first_pair_direction https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:221: directiona.push_back(array_geometry[1].z() - array_geometry[0].z()); Write a helper for this, something like: std::vector<float> PairDirection(const Point& a, const Point& b) { std::vector<float> direction; direction.reserve(3); direction.push_back(...); ... return direction; } const auto first_pair_direction = PairDirection(...); ... Going further, is it really worth transforming these into vectors of floats? All it does is save you one line in DotProduct. Instead you could have PairDirection return a Point: Point PairDirection(...) { return {b.x() - a.x(), b.y() - a.y(), b.z() - a.z()}; } https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:224: directionb.push_back(array_geometry[i].x() - array_geometry[i - 1].x()); pair_direction https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:228: std::abs(NormalizedDotProduct(directiona, directionb)) > kMinDotProduct; Instead, short circuit: if (std::abs(NormalizedDotProduct(...) < kMinDotProduct) { return false; } } return true; https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:273: InitDifuseCovMats(); Diffuse https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:319: if (target_angle_radians_ - kAway >= 0.f) { These values depend on the array being parallel with the x-axis (so that the angle of reflection is the x-axis). What if that's not the case? https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h:35: float target_angle_radians = M_PI / 2.f); Since we're exposing a setter, I think it's redundant to also take a target angle at construction time. WDYT? https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h:48: void SteerBeam(float target_angle_radians); This seems like a fairly generic beamformer task. We should probably add it to Beamformer. And since we have an existing method taking a SphericalPointf, this should as well for consistency. (For comparison, consider AimAt in MR) https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h:144: // counterclockwise. This is really about our coordinate system, and specifically how we convert from Cartesian to spherical (since the microphone positions are Cartesian). I think this documentation belongs with the declaration of SphericalPoint. Just as a reference, see this for the definition of Carteisan mic position coordinates in Chromium: https://code.google.com/p/chromium/codesearch#chromium/src/media/audio/audio_... https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/include/audio_processing.h (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/include/audio_processing.h:124: const float target_angle_radians; We don't have any immediate use for exposing this in the AudioProcessing interface. Should we hold off? I suppose you want to be able to test it in audioproc_float, which is justification enough :)
One more general comment for this set of CLs: it would be nice to see some unit testing.
Addressed comments and added unittest. PTAL https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h:35: float target_angle_radians = M_PI / 2.f); On 2015/10/14 22:12:31, Andrew MacDonald wrote: > Since we're exposing a setter, I think it's redundant to also take a target > angle at construction time. WDYT? Since it has a default value, it is not necessary to specify it. I was just feeling it would be inefficient to initialize all matrices twice, once with the default value and once with the real one, just because the user needs to set the angle with the setter. If you still think it is redundant, I am happy to remove, I don't have such a strong opinion. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h:48: void SteerBeam(float target_angle_radians); On 2015/10/13 12:53:41, peah-webrtc wrote: > My gut feeling is that it would be simpler for an outside user to use degrees > when steering the beam. At least I need to think a second time to get a feeling > for what angle a certain value in radians corresponds to. That is true for human input, but this is probably be set automatically by some BemaformingManager which uses a Localizer (returns radians) to decide where to steer towards. Plus all trigonometric functions take radians by default. I would use as a standard to always use radians except for human user input, as I did for the test. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h:48: void SteerBeam(float target_angle_radians); On 2015/10/14 22:12:31, Andrew MacDonald wrote: > This seems like a fairly generic beamformer task. We should probably add it to > Beamformer. And since we have an existing method taking a SphericalPointf, this > should as well for consistency. > > (For comparison, consider AimAt in MR) Agreed. Added. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h:144: // counterclockwise. On 2015/10/14 22:12:31, Andrew MacDonald wrote: > This is really about our coordinate system, and specifically how we convert from > Cartesian to spherical (since the microphone positions are Cartesian). > > I think this documentation belongs with the declaration of SphericalPoint. > > Just as a reference, see this for the definition of Carteisan mic position > coordinates in Chromium: > https://code.google.com/p/chromium/codesearch#chromium/src/media/audio/audio_... I am not sure if we want to impose a specific convention to the general Point class, I don't have a strong opinion. Removed this comment and added documentation for CartesianPoint and SphericalPoint following the Chromium example you suggested. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/include/audio_processing.h (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/include/audio_processing.h:113: target_angle_radians(M_PI / 2.f) {} On 2015/10/13 07:20:21, solenberg wrote: > Use C++11 initialization? I am not sure what you are suggesting. What am I missing? https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/include/audio_processing.h:124: const float target_angle_radians; On 2015/10/14 22:12:31, Andrew MacDonald wrote: > We don't have any immediate use for exposing this in the AudioProcessing > interface. Should we hold off? > > I suppose you want to be able to test it in audioproc_float, which is > justification enough :) Yes, I only surfaced it here to test it in audioproc_float. But if you think it doesn't make sense I can hold off with that. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/test/audioproc_float.cc (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/test/audioproc_float.cc:116: M_PI * FLAGS_target_angle_degrees / 180.f)); On 2015/10/13 12:53:41, peah-webrtc wrote: > Since the usage is done using degrees, it would make sense for the beamformer to > accept the target angle in degrees instead of radians. I only think this user will user degrees, since it is a user input and, as you said, degrees are easier to think about. The localizers work with radians, all trigonometrics functions in C++ do, so I don't find a strong point for changing the interface to degrees.
https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:195: float DotProduct(std::vector<float> a, std::vector<float> b) { On 2015/10/14 22:12:30, Andrew MacDonald wrote: > const references. Looks like you missed the remainder of the comments in this file. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/include/audio_processing.h (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/include/audio_processing.h:124: const float target_angle_radians; On 2015/10/20 00:04:20, aluebs-webrtc wrote: > On 2015/10/14 22:12:31, Andrew MacDonald wrote: > > We don't have any immediate use for exposing this in the AudioProcessing > > interface. Should we hold off? > > > > I suppose you want to be able to test it in audioproc_float, which is > > justification enough :) > > Yes, I only surfaced it here to test it in audioproc_float. But if you think it > doesn't make sense I can hold off with that. I think it's probably worth it. https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... File webrtc/modules/audio_processing/beamformer/array_util.h (right): https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/beamformer/array_util.h:50: // azimuth: Zero is to the right from the camera's perspective, with positive nit: to be consistent with above, no capitalization ("azimuth: zero...") https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/beamformer/array_util.h:51: // angles in radians towards the front of the camera. with positive angles counter-clockwise. https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc (right): https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc:18: SphericalPointf Azimuth2Point(float azimuth_radians) { nit: AzimuthToPoint (just because I think that's more common). https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc:23: const float kHalfBeamWidthRadians = static_cast<float>(M_PI) * 20.f / 180.f; If you want to use this (and I think that's fine), it should be exposed as a static in the interface, rather than copied here. https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... File webrtc/modules/audio_processing/test/audioproc_float.cc (right): https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/test/audioproc_float.cc:113: SphericalPointf(M_PI * FLAGS_target_angle_degrees / 180.f, 0.f, 1.f))); Do we have a DegreesToRadians helper somewhere? If not, probably good to add to array_util.h. (Also add the inverse.)
No further comments from me, but I have not gone through the rest of the changes in detail. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h:48: void SteerBeam(float target_angle_radians); On 2015/10/20 00:04:20, aluebs-webrtc wrote: > On 2015/10/13 12:53:41, peah-webrtc wrote: > > My gut feeling is that it would be simpler for an outside user to use degrees > > when steering the beam. At least I need to think a second time to get a > feeling > > for what angle a certain value in radians corresponds to. > > That is true for human input, but this is probably be set automatically by some > BemaformingManager which uses a Localizer (returns radians) to decide where to > steer towards. Plus all trigonometric functions take radians by default. I would > use as a standard to always use radians except for human user input, as I did > for the test. Acknowledged. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/test/audioproc_float.cc (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/test/audioproc_float.cc:116: M_PI * FLAGS_target_angle_degrees / 180.f)); On 2015/10/20 00:04:20, aluebs-webrtc wrote: > On 2015/10/13 12:53:41, peah-webrtc wrote: > > Since the usage is done using degrees, it would make sense for the beamformer > to > > accept the target angle in degrees instead of radians. > > I only think this user will user degrees, since it is a user input and, as you > said, degrees are easier to think about. The localizers work with radians, all > trigonometrics functions in C++ do, so I don't find a strong point for changing > the interface to degrees. Acknowledged.
No further comments from me, but I have not gone through the rest of the changes in detail.
https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:195: float DotProduct(std::vector<float> a, std::vector<float> b) { On 2015/10/14 22:12:30, Andrew MacDonald wrote: > const references. Done. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:195: float DotProduct(std::vector<float> a, std::vector<float> b) { On 2015/10/20 03:00:08, Andrew MacDonald wrote: > On 2015/10/14 22:12:30, Andrew MacDonald wrote: > > const references. > > Looks like you missed the remainder of the comments in this file. I totally missed them, sorry about that. Addressing them now. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:204: float NormalizedDotProduct(std::vector<float> a, std::vector<float> b) { On 2015/10/14 22:12:31, Andrew MacDonald wrote: > const references. Removed this function. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:206: float norm_a = DotProduct(a, a); On 2015/10/14 22:12:30, Andrew MacDonald wrote: > You're repeatedly computing this for first_pair_direction. Can we improve that? I removed the NormalizedDotProduct and did the normalization outside, so only calculate the first norm once. It is a little messier, but more efficient. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:218: std::vector<float> directiona; On 2015/10/14 22:12:31, Andrew MacDonald wrote: > first_pair_direction Done. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:221: directiona.push_back(array_geometry[1].z() - array_geometry[0].z()); On 2015/10/14 22:12:31, Andrew MacDonald wrote: > Write a helper for this, something like: > std::vector<float> PairDirection(const Point& a, const Point& b) { > std::vector<float> direction; > direction.reserve(3); > direction.push_back(...); > ... > return direction; > } > > const auto first_pair_direction = PairDirection(...); > ... > > > Going further, is it really worth transforming these into vectors of floats? All > it does is save you one line in DotProduct. Instead you could have PairDirection > return a Point: > > Point PairDirection(...) { > return {b.x() - a.x(), b.y() - a.y(), b.z() - a.z()}; > } I though about using Point, but I thought it was stretching the Point class, since they are directions and not Points, but I guess you are right. Done. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:224: directionb.push_back(array_geometry[i].x() - array_geometry[i - 1].x()); On 2015/10/14 22:12:31, Andrew MacDonald wrote: > pair_direction Done. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:228: std::abs(NormalizedDotProduct(directiona, directionb)) > kMinDotProduct; On 2015/10/14 22:12:31, Andrew MacDonald wrote: > Instead, short circuit: > > if (std::abs(NormalizedDotProduct(...) < kMinDotProduct) { > return false; > } > } > return true; I believe that function with only one return statement are less bug prone and easier to understand, but I don't have a too strong opinion, so I changed it. Done. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:273: InitDifuseCovMats(); On 2015/10/14 22:12:31, Andrew MacDonald wrote: > Diffuse Done. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:319: if (target_angle_radians_ - kAway >= 0.f) { On 2015/10/14 22:12:30, Andrew MacDonald wrote: > These values depend on the array being parallel with the x-axis (so that the > angle of reflection is the x-axis). What if that's not the case? Great Point. Now I generalized it. https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... File webrtc/modules/audio_processing/include/audio_processing.h (right): https://codereview.webrtc.org/1394103003/diff/1/webrtc/modules/audio_processi... webrtc/modules/audio_processing/include/audio_processing.h:124: const float target_angle_radians; On 2015/10/20 03:00:08, Andrew MacDonald wrote: > On 2015/10/20 00:04:20, aluebs-webrtc wrote: > > On 2015/10/14 22:12:31, Andrew MacDonald wrote: > > > We don't have any immediate use for exposing this in the AudioProcessing > > > interface. Should we hold off? > > > > > > I suppose you want to be able to test it in audioproc_float, which is > > > justification enough :) > > > > Yes, I only surfaced it here to test it in audioproc_float. But if you think > it > > doesn't make sense I can hold off with that. > > I think it's probably worth it. Agreed. https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... File webrtc/modules/audio_processing/beamformer/array_util.h (right): https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/beamformer/array_util.h:50: // azimuth: Zero is to the right from the camera's perspective, with positive On 2015/10/20 03:00:08, Andrew MacDonald wrote: > nit: to be consistent with above, no capitalization ("azimuth: zero...") Done. https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/beamformer/array_util.h:51: // angles in radians towards the front of the camera. On 2015/10/20 03:00:08, Andrew MacDonald wrote: > with positive angles counter-clockwise. My only concern with using "counter-clockwise" is that this depends if you are looking from above or below. But if you think it is clearer I am changing it. Done. https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc (right): https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc:18: SphericalPointf Azimuth2Point(float azimuth_radians) { On 2015/10/20 03:00:08, Andrew MacDonald wrote: > nit: AzimuthToPoint (just because I think that's more common). Done. https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc:23: const float kHalfBeamWidthRadians = static_cast<float>(M_PI) * 20.f / 180.f; On 2015/10/20 03:00:08, Andrew MacDonald wrote: > If you want to use this (and I think that's fine), it should be exposed as a > static in the interface, rather than copied here. The problem is that in WebRTC I can't use constexpr and I can't have a class float constant initialized in compile time. I think this is ok, since it would catch changes in kHalfBeamWidthRadians, but if you think it is not ok to repeat the constant, what do you prefer I do? Have a getter in NonlinearBeamformer? A public constant that is initialized in the constructor? I am happy to implement whatever you think is best, I don't have a strong opinion. https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... File webrtc/modules/audio_processing/test/audioproc_float.cc (right): https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/test/audioproc_float.cc:113: SphericalPointf(M_PI * FLAGS_target_angle_degrees / 180.f, 0.f, 1.f))); On 2015/10/20 03:00:08, Andrew MacDonald wrote: > Do we have a DegreesToRadians helper somewhere? If not, probably good to add to > array_util.h. > > (Also add the inverse.) I didn't find any helper functions, so I added them.
https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc (right): https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc:23: const float kHalfBeamWidthRadians = static_cast<float>(M_PI) * 20.f / 180.f; On 2015/10/21 01:41:40, aluebs-webrtc wrote: > On 2015/10/20 03:00:08, Andrew MacDonald wrote: > > If you want to use this (and I think that's fine), it should be exposed as a > > static in the interface, rather than copied here. > > The problem is that in WebRTC I can't use constexpr and I can't have a class > float constant initialized in compile time. I think this is ok, since it would > catch changes in kHalfBeamWidthRadians, but if you think it is not ok to repeat > the constant, what do you prefer I do? Have a getter in NonlinearBeamformer? A > public constant that is initialized in the constructor? I am happy to implement > whatever you think is best, I don't have a strong opinion. You can do it with: .h class NonlinearBeaformer { static const float kHalfBeamWidthRadians; ... }; .cc const int float NonlinearBeamformer::kHalfBeamWidthRadians = ...;
https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc (right): https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc:23: const float kHalfBeamWidthRadians = static_cast<float>(M_PI) * 20.f / 180.f; On 2015/10/21 02:10:32, Andrew MacDonald wrote: > You can do it with: > .h > class NonlinearBeaformer { > static const float kHalfBeamWidthRadians; > ... > }; > > .cc > const int float NonlinearBeamformer::kHalfBeamWidthRadians = ...; oops, just "const float" of course.
https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... File webrtc/modules/audio_processing/test/audioproc_float.cc (right): https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/test/audioproc_float.cc:113: SphericalPointf(M_PI * FLAGS_target_angle_degrees / 180.f, 0.f, 1.f))); On 2015/10/21 01:41:40, aluebs-webrtc wrote: > On 2015/10/20 03:00:08, Andrew MacDonald wrote: > > Do we have a DegreesToRadians helper somewhere? If not, probably good to add > to > > array_util.h. > > > > (Also add the inverse.) > > I didn't find any helper functions, so I added them. Nice. https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/array_util.h (right): https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:72: float DegreesToRadians(float angle_degrees); Probably reasonable to make these templates. https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc (right): https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:189: bool IsGeometryLinear(std::vector<Point> array_geometry) { DCHECK_GT(array_geometry.size(), 1); https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:199: float norm_b = const https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:291: void NonlinearBeamformer::InitInterfAngles() { This looks good, but it's probably complicated enough to warrant a test. WDYT? https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:305: if (DotProduct(array_normal, target_direction) * Add some comments to each of these conditions, because it's harder to see what they mean now. Something like: "The target and clockwise interferer are in the same half-plane defined by the array." ... "Otherwise, the interferer will begin reflecting back at the target. Instead rotate it away 180 degrees." https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc (right): https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc:42: TEST(NonlinearBeamformerTest, AimingModifiesBeam) { Can you add another test to verify non-x-axis-aligned arrays?
Description was changed from ========== Make the nonlinear beamformer steerable Depends on this CL: https://codereview.webrtc.org/1395453004/ ========== to ========== Make the nonlinear beamformer steerable Depends on this CL: https://codereview.webrtc.org/1395453004/ ==========
solenberg@webrtc.org changed reviewers: - solenberg@google.com
https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... File webrtc/modules/audio_processing/beamformer/array_util.h (right): https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/beamformer/array_util.h:51: // angles in radians towards the front of the camera. On 2015/10/21 01:41:40, aluebs-webrtc wrote: > On 2015/10/20 03:00:08, Andrew MacDonald wrote: > > with positive angles counter-clockwise. > > My only concern with using "counter-clockwise" is that this depends if you are > looking from above or below. But if you think it is clearer I am changing it. > Done. Right. Another option is to define these in terms of the Cartesian coordinates rather than the camera. So: azimuth: the angle from the positive x-axis towards the positive y-axis in radians. elevation: the angle from the xy projection towards the positive z-axis in radians. radius: distance from the origin in meters. WDYT?
https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc (right): https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:419: // averaging). CRITICAL ;-) Just noticed this because I'm looking into the code for avoiding the multi-channel splitting filter: you need to steer the delay-and-sum beam now. Adding fractional delay filters is going to be pretty annoying for what is a largely negligible effect with two mics. Do you just want to do integer delays?
https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc (right): https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:419: // averaging). On 2015/10/22 01:43:32, Andrew MacDonald wrote: > CRITICAL ;-) > > Just noticed this because I'm looking into the code for avoiding the > multi-channel splitting filter: you need to steer the delay-and-sum beam now. > > Adding fractional delay filters is going to be pretty annoying for what is a > largely negligible effect with two mics. Do you just want to do integer delays? I think the Matlab code does the delay-and-sum filtering in the frequency domain? Perhaps we should follow that here.
https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc (right): https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:419: // averaging). On 2015/10/22 01:48:55, Andrew MacDonald wrote: > On 2015/10/22 01:43:32, Andrew MacDonald wrote: > > CRITICAL ;-) > > > > Just noticed this because I'm looking into the code for avoiding the > > multi-channel splitting filter: you need to steer the delay-and-sum beam now. > > > > Adding fractional delay filters is going to be pretty annoying for what is a > > largely negligible effect with two mics. Do you just want to do integer > delays? > > I think the Matlab code does the delay-and-sum filtering in the frequency > domain? Perhaps we should follow that here. Ah, that doesn't make any sense. This is for the upper bands where we don't have a FD representation. Hmm. I'm looking at providing the full-band signal to avoid the multi-channel split though, so in the new typical case, it will work fine to do the delay-and-sum in FD.
https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc (right): https://codereview.webrtc.org/1394103003/diff/60001/webrtc/modules/audio_proc... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc:23: const float kHalfBeamWidthRadians = static_cast<float>(M_PI) * 20.f / 180.f; On 2015/10/21 02:11:31, Andrew MacDonald wrote: > On 2015/10/21 02:10:32, Andrew MacDonald wrote: > > You can do it with: > > .h > > class NonlinearBeaformer { > > static const float kHalfBeamWidthRadians; > > ... > > }; > > > > .cc > > const int float NonlinearBeamformer::kHalfBeamWidthRadians = ...; > > oops, just "const float" of course. Done. https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/array_util.h (right): https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:72: float DegreesToRadians(float angle_degrees); On 2015/10/21 03:27:43, Andrew MacDonald wrote: > Probably reasonable to make these templates. Done. https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc (right): https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:189: bool IsGeometryLinear(std::vector<Point> array_geometry) { On 2015/10/21 03:27:43, Andrew MacDonald wrote: > DCHECK_GT(array_geometry.size(), 1); Done. https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:199: float norm_b = On 2015/10/21 03:27:43, Andrew MacDonald wrote: > const Done. https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:291: void NonlinearBeamformer::InitInterfAngles() { On 2015/10/21 03:27:43, Andrew MacDonald wrote: > This looks good, but it's probably complicated enough to warrant a test. WDYT? Good point! :) And the great part about writing unittest, is that it forces you to think the contract of the method. So I realized that planar arrays can have ambiguity to when the normal is in the xy-plane. Added a bunch of methods to array_util and unit-tested them, modified this method accordingly and finally added a unit-test for it. https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:305: if (DotProduct(array_normal, target_direction) * On 2015/10/21 03:27:43, Andrew MacDonald wrote: > Add some comments to each of these conditions, because it's harder to see what > they mean now. Something like: > > "The target and clockwise interferer are in the same half-plane defined by the > array." > ... > "Otherwise, the interferer will begin reflecting back at the target. Instead > rotate it away 180 degrees." Done. https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:419: // averaging). On 2015/10/22 01:53:03, Andrew MacDonald wrote: > On 2015/10/22 01:48:55, Andrew MacDonald wrote: > > On 2015/10/22 01:43:32, Andrew MacDonald wrote: > > > CRITICAL ;-) > > > > > > Just noticed this because I'm looking into the code for avoiding the > > > multi-channel splitting filter: you need to steer the delay-and-sum beam > now. > > > > > > Adding fractional delay filters is going to be pretty annoying for what is a > > > largely negligible effect with two mics. Do you just want to do integer > > delays? > > > > I think the Matlab code does the delay-and-sum filtering in the frequency > > domain? Perhaps we should follow that here. > > Ah, that doesn't make any sense. This is for the upper bands where we don't have > a FD representation. Hmm. > > I'm looking at providing the full-band signal to avoid the multi-channel split > though, so in the new typical case, it will work fine to do the delay-and-sum in > FD. As you said, this is only for the higher bands, so it will not be applicable when the full-band signal is used and we don't want to transform to the FD. I find even the time domain integer delay to be pretty annoying for a largely negligible effect, specially since we would need to take care of the cases when this delay changes dynamically. So I propose to use the first channel for the higher bands and apply the post-filter mask. WDYT? https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc (right): https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc:42: TEST(NonlinearBeamformerTest, AimingModifiesBeam) { On 2015/10/21 03:27:43, Andrew MacDonald wrote: > Can you add another test to verify non-x-axis-aligned arrays? The array doesn't affect the steering. Just had to put some geometry for it to run.
https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc (right): https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:419: // averaging). On 2015/10/27 18:08:15, aluebs-webrtc wrote: > As you said, this is only for the higher bands, so it will not be applicable > when the full-band signal is used and we don't want to transform to the FD. I > find even the time domain integer delay to be pretty annoying for a largely > negligible effect, specially since we would need to take care of the cases when > this delay changes dynamically. So I propose to use the first channel for the > higher bands and apply the post-filter mask. WDYT? This looks good, assuming the effect is as negligible as we've been assuming. Did you confirm with a recording? Since we're doing this, could you also drop a TODO here to look into allowing multiple output channels? We could apply the mask to every channel. https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc (right): https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer_unittest.cc:42: TEST(NonlinearBeamformerTest, AimingModifiesBeam) { On 2015/10/27 18:08:16, aluebs-webrtc wrote: > On 2015/10/21 03:27:43, Andrew MacDonald wrote: > > Can you add another test to verify non-x-axis-aligned arrays? > > The array doesn't affect the steering. Just had to put some geometry for it to > run. Got it. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/array_util.h (right): https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:40: // Calculates the direction given a pair of Points. Be more specific: Calculates the direction from a to b. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:54: // normalizing. If it is not linear it returns a null Point. I would say the origin (or zero) rather than null? And below. But instead, this is a perfect opportunity to use kwiberg's new Maybe<T>: https://code.google.com/p/chromium/codesearch#chromium/src/third_party/webrtc... since the origin is a valid direction for a trivially true case (all points at the origin). And I would call it GetDirectionIfLinear. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:59: Point IsGeometryPlanar(const std::vector<Point>& array_geometry); GetNormalIfPlanar https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:62: // the normal is not in the xy-plane it returns a null Point. zero or origin https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:63: Point GetArrayNormal(const std::vector<Point>& array_geometry); GetArrayNormalIfExists https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:65: Point AzimuthToPoint(float azimuth); Might be worth documenting that the point will be in the XY plane? Is that obvious? https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/array_util_unittest.cc (right): https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util_unittest.cc:20: void ExpectPointsEq(const Point& a, const Point& b) { You should be able to overload operator== for Point to use EXPECT_EQ directly rather than ExpectPointsEq below. See: https://codereview.chromium.org/1275783003/diff/510001/content/renderer/media... We could consider adding it to array_util.h. I don't think it would break ODR with the Chromium code because it's in an unnamed namespace. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util_unittest.cc:97: std::vector<Point> array_geometry; Just geometry or points is fine to save some characters. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h (right): https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h:131: // The normal direction of the array in the xy-plane. It is set to null Point origin or zero
https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc (right): https://codereview.webrtc.org/1394103003/diff/100001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.cc:419: // averaging). On 2015/10/28 01:57:56, Andrew MacDonald wrote: > On 2015/10/27 18:08:15, aluebs-webrtc wrote: > > As you said, this is only for the higher bands, so it will not be applicable > > when the full-band signal is used and we don't want to transform to the FD. I > > find even the time domain integer delay to be pretty annoying for a largely > > negligible effect, specially since we would need to take care of the cases > when > > this delay changes dynamically. So I propose to use the first channel for the > > higher bands and apply the post-filter mask. WDYT? > > This looks good, assuming the effect is as negligible as we've been assuming. > Did you confirm with a recording? > > Since we're doing this, could you also drop a TODO here to look into allowing > multiple output channels? We could apply the mask to every channel. I did confirm that I couldn't hear the difference. Now I checked in matlab to double confirm. For most blocks the SNR (between the signal and the error from not using the D&S for higher bands) is between 20dB and 35dB. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/array_util.h (right): https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:40: // Calculates the direction given a pair of Points. On 2015/10/28 01:57:56, Andrew MacDonald wrote: > Be more specific: Calculates the direction from a to b. Done. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:54: // normalizing. If it is not linear it returns a null Point. On 2015/10/28 01:57:56, Andrew MacDonald wrote: > I would say the origin (or zero) rather than null? And below. > > But instead, this is a perfect opportunity to use kwiberg's new Maybe<T>: > https://code.google.com/p/chromium/codesearch#chromium/src/third_party/webrtc... > > since the origin is a valid direction for a trivially true case (all points at > the origin). > > And I would call it GetDirectionIfLinear. Changed to use Maybe and renamed function. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:59: Point IsGeometryPlanar(const std::vector<Point>& array_geometry); On 2015/10/28 01:57:56, Andrew MacDonald wrote: > GetNormalIfPlanar Done. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:62: // the normal is not in the xy-plane it returns a null Point. On 2015/10/28 01:57:57, Andrew MacDonald wrote: > zero or origin No need since using Maybe. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:63: Point GetArrayNormal(const std::vector<Point>& array_geometry); On 2015/10/28 01:57:56, Andrew MacDonald wrote: > GetArrayNormalIfExists Done. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util.h:65: Point AzimuthToPoint(float azimuth); On 2015/10/28 01:57:56, Andrew MacDonald wrote: > Might be worth documenting that the point will be in the XY plane? Is that > obvious? Good point. Added documentation. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/array_util_unittest.cc (right): https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util_unittest.cc:20: void ExpectPointsEq(const Point& a, const Point& b) { On 2015/10/28 01:57:57, Andrew MacDonald wrote: > You should be able to overload operator== for Point to use EXPECT_EQ directly > rather than ExpectPointsEq below. > > See: > https://codereview.chromium.org/1275783003/diff/510001/content/renderer/media... > > We could consider adding it to array_util.h. I don't think it would break ODR > with the Chromium code because it's in an unnamed namespace. I Added the operator here, since the one in Chromium is not in a unnamed namespace, but in the webrtc one. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util_unittest.cc:97: std::vector<Point> array_geometry; On 2015/10/28 01:57:57, Andrew MacDonald wrote: > Just geometry or points is fine to save some characters. Done. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h (right): https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/nonlinear_beamformer.h:131: // The normal direction of the array in the xy-plane. It is set to null Point On 2015/10/28 01:57:57, Andrew MacDonald wrote: > origin or zero Not necessary anymore, since using Maybe.
Nice, lgtm. https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... File webrtc/modules/audio_processing/beamformer/array_util_unittest.cc (right): https://codereview.webrtc.org/1394103003/diff/140001/webrtc/modules/audio_pro... webrtc/modules/audio_processing/beamformer/array_util_unittest.cc:20: void ExpectPointsEq(const Point& a, const Point& b) { On 2015/10/29 00:34:21, aluebs-webrtc wrote: > On 2015/10/28 01:57:57, Andrew MacDonald wrote: > > You should be able to overload operator== for Point to use EXPECT_EQ directly > > rather than ExpectPointsEq below. > > > > See: > > > https://codereview.chromium.org/1275783003/diff/510001/content/renderer/media... > > > > We could consider adding it to array_util.h. I don't think it would break ODR > > with the Chromium code because it's in an unnamed namespace. > > I Added the operator here, since the one in Chromium is not in a unnamed > namespace, but in the webrtc one. Acknowledged.
The CQ bit was checked by aluebs@webrtc.org
CQ is trying da patch. Follow status at https://chromium-cq-status.appspot.com/patch-status/1394103003/200001 View timeline at https://chromium-cq-status.appspot.com/patch-timeline/1394103003/200001
The CQ bit was unchecked by commit-bot@chromium.org
CLs for remote refs other than refs/pending/heads/master must contain NOTRY=true and NOPRESUBMIT=true in order for the CQ to process them
Message was sent while issue was closed.
Committed patchset #17 (id:320001) manually as cb3f9bd9c024f11e1ee060de23bf65c7a1f9f594 (presubmit successful).
Message was sent while issue was closed.
Patchset 17 (id:??) landed as https://crrev.com/cb3f9bd9c024f11e1ee060de23bf65c7a1f9f594 Cr-Commit-Position: refs/heads/master@{#10458} |