|
|
Created:
4 years, 6 months ago by noahric Modified:
4 years, 5 months ago CC:
webrtc-reviews_webrtc.org, tterriberry_mozilla.com, zhengzhonghou_agora.io, video-team_agora.io, stefan-webrtc, mflodman, wuchengli Base URL:
https://chromium.googlesource.com/external/webrtc.git@master Target Ref:
refs/pending/heads/master Project:
webrtc Visibility:
Public. |
DescriptionAdd native handle support to SimulcastEncoderAdapter.
If all subencoders support textures, the adapter will claim support.
Texture frames will be passed on directly to subencoders, without any
attempt at scaling, and subencoders will be expected to sample/scale
correctly from source textures.
BUG=
NOTRY=true
Committed: https://crrev.com/fe3654d5dceb3d931cd773afae511e90180d39ba
Cr-Commit-Position: refs/heads/master@{#13365}
Patch Set 1 #Patch Set 2 : Merged with ToT #
Total comments: 6
Patch Set 3 : Add a comment and remove some brackets. #Patch Set 4 : git pull #
Messages
Total messages: 45 (18 generated)
noahric@chromium.org changed reviewers: + mflodman@webrtc.org, pbos@chromium.org
Let me know if this looks ok.
pbos@webrtc.org changed reviewers: + magjed@webrtc.org, pbos@webrtc.org
magjed@, PTAL, specifically: Do the native handles support scaling (applying a matrix) that should be applied here. I don't want us to break simulcast w/ MediaCodecVideoEncoder.
perkj@webrtc.org changed reviewers: + emircan@chromium.org, perkj@webrtc.org
emirican - is this cl a problem currently on Mac and Win for hw codecs ? https://codereview.webrtc.org/2099483002/diff/20001/webrtc/modules/video_codi... File webrtc/modules/video_coding/codecs/vp8/simulcast_encoder_adapter.cc (right): https://codereview.webrtc.org/2099483002/diff/20001/webrtc/modules/video_codi... webrtc/modules/video_coding/codecs/vp8/simulcast_encoder_adapter.cc:299: input_image.video_frame_buffer()->native_handle()) { The encoder does not know the size is should scale too. Scaling is simple in it self, we just need to create a new VideoFrameBuffer with the correct width and height. There is a plan to add a scale factor in the buffers to handle this https://bugs.chromium.org/p/webrtc/issues/detail?id=5683 but that is not done. If this is not a problem today for hw encoders on Mac and Win I guess we can add a TODO here and update the bug 5683.
On 2016/06/27 08:02:01, perkj_webrtc_ooo_june20 wrote: > emirican - is this cl a problem currently on Mac and Win for hw codecs ? > It isn't a problem. SupportsNativeHandle() currently returns false for both these platforms, so that section would not be triggered either. However, we have a use-case for sending IOSurface backed textures for Mac encoder, and we will add it if performance gain seems reasonable. About scaling, I don't think we can rely on encoder doing it as you said.
On 2016/06/27 18:53:26, emircan wrote: > On 2016/06/27 08:02:01, perkj_webrtc_ooo_june20 wrote: > > emirican - is this cl a problem currently on Mac and Win for hw codecs ? > > > It isn't a problem. SupportsNativeHandle() currently returns false for both > these platforms, so that section would not be triggered either. However, we have > a use-case for sending IOSurface backed textures for Mac encoder, and we will > add it if performance gain seems reasonable. > About scaling, I don't think we can rely on encoder doing it as you said. > The encoder does not know the size is should scale too. The encoder is told the size in InitEncode, so it should know the size to scale/crop/whatever to. > About scaling, I don't think we can rely on encoder doing it as you said. Can you explain why? In android, the encoding is a render from a source texture to a surface. That render can certainly scale/crop/rotate/etc. On iOS/mac, VTCompressionSessionCreate takes a width/height. Are you saying the encoder ignores the resolution and uses the source resolution of frame buffers? I'm not sure that's correct; an earlier prototype of the webrtc ios h264 encoder used frame buffer input, and I don't remember if the encoder respected configured resolution or not. It's worth checking, if you have a setup where you can test it. Though you're going to have a problem if that's the case anyways, since things like QP want to change resolution. If they can't do it with source frame buffers on iOS, then you'll need to either reset the capturer to the desired size or reconfigure the encoder to take i420/nv21/whatever and do a full glreadpixels for every source frame, which will cancel out the savings you were getting.
noahric@chromium.org changed reviewers: + tkchin@chromium.org
+zeke: have you played around with the pixelbuffer input mode for iOS hardware h264 encode? Do you remember if the encoder scales to the configured resolution or just resets itself to the pixelbuffer resolution?
On 2016/06/27 19:26:02, noahric wrote: > On 2016/06/27 18:53:26, emircan wrote: > > On 2016/06/27 08:02:01, perkj_webrtc_ooo_june20 wrote: > > > emirican - is this cl a problem currently on Mac and Win for hw codecs ? > > > > > It isn't a problem. SupportsNativeHandle() currently returns false for both > > these platforms, so that section would not be triggered either. However, we > have > > a use-case for sending IOSurface backed textures for Mac encoder, and we will > > add it if performance gain seems reasonable. > > About scaling, I don't think we can rely on encoder doing it as you said. > > > The encoder does not know the size is should scale too. > The encoder is told the size in InitEncode, so it should know the size to > scale/crop/whatever to. > > > About scaling, I don't think we can rely on encoder doing it as you said. > Can you explain why? > > In android, the encoding is a render from a source texture to a surface. That > render can certainly scale/crop/rotate/etc. > > On iOS/mac, VTCompressionSessionCreate takes a width/height. Are you saying the > encoder ignores the resolution and uses the source resolution of frame buffers? > I'm not sure that's correct; an earlier prototype of the webrtc ios h264 encoder > used frame buffer input, and I don't remember if the encoder respected > configured resolution or not. It's worth checking, if you have a setup where you > can test it. > > Though you're going to have a problem if that's the case anyways, since things > like QP want to change resolution. If they can't do it with source frame buffers > on iOS, then you'll need to either reset the capturer to the desired size or > reconfigure the encoder to take i420/nv21/whatever and do a full glreadpixels > for every source frame, which will cancel out the savings you were getting. And everywhere for iOS, s/frame buffer/pixel buffer/, I think. I always get the terminology confused :-/
On 2016/06/27 19:29:09, noahric wrote: > On 2016/06/27 19:26:02, noahric wrote: > > On 2016/06/27 18:53:26, emircan wrote: > > > On 2016/06/27 08:02:01, perkj_webrtc_ooo_june20 wrote: > > > > emirican - is this cl a problem currently on Mac and Win for hw codecs ? > > > > > > > It isn't a problem. SupportsNativeHandle() currently returns false for both > > > these platforms, so that section would not be triggered either. However, we > > have > > > a use-case for sending IOSurface backed textures for Mac encoder, and we > will > > > add it if performance gain seems reasonable. > > > About scaling, I don't think we can rely on encoder doing it as you said. > > > > > The encoder does not know the size is should scale too. > > The encoder is told the size in InitEncode, so it should know the size to > > scale/crop/whatever to. > > > > > About scaling, I don't think we can rely on encoder doing it as you said. > > Can you explain why? > > > > In android, the encoding is a render from a source texture to a surface. That > > render can certainly scale/crop/rotate/etc. > > > > On iOS/mac, VTCompressionSessionCreate takes a width/height. Are you saying > the > > encoder ignores the resolution and uses the source resolution of frame > buffers? > > I'm not sure that's correct; an earlier prototype of the webrtc ios h264 > encoder > > used frame buffer input, and I don't remember if the encoder respected > > configured resolution or not. It's worth checking, if you have a setup where > you > > can test it. > > > > Though you're going to have a problem if that's the case anyways, since things > > like QP want to change resolution. If they can't do it with source frame > buffers > > on iOS, then you'll need to either reset the capturer to the desired size or > > reconfigure the encoder to take i420/nv21/whatever and do a full glreadpixels > > for every source frame, which will cancel out the savings you were getting. > > And everywhere for iOS, s/frame buffer/pixel buffer/, I think. I always get the > terminology confused :-/ We don't have a setup for testing pixel buffer input into the encoder currently. I don't know what happens if you give it a size different than what it's configured for.
On 2016/06/27 21:46:35, tkchin_webrtc wrote: > On 2016/06/27 19:29:09, noahric wrote: > > On 2016/06/27 19:26:02, noahric wrote: > > > On 2016/06/27 18:53:26, emircan wrote: > > > > On 2016/06/27 08:02:01, perkj_webrtc_ooo_june20 wrote: > > > > > emirican - is this cl a problem currently on Mac and Win for hw codecs ? > > > > > > > > > It isn't a problem. SupportsNativeHandle() currently returns false for > both > > > > these platforms, so that section would not be triggered either. However, > we > > > have > > > > a use-case for sending IOSurface backed textures for Mac encoder, and we > > will > > > > add it if performance gain seems reasonable. > > > > About scaling, I don't think we can rely on encoder doing it as you said. > > > > > > > The encoder does not know the size is should scale too. > > > The encoder is told the size in InitEncode, so it should know the size to > > > scale/crop/whatever to. > > > > > > > About scaling, I don't think we can rely on encoder doing it as you said. > > > Can you explain why? > > > > > > In android, the encoding is a render from a source texture to a surface. > That > > > render can certainly scale/crop/rotate/etc. > > > > > > On iOS/mac, VTCompressionSessionCreate takes a width/height. Are you saying > > the > > > encoder ignores the resolution and uses the source resolution of frame > > buffers? > > > I'm not sure that's correct; an earlier prototype of the webrtc ios h264 > > encoder > > > used frame buffer input, and I don't remember if the encoder respected > > > configured resolution or not. It's worth checking, if you have a setup where > > you > > > can test it. > > > > > > Though you're going to have a problem if that's the case anyways, since > things > > > like QP want to change resolution. If they can't do it with source frame > > buffers > > > on iOS, then you'll need to either reset the capturer to the desired size or > > > reconfigure the encoder to take i420/nv21/whatever and do a full > glreadpixels > > > for every source frame, which will cancel out the savings you were getting. > > > > And everywhere for iOS, s/frame buffer/pixel buffer/, I think. I always get > the > > terminology confused :-/ > > We don't have a setup for testing pixel buffer input into the encoder currently. > I don't know what happens if you give it a size different than what it's > configured for. So: do we need to prototype something on iOS to test this before you'd allow it to be submitted? Or are there also other cases where expect the encoder is incapable of scaling from a source texture? Alternatively, there are a few other options in that world: 1) Encoders that can't support directly scaling textures have to convert to i420, which is basically the same as claiming lack of texture support (call the convert to i420 method on the frame) 2) Change the SupportsNativeHandle method to return a struct with supported features, where one is something like supports_resampling_native_handle, and if it's false, the frames will be converted in simulcast cases, or cases where the camera can't open at precisely the right size.
lgtm https://codereview.webrtc.org/2099483002/diff/20001/webrtc/modules/video_codi... File webrtc/modules/video_coding/codecs/vp8/simulcast_encoder_adapter.cc (right): https://codereview.webrtc.org/2099483002/diff/20001/webrtc/modules/video_codi... webrtc/modules/video_coding/codecs/vp8/simulcast_encoder_adapter.cc:306: VideoFrame dst_frame; So apparently no one use simulcast with native textures today so I think this is fine but it might be good to add a comment that the native handles are not scaled. There are two ways to handle this in the long run. Either the encoder must scale textures to the encoder configured size. Bug webrtc::5683 suggest that a frame should contain a scale factor so that we can postpone all scaling until we know that the frame actually will be encoded and scaling should probably occur in the encoder. So here a new VideoFrameBuffer with separate scale factors but with the same under laying data buffer could be created. When it comes to quality scaling it gets a bit confusing. What does it mean that the quality scaler want to downscale a layer due to bad quality?https://cs.chromium.org/chromium/src/third_party/webrtc/api/java/jni/androidmediaencoder_jni.cc?rcl=0&l=700. In our android encoder implementation the frame size decide how the encoder should be configured after quality scaling.
pbos@webrtc.org changed reviewers: + wuchengli@chromium.org
+wuchengli@, I believe RTCVideoEncoder on ChromeOS can use textures. So I think this might break simulcast on ChromeOS?
On 2016/06/28 14:12:05, pbos-webrtc wrote: > +wuchengli@, I believe RTCVideoEncoder on ChromeOS can use textures. So I think > this might break simulcast on ChromeOS? RTCVideoEncoder does not handle textures. The native handle passed to RTCVideoEncoder has to be I420. The default SupportsNativeHandle returns false. RTCVideoEncoder are subencoders. So streaminfo.encoder->SupportsNativeHandle() will return false and this should be fine?
> What does it mean that the quality scaler want to downscale a layer due to bad quality? I've brought this up a couple of times in the past. The short answer is that the quality scaler can't live on individual encoders, because it makes no sense in simulcasting. E.g. if you need to scale down the HD layer, what you really want to do is probably stop sending the HD layer entirely. If you want to downscale the VGA layer, then I have no idea what you do :) > but it might be good to add a comment that the native handles are not scaled. They are scaled in the android implementation, though it's kinda non-obvious. So: 1) MediaCodecVideoEncoder.width and .height are set to the encoder resolution in initEncode. 2) encodeTexture uses GlRectDrawer.drawOes, which samples a full source texture (FULL_RECTANGLE_BUFF) to the target width/height, so it will scale (and stretch) if they don't match. It ignores frameWidth/frameHeight args, so maybe those were originally intended to do something about cropping the source? You probably want some TODOs or bugs there about being more explicit about how you handle aspect ratio differences, since it'll just stretch/squash things today, and you may want something different. Ping me on chat if you want pointers to a different implementation that handles aspect ratios by cropping (for small differences) or letterboxing/pillarboxing (for large differences). > Bug webrtc::5683 suggest that a frame should contain a scale factor Why does the frame need a scale factor? If it's a GL texture and everyone using it (renderers and encoders) know their own size, then they can do whatever scaling they need. I'm asking because we have textures used throughout our app, but there's nothing like that (scale factor) that makes the trip through the capture pipeline. Rotation does, and additionally crop info does on the decode side (some android hardware decoders supply crop info in the buffer info for decoded frames).
On 2016/06/28 17:32:40, noahric wrote: > They are scaled in the android implementation, though it's kinda non-obvious. > So: > 1) MediaCodecVideoEncoder.width and .height are set to the encoder resolution in > initEncode. > 2) encodeTexture uses GlRectDrawer.drawOes, which samples a full source texture > (FULL_RECTANGLE_BUFF) to the target width/height, so it will scale (and stretch) > if they don't match. It's true that MediaCodecVideoEncoder.encodeTexture will scale (and stretch) regardless of input resolution. However, MediaCodecVideoEncoder does not respect the values set in initEncode, because it will reconfigure to whatever the input resolution is (see here: https://cs.chromium.org/chromium/src/third_party/webrtc/api/java/jni/androidm...) > It ignores frameWidth/frameHeight args, so maybe those were > originally intended to do something about cropping the source? The frameWidth/frameHeight args are not used for rectangular rendering, but they are added to the GL render interface so that downstream apps can do e.g. round rendering that depends on the frame resolution. > You probably want some TODOs or bugs there about being more explicit about how > you handle aspect ratio differences, since it'll just stretch/squash things > today, and you may want something different. I think there is an implicit assumption right now that the aspect ratios match. I420 buffers will also be stretched if the aspects don't match, with libyuv::I420Scale. I think it's ok to land this CL even if the texture scaling is broken, because the code doesn't seem to work now anyway. lgtm
On 2016/06/28 17:32:40, noahric wrote: > > What does it mean that the quality scaler want to downscale a layer due to bad > quality? > I've brought this up a couple of times in the past. The short answer is that the > quality scaler can't live on individual encoders, because it makes no sense in > simulcasting. E.g. if you need to scale down the HD layer, what you really want > to do is probably stop sending the HD layer entirely. If you want to downscale > the VGA layer, then I have no idea what you do :) > > > but it might be good to add a comment that the native handles are not scaled. > They are scaled in the android implementation, though it's kinda non-obvious. > So: > 1) MediaCodecVideoEncoder.width and .height are set to the encoder resolution in > initEncode. > 2) encodeTexture uses GlRectDrawer.drawOes, which samples a full source texture > (FULL_RECTANGLE_BUFF) to the target width/height, so it will scale (and stretch) > if they don't match. It ignores frameWidth/frameHeight args, so maybe those were > originally intended to do something about cropping the source? > > You probably want some TODOs or bugs there about being more explicit about how > you handle aspect ratio differences, since it'll just stretch/squash things > today, and you may want something different. Ping me on chat if you want > pointers to a different implementation that handles aspect ratios by cropping > (for small differences) or letterboxing/pillarboxing (for large differences). > > > Bug webrtc::5683 suggest that a frame should contain a scale factor > Why does the frame need a scale factor? If it's a GL texture and everyone using > it (renderers and encoders) know their own size, then they can do whatever > scaling they need. > > I'm asking because we have textures used throughout our app, but there's nothing > like that (scale factor) that makes the trip through the capture pipeline. > Rotation does, and additionally crop info does on the decode side (some android > hardware decoders supply crop info in the buffer info for decoded frames). There are two reasons we have talked about adding a scale factor to a video frame and do scaling lazy in the encoder. Reason one - we drop frames in a couple of places before the encoder but after scaling. So we would not have to scale frames before they are dropped if it was done lazily. Reason two - Normally the spec say that we should encode and send at whatever resolution the media stream track provide. But there is a setting on the RTPSender object that allows for down scaling. Nothing has been implemented yet though, another solution is to tell the encoder the target resolution as you say separately. That would probably be even simpler in the end and align well with this cl.
Description was changed from ========== Add native handle support to SimulcastEncoderAdapter. If all subencoders support textures, the adapter will claim support. Texture frames will be passed on directly to subencoders, without any attempt at scaling, and subencoders will be expected to sample/scale correctly from source textures. BUG= ========== to ========== Add native handle support to SimulcastEncoderAdapter. If all subencoders support textures, the adapter will claim support. Texture frames will be passed on directly to subencoders, without any attempt at scaling, and subencoders will be expected to sample/scale correctly from source textures. BUG= ==========
wuchengli@chromium.org changed reviewers: - tkchin@chromium.org, wuchengli@chromium.org
lgtm https://codereview.webrtc.org/2099483002/diff/20001/webrtc/modules/video_codi... File webrtc/modules/video_coding/codecs/vp8/simulcast_encoder_adapter.cc (right): https://codereview.webrtc.org/2099483002/diff/20001/webrtc/modules/video_codi... webrtc/modules/video_coding/codecs/vp8/simulcast_encoder_adapter.cc:306: VideoFrame dst_frame; On 2016/06/28 14:09:00, perkj_webrtc_ooo_june20 wrote: > So apparently no one use simulcast with native textures today so I think this is > fine but it might be good to add a comment that the native handles are not > scaled. > > There are two ways to handle this in the long run. Either the encoder must scale > textures to the encoder configured size. > Bug webrtc::5683 suggest that a frame should contain a scale factor so that we > can postpone all scaling until we know that the frame actually will be encoded > and scaling should probably occur in the encoder. So here a new VideoFrameBuffer > with separate scale factors but with the same under laying data buffer could be > created. > > When it comes to quality scaling it gets a bit confusing. What does it mean that > the quality scaler want to downscale a layer due to bad > quality?https://cs.chromium.org/chromium/src/third_party/webrtc/api/java/jni/androidmediaencoder_jni.cc?rcl=0&l=700. > In our android encoder implementation the frame size decide how the encoder > should be configured after quality scaling. I think quality scaling should apply outside and just be fed into the encoder with a new size. I assume that this encoder would then feel free to drop layers. https://codereview.webrtc.org/2099483002/diff/20001/webrtc/modules/video_codi... webrtc/modules/video_coding/codecs/vp8/simulcast_encoder_adapter.cc:519: return false; remove {}s, you're not in Kansas anymore, cowboy.
https://codereview.webrtc.org/2099483002/diff/20001/webrtc/modules/video_codi... File webrtc/modules/video_coding/codecs/vp8/simulcast_encoder_adapter.cc (right): https://codereview.webrtc.org/2099483002/diff/20001/webrtc/modules/video_codi... webrtc/modules/video_coding/codecs/vp8/simulcast_encoder_adapter.cc:299: input_image.video_frame_buffer()->native_handle()) { On 2016/06/27 08:02:01, perkj_webrtc_ooo_june20 wrote: > The encoder does not know the size is should scale too. > Scaling is simple in it self, we just need to create a new VideoFrameBuffer with > the correct width and height. There is a plan to > add a scale factor in the buffers to handle this > https://bugs.chromium.org/p/webrtc/issues/detail?id=5683 but that is not done. > If this is not a problem today for hw encoders on Mac and Win I guess we can add > a TODO here and update the bug 5683. Added a TODO here with a reference for webrtc:5683. Let me know if that's ok (assuming I submit before you see it, let me know and I'll fix it in a follow-up CL). https://codereview.webrtc.org/2099483002/diff/20001/webrtc/modules/video_codi... webrtc/modules/video_coding/codecs/vp8/simulcast_encoder_adapter.cc:519: return false; On 2016/06/29 15:48:27, pbos-webrtc wrote: > remove {}s, you're not in Kansas anymore, cowboy. Done :)
The CQ bit was checked by noahric@chromium.org
The patchset sent to the CQ was uploaded after l-g-t-m from magjed@webrtc.org, pbos@webrtc.org, perkj@webrtc.org Link to the patchset: https://codereview.webrtc.org/2099483002/#ps60001 (title: "git pull")
CQ is trying da patch. Follow status at https://chromium-cq-status.appspot.com/v2/patch-status/codereview.webrtc.org/...
The CQ bit was unchecked by commit-bot@chromium.org
Try jobs failed on following builders: android_dbg on master.tryserver.webrtc (JOB_FAILED, http://build.chromium.org/p/tryserver.webrtc/builders/android_dbg/builds/14708)
The CQ bit was checked by noahric@chromium.org
CQ is trying da patch. Follow status at https://chromium-cq-status.appspot.com/v2/patch-status/codereview.webrtc.org/...
The CQ bit was unchecked by commit-bot@chromium.org
Try jobs failed on following builders: android_dbg on master.tryserver.webrtc (JOB_FAILED, http://build.chromium.org/p/tryserver.webrtc/builders/android_dbg/builds/14716)
The CQ bit was checked by noahric@chromium.org
CQ is trying da patch. Follow status at https://chromium-cq-status.appspot.com/v2/patch-status/codereview.webrtc.org/...
The CQ bit was unchecked by commit-bot@chromium.org
Try jobs failed on following builders: android_dbg on master.tryserver.webrtc (JOB_FAILED, http://build.chromium.org/p/tryserver.webrtc/builders/android_dbg/builds/14718)
Description was changed from ========== Add native handle support to SimulcastEncoderAdapter. If all subencoders support textures, the adapter will claim support. Texture frames will be passed on directly to subencoders, without any attempt at scaling, and subencoders will be expected to sample/scale correctly from source textures. BUG= ========== to ========== Add native handle support to SimulcastEncoderAdapter. If all subencoders support textures, the adapter will claim support. Texture frames will be passed on directly to subencoders, without any attempt at scaling, and subencoders will be expected to sample/scale correctly from source textures. BUG= NOTRY=true ==========
The CQ bit was checked by pbos@webrtc.org
CQ is trying da patch. Follow status at https://chromium-cq-status.appspot.com/v2/patch-status/codereview.webrtc.org/...
Message was sent while issue was closed.
Description was changed from ========== Add native handle support to SimulcastEncoderAdapter. If all subencoders support textures, the adapter will claim support. Texture frames will be passed on directly to subencoders, without any attempt at scaling, and subencoders will be expected to sample/scale correctly from source textures. BUG= NOTRY=true ========== to ========== Add native handle support to SimulcastEncoderAdapter. If all subencoders support textures, the adapter will claim support. Texture frames will be passed on directly to subencoders, without any attempt at scaling, and subencoders will be expected to sample/scale correctly from source textures. BUG= NOTRY=true ==========
Message was sent while issue was closed.
Committed patchset #4 (id:60001)
Message was sent while issue was closed.
CQ bit was unchecked.
Message was sent while issue was closed.
Description was changed from ========== Add native handle support to SimulcastEncoderAdapter. If all subencoders support textures, the adapter will claim support. Texture frames will be passed on directly to subencoders, without any attempt at scaling, and subencoders will be expected to sample/scale correctly from source textures. BUG= NOTRY=true ========== to ========== Add native handle support to SimulcastEncoderAdapter. If all subencoders support textures, the adapter will claim support. Texture frames will be passed on directly to subencoders, without any attempt at scaling, and subencoders will be expected to sample/scale correctly from source textures. BUG= NOTRY=true Committed: https://crrev.com/fe3654d5dceb3d931cd773afae511e90180d39ba Cr-Commit-Position: refs/heads/master@{#13365} ==========
Message was sent while issue was closed.
Patchset 4 (id:??) landed as https://crrev.com/fe3654d5dceb3d931cd773afae511e90180d39ba Cr-Commit-Position: refs/heads/master@{#13365} |