ios – ProcessSampleBuffer of Broadcast Add Extension being throttled

Spread the love


I am engaged on an Xamarin Varieties iOS app that streams the person’s display screen utilizing the Broadcast Add Extension through WebRTC. It really works properly in probably the most half, however I am seeing uncommon behaviour when the strategy would not have a lot processing to do, the ProcessSampleBuffer technique is known as much less incessantly. This is the code:

public override void ProcessSampleBuffer(CoreMedia.CMSampleBuffer sampleBuffer, RPSampleBufferType sampleBufferType)
    {

        swap (sampleBufferType)
        {
            case RPSampleBufferType.Video:
          
                    // Deal with audio pattern buffer
                    var imageBuffer = sampleBuffer.GetImageBuffer() as CVPixelBuffer;


                    var orientation = GetOrientation(sampleBuffer);

                    var newOrientation = RTCVideoRotation.Rotation0;

                    if (orientation == InterfaceOrientationType.Portrait)
                    {
                        newOrientation = RTCVideoRotation.Rotation0;

                    }
                    else if (orientation == InterfaceOrientationType.LandscapeLeft)
                    {
                        newOrientation = RTCVideoRotation.Rotation180;

                    }
                    else if (orientation == InterfaceOrientationType.LandscapeRight)
                    {
                        newOrientation = RTCVideoRotation.Rotation270;

                    }


                    // UNCOMMENTING THE FOLLOWING 3 LINES MAKES THE METHOD FIRE ON A MORE REGULAR BASIS.                     
                    //  var ciImage = CIImage.FromImageBuffer(imageBuffer);
                    //  var picture = new UIImage(ciImage, 1.0f, UIImageOrientation.Up);
                    //  var jpeg = picture.AsJPEG(0.05f);


                    lengthy timeStampNs = (lengthy)sampleBuffer.PresentationTimeStamp.Seconds * 1000000000;


                    if (WebRtcService2._videoCapturer != null && WebRtcService2._videoCapturer.Delegate != null)
                    {
                        WebRtcService2._videoCapturer.Delegate.Capturer(WebRtcService2._videoCapturer, ConvertToRTCVideoFrame(imageBuffer, timeStampNs, newOrientation));
                    }

                break;
            case RPSampleBufferType.AudioApp:
                // Deal with audio pattern buffer for app audio
                break;
            case RPSampleBufferType.AudioMic:
                // Deal with audio pattern buffer for app audio
                break;
        }
    }

When the three commented out strains above are included to transform to a JPEG the strategy is known as extra often, a number of instances per second. With the code as it’s above, the strategy is fired briefly bursts of 10-20 frames each minute or so.

Why would this be? I do not wish to embrace the Jpeg code because it’s impacting the efficiency (framerate) of the stream.

I count on a constant framerate with the strategy being referred to as a number of instances per second (when the display screen has one thing to redraw, in fact) with out the necessity to embrace the next strains of code:

var ciImage = CIImage.FromImageBuffer(imageBuffer); var picture = new UIImage(ciImage, 1.0f, UIImageOrientation.Up); var jpeg = picture.AsJPEG(0.05f);

Leave a Reply

Your email address will not be published. Required fields are marked *