![]() ![]() Reply to this email directly, view it on GitHub, or unsubscribe. ![]() You are receiving this because you commented. Hello, I encountered the same problem as you. Note: I am checking on iOS device: iPhone 11, iOS version: 14.0 Please check the screenshot attached.Īm I missing anything else to enable screen share option?ĭo I need to do any additional code base settings? I have added the latest iOS SDK 3.5.0 (iOS SDK URL ref: ) in my project and checked for screen sharing, but I can not see the screen share option. But I hope these concepts help someone looking into this.Įl 11 sept 2021, a las 13:48, 夕若 : I am trying to enable screen sharing option. Use Core Data ( Haven't gotten too much into this ).since the read write has to happen at a pretty high speed of atleast one read write per 200 - 400 ms. This could work but it might lead to frame drops or lags. Use NSFileCoordinator and coordinate read write into the same file from both extension and App.Use to communicate small back and forth information if required ( But it cannot be used to send CMSampleBuffer, atleast as far as I know ).There are couple of ways I found we could do this. Pass the CMSampleBuffer to the app and then follow the same flow as the camera feed does.Couldn't really get too far using this method but like saghul said since extension has memory limits it would probably not work as well as expected. Either process the CMSampleBuffer within the extension completely.Once the broadcast extension is added to the app we get the CMSampleBuffers after that two things can be done :. On iOS, according to me there are two way to achieve this. Thanks a ton for your quick don't have any code snippets as such, but just some concepts. Please suggest if there are any changes to this flow as well. Let me know if this just sounds stupid and I am completely on the wrong track. On click of button again, get the original camera feed if camera was on. ![]()
0 Comments
Leave a Reply. |