'Screen Share Functionality in Android to browser using webRTC

I am working on an MDM project where I am required to implement a feature of remote troubleshooting. Requirement is that my android screen must be captured and displayed on web browser in Real Time. I researched for over a month and found out that webRTC is my only option for RealTime Communication. webRTC has provided a library for android with no documentation. I found two projects on github that are similar:

  1. Project 1
  2. Project 2

I have understood from my research that webRTC helps establish a peer to peer connection after two peers "somehow" exchange their Session Description Protocols(SDP) with each other.

Since all devices on the internet are behind NATs, we use STUN servers to fetch our public IP addresses.

Peer A sends SDP called "Offer" to Peer B and sets this offer as its Local Description. Peer B receives this offer and sets it as its Remote Description. Peer B then creates its SDP called "Answer" and sends it to Peer A. Peer A sets this Answer as its Remote Description. This process is called Signaling.

After signaling is complete, a P2P connection is somehow established and two devices can now send audio, video and data streams to one another.

I have all this theory but need some guidance to implement it in practical.

I am trying to create two demo android applications which will perform screen share feature. I have used TCP to send and receive SDPs (Offer and Answer) and hope to accomplish this so that I can implement a similar screen share to browser functionality

I am a newbie in android development and need some guidance and resources on how to achieve this. Thanks.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source