This is the video that gave me the inspiration (my code replicates this exactly so I didn’t see the need to recreate the video too), the only difference is I used linux all round rather than windows:
The code can be found on my github.
The system consists of three main elements, a client which streams video, a client for viewing the video and a server to route the video via websockets and host the web pages.
The camera is created using getUserMedia to send a video stream from a devices camera to the html5 video tag which is displayed on the page. Capture happens by pulling frames onto a canvas and then saving the canvas as a base 64 encoded image and transmitting it over websockets to the server. Repeated frames are captured using setInterval. I’ve tested this demo using the custom Opera Mobile for Android build with getUserMedia support on my HTC Desire Z (T mobile G2) about 2fps is achievable due to the limitations of processing the large amount of data. On faster devices this should improve. I intend to produce a second version using a desktop client to test this.
The viewer is very simple, it receives frames over a websocket connection as base64 encoded images and displays them in as standard images in html.
The server uses node.js to receive frames via a websocket and then resends them back out to all clients, it is also used to host the static files for the clients.