As demand has increased on internet media, the core web infrastructure has evolved to support new use cases for some pretty old standards.
HTTP has been used since the inception of the web around 1989. The core concept being, a client's request receives a peer's response. Originally a content length header element was used to box the request and response body in to some static size. This was used for a variety of reasons, and was also manipulated to perform denial-of-service (DoS) attacks. This is now used to stream partial offsets of a video file, allowing the player to start fetching segments with a lower bit-rate. Hence, dynamic adaptive streaming over http.
You can check out the wildfly and castlab's code at my github public repository: https://github.com/charlescva/mobile-dashjs
|Notice the Content-Length is determined by the offset as provided by the MPD and Initial MP4 containing the Metadata about each stream.|
You can review the source, but the steps are as follows:
- Obtain a standard MP4 example video.
- Configure Apache to host the files in the directory dashencrypt is using. This is currently hard-coded in the VideoRegistration.java.
- Add Video using the Add a Video tab. The JAX-RS enabled VideoService.java will handle the request, and dash the file for you.
- Upon success, you will see the entry for the video appear on the "Video Player" tab.
|Observing the Console. You can see the logger is outputting the steps as it processes the request.|
I am still getting my feet wet as well, and came across a great article on the following website, https://arashafiei.wordpress.com/2012/11/13/quick-dash/.
I'll be working on integrating a "live" stream in which a imaging device like /dev/video0 (webcam) will be used to generate the video segment data, while the MPD (Manifest) and initial MP4 file containing the Movie Box (moov) and/or Fragment Box (moof) are updated on the fly. Essentially, the goal is to enable "DASHing" of a live video feed.