A protocol is a group of related properties and methods that can be implemented by any class. They are more flexible than a normal class interface, since they let you reuse a single API declaration in completely unrelated classes.
Streaming involves protocols at several different layers of the OSI Reference Model.
Real Time Messaging Protocol (RTMP) is a proprietary protocol used primarily by Flash, but implemented by some other software as well. Adobe has released a specification for it, but it’s incomplete in some important respects. It’s usually used over TCP, though this isn’t a requirement. It operates in the application through session layers. Its importance is a direct result of the ubiquity of Flash, and it will decline as the use of Flash does.
Vidict can stream both, live and on-demand streams via RTMP, and can contain multiple channels of video, audio, and data.
HTTP Live Streaming – HLS
HTTP Live Streaming (HLS) works by delivering segmented video and audio files to a player via HTTP. It delivers live streams and multimedia files into small files (chunks) for delivery, using a generated manifest file to order requests for the chunks of a stream.
Why use HLS?
HLS avoids the network problems sometimes associated with traditional streaming protocols, and offers additional benefits:
- DRM support
- Live and time-shifted playback
- Multi-language audio
- Adaptive bitrate (ABR) streaming
(ABR is built into the HTTP streaming protocol, and works by detecting a user’s bandwidth and CPU capacity in real time and adjusting the stream quality accordingly.)