Hls streaming sample

Hls streaming sample DEFAULT

Streaming Examples

View various examples of .M3U8 files formatted to index streams and .ts media segment files on your Mac, iPhone, iPad, and Apple TV.

Advanced stream

View example (TS)
View example (fMP4)

  • TS stream compatible with macOS v10.7 or later, iOS 6 or later, and tvOS 9 or later
  • fMP4 stream compatible with macOS v10.12 or later, iOS 10 or later, and tvOS 10 or later
  • AVERAGE-BANDWIDTH, CODECS, RESOLUTION, FRAME-RATE attributes in variant definitions
  • Floating point durations as separate segment files
  • H.264 @ 30Hz and 60Hz
  • 16x9 aspect ratio
  • 8 video variants
    • Gear 1 - 480x270 @ 775 kbps
    • Gear 2 - 640x360 @ 1.2 Mbps
    • Gear 3 - 768x432 @ 1.5 Mbps
    • Gear 4 - 960x540 @ 2.5 Mbps
    • Gear 5 - 1280x720 @ 3.5 Mbps
    • Gear 6 - 1920x1080 @ 5 Mbps
    • Gear 7 - 1920x1080 @ 6.5 Mbps
    • Gear 8 - 1920x1080 @ 8 Mbps
  • I-Frame variants (fast-forward / rewind support)
  • 3 audio renditions
    • AAC-LC - 48 kHz stereo @ 161 kbps
    • AC-3 - 48 kHz 5.1 @ 384 kbps
    • EC-3 - 48 kHz 5.1 @ 192 kbps
  • 1 subtitle rendition (WebVTT)

Advanced stream (HEVC/H.264)

View preliminary example (fMP4)

  • Stream backwards compatible with macOS v10.7 or later, iOS 6 or later, and tvOS 9 or later
  • HEVC variants compatible with macOS v10.13 or later, iOS 11 or later, and tvOS 11 or later
  • Floating point durations as separate segment files
  • H.264 and HEVC @ 30Hz and 60Hz
  • 16x9 aspect ratio
  • Nine HEVC video variants
    • Gear 9 - 1920x1080 @ 5.8 Mbps
    • Gear 8 - 1920x1080 @ 4.5 Mbps
    • Gear 7 - 1920x1080 @ 3.2 Mbps
    • Gear 6 - 1280x720 @ 2.4 Mbps
    • Gear 5 - 960x540 @ 1.7 Mbps
    • Gear 4 - 768x432 @ 990 kbps
    • Gear 3 - 640x360 @ 660 kbps
    • Gear 2 - 480x270 @ 350 kbps
    • Gear 1 - 416x234 @ 145 kbps
  • Nine H.264 video variants
    • Gear 9 - 1920x1080 @ 7.8 Mbps
    • Gear 8 - 1920x1080 @ 6.0 Mbps
    • Gear 7 - 1920x1080 @ 4.5 Mbps
    • Gear 6 - 1280x720 @ 3.0 Mbps
    • Gear 5 - 960x540 @ 2.0 Mbps
    • Gear 4 - 768x432 @ 1.1 Mbps
    • Gear 3 - 640x360 @ 730 kbps
    • Gear 2 - 480x270 @ 365 kbps
    • Gear 1 - 416x234 @ 145 kbps
  • I-Frame variants (fast-forward / rewind support)
  • 3 audio renditions
    • AAC-LC - 48 kHz stereo @ 160 kbps
    • AC-3 - 48 kHz 5.1 @ 384 kbps
    • EC-3 - 48 kHz 5.1 @ 192 kbps
  • 1 subtitle rendition (WebVTT)

Basic stream

View example

  • Compatible with macOS v10.7 or later and iOS 4.3 or later
  • 4x3 aspect ratio
  • H.264 @ 30Hz
  • floating point durations as separate segment files
  • CODECS attribute in master playlist
  • 4 video variants
    • Gear 1 - 400x300 @ 232 kbps
    • Gear 2 - 640x480 @ 650 kbps
    • Gear 3 - 640x480 @ 1 Mbps
    • Gear 4 - 960x720 @ 2 Mbps
  • 1 audio-only variant
    • Gear 0 AAC - 22.05 kHz stereo @ 40 kbps

Basic stream

View example

Note: The primary audio in the stream should be used for any sync testing. The second alternate audio demonstrates the use of an alternate audio option, but was not designed as a true sync verification.

  • Compatible with macOS v10.7 or later and iOS 5 or later
  • 16x9 aspect ratio
  • H.264 @ 30Hz
  • single .ts file, with byte-ranges in the playlists
  • floating point durations
  • CODECS and RESOLUTION attributes in master playlist
  • I-Frames (fast forward rewind support)
  • closed captions
  • timed metadata (timecode every 5 seconds)
  • 5 video variants
    • Gear 1 - 416x234 @ 265 kbps
    • Gear 2 - 640x360 @ 580 kbps
    • Gear 3 - 960x540 @ 910 kbps
    • Gear 4 - 1280x720 @ 1 Mbps
    • Gear 5 - 1920x1080 @ 2 Mbps
  • 1 audio-only variant
    • Gear 0 - AAC - 22.05 kHz stereo @ 40 kbps
  • 1 alternate audio
    • alt audio - AAC - 22.05 kHz stereo @ 40 kbps
  • subtitles (WebVTT)
Sours: https://developer.apple.com/streaming/examples/

MPEG-DAS sample streams, plus m3u8 stream and HLS test streams

MPEG-DASH sample streams and HLS test streams or m3u8 streams as they are often called, are important tools that you should have available throughout your development process.

It’s good to have a variety of streams available when you are testing your adaptive streaming solution to ensure you are covering all aspects of your playback. We have collected the following list of publicly available and free MPEG-DASH and HLS examples, test streams and datasets to help you through your development process:

DASH Industry Forum: MPEG-DASH Test Vectors

These MPEG-DASH examples are provided by the DASH Industry Forum and its members to validate conformance to the DASH264 profile of the DASH-IF. They contain a broad variety of streams, e.g. SD only, HD streams, multichannel audio extensions, negative test vectors, single and multi-bitrate MPDs, multi-resolutions, multiple audio representations, the addition of timed text, multiple periods, encrypted streams including key rotation, dynamic segment offering, and MPD updates, tick modes, etc.

These streams will provide a great set of tools for thoroughly testing your player and environment.

DASH industry forum test streams

HLS .m3u8 streams for testing

HLS test streams (.m3u8 streams) are a little harder to come by due to the nature of the technology.  HLS, also known as HTTP Live Streaming, is an HTTP-based protocol implemented by Apple. As it’s not an open standard like MPEG-DASH, it doesn’t have nearly as much community-generated content and resources. With that being said, we have created a small list of streams that we have tested and prepared using a few of our own with various encoding profiles and functions, such as subtitles and multi-language options.

  • https://bitdash-a.akamaihd.net/content/MI201109210084_1/m3u8s/f08e80da-bf1d-4e3d-8899-f0f6155f6efa.m3u8
  • https://bitdash-a.akamaihd.net/content/sintel/hls/playlist.m3u8
  • https://mnmedias.api.telequebec.tv/m3u8/29880.m3u8
  • http://www.streambox.fr/playlists/test_001/stream.m3u8

Bitmovin’s Teststreams from the Most Popular Encoders and Streaming Servers

As part of our demo section, we provide a wide range of test-streams from the most popular encoders and streaming servers on the market to ensure compatibility and integration. We also provide our test live-streams that showcase time-shift/DVR functionalities and that are produced by our Bitmovin encoding service. You can also find High-Frame-Rate (HFR) content, as well as MPEG-CENC compliant Microsoft PlayReady, Apple Fairplay, Adobe PrimeTime, or Google Widevine content.

You can find the Bitmovin encoding and streaming server demonstration page here. You can also find our DRM test player useful when you start working on your protected content.

University Klagenfurt, ITEC: DASH Dataset

The ITEC: DASH Dataset was the first MPEG-DASH dataset available, and is consistently getting updated with the current status of MPEG-DASH. Today, the ITEC research department has expanded to its own unique research wing called ATHENA Labs. The DASH dataset contain full-movie-length content in up to 15 different quality representations with resolution up to 1080p. The content is available in different segment length versions from 1 to 15 seconds. As mentioned in our blog post on the optimal segment length, that is an important parameter in adaptive streaming systems.

Telecom ParisTech, GPAC: UHD HEVC DASH Dataset

This is a great UHD HEVC dataset, which includes several sequences shot during the 4Ever project. The DASH sequences provide HEVC encoding ranging from 720p30 @ 2Mbps up to 2160p60 @ 20 Mbps, with one 1080p60 and one 2160p60 in 10 bits.

University Klagenfurt, ITEC: Distributed DASH Dataset

This dataset takes some of the content from the DASH Dataset of ITEC, and makes it available on different mirrors worldwide. Currently there are 8 mirrors active around the world, which makes it possible to test multi-origin/multi-CDN, fail-over, multi-BaseURL, etc. scenarios.

YouTube, MPEG-DASH Sample Media

As previously mentioned, Youtube widely uses MPEG-DASH. The interesting thing here is that YouTube also uses WebM-based MPEG-DASH, therefore we can also find such test streams on their page. They also provide DRM protected content using MPEG-CENC and Widevine.

BBC R&D: MPEG-DASH Teststreams

The BBC is one of the first-movers on the MPEG-DASH side and have a very active development team. They also provide some test streams on their R&D website. The BBC has been using MPEG-DASh in production for several years now.

Bento4 sample MPEG-DASH streams

Also the popular open-source packaging software, bento4, provides some test streams produced by the software, using multiple qualities of the well-known Tears of Steel movie.

Microsoft Azure Teststreams

Microsoft and the Azure Media Services team are also very active in the DASH field, and support MPEG-DASH in their products such as Azure Media Services, IE11, etc. They also provide different test streams.


This is a second MPEG-DASH dataset of the GPAC group at Telecom ParisTech. It includes MPEG-DASH test content using both ISO Base Media File Format as well as MPEG2-TS containers. This content is really convenient for testing audio/video sync.

Bitmovin’s OnDemand and Live MPEG-DASH & HLS Demostreams

With the Bitmovin cloud-encoding platform everybody can simply and easily create MPEG-DASH content within a few minutes. You can try it out by registering on the website. You get 10 encodings / 2.5 GB per month free encoding volume, that’s more than enough for generating a range of teststreams. You can also find a lot of teststreams on the website, e.g. High-Frame-Rate (HFR) content as well as Test-Livestreams.

We hope that this list provides you with a good overview of the available MPEG-DASH example test streams and datasets and helps you in your development and testing phase. Please feel free to send us additional MPEG-DASH test content sources. We are happy to extend the list and let it grow. Please also let us know if one of the sites is no longer online, so that we can remove it from the list.

Encode MPEG-DASH & HLS Content

Encode your content with the same technology as Netflix and YouTube in a way that it plays everywhere with low startup delay and no buffering. The Bitmovin cloud encoding server is not only the fastest in the industry, but it has a full set of capabilities including DRM, Multi-language, ads, subtitles 360° and much more. Sign up for a free trial and test it today.

This blog post was originally posted in April 2015, and was checked and updated with more streams and on the 10th November 2016.

Thanks and best,
Stefan from the Bitmovin Team

Follow me on Twitter: @slederer
Follow bitmovin on Twitter: @bitmovin


More Readings:



Sours: https://bitmovin.com/mpeg-dash-hls-examples-sample-streams/
  1. Shrm maine
  2. Wall outlet walmart
  3. Daily bread 2020 pdf
  4. Bill cipher ritual
  5. Voodoo child youtube

In the past, Adobe’s Flash video technology was the main method of delivering video via the internet. In recent years, however, there’s been a major shift in the world of online video. Specifically, online video delivered by protocols like HLS streaming and played by HTML5 video players has increasingly replaced Adobe’s Flash protocol.

For both broadcasters and viewers, this is a largely positive change. Why? First, HTML5 and HLS are open specifications. Therefore, users can modify them to their specifications and anyone can access them free of cost. Additionally, these newer HTML5 and HLS streaming protocols are safer, more reliable, and faster than earlier technologies.

For content producers, HTML5 and HLS live streaming technologies also bring some major advantages. However, there are also some notable disadvantages within this realm of content production. In particular, it can take considerable time and effort to replace legacy systems and technologies with new standards, which may not work the same across all streaming platforms. As with all technological innovations, growing pains are inevitable.

To get you up to speed on these changes, we’ve framed this article for both longtime professional broadcasters and newcomers to streaming media.  Whether you do live event streaming, or you want to stream live from your website, we’ve got you covered. Overall, our focus here is on HLS video streaming

We’ll also discuss the role of HTML5 video streaming as it relates to HLS. In particular, we’ll cover basic streaming protocol definitions, other streaming protocols, and provide a detailed overview of the main topic of this post: what is HLS streaming and when should you use it?

Please note that we have updated this post to reflect the most current information about HLS streaming for 2021 including new protocols, M3U8, and HTML5 video players.

Table of Contents:

  • What is HLS Streaming?
  • A Basic Breakdown: How Does HLS Work?
  • Technical Overview of HLS Streaming
  • Key Benefits of HLS Streaming
  • Comparing HLS Streaming to Other Video Streaming Protocols
  • Advantages of HLS Over Other Protocols
  • Devices and Browsers that Support HLS
  • When to Use HLS
  • Building an RTMP to HLS Workflow
  • HTML5 Video Streaming with HLS
  • The Future of Live Streaming
  • Conclusion

What is HLS Streaming and How Does it Work?

HLS stands for HTTP Live Streaming. In short, HLS is a media streaming protocol for delivering visual and audio media to viewers over the internet.

Apple first launched the HTTP live streaming (HLS) protocol in the summer of 2009. Apple timed this release to coincide with the debut of the iPhone 3. As you may recall, previous iPhone models had experienced many problems with streaming media online. These issues arose, at least in part, because those older devices often switched between Wi-Fi and mobile networks mid-stream.

Prior to the release of HLS, Apple used the Quicktime Streaming Server as its media streaming standard. Though a robust service, Quicktime used non-standard ports for data transfer and so firewalls often blocked its RTSP protocol. 

Combined with slow average internet speeds, these limitations doomed Quicktime Streaming Server. As a result, this early experiment in video streaming technology never reached a wide audience.

That said, HTTP Live Streaming ultimately drew from the lessons learned from creating and rolling out the Quicktime service.

A Basic Breakdown: How Does HLS Work?

We’ve covered the matter-of-fact definition of HLS, but before we move on to an equally technical overview of how this protocol works, we’re going to take a chance to go back to the basics.

As we’ve mentioned, HLS is an important protocol for live streaming. The live streaming process that is compatible with the greatest number of devices and browsers looks something a little bit like this:

  1. Capturing devices (cameras, microphones, etc.) capture the content.
  2. The content is sent from the capturing device to a live video encoder. 
  3. The encoder transmits the content to the video hosting platform via RTMP.
  4. The video hosting platform uses HLS ingest to transmit the content to an HTML5 video player.

This process requires two main software solutions: a live video HLS encoder and a powerful video hosting platform. If you choose to stream with HLS, you’ll want to make sure that both software offers the protocols and features we mentioned.

HTML5 video players powered by HLS are great for reaching the largest audience since this duo is practically universal. Dacast is a feature-rich live video streaming solution that includes HLS streaming and a customizable, white-label HTML5 video player.

Want to see the HLS streaming protocol in action? Sign up for your free trial of Dacast to try out HLS ingest and other powerful streaming features.

Start My Trial

Technical Overview of HLS Streaming

With that background in mind, how does HLS streaming technology work?

First, the HLS protocol chops up MP4 video content into short (10-second) chunks with the .ts file extension (MPEG2 Transport Stream). Next, an HTTP server stores those streams, and HTTP delivers these short clips to viewers on their devices. 

HLS will play video encoded with the H.264 or HEVC/H.265 codecs.

The HTTP server also creates an M3U8 playlist file (e.g. manifest file) that serves as an index for the video chunks.  That way, even if you choose to broadcast live using only a single quality option, the file will still exist.

Now, let’s consider how playback quality works with HLS video streaming. With this protocol, a given user’s video player software (like an HTML5 video player) detects deteriorating or improving network conditions. 

If either occurs, the player software first reads the main index playlist and determines which quality video is ideal. Then the software reads the quality-specific index file to determine which chunk of video corresponds to the point at which the viewer is watching. If you’re streaming with Dacast, you can use your M3U8 online player to test your HLS stream. Though this may sound technically complex, the entire process is seamless for the user.

Key Benefits of HLS Streaming

One key benefit of this protocol relates to its compatibility features. Unlike other streaming formats, HLS is compatible with a wide range of devices and firewalls. However, latency (or lag time) tends to be in the 15-30 second range with HLS live streams. 

This is certainly an important factor to keep in mind. Note that Dacast now offers an HLS direct low latency streaming feature, which works with any HLS-compatible encoder.

When it comes to quality, versatility makes HLS video streaming stand out from the pack. On the server-side, content creators often have the option to encode the same live stream at multiple quality settings. 

In turn, viewers can dynamically request the best option available, given their specific bandwidth at any given moment. In other words, from chunk to chunk the data quality can differ to fit different streaming device capabilities.

This is best explained with an example. Let’s say, in one moment, you’re sending a full high-definition video. Moments later, a mobile user encounters a “dead zone” in which their quality of service declines. With HLS streaming, this is not an issue. The player will detect this decline in bandwidth and instead deliver lower-quality movie chunks at this time.

HLS also supports closed captions embedded in the video stream. To learn more about the technical aspects of HLS, we recommend the extensive documentation and best practices provided by Apple.

Comparing HLS Streaming to Other Video Streaming Protocols

Several companies have developed a variety of streaming solutions through the use of media streaming protocols. Generally, each of these solutions has represented a new innovation in the field of video streaming. However, similar to the HD-DVD vs. Blu-Ray format wars, or the even older Betamax vs. VHS showdown, industry conflicts can arise.

HLS is currently the best option for streaming media protocols, but it wasn’t always that way—nor will it remain so forever. Let’s review several past and current streaming protocols to better understand the innovations that the HLS streaming protocol offers today.

1. Adobe HTTP Dynamic Flash Streaming (HDS)

Known as Adobe’s next-gen streaming, HDS actually stands for HTTP Dynamic Streaming. This protocol was designed specifically for compatibility with Adobe’s Flash video browser plug-in. Therefore, the overall adoption of HDS is relatively small compared to HLS.

Here at Dacast, we use HDS to deliver some of our VOD (Video On Demand) content. For devices and browsers that do support Flash video, HDS can be a robust choice with lower latency. Like HLS, the HDS protocol splits media files into small chunks. HDS also provides advanced encryption and DRM features. Finally, it uses an advanced keyframe method to ensure that chunks align with one another.

2. Real-Time Messaging Protocol (RTMP)

Macromedia developed RTMP (Real-Time Messaging Protocol) in the mid-2000s. Designed for streaming both audio and video, many know this protocol simply as Flash. Macromedia later merged with Adobe, which now develops RTMP as a semi-open standard.

For much of the past decade, RTMP was the default video streaming method on the internet. But with the recent rise of HLS, we’ve seen a decline in the usage of RTMP. Even today, most streaming video hosting services work with RTMP encoders to ingest live streams via HLS. In other words, broadcasters deliver their streams to their chosen video platform in RTMP stream format. Then, the OVP usually delivers those streams to viewers via HLS, and that includes in-China video hosting, which Dacast now offers.

In recent years, even this legacy use of RTMP streams is beginning to fade. More and more CDNs (Content Delivery Networks) are beginning to depreciate RTMP support.

3. Microsoft Smooth Streaming (MSS)

Next up is the streaming protocol MSS (Microsoft Smooth Streaming). As the name implies, it’s Microsoft’s version of a live streaming protocol. Smooth Streaming also uses the adaptive bitrate approach, delivering the best quality available at any given time.

First introduced in 2008, MSS was one of the first adaptive bitrate methods to hit the public realm. In fact, the MSS protocol helped to broadcast the 2008 Summer Olympics that year. The most widely-used MSS platform today is the Xbox One. However, MSS is one of the less popular streaming protocols available today. In almost all cases, HLS should be the default method over this lesser-used approach.

4. Dynamic Adaptive Streaming over HTTP (MPEG-DASH)

Lastly, the newest entry in the streaming protocol format wars is MPEG-DASH. The DASH stands for Dynamic Adaptive Streaming (over HTTP).

MPEG-DASH comes with several advantages. First of all, it is the first international standard streaming protocol based on HTTP. This feature has helped to quicken the process of widespread adoption. For now, MPEG-DASH is a relatively new protocol and isn’t widely used across the streaming industry. However, like the rest of the industry, we expect MPEG-DASH to become the de facto standard for streaming within a couple of years.

One major advantage of MPEG-DASH is that this protocol is “codec agnostic.” Simply put, this means that the video or media files sent via MPEG-DASH can utilize a variety of encoding formats. These encoding formats include widely supported standards like H.264 (as with HLS streaming protocol), as well as next-gen video formats like HEVC/H.265 and VP10. And like HLS, MPEG-DASH is an adaptive bitrate streaming video method.

For a more detailed comparison, you can also review this blog post on MPEG-DASH versus HLS streaming protocols.

5. Real-Time Streaming Protocol (RTSP)

Real-time streaming protocol, or RTSP for short, is a protocol that helps manage and control live stream content rather than actually transmitting the content. It is considered a “presentation layer protocol.”

It is a pretty old protocol, having originally been developed in the late 1990s. RTSP was developed in collaboration with Columbia University, Real Network, and Netscape.

RTSP is known for having extremely low latency, which is certainly a plus.

Unfortunately, this protocol comes with a slew of limitations. For starters, when comparing RTSP vs RTMP, it is not as compatible nor adaptable with modern video players and devices. Unlike RTMP, it is not compatible with streaming over HTTP in a web browser, nor is it easy to scale.

Advantages of HLS Video Streaming Over Other Protocols

In the first half of this article, we covered a major advantage of HLS over other protocols in terms of streaming video quality. In particular, broadcasters can deliver streams using the adaptive bitrate process supported by HLS. That way, each viewer can receive the best quality stream for their internet connection at any given moment.

This protocol includes a number of other key benefits, including embedded closed captions, synchronized playback of multiple streams, good advertising standards support, DRM support, and more.

The takeaway here for broadcasters? For now and at least the shorter-term future, HLS is the definitive default standard for live streaming content.

Devices and Browsers that Support HLS

The HLS streaming protocol is also widely supported across multiple devices and browsers. 

Originally limited to iOS devices like iPhones, iPads, and the iPod Touch, HLS is now supported by the following devices and browsers:

  • All Google Chrome browsers
  • Safari
  • Microsoft Edge
  • iOS devices
  • Android devices 
  • Linux devices
  • Microsoft devices
  • macOS platforms 

At this point, HLS is a nearly universal protocol.

When to Use HLS Streaming?

Currently, we recommend that broadcasters adopt the HLS streaming protocol all of the time. It is the most up-to-date and widely used protocol for media streaming. In this Video Streaming Latency Report, for example, 45% of broadcasters reported using HLS streaming.

RTMP came in second with 33% of broadcasters using that alternative. And MPEG-DASH trailed behind even further, used by only 7% of broadcasters.

1. Streaming to Mobile Devices

HLS is mandatory for streaming to mobile devices and tablets. Given that mobile devices now make up over 48% of all internet traffic in 2020, HLS is essential for these users.

2. Streaming with an HTML5 Video Player

Remember, native HTML5 video doesn’t support RTMP or HDS. Therefore, if you want to use a purely HTML5 video player, HLS is the only choice.

Along with reaching mobile devices, these considerations point towards HLS as the default standard. If you’re stuck using Flash technology for the moment, RTMP will be a better delivery method—but only if you have no other option.

3. Latency with HLS Streaming

HLS streaming does have one disadvantage, which we mentioned above. Namely, it has a relatively higher latency than some other protocols. This means that HLS streams are not quite as “live” as the term live streaming suggests. 

Generally, with HLS viewers can experience delays of up to 30 seconds (or more, in some cases). That said, this isn’t a problem for most broadcasters. The vast majority of live streams can handle that kind of delay without causing any sort of user dissatisfaction.

One protocol that works well to reduce latency with HLS video streaming is Low-Latency CMAF for DASH. This protocol works with the content delivery network and HTML5 video player to carry the weight where HLS streaming is lacking.

Building an RTMP to HLS Workflow

So we’ve covered what HLS is, how it works, and when to use it. We’ve also looked at alternative streaming protocols from the past and present. Now, let’s talk through how to build an RTMP Ingest to HLS workflow.

If you’re using a streaming service like Dacast, you need to build a workflow that begins as RTMP. This is much simpler than it sounds. Essentially, you simply need to configure your hardware or software encoder to deliver an RTMP stream to the Dacast servers. Most encoders default to RTMP, and quite a few only support that standard.

For Dacast users, our CDN partners then ingest the RTMP stream and automatically rebroadcast it via both HLS and RTMP. From there, viewers default to the best-supported method on their own devices.

Using HLS is relatively straightforward with a professional, full-service OVP. On Dacast, all live streams default to HLS delivery. On computers that support Flash, we do fall back on RTMP/Flash in order to reduce latency. However, HLS is supported automatically on every Dacast live stream and is used on almost all devices.

As we discussed above, HLS streaming is delivered through an M3U8 file. This file is a kind of playlist that contains references to the location of media files. On a local machine, these would consist of file paths. For live streaming on the internet, that M3U8 file would contain a URL (the one on which your stream is being delivered).

Another relevant process to quickly note is transmuxing. Transmuxing is the process that repackages content files without distorting the content itself. This allows the content to flow more easily between software via the RTMP and HLS protocols.

HTML5 Video Streaming with HLS

As mentioned above, the HLS protocol has become the go-to approach for streaming content with HTML5 video players. If you’re not familiar with HTML5 video streaming, it’s one of the three main approaches to video streaming today.

With HTML5, the content-hosting website uses native HTTP to stream the media directly to viewers. Content tags (e.g., <video> tag) are included as part of the HTML code. 

As a result, the <video> tag creates a native HTML5 video player within your browser. These tags provide direction to the HTTP protocol (HLS), as to what to do with this content. HTTP displays the text, and an audio player plays audio content.

Like HLS, HTML5 is customizable for broadcasters and free for viewers. You can check out our related post on optimizing HTML5 video players with HLS to learn more.

We’ve also written extensively about the transition from Flash-based video (usually delivered via RTMP) to HTML5 video (usually delivered using HLS). Check out our “Flash is Dead” RTMP-focused blog post for more on that subject, including why it’s important to use an HTML5 video player.

If you’re streaming over the Dacast, you’re already using a fully compatible HTML5 video player. Content delivered via Dacast defaults to HTML5 delivery. However, it will use Flash as a backup method if HTML5 is not supported on a given device or browser. This means that even older devices with flash will have no problem playing your content over your Dacast account.

Of course, some broadcasters may prefer to use a custom video player. Luckily, it’s quite simple to embed your HLS stream within any video player. For example, if you’re using JW Player, simply insert the M3U8 reference URL into the code for your video player. Here’s a visual example:

Another note about using HLS and an HTML5 video player with Dacast is that Dacast uses the THEOplayer. THEOplayer is a universal video player that can be embedded in websites, mobile apps, and pretty much any platform that you can think of.

As we’ve mentioned before, compatibility is key when choosing video players and protocols since you want to be able to reach the greatest number of people.

The Future of Live Streaming

Before wrapping things up, let’s recap our discussion of some of the advantages of the HLS streaming protocol. First, there is no special infrastructure required to deliver HLS content. In fact, any standard web server or CDN will function well. Additionally, firewalls are much less likely to block content using HLS.

In terms of technical functionality, HLS will play video encoded with the H.264 or HEVC/H.265 codecs. It then chops the video into 10-second segments. Remember, latency for delivery tends to be in the 30-second range. However, Dacast now has a solution for low-latency HLS live streaming that reduces latency to 10 seconds or less.

The HLS protocol also includes several other built-in features. For example, HLS is an adaptive bitrate streaming protocol. This means that the client device and server dynamically detect the internet speed of the user, and then adjust video quality in response.

Other beneficial HLS features include support for embedded closed captions, synchronized playback of multiple streams, advertising standards (i.e., VPAID and VAST), DRM, and more.

While HLS is the current gold standard for live streaming, it won’t stay that way indefinitely. We expect MPEG-DASH to become increasingly popular in the coming years. 

As that shift takes place, we’ll see other changes as well, such as the transition away from h.264 encoding to h.265/HEVC. This new compression standard provides much smaller file sizes, making 4K live-streaming a real possibility.

However, that time isn’t here yet. For now, it’s important to stick with the established standards to reach as many users as possible on their devices. In other words, HLS is the streaming protocol of the present.


Today, HLS is widely supported, high-quality, and robust. All streamers should be familiar with the protocol, even if they don’t understand all of the technical details. This is true for all kinds of streaming, including live broadcasting over the Dacast live streaming platform.

Our goal in this article was to introduce you to the HLS protocol for streaming media. We’ve discussed what HLS is, how it works, and when to use it, as well as how it compares to other streaming protocols out there. After reading, we hope you have a solid foundation in HLS streaming technology and its future.

You can do your first HLS live stream today with our video streaming solution. If you’re ready to try it today, you can take advantage of our free 30-day trial. No credit card is required.

Get Started for Free

And for exclusive offers and regular live streaming tips, you’re also invited to join our LinkedIn group.

Finally, do you have further questions, thoughts, or feedback about this article? We’d love to hear from you in the comments below, and we will get back to you. Thanks for reading!

Please note that this post was originally written by Max Wilbert. It was revised in 2021 by Emily Krings to include the most up-to-date information. Emily is a strategic content writer and storyteller. She specializes in helping businesses create blog content that connects with their audience.

author avatar

Max Wilbert

Max Wilbert is a passionate writer, live streaming practitioner, and has strong expertise in the video streaming industry.

Sours: https://www.dacast.com/blog/hls-streaming-protocol/
What is HTTP Live Streaming (HLS)?

Introduction to HTTP Live Streaming: HLS on Android and More

Video streaming is an integral part of the modern internet experience. It’s everywhere: on mobile phones, desktop computers, TVs, and even wearables. It needs to work flawlessly on every device and network type, be it on slow mobile connections, WiFi, behind firewalls, etc. Apple’s HTTP Live Streaming (HLS) was created exactly with these challenges in mind.

Almost all modern devices come endowed with modern hardware that’s fast enough to play video, so network speed and reliability emerge as the biggest problem. Why is that? Up until a few years ago, the canonical way of storing and publishing video were UDP-based protocols like RTP. This proved problematic in many ways, to list just a few:

  1. You need a server (daemon) service to stream content.
  2. Most firewalls are configured to allow only standard ports and network traffic types,such as http, email, etc.
  3. If your audience is global, you need a copy of your streaming daemon service running in all major regions.

Of course, you may think all these problems are easy to solve. Just store video files (for example, mp4 files) on your http server and use your favourite CDN service to serve them anywhere in the world.

Where Legacy Video Streaming Falls Short

This is far from the best solution for a few reasons, efficiency being one of them. If you store original video files in full resolution, users in rural areas or parts of the world with poor connectivity will have a hard time enjoying them. Their video players will struggle to download enough data to play it in runtime.

Therefore, you need a special version of the file so that the amount of video downloaded is approximately the same that can be played. For example, if the video resolution and quality are such that in five seconds it can download another five seconds of video, that’s optimal. However, if it takes five seconds do download just three seconds worth of video, the player will stop and wait for the next chunk of the stream to download.

On the other hand, reducing quality and resolution any further would only degrade the user experience on faster connections, as you’d be saving bandwidth unnecessarily. However, there is a third way.

Adaptive Bitrate Streaming

While you could upload different versions of video for different users, you’d then need to have the ability to control their players and calculate what is the best stream for their connection and device. Then, the player needs to switch between them (for example, when a user switches from 3G to WiFi). And even then, what if the client changes the network type? Then the player must switch to a different video, but it must start playing not from the start, but somewhere in the middle of the video. So how do you calculate the byte range to request?

A cool thing would be if video players could detect changes in network type and available bandwidth, and then switch transparently between different streams (of the same video prepared for different speeds) until it finds the best one.

That’s exactly what Adaptive bitrate streaming solves.

Note:This HLS tutorial will not cover encryption, synchronized playbacks and IMSC1.

What is HLS?

HTTP Live Streaming is an adaptive bitrate streaming protocol introduced by Apple in 2009. It uses m3u8 files to describe media streams and it uses HTTP for the communication between the server and the client. It is the default media streaming protocol for all iOS devices, but it can be used on Android and web browsers.

HTTP Live Streaming cover illustration

The basic building blocks of a HLS streams are:

  1. M3U8 playlists
  2. Media files for various streams

M3U8 Playlists

Let’s start by answering a basic question: What are M3U8 files?

M3U (or M3U8) is a plain text file format originally created to organize collections of MP3 files. The format is extended for HLS, where it’s used to define media streams. In HLS there are two kinds of m3u8 files:

  • Media playlist: containing URLs of the files needed for streaming (i.e. chunks of the original video to be played).
  • Master playlist: contains URLs to media playlists which, in turn, contain variants of the same video prepared for different bandwidths.

A so-called M3U8 live stream URL is nothing more than URLs to M3U8 files, such as: https://s3-us-west-2.amazonaws.com/hls-playground/hls.m3u8.

Sample M3U8 File for HLS Stream

An M3U8 file contains a list of urls or local file paths with some additional metadata. Metadata lines start with #.

This example illustrates what an M3U8 file for a simple HLS stream looks like:

  • The first four lines are global (header) metadata for this M3U8 playlist.
  • The is the version of the M3U8 format (must be at least 3 if we want to use entries).
  • The tag contain the maximum duration of each video “chunk”. Typically, this value is around 10s.
  • The rest of the document contains pairs of lines such as:

This is a video “chunk.” This one represents the chunk which is exactly 10.344822 seconds long. When a client video player needs to start a video from a certain point in said video, it can easily calculate which file it needs to request by adding up the durations of previously viewed chunks. The second line can be a local filename or a URL to that file.

The M3U8 file with its files represents the simplest form of a HLS stream – a media playlist. You can open a simple example here.

Please keep in mind that not every browser can play HLS streams by default.

Master Playlist or Index M3U8 File

The previous M3U8 example points to a series of chunks. They are created from the original video file, which is resized encoded and split into chunks.

That means we still have the problem outlined in the introduction – what about clients on very slow (or unusually fast) networks? Or, clients on fast networks with very small screen sizes? It makes no sense to stream a file in maximum resolution if it can’t be shown in all its glory on your shiny new phone.

M3U8 in HSL

HLS solves this problem by introducing another “layer” of M3U8. This M3U8 file will not contain pointers to files, but it has pointers to other M3U8 files which, in turn, contain video files prepared in advance for specific bitrates and resolutions.

Here is an example of such an M3U8 file:

The video player will pick pairs of lines such as:

These are called variants of the same video prepared for different network speeds and screen resolutions. This specific M3U8 file () contains the video file chunks of the video resized to 640x360 pixels and prepared for bitrates of 1296kbps. Note that the reported bitrate must take into account both the video and audio streams in the video.

The video player will usually start playing from the first stream variant (in the previous example this is 640x360_1200.m3u8). For that reason, you must take special care to decide which variant will be the first in the list. The order of the other variants isn’t important.

If the first .ts file takes too long to download (causing “buffering”, i.e. waiting for the next chunk) the video player will switch to a to a stream with a smaller bitrate. And, of course, if it’s loaded fast enough it means that it can switch to a better quality variant, but only if it makes sense for the resolution of the display.

If the first stream in the index M3U8 list isn’t the best one, the client will need one or two cycles until it settles with the right variant.

So, now we have three layers of HLS:

  1. Index M3U8 file (the master playlist) containing pointers (URLs) to variants.
  2. Variant M3U8 files (the media playlist) for different streams for different screen sizes and network speeds. They contain pointers (URLs) to .ts files.
  3. files (chunks) which are binary files with parts of the video.

You can watch an example index M3U8 file here (again, it depends on your browser/OS).

Sometimes, you know in advance that the client is on a slow or fast network. In that case you can help the client choose the right variant by providing an index M3U8 file with a different first variant. There are two ways of doing this.

  • The first is to have multiple index files prepared for different network types and prepare the client in advance to request the right one. The client will have to check the network type and then request for example or .
  • You also can make sure the client sends the network type as part of the http request (for example if it’s connected to a wifi, or mobile 2G/3G/…) and then have the index M3U8 file prepared dynamically for each request. Only the index M3U8 file needs a dynamic version, the single streams (variant M3U8 files) can still be stored as static files.

Preparing Video Files for HLS

There are two important building blocks of Apple’s HTTP Live Streaming. One is the way how video files are stored (to be served via HTTP later) and the other are the M3U8 index file(s) which tells to the player (the streaming client app) where to get which video file.

Let’s start with video files. The HLS protocol expects the video files stored in smaller chunks of equal length, typically 10 seconds each. Originally, those files had to be stored in MPEG-2 TS files () and encoded with the H.264 format with audio in MP3, HE-AAC, or AC-3.

HLS Video

That means that a video of 30 seconds will be split into 3 smaller files, each approximately 10s long.

Note, the latest version of HLS allows for fragmented .mp4 files, too. Since this is a still a new thing, and some video players still need to implement it, the examples in this article will use files.


Chunks must be encoded with a keyframe at the start of each file. Each video contains frames. Frames are images, but video formats don’t store complete images, that would take too much disk space. They encode just the difference from the previous frame. When you jump to a middle point in the video, the player needs a “starting point” from where to apply all those diffs in order to show the initial image, and then start playing the video.

That’s why chunks must have a keyframe at the start. Sometimes players need to start in the middle of the chunk. The player can always calculate the current image by adding all the “diffs” from the first keyframe. But, if it starts 9 seconds from the start, it needs to calculate 9 seconds of “diffs.” To make that computation faster, it is best to create keyframes every few seconds (best cca 3s).

HLS Break Points

There are situations when you want multiple video clips played in succession. One way to do that is to merge the original video files, and then create the HLS streams with that file, but that’s problematic for multiple reasons. What if you want to show an ad before or after your video? Maybe you don’t want to do that for all users, and probably you want different ads for different users. And, of course, you don’t want to prepare HLS files with different ads in advance.

In order to fix that problem, there is a tag which can be used in the m3u8 playlist. This line basically tells to the video player to prepare in advance for the fact that from this point on, the files may be created with a different configuration (for example, the resolution may change). The player will need to recalculate everything and possibly to switch to another variant and it needs to be prepared to such “discontinuity” points.

Live Streaming With HLS

There are basically two kinds of “video streaming”. One is Video On Demand (VOD) for videos recorded in advance and streamed to the user when he decides to. And there is Live Streaming. Even though HLS is an abbreviation for HTTP Live Streaming, everything explained so far has been centered around VOD, but there is a way to make live streaming with HLS, too.

There are a few changes in your M3U8 files. First, there must be a tag in the variant M3U8 file. Then, the M3U8 file must not end the with (which otherwise must always be placed at the end).

While you record your stream you will constantly have new files. You need to append them in the M3U8 playlist and each time you add a new one the counter in the must be increased by 1.

The video player will check the counter. If changed from the last time it knows if there are new chunks to be downloaded and played. Make sure that the M3U8 file is served with the no-cache headers, because clients will keep reloading M3U8 files waiting for new chunks to play.


Another interesting feature for HLS streams is that you can embed Web Video Text Track (VTT) files in them. VTT files can be used for various uses. For example, for a web HLS player you can specify image snapshots for various parts of the video. When the user moves the mouse over the video timer area (below the video player), the player can show snapshots from that position in the video.

Another obvious use for VTT files are subtitles. The HLS stream can specify multiple subtitles for multiple languages:

Then, looks like:

The actual VTT (for example ):

In addition to VTT files, Apple recently announced HLS will feature support for IMSC1, a new subtitle format optimized for streaming delivery. Its most important advantage is that it can be styled using CSS.

HTTP Live Streaming Tools and Potential Issues

Apple introduced a number of useful HSL tools, which are described in greater detail in the official HLS guide.

  • For live streams, Apple prepared a tool named to create segment files on the fly from an ongoing video stream.
  • Another important tool is . It will check your M3U8 playlists, download the video files and report various problems. For example when the reported bitrate is not the same as calculated from the .ts files.
  • Of course, when you must encode/decode/mux/demux/chunk/strip/merge/join/… video/audio files, there is ffmpeg. Be ready to compile your own custom version(s) of ffmpeg for specific use cases.

One of the most frequent problems encountered in video is audio synchronization. If you find that audio in some of your HLS streams is out of sync with the video (i.e. an actor open their mouth, but you notice the voice is a few milliseconds early or late), it is possible that the original video file was filmed using a variable framerate. Make sure to convert it to constant bitrate.

If possible, it’s even better to make sure that your software is set to record video at a constant framerate.

HTTP Live Streaming Example

I prepared an HLS Android application which streams a predefined HLS using Google’s ExoPlayer player. It will show a video and a list of HLS “events” below it. Those events include: every file downloaded, or each time the player decides to switch to a higher or lower bitrate stream.

Let’s go through the main parts of the viewer initialization. In the first step we’ll retrieve the device’s current connection type and use that information to decide which file to retrieve.

Note that this isn’t strictly necessary. The HLS player will always adjust to the right HLS variant after a few chunks, but that means that in the first 5-20 seconds the user might not watch the ideal variant of the stream.

Remember, the first variant in the file is the one the viewer will start with. Since we’re on the client side and we can detect the connection type, we can at least try to avoid the initial player’s switching between variants by requesting the file which is prepared in advance for this connection type.

In the next step, we initialize and start our HLS player:

Then we prepare the player and feed it with the right m3u8 for this network connection type:

And here is the result:

HLS streaming example in Android

HLS Browser Compatibility, Future Developments

There is a requirement from Apple for video streaming apps on iOS that they must use HLS if the videos are longer than 10 minutes or bigger than 5mb. That in itself is a guarantee that HLS is here to stay. There were some worries about HLS and MPEG-DASH and which one will be the winner in the web browsers arena. HLS isn’t implemented in all modern browsers (you probably noticed that if you clicked the previous m3u8 url examples). On Android, for example, in versions less than 4.0 it won’t work at all. From 4.1 to 4.4 it works only partially (for example, the audio is missing, or video missing but audio works).

But this “battle” got slightly more simple recently. Apple announced that the new HLS protocol will allow fragmented mp4 files (). Previously, if you wanted to have both HLS and MPEG-DASH support, you had to encode your videos twice. Now, you will be able to reuse the same video files, and repackage only the metadata files ( for HLS and for MPEG-DASH).

Another recent announcement is support for High Efficiency Video Codec (HEVC). If used, it must be packaged in fragmented mp4 files. And that probably means that the future of HLS is .

The current situation in the world of browsers is that only some browser implementations of the tag will play HLS out of the box. But there are open-source and commercial solutions which offer HLS compatibility. Most of them offer HLS by having a Flash fallback but there are a few implementations completely written in JavaScript.

Wrapping Up

This article focuses specifically on HTTP Live Streaming, but conceptually it can also be read as an explanation of how Adaptive Bitrate Streaming (ABS) works. In conclusion, we can say HLS is a technology that solves numerous important problems in video streaming:

  • It simplifies storage of video files
  • CDN
  • Client players handling different client bandwidths and switching between streams
  • Subtitles, encryption, synchronized playbacks, and other features not covered in this article

Regardless of whether you end up using HLS or MPEG-DASH, both protocols should offer similar functionalities and, with the introduction of fragmented mp4 (fMP4) in HLS, you can use the same video files. That means that in most cases you’ll need to understand the basics of both protocols. Luckily, they seem to be moving in the same direction, which should make them easier to master.

Understanding the basics

The M3U (or M3U8) is a plain text file format originally created to organize collections of MP3 files. The format is extended for HLS, where it’s used to define media streams.

HTTP Live Streaming (HLS) is an adaptive bitrate streaming protocol introduced by Apple in 2009. It uses m3u8 files to describe media streams and HTTP for communication between the server and the client. It is the default media streaming protocol for all iOS devices.

MPEG-DASH is a widely used streaming solution, built around HTTP just like Apple HLS. DASH stands for Dynamic Adaptive Streaming over HTTP.

In recent years, HLS support has been added to most browsers. However, differences persist. For example, Chrome and Firefox feature only partial support on desktop platforms.

Sours: https://www.toptal.com/apple/introduction-to-http-live-streaming-hls

Sample hls streaming

Here is a list of free HLS m3u8 test URLs for testing OTT HLS (m3u8) Video Players (including Big Buck Bunny, Sintel, Tears of Steel, and m3u8 URLs from Akamai, Dolby, Azure, Unified Streaming).

HLS m3u8 URLs

When you are working with HLS (either writing a packager, or writing an HLS-compliant player, or you just want to see how HLS works), it is always convenient to have a few sample HLS m3u8 URLs to test against – right? I always find myself googling for such URLs, so I have made a list of them right here for you!

If you are interested in HLS (HTTP Live Streaming), then these articles will interest you –

  1. What is HLS or HTTP Live Streaming?
  2. How to package for HLS using FFmpeg?
  3. List of HLS or m3u8 video players
  4. HLS vs. MPEG-DASH

Here is the list of m3u8 URLs –

  1. Tears of Steel m3u8
  2. Big Buck Bunny VOD m3u8
  3. Sintel VOD m3u8
  4. Sample from Apple
  5. fMP4 m3u8
  6. Live Akamai m3u8
  7. Live Akamai m3u8
  8. Dolby VOD m3u8
     – this is served over HTTP, so it might result in media errors. To avoid this, reload your HLS player using HTTP (less secure, but, you can playback this stream!)
  9. Dolby Multichannel m3u8
     – this is served over HTTP, so it might result in media errors. To avoid this, reload your HLS player using HTTP (less secure, but, you can playback this stream!)
  10. Dolby Multilanguage m3u8
     – this is served over HTTP, so it might result in media errors. To avoid this, reload your HLS player using HTTP (less secure, but, you can playback this stream!)
  11. Dolby Vision Profile 5 – Dolby Atmosm3u8:
  12. Azure HLSv4 m3u8
     .. again, served over HTTP.
  13. Azure HLSv4 m3u8
     served over HTTP.
  14. Azure 4K HLSv4 m3u8
     served over HTTP.

Credit to Apple, Akamai, Unified Streaming, Dolby, Azure for creating and hosting these HLS m3u8 URLs. If you have any more HLS m3u8 URLs that you can add to this list, please add it in the comments section, and I will add them to the list. Also, if any of the streams are not working, let me know. Thanks!

krishna rao vijayanagar

Krishna Rao Vijayanagar


I’m Dr. Krishna Rao Vijayanagar, and I have worked on Video Compression (AVC, HEVC, MultiView Plus Depth), ABR streaming, and Video Analytics (QoE, Content & Audience, and Ad) for several years.

I hope to use my experience and love for video streaming to bring you information and insights into the OTT universe.

Sours: https://ottverse.com/free-hls-m3u8-test-urls/
Dual or more audio example in hls streaming

HTTP Live Streaming Examples

This repository contains a collection of samples showcasing of HTTP Live Streaming.

What's HTTP Live Streaming?

HLS lets you deploy content using ordinary web servers and content delivery networks. HLS is designed for reliability and dynamically adapts to network conditions by optimizing playback for the available speed of wired and wireless connections.

Manifest – M3U8

Apple HTTP Live Streaming (HLS) uses an M3U8 playlist as its manifest, typically a variant of a stream is quality of the stream in a specific bitrate and/or resolution. Variant playlists are structured in the following that there is one root M3U8 that references other M3U8s that describe the individual variants (qualities).

Samples Overview

It’s good to have a variety of streams available when you are testing your adaptive streaming solution to ensure you are covering all aspects of your playback. Below you will find list of publicly available HLS examples, test streams and datasets to help you through your development process.

This repository contains a few categories of samples:

Apiary API streaming list mock

A REST service that you can use whenever you need some fake data. It's great for faking a server, sharing code examples, testing your MVP.



If you have questions about any aspect of this collection, please feel free to open an issue.

Sours: https://github.com/adimango/http-live-streaming-examples

Similar news:

HTTP Live Streaming

ClientPlatformLive StreamingDRMAs of VersionEditor Safari (web browser)macOS, iOSYesYes6.0+

Has full HLS support.

Apple Microsoft Edge (web browser)Windows 10Native support on Edge Legacy.

Support via Media Source Extensions on Edge Chromium.

YesSupported natively on Edge Legacy's engine EdgeHTML from version 12 to 18.

No native support on Edge Chromium from version 79 to present.[30]

Microsoft Google Chrome (web browser) / ChromiumWindows, macOS, Linux, Android, iOSOS-dependent support on Android/iOS.

Support via Media Source Extensions on other OS.


Android and iOS have OS-dependent native support.

Other platforms require Media Source Extensions.

Google Firefox (web browser)Windows, macOS, Linux, Android, iOSOS-dependent support on Android/iOS.

Support via Media Source Extensions on other OS.

Yes50.0+ for Android[31] and 57.0 for others,[32] 59.0 has enhanced support for Android[33]

Other platforms require Media Source Extensions.

Mozilla QuickTime Player (media player)macOSYesYes10.0+

Has full HLS support.

Apple iTunes (music player)Windows, macOSYesYes10.1+[34]

Has full HLS support.

To play a HLS stream, go to File > Open Stream and replace "http://" with "itls://" (for video streams) or "itals://" (for audio streams) in the stream URL.

Apple StreamS HiFi Radio (radio player)iOS, tvOS

iPhone, iPad, and AppleTV


Plays Internet Radio Streams

HLS Audio - 100% Compliant
AAC-LC/HE-AAC/xHE-AAC 2.0 Stereo/5.1-7.1 Surround
ES - Elementary Stream ADTS
fMP4 - Fragmented ISO MP4
Displays Synchronous Realtime Metadata and Graphics

StreamS/Modulation Index LLC VLC media player (media player)Windows, macOS, Linux, Android, iOS, Windows PhoneYesUn­knownVLC 2.x[35] has partial support up to HLS version 3 (otherwise will load as M3U playlist, individual chunks sequence).[36]

VLC 3.0 has full HLS support.

VideoLANMedia Player Classic Home Cinema (media player)WindowsYesYesGabest, Doom9 forum users PotPlayer (media player)WindowsYesYesDaum CommunicationsMPlayer / SMPlayer / mpv (media player)Windows, macOS, Linux, BSDYesYesRicardo Villalba GOM Player (media player)WindowsYesYesGretech Cameleon (live video streaming software)Windows, macOSYesUn­knownYatko Audacious (software) (music player)Windows, LinuxYesYesAudacious Radio Tray (radio player)LinuxYesYesCarlos Ribeiro Kodi (software) (home entertainment application)Windows, macOS, Linux, Android, iOSYesPartial12.0 Alpha 5 and later
DRM support requires a monthly/nightly buildXBMC Foundation MythTV (home entertainment application)Windows, macOS, Linux, FreeBSDYesYes0.26MythTVJRiver Media Center (home entertainment application)Windows, macOSYesYesJRiver XiiaLive (radio player)Android, iOSYesYes3.0+
Plays internet radio streams (audio only).Visual Blasters LLC Tunein radio (radio player)Android, iOSYesYes3.3+
Plays internet radio streams (audio only).TuneIn myTuner Radio (radio player)Android, iOS, Windows Phone, Windows 8, macOSYesYesPlays internet radio streams (audio only).AppGeneration Software Internet Radio Player (radio player)AndroidYesYesPlays internet radio streams (audio only).MuserTech GuguRadio (radio player)iOSYesYesPlays internet radio streams (audio only).Leon Fan AIMP (media player)Windows, AndroidYesUn­known4.10+ (build 1827)
Plays internet radio streams (audio only).Artem Izmaylov Mini Stream Player (media player)AndroidYesYesJogiApp MX Player (media player)AndroidYesYesJ2 Interactive TV Streams (media player)macOS, iOS, tvOSYesYesv7.1Tiago Martinho HP TouchpadWebOSYesYes3.0.5HP Amino x4x STBAmino set-top boxesYesYes2.5.2 AminetAminocom.comDune HD TVDune HD set-top boxesYesYesTV Seriesdunehd.comCTU Systems LtdCTU Systems Ltd Eludo Play Out SystemYesYesTV Seriesctusystems.comnangu.TVMotorola set-top boxesYesYes2.0nangu.TVRoku Digital Video PlayerRoku set-top boxesYesYesRoku OS / SDK 2.6RokuTelebreeze PlayerHTML, Android, iOS, Windows, MacOS, Roku, MAG Infomir, Samsung Tizen, LG WebOS, Google Chromecast, tvOS, Amazon Fire TV, AndroidTVYesYesTelebreeze bitdash (SDK)HTML5 or Flash, Web and MobileYesYesVersion 3.0+bitmovin3ivx (SDK)Windows 8, Windows Phone 8[37] & Xbox One[38]YesYes2.03ivxTHEOplayer[39]HTML5, SDK (Android, iOS, Android TV, tvOS, Chromecast, WebOS, FireTV, Tizen)YesYesTHEO Technologies Viblast Player (SDK)HTML5, iOS, AndroidYesPartialViblast Ltd Flowplayer (SDK)Adobe Flash, iOS, Android, HTML5 (hlsjs plugin)YesYesThe Flash HLS plugin is available from GitHub.Flowplayer Ltd JW Player (SDK)Adobe Flash, iOS, Android, HTML5YesYesHLS is provided in all JW Player versions as of JW8 (latest)JW Player Radiant Media Player (SDK)Adobe Flash, HTML5YesYes1.5.0[40]Radiant Media Player Yospace (SDK)Adobe FlashYesYes2.1YospaceOnlinelib (SDK)Adobe FlashYesYes2.0Onlinelib.deVODOBOX HLS Player (online service)Adobe Flash, HTML5, iOS, AndroidYesYesVodobox NexPlayer (SDK)HTML5 (MSE Browsers), Android (mobile, TV, STB), iOS, Chromecast, Windows, Mac, Linux, Tizen, WebOSYesYesNexStreamingffplay/avplay (multimedia framework)YesPartialFFmpeg/LibavGPAC (multimedia framework)YesNo0.5.0Telecom ParisTech inc. QuickPlayer (SDK)Android, iOS, Windows 7, 8, 8,1 and 10YesYesSquadeo hls.js (MSE)MSE BrowsersYesUn­knownDailymotion open source[41][42]hasplayer.js (MSE)MSE BrowsersYesUn­knownopen source[43]Hola Player (video player)HTML5, Adobe Flash, iOS, AndroidYesYesAll versionsHola Ltd open source[44]Shaka Player (SDK)HTML5 (MSE Browsers)Coming soonPartial2.1Open Source[45][46]Fluid Player (Video Player)HTML5 (MSE Browsers)YesYes2.2.0+Fluid Player OSS[47][48][49]Video.js MSE Browsers. Flash with flashls source handler fallback.YesYes Open source
Sours: https://en.wikipedia.org/wiki/HTTP_Live_Streaming

713 714 715 716 717