Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      Sunshine And March Vibes (2025 Wallpapers Edition)

      May 18, 2025

      The Case For Minimal WordPress Setups: A Contrarian View On Theme Frameworks

      May 18, 2025

      How To Fix Largest Contentful Paint Issues With Subpart Analysis

      May 18, 2025

      How To Prevent WordPress SQL Injection Attacks

      May 18, 2025

      I need to see more from Lenovo’s most affordable gaming desktop, because this isn’t good enough

      May 18, 2025

      Gears of War: Reloaded — Release date, price, and everything you need to know

      May 18, 2025

      I’ve been using the Logitech MX Master 3S’ gaming-influenced alternative, and it could be your next mouse

      May 18, 2025

      Your Android devices are getting several upgrades for free – including a big one for Auto

      May 18, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      YTConverter™ lets you download YouTube videos/audio cleanly via terminal — especially great for Termux users.

      May 18, 2025
      Recent

      YTConverter™ lets you download YouTube videos/audio cleanly via terminal — especially great for Termux users.

      May 18, 2025

      NodeSource N|Solid Runtime Release – May 2025: Performance, Stability & the Final Update for v18

      May 17, 2025

      Big Changes at Meteor Software: Our Next Chapter

      May 17, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      I need to see more from Lenovo’s most affordable gaming desktop, because this isn’t good enough

      May 18, 2025
      Recent

      I need to see more from Lenovo’s most affordable gaming desktop, because this isn’t good enough

      May 18, 2025

      Gears of War: Reloaded — Release date, price, and everything you need to know

      May 18, 2025

      I’ve been using the Logitech MX Master 3S’ gaming-influenced alternative, and it could be your next mouse

      May 18, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Tech & Work»Adaptive Video Streaming With Dash.js In React

    Adaptive Video Streaming With Dash.js In React

    March 27, 2025

    I was recently tasked with creating video reels that needed to be played smoothly under a slow network or on low-end devices. I started with the native HTML5 <video> tag but quickly hit a wall — it just doesn’t cut it when connections are slow or devices are underpowered.

    After some research, I found that adaptive bitrate streaming was the solution I needed. But here’s the frustrating part: finding a comprehensive, beginner-friendly guide was so difficult. The resources on MDN and other websites were helpful but lacked the end-to-end tutorial I was looking for.

    That’s why I’m writing this article: to provide you with the step-by-step guide I wish I had found. I’ll bridge the gap between writing FFmpeg scripts, encoding video files, and implementing the DASH-compatible video player (Dash.js) with code examples you can follow.

    Going Beyond The Native HTML5 <video> Tag

    You might be wondering why you can’t simply rely on the HTML <video> element. There’s a good reason for that. Let’s compare the difference between a native <video> element and adaptive video streaming in browsers.

    Progressive Download

    With progressive downloading, your browser downloads the video file linearly from the server over HTTP and starts playback as long as it has buffered enough data. This is the default behavior of the <video> element.

    <video src="rabbit320.mp4" />
    

    When you play the video, check your browser’s network tab, and you’ll see multiple requests with the 206 Partial Content status code.

    It uses HTTP 206 Range Requests to fetch the video file in chunks. The server sends specific byte ranges of the video to your browser. When you seek, the browser will make more range requests asking for new byte ranges (e.g., “Give me bytes 1,000,000–2,000,000”).

    In other words, it doesn’t fetch the entire file all at once. Instead, it delivers partial byte ranges from the single MP4 video file on demand. This is still considered a progressive download because only a single file is fetched over HTTP — there is no bandwidth or quality adaptation.

    If the server or browser doesn’t support range requests, the entire video file will be downloaded in a single request, returning a 200 OK status code. In that case, the video can only begin playing once the entire file has finished downloading.

    The problems? If you’re on a slow connection trying to watch high-resolution video, you’ll be waiting a long time before playback starts.

    Adaptive Bitrate Streaming

    Instead of serving one single video file, adaptive bitrate (ABR) streaming splits the video into multiple segments at different bitrates and resolutions. During playback, the ABR algorithm will automatically select the highest quality segment that can be downloaded in time for smooth playback based on your network connectivity, bandwidth, and other device capabilities. It continues adjusting throughout to adapt to changing conditions.

    This magic happens through two key browser technologies:

    • Media Source Extension (MSE)
      It allows passing a MediaSource object to the src attribute in <video>, enabling sending multiple SourceBuffer objects that represent video segments.
    <video src="blob:https://example.com/6e31fe2a-a0a8-43f9-b415-73dc02985892" />
    • Media Capabilities API
      It provides information on your device’s video decoding and encoding abilities, enabling ABR to make informed decisions about which resolution to deliver.

    Together, they enable the core functionality of ABR, serving video chunks optimized for your specific device limitations in real time.

    Streaming Protocols: MPEG-DASH Vs. HLS

    As mentioned above, to stream media adaptively, a video is split into chunks at different quality levels across various time points. We need to facilitate the process of switching between these segments adaptively in real time. To achieve this, ABR streaming relies on specific protocols. The two most common ABR protocols are:

    • MPEG-DASH,
    • HTTP Live Streaming (HLS).

    Both of these protocols utilize HTTP to send video files. Hence, they are compatible with HTTP web servers.

    This article focuses on MPEG-DASH. However, it’s worth noting that DASH isn’t supported by Apple devices or browsers, as mentioned in Mux’s article.

    MPEG-DASH

    MPEG-DASH enables adaptive streaming through:

    • A Media Presentation Description (MPD) file
      This XML manifest file contains information on how to select and manage streams based on adaptive rules.
    • Segmented Media Files
      Video and audio files are divided into segments at different resolutions and durations using MPEG-DASH-compliant codecs and formats.

    On the client side, a DASH-compliant video player reads the MPD file and continuously monitors network bandwidth. Based on available bandwidth, the player selects the appropriate bitrate and requests the corresponding video chunk. This process repeats throughout playback, ensuring smooth, optimal quality.

    Now that you understand the fundamentals, let’s build our adaptive video player!

    Steps To Build an Adaptive Bitrate Streaming Video Player

    Here’s the plan:

    1. Transcode the MP4 video into audio and video renditions at different resolutions and bitrates with FFmpeg.
    2. Generate an MPD file with FFmpeg.
    3. Serve the output files from the server.
    4. Build the DASH-compatible video player to play the video.

    Install FFmpeg

    For macOS users, install FFmpeg using Brew by running the following command in your terminal:

    brew install ffmpeg
    

    For other operating systems, please refer to FFmpeg’s documentation.

    Generate Audio Rendition

    Next, run the following script to extract the audio track and encode it in WebM format for DASH compatibility:

    ffmpeg -i "input_video.mp4" -vn -acodec libvorbis -ab 128k "audio.webm"
    
    • -i "input_video.mp4": Specifies the input video file.
    • -vn: Disables the video stream (audio-only output).
    • -acodec libvorbis: Uses the libvorbis codec to encode audio.
    • -ab 128k: Sets the audio bitrate to 128 kbps.
    • "audio.webm": Specifies the output audio file in WebM format.

    Generate Video Renditions

    Run this script to create three video renditions with varying resolutions and bitrates. The largest resolution should match the input file size. For example, if the input video is 576×1024 at 30 frames per second (fps), the script generates renditions optimized for vertical video playback.

    ffmpeg -i "input_video.mp4" -c:v libvpx-vp9 -keyint_min 150 -g 150 
    -tile-columns 4 -frame-parallel 1 -f webm 
    -an -vf scale=576:1024 -b:v 1500k "input_video_576x1024_1500k.webm" 
    -an -vf scale=480:854 -b:v 1000k "input_video_480x854_1000k.webm" 
    -an -vf scale=360:640 -b:v 750k "input_video_360x640_750k.webm"
    
    • -c:v libvpx-vp9: Uses the libvpx-vp9 as the VP9 video encoder for WebM.
    • -keyint_min 150 and -g 150: Set a 150-frame keyframe interval (approximately every 5 seconds at 30 fps). This allows bitrate switching every 5 seconds.
    • -tile-columns 4 and -frame-parallel 1: Optimize encoding performance through parallel processing.
    • -f webm: Specifies the output format as WebM.

    In each rendition:

    • -an: Excludes audio (video-only output).
    • -vf scale=576:1024: Scales the video to a resolution of 576×1024 pixels.
    • -b:v 1500k: Sets the video bitrate to 1500 kbps.

    WebM is chosen as the output format, as they are smaller in size and optimized yet widely compatible with most web browsers.

    Generate MPD Manifest File

    Combine the video renditions and audio track into a DASH-compliant MPD manifest file by running the following script:

    ffmpeg 
      -f webm_dash_manifest -i "input_video_576x1024_1500k.webm" 
      -f webm_dash_manifest -i "input_video_480x854_1000k.webm" 
      -f webm_dash_manifest -i "input_video_360x640_750k.webm" 
      -f webm_dash_manifest -i "audio.webm" 
      -c copy 
      -map 0 -map 1 -map 2 -map 3 
      -f webm_dash_manifest 
      -adaptation_sets "id=0,streams=0,1,2 id=1,streams=3" 
      "input_video_manifest.mpd"
    
    • -f webm_dash_manifest -i "…": Specifies the inputs so that the ASH video player will switch between them dynamically based on network conditions.
    • -map 0 -map 1 -map 2 -map 3: Includes all video (0, 1, 2) and audio (3) in the final manifest.
    • -adaptation_sets: Groups streams into adaptation sets:
      • id=0,streams=0,1,2: Groups the video renditions into a single adaptation set.
      • id=1,streams=3: Assigns the audio track to a separate adaptation set.

    The resulting MPD file (input_video_manifest.mpd) describes the streams and enables adaptive bitrate streaming in MPEG-DASH.

    <?xml version="1.0" encoding="UTF-8"?>
    <MPD
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      xmlns="urn:mpeg:DASH:schema:MPD:2011"
      xsi:schemaLocation="urn:mpeg:DASH:schema:MPD:2011"
      type="static"
      mediaPresentationDuration="PT81.166S"
      minBufferTime="PT1S"
      profiles="urn:mpeg:dash:profile:webm-on-demand:2012">
    
      <Period id="0" start="PT0S" duration="PT81.166S">
        <AdaptationSet
          id="0"
          mimeType="video/webm"
          codecs="vp9"
          lang="eng"
          bitstreamSwitching="true"
          subsegmentAlignment="false"
          subsegmentStartsWithSAP="1">
    
          <Representation id="0" bandwidth="1647920" width="576" height="1024">
            <BaseURL>input_video_576x1024_1500k.webm</BaseURL>
            <SegmentBase indexRange="16931581-16931910">
              <Initialization range="0-645" />
            </SegmentBase>
          </Representation>
    
          <Representation id="1" bandwidth="1126977" width="480" height="854">
            <BaseURL>input_video_480x854_1000k.webm</BaseURL>
            <SegmentBase indexRange="11583599-11583986">
              <Initialization range="0-645" />
            </SegmentBase>
          </Representation>
    
          <Representation id="2" bandwidth="843267" width="360" height="640">
            <BaseURL>input_video_360x640_750k.webm</BaseURL>
            <SegmentBase indexRange="8668326-8668713">
              <Initialization range="0-645" />
            </SegmentBase>
          </Representation>
    
        </AdaptationSet>
    
        <AdaptationSet
          id="1"
          mimeType="audio/webm"
          codecs="vorbis"
          lang="eng"
          audioSamplingRate="44100"
          bitstreamSwitching="true"
          subsegmentAlignment="true"
          subsegmentStartsWithSAP="1">
    
          <Representation id="3" bandwidth="89219">
            <BaseURL>audio.webm</BaseURL>
            <SegmentBase indexRange="921727-922055">
              <Initialization range="0-4889" />
            </SegmentBase>
          </Representation>
    
        </AdaptationSet>
      </Period>
    </MPD>
    

    After completing these steps, you’ll have:

    1. Three video renditions (576x1024, 480x854, 360x640),
    2. One audio track, and
    3. An MPD manifest file.
    input_video.mp4
    audio.webm
    input_video_576x1024_1500k.webm
    input_video_480x854_1000k.webm
    input_video_360x640_750k.webm
    input_video_manifest.mpd
    

    The original video input_video.mp4 should also be kept to serve as a fallback video source later.

    Serve The Output Files

    These output files can now be uploaded to cloud storage (e.g., AWS S3 or Cloudflare R2) for playback. While they can be served directly from a local folder, I highly recommend storing them in cloud storage and leveraging a CDN to cache the assets for better performance. Both AWS and Cloudflare support HTTP range requests out of the box.

    Building The DASH-Compatible Video Player In React

    There’s nothing like a real-world example to help understand how everything works. There are different ways we can implement a DASH-compatible video player, but I’ll focus on an approach using React.

    First, install the Dash.js npm package by running:

    npm i dashjs
    

    Next, create a component called <DashVideoPlayer /> and initialize the Dash MediaPlayer instance by pointing it to the MPD file when the component mounts.

    The ref callback function runs upon the component mounting, and within the callback function, playerRef will refer to the actual Dash MediaPlayer instance and be bound with event listeners. We also include the original MP4 URL in the <source> element as a fallback if the browser doesn’t support MPEG-DASH.

    If you’re using Next.js app router, remember to add the ‘use client’ directive to enable client-side hydration, as the video player is only initialized on the client side.

    Here is the full example:

    import dashjs from 'dashjs'
    import { useCallback, useRef } from 'react'
    
    export const DashVideoPlayer = () => {
      const playerRef = useRef()
    
      const callbackRef = useCallback((node) => {
        if (node !== null) {
    playerRef.current = dashjs.MediaPlayer().create() playerRef.current.initialize(node, "https://example.com/uri/to/input_video_manifest.mpd", false) playerRef.current.on('canPlay', () => { // upon video is playable }) playerRef.current.on('error', (e) => { // handle error }) playerRef.current.on('playbackStarted', () => { // handle playback started }) playerRef.current.on('playbackPaused', () => { // handle playback paused }) playerRef.current.on('playbackWaiting', () => { // handle playback buffering }) } },[]) return ( <video ref={callbackRef} width={310} height={548} controls> <source src="https://example.com/uri/to/input_video.mp4" type="video/mp4" /> Your browser does not support the video tag. </video> ) }

    Result

    Observe the changes in the video file when the network connectivity is adjusted from Fast 4G to 3G using Chrome DevTools. It switches from 480p to 360p, showing how the experience is optimized for more or less available bandwidth.

    Conclusion

    That’s it! We just implemented a working DASH-compatible video player in React to establish a video with adaptive bitrate streaming. Again, the benefits of this are rooted in performance. When we adopt ABR streaming, we’re requesting the video in smaller chunks, allowing for more immediate playback than we’d get if we needed to fully download the video file first. And we’ve done it in a way that supports multiple versions of the same video, allowing us to serve the best format for the user’s device.

    References

    • “Http Range Request And MP4 Video Play In Browser,” Zeng Xu
    • Setting up adaptive streaming media sources (Mozilla Developer Network)
    • DASH Adaptive Streaming for HTML video (Mozilla Developer Network)

    Source: Read More 

    news
    Facebook Twitter Reddit Email Copy Link
    Previous ArticleAkamai launches new platform for AI inference at the edge
    Next Article Deletion Vectors in Delta Live Tables: Identifying and Remediating Compliance Risks

    Related Posts

    Tech & Work

    Sunshine And March Vibes (2025 Wallpapers Edition)

    May 18, 2025
    Tech & Work

    The Case For Minimal WordPress Setups: A Contrarian View On Theme Frameworks

    May 18, 2025
    Leave A Reply Cancel Reply

    Continue Reading

    Chrome on Android is making it easier to access bookmarks and history

    Development

    Game Off 2024 winners

    News & Updates

    Hackers Deploy Malicious npm Packages to Steal Solana Wallet Keys via Gmail SMTP

    Development

    CVE-2023-53130 – Linux Kernel Block Device Exclusivity Leak

    Common Vulnerabilities and Exposures (CVEs)
    GetResponse

    Highlights

    Development

    SkyCell secures $116M for medical supply chain management

    June 25, 2024

    SkyCell, the Swiss-based pharmatech company, has closed its $116M Series D. The investment will be…

    The Diablo 4 devs just made an announcement that tells me they are cooking up something good for Season 5

    June 13, 2024

    GitHub Copilot adds agent mode, MCP support in latest release

    April 4, 2025

    Hackers Exploiting LiteSpeed Cache Bug to Gain Full Control of WordPress Sites

    May 8, 2024
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.