Rendering using SSR APIs
The NPM package @remotion/renderer
provides you with "Server-Side Rendering" APIs for rendering media programmatically.
These functions can be used in Node.js and Bun.
Rendering a video takes three steps:
1
Creating a Remotion Bundle2
Select the composition to render and calculate its metadata3
Render the video, audio, still or image sequence.Example script
Follow this commented example to see how to render a video:
render.mjstsx
import {bundle } from "@remotion/bundler";import {renderMedia ,selectComposition } from "@remotion/renderer";importpath from "path";// The composition you want to renderconstcompositionId = "HelloWorld";// You only have to create a bundle once, and you may reuse it// for multiple renders that you can parametrize using input props.constbundleLocation = awaitbundle ({entryPoint :path .resolve ("./src/index.ts"),// If you have a Webpack override, make sure to add it herewebpackOverride : (config ) =>config ,});// Parametrize the video by passing props to your component.constinputProps = {foo : "bar",};// Get the composition you want to render. Pass `inputProps` if you// want to customize the duration or other metadata.constcomposition = awaitselectComposition ({serveUrl :bundleLocation ,id :compositionId ,inputProps ,});// Render the video. Pass the same `inputProps` again// if your video is parametrized with data.awaitrenderMedia ({composition ,serveUrl :bundleLocation ,codec : "h264",outputLocation : `out/${compositionId }.mp4`,inputProps ,});console .log ("Render done!");
render.mjstsx
import {bundle } from "@remotion/bundler";import {renderMedia ,selectComposition } from "@remotion/renderer";importpath from "path";// The composition you want to renderconstcompositionId = "HelloWorld";// You only have to create a bundle once, and you may reuse it// for multiple renders that you can parametrize using input props.constbundleLocation = awaitbundle ({entryPoint :path .resolve ("./src/index.ts"),// If you have a Webpack override, make sure to add it herewebpackOverride : (config ) =>config ,});// Parametrize the video by passing props to your component.constinputProps = {foo : "bar",};// Get the composition you want to render. Pass `inputProps` if you// want to customize the duration or other metadata.constcomposition = awaitselectComposition ({serveUrl :bundleLocation ,id :compositionId ,inputProps ,});// Render the video. Pass the same `inputProps` again// if your video is parametrized with data.awaitrenderMedia ({composition ,serveUrl :bundleLocation ,codec : "h264",outputLocation : `out/${compositionId }.mp4`,inputProps ,});console .log ("Render done!");
This flow is customizable. Click on one of the SSR APIs to read about its options:
getCompositions()
- Get a list of available compositions from a Remotion project.selectComposition()
- Get a list of available compositions from a Remotion project.renderMedia()
- Render a video or audio.renderFrames()
- Render an image sequence.renderStill()
- Render a still image.stitchFramesToVideo()
- Encode a video based on an image sequenceopenBrowser()
- Share a browser instance across function calls for improved performance.
Linux Dependencies
If you are on Linux, Chrome Headless Shell requires some shared libraries to be installed. See Linux Dependencies.
SSR APIs in Next.js
If you are using Next.js, you will not be able to use @remotion/bundler
because of the limitations explained in Can I render videos in Next.js? Refer to the page for possible alternatives.
We recommend Lambda for use in Next.js.