renderMediaOnVercel()v4.0.426
Experimental package: We reserve the right to make breaking changes in order to correct bad design decisions until this notice is gone.
Renders a video inside a Vercel Sandbox.
The rendered file stays inside the sandbox. Use uploadToVercelBlob() to upload it to Vercel Blob.
Example
route.tsconst {sandboxFilePath } = awaitrenderMediaOnVercel ({sandbox ,compositionId : 'MyComp',inputProps : {title : 'Hello World'},onProgress : async (update ) => {console .log (`Overall: ${Math .round (update .overallProgress * 100)}%`); if (update .stage === 'render-progress') {console .log (`Rendering: ${Math .round (update .progress .progress * 100)}%`); } }, });
Arguments
An object with the following properties:
sandbox
A Sandbox instance.
compositionId
The ID of the Remotion composition to render.
inputProps
Props to pass to the composition.
codec?
string Codec
Choose a suitable codec for your output media. Refer to the Encoding guide to find the best codec for your use case. Default: "h264".
outputFile?
The output file path inside the sandbox. Default: "/tmp/video.mp4".
crf?
No matter which codec you end up using, there's always a tradeoff between file size and video quality. You can control it by setting the CRF (Constant Rate Factor). The lower the number, the better the quality, the higher the number, the smaller the file is – of course at the cost of quality.
imageFormat?
The image format to use when rendering frames for a video. Must be one of "png", "jpeg", "none". Default: "jpeg". JPEG is faster, but does not support transparency.
pixelFormat?
Sets the pixel format in FFmpeg. See the FFmpeg docs for an explanation. Acceptable values: "yuv420p", "yuva420p", "yuv422p", "yuv444p", "yuv420p10le", "yuv422p10le", "yuv444p10le", "yuva444p10le".
envVariables?
Record<string, string>
An object containing environment variables to be injected in your project.
frameRange?
number | [number, number] | [number, null] FrameRange
Specify a single frame (passing a number) or a range of frames (passing a tuple [number, number]) to be rendered. By passing null (default) all frames of a composition get rendered.
everyNthFrame?
This option may only be set when rendering GIFs. It determines how many frames are rendered, while the other ones get skipped in order to lower the FPS of the GIF. For example, if the fps is 30, and everyNthFrame is 2, the FPS of the GIF is 15.
proResProfile?
Set the ProRes profile. This option is only valid if the codec has been set to prores. Possible values: "4444-xq", "4444", "hq", "standard", "light", "proxy". Default: "hq". See here for an explanation of possible values.
chromiumOptions?
Allows you to set certain Chromium / Google Chrome flags. See: Chromium flags.
scale?
Scales the output dimensions by a factor. For example, a 1280x720px frame will become a 1920x1080px frame with a scale factor of 1.5. See Scaling for more details.
preferLossless?
Uses a lossless audio codec, if one is available for the codec. If you setaudioCodec, it takes priority over preferLossless.
enforceAudioTrack?
Render a silent audio track if there would be none otherwise.
disallowParallelEncoding?
Disallows the renderer from doing rendering frames and encoding at the same time. This makes the rendering process more memory-efficient, but possibly slower.
concurrency?
How many CPU threads to use. Minimum 1. The maximum is the amount of threads you have (In Node.JS os.cpus().length). You can also provide a percentage value (e.g. 50%).
metadata?
An object containing metadata to be embedded in the video. See here for which metadata is accepted.
logLevel?
One of trace, verbose, info, warn, error.Determines how much info is being logged to the console.
Default
info.
timeoutInMilliseconds?
A number describing how long the render may take to resolve all delayRender() calls before it times out. Default: 30000
videoBitrate?
Specify the target bitrate for the generated video. The syntax for FFmpeg's-b:v parameter should be used. FFmpeg may encode the video in a way that will not result in the exact video bitrate specified. Example values: 512K for 512 kbps, 1M for 1 Mbps.
audioBitrate?
Specify the target bitrate for the generated video. The syntax for FFmpeg's -b:a parameter should be used. FFmpeg may encode the video in a way that will not result in the exact audio bitrate specified. Example values: 512K for 512 kbps, 1M for 1 Mbps. Default: 320k
audioCodec?
Set the format of the audio that is embedded in the video. Not all codec and audio codec combinations are supported and certain combinations require a certain file extension and container format. See the table in the docs to see possible combinations.
encodingMaxRate?
The value for the -maxrate flag of FFmpeg. Should be used in conjunction with the encoding buffer size flag.
encodingBufferSize?
The value for the -bufsize flag of FFmpeg. Should be used in conjunction with the encoding max rate flag.
muted?
The Audio of the video will be omitted.
numberOfGifLoops?
Allows you to set the number of loops as follows:null(or omitting in the CLI) plays the GIF indefinitely.0disables looping1loops the GIF once (plays twice in total)2loops the GIF twice (plays three times in total) and so on.
x264Preset?
Sets a x264 preset profile. Only applies to videos rendered with h264 codec.Possible values:
superfast, veryfast, faster, fast, medium, slow, slower, veryslow, placebo.Default:
medium
colorSpace?
Color space to use for the video. Acceptable values: "default"(default since 5.0), "bt601" (same as "default", since v4.0.424), "bt709" (since v4.0.28), "bt2020-ncl" (since v4.0.88), "bt2020-cl" (since v4.0.88), .For best color accuracy, it is recommended to also use
"png" as the image format to have accurate color transformations throughout.Only since v4.0.83, colorspace conversion is actually performed, previously it would only tag the metadata of the video.
jpegQuality?
Sets the quality of the generated JPEG images. Must be an integer between 0 and 100. Default: 80.
forSeamlessAacConcatenation?
If enabled, the audio is trimmed to the nearest AAC frame, which is required for seamless concatenation of AAC files. This is a requirement if you later want to combine multiple video snippets seamlessly.This option is used internally. There is currently no documentation yet for to concatenate the audio chunks.
separateAudioTo?
If set, the audio will not be included in the main output but rendered as a separate file at the location you pass. It is recommended to use an absolute path. If a relative path is passed, it is relative to the Remotion Root.
hardwareAcceleration?
One of
"disable", "if-possible", or "required"
. Default "disable". Encode using a hardware-accelerated encoder if
available. If set to "required" and no hardware-accelerated encoder is
available, then the render will fail.
offthreadVideoCacheSizeInBytes?
From v4.0, Remotion has a cache for <OffthreadVideo> frames. The default is null, corresponding to half of the system memory available when the render starts.This option allows to override the size of the cache. The higher it is, the faster the render will be, but the more memory will be used.
The used value will be printed when running in verbose mode.
Default:
null
mediaCacheSizeInBytes?
Specify the maximum size of the cache that <Video> and <Audio> from @remotion/media may use combined, in bytes. The default is half of the available system memory when the render starts.
offthreadVideoThreads?
The number of threads that<OffthreadVideo> can start to extract frames. The default is 2. Increase carefully, as too many threads may cause instability.
licenseKey?
License key for sending a usage event using @remotion/licensing.
onProgress?
function RenderMediaOnVercelProgress
A callback that receives progress updates. Every variant includes overallProgress (0–1).
const onProgress = async (update : RenderMediaOnVercelProgress ) => {
console .log (`Overall: ${Math .round (update .overallProgress * 100)}%`);
if (update .stage === 'render-progress') {
console .log (`Rendering: ${Math .round (update .progress .progress * 100)}%`);
}
};Return value
An object containing:
sandboxFilePath
The path to the rendered video inside the sandbox.
contentType
The MIME type of the rendered output (e.g. "video/mp4", "video/webm", "image/gif").