<HtmlInCanvas>v4.0.455
This component renders its children into a <canvas> using the browser’s HTML-in-canvas API and allows you to draw an effect using the Canvas 2D API, WebGL or WebGPU.
HTML-in-canvas is only available in Chrome 149 and later and if the chrome://flags/#canvas-draw-element flag is enabled.
MyComp.tsximport {HtmlInCanvas } from 'remotion'; export constMyComp :React .FC = () => { return ( <HtmlInCanvas width ={1280}height ={720}> <div style ={{fontSize : 80}}>Hello</div > </HtmlInCanvas > ); };
HtmlInCanvas.isHtmlInCanvasSupported()
Return value: boolean
If HTML-in-canvas is not available, the component throws a fatal error.
Use HtmlInCanvas.isHtmlInCanvasSupported() to check if HTML-in-canvas is supported.
Buggy implementations such as the one in Chrome 147 are not considered supported.
Check if HTML-in-canvas is supportedimport {HtmlInCanvas } from 'remotion'; if (HtmlInCanvas .isHtmlInCanvasSupported ()) {console .log ('HTML-in-canvas is supported'); } else {console .log ('HTML-in-canvas is not supported'); }
API
width
Width of the canvas and the inner layout area, in pixels. Must be a positive integer.
height
Height of the canvas and the inner layout area, in pixels. Must be a positive integer.
children
Children to draw to the canvas.
Children will be wrapped in a <div> with the given width and height.
onPaint?
Called when the children are updated and can be painted onto the canvas.
If this callback is omitted, the children are painted using a 2d context with no transform.
Simple exampleimport {HtmlInCanvasOnPaint } from 'remotion'; constonPaint :HtmlInCanvasOnPaint = ({canvas ,element ,elementImage }) => { constctx =canvas .getContext ('2d'); if (!ctx ) { throw newError ('Failed to acquire 2D context'); }ctx .reset ();ctx .filter = 'blur(10px)'; consttransform =ctx .drawElementImage (elementImage , 0, 0);element .style .transform =transform .toString (); };
See the examples below for webgl and webgpu examples.
The callback receives:
canvas
An OffscreenCanvas with dimensions width × height.
You should paint to this canvas.
element
The inner HTMLDivElement that wraps children.
You should apply the return value of drawElementImage to this element's style.transform property.
elementImage
An ElementImage handle for the current capture.
You should paint this image to the canvas.
onInit?
Runs once before the first paint. Use it to create GPU contexts or other resources tied to the OffscreenCanvas. Must return a cleanup function, or a Promise that resolves to one. The cleanup runs on unmount.
The argument object matches onPaint (canvas, element, elementImage). The elementImage passed here is only for initialization — capture again inside onPaint for each frame.
durationInFrames?
Inherited from <Sequence>.
from?
Inherited from <Sequence>.
ref?
You can add a React ref to <HtmlInCanvas />.
It is attached to the layout HTMLCanvasElement — the canvas that hosts the laid-out subtree (layoutSubtree).
If you use TypeScript, type the ref with HTMLCanvasElement:
src/Example.tsximportReact , {useRef } from 'react'; import {AbsoluteFill ,HtmlInCanvas } from 'remotion'; export constExample :React .FC = () => { constcanvasRef =useRef <HTMLCanvasElement >(null); return ( <HtmlInCanvas ref ={canvasRef }width ={1280}height ={720}> <AbsoluteFill style ={{fontSize : 80}}>Hello</AbsoluteFill > </HtmlInCanvas > ); };
onPaint examples
2D
Call drawElementImage() on the 2D context to draw elementImage into the bitmap.
HTML-in-canvas recommends that you use the return value and assign it to element.style.transform, so that the original DOM element matches the transform, so that selection still works.
For 2D, you usually do not need onInit.
2D: animated blur + drawElementImageimportReact , {useCallback } from 'react'; import {AbsoluteFill ,HtmlInCanvas , typeHtmlInCanvasOnPaint ,useCurrentFrame ,useVideoConfig , } from 'remotion'; constBLUR_MIN_PX = 4; constBLUR_MAX_PX = 22; constBLUR_CYCLES_PER_SECOND = 0.35; export constHtmlInCanvas2DBlur :React .FC = () => { constframe =useCurrentFrame (); const {width ,height ,fps } =useVideoConfig (); constonPaint :HtmlInCanvasOnPaint =useCallback ( ({canvas ,element ,elementImage }) => { constctx =canvas .getContext ('2d'); if (!ctx ) { throw newError ('Failed to acquire 2D context'); } constt = (frame /fps ) *Math .PI * 2 *BLUR_CYCLES_PER_SECOND ; constblurPx =BLUR_MIN_PX + (BLUR_MAX_PX -BLUR_MIN_PX ) * (0.5 + 0.5 *Math .sin (t ));ctx .reset ();ctx .filter = `blur(${blurPx }px)`; consttransform =ctx .drawElementImage (elementImage , 0, 0);element .style .transform =transform .toString (); }, [frame ,fps ], ); return ( <HtmlInCanvas width ={width }height ={height }onPaint ={onPaint }> <AbsoluteFill style ={{justifyContent : 'center',alignItems : 'center',backgroundColor : '#1a1a2e',color : 'white',fontSize : 120,fontFamily : 'sans-serif', }} > <h1 style ={{margin : 0}}>Hello</h1 > </AbsoluteFill > </HtmlInCanvas > ); };
WebGL
Do all setup in onInit such as getting the WebGL context and compiling the shader.
onInit must return a cleanup function that destroys the resources that were created in onInit.
Use texElementImage2D to turn the elementImage into a texture.
This example is long. Expand to see it:
WebGL2: minimal full component/** * Minimal WebGL2 + HtmlInCanvas sample (same code as /docs/remotion/html-in-canvas). * UV wave distortion in the fragment shader (not expressible as a static CSS filter). */ importReact , {useCallback ,useRef } from 'react'; import {AbsoluteFill ,HtmlInCanvas ,HtmlInCanvasOnInit ,HtmlInCanvasOnPaint ,useCurrentFrame ,useVideoConfig , } from 'remotion'; typeGlState = {gl :WebGL2RenderingContext ;program :WebGLProgram ;uTex :WebGLUniformLocation | null;uTime :WebGLUniformLocation | null;texture :WebGLTexture ;vao :WebGLVertexArrayObject ; }; constVS = `#version 300 es in vec2 a_pos; in vec2 a_uv; out vec2 v_uv; void main() { gl_Position = vec4(a_pos, 0.0, 1.0); v_uv = a_uv; }`; constFS = `#version 300 es precision highp float; uniform sampler2D u_tex; uniform float u_time; in vec2 v_uv; out vec4 o; void main() { vec2 uv = v_uv; uv.x += 0.045 * sin(v_uv.y * 32.0 + u_time * 5.0); uv.y += 0.038 * sin(v_uv.x * 26.0 + u_time * 4.0); o = texture(u_tex, uv); }`; functionlinkProgram (gl :WebGL2RenderingContext ,vsSrc : string,fsSrc : string, ):WebGLProgram { constvert =gl .createShader (gl .VERTEX_SHADER )!;gl .shaderSource (vert ,vsSrc );gl .compileShader (vert ); constfrag =gl .createShader (gl .FRAGMENT_SHADER )!;gl .shaderSource (frag ,fsSrc );gl .compileShader (frag ); constprogram =gl .createProgram ()!;gl .attachShader (program ,vert );gl .attachShader (program ,frag );gl .linkProgram (program );gl .deleteShader (vert );gl .deleteShader (frag ); returnprogram ; } constQUAD = newFloat32Array ([ -1, -1, 0, 0, 1, -1, 1, 0, -1, 1, 0, 1, 1, -1, 1, 0, -1, 1, 0, 1, 1, 1, 1, 1, ]); export constHtmlInCanvasDocsMinimalWebGL :React .FC = () => { constframe =useCurrentFrame (); const {width ,height ,fps } =useVideoConfig (); constgpuRef =useRef <GlState | null>(null); constonInit :HtmlInCanvasOnInit =useCallback (({canvas }) => { constgl =canvas .getContext ('webgl2', {alpha : true,premultipliedAlpha : true,antialias : false, }); if (!gl ) { throw newError ('WebGL2 unavailable'); }gl .pixelStorei (gl .UNPACK_FLIP_Y_WEBGL , true); constprogram =linkProgram (gl ,VS ,FS ); constuTex =gl .getUniformLocation (program , 'u_tex'); constuTime =gl .getUniformLocation (program , 'u_time'); consttexture =gl .createTexture ()!;gl .bindTexture (gl .TEXTURE_2D ,texture );gl .texParameteri (gl .TEXTURE_2D ,gl .TEXTURE_MIN_FILTER ,gl .LINEAR );gl .texParameteri (gl .TEXTURE_2D ,gl .TEXTURE_MAG_FILTER ,gl .LINEAR );gl .texParameteri (gl .TEXTURE_2D ,gl .TEXTURE_WRAP_S ,gl .CLAMP_TO_EDGE );gl .texParameteri (gl .TEXTURE_2D ,gl .TEXTURE_WRAP_T ,gl .CLAMP_TO_EDGE ); constbuffer =gl .createBuffer ()!;gl .bindBuffer (gl .ARRAY_BUFFER ,buffer );gl .bufferData (gl .ARRAY_BUFFER ,QUAD ,gl .STATIC_DRAW ); constvao =gl .createVertexArray ()!;gl .bindVertexArray (vao ); constlocPos =gl .getAttribLocation (program , 'a_pos'); constlocUv =gl .getAttribLocation (program , 'a_uv');gl .enableVertexAttribArray (locPos );gl .vertexAttribPointer (locPos , 2,gl .FLOAT , false, 16, 0);gl .enableVertexAttribArray (locUv );gl .vertexAttribPointer (locUv , 2,gl .FLOAT , false, 16, 8);gpuRef .current = {gl ,program ,uTex ,uTime ,texture ,vao }; return () => {gl .deleteProgram (program );gl .deleteTexture (texture );gl .deleteVertexArray (vao );gl .deleteBuffer (buffer );gpuRef .current = null; }; }, []); constonPaint :HtmlInCanvasOnPaint =useCallback ( ({elementImage }) => { constgpu =gpuRef .current ; if (!gpu ) { return; } const {gl } =gpu ;gl .viewport (0, 0,gl .drawingBufferWidth ,gl .drawingBufferHeight );gl .useProgram (gpu .program );gl .activeTexture (gl .TEXTURE0 );gl .bindTexture (gl .TEXTURE_2D ,gpu .texture );gl .texElementImage2D (gl .TEXTURE_2D , 0,gl .RGBA ,gl .RGBA ,gl .UNSIGNED_BYTE ,elementImage , ); if (gpu .uTex ) {gl .uniform1i (gpu .uTex , 0); } if (gpu .uTime ) {gl .uniform1f (gpu .uTime ,frame /fps ); }gl .bindVertexArray (gpu .vao );gl .drawArrays (gl .TRIANGLES , 0, 6); }, [frame ,fps ], ); return ( <HtmlInCanvas width ={width }height ={height }onInit ={onInit }onPaint ={onPaint } > <AbsoluteFill style ={{justifyContent : 'center',alignItems : 'center',color : 'white',fontSize : 120, }} > <h1 >Hello</h1 > </AbsoluteFill > </HtmlInCanvas > ); };
WebGPU
Use onInit to get a WebGPU context, to request a GPU and to compile shaders.
onInit must return a cleanup function that destroys the resources that were created in onInit.
In onPaint, draw the elementImage to the canvas using copyElementImageToTexture.
This example is long. Expand to see it:
compose-webgpu.tsximportReact , {useCallback ,useRef } from 'react'; import {AbsoluteFill ,HtmlInCanvas , typeHtmlInCanvasOnInit , typeHtmlInCanvasOnPaint ,useCurrentFrame ,useVideoConfig , } from 'remotion'; // Minimal WebGPU types — `@webgpu/types` is intentionally not a dependency, // matching the convention in `packages/core/src/effects/gpu-device.ts`. typeGpu = {requestAdapter ():Promise <GpuAdapter | null>;getPreferredCanvasFormat (): string; }; typeGpuAdapter = {requestDevice ():Promise <GpuDevice >}; typeGpuTextureView = unknown; typeGpuTexture = {createView ():GpuTextureView ;destroy (): void}; typeGpuBuffer = {destroy (): void}; typeGpuBindGroup = unknown; typeGpuPipeline = unknown; typeGpuSampler = unknown; typeGpuShaderModule = unknown; typeGpuDevice = {createShaderModule (d : {code : string}):GpuShaderModule ;createRenderPipeline (d : unknown):GpuPipeline ;createTexture (d : unknown):GpuTexture ;createSampler (d ?: unknown):GpuSampler ;createBindGroup (d : unknown):GpuBindGroup ;createBuffer (d : unknown):GpuBuffer ;createCommandEncoder (): {beginRenderPass (d : unknown): {setPipeline (p :GpuPipeline ): void;setBindGroup (i : number,b :GpuBindGroup ): void;draw (n : number): void;end (): void; };finish (): unknown; };queue : {submit (c : unknown[]): void;writeBuffer (b :GpuBuffer ,offset : number,data :BufferSource ): void;copyElementImageToTexture (source :Element |ElementImage ,width : number,height : number,destination : {texture :GpuTexture }, ): void; }; }; typeGpuCanvasContext = {configure (d : {device :GpuDevice ;format : string;alphaMode : 'premultiplied' | 'opaque'; }): void;getCurrentTexture ():GpuTexture ; }; constWGSL = /* wgsl */ ` struct VsOut { @builtin(position) pos: vec4f, @location(0) uv: vec2f, }; @vertex fn vs(@builtin(vertex_index) i: u32) -> VsOut { // Fullscreen triangle (slightly oversized — clipped to viewport). var p = array(vec2f(-1.0, -3.0), vec2f(-1.0, 1.0), vec2f(3.0, 1.0)); var uv = array(vec2f(0.0, 2.0), vec2f(0.0, 0.0), vec2f(2.0, 0.0)); var o: VsOut; o.pos = vec4f(p[i], 0.0, 1.0); o.uv = uv[i]; return o; } struct U { time: f32, _pad: f32, resolution: vec2f, }; @group(0) @binding(0) var samp: sampler; @group(0) @binding(1) var tex: texture_2d<f32>; @group(0) @binding(2) var<uniform> u: U; @fragment fn fs(in: VsOut) -> @location(0) vec4f { // Animate pixel cell size with a slow breathing motion. let cell = 6.0 + sin(u.time * 0.8) * 4.0; let snapped = floor(in.uv * u.resolution / cell) * cell / u.resolution; // Slight chromatic offset between channels — sampled from snapped centers. let off = vec2f(2.0, 0.0) / u.resolution; let r = textureSample(tex, samp, snapped + off).r; let g = textureSample(tex, samp, snapped).g; let b = textureSample(tex, samp, snapped - off).b; let a = textureSample(tex, samp, snapped).a; // Posterize to 5 levels per channel for a flatter, screenprint look. let levels = 5.0; let q = floor(vec3f(r, g, b) * levels) / (levels - 1.0); return vec4f(q, a); } `; typeGpuState = {device :GpuDevice ;context :GpuCanvasContext ;pipeline :GpuPipeline ;sampler :GpuSampler ;texture :GpuTexture ;uniformBuffer :GpuBuffer ;bindGroup :GpuBindGroup ;width : number;height : number; }; export constHtmlInCanvasComposeWebGPU :React .FC = () => { constframe =useCurrentFrame (); const {width ,height ,fps } =useVideoConfig (); constgpuRef =useRef <GpuState | null>(null); consttime =frame /fps ; constonInit :HtmlInCanvasOnInit =useCallback (async ({canvas }) => { if (typeofnavigator === 'undefined' || !('gpu' innavigator )) { throw newError ('WebGPU is not available in this environment'); } constgpu = (navigator as unknown as {gpu :Gpu }).gpu ; constadapter = awaitgpu .requestAdapter (); if (!adapter ) { throw newError ('No WebGPU adapter available'); } constdevice = awaitadapter .requestDevice (); constcontext = (canvas as unknown as {getContext (id : 'webgpu'):GpuCanvasContext | null; } ).getContext ('webgpu'); if (!context ) { throw newError ('WebGPU context unavailable on OffscreenCanvas'); } // Use the device's preferred swap-chain format (typically `bgra8unorm`) // to avoid an extra format-conversion copy on present. constpresentationFormat =gpu .getPreferredCanvasFormat ();context .configure ({device ,format :presentationFormat ,alphaMode : 'premultiplied', }); constmodule =device .createShaderModule ({code :WGSL }); constpipeline =device .createRenderPipeline ({layout : 'auto',vertex : {module ,entryPoint : 'vs'},fragment : {module ,entryPoint : 'fs',targets : [{format :presentationFormat }], },primitive : {topology : 'triangle-list'}, }); constTextureUsage = (globalThis as unknown as {GPUTextureUsage : {COPY_DST : number;TEXTURE_BINDING : number;RENDER_ATTACHMENT : number; }; } ).GPUTextureUsage ; constBufferUsage = (globalThis as unknown as {GPUBufferUsage : {UNIFORM : number;COPY_DST : number}; } ).GPUBufferUsage ; consttexture =device .createTexture ({size : {width :canvas .width ,height :canvas .height },format : 'rgba8unorm',usage :TextureUsage .COPY_DST |TextureUsage .TEXTURE_BINDING |TextureUsage .RENDER_ATTACHMENT , }); constsampler =device .createSampler ({magFilter : 'linear',minFilter : 'linear',addressModeU : 'clamp-to-edge',addressModeV : 'clamp-to-edge', }); // 16 bytes: time (f32), pad (f32), resolution (vec2f). constuniformBuffer =device .createBuffer ({size : 16,usage :BufferUsage .UNIFORM |BufferUsage .COPY_DST , }); constbindGroup =device .createBindGroup ({layout : (pipeline as unknown as {getBindGroupLayout (i : number): unknown; } ).getBindGroupLayout (0),entries : [ {binding : 0,resource :sampler }, {binding : 1,resource :texture .createView ()}, {binding : 2,resource : {buffer :uniformBuffer }}, ], });gpuRef .current = {device ,context ,pipeline ,sampler ,texture ,uniformBuffer ,bindGroup ,width :canvas .width ,height :canvas .height , }; return () => {texture .destroy ();uniformBuffer .destroy ();gpuRef .current = null; }; }, []); constonPaint :HtmlInCanvasOnPaint =useCallback ( ({elementImage }) => { constgpu =gpuRef .current ; if (!gpu ) { return; } const {device ,context ,pipeline ,texture ,bindGroup ,uniformBuffer } =gpu ;device .queue .copyElementImageToTexture (elementImage ,gpu .width ,gpu .height , {texture }, ); constuniforms = newFloat32Array ([time , 0,gpu .width ,gpu .height ]);device .queue .writeBuffer (uniformBuffer , 0,uniforms ); constencoder =device .createCommandEncoder (); constview =context .getCurrentTexture ().createView (); constpass =encoder .beginRenderPass ({colorAttachments : [ {view ,clearValue : {r : 0,g : 0,b : 0,a : 0},loadOp : 'clear',storeOp : 'store', }, ], });pass .setPipeline (pipeline );pass .setBindGroup (0,bindGroup );pass .draw (3);pass .end ();device .queue .submit ([encoder .finish ()]); }, [time ], ); return ( <AbsoluteFill style ={{justifyContent : 'center',alignItems : 'center', }} > <HtmlInCanvas width ={width }height ={height }onInit ={onInit }onPaint ={onPaint } > <AbsoluteFill style ={{backgroundColor : 'white',color : 'black',justifyContent : 'center',alignItems : 'center',fontSize : 120,fontFamily : 'sans-serif',fontWeight : 'bold', }} > <h1 >Hello, World!</h1 > </AbsoluteFill > </HtmlInCanvas > </AbsoluteFill > ); };
Async callbacks
You can use await inside onPaint and Remotion will keep the frame open via delayRender() until the promise settles.
This might be necessary if you are implementing a multi-pass effect using multiple contexts.
Async: createImageBitmap then drawImageimport {HtmlInCanvasOnPaint } from 'remotion'; declare constwidth : number; declare constheight : number; constonPaint :HtmlInCanvasOnPaint = async ({canvas ,elementImage }) => { constctx =canvas .getContext ('2d'); if (!ctx ) { return; }ctx .reset ();ctx .drawElementImage (elementImage , 0, 0); constbitmap = awaitcreateImageBitmap (canvas ); try {ctx .reset ();ctx .drawImage (bitmap , 0, 0,width ,height ); } finally {bitmap .close (); } };
Compatibility
| Browsers | Environments | |||||
|---|---|---|---|---|---|---|
Chrome | Firefox | Safari | ||||
Only works in Chrome 149 and later with the flag enabled, as mentioned on the top of the page.