WebGL (short for Web Graphics Library ) is a JavaScript API for rendering interactive 2D and 3D graphics within any compatible web browser without the use of plug-ins . WebGL is fully integrated with other web standards , allowing GPU -accelerated usage of physics, image processing, and effects in the HTML canvas . WebGL elements can be mixed with other HTML elements and composited with other parts of the page or page background.
46-551: WebGL programs consist of control code written in JavaScript, and shader code written in OpenGL ES Shading Language (GLSL ES, sometimes referred to as ESSL), a language similar to C or C++ . WebGL code is executed on a computer's GPU. WebGL is designed and maintained by the non-profit Khronos Group . On February 9, 2022, Khronos Group announced WebGL 2.0 support from all major browsers. WebGL 1.0
92-462: A shader is a computer program that calculates the appropriate levels of light , darkness , and color during the rendering of a 3D scene —a process known as shading . Shaders have evolved to perform a variety of specialized functions in computer graphics special effects and video post-processing , as well as general-purpose computing on graphics processing units . Traditional shaders calculate rendering effects on graphics hardware with
138-559: A WebGL specific extension called glUtils.js. There are also some 2D libraries built atop WebGL, like Cocos2d -x or Pixi.js , which were implemented this way for performance reasons in a move that parallels what happened with the Starling Framework over Stage3D in the Flash world. The WebGL-based 2D libraries fall back to HTML5 canvas when WebGL is not available. Removing the rendering bottleneck by giving almost direct access to
184-471: A better approximation of a curve. As of OpenGL 4.0 and Direct3D 11, a new shader class called a tessellation shader has been added. It adds two new shader stages to the traditional model: tessellation control shaders (also known as hull shaders) and tessellation evaluation shaders (also known as Domain Shaders), which together allow for simpler meshes to be subdivided into finer meshes at run-time according to
230-426: A color value; more complex shaders with multiple inputs/outputs are also possible. Pixel shaders range from simply always outputting the same color, to applying a lighting value, to doing bump mapping , shadows , specular highlights , translucency and other phenomena. They can alter the depth of the fragment (for Z-buffering ), or output more than one color if multiple render targets are active. In 3D graphics,
276-403: A final rendered image can be altered using algorithms defined in a shader, and can be modified by external variables or textures introduced by the computer program calling the shader. Shaders are used widely in cinema post-processing , computer-generated imagery , and video games to produce a range of effects. Beyond simple lighting models, more complex uses of shaders include: altering
322-565: A geometry shader if present, or the rasterizer . Vertex shaders can enable powerful control over the details of position, movement, lighting, and color in any scene involving 3D models . Geometry shaders were introduced in Direct3D 10 and OpenGL 3.2; formerly available in OpenGL 2.0+ with the use of extensions. This type of shader can generate new graphics primitives , such as points, lines, and triangles, from those primitives that were sent to
368-471: A geometry shader include point sprite generation, geometry tessellation , shadow volume extrusion, and single pass rendering to a cube map . A typical real-world example of the benefits of geometry shaders would be automatic mesh complexity modification. A series of line strips representing control points for a curve are passed to the geometry shader and depending on the complexity required the shader can automatically generate extra lines each of which provides
414-612: A high degree of flexibility. Most shaders are coded for (and run on) a graphics processing unit (GPU), though this is not a strict requirement. Shading languages are used to program the GPU's rendering pipeline , which has mostly superseded the fixed-function pipeline of the past that only allowed for common geometry transforming and pixel-shading functions; with shaders, customized effects can be used. The position and color ( hue , saturation , brightness , and contrast ) of all pixels , vertices , and/or textures used to construct
460-418: A mathematical function. The function can be related to a variety of variables, most notably the distance from the viewing camera to allow active level-of-detail scaling. This allows objects close to the camera to have fine detail, while further away ones can have more coarse meshes, yet seem comparable in quality. It also can drastically reduce required mesh bandwidth by allowing meshes to be refined once inside
506-479: A per-vertex basis. Newer geometry shaders can generate new vertices from within the shader. Tessellation shaders are the newest 3D shaders; they act on batches of vertices all at once to add detail—such as subdividing a model into smaller groups of triangles or other primitives at runtime, to improve things like curves and bumps , or change other attributes. Vertex shaders are the most established and common kind of 3D shader and are run once for each vertex given to
SECTION 10
#1732788098778552-647: A pixel shader alone cannot produce some kinds of complex effects because it operates only on a single fragment, without knowledge of a scene's geometry (i.e. vertex data). However, pixel shaders do have knowledge of the screen coordinate being drawn, and can sample the screen and nearby pixels if the contents of the entire screen are passed as a texture to the shader. This technique can enable a wide variety of two-dimensional postprocessing effects such as blur , or edge detection /enhancement for cartoon/cel shaders . Pixel shaders may also be applied in intermediate stages to any two-dimensional images— sprites or textures —in
598-436: A project called X3DOM to make X3D and VRML content run on WebGL. There has been an emergence of 2D and 3D game engines for WebGL, such as Unreal Engine 4 and Unity . The Stage3D /Flash-based Away3D high-level library also has a port to WebGL via TypeScript . A more light-weight utility library that provides just the vector and matrix math utilities for shaders is sylvester.js. It is sometimes used in conjunction with
644-434: A vertex, while pixel shaders describe the traits (color, z-depth and alpha value) of a pixel. A vertex shader is called for each vertex in a primitive (possibly after tessellation ); thus one vertex in, one (updated) vertex out. Each vertex is then rendered as a series of pixels onto a surface (block of memory) that will eventually be sent to the screen. Shaders replace a section of the graphics hardware typically called
690-479: Is a large pixel matrix or " frame buffer ". There are three types of shaders in common use (pixel, vertex, and geometry shaders), with several more recently added. While older graphics cards utilize separate processing units for each shader type, newer cards feature unified shaders which are capable of executing any type of shader. This allows graphics cards to make more efficient use of processing power. 2D shaders act on digital images , also called textures in
736-511: Is an open source graphic engine which implements WebGL 1.0 (2.0 which closely conforms to ES 3.0) and OpenGL ES 2.0 and 3.0 standards. It is a default backend for both Google Chrome and Mozilla Firefox on Windows platforms and works by translating WebGL and OpenGL calls to available platform-specific APIs. ANGLE currently provides access to OpenGL ES 2.0 and 3.0 to desktop OpenGL, OpenGL ES, Direct3D 9, and Direct3D 11 APIs. ″[Google] Chrome uses ANGLE for all graphics rendering on Windows, including
782-458: Is based on OpenGL ES 2.0 and provides an API for 3D graphics. It uses the HTML5 canvas element and is accessed using Document Object Model (DOM) interfaces. WebGL 2.0 is based on OpenGL ES 3.0 . It guarantees the availability of many optional extensions of WebGL 1.0, and exposes new APIs. Automatic memory management is provided implicitly by JavaScript . Like OpenGL ES 2.0, WebGL lacks
828-454: Is the world's first GPU microarchitecture that supports mesh shading through DirectX 12 Ultimate API, several months before Ampere RTX 30 series was released. In 2020, AMD and Nvidia released RDNA 2 and Ampere microarchitectures which both support mesh shading through DirectX 12 Ultimate . These mesh shaders allow the GPU to handle more complex algorithms, offloading more work from the CPU to
874-456: The Metal framework . Modern video game development platforms such as Unity , Unreal Engine and Godot increasingly include node-based editors that can create shaders without the need for actual code; the user is instead presented with a directed graph of connected nodes that allow users to direct various textures, maps, and mathematical functions into output values like the diffuse color,
920-636: The fixed-function APIs introduced in OpenGL 1.0 and deprecated in OpenGL 3.0. This functionality, if required, has to be implemented by the developer using shader code and JavaScript. Shaders in WebGL are written in GLSL and passed to the WebGL API as text strings. The WebGL implementation compiles these strings to GPU code. This code is executed for each vertex sent through the API and for each pixel rasterized to
966-427: The hue , saturation , brightness ( HSL/HSV ) or contrast of an image; producing blur , light bloom , volumetric lighting , normal mapping (for depth effects), bokeh , cel shading , posterization , bump mapping , distortion , chroma keying (for so-called "bluescreen/ greenscreen " effects), edge and motion detection , as well as psychedelic effects such as those seen in the demoscene . This use of
SECTION 20
#17327880987781012-433: The pipeline , whereas vertex shaders always require a 3D scene. For instance, a pixel shader is the only kind of shader that can act as a postprocessor or filter for a video stream after it has been rasterized . 3D shaders act on 3D models or other geometry but may also access the colors and textures used to draw the model or mesh . Vertex shaders are the oldest type of 3D shader, generally making modifications on
1058-487: The Fixed Function Pipeline (FFP), so-called because it performs lighting and texture mapping in a hard-coded manner. Shaders provide a programmable alternative to this hard-coded approach. The basic graphics pipeline is as follows: The graphic pipeline uses these steps in order to transform three-dimensional (or two-dimensional) data into useful two-dimensional data for displaying. In general, this
1104-460: The GPU has exposed performance limitations in the JavaScript implementations. Some were addressed by asm.js and WebAssembly (similarly, the introduction of Stage3D exposed performance problems within ActionScript , which were addressed by projects like CrossBridge ). As with any other graphics API, creating content for WebGL scenes requires using a 3D content creation tool and exporting
1150-795: The GPU, and in algorithm intense rendering, increasing the frame rate of or number of triangles in a scene by an order of magnitude. Intel announced that Intel Arc Alchemist GPUs shipping in Q1 2022 will support mesh shaders. Ray tracing shaders are supported by Microsoft via DirectX Raytracing , by Khronos Group via Vulkan , GLSL , and SPIR-V , by Apple via Metal . Tensor shaders may be integrated in NPUs or GPUs . Tensor shaders are supported by Microsoft via DirectML , by Khronos Group via OpenVX , by Apple via Core ML , by Google via TensorFlow , by Linux Foundation via ONNX . Compute shaders are not limited to graphics applications, but use
1196-558: The WebGL API, which provides little on its own to quickly create desirable 3D graphics, motivated the creation of higher-level libraries that abstract common operations (e.g. loading scene graphs and 3D objects in certain formats; applying linear transformations to shaders or view frustums ). Some such libraries were ported to JavaScript from other languages. Examples of libraries that provide high-level features include A-Frame (VR) , BabylonJS, PlayCanvas , three.js , OSG.JS , Google ’s model-viewer and CopperLicht . Web3D also made
1242-638: The WebGL specification was released March 2011. An early application of WebGL was Zygote Body . In November 2012 Autodesk announced that they ported most of their applications to the cloud running on local WebGL clients. These applications included Fusion 360 and AutoCAD 360. Development of the WebGL 2 specification started in 2013 and finished in January 2017. The specification is based on OpenGL ES 3.0. First implementations are in Firefox 51, Chrome 56 and Opera 43. Almost Native Graphics Layer Engine (ANGLE)
1288-506: The accelerated Canvas2D implementation and the Native Client sandbox environment.″ WebGL is widely supported by modern browsers. However, its availability depends on other factors, too, like whether the GPU supports it. The official WebGL website offers a simple test page. More detailed information (like what renderer the browser uses, and what extensions are available) can be found at third-party websites. The low-level nature of
1334-575: The advent of WebGL , Paul Brunt was able to implement the new rendering technology quite easily as Three.js was designed with the rendering code as a module rather than in the core itself. Branislav Uličný, an early contributor, started with Three.js in 2010 after having posted a number of WebGL demos on his own site. He wanted WebGL renderer capabilities in Three.js to exceed those of CanvasRenderer or SVGRenderer. His major contributions generally involve materials, shaders, and post-processing. Soon after
1380-421: The beginning of the graphics pipeline . Geometry shader programs are executed after vertex shaders. They take as input a whole primitive, possibly with adjacency information. For example, when operating on triangles, the three vertices are the geometry shader's input. The shader can then emit zero or more primitives, which are rasterized and their fragments ultimately passed to a pixel shader . Typical uses of
1426-521: The browser without the effort required for a traditional standalone application or a plugin. Three.js was first released by Ricardo Cabello on GitHub in April 2010. The origins of the library can be traced back to his involvement with the demoscene in the early 2000s. The code was originally developed in the ActionScript language used by Adobe Flash , later being ported to JavaScript in 2009. In Cabello's mind, there were two strong points that justified
WebGL - Misplaced Pages Continue
1472-505: The creation of graphical processing unit (GPU)-accelerated 3D animations using the JavaScript language as part of a website without relying on proprietary browser plugins . This is possible due to the advent of WebGL , a low-level graphics API created specifically for the web. High-level libraries such as Three.js or GLGE , Scene.js, PhiloGL, and many more make it possible to author complex 3D computer animations for display in
1518-417: The field of computer graphics. They modify attributes of pixels . 2D shaders may take part in rendering 3D geometry . Currently the only type of 2D shader is a pixel shader. Pixel shaders, also known as fragment shaders, compute color and other attributes of each "fragment": a unit of rendering work affecting at most a single output pixel . The simplest kinds of pixel shaders output one screen pixel as
1564-461: The graphics processor. The purpose is to transform each vertex's 3D position in virtual space to the 2D coordinate at which it appears on the screen (as well as a depth value for the Z-buffer). Vertex shaders can manipulate properties such as position, color and texture coordinates, but cannot create new vertices. The output of the vertex shader goes to the next stage in the pipeline, which is either
1610-528: The introduction of WebGL 1.0 on Firefox 4 in March 2011, Joshua Koo came on board. He built his first Three.js demo for 3D text in September 2011. His contributions frequently relate to geometry generation. Starting from version 118, Three.js uses WebGL 2.0 by default. Older version of the standard is still available via WebGL1Renderer class. Three.js has over 1700 contributors on GitHub. Three.js includes
1656-445: The online WebGL-based editor Clara.io . Online platforms such as Sketchfab and Clara.io allow users to directly upload their 3D models and display them using a hosted WebGL viewer. Starting from Firefox Version 27, Mozilla has given Firefox built-in WebGL tools that allow the editing of vertices and fragment shaders. A number of other debugging and profiling tools have also emerged. Shader In computer graphics ,
1702-512: The power of shaders. The first video card with a programmable pixel shader was the Nvidia GeForce 3 (NV20), released in 2001. Geometry shaders were introduced with Direct3D 10 and OpenGL 3.2. Eventually, graphics hardware evolved toward a unified shader model . Shaders are simple programs that describe the traits of either a vertex or a pixel . Vertex shaders describe the attributes (position, texture coordinates , colors, etc.) of
1748-404: The same execution resources for GPGPU . They may be used in graphics pipelines e.g. for additional stages in animation or lighting algorithms (e.g. tiled forward rendering ). Some rendering APIs allow compute shaders to easily share data resources with the graphics pipeline. Shaders are written to apply transformations to a large set of elements at a time, for example, to each pixel in an area of
1794-472: The scene to a format that is readable by the viewer or helper library. Desktop 3D authoring software such as Blender , Autodesk Maya or SimLab Composer can be used for this purpose. In particular, Blend4Web allows a WebGL scene to be authored entirely in Blender and exported to a browser with a single click, even as a standalone web page. There are also some WebGL-specific software such as CopperCube and
1840-585: The screen, or for every vertex of a model. This is well suited to parallel processing , and most modern GPUs have multiple shader pipelines to facilitate this, vastly improving computation throughput. A programming model with shaders is similar to a higher order function for rendering, taking the shaders as arguments, and providing a specific dataflow between intermediate results, enabling both data parallelism (across pixels, vertices etc.) and pipeline parallelism (between stages). (see also map reduce ). The language in which shaders are programmed depends on
1886-581: The screen. WebGL evolved out of the Canvas 3D experiments started by Vladimir Vukićević at Mozilla . Vukićević first demonstrated a Canvas 3D prototype in 2006. By the end of 2007, both Mozilla and Opera had made their own separate implementations. In early 2009, the non-profit technology consortium Khronos Group started the WebGL Working Group, with initial participation from Apple , Google , Mozilla, Opera , and others. Version 1.0 of
WebGL - Misplaced Pages Continue
1932-618: The shader units instead of downsampling very complex ones from memory. Some algorithms can upsample any arbitrary mesh, while others allow for "hinting" in meshes to dictate the most characteristic vertices and edges. Circa 2017, the AMD Vega microarchitecture added support for a new shader stage—primitive shaders—somewhat akin to compute shaders with access to the data necessary to process geometry. Nvidia introduced mesh and task shaders with its Turing microarchitecture in 2018 which are also modelled after compute shaders. Nvidia Turing
1978-410: The shift away from ActionScript: Firstly, JavaScript provided greater platform independence. Secondly, applications written in JavaScript would not need to be compiled by the developer beforehand, unlike Flash applications. Additional contributions by Cabello include API design, CanvasRenderer, SVGRenderer, and being responsible for merging the commits by the various contributors into the project. With
2024-443: The specular color and intensity, roughness/metalness, height, normal, and so on. Automatic compilation then turns the graph into an actual, compiled shader. Three.js Three.js is a cross-browser JavaScript library and application programming interface (API) used to create and display animated 3D computer graphics in a web browser using WebGL . The source code is hosted in a repository on GitHub . Three.js allows
2070-460: The target environment. The official OpenGL and OpenGL ES shading language is OpenGL Shading Language , also known as GLSL, and the official Direct3D shading language is High Level Shader Language , also known as HLSL. Cg , a third-party shading language which outputs both OpenGL and Direct3D shaders, was developed by Nvidia ; however since 2012 it has been deprecated. Apple released its own shading language called Metal Shading Language as part of
2116-472: The term "shader" was introduced to the public by Pixar with version 3.0 of their RenderMan Interface Specification, originally published in May 1988. As graphics processing units evolved, major graphics software libraries such as OpenGL and Direct3D began to support shaders. The first shader-capable GPUs only supported pixel shading , but vertex shaders were quickly introduced once developers realized
#777222