The WebGL rendering pipeline is the series of steps that WebGL goes through to turn a scene into an image. The following describes (in a very rough way) the main steps in the pipeline.Step 1: Gathering vertices and related data
For each triangle in the model, the world position of each vertex is computed, together with all related information such as the normal vector, UV coordinates, color, and associated textures.
Shaders are not part of WebGL. They must be provided either by you or by some middleware (like three.js) that you are using. At a minimum, vertex shaders convert 3D points in world coordinates to points that are 3D (canvas coordinates) plus a depth value. They may also produce additional data to be passed down the pipeline.
For each triangle, the corresponding pixels are determined, and the associated data (textures, etc.) is gathered.
The pixel shader (again, provided by the user or middleware) takes data relevant to the pixel color and computes the actual color.
The color is written to the appropriate location in the framebuffer, typically dependant on whether the depth value of the pixel is lower that whatever color may have previously been written at that location. The framebuffer is the color data that is eventually rendered to your screen.