Skip to content

Commit 8dd472d

Browse files
committed
strands tut: minor readability tweaks
1 parent d02ac17 commit 8dd472d

File tree

1 file changed

+33
-30
lines changed

1 file changed

+33
-30
lines changed

src/content/tutorials/en/intro-to-p5-strands.mdx

Lines changed: 33 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,6 @@ authors:
2222
import EditableSketch from "../../../components/EditableSketch/index.astro";
2323
import Callout from "../../../components/Callout/index.astro";
2424

25-
2625
## Version
2726

2827
Attempted simplification (25/03/2026).
@@ -37,7 +36,7 @@ Before p5.js 2.0, you could already use [GLSL](https://beta.p5js.org/tutorials/i
3736

3837
When you write a p5.js sketch, you are giving the CPU a sequence of instructions. When you add a shader - using p5.strands or GLSL - you are giving instructions to the GPU to run many times at once, simultaneously. For example, in a fragment shader, that means many calculations for each pixel.
3938

40-
Drawing to the screen (rendering) can take advantage of parallel operations. Shaders make it possible to create visuals that would otherwise be too slow or difficult, like realistic lighting simulations, post processing effects, and rendering complex geometries. Learning shaders is valuable for anyone interested in graphics programming. This could be for game development, VFX for films, or for any kind of digital arts. It's also a fun and unique way to think using computers.
39+
Drawing to the screen (rendering) can take advantage of parallel operations. Shaders make it possible to create visuals that would otherwise be too slow or difficult, like realistic lighting simulations, post-processing effects, and rendering complex geometries. Learning shaders is valuable for anyone interested in graphics programming. This could be for game development, VFX for films, or for any kind of digital arts. It's also a fun and unique way to think using computers.
4140

4241
To learn how **p5.strands** works, we'll create a 3D sketch using 4 different shaders, each illustrating different concepts. The code will be built incrementally throughout the tutorial, with the complete code available at the end.
4342

@@ -77,15 +76,17 @@ A vertex shader and a fragment shader are both required to render anything in We
7776
}
7877
`} />
7978

80-
Even without explicitly setting it, a shader calculates the lighting and shading of the fill, and another does the stroke. Uncommenting `shader(baseColorShader())` produces a different result, where the blue tint of the lighting does not affect the visible color of the sphere.
81-
79+
Even without explicitly setting it, a shader is used which calculates the lighting and shading of the fill for the sphere, and another does the stroke.
80+
8281
<Callout title="Tip">
83-
Try uncommenting the 2nd line in `draw` to use `baseColorShader()` instead. The blue tint of the light in `lightingSetup` is taken into account by the material shader.
82+
Try uncommenting the 2nd line in `draw` to use `baseColorShader()` instead.
8483
</ Callout>
8584

86-
Alternatively, we can use the `baseColorShader` instead and the fill is now a solid white color, not affected by lighting. The filters available when you call `filter()` are also shaders which p5.js provides for you.
85+
When we instead use the `baseColorShader`, the sphere fill is now a solid white color, not affected by lighting.
8786

88-
Part of the fun of WebGL is that you can write your own shaders, and it unlocks a lot of possibilities for creating which would be difficult to with the p2D renderer. They can also run a lot faster. In the next section, we will start the main part of the tutorial and learn how to use the new p5.js shading language.
87+
The filters available when you call `filter()` are also shaders which p5.js provides for you.
88+
89+
Part of the fun of WebGL is that you can write your own shaders, and it unlocks a lot of possibilities for creating which would be difficult to achieve with p5's default 2D renderer. They can also run a lot faster. In the next section, we will start the main part of the tutorial and learn how to use the new p5.js shading language.
8990

9091
## What is p5.strands?
9192

@@ -95,7 +96,7 @@ In the rest of p5.js, we're accustomed to writing instructions that run sequenti
9596

9697
Instead, shaders apply a set of instructions across all vertices or pixels simultaneously. Each vertex or pixel independently follows the same rules, but produces a unique result based on its position. Rather than explicitly drawing the circle, we instead ask each pixel individually, "Are you within a circle centered at `(10, 10)`? If so, change your color."
9798

98-
The "strands" also refer to access points into the shader pipeline that let you modify specific aspects of rendering without building entire shaders from scratch. These access points allow you to modify specific aspects of the rendering process without needing to construct the entire shader from scratch.
99+
The "strands" also refer to access points into the shader pipeline that let you modify specific aspects of rendering without building the entire shader from scratch.
99100

100101
To summarise, p5.strands is a JavaScript-based shading language which sits on top of GLSL. It allows you to write shaders in a regular `.js` file without writing GLSL in string literals. It also removes some setup code in the process, and integrates with the rest of p5.js, to bring learning and using shaders closer to the core of p5.js.
101102

@@ -140,7 +141,7 @@ In this sketch, you should see a yellow sphere.
140141
2. Strands: A "strand" is a specific part of a shader you can modify. In our example, getFinalColor is a strand that lets you change an object's final color.
141142
3. Modifying shaders: The pattern works like this:
142143
- Choose a build function, e.g. `buildColorShader()`, `buildMaterialShader()`, `buildFilterShader()`, `buildStrokeShader()`
143-
- pass in your callback function
144+
- Pass in your callback function
144145
- Inside your callback, use the strand blocks such as `finalColor` to override different parts of the default shader's behavior.
145146
4. Vectors: Shaders work with vectors — collections of numbers that represent colors, positions, etc. In p5.strands, you can create vectors using array syntax `[x, y, z, w]` or the `vec4()` function.
146147
- For example, `[1, 1, 0, 1]` made a 4-element vector representing a color. The elements are Red, Green, Blue, and Alpha, all ranging between 0 and 1.
@@ -149,17 +150,17 @@ In this sketch, you should see a yellow sphere.
149150
p5.strands provides the following builder functions. Click the links below for their reference which tells us what functions are available for overriding.
150151
- [`buildColorShader`](/reference/p5/buildColorShader): Builds the default shader type in WebGL mode.
151152
- [`buildMaterialShader`](/reference/p5/buildMaterialShader): Builds the type of shader automatically applied if you have any lights in the scene.
152-
- [`buildNormalShader`](/reference/p5/buildNormalShader): Builds a default normally applied by calling `normalMaterial()`. Often used in visually debugging geometry.
153+
- [`buildNormalShader`](/reference/p5/buildNormalShader): Builds a default shader normally applied by calling `normalMaterial()`. Often used in visually debugging geometry.
153154
- [`buildStrokeShader`](/reference/p5/buildStrokeShader): Builds a shader type used to shade the geometry of strokes in 3D modes.
154155
- [`buildFilterShader`](/reference/p5/buildFilterShader): Builds a shader type for post-processing such as those provided by p5.js, like `filter(BLUR)`.
155156

156157
Now that we've built our first p5.strands modification, let's create a more complex scene! We'll use `buildColorShader` and `buildStrokeShader` to create 3D objects, then apply post-processing with `buildFilterShader` to enhance the final result.
157158

158159
## Building a scene
159160
### Instancing particles
160-
Because the GPU excels at parallel computation, it can draw thousands or millions of particles simultaneously. We can do this with a technique called **GPU Instancing**.
161+
Because the GPU excels at parallel computation, it can draw thousands or even millions of particles simultaneously. We can do this with a technique called **GPU Instancing**.
161162

162-
In GPU instancing, we ask the GPU to draw multiple copies of the same object, each with a unique ID (from 0 to n-1). We can then position each instance based on its ID. For example, placing objects at coordinates `[ID, 0, 0]` would create a line along the x-axis.
163+
In GPU instancing, we ask the GPU to draw multiple copies of the same object, each with a unique ID (from 0 to n-1). We can then position each instance based on its ID. For example, placing objects at coordinates `[ID, 0, 0]` would create a line of objects along the x-axis.
163164

164165
In p5.js, instancing is available via optional parameters for `endShape` and `model`. It does also require a custom shader to work. In our case, let's use `model` and build a sphere shape.
165166

@@ -222,14 +223,14 @@ Let's start by offsetting instances along the x-axis using `baseColorShader()`:
222223
}
223224
`} />
224225

225-
The worldInputs block contains data about the current vertex: `position`, `normal`, `texCoord`, and `color`. The "world" part indicates that this runs after JavaScript transformations like translate() or scale() have been applied.
226+
The `worldInputs` block contains data about the current vertex: `position`, `normal`, `texCoord`, and `color`. The "world" part indicates that this runs after JavaScript transformations like `translate()` or `scale()` have been applied.
226227
{/* TODO: have a pipeline diagram early that we repeatedly reference. */}
227228

228229
<Callout title='World Space vs. Object Space'>
229230
Moving in world space is like moving relative to the entire scene, while object transformations are relative to the object's center.
230231
</ Callout>
231232

232-
Now let's distribute our particles in a more interesting pattern - placing them randomly on a sphere:
233+
Now let's distribute our particles in a more interesting pattern: placing them randomly on a sphere:
233234

234235
<EditableSketch code={`
235236
let instancingShader;
@@ -277,10 +278,10 @@ Now let's distribute our particles in a more interesting pattern - placing them
277278
}
278279
`} />
279280

280-
The noise() function is used to generate pseudorandom values based on the instance ID.
281+
The `noise()` function is used to generate pseudorandom values based on the instance ID.
281282

282283
#### Adding movement to the particles
283-
Strands provides a function, `millis()`, that returns the number of milliseconds since a sketch started running. It will give the same value as the [normal p5.js millis() function](/reference/p5/millis/). We can use it to change each particle's position over time.
284+
Strands provides a function, `millis()`, that returns the number of milliseconds since a sketch started running. It will give the same value as the [standard p5.js `millis()` function](/reference/p5/millis/). We can use it to change each particle's position over time.
284285

285286
<EditableSketch code={`
286287
let instancingShader;
@@ -343,7 +344,7 @@ By adding `millis() / 10000` to the phi angle and modifying the radius with `sin
343344
Try replacing `sin()` with `tan()`, `acosh()`, or combinations of functions. Small changes in shader code often create dramatically different visual effects, so experimentation goes a long way.
344345
</ Callout>
345346

346-
#### standard p5.js variables available
347+
#### Standard p5.js variables available
347348

348349
Some of the well-known p5.js global variables are made available to your shader when you use p5.strands.
349350

@@ -359,7 +360,7 @@ Within a strands callback function, you can use any of these directly:
359360
* Time and frameCount: [deltaTime](/reference/p5/deltaTime),
360361
[frameCount](/reference/p5/frameCount).
361362

362-
* Pointer (mouse, touch, etc): [mouseIsPressed](/reference/p5/mouseIsPressed),
363+
* Pointer (mouse, touch, etc.): [mouseIsPressed](/reference/p5/mouseIsPressed),
363364
[mouseX](/reference/p5/mouseX),
364365
[mouseY](/reference/p5/mouseY),
365366
[pmouseX](/reference/p5/pmouseX),
@@ -371,16 +372,18 @@ Within a strands callback function, you can use any of these directly:
371372

372373
These are all numbers except `mouseIsPressed`, which is a boolean.
373374

375+
These values are automatically passed from p5.js into your shader using "uniform variables" ("uniforms", for short), behind the scenes for your convenience. This is an information-passing mechanism you don't need to understand for now, but which we'll see again, later.
376+
374377
### Fresnel effect
375378
If you've ever seen a material in a 3D render which appears to glow at the edges, or noticed how the light reflections appear to change on virtual water as you move your viewpoint, you were seeing the Fresnel effect. This effect changes how materials look when viewed at an angle.
376379

377380
The Fresnel effect will be checking which parts of the shape are pointing away from the camera. For this reason, it is helpful for us to work in camera space. In camera space, also known as view space:
378-
- The camera is position at the origin `(0, 0, 0)`
381+
- The camera is positioned at the origin `(0, 0, 0)`
379382
- The camera looks along the negative Z-axis
380383
- All 3D positions are relative to the camera's perspective
381384

382385
<Callout title='Camera Space'>
383-
This is another relative view of the virtual world, where the camera is positioned at `(0, 0)`, so everything else is position relative to it.
386+
Like Object Space and World Space, Camera space is another relative view of the virtual world. In Camera Space, the current camera is positioned at `(0, 0, 0)`, so everything else is positioned relative to it.
384387
</ Callout>
385388

386389
This perspective makes it easier to determine how surfaces appear to the viewer:
@@ -407,7 +410,7 @@ The line `let viewVector = normalize(-cameraInputs.position)` might seem counter
407410
1. In camera space, the camera is at `(0, 0, 0)`
408411
2. cameraInputs.position gives us the position of the current vertex being processed
409412
3. When we negate this (`-cameraInputs.position`), we get a vector pointing from the vertex toward the camera
410-
4. `normalize()` converts this to a unit vector (where every component is between 0 - 1), making it useful for direction calculations regardless of distance
413+
4. `normalize()` converts this to a "unit vector" (i.e., a vector with length of 1), making it useful for direction calculations regardless of distance
411414

412415
#### Calculating the Fresnel factor
413416

@@ -485,7 +488,7 @@ Notice the returned value above: `[col, 1]`. This is one of the ways in which ve
485488

486489

487490
<Callout title="Tip">
488-
Try using `mouseX` and `mouseY` for an interactive, color changing effect instead of the hardcoded pink. You will probably want to divide them down by `width` and `height` to get values in a 0 to 1 range.
491+
Try using `mouseX` and `mouseY` for an interactive, color-changing effect instead of the hardcoded pink. You will probably want to divide them down by their respective `width` and `height` to get values in a 0 to 1 range.
489492
</ Callout>
490493

491494
#### Fine-tuning
@@ -504,12 +507,12 @@ Instead of leaving these fresnel variables hard-coded, try having them change ov
504507
</Callout>
505508

506509
## Post-processing
507-
Filter shaders are built in much the same way as any other shader in p5.strands. The only difference is that we are only concerned with the fragment shader in filters, the one which decides the color of what's on the screen. They work by taking a snapshot of the sketch every frame, and sending through a fragment shader to manipulate the color. There are effects which are only possible through post-processing in this way.
510+
Filter shaders are built in much the same way as any other shader in p5.strands. The only difference is that we are only concerned with the fragment shader in filters, the one which decides the color of what's on the screen. They work by taking a snapshot of the sketch every frame, and sending it through a fragment shader to manipulate the color. Some effects are only possible through post-processing in this way.
508511

509-
For the filter shader built with `buildFilterShader()` there is only one hook block available, `filterColor`. This makes available these properties: `texCoord`, `canvasSize`, `texelSize`, and `canvasContent` - the snapshot of the sketch mentioned above, and its set() method allows us to set the color of the pixel being prepared.
512+
For the filter shader built with `buildFilterShader()` there is only one hook block available, `filterColor`. This makes available these properties: `texCoord`, `canvasSize`, `texelSize`, and `canvasContent` - the snapshot of the sketch mentioned above. Its `set()` method allows us to set the color of the pixel being prepared.
510513

511514
### Pixelating effect
512-
We can achieve a pixelation effect by sampling the color of the scene at fewer points than there are real pixels. If you are not familiar with texture coordinates, also known as UV coordinates, try setting `filterColor.set([filterColor.texCoord, 0, 1])` - this will expand to a 4-element vector, using the x and y of texCoord as red and green values, respectively, setting blue to 0, and setting alpha (opacity) to 1.
515+
We can achieve a pixelation effect by sampling the color of the scene at fewer points than there are real pixels. If you are not familiar with texture coordinates, also known as UV coordinates, try setting `filterColor.set([filterColor.texCoord, 0, 1])` - this will expand to a 4-element vector, using the x and y of the 2D `texCoord` as red and green values, respectively, setting blue to 0, and setting alpha (opacity) to 1.
513516

514517
<EditableSketch code={`
515518
let pixelateShader;
@@ -534,9 +537,9 @@ We can achieve a pixelation effect by sampling the color of the scene at fewer p
534537
535538
`}/>
536539

537-
The top left corner of the screen is now black, as it has a value of `(0, 0)` in texture coordinates. Down in the bottom left, it's green, showing that the texture coordinates are `(0, 1)`, and there is red in the top right with a value of `(1, 0)`.
540+
The top-left corner of the screen is now black, as it has a value of `(0, 0)` in texture coordinates. Down in the bottom-left, it's green, showing that the texture coordinates are `(0, 1)`, and there is red in the top-right with a value of `(1, 0)`.
538541

539-
With that in mind, let's manipulate the number of texture coordinates, so that more pixels will sample their color from the same place on the original texture. We do this proportionally to `filterColor.canvasSize` to get square pixels.
542+
With that in mind, let's manipulate the texture coordinates, so that more pixels will sample their color from the same place on the original texture. We do this proportionally to `filterColor.canvasSize` to get square pixels.
540543

541544
<EditableSketch code={`
542545
let pixelateShader;
@@ -574,11 +577,11 @@ With that in mind, let's manipulate the number of texture coordinates, so that m
574577
This is the entire code for our pixelating shader. Put this filter at the bottom of our draw function to pixelate the entire scene by calling `filter(pixelShader)`. We also need to use `buildFilterShader(pixelateCallback)` to construct the shader as before.
575578

576579
### Bloom
577-
In postprocessing, bloom is an effect which makes the brightest parts of an image bleed out and cause the surrounding pixels to light up. This produces the sense that light is emitting from parts of the scene, but without any expensive lighting calculations.
580+
In post-processing, "bloom" is an effect which makes the brightest parts of an image bleed out and cause the surrounding pixels to light up. This produces the sense that light is emitting from parts of the scene, but without any expensive lighting calculations.
578581

579-
Bloom works by taking a blurred version of the image and setting some threshold values. Anything above that threshold (i.e. the brightest parts of an image) will be added to the original, non blurred version of the image. This creates a glow around the original objects.
582+
Bloom works by taking a blurred version of the image and setting some threshold values. Anything above that threshold (i.e., the brightest parts of an image) will be added to the original, non-blurred version of the image. This creates a glow around the original objects.
580583

581-
We could write the shader to produce the blurred image, but p5.js already has provided us with a `filter(BLUR)` which we can use. We will have to approach this effect slightly differently. Firstly, we will need to create a `p5.Framebuffer` object to capture the contents of the canvas before we blur it.
584+
We could write the shader to produce the blurred image, but p5.js already provides us with a `filter(BLUR)` which we can use. We will have to approach this effect slightly differently. Firstly, we will need to create a `p5.Framebuffer` object to capture the contents of the canvas before we blur it.
582585

583586
```js
584587
let originalImage;

0 commit comments

Comments
 (0)