Last week, I introduced a particle effects demo named Trigonometric Particles. Today, I’m going to explain the mechanics behind making it work. But first…
What is a particle system?
I’m sure you can find many articles on the definition. A particle system is basically a collection of particles, and controls the particles through some logic you defined. Uh, so what’s a particle?
Think of smoke. Imagine it being comprised of thousands of small bits of carbon matter. Which it is, in a manner of speaking. And there you have it, each small bit of carbon matter is a particle.
Particles and particle systems are commonly used to simulate smoke, fire, water, explosions. Anywhere when there are large numbers of particles (for lack of a better word…) behaving in a certain way, and you need a system to control them. You could directly manipulate them in the program, but the logic won’t be portable to other programs. “You need smoke in another program? Here, use this particle systems class.”
The class representing a particle I used in Trigonometric Particles had a 3D position with X-, Y- and Z-axes (in terms of functions of time), colour (red, green, blue), and life (length of display or existence). Some particle systems have particle velocities, gravities and probably concepts I won’t even think of. The one I’m using has few components, but the axis functions are complex.
What happens is, for every frame of animation, you update each particle with the correct calculations. Say, one particle was here, then at the next frame, it should be there, according to the simulation logic. And you iterate through the entire list of particles, updating each component as needed (such as decreasing the life counter).
Usually, you don’t update the positional component directly. You update the velocity, and let the velocity influence the position instead. There could also be a global gravity (or local to the particle), which influences the velocity, which in turn influences the position.
So the order of updating probably goes like this: you update the position based on the current velocity. Then you update the current velocity based on the gravity component. Then you update the gravity component based on the logic you require. This happens in one frame of update. As you can imagine, that’s a lot of updates to stick your hand into in a program. It might be easier to pull the standard updating logic into a class.
I understand that this long method of indirect updating creates a smoother particle simulation. Say you’re simulating a waterfall and drops of water are initially splashing to the right. As each water droplet is influenced by gravity (downwards), its velocity starts shifting downwards. But it’s still moving fairly fast towards the right. But on each subsequent update, it moves faster towards the ground, hence simulating the gravitational effect.
In my case, I’m directly manipulating the position because I don’t need that kind of smoothness. It’s taken care of by the axis functions. I’ll explain more in another article.
Oh, and the life counter? It’s usually 1, and goes down to 0. It’s like an alpha setting, usually used when rendering the particle. So at 1, it’s displayed at full strength. As it winds down to 0, the particle is rendered at greater and greater transparency, till you finally can’t see it. Depending on your logic, you might “revive” the particle by giving it full strength and starting over, or just let it “die”.
Speaking of alpha values, here’s the bitmap I used for rendering my particles.
I blew it up from 32 by 32 pixels to 128 by 128 pixels so you can see it better. I think I used my Bryce renderer to generate this (yes, I know it’s weird using a full 3D graphics renderer to generate a small bitmap for a particle effects demo… but it’s the only tool I have to generate it! Anyway…). It starts as a white square, then I applied Gaussian filters to it so a circular white blob is in the center fading to black at the edges.
You can do this with Paint.NET as well (I didn’t have it back then). Version 3.36 allows you to use the Gradient tool. Just use the radial gradient.
In my demo, I mapped this bitmap as a texture to my particles. I set the bitmap to the specified colour of the particle, and use the “height map” value of the bitmap as an alpha value. So if the particle is red, then it’s rendered like the bitmap texture, but in red, fading gradually to transparency towards the edges.
Then I used the life counter to further augment the alpha value. So at full life, the centre of the bitmap is rendered at zero transparency (full opaqueness). As the life counter goes down, the bitmap is rendered at the transparency inversely proportional to the life counter. Just look at the point in my video where I said to fade the top of the tornado. This was what I was doing. Other times, I just rendered all particles at full strength. Just download my demo and play with it already!
The bitmap texture is in black and white because its purpose is to give shape to the rendered output, which is a circular blob fading at the edges. Colour is provided by the particle’s properties. And it’s small in size, because my particles are small.
This also means that your particle system can use some other form of rendering the particles. For example, you could use a different texture, such as wispy puffs, which you could bunch together and animate and they could look like moving clouds.
Or you could render full 3D models using the positional information of the particles. Rendering textured mapped quadrilaterals is faster than rendering 3D models, which is a big deal because there are a lot of particles. But depending on your situation (maybe you need less particles), and targeted computer speed, 3D models may be more suitable.
And that’s all I have to say for now. You want to ask about or add information to my particular brand of particle systems (used in my demo), or about particle systems in general? Post it in a comment below or email me.