Phaser World Issue 215

Oh boy, we have a good one for you this week! A new beta release of Phaser 4 that introduces some groundbreaking GPU tech for rendering millions of sprites, plus a comprehensive guide on Shader development, cool new games, and a massive team developer log.
🤯 1 Million+ Animated Sprites? Yes, please!
Here's something a little special - the final part of our new renderer just landed and Sprite GPU Layers are now available in Phaser v4 Beta 5. What can they do? For a start, they can easily handle rendering 1 million+ background sprites!
SpriteGPULayer is a new game object designed for ultra-high performance in background layers. Leverage the power of the GPU to render a hundred times faster than conventional sprites. And use tween-like animations to add life to "static" backgrounds. This allows for behaviors like animated characters, crowds of moving people, trees swaying in the wind, fading smoke rising from chimneys, raindrops falling, and many more.
We just published an article all about them, which includes 4 demos to play with.
📕 Phaser 4 Shader Guide now Available
Phaser 4 updates the WebGL rendering system to become more powerful and reliable than ever before. This guide explains the core concepts and how to use shaders in the new system.
We're excited to announce the release of our detailed guide on creating shaders in Phaser 4! This comprehensive resource walks you through everything you need to know about working with Phaser's new WebGL rendering system, "Phaser Beam"
Read more about the Shader Guide
🔠 Word Rummy
Word enthusiasts, get ready for a fresh take on letter-tile gaming! Stone Fruit Studios has just launched Word Rummy, an engaging new word game available exclusively as a Discord Activity. This clever mashup of beloved classics like Wordle, Scrabble, and Bananagrams brings strategic word-building directly to your Discord server.
🐍 Full Stack Snake Tutorial
In this episode of the 12-Minute AI Game Development Challenge, we’ll build a full-stack Snake game with Cursor IDE, Supabase, and Phaser.js — all in just 12 minutes and with just prompts (NO coding) Watch and learn how to use AI to build and develop games with Cursor.
🔨 Hammer Time
In a standout entry from the 16th Pirate Software Game Jam, developer drewhjava brings us Hammer Time, an intense survival game where you play as a trusty hammer that is both your lifeline and your greatest weapon. Created in just 16 days under the theme "You are the weapon," this game stands out among the jam's 1,772 submissions with its unique combat mechanics and strategic gameplay.
Phaser Studio Developer Logs
Things are getting hot in here! This is what the Phaser Studio team was up to last week…
👨💻 Rich - CTO & Founder
Part of my job includes talking to new potential customers. In addition to finding out about their companies, it also allows them to ask what features Phaser and Phaser Editor have. A reoccurring question, which was posed yet again today, is: How are you going to integrate AI into your products?
And it's a perfectly valid thing to ask of a software vendor now. Of course, there's no need to have it within Phaser itself. I'm not sure how that would even be possible anyway. But within Editor? That's a different matter. Developers are becoming so used to having co-pilot and Cursor-like interfaces at their fingertips that it's becoming almost painful when they're not present, and purchase decisions are now based upon it. Code-level assistants are one thing. This would be relatively easy for us to add if we wanted to. Because Phaser has always been open-source, the models are full of a decade of learning material. They don't always appreciate the nuances of just how the API fully interacts - but, given enough prompting and patience, they can all muster-up a decent quality Phaser game in next to no time.
That doesn't mean we should stop learning to code and let the tools take over, far from it. Yet it would be idiotic to bury my head in the sand and pretend this seismic shift isn't taking place because it is. And it's happening right now. So, what are we going to do about it? The short answer is: I'm not entirely sure yet, but it's something as a team we need to focus on very closely and carefully in the coming months. Right now, though, we've got the massive Phaser v4.0 release to focus on, the lesser, but no less important v3.88 release, and of course, Phaser Launcher and Phaser Editor updated to reflect all of this innovation.
Make no mistake, Phaser 4 is a superb release, and I cannot wait for it to be released this month. We've spent over a year working full-time on the brand-new renderer, and it shows. It's a giant leap forward in terms of performance and features - look at the Sprite GPU Layers we released in this issue for proof. Having that live, along with the other projects all coming to completion, will be truly exciting for us and the community.
In the meantime, I will keep talking to customers, listening to their needs and trying to find ways to make their lives easier. And if that means integrating AI into our products, so be it. It's not something I'm going to rush into. It's something that needs to be done right and done well. And that's going to take time. But it's also going to be worth it. Because the future of development will be very different from now. And we need to be ready for it. Thankfully, with all the tech we've been developing and releasing over the past year, we're in a perfect position to do just that.
👨💻 Francisco - Phaser Launcher
Welcome everyone to another week!
We're getting closer and closer to the first release of our Phaser Launcher. As you may know, "Phaser Launcher" is just our project name and not the final name of the software. Our team has been brainstorming potential names, and we've come up with some interesting ones:
Phaser Launcher
Phaser Editor Lite
Phaser Taco Blast Processing
Zero (this one is really solid!) We haven't made a final decision yet, but we hope you'll like the name we will choose.
The Launcher is designed to be a tool that helps developers create games with Phaser, as well as an entry point for those who want to step into the world of game development. With that in mind, we've added the Phaser by Examples book to the launcher. We've even dedicated a section on the landing page to make it easier for you to access the book.
The Launcher features a Game Viewer, a component that allows you to preview your game’s progress. However, we encountered a small issue: to see any updates, you had to close and reopen the window manually.
After diving deep into how Tauri’s window management works, I discovered that I could inject code into the game server, allowing it to emit an event that requests an update for this component. Thanks to this, we now have a much faster update process—without needing to close the window!
Thanks for reading about the Launcher’s progress—stay tuned! 🚀
👨💻 Arian - Phaser Editor
Hello!
These have been some hard work on the Box2D tools for Phaser Editor. Here is a summary of some of the most important features we have made progress on.
Body bounds
Since Box2D is not part of Phaser, I have had to implement some utilities to delimit the area that a body encompasses. This is used to determine when the user clicks on a body and also to highlight the object when it is selected.
New shapes
I have added two new shapes: Offset box and Circle.
Box2D glue-code generation
In a game, to create scenes with Box2D support, you have to write some code related to setting up the world, advancing the simulation, updating the sprites, displaying debug information, etc... All this code is now generated by the editor. You just have to activate the parameter Generate Box2D World Code. This means that the editor is now able to automatically generate all the code needed to get your Box2D scenes ready to run. Along with this parameter I have added others to further detail some optional aspects of the scene, such as implementing the "update" method and generating debug code.
World parameters
If you enable the code generation of the Box2D world, then the editor shows you the world parameters:
Auto import
All Box2D construction is made by calling the Box2D API functions. The editor will generate the "import" statements of all the functions. For now, it will look at the "PhaserBox2D.js" file in the project and will use it as the source of the API.
Prefabs
I started working with adapting the editor's prefab system to the Box2D particularities. In the next scene, all of the elements are made of prefabs that contains a Box2D body:
There is a lot to do with the prefab system, since I should implement a new way to configure the nested prefabs by setting the parameters via constructor arguments instead of object properties (as it is now).
Finally, here I let you with a small video of a demo I'm making, inspired on Angry Birds. In the next weeks we are going to make a release of the editor with an early version of the Box2D tooling. This version is not going to be feature-complete, but you will be powerful enough to make an Angry Birds-like game, all with in-scene visual tools.
👨💻 Can - Discord Activities
I wrapped up work on the guide and the template itself, providing a clear step-by-step approach. Dealing with all the details is not a straightforward experience, we wanted to make the process as simple as possible. You have to go through steps like app creation, setting your development environment, payout settings, dealing with proxies, verifying your app, creating your store, testing, debugging via the Discord app itself, bot settings, getting into production, and many more! If you get stuck on any part through it, or any part we can improve with the guide & repo - share with us in #discord-activites on Phaser Discord.
Now, I'm switching to Phaser Launcher to join forces with my great teammate, Francisco. I'll be focusing on the deployment part of the app for the moment being.
Till the next one, keep sharpening your pixels!
Tales from the Pixel Mines
3rd February, 2025 - Ben Richards
We've released Phaser 4 Beta 5, and it's a big one. This release introduces SpriteGPULayer
, a game object which uses WebGL to render lively backgrounds a hundred times faster than anything else in Phaser. I've had this in the works for some time, and it's great to finally have it out!
As if that wasn't enough, I released the first draft of the Beam Shader Guide, to help you write your own shaders and filters. It's available for logged-in Phaser members at https://phaser.io/tutorials/phaser-4-shader-guide.
I also made changes to internal texture orientation, and fixed lots of bugs.
SpriteGPULayer
SpriteGPULayer is a new game object designed for ultra-high performance in background layers. Leverage the power of the GPU to render a hundred times faster than conventional sprites. And use tween-like animations to add life to "static" backgrounds. This allows for behaviors like animated characters, crowds of moving people, trees swaying in the wind, fading smoke rising from chimneys, raindrops falling, and many more.
This means you can get over 1 million sprites running on modern devices. That's a lot! A 4K display only has 8.3M pixels, and a more common resolution of 1280x720 has less than 1M pixels - one pixel per sprite. That quickly dissolves into meaningless noise if you try to push the limits. I know, I've done it.
You can actually get more than a million sprites. The maximum buffer size seems to be around 8 million, but if you run out, just make a second SpriteGPULayer. High-end machines can handle it.
The SpriteGPULayer is designed for high efficiency backgrounds. The expected use case is for developers to create static background or foreground layers containing high detail and limited animation. This is common in video game worlds, so we think it's going to be very useful.
The "Big Forest" demo shows just how much you can do with this technology. It consists of 1.4 million sprites, set up with parallax and cyclic animation. Grass and trees sway, clouds drift by, grains of pollen glint in the light, and plenty more. But this is overkill: the scene is a thousand screens across. It would take you 60 minutes to scroll from one side of the game world to another. 1.4 million sprites is a lot, and every grain of pollen in the entire world is being recalculated every frame. It still runs smoothly.
(The Big Forest demo also uses basic procedural generation to place parallax ground layers, alter the weather, grow trees etc. if you like that sort of thing.)
My advice for SpriteGPULayer is to use it efficiently. Don't render a thousand screens of off-screen content - that wastes device power. Consider splitting the scene up into smaller regions which can render efficiently, and use multiple SpriteGPULayers to handle foreground, background, and far background parallax elements. Use it to shift computation off the CPU, to free up processing power for other tasks like physics and AI.
How It Works
SpriteGPULayer is designed to fix a major performance bottleneck: the CPU-GPU update cycle.
Most Phaser game objects are calculated in the CPU, packed into buffers, and sent to the GPU to render. The problem is, CPU computation is very limited in JavaScript. We're limited to a single thread, so we have to do every calculation one after the other. Then when we're done, we have to send the data to the GPU, a process which takes more time while we just wait. This all adds up to a lot of time per frame, limiting how much we can render.
These problems have solutions. JavaScript has access to WebWorkers, and GPU updates are far more efficient with APIs like WebGPU. Unfortunately, WebWorkers have too much overhead for our current architecture, and WebGPU is not yet optimized enough to meet our performance needs.
What if we didn't have to use the CPU?
SpriteGPULayer packs data into buffers and sends them to the GPU at construction time. The data doesn't need to be updated. All the computations which are normally done on the CPU are instead handled by a vertex shader on the GPU. Because shaders are highly parallelized, all the work that takes a long time gets split up and run at the same time. This means we can run orders of magnitude more rendering work, and never run out of time.
You can update SpriteGPULayer data at runtime. It's just as efficient as ordinary Phaser objects. But because SpriteGPULayer can use so many more sprites, you may find that updates take so long the game stutters. The object has some optimizations for quickly updating contiguous regions of buffer memory, but even so, you should be cautious with updates to large layers.
Parallel GPU shader operations are so efficient that they will never be a bottleneck. The actual bottleneck is fill rate: how many pixels are drawn to the screen. The GPU has to handle every pixel on every sprite on the screen, even if it's behind another sprite. (Pixels not on the screen are not calculated.) Don't draw too many overlapping sprites, or performance will drop.
This all comes at a cost. If we don't update the data, we can't change it in response to input, physics, or other "dynamic" systems. A highly detailed background that doesn't move is pretty boring, even if it has parallax.
So I implemented an animation system. In fact, SpriteGPULayer supports two kinds of animation: frames and properties.
Frame animation is the ordinary kind of animation you see on sprites. A series of frames are displayed in sequence. SpriteGPULayer supports basic frame playback: it cycles frames at a fixed interval.
Property animation is more like tweens. You set the base value of properties like x
, y
, alpha
, or tintBlend
. You can also set values like amplitude, duration, delay, and animation type. The SpriteGPULayer vertex shader uses these values to calculate the vertex positions at a given moment in time, without recording state or computing multiple steps. THis is enough to support a great variety of lifelike animations.
I added animations which correspond to most tweens in Phaser. Because Elastic
requires more parameters, it had to be left out. The Stepped
tween supports extra parameters, but we skip them for the same reason. Also check out the unique animation type, Gravity
, which simulates objects falling - perfect for effects like fountains, explosions, etc. You don't have to use Gravity just for the Y position; you could apply it to other properties to simulate some process which accelerates or decelerates over time.
There are a lot of technical details in how this fits together, but I don't want to be here all week, so I'll just skim the surface.
WebGL has some hard limits on how much data we can use per vertex. We assume that we have access to 16 vertex attributes. An attribute is a vector of 4 32-bit floating-point numbers, so in practice we can use 16x4=64 numbers to describe a sprite. As you saw above, we define a property with base, amplitude, duration, delay, animation type, and a 'yoyo' flag to control how animation repeats. That's 6 numbers per property, so we could have up to 10 properties in our program. We have (let's see...) 14.
Obviously we're doing something clever to exceed the hardware limitations.
Many properties aren't animated. For example, we have per-vertex color tint, just like regular Sprites; but we don't animate it. Animating 4 8-bit color channels inside a 32-bit value is pretty complicated! Instead we have a single animated tintBlend
property which controls how much tint every vertex receives. This is enough to give decent control of colors.
And many properties are packed together, using the 32-bit number space to store more data than intended. For example, when you use the Gravity animation, we replace the animation amplitude with velocity and acceleration. How do we fit two numbers into a single number? Well, we make some precision trade-offs. We round down the velocity to an integer value, which is almost always good enough. Then we store acceleration as a fraction. There's a global gravity value for the whole layer, and this fraction multiplies that to get an individual gravity value for the sprite. Then we just add the integer and the fraction together to get a single number, which we store. On the GPU, we reverse the process to turn one number back into two. This might not be super efficient, but it's fast enough and gives us all the functionality we need! While there are limits on floating-point number precision, you've still got 23 bits to split between velocity and acceleration.
I haven't even mentioned how we use texture data to store frame animations. We really squeezed every drop of functionality out of this system. I'm pretty proud of the result, and I hope it helps you make better games.
Shader Guide
The Shader Guide draft tells you all about shaders and filters in Phaser 4. Log into the Phaser site and give it a look! This is a very powerful system for managing shader code. It also gives solid examples to get you writing your own shaders. We've already seen developers creating cool experiments, so I guess it works!
Texture Orientation
After examining various factors around external renderer compatibility, it became clear that we were handling textures wrong. GL texture coordinates start in the bottom-left, unlike the top-left of web standard image formats. We previously handled this with some options inside the render system, storing images in top-left format and doing a "final flip" to present the frame to the player. But when using the Extern
object to add renders from ThreeJS and other engines, it was no longer quite good enough.
So I flipped texture representation. Now all our texture systems are bottom-left oriented.
This should not change anything for the user. Only if you're handling texture coordinates within the render system do you need to invert your Y axis.
However, for users of compressed textures such as PVR or KTX files, you must invert your texture coordinates for them to work properly in Phaser 4. This is a simple option in the software used to create these files. If you are using assets created for other systems, they are probably oriented this way already, because it is the default. We don't want you to have to create non-standard textures just for Phaser!
A summary of steps for creating high quality compressed textures for Phaser 4:
Generate a texture atlas in TexturePacker, and save it as PNG.
Use ImageMagick to lighten the image for hardware compression on the Web:
magick input.png -set colorspace RGB -colorspace sRGB output.png
Use TexturePacker or PVRTexTool to save the lightened image as the following texture formats, in a PVR or KTX container, ensuring that the Y axis is flipped:
ASTC sRGB UNorm 4x4 (or another sample level - don't select a "signed" variant!)
ETC2 sRGBA
PVRTC v1
S3TC DXT5 sRGB (also known as BC3)
Fixes and Tweaks
In Beta 5, I fixed DynamicTexture. It broke after we removed some theoretically dead code in Camera.
After the Beta 5 release, I added destroy handling for filters, including a setting to preserve filter controllers for reuse (not recommended, but possible if you know what you're doing). I also removed more dead code, this time from places where it was interfering with nested transforms, helping apply filters to objects inside Containers. The last thing I did was add filter support to Layer, which slipped through the cracks because it's the only game object that's not a GameObject. This is why code duplication is bad, kids! It's not just extra package size; it's a maintenance issue.
Next Week
I've got a pile of bugs and tweaks to work through to get Phaser 4 ready to roll. But we're close! We had an absolutely titanic week last week, and beta feedback continues to be excellent. Thanks to all the people who are helping us create the best Phaser there's ever been.
Share your content with 18,000+ readers
Have you created a game, tutorial, code snippet, video, or anything you feel our readers would like?
Please send it to us!
Until the next issue, happy coding!
Subscribe to my newsletter
Read articles from Richard Davey directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
