[Pixel Post] Level‑Up Your Pixel Art: Walk‑Cycle Animation with Flux, IP‑Adapter V2 & ControlNet (2025 Guide)


TL;DR – Feed two or three clean reference sprites + a stick‑figure pose frame into Flux.1, crank the IP‑Adapter dial to ~0.8 and the ControlNet (pose) dial to ~1.1, iterate for eight frames, then stitch the PNGs into a sprite sheet. You just animated a character while your coffee was still hot.
Why Flux?
Native pixel‑art bias. Flux.1’s rectified‑flow backbone loves hard‑edged sprites; it doesn’t hallucinate watercolours the way SD‑XL sometimes does. (huggingface.co)
Speed. The Schnell variant gets a 64 × 64 frame out in ~0.6 s on a mid‑tier GPU.
Drop‑in adapters. Former DreamBooth chores are now one‑click IP‑Adapter loads – no fine‑tunes, no 10‑GB checkpoints. (huggingface.co)
Stuff You Need
Role | 2025 Best Pick | Notes |
Base model | black‑forest‑labs/FLUX.1-dev or FLUX.1‑schnell | The heavy dev build looks nicest; Schnell is 2‑3× faster. (huggingface.co) |
Character reference glue | Flux IP‑Adapter V2 (XLabs‑AI/flux‑ip‑adapter‑v2 ) | Multi‑image, multi‑style; ComfyUI node ships in the repo. (huggingface.co) |
Pose forcing | Flux ControlNet‑Union‑Pro 2.0 (pose mode) | Set mode="pose" and you get OpenPose fidelity without the SD baggage. (huggingface.co) |
Face lock (optional) | PuLID‑FLUX v0.9.1 | Keeps the mug on‑model; lighter than InstantID. (huggingface.co) |
Front‑end | ComfyUI 0.3.39+ + comfyui‑ipadapter‑flux extension | The node bundle is literally one click in the manager. (runcomfy.com) |
Add Python ≥ 3.10 and diffusers
0.28 if you’d rather script it.
Step 1 – Prep Your Reference Frames
Pick 2–3 clean sprites. Front idle, three‑quarter idle, maybe a jump frame. Transparent PNG, no background cruft.
Upscale 4× (nearest‑neighbour). Keeps the pixel grid crisp while giving the adapter something to chew on.
Double‑check colour palette. IP‑Adapter loves to copy colours verbatim – good for consistency, embarrassing for stray UI magenta.
Tip: Flux works fine with 64 × 64 refs; no need for 512 monsters. The adapter will down‑sample anyway.
Step 2 – Get Your Walk‑Cycle Silhouettes
Grab an 8‑frame walk template (Aseprite → File ▸ Export ▸ PNG Sequence).
Set background to pure white, character to solid black – OpenPose reads edges, not art.
Export each frame as
pose_00.png … pose_07.png
.
If you’re lazy, import your sprite into ase‑pose‑helper (free) and auto‑trace stick figures.
Step 3 – Install the Toybox
# 1. ComfyUI (if you don't already have it)
git clone https://github.com/comfyanonymous/ComfyUI.git
cd ComfyUI && python -m venv venv && . venv/bin/activate
pip install -r requirements.txt
# 2. Flux checkpoint (drop into models/checkpoints)
wget https://huggingface.co/black-forest-labs/FLUX.1-dev/resolve/main/flux1_dev.safetensors
# 3. Flux IP‑Adapter node
comfyui --add-node XLabs-AI/flux-ip-adapter-v2
# 4. ControlNet‑Union‑Pro
comfyui --add-node Shakker-Labs/FLUX.1-dev-ControlNet-Union-Pro-2.0
(Windows folk: drag‑and‑drop the .json
node bundles into the manager, click Install, done.)
Step 4 – Build the Graph
Checkpoint Loader →
flux1_dev.safetensors
IP‑Adapter‑Flux Node
Input images:
ref_idle.png
,ref_jump.png
Scale: 0.80
Flux Preprocess – DWPose
- Image:
pose_00.png
- Image:
Flux ControlNet (Union‑Pro 2.0, mode=pose)
Weight: 1.10
Guidance End: 0.65
(Optional) PuLID‑FLUX
- Weight: 0.75
K‑Sampler
- Euler‑a, 24 steps, CFG 5.0
Save Image
Hit Generate.
Hand drift? Bump ControlNet to 1.25. Colours washed‑out? Nudge IP‑Adapter scale to 0.9.
Repeat for each pose_XX.png
.
Step 5 – Automate with Diffusers (Loop‑All‑Frames)
from diffusers import AutoPipelineForText2Image, FluxControlNetModel
import torch, glob, PIL.Image as Image
pipe = AutoPipelineForText2Image.from_pretrained(
"black-forest-labs/FLUX.1-schnell", torch_dtype=torch.float16).to("cuda")
# Pose ControlNet
pipe.controlnet = FluxControlNetModel.from_pretrained(
"Shakker-Labs/FLUX.1-dev-ControlNet-Union-Pro-2.0",
subfolder="pose", torch_dtype=torch.float16).to("cuda")
# IP‑Adapter
pipe.load_ip_adapter("XLabs-AI/flux-ip-adapter-v2"); pipe.set_ip_adapter_scale(0.8)
ref_imgs = [Image.open(p) for p in ("ref_idle.png", "ref_jump.png")]
for pose_path in sorted(glob.glob("pose_*.png")):
frame = pipe(
prompt="32px pixel art character, clean outline, flat colours, no background",
negative_prompt="blur, extra limbs, watermark",
ip_adapter_image=ref_imgs,
control_image=Image.open(pose_path),
controlnet_conditioning_scale=1.1,
num_inference_steps=20,
guidance_scale=5.0,
).images[0]
frame.save(pose_path.replace("pose_", "walk_"))
Eight files later and you have walk_00.png … walk_07.png
.
Step 6 – Stitch a Sprite Sheet
from PIL import Image
frames = [Image.open(f"walk_{i:02}.png") for i in range(8)]
w, h = frames[0].size
sheet = Image.new("RGBA", (w*8, h))
for i, fr in enumerate(frames):
sheet.paste(fr, (i*w, 0))
sheet.save("walk_cycle.png")
Drop walk_cycle.png
into Aseprite, set Frame Duration = 100 ms, press Play. Boom – looping animation.
Common Gotchas
Symptom | Fix |
Grid mis‑alignment | Resize outputs with nearest‑neighbour back to your native sprite size; if logical pixels are 10× physical, downsample exactly 10×. |
Muddy details | Flux has a “Retro‑Pixel LoRA”; load it with pipe.load_lora_weights() at 0.3 strength for extra chunky charm. |
Huge VRAM spike | Use Schnell + FP8 and cut steps to 12 – good enough for 32 × 32 sprites. |
Final Thoughts
What used to take a week of onion‑skinning in Aseprite now takes fifteen minutes and a GPU. Flux + IP‑Adapter V2 locks your character, Union‑Pro nails the pose, and a tiny Python loop spits out the frames. Spend the time you saved on coffee, new level designs, or – dare I say – shipping the game.
Go forth and animate. And remember: if your sprite can walk, your game can run.
Subscribe to my newsletter
Read articles from Pixel directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
