💡 Next.js App Router’s Hidden Flows: Server Actions, Cookies & RSC Re-renders

Table of contents
- 🔍 Introduction
- 🍪 Behind the Scenes: How Cookie Updates Invalidate the Cache
- ⚡ Streaming Fresh UI: How RSC Payloads Update the Client
- 🧩 When "Seamless" Becomes "Puzzling": A Real-World Case Study
- 🔬 Debugging the "Hidden Flows": My Journey to Understanding
- ✅ The Robust Solution: Leaning into Next.js’s Design
- 📝 Key Takeaways for Next.js Developers
- 👋 Conclusion
- 📚 References

🔍 Introduction
Ever landed on a page in your Next.js app and wondered why the UI didn’t quite reflect your latest changes, without you clicking refresh? You’re not alone. While the App Router’s Server Actions and React Server Components (RSC) promise super-fast updates, subtle under-the-hood mechanisms, especially around caching and cookie updates, can occasionally lead to unexpected UI behavior.
In this walkthrough, we’ll:
- Peek Inside: What happens when you call
cookies.set()
orcookies.delete()
in a Server Action. - Stale Cache: How Next.js flags the Router Cache as stale and schedules a fresh RSC build.
- Automatic Updates: How the updated UI streams to your browser automatically, no manual
revalidatePath()
or page reload required. - Real-World Fixes: Dive into two real chat-app scenarios, explore the odd behavior we encountered, and share the simple fixes that made everything click.
By the end, you’ll know exactly how these hidden flows work and have clear, battle-tested patterns to keep your Next.js app lightning fast and perfectly in sync. Ready? Let’s go! 🚀
🍪 Behind the Scenes: How Cookie Updates Invalidate the Cache
Imagine a route that shows user-specific data (profile picture, name, etc) based on an auth
cookie. When you run a Server Action that writes or deletes that cookie, Next.js automatically triggers a cache reset:
- Cache Stale: Next.js marks the route’s cache as stale. Any previously cached Server Component output for that path is discarded.
- Fresh Build: Immediately after the Server Action finishes, Next.js builds a fresh RSC payload on the server.
- UI Update: Next.js streams the new build straight to the client as soon as it’s ready, updating your UI without waiting for another navigation or refresh.
Here’s an example Server Action (in actions/auth.js
) that sets an authentication token:
// In actions/auth.js
export async function addAuthToken(token) {
'use server';
cookies().set('auth', token, { path: '/' });
}
Notice there’s no call to revalidatePath()
or router.refresh()
. Next.js handles cache invalidation and ensures your UI will reflect the updated cookie value on the next render.
💡 Key takeaway: Updating cookies inside a Server Action is all you need to trigger a cache reset and guarantee fresh, up-to-date data in your Server Components.
⚡ Streaming Fresh UI: How RSC Payloads Update the Client
Once Next.js has invalidated the cache and built a new React Server Components (RSC) payload on the server, it doesn’t wait for you to navigate or refresh. Instead, it streams updated UI chunks straight to the browser:
- Action Completes: Server Action completes and invalidates the route cache.
- RSC Render: Next.js renders the new RSC tree on the server, serializing it into chunks.
- Streaming Begins: As each chunk is ready, Next.js begins streaming it down to the client.
- DOM Merge: React’s concurrent model merges those chunks into the DOM, updating only what changed.
- Snappy UI: The DOM updates happen incrementally, so your UI feels incredibly snappy.
This streaming model means your app can display fresh data almost instantly, without full page reloads or extra fetch calls. Users see updated content as soon as the server produces it.
🚀 Pro tip: Streaming reduces time-to-interactive. Your page appears and updates faster, boosting perceived performance.
🧩 When "Seamless" Becomes "Puzzling": A Real-World Case Study
While the automatic cache invalidation and RSC re-rendering by Server Actions are incredibly powerful, their subtle timing and interaction with client-side navigation can lead to unexpected challenges if not fully understood. I discovered these complexities firsthand while building a dynamic chat application, which serves as a perfect illustration of these "hidden flows."
My application featured a chat page designed to handle dynamic conversations. This page was set up with a dynamic route segment, [[...slug]]
, meaning it could render initially without a specific chatId
in the URL (e.g., /chat
), and then seamlessly transition to /chat/someChatId
once a conversation began.
The core message sending flow was designed to be efficient and interactive:
- User Lands: A user lands on the generic chat page (e.g.,
/chat
, without a/:chatId
in the URL initially). - First Message: On sending the first message, we optimistically update the UI (with the sent message to provide instant feedback) and call a Server Action.
- Server Action's Role: This Server Action first creates a unique
chatId
, processes the message on the server, generates updates for that message, and finally sends this updated message data (including the newchatId
) as a response to the client. - Client-Side Logic: Upon receiving the Server Action's response, my client-side logic would:
- Update the URL using a shallow client-side update with
window.history.pushState
(a common practice for updating parts of the URL without a full page reload, and Next.js also supports it), navigating the user to the/:chatId
page. - Remove the optimistic message and update the displayed messages using the actual, confirmed data from the Server Action's response.
- Update the URL using a shallow client-side update with
It all sounded logical on paper, leveraging Next.js's capabilities for a smooth user experience. However, this seemingly straightforward process led to two distinct, highly confusing issues, stemming directly from the core mechanisms discussed above.
The Puzzling Behaviors Observed
My journey into the nuances of Next.js’s App Router deepened significantly as I tried to reconcile client-side updates with server-side re-renders, especially when the Router Cache was involved.
Our Expectation: A seamless, shallow URL update without affecting the current component tree or triggering a full server re-render.
🐛 Issue 1: The window.history.pushState
Re-render Mystery
When window.history.pushState
was called immediately after the Server Action:
window.history.pushState({}, '', `/chat/${chatId}`);
The server would re-render the entire page and send a new RSC payload back to the client. This completely negated the purpose of a shallow update and caused a noticeable flicker or overwrite.
Reason: Calling pushState
immediately triggers the Router to start a new RSC render. Next.js treats the URL change as a signal to refetch that route (i.e., a new RSC render), creating a race condition with the Server Action's completion.
Workaround: We found a peculiar workaround by wrapping the window.history.pushState
call in a setTimeout(0)
:
setTimeout(() => {
window.history.pushState({}, '', `/chat/${chatId}`);
}, 0);
The shallow redirection would work perfectly. Critically, this setTimeout(0)
stopped the unexpected server re-rendering, meaning no new RSC payload came down, and the UI remained stable for this specific scenario. This suggested some kind of subtle race condition or timing conflict in how Next.js's Router processes client-side URL changes versus its internal Server Action completion logic, where allowing the current event loop to clear before the pushState
prevented a full server round-trip.
This tiny delay allows the current event loop to clear before the pushState
is executed, preventing a full server round-trip and the race condition.
👻 Issue 2: UI Reverting After Cookie Expiration / Cache Invalidation
In our setup, we use two cookies for session management: __at
(access token) and __rt
(refresh token). Our custom middleware is responsible for checking and refreshing these tokens.
Imagine this sequence:
- User lands on
/chat
(nochatId
in the URL yet). - The session cookie (
__rt
) expires in the browser shortly after. - User sends first message (triggering optimistic UI update + Server Action).
- Middleware intervenes: On the same HTTP request that runs our Server Action, our middleware sees the expired
__rt
, refreshes it, and, in doing so, invalidates the Router Cache. - Server Action completes: The Server Action successfully creates a new
chatId
and processes the message on the server. - As soon as the action completes:
- Server-side Re-render: Next.js re-renders the original
/chat
route. Because the server hasn't yet registered the client'spushState
, it streams an emptyinitialMessages
array to the client. - Client-side Update: The Server Action's response arrives. The client then calls
window.history.pushState
(withinsetTimeout
) to update the URL to/chat/{chatId}
, and updates its UI with the new message data from the Server Action.
- Server-side Re-render: Next.js re-renders the original
At this point, your UI, having already removed the optimistic message and applied the Server Action’s real response, would unexpectedly revert back to its initial page load UI (i.e., an empty chat list).
I quickly realized that in both these scenarios, the main culprit was the server re-rendering and sending a new RSC payload. If my local messages state was being updated based on the Server Action's response, and then a new, possibly conflicting, initialMessages
array arrived from the server’s re-render, the UI would inevitably jump back.
The Crucial Insight: This mysterious reversion was tied directly to Router Cache invalidation. When the cache was invalidated (e.g., __rt
expired and refreshed by middleware during the Server Action flow), the server triggered a full RSC re-render and streamed a new payload. Crucially, at that precise moment of the server-side re-render, the URL on the server's side still represented the page before our client-side window.history.pushState
had fully synchronized and been acknowledged by the server. Since the chatId
wasn't yet present in the URL (it was still the /chat
route from the server's perspective during this re-render), the server-side component would fetch data for a non-existent chatId
, resulting in an empty initialMessages
array being streamed down. This empty array would then overwrite our client's updated state, causing the dreaded UI revert.
🔬 Debugging the "Hidden Flows": My Journey to Understanding
Faced with these inconsistent UI states, my debugging journey became a deep dive into the very core of Next.js App Router's lifecycle. My primary challenge was reconciling client-side state updates with potentially conflicting server-side RSC payloads.
One immediate thought was to control message updates via useEffect
hooks on the client. However, this presented a dilemma: if useEffect
continuously updated my messages
state based on initialMessages
from the server, it would consistently fight against the new (and sometimes empty, as discovered in Issue 2) payloads streamed down by the server. My goal became to prevent useEffect
from re-syncing messages after they were initially loaded, to avoid this constant contention.
But then, a new problem emerged: sometimes, I did need the UI to refresh with fresh data (for instance, after an explicit router.refresh()
call triggered by some other user action or data change). If my useEffect
was removed or configured to only run once, how could I trigger updates when I did want them, without it fighting the server's initialMessages
?
This conflict led me down an interesting path: trying to differentiate requests based on HTTP headers within my Server Components. My idea was to discern if a server re-render was an intentional router.refresh()
(triggered by my client code) versus an automatic one (triggered by cache invalidation or window.history.pushState
quirks).
I explored the headers that Next.js sends with requests, specifically looking at the headers()
function available in Server Components:
import { headers } from 'next/headers';
async function MyServerComponent({ initialMessages }) {
const headerStore = headers();
const cacheControl = headerStore.get('cache-control');
const nextAction = headerStore.get('next-action');
const nextUrl = headerStore.get('next-url');
// My attempt to identify router.refresh
const routerRefresh = !(cacheControl || nextAction || nextUrl);
// Pass routerRefresh to client component
return (
<ClientChatComponent initialMessages={initialMessages} routerRefreshDetected={routerRefresh} />
);
}
My reasoning was:
cache-control
was typically present only in a full document request (like a hard refresh).next-action
would be present for a Server Action call.next-url
would indicate a soft client-side navigation.
If none of these specific headers were present, I theorized it must be an internal router.refresh()
, and I could pass routerRefresh
to the client to conditionally update messages.
However, I quickly realized this was a hack reliant on internal Next.js headers that are undocumented and could change in future versions. Relying on such internal behavior is a risky strategy for production applications. So, I opted out of this solution, seeking a more robust and idiomatic approach.
✅ The Robust Solution: Leaning into Next.js’s Design
After navigating through the complexities of cache invalidation, timing issues, and abandoned hacks, the solution emerged as surprisingly elegant and robust. It's a pattern that truly aligns with Next.js's design philosophy, particularly its approach to RSC payload streaming and data consistency.
The core of the problem, especially in "Issue 2" (UI reverting to empty messages), was the server sending an empty initialMessages
array when the chatId
was not yet available to it during an early, triggered re-render. My client-side state, however, already had the correct, newly created message. The goal was to prevent the client from prematurely adopting the server's incomplete view.
The robust solution involves a simple, conditional check when updating the client-side messages array:
// In your Client Component (e.g., ChatDisplay.tsx)
'use client';
import { useState, useEffect } from 'react';
export default function ChatDisplay({ initialMessages }) {
const [messages, setMessages] = useState(initialMessages);
useEffect(() => {
if (initialMessages.length > messages.length) {
setMessages(initialMessages);
}
}, [initialMessages, messages.length]);
// Rest of your component logic for sending messages, etc.
// When a message is sent via a Server Action, you'd typically
// update messages optimistically, then with the Server Action's
// confirmed response. The useEffect then handles any subsequent
// server-pushed updates of initialMessages.
}
Why this solution is robust:
- Trusts the Server (Eventually): It acknowledges that the server will eventually send the correct, complete data when the route context (like
chatId
) is fully established and consistent. - Prevents Premature Reverts: The
if (initialMessages.length > messages.length)
condition is critical. It prevents the client-sidemessages
array from being overwritten by an older or emptyinitialMessages
array that might come from an early, cache-invalidated server re-render (like whenchatId
was missing on the server's perception of the URL). - Handles
[]
Gracefully: This specifically addresses the scenario whereinitialMessages
might temporarily be[]
due to the server rendering without the fullchatId
context during a cookie-triggered invalidation. It only adopts the server's view if it's genuinely more comprehensive than what the client already has. - No Fragile Hacks: This approach relies purely on React’s
useEffect
and standard array comparisons, making it resilient to future Next.js internal changes. It avoids relying on undocumented internal headers or complex timing tricks beyondsetTimeout(0)
for specific shallow routing edge cases.
This solution allows the automatic RSC streaming to provide fresh data when available, but with a guard that ensures your UI doesn't jump backward due to temporary inconsistencies in the server's early renders.
📝 Key Takeaways for Next.js Developers
Navigating the intricacies of Next.js App Router, Server Actions, and the Router Cache can be challenging, but understanding their "hidden flows" is key to building truly robust and seamless applications.
Here are the critical takeaways from this deep dive:
- Automatic Cache Invalidation is Powerful: Remember that
cookies.set()
orcookies.delete()
(andrevalidatePath()
,revalidateTag()
) within Server Actions will automatically invalidate the Router Cache. This is a feature, not a bug, designed for data freshness. - RSC Payloads Stream After Actions: The fresh RSC payload build and stream happen after the Server Action has completed its work. Understanding this timing is crucial for predicting UI behavior.
- Beware of Timing Nuances with
pushState
: If you're combining Server Actions with immediatewindow.history.pushState
for shallow routing, be aware of potential race conditions that can trigger unexpected server re-renders. AsetTimeout(0)
can be a pragmatic (though potentially temporary) workaround to yield to the event loop. - Anticipate Server Re-renders: Always design your client-side components to gracefully handle potential server re-renders that might send initial props that temporarily don't match your client's current state. The server is the source of truth, but its truth depends on its perception of the route at the moment of render.
- Simple Guards Solve Complexities: A simple conditional check, like comparing array lengths for
initialMessages
, can prevent frustrating UI reverts and ensure data consistency without relying on fragile hacks. It allows the UI to stay stable until a truly comprehensive server-side state is available. - Test Your Cookie & Middleware Flows: Thoroughly test how your authentication and session management (especially cookie updates via middleware or Server Actions) interact with the Router Cache and RSC streaming, as this is a common source of unexpected UI behavior.
Mastering these distinctions is key to building truly dynamic and data-consistent App Router applications that delight users.
👋 Conclusion
The Next.js App Router offers unprecedented power and flexibility, but its sophistication lies in deeply interconnected systems like Server Actions, React Server Components, and the Router Cache. What might appear as baffling UI glitches are often logical consequences of these "hidden flows" interacting in unexpected ways.
By understanding how cache invalidation directly leads to fresh RSC payload builds and streams, and by carefully managing client-side state reconciliation, you can transform confusing notes into clear insights. My journey through these challenges reinforced the importance of thoroughly understanding the underlying framework mechanisms, rather than relying on surface-level assumptions or fragile workarounds.
I hope this deep dive into Next.js's hidden flows provides you with valuable knowledge to build more resilient and delightful user experiences. Have you encountered similar challenges with Next.js App Router? Share your experiences and solutions in the comments below!
📚 References
- Server and Client Components
- Next.js Updating Data: Server Functions
- Next.js Caching: Router Cache
- Next.js Caching: Cookies
- Next.js Caching: Router Cache Invalidation
Happy coding! 👩💻👨💻
Subscribe to my newsletter
Read articles from Sushil Kumar directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
