How 1 LoC randomly broke our website for over a year

Alem TuzlakAlem Tuzlak
7 min read

Sometimes, it only takes one line of code to bring a production site to its knees. This is the story of how I unintentionally did exactly that, and how I became the prince of Bel-Air… or wait, that’s a different one!

The background

As any good bug, this one spans across multiple JS frameworks, it all started back in remix and carried over once we migrated to react router v7. Due to remix being merged into react-router I assumed it was a bug that was carried over as well because the following would happen:

  • We would deploy the website

  • After a random period of time on random machines (we have 6 in production across the globe) the website would completely crash

  • The error happened inside of remix and react-router where they threw an error that the route the users were trying to access didn’t exist at all (it did)

  • The user would see a “Unexpected server error” message because it crashed inside of react-router so no ErrorBoundary.

This sounds very weird right? It did to me, especially considering that we had built it with our @forge-42/base-stack template which we have 4 instances of internally also running in production without the same issue. That meant that the bug was specific to our company website, but we weren’t sure what it was.

Back when react-router v7 released I was so excited to switch over in hopes the bug was resolved in the transition from one to the other, I refactored our codebase, pushed it to production and prayed! My prayers fell on deaf ears though, as the problem came back after a month.

Another peculiar thing was that I would get regular emails from Google that their bots couldn’t crawl the website as it was down, or I would get DM’s on Twitter/BlueSky/X/YouTube that our website was down, followed up with an embarassing: “Yeah we know about it, but no idea what was causing it and we’re actively trying to figure it out”.

The website itself is super simple. It only has the landing page and the newsletter signup form, and sitemaps. All of the code looked good and the bug was not reproducible locally or when built, it was random and unpredictable.

Here is what we considered:

  • Memory leak - because it would happen after a set period of time

  • Wrong dependencies - maybe we were using dev dependencies of some packages in production

  • Wrong NODE_ENV - maybe it was set to development

  • Faulty machine - it only happened on a specfic machine in the US. (at least at first)

And here are the answers to those things:

  • Memory leak - the memory on the machines was stable and it just crashed so it wasn’t the memory

  • Wrong dependencies - We saw that on Sentry the error stack was coming from ˙react-router/dist/development` so we assumed that might be the reason behind it, but after running it locally and creating a “production reproduction” locally it turned out it was indeed using the right dep.

  • Wrong NODE_ENV - This was easy, it was indeed set to “production”

  • Faulty machine - Also an easy one, started happening on other machines.

So what was going on? How was this happening randomly, without the possibility to reproduce it locally and above all on a very simple website we’ve built a dozen times over??

Crying Cat Meme Template — Kapwing

The Problem

Remember when I said we had a sitemap?

Well, check this code out:

import { generateRemixSitemap } from "@forge42/seo-tools/remix/sitemap"
import type { LoaderFunctionArgs } from "react-router"

export const loader = async ({ request, params }: LoaderFunctionArgs) => {
    const domain = new URL(request.url).origin
    // @ts-expect-error - This import exists but is not picked up by the typescript compiler because it's a remix internal
    const { routes } = await import("virtual:react-router/server-build")

    // biome-ignore lint/performance/noDelete: We want to remove the wildcard route to stop it from being added to the sitemap
    delete routes["routes/$"]

    const sitemap = await generateRemixSitemap({
        domain,
        routes,
        ignore: ["/_status"],
        // Transforms the url before adding it to the sitemap
        urlTransformer: (url) => `${url}?lng=${params.lang}`,
        sitemapData: {
            lang: params.lang,
        },
    })

    return new Response(sitemap, {
        headers: {
            "Content-Type": "application/xml; charset=utf-8",
        },
    })
}

So this code right here is a route that handles sitemap generation. Whenever you land on this page, It gets all the routes you have in your project, then deletes the catch all route as it’s not needed in the sitemap because you don’t want your 404 route to be there, and then finally the generateRemixSitemap function generates dynamic sitemap entries for every URL in your application and then returns the Response as XML so crawlers can read it. Pretty simple, right?

So why would this cause any issues at all?

Right off the bat, it’s not the “remix” specific import or the generateRemixSitemap function, they just expect the routes object from remix/react-router and that’s all there is to it, they don’t use any internals under the hood.

The biome ignore told me that it’s not performant as other approaches, I was in a hurry to release the website as we were in a tight spot that day, so I was like “I don’t care in this case about performance”, if only it told me it could lead to bugs as well!

Also, no, it’s not the:

    // @ts-expect-error - This import exists but is not picked up by the typescript compiler because it's a remix internal
    const { routes } = await import("virtual:react-router/server-build")

either, as this indeed does exist and is imported correctly.

Well, what is it then??

If you haven’t figured it out yet, which is kind of obvious in hindsight (which I wish I had at the time of writing the actual code), it’s that routes object is:

PASSED BY REFERENCE

Which means I was directly mutating the routes object when I called:

delete routes["routes/$"]

This would then delete the route internally, then somebody would come to the site, react-router would try to parse the existing routes, and then…

Boom GIFs | Tenor

The website would die, and there would be no going back anymore. It was DEAD.

So to sum it all up in a neatly wrapped list:

  • Web crawlers would try to crawl the site with the sitemap

  • The sitemap generator would delete the catch-all route

  • This would kill react-router

  • This would kill our website

  • This would kill my will to live by a certain percentage

  • This list is recursive.

The AP Photographer Behind the Crying Jordan Meme Is Chill as Hell | GQ

The Result

  • Crawlers would die mid-crawling = SEO nightmare.

  • Downtime for users = even worse.

  • Confusion, debugging, log-diving… you get the idea.

The worst part of it all, I wasn’t even trying to be smart, I wasn’t trying to make it readable or maintainable, I was just trying to ship our website. I simply forgot that I was using the reference object rather than a copy, and nobody caught it in code-review either as it was such a minor detail of a bigger PR.

The Fix

The fix was simple in hindsight: don’t mutate shared route config.

Instead, my co-founder (who originally found the issue I created) cloned the route object before modifying it:

const { routes } = await import("virtual:react-router/server-build")

const clonedRoutes = structuredClone(routes); // OR just filter out the $ from the object via built-in JS methods
delete clonedRoutes["$"];

Boom. No side effects, no global mutation, no death spiral. 2 line fix for an issue that has been plaguing us for 1 year. I wish all bug-fixing was this easy!

The Lesson

This was a classic JavaScript footgun: mutating shared data without realizing how shared it actually is.

If you're building tools or utilities around your router config, remember:

  • Always clone what you don’t own.

  • Avoid side effects, especially in shared modules.

  • What seems like a “read-only” import may not be.

React-router doesn’t expect you to mutate the route tree — and you shouldn’t, for a reason.

Thank you

If you've reached the end you're a champ! I hope you liked my article.

If you liked the article consider sharing it with others, or tagging me on X with your thoughts, would love to hear them!

If you wish to support me follow me on X here:

Twitter/X

or if you want to follow my work you can do so on GitHub:

My Github

or our company github where I do all our OSS:

Forge 42

And you can also sign up for the newsletter to get notified whenever I publish something new!

3
Subscribe to my newsletter

Read articles from Alem Tuzlak directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Alem Tuzlak
Alem Tuzlak

Making your http://Remix.run experience more enjoyable. Creator of Remix Forge & remix-hook-form & remix-development-tools. OSS contributor and enthusiast.