C# Compilation and Execution Process Demystified


When running and compiling C# code, it can be hard to grasp what is going on under the hood. But having this knowledge can by very helpful when in need of debugging and profiling your applications. First, it is important to understand the basic concepts.
CIL
CIL stands for Common Intermediate Language (CIL), formerly called Microsoft Intermediate Language (MSIL) or Intermediate Language (IL), is a CPU-independent set of instructions that can be efficiently converted to native code. The C# compiler compiles the code into CIL instructions, which are then executed by a runtime environment compatible with CIL, such as the Common Language Runtime.
CLR
Common Language Runtime manages the execution of .NET code and provides services such as Just In Time Complier (JIT). In .NET, we are dealing with a managed code. This means that the execution of the code is managed by the CLR.
Assemblies
When compiling the code, it is translated to CIL by the compiler. CIL code is packaged into assemblies. An assembly can be either a Dynamic Link Library (DLL) or an Executable (EXE) file. Assemblies contain CIL code and metadata. Metadata consists of data tables that provide details about a managed module, including:
Defined Elements: Types and their members within the module.
Referenced Elements: Imported types and their members.
Compilation Process
C# compilation is a two-step process:
Compiling: In the compiling phase, individual C# code files are compiled into intermediate files, such as temporary object files and metadata. These files are stored in the
obj
directory.Linking: In the linking phase these files are linked to create single DLL and EXE. This goes in the
bin
directory.
bin
and obj
are directories are further subdivided into Debug
and Release
directories, which correspond to the project's build configurations.
DLL Files
Dynamic-Link Library (DLL) is a concept that was introduced in the Microsoft Windows. DLLs are collections of code, data, and resources that can be used by multiple applications simultaneously. In C#, a DLL is a compiled assembly that contains IL code. It is created by compiling one or more C# source files into a DLL file. The DLL file can then be referenced by other C# projects, allowing them to use the code and data that it contains.
Just In Time (JIT) Compiler
When running a .NET application run, CLR doesn’t immediately convert all the IL code into machine code. Instead, it uses the JIT compiler to compile IL to fully native machine code on demand. JIT compiler takes into account the possibility that some code might never be called during execution, it converts the IL as needed during execution and stores the resulting native code in memory so that it is accessible for subsequent calls in the context of that process.
Dynamic Linking in .NET
Linking is the process of connecting symbol references in a program(like functions or variables) to their definitions, often found in libraries. In static linking, this happens during the build process, producing a single executable that does not depend on external libraries at runtime, making it self-contained. In dynamic linking, the linking is deferred until the program runs, which allows updating dynamic libraries without recompiling the entire program.
By default, both .NET Framework and modern .NET versions (like .NET Core and .NET 5+) use dynamic linking, meaning that external libraries (assemblies) are loaded at runtime. However, .NET Framework enforced strict version matching for referenced assemblies. The compiler embedded version information for each library, and at runtime, the .NET Framework attempted to load the exact versions. If a version mismatch occurred, the runtime could fail to load the assembly. To mitigate these issues, Microsoft introduced binding redirects, allowing applications to specify alternative versions of assemblies to load. However, this approach was complex and didn’t fully solve the problem of "DLL hell". As part of a broader redesign, Microsoft introduced .NET Core, which simplified dependency management and eliminated the need for binding redirects and reduced versioning issues.
Example: Minimal API
Use the command line to create a new Minimal API project:
dotnet new webapi -o WebApi
Out of the box, you should get the example Weather Forecast example app bootstrapped. Point the terminal in the project directory and run the build command to compile the code:
dotnet build
Behind the scenes, this command is using MSBuild to build the project. It is equivalent of running:
dotnet msbuild -restore
The default configuration for dotnet build
is Debug
unless you explicitly specify otherwise. Hence, it will generate output in the bin/Debug/{target-framework}/
. The Debug
and Release
configurations are predefined by the .NET SDK, but they can be customized.
If you inspect the content of the bin
directory, you can see these two files among others: WebApi.dll
and WebApi.exe
(on Windows). WebApi.dll
is the actual IL code and is the result of compiling the C# code. Its cross-platform and can run anywhere .NET runtime is installed. WebApi.exe
is the native executable for the application. Running this EXE starts the app without needing to explicitly invoke dotnet
. This file is built using the apphost
executable from .NET SDK. It acts as the main entry-point to start code execution and is typically referred to as the "host". Its job is to:
Locate the .NET runtime
Load the app’s
dll
Start the execution of IL code
Now, lets run the application with:
dotnet run
What this command does is the following:
Builds the project (unless
--no-build
is specified).Finds the output DLL, example:
bin/Debug/net8.0/WebApi.dll
Executes
dotnet bin/Debug/net8.0/WebApi.dll
The dotnet
executable starts a new child process, running the app’s .dll
inside the .NET runtime. This means, if you inspect the Task Manager, you should see the .NET Host
or dotnet
process running as a child process of the CLI from which you executed the application. This process is also a “host” that is responsible for starting the runtime (including components like the JIT and garbage collector) and invoking managed entry points.
On the other hand, if you run the WebApi.exe
native executable, you will notice that the process in the Task Manager is running under the name WebApi
.
Profiling the Application
Modify Program.cs
so we have few endpoints we can use to simulate different kinds of workloads:
using System.Diagnostics;
var builder = WebApplication.CreateBuilder(args);
var app = builder.Build();
// CPU-intensive endpoint
app.MapGet("/cpu", () =>
{
var sw = Stopwatch.StartNew();
var result = Fibonacci(40); // intentionally slow
sw.Stop();
return Results.Ok(new { result, timeMs = sw.ElapsedMilliseconds });
});
// Async I/O simulation
app.MapGet("/io", async () =>
{
await Task.Delay(2000); // Simulated I/O
return Results.Ok("I/O complete");
});
// Blocking thread
app.MapGet("/block", () =>
{
Thread.Sleep(2000); // Blocks thread
return Results.Ok("Blocked complete");
});
app.Run();
return;
// Helper method
static int Fibonacci(int n) =>
n <= 1 ? n : Fibonacci(n - 1) + Fibonacci(n - 2);
To profile the previously created project we will use dotnet trace tool. Install the tool using:
dotnet tool install --global dotnet-trace
First, start the application with dotnet run
. Then, use the dotnet-trace ps
command to list all .NET processes currently running on the machine. It helps identify the process IDs (PIDs) of .NET applications that can be targeted for diagnostics. After running the command, you should see WebApi
process in the list. To collect the diagnostic trace run:
dotnet-trace collect -n WebApi --format speedscope
Here, we are using speedscope
format, so we can visualize the trace file in Speedscope. Execute each of the different endpoints multiple times. Next, stop dotnet-trace (CTRL-C
) and wait until dotnet-trace finishes and creates the trace file. Head over to Speedscope and open the created trace file to analyze it.
Important Concepts for Trace Analysis
In .NET, CLR has an important role in managing threads lifecycle. When running a .NET application, CLR creates a single foreground thread to execute the application code using the Main method. This thread is called a primary or main thread. Along with with the main thread, a process can create one or more threads to execute code that is not executed by the main thread.
The worker thread refers to any thread other than the main thread that does some work on behalf of the application that spawned the thread. This work could be anything, for example waiting for some I/O to complete.
Thread Pool is the native software level thread management system for CLR. It’s used by default for:
Task.Run
()
,async/await
continuationsHTTP request processing (in ASP.NET Core)
Timers, background jobs, and
Parallel
loopsThreadPool.QueueUserWorkItem
Request lifecycle in .NET Core goes something like this:
Kestrel accepts request (I/O thread, not Thread Pool).
The request is dispatched to the Thread Pool.
Method executes on a Thread Pool thread.
If you
await
, the thread is released and a continuation is resumed later — again on a Thread Pool thread.
Call stack is a data structure used by a program to keep track of which methods or functions are currently executing, and in what order they were called.
Managed code is code that runs under the supervision of the CLR, for example your C# methods.
Unmanaged code is native code that runs outside of the CLR scope, for example:
OS-level APIs
Native libraries (e.g., libuv, OpenSSL, SQLite)
Low-level system calls (file I/O, sockets, locks)
Parts of .NET Core runtime itself (like GC, JIT)
Analyzing the Trace
Looking at the top menu, you will notice that every thread has its own tab, and it is possible to move between the threads using the dropdown.
One thing to keep in mind when looking at the graph is that the time is relative for every thread, meaning time starts from 0 for every thread and it is not related to overall execution time of the process.
There are three kinds of views:
Time Order: Call stacks are ordered in chronological order. The horizontal axis represents the "weight" of each stack (most commonly CPU time), and the vertical axis shows you the stack active at the time of the sample. If you click on one of the frames, you'll be able to see summary statistics about it.
Left Heavy: Identical stacks are grouped together, regardless of whether they were recorded sequentially. Then, the stacks are sorted so that the heaviest stack for each parent is on the left.
Sandwich: Table view in which you can find a list of all functions and their associated times. You can sort by self time or total time.
In Speedscope, stack refers to the chain of function calls currently active on a thread when the profiler took a sample. Each vertical "stack" of boxes is a snapshot of the call stack. Each box in the stack is a function or method that was executing or was on the call path when the sample was taken.
You can recognize the main thread if you find calls to the Main
class.
If you want to confirm that the thread you are looking at is a Thread Pool thread, look for these clues:
System.Threading.ThreadPoolWorkQueue.Dispatch()
System.Threading.ThreadPoolWorkQueue.WorkerThreadStart()
Followed by your app logic (
MoveNext
,Controller
, etc.)
In one of the started worker threads, you can notice that Thread.Sleep
was called, for the duration of 2 seconds:
Resources:
https://stackoverflow.com/questions/1564348/is-the-clr-a-virtual-machine
https://dev.to/kcrnac/net-execution-process-explained-c-1b7a
https://github.com/dotnet/runtime/blob/main/docs/workflow/testing/host/using-apphost.md
https://learn.microsoft.com/en-us/dotnet/core/tutorials/netcore-hosting
https://improveandrepeat.com/2025/03/find-the-hot-path-with-dotnet-trace/
https://www.infoworld.com/article/2253435/understand-the-net-clr-thread-pool.html
Subscribe to my newsletter
Read articles from Danko Simunovic directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
