Composable Architecture.

Brian KingBrian King
26 min read

TL;DR.

Composable Architecture combines APIs, microservices, headless, and the Cloud. For the "12 Startups" project, I will blend Composable Architecture with technologies like Next.js, Deno.js, Express.js, MariaDB, Redis, RedisGraph, and Neo4j. This architecture and these technologies will help me create scalable, high-performance applications that efficiently manage complex data relationships while delivering seamless user experiences.

An Introduction.

Composable Architecture is a description of how APIs, microservices, headless, and the Cloud, are combined as the foundation of modern, distributed, extensible applications.

The purpose of this post is to present my purpose for adopting a Composable Architecture.

Beyond the comfortable theories of why Composable Architecture is important to modern software development, there are actual services that put these ideas into practice, like Next.js (frontend), Deno.js (backend), Express.js (middleware), PostgreSQL (RDBMS), Redis (in-memory data storage), RedisGraph (graph data manager), and Neo4j (graph data storage). These are powerful and flexible tools that are used in modern web development. These technologies enable developers to create scalable, high-performance applications that can handle complex data relationships and deliver seamless user experiences.

Also, Composable Architecture is technology agnostic and unopinionated. For instance, I can swap out Next.js for Nuxt.js, or Deno.js for Node.js. As long as the components that make up my applications can communicate with each other, Composable Architecture doesn't care which technologies I use to develop my APIs and microservices.

Definition of Composable Architecture.

With Composable Architecture, the main focus is on ensuring that individual microservices are designed in such a way that they can be easily replaced, or updated, without affecting the overall functionality of the applications that use them.

Microservices.

Microservices are collections of small, autonomous services, each focused on its own, specific functionality. This approach to app creation enables developers to build, deploy, and scale individual components, leading to more resilient and flexible applications.

Benefits of Microservices.

Some of the key advantages of using microservices include:

  • Improved Scalability: Since each microservice can be scaled independently, it is easy to deploy extra microservice instances under heavy load, or spin down unneeded instances as demand drops, all without affecting the app as a whole.

  • Enhanced Flexibility: The autonomous nature of microservices allows developers to use different technologies, or frameworks, or the most appropriate language for each service, leading to an adaptable programming environment.

  • Faster Development and Deployment: By breaking down an application into smaller components, teams can work on different microservices simultaneously, accelerating the development process. Additionally, deploying updates or bug fixes to a specific microservice is quicker and less risky, as it does not require modifying the entire application.

  • Better Fault Isolation: With microservices, if one service should fail, it does not necessarily cause the entire application to fail. This results in improved fault isolation and increased application resilience.

  • Easier Maintenance: Microservices are easier to understand, manage, and maintain due to their focused scope and smaller codebase. This simplifies the process of updating, refactoring, or even replacing individual components as needed.

  • Reusability: Microservices decrease development time and effort by utilizing the same components across multiple applications. As long as consistency and standardization are observed, maintainability is improved since updates to a specific microservice will propagate throughout all the applications using that software module. Reusing existing services lowers development costs and reduces time to market.

  • Loose coupling: Microservices enable independent development, real-time scaling, improved redundancies, reduced conflicts, and faster development cycles. Their independence facilitates continuous integration, and the delivery of frequent, low-disruption updates.

  • Unopinionated: Microservices promote technology agnosticism, allowing developers to select the ideal stack for each service, encouraging best practices, and preventing single-technology lock-in.

By leveraging microservices, an organization can create more robust, scalable, and flexible applications that are better suited to the ever-evolving demands of modern software development.

APIs.

APIs, or Application Programming Interfaces, are sets of rules and protocols that enable different microservices to communicate with each other. APIs are crucial to modern web development, as they allow developers to integrate various services and functionalities into their applications, creating a seamless user experience.

Benefits of APIs.

APIs are integral to the success of Composable Architecture for several reasons. Firstly, they provide a standardized way for microservices to interact with one another, ensuring compatibility and reducing the complexity of integration. Secondly, APIs enable developers to access and utilize third-party services, such as payment gateways, social media platforms, and data analytics tools, without having to build these functionalities from scratch. This not only saves time and resources, but also allows developers to focus on their core competencies while delivering more robust applications.

APIs in Composable Architecture promote scalability and maintainability. As the application grows, new microservices can be easily added, or existing ones can be updated, without disrupting the overall system. This modular approach also simplifies troubleshooting and debugging, as issues can be isolated and resolved at the microservice level.

APIs play a pivotal role in the implementation of Composable Architecture, as they facilitate seamless communication between software applications, while enabling the integration of diverse services and functionalities. By leveraging APIs, developers can create adaptable, scalable, and maintainable applications that cater to the evolving needs of businesses and end-users alike.

Types of APIs.

Application Programming Interfaces (APIs) consist of protocols, routines, and tools that facilitate communication and data sharing between different software applications, and the microservices on which they are built. APIs serve as a bridge between various software systems and modules, enabling them to work together seamlessly. There are several types of APIs, each catering to specific needs and use cases, such as:

  • Public APIs (external access, open to the public): These APIs are accessible to developers and users with minimal restrictions, but may require registration and guideline adherence.

  • Private APIs (internal access, closed to the public): These APIs are hidden from external users, come with enhanced security, and are only used within an organization's software applications.

  • Partner APIs (external access, closed to the public): These are similar to Private APIs, but are designed for a specific purpose or audience. Access is granted through a formal agreement between the microservice provider and the partner company. The partner will need to generate an API key on the microservice provider's website. API keys will allow partners to access the functionalities of specific microservices.

  • Composite APIs: These APIs allow developers to access multiple endpoints in a single call. This may include data streams, or service calls from other APIs, facilitating a more streamlined accessibility process.

  • RESTful APIs: Representational State Transfer (REST) is an architectural approach where APIs employ HTTP methods (GET, POST, PUT, DELETE) to execute operations. REST APIs are very popular because of their simplicity, scalability, and user-friendliness.

  • SOAP APIs: Simple Object Access Protocol (SOAP) is a protocol for exchanging structured information across web service implementations. Relying on XML-based messaging, they are typically more complex, and less flexible, than RESTful APIs.

  • GraphQL APIs: GraphQL is a query language, and runtime for APIs, where clients can request only the data they need. This makes GraphQL useful in situations where bandwidth is a critical factor, i.e. mobile applications.

By understanding the various types of APIs, and their specific characteristics, developers can make informed decisions when selecting the most appropriate APIs for different parts of their applications, ultimately contributing to the creation of more reliable, high-performing, and efficient software solutions.

API Design.

To create robust, efficient, and user-friendly APIs, developers must adhere to a set of well-established design principles and best practices. These guidelines ensure that an API is easy to understand and use, as well as contribute to the development of more reliable, high-performing, and scalable software solutions. By following these essential principles, developers can make informed decisions when selecting the most appropriate API for their system, ultimately leading to the creation of superior applications.

  • Consistency: An API should maintain a consistent design throughout, including consistent naming conventions, error handling, and response formats. This makes it easier for developers to understand and use an API effectively.

  • Simplicity: An API should be designed with simplicity in mind, ensuring that it is easy to learn and use. This can be achieved by minimizing the number of endpoints, using clear and concise naming, and providing comprehensive documentation.

  • Flexibility: An API should be flexible enough to accommodate future changes and enhancements without breaking existing functionality. This can be achieved by using versioning and modular design.

  • Security: An API should be designed with security in mind, ensuring that sensitive data is protected and access is controlled through proper authentication and authorization mechanisms.

  • Performance: An API should be optimized for performance, ensuring that it can handle high levels of traffic and provide fast response times. This can be achieved through efficient data handling, caching, and load balancing.

  • Scalability: An API should be designed to scale seamlessly as the number of users and requests increase. This can be achieved through horizontal scaling, load balancing, and efficient resource management.

  • Testability: An API should be designed in such a way that it is easy to test, ensuring that it functions correctly and meets the required performance standards. This can be achieved by providing test environments, test data, and automated testing tools.

  • Documentation: Comprehensive and up-to-date documentation is essential for any API, as it helps developers understand the API's functionality, endpoints, and usage. This includes providing clear examples, sample code, and detailed explanations of error messages.

By adhering to these API design principles and best practices, developers can create APIs that are easy to use, and simple to understand.

However, a single app might also use multiple types of APIs. For instance, an app could use an API that references a relational database for CRUD content, while also using another API that references a GraphQL social media database - ON THE SAME PAGE. Developers are constantly pushing the boundaries.

Headless.

Headless refers to an architecture where the frontend and backend of an application are decoupled, allowing them to be developed, deployed, and scaled independently. In this approach, the frontend is responsible for rendering the user interface, while the backend provides data and functionality through APIs.

Benefits of Headless.

One of the benefits of headless architecture is that the decoupled frontend and backend can be developed, tested, and deployed separately, allowing for faster development cycles and reduced dependencies.

Both the frontend and backend can scale independently.

I can choose the best technologies, frameworks, or languages for each part of the application, leading to more adaptable solutions.

Decoupling the frontend and backend makes it easier to integrate with external APIs and services, expanding the application's capabilities.

By separating the presentation layer from the business logic and data layers, I can optimize each part of the app for better performance, leading to faster load times and more responsive user interfaces.

With a decoupled architecture, it's easier to update or refactor individual components without affecting the entire application, making maintenance more manageable.

Headless for Platform Specificity.

The best benefit of the headless architecture is that various frontends can be optimised for their specific platforms, for instance:

  • Chrome browser app (Google),

  • Firefox browser app (Mozilla),

  • macOS app (AppStore),

  • iOS app (AppStore),

  • Android app (PlayStore),

  • Windows app (Microsoft Store), and

  • Debian-based OS app (APT and Snap).

Creating multiple platform-specific apps requires the use of a single code base that can be compiled to multiple platforms.

The Cloud.

I am sure there are use cases for serverless functions in the Cloud but I haven't found a reason to switch from services like Linode, Digital Ocean, or Heroku. I was able to run AWS Lambda functions on Netlify easily enough but, without extensive experience with the technology, I am not qualified to offer an opinion.

Two Libraries.

Ultimately, I suspect there will be two libraries:

  • The Standard Library: Common microservice functionalities that are available to all of the apps (Join, Login, Search, etc) will be kept in the standard library.

  • The Component Library: Specialist WASM binaries that are used by specific apps will be kept in the component library.

Push Notifications.

Whenever updates, security patches, or new features are added to either of the libraries, a push notification is sent to all of the platform-specific apps that require this/these changes.

App Updates.

When a user receives a push notification within an app, whether it's an update, security patch, or new feature, a dialog box will allow the user to install the changes to their installation. Microservices and WASM binaries can be downloaded and installed locally, reducing the number of remote calls that are made by the app.

App Rebuilds.

Updates, security patches, or new features that are uploaded to either of the libraries, will trigger a rebuild of all the relevant apps that require this/these changes. When the rebuilds are complete, the newly minted apps will then automatically upload to their respective store repositories.

App Internals.

Downloading, and installing, changes to platform-specific apps is possible because each app is nothing more than a stripped-down, optimized browser. This customized browser supports HTML5, CSS3, JavaScript, and WASM binaries. It also contains a mobile-optimized graph database called GQLite.

A Composable Platform

My specific use case, 12 Startups in 12 Months, has very specific requirements. Fortunately, I believe that using Composable Architecture can meet some of these demands.

Planning a Composable Platform.

Throughout June 2023, I will publish my plans regarding:

  • Why a Composable Platform is the right solution for me,

  • How a Composable Platform works on a large project,

  • Which tools I'll use to build a Composable Platform,

  • How I'll design experiments to answer questions,

  • Why experiments are important to the process,

  • How different experiments work with each other,

  • What path I'll take during the build phase, and

  • When I expect specific deadlines to occur.

Composable Platform Experiments.

June 2023 to July 2023 will see me experiment, test, prototype, connect, integrate, and build different aspects of a Composable Platform.

I am targeting August 2023 as the launch month for my first MVP (minimum viable product).

For now, I am almost decided on the tools I'll use for the 12 Startups project.

Technologies.

Beyond the architectural philosophies and design principles are the technical implementations that run on the servers.

Next.js.

Next.js is a framework for building React applications that provides easy server-side rendering, static site generation, and automatic code splitting. It is built around the concept of pages that are associated with a route based on their file name. The framework expects a pages directory in the root of the application where every .js, .jsx, .ts, or .tsx file is a React Component that gets automatically associated with a route based on its filename. index.js file inside pages directory is the default/root page that is rendered when the user visits the root of the application.

Here is an example content of index.js:

function HomePage() {
  return <div>Welcome to Next.js!</div>
}

export default HomePage

Additionally, Next.js comes with features such as automatic code splitting, server-side rendering, static site generation, fast refresh, and built-in CSS support. The static assets such as images, fonts, etc. can be stored in the public directory and can be referenced within the code starting from the base URL (/).

To start a Next.js application, navigate to the application's root directory and run npm run dev or yarn dev or pnpm dev to start the local development server. By default, the development server starts on http://localhost:3000. You can then visit http://localhost:3000 to view your application.

For more information on what to do next, you can refer to the official Next.js documentation.

Installing Next.js on Ubuntu.

To install Next.js on Ubuntu, I need to have Node.js and npm installed on my system. I can use the following steps to install Node.js and npm on my Ubuntu system:

  • I update my Ubuntu system:
sudo apt clean && sudo apt update && sudo apt dist-upgrade -y && sudo apt autoremove -y
  • I install Node.js using the following command:
sudo apt install nodejs
  • I verify that Node.js is installed correctly by checking its version:
node -v
  • I install npm using the following command:
sudo apt install npm
  • Once I have Node.js and npm installed, I can create a new Next.js project by running the following command:
npx create-next-app [project-name]
  • This will create a new Next.js project in a directory named project-name. I can then navigate to this directory and start the development server using the following command:
cd project-name
npm run dev

This will start a development server at http://localhost:3000, and I can open this URL in my web browser to see my Next.js application running.

Note that the above installation steps assume I have the required permissions to install software on my Ubuntu system. If I encounter any permission issues, I may need to run the commands with sudo or obtain the necessary permissions.

Deno.js.

Deno.js is an innovative, open-source, cross-platform JavaScript runtime environment that allows for the execution of JavaScript code outside the confines of a web browser. This powerful tool has been specifically designed to enable developers to create scalable, high-performance server-side applications utilizing JavaScript, a programming language that has been predominantly employed for client-side scripting purposes. By offering seamless integration with Composable Architecture, Deno.js provides developers with a flexible and modular approach to building modern applications, thus expanding the potential of JavaScript beyond its traditional scope.

Deno.js brings a lot of features to developers who work on modern applications. By transcending the conventional boundaries of JavaScript, which has been primarily utilized for client-side scripting, Deno.js paves the way for a more versatile approach to building contemporary applications.

One of the most notable aspects of Deno.js is its seamless integration with Composable Architecture. This integration empowers developers to create flexible, modular, and scalable applications with ease, thereby expanding the potential of JavaScript beyond its traditional scope. The fusion of the Composable Architecture philosophy and the Deno.js technology results in a robust and efficient development ecosystem that can adapt to the ever-changing demands of the industry.

The enhanced features and advantages of Deno.js make it an indispensable tool for developers seeking to build modern applications that are high-performing and also adaptable to the rapidly evolving landscape of software development.

Installing Deno.js on Ubuntu.

1. I can install Deno on Ubuntu by running the following command in my terminal:

curl -fsSL https://deno.land/x/install/install.sh | sh

This command downloads and runs the Deno installation script, which will install Deno to the $HOME/.deno directory.

2. After installation, I add Deno to my system's PATH so I can run it from any location on my system. To do that, I run the following command:

echo 'export DENO_INSTALL="$HOME/.deno"' >> ~/.bashrc
echo 'export PATH="$DENO_INSTALL/bin:$PATH"' >> ~/.bashrc

This will add Deno to my system's PATH, and the changes will take effect the next time I open my terminal.

3. I run the following to confirm that Deno is installed:

deno --version

This should output the current version of Deno that I've installed on my system.

Express.js.

Express.js is a minimal and flexible Deno.js web application framework that provides a robust set of features for building web applications and APIs. It simplifies the process of creating server-side applications by offering a streamlined and modular architecture, making it a popular choice among developers.

One of the key aspects of Express.js is its ability to create and manage Deno.js middleware, which are essential components in the development process. Middleware functions have access to the request and response objects, as well as the next middleware function in the application's request-response cycle. These functions can execute any code, modify the request and response objects, or even end the request-response cycle.

In Express.js, building Deno.js middleware is achieved through a well-structured and modular architecture that promotes code reusability and maintainability. This architecture allows developers to break down their applications into smaller, more manageable components, which can then be easily combined to create a complete and functional application. Middleware functions can be organized into separate modules, and then imported and used within the main application file as needed.

To create a middleware function in Express.js for Deno.js applications, developers can define a function with three parameters: the request object (req), the response object (res), and the next function (next). The request object represents the incoming HTTP request, while the response object is used to send the HTTP response back to the client. The next function is a callback that, when invoked, passes control to the next middleware function in the stack.

Express.js also provides a set of built-in middleware functions that can be easily integrated into Deno.js applications, such as serving static files, handling cookies, and parsing request bodies. Additionally, developers can take advantage of third-party middleware packages to extend the functionality of their applications even further.

Express.js enables the construction of Deno.js middleware through its streamlined and modular architecture, which simplifies the development process and allows for greater flexibility and customization in building web applications and APIs.

Installing Express.js on Ubuntu

To install Express.js on Ubuntu, I follow these steps:

1. I open my terminal.

2. I navigate to my project directory.

3. I run the following command to initialize a new Deno.js package:

bash deno --init

This command will create a new package.json file in my project directory.

4. Now, I install the express package by running the following command:

bash npm install express

This command will download and install the express package and its dependencies.

5. I create the main.ts file:

touch main.ts

6. In main.ts, I create a simple server:

// @deno-types="npm:@types/express@4.17.15"
import express from "npm:express@4.18.2";

const app = express();

app.get("/", (req, res) => {
  res.send("Welcome to the Dinosaur API!");
});

app.listen(8000);

7. I run the main.ts server:

deno run -A main.ts

8. I point my browser to the following URL:

localhost:8000

I should see:

Welcome to the Dinosaur API!

The next step is to add some data. I'll use the Dinosaur data that I found in this article. I right-click this link and copy this file to my clipboard.

9. I create data.json:

touch data.json

10. I paste the dinosaur data into the data.json file.

11. I add this line at the top of the main.ts file to import the data.json data:

import data from "./data.json" assert { type: "json" };

Then, I can create the routes to access that data. To keep it simple, I'll just define GET handlers for /api/ and /api/:dinosaur.

12. I add the following after the const app = express(); line in the main.ts file:

app.get("/", (req, res) => {
  res.send("Welcome to the Dinosaur API!");
});

app.get("/api", (req, res) => {
  res.send(data);
});

app.get("/api/:dinosaur", (req, res) => {
  if (req?.params?.dinosaur) {
    const found = data.find(item => item.name.toLowerCase() === req.params.dinosaur.toLowerCase());
    if (found) {
      res.send(found)
    } else {
      res.send("No dinosaurs found.");
    }
  }
});

app.listen(8000);

13. I run the server:

deno run -A main.ts

14. I check out the results in a browser:

http://localhost:8000/api

I should see a list of dinosaurs:

[
  {
    "name": "Aardonyx",
    "description": "An early stage in the evolution of sauropods."
  },
  {
    "name": "Abelisaurus",
    "description": "\"Abel's lizard\" has been reconstructed from a single skull."
  },
  {
    "name": "Abrictosaurus",
    "description": "An early relative of Heterodontosaurus."
  },
...

15. I look for a specific result in a browser:

localhost:8000/api/aardonyx

I should see the result of a specific dinosaur:

{
  "name": "Aardonyx",
  "description": "An early stage in the evolution of sauropods."
}

Great! On my Ubuntu system, I now have a way to install Express.js and use it with Deno.js. I can now start building middleware for my apps.

Attribution:

https://deno.com/manual@v1.33.1/node/how_to_with_npm/express

PostgreSQL.

PostgreSQL is an open-source relational database management system (RDBMS) renowned for its powerful features and scalability. First released in 1989 as a research project at the University of California, Berkeley, it has since become widely adopted as a reliable and efficient database solution for both small and large-scale applications. PostgreSQL supports a wide range of operating systems, including Linux, macOS, and Windows. It utilizes SQL (Structured Query Language) as its primary query language and offers numerous advanced features, such as support for user-defined types, complex queries and transactions, and extensive data type support. PostgreSQL is also recognized for its extensibility, which enables developers to augment the database system with additional features and functionality. It boasts a large and active community of developers who contribute to the ongoing development and enhancement of the software. In summary, PostgreSQL is a powerful and versatile RDBMS extensively used for a diverse array of applications, ranging from small web applications to large-scale enterprise solutions.

Installing PostgreSQL on Ubuntu.

I can install PostgreSQL on Ubuntu by following these steps:

1. I update my system's package index by running the following command in my terminal:

bash sudo apt update

2. I install PostgreSQL by running the following command:

bash sudo apt install postgresql postgresql-contrib

3. After installation, PostgreSQL should start automatically. I can check its status by running the following command:

bash systemctl status postgresql

If PostgreSQL is not running, I can start it with this command:

bash sudo systemctl start postgresql

4. By default, PostgreSQL allows local connections only, so I should be able to connect to the server immediately once it is running. I can connect to the server by running the following command:

bash sudo -u postgres psql

This will open up the PostgreSQL command prompt.

Now that PostgreSQL is installed on my Ubuntu system, I can start creating databases, roles, and tables.

Note: if I plan to use PostgreSQL for development purposes or in a non-production environment, I may also want to install pgAdmin, which is a popular GUI client for PostgreSQL. I can install it by running:

sudo apt install pgadmin4

Redis.

Redis is an advanced in-memory data structure store that offers exceptional performance and versatility, making it a popular choice for use in web applications and APIs. When combined with Express.js, a popular web application framework, Redis can significantly boost the development of powerful, scalable, and responsive server-side applications that are capable of efficiently managing real-time data processing requirements. To achieve this seamless integration, Redis communicates with Express.js middleware through a well-defined process.

First, a Redis client is set up within the Express.js application, which establishes a connection to the Redis server. This client is responsible for sending commands and receiving responses from the Redis server. The Express.js middleware then utilizes this client to perform various operations on the Redis data store, such as storing, retrieving, or manipulating data. These operations are executed through the use of Redis commands, which are sent by the client to the Redis server over a TCP connection.

Once the Redis server receives the commands, it processes them and returns the appropriate responses to the client. The Express.js middleware, in turn, interprets these responses and takes the necessary actions based on the application's requirements. This efficient communication between Redis and Express.js middleware enables developers to build highly performant, scalable, and responsive server-side applications that can effectively handle the demands of real-time data processing.

Installing Redis on Ubuntu.

To install Redis on Ubuntu, I can follow these steps:

1. I update my system's package index by running the following command in my terminal:

bash sudo apt update

2. I install Redis by running the following command:

bash sudo apt install redis-server

3. After installation, Redis should start automatically. I can check its status by running the following command:

bash systemctl status redis

If it's not running, I can start it by running:

bash sudo systemctl start redis

4. I test that Redis is working properly by connecting to the Redis server using the Redis command-line interface. I can do this by running:

bash redis-cli

This will open up the Redis command prompt.

5. I can try the following commands to make sure everything is working:

bash set mykey "Hello World" get mykey

If Redis is running properly, I should see the value "Hello World" printed in response to the second command.

Now that Redis is installed on my Ubuntu system, I can start using it in my applications.

RedisGraph.

RedisGraph operates as a powerful graph database module specifically designed for Redis, which is a widely-used open-source, in-memory data structure store. This innovative module allows developers to effortlessly store, manage, and analyze graph data by utilizing the versatile Cypher query language. As a result, RedisGraph offers a high-performance and highly efficient solution that is well-equipped to handle intricate data relationships and the demands of real-time data processing.

By employing a unique combination of adjacency matrices and the GraphBLAS library, RedisGraph can optimize its performance and minimize the memory footprint. This approach enables the module to execute complex graph operations and algorithms with remarkable speed and efficiency. Furthermore, RedisGraph's seamless integration with the Redis ecosystem allows developers to take full advantage of its powerful features, such as data persistence, replication, and clustering, thus ensuring a robust and scalable graph database solution.

Installing RedisGraph on Ubuntu.

To install RedisGraph on Ubuntu, I follow these steps:

1. I download the RedisGraph source code, from the official GitHub repository, using the following command:

bash wget https://github.com/RedisGraph/RedisGraph/archive/refs/tags/v2.10.10.tar.gz

Note that v2.10.10 in the above command refers to the 21st April 2023 maintenance release of RedisGraph. I usually update these types of commands, with the latest stable versions, at the time of installation.

2. I unpack the RedisGraph source code using the following command:

graphql tar xzf v2.10.10.tar.gz

3. I move into the extracted directory:

graphql cd RedisGraph-2.10.10/

4. I make RedisGraph using the following command:

graphql make

This will create a dynamic module file named redisgraph.so in the src directory.

5. I copy the dynamic module file to the Redis modules directory using the following command:

graphql sudo cp src/redisgraph.so /usr/lib/redis/modules/

6. Finally, I need to load the RedisGraph module into the Redis server by adding the following line to the Redis configuration file:

graphql loadmodule /usr/lib/redis/modules/redisgraph.so

To do this, I open the Redis configuration file in a text editor using the following command:

graphql sudo nano /etc/redis/redis.conf

I add the above loadmodule line at the end of the file, save the changes, and exit the Nano text editor.

7. I restart the Redis server to load the RedisGraph module using the following command:

graphql sudo systemctl restart redis

I now have RedisGraph installed on my Ubuntu system.

Neo4j.

Neo4j is a highly scalable, native graph database that is designed to store and manage connected data efficiently. It allows developers to model, store, and query data as nodes and relationships in a graph, making it an ideal choice for applications that require complex data analysis and real-time processing, such as social media posts and real-time messaging.

To set up Redis as a cache for Neo4j, I can use the RedisGraph module. This module provides an API that allows me to efficiently store Neo4j result sets in Redis.

Here are the steps to set up Redis as a cache for Neo4j:

1. Configure RedisGraph by adding the following line to the Redis configuration file (right after the loadmodule line):

redisgraph.graph_name_cache_size_mb 1

This will tell RedisGraph to configure its memory to use 1 MB as a cache.

2. Query Neo4j and cache the results in Redis using RedisGraph and the following Python code:

from redisgraph import Graph
from neo4j import GraphDatabase

# Connect to Neo4j and Redis
neo4j_driver = GraphDatabase.driver("bolt://localhost:7687")
redis_graph = Graph("neo4j_cache", host="localhost", port=6379)

# Run a Neo4j query and cache the result in Redis
with neo4j_driver.session() as session:
    result = session.run("MATCH (n) RETURN count(n)")
    redis_graph.query(result.query().strip())

# Retrieve the result from Redis
result = redis_graph.query("MATCH (n) RETURN count(n)")
print(result.result_set[0][0])

In this example, I'm using the Neo4j Python driver to run a query against Neo4j, and storing the resulting query object in RedisGraph. When I want to retrieve results, I can simply execute a RedisGraph query against the stored query object.

Installing Neo4j on Ubuntu.

To install Neo4j on Ubuntu, I follow these steps:

1. I import the Neo4j signing key:

bash wget -O - https://debian.neo4j.com/neotechnology.gpg.key | sudo apt-key add -

2. I add the Neo4j official repository to my system's package sources:

bash echo "deb https://debian.neo4j.com stable latest" | sudo tee -a /etc/apt/sources.list.d/neo4j.list

3. I update the package lists:

bash sudo apt update

4. I install Neo4j’s package:

bash sudo apt install neo4j

5. Once Neo4j is installed, I start the Neo4j service using the following command:

bash sudo systemctl start neo4j

6. I verify that Neo4j is running by checking its status:

bash sudo systemctl status neo4j

If everything is fine, I should see a message that says "active (running)."

7. I open a web browser and navigate to the following URL: http://localhost:7474

This will open the Neo4j Browser interface, where I can start exploring and interacting with my Neo4j graph database installation.

I have now installed Neo4j on my Ubuntu system.

The Results.

Composable Architecture offers a powerful and flexible approach to modern software development. By leveraging microservices, APIs, and cutting-edge technologies such as Next.js, Deno.js, Express.js, PostgreSQL, Redis, RedisGraph, and Neo4j, developers can create scalable, high-performance applications that efficiently manage complex data relationships and deliver seamless user experiences. As the demand for adaptable and resilient systems continues to grow, Composable Architecture and its associated technologies will play a crucial role in shaping the future of software development.

In Conclusion.

Looking ahead, the future of Composable Architecture and related technologies is promising. As the demand for scalable and flexible systems continues to grow, the need for efficient data management solutions becomes increasingly important. By leveraging technologies like Neo4j and RedisGraph, developers can create powerful, adaptable, and resilient solutions that can handle the ever-evolving challenges of storing instant messages and social media posts.

Until next time: Be safe, be kind, be awesome.

0
Subscribe to my newsletter

Read articles from Brian King directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Brian King
Brian King

Thank you for reading this post. My name is Brian and I'm a developer from New Zealand. I've been interested in computers since the early 1990s. My first language was QBASIC. (Things have changed since the days of MS-DOS.) I am the managing director of a one-man startup called Digital Core (NZ) Limited. I have accepted the "12 Startups in 12 Months" challenge so that DigitalCore will have income-generating products by April 2024. This blog will follow the "12 Startups" project during its design, development, and deployment, cover the Agile principles and the DevOps philosophy that is used by the "12 Startups" project, and delve into the world of AI, machine learning, deep learning, prompt engineering, and large language models. I hope you enjoyed this post and, if you did, I encourage you to explore some others I've written. And remember: The best technologies bring people together.