API Gateway vs Load Balancer

Beng Chen

March 3, 2023

Technology

With the development of Internet technology, the number of requests for network data has dramatically increased, which places greater pressure on servers. In the system architecture of the early days, a load balancer was typically used to distribute network traffic across multiple servers, thereby reducing the load on a single server.

However, nowadays, many different types of backend services expose APIs to the outside world, resulting in an increasing number of APIs. As a result, the limitations of system architecture that rely mainly on load balancers have become evident, as they primarily operate at layer 4 and have limited functionality at layer 7. This has led to the emergence of an infrastructure called the API Gateway, which mainly operates at layer 7 and offers extensive extension capabilities.

In this article, we will introduce the distinct features of load balancer and API gateways, and explore their differences to help readers better understand their relationship.

What Is a Load Balancer?

what is load balancer

The primary role of a load balancer is to provide load-balancing functionality for multiple backend services, allowing them to distribute traffic based on different load-balancing algorithms. The history of load balancers is extensive and can be roughly categorized into the following stages:

  • First stage: Load balancers in this stage are typically composed of hardware devices, which have high performance and high reliability but are inflexible and expensive.

  • Second stage: Load balancers began to be implemented in software, more flexible and scalable, often appearing in software distribution and thus less expensive. LVS (Linux Virtual Server) is an example of this type.

  • Third stage: With the rise of cloud computing technology, load balancers also began to have cloud versions. One significant advantage of such a load balancer is that it can help enterprises obtain high-performance load-balancing services at a lower cost. Another advantage is that it can leverage the scalability and elasticity of cloud computing to enhance overall availability. Examples of cloud load balancers include AWS's Classic Load Balancer, Application Load Balancer, and Network Load Balancer.

In addition to distributing traffic and improving network scalability, load balancers can also be used to enhance network security. For example, they can be used to isolate internal servers from the public internet, preventing malicious attacks and unauthorized access. A simple use case is an internal server containing sensitive information, which a load balancer can isolate within the internal network to protect its security effectively.

What Is API Gateway?

what is api gateway

In the short term, an API Gateway is an infrastructure that manages and forwards API traffic, primarily working at layer 7. It provides robust scalability that load balancers do not have, including authentication, observability, and custom plugins. Some of its key features are:

  • Rich routing strategies: As the API Gateway operates at layer 7, it can parse data at the HTTP/HTTPS layer. Hence, it can direct requests to various upstream servers based on conditions like the request path, domain, or header.

  • Authentication: API Gateway supports multiple authentication methods at the API level to prevent unauthorized requests, such as OAuth2 and JWT. Authentication can be separated from business logic and kept independent to be non-intrusive or minimally intrusive to business code.

  • Rate limiting: Fine-grained rate limiting can be applied to different routing levels to prevent malicious attacks and avoid backend service overload.

  • Observability: Observability refers to the ability to observe the running status and resource utilization of the internal program from the external of the system. API Gateway supports linking logs to Kafka, Google Cloud Logging Service, Elasticsearch, etc., and linking relevant metrics to prometheus, datadog, etc.

  • Extension: As the API Gateway itself is a gateway, it is designed to be able to adapt to different application scenarios of different companies, such as different identity authentication, canary release, security policies, and log collection. Furthermore, the API Gateway must enable users to select extensions or undertake custom development; therefore, the scalability is strong, and the selection of available extensions is quite diverse. For example, Apache APISIX has 13 different authentication extensions, almost covering all common authentication requirements on the market.

Currently, there are many different API gateways available on the market, such as Apache APISIX, Kong, Tyk, Zuul, etc. Developers can select the most appropriate API Gateway according to their specific requirements.

Main Differences Between API Gateway and Load Balancer

Distinct Focus Areas

api gateway different with load balancer img 1

Although both API Gateway and load balancer support layer 4 and layer 7 proxies, API Gateway primarily focuses on layer 7 while load balancer focuses mainly on layer 4.

Therefore, the load balancer, operating at layer 4, has many advantageous features. Firstly, it reduces protocol parsing overhead compared to the API Gateway, resulting in higher throughput capacity. Secondly, it supports transparent client IP address forwarding, whereas API Gateway typically passes client IP addresses through HTTP headers.

Richness of Features

api gatewat different with load balancer img 2

For example, load balancers have weaker HTTP layer 7 processing capabilities and often lack features such as authentication, authorization, complex routing logic, and log collection. While API gateways have stronger layer 7 protocol processing capabilities and can attach various feature extensions such as access control, logging, API management, and serverless computing.

Custom Development

In today's rapidly evolving technology market, many companies require the ability to support custom development. API gateways offer various custom development options, such as support for different programming languages and the ability to inject custom processing logic at different stages of traffic forwarding. On the other hand, load balancers do not offer any custom development options.

Traffic Distributing Ways

Load balancers usually use direct traffic distribution to achieve load balancing. Via algorithms, traffic data are sent directly to specific backend server nodes, meaning each service instance waiting to receive traffic must behave consistently, reducing certain flexibility. In contrast, API gateways distribute traffic based on different dimensions such as URL Path, Domain, and Header. As a result, service instances waiting to receive traffic can vary, such as a private API or a GRPC API, making traffic distribution highly flexible.

Use Cases

Use Case of Microservices

in microservice screen

For microservice architecture systems, an API Gateway is essential. First, it can easily manage and route various backend services. Second, an API Gateway could provide many advanced features such as authentication, authorization, rate limiting, forwarding, and logging. Therefore, different microservices no longer need to repeatedly implement functions such as rate limiting and authentication, making each microservice's function implementation more focused and reducing development costs.

Microservices architecture involves many services, making layer 4 load balancers unsuitable for load balancing multiple backend services. Instead, they are more suitable to be used with monolithic backend services. Meanwhile, layer 7 load balancers usually could not offer advanced features, which makes their advantages over API gateways in microservices less significant.

API's Management & Deployment

API Gateway is also highly suitable in scenarios where there is a need to manage and deploy a large number of APIs, as it has powerful API management capabilities. In this case, you could easily enable or disable a specific API, quickly modify the forwarding configuration of an API, and add features such as rate limiting, authentication, and logging to an API without the need to restart the API Gateway.

Taking Apache APISIX as an example, it is a top-level open-source project under the Apache foundation and is currently the most active open-source gateway project. As a dynamic, real-time, and high-performance open-source API Gateway, Apache APISIX provides various traffic management functions such as load balancing, dynamic upstream, canary release, circuit breaker, identity authentication, observability, etc.

apisix dashboard screenshot image 1

apisix dashboard screenshot image 12

On the other hand, traditional load balancers are relatively weaker regarding API management and need such rich advanced features.

High-Performance Network Access

For scenarios that require high traffic and extremely high stability for network access, a layer 4 load balancer is obviously more suitable. It can directly distribute the network's original layer 4 traffic to each backend service without the impact of multiple parsing of application layer protocols in the middle layers, making it more capable of handling higher throughput.

In contrast, the API Gateway operating at layer 7, as the unified entry point, will have a certain throughput limit due to the need to parse protocols. Even though using a layer 4 API Gateway for network access is not particularly advantageous because this layer is not the focus of the API Gateway. Compared to the many years of technical accumulation of load balancers at this layer, the advantages of API gateways are far less significant.

Summary

In general, API Gateway and load balancer are infrastructure solutions used to resolve different problems. API Gateway is mainly used as a proxy for backend API interfaces, providing a single entry point for accessing different types of APIs and independent functions such as rate limiting, authentication, and monitoring. On the other hand, a load balancer is mainly used for layer 4 traffic distribution, spreading requests across multiple backend servers to balance out the request load and improve overall system availability and fault tolerance.

Under reasonable architectural design, API Gateway and load balancer are generally used together. A load balancer is the network access for the entire system, distributing traffic to multiple API Gateway instances. Each API Gateway instance then routes, authenticates, and authorizes requests separately to make the entire network more robust, reliable, and scalable.

Tags:
API Gateway ConceptsLoad Balancer