You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Performance is crucial for modern apps. To achieve high performance of an app you need to utilize resources efficiently,
16
18
that in turn leads to cost saving in terms of infrastructure and operational expenses. Also, performance drastically
17
19
affects user experience - faster applications result in happier users, higher engagement. API performance plays one of the key roles here:
18
20
it is barely possible to achieve all these benefits without decreasing API latency, boosting throughput, improving responsiveness, especially for processing large volumes of data where a little tweak could make tremendous impact.
19
21
20
-
# Caching
22
+
##Caching
21
23
22
24

23
25
24
-
## Use cases and advantages
26
+
###Use cases and advantages
25
27
26
28
27
29
To improve backend API performance using caching the general approach is to store in cache expensive or frequently executed queries to third party APIs, as well as time-consuming and complex database queries.
@@ -41,16 +43,16 @@ The approach has the following advantages:
41
43
***Hybrid cache** - is a caching system, that combines some the benefits from in-memory and distributed cache systems: the cache system still stores data in-memory, but in addition to in-memory caching,
42
44
a hybrid cache also incorporates a distributed caching component that spans multiple nodes or servers.
43
45
44
-
## Pitfalls
46
+
###Pitfalls
45
47
* One of the most complex and challenging aspects of caching is deciding when and how to invalidate or update the cached data. Wrong **Caching invalidation** implementation might lead to losing advantages of caching or to providing outdated data to users;
46
48
* Cache consistency - during temporary unavailability of a cache certain updates of data occurs in a source, but not in the cache. After the cache becomes available again the changes must be synchronized. Another separate complex topic is consistency of a distributed cache nodes;
47
49
* Cache poisoning - caches are susceptible to various security threats, such as cache poisoning. Attackers inject malicious data into the cache compromising system integrity and user security, exposing harmful or fraudulent data to users.
48
50
* For hybrid caches like Infinispan upscaling or downscaling requires restarting all nodes. Infinispan always segments data that it stores in-memory and
49
51
changing the number of segments Infinispan creates requires a full cluster restart.
50
52
51
-
# Asynchronous processing
53
+
##Asynchronous processing
52
54
53
-
## Use cases and advantages
55
+
###Use cases and advantages
54
56
55
57
In BE development, performance and responsiveness are intricately linked and play crucial role. Utilizing asynchronous processing for time-consuming or computation-intensive tasks is beneficial for both. For tasks such as email sending, files processing, and logging, where instant response is impossible or not necessary,
56
58
asynchronous processing provides an efficient way to handle tasks without delaying other critical operations or blocking the main execution flow. Consider using of Java `CompletableFuture` and `@Async` Spring annotation for asynchronous processing.
@@ -80,15 +82,15 @@ But `@Async` has its limitations; for large scale applications with more complex
80
82
4. Throttling. You can control the rate at which messages are produced or consumed to prevent overload and ensure optimal resource utilization.
81
83
5. Partitioning. Message queues often provide features for partitioning data for parallel processing.
82
84
83
-
## Pitfalls
85
+
###Pitfalls
84
86
85
87
* Error handling, especially propagations of such errors across asynchronous boundaries, can be really challenging. Also, improper error handling in asynchronous code can produce silent errors and lead to data corruption, unexpected or misleading behaviour. Consider using exceptions handlers in callbacks and using events for error propagation;
86
88
* Asynchronous programming requires careful management of resources, especially for long-running asynchronous processes. Pay special attention to proper cleanup of resources, possible resource leaking and excess resource consumption;
87
89
* Debugging and testing may require specialized frameworks and techniques.
88
90
89
-
# Parallel processing
91
+
##Parallel processing
90
92
91
-
## Use cases and advantages
93
+
###Use cases and advantages
92
94
93
95
During development of a service in microservice architecture the following situation may arise:
94
96
the task of the service is to query several different services and aggregate data fetched from them. The straightforward approach is to fetch data by querying those services one by one,
@@ -177,19 +179,19 @@ public void processItems(List<Item> items) {
177
179
.forEach(executor::execute);
178
180
}
179
181
```
180
-
## Pitfalls
182
+
###Pitfalls
181
183
* Parallel processing introduces additional complexity to the software design and implementation. Coordinating multiple threads or processes, managing synchronization, and handling communication between parallel tasks can be challenging and error-prone;
182
184
* Parallel processing does not necessarily guarantee linear performance grow. You can not really assume that 100 similar tasks executed using 100 threads will be 10 times faster than 100 tasks, but executed on 10 threads;
183
185
* Debugging and testing may be really challenging since traditional debugging techniques may not be sufficient for investigating concurrency-related issues;
184
186
* Parallel streams use common fork-join thread pool, using a custom thread pool is required for executing parallel streams with I/O operations, as the common fork-join pool is optimized for CPU-bound tasks and may experience thread starvation when blocked by I/O operations.
185
187
* Using parallel processing in the wrong context. For example, before buying a product, application should check that user owes sufficient funds for the purchase. Using of parallel processing in this context may produce a situation when a user is able to buy a good with insufficient funds on his account.
186
188
187
189
188
-
# Payload reduction
190
+
##Payload reduction
189
191
190
192
To improve performance consider using smaller payloads. Smaller payloads lead to faster transmission over network, reduced latency, improves responsiveness.
191
193
192
-
## Use cases & advantages
194
+
###Use cases & advantages
193
195
194
196
One of the ways to reduce a response size is pagination. Using pagination, an API respond with small chunks of the complete queried dataset. For example, you can apply it for user navigation through content:
*Improperly implemented pagination may result in issues with data consistency and integrity, particularly in data-intensive apps.
232
234
*Using compression for small-sized responses or data that is already compressed. While compression reduces the size of HTTP responses it increases processing time. Therefore, in the mentioned cases, compression may negatively affect performance;
233
235
*Using `PATCH` introduces additional complexity, especially in testing. Additionally, the HTTP method may not be supported by all servers or clients;
234
236
235
-
# Using of CircuitBreaker
237
+
##Using of CircuitBreaker
236
238
237
-
## Use cases & advantages
239
+
###Use cases & advantages
238
240
239
241
TheCircuitBreaker is a design pattern. Despite its main purpose being to enhance the resilience of a system, implementing the pattern can lead to performance improvements in certain scenarios:
* Tuning the circuit breakers thresholds and timeouts is crucial, but it's a quite complex task. Without proper configuration, issues such as breaking early(false negative scenario) are possible, leading to degradation of a healthy service;
266
268
*Due to overhead, implementation may negatively impact performance in high-throughput scenarios.
267
269
268
-
# Using connection pool with `RestTemplate`
270
+
##Using connection pool with `RestTemplate`
269
271
270
-
## Use cases & advantages
272
+
###Use cases & advantages
271
273
272
274
In the SpringFramework, the `RestTemplate` classcreated by default uses the `SimpleClientHttpRequestFactory` under the hood, which creates a new connection for each request.
273
275
With this approach, operations like socket opening, and handshake must be executed repeatedly during connection creation.
@@ -296,12 +298,12 @@ public RestTemplate pooledRestTemplate() {
296
298
}
297
299
```
298
300
299
-
## Pitfalls
301
+
###Pitfalls
300
302
*Connections in the pool may become invalid if they are kept idle for too long or if the remote server terminates the connection unexpectedly. Using such connection may result in errors or
301
303
unexpected behaviour. Consider implementing a custom`Keep-Alive` strategy, which determines how long a connection may remain unused in the pool until it is closed.
302
304
*Improper configuration and connection management may lead to connection exhaustion or the excessive use of resources for maintaining unnecessary connections.
0 commit comments