Say we want to limit the concurrent access to a resource to specific number; we can use the concurrent utilities in java to achieve this. For example, as per SLA you can make only a certain number of concurrent requests. Or writes to the database.
Classes from Java Concurrent Utilities provide a way to achieve this.
Let us define a scenario to work out a solution.
A policy service need to emit events to a dashboard service so that it can display the events as they happen. However, the dashboard service can accept only 5 concurrent requests and higher concurrencies will result in penalty being imposed.
Below is a Spring command line application to demonstrate the solution. The key participants are ThreadPoolExecutor, BlockingQueue and ThrottledDashboardService instances.
- Spring container instantiates an instance of LinkedBlockingQueue (AppConfig line#44) and wires it to both ThreadPoolExecutor (AppConfig.java line#31)and ThrottledDashboardService (AppConfig.java line#58)instances.
- The events generated by PolicyService is delegated to ThrottledDashboardService.
- ThrottledDashboardService converts the method arguments into DashboardTask instances and pumps them into the queue.
- The 5 threads in the pool pick the task and makes a rest call to the remote server.
- Straightforward implementation.
- Async processing. The producer thread does not have to wait for the processing to be finished.
- Tolerant to traffic burst. The queue will buffer any sudden increase in traffic. (Task Throttling using Java Concurrent Utilities (Sync) discusses how throttling can be achieved in a synchronous way)
- The execution is in a separate thread. So it is not straightforward to send the response of the execution back to the caller. Will need a callback.
- Again if a database transaction is involved, it will not be in the scope of the transaction.
- The solution works per jvm. If the application is scaled horizontally each jvm gets to make maxPoolSize requests.
- Overall the consumption rate must be greater than the production rate. Else the backlog in the queue will keep increasing.
Note: As an alternate solution, if a middleware is involved in brokering the requests to a remote service, it can throttle the rate/concurrency.