The monolith database as a primary cause of slow delivery

Large monolitic applications are characterized by big bottlenecks that slows down the entire process.  The dependencies causes a significant overhead on your deployment throughput. Cycle times are longer because of the slower builds and automated and manual tests required to validate the entire app. This creates a lot of friction in the pipeline. Dependencies brings complexity what increases the development time. Complex code bases have more opportunities for bugs what causes many waste. The dependencies also lead to less opportunities for parallelizing the work. To be able to work in group, developers need to create branches so that developers can work in isolation but this demand also to merge more often what also demand a considerable effort. 

For me bottleneck number one is Manual Testing! Manual Testing is a tedious process when the test cycles are repeated frequently. Therefore many teams adopt test automation but before they can do test automation they first need to create test data. Only when teams have adequate test data they can run automated tests, and can create that data on demand, they see better IT performance, lower change failure rates, and lower levels of deployment pain and rework. The problem with large monolitic applications is that they also go with big databases and how bigger the database how more difficult it is to manage. When the test data is difficult to setup or reason about then it become difficult to do any meaningful test automation.

Conclusion, if your database is a monolith your delivery process we’ll suffer from bottlenecks caused by lack of test automation and also by communication/coordination overhead even if you have a service oriented or microservice architecture on top of it. 

Add comment