'Monolith api vs 2 apis with the same datastore
I’m actually working on a project where the starting point is an on premise windows app (third party) with its own datastore that will not always be reachable. That datastore will be the source of truth for the entire application.
Having said that, I’m building 2 frontend applications: a multiplatform App (Xamarin.Forms/MAUI), and a Web SPA (Blazor).
For those client apps, I’ll build a rest api (ASP.net 6) with its own datastore. The onpremise data (I have full access to the db) will be synced to the online datastore, via an onpremise service that talks to the rest api and receives data via servicebus (as could be temporarily offline).
My first option is to build a monolith api, which server both the onpremise service for sync and the frontends. But I have some concerns about scalability. The load from the onpremise svc will be uniform, but on the frontends I expect to have spikes. So I was wondering If separating the apis for the onpremise svc, and the frontends (while maintaining the same datastore) does make sense.
As I’m using EF Core as ORM of choice (code first), does eventually make sense to create a sort of DAL in a shared module between apis?
As the project is small the entire microservices stuff (so duplicating the data) is out of question, even more so the data will be the very same between the apis.
Thanks.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
