2👍
Django and microservices? Yeah, maybe somewhere in the parallel universe.
Only one thing that I may recommend is to build two identical services like django_container_internal
and django_container_production
. In this case you will be able to release internal tools
without stopping production
.
If you want to prevent access to production
functionality with internal
endpoints you may deactivate production
URLs by using ENVs
. Usually Django project has common config/urls.py
that aggregate all URL endpoints and looks like
urlpatterns = [
url('core/api/v1/', include(core.urls)),
url('internal/api/v1/', include(internal_app_1.urls)),
url('user/api/v1/', include(userapi_1.urls))
...
]
For example you may add IS_INTERNAL_TOOLS
environment variable and update urls.py
like
from os import environ
urlpatterns = [
url('core/api/v1/', include(core.urls)),
...
]
if environ.get('IS_INTERNAL_TOOLS', 'false').lower() in ('true', '1', 'yes'):
urlpatterns.append(url('insternal/api/v1/', include(insternal_app_1.urls)))
else:
urlpatterns.append(url('user/api/v1/', include(userapi_1.urls)))
-
Pros:
- All models will be accessible at both services (only one common DAO => no double developers work to create models twice)
- Functionality is separated so only necessary features are accessible
- Easy to implement
-
Cons:
- Whole source code stored inside both of containers even if half of it is not used
- If you using two separate databases for internal tools and external API you have to create all tables in both of it (but looks like that is not your case)
- Because of it is still monolith
internal
andproduction
parts heavily dependable on commoncore
and it is impossible to deploy only updated core separately
Source:stackexchange.com