9đź‘Ť
Django, like many other web framework, is constructed around the concept of receiving an HTTP request from a web client, processing the request and sending a response. Breaking down that flow (simplified for sake of clarity):
- The remote client opens TCP connection with your Django server.
- The client sends a HTTP request to the server, having a path, some headers and possibly a body.
- Server sends a HTTP response.
- Connection is closed
- Server goes back to a state where it waits for a new connection.
A chat server, if it needs to be somewhat real-time, needs to be different: it needs to maintain many simultaneous open connections with connected clients, so that when new messages are published on a channel, the appropriate clients are notified accordingly.
A modern way of implementing that is using WebSockets. This communication flow between the client and server starts with a HTTP request, like the one described above, but the client sends a special Upgrade HTTP request to the server, asking for the session to switch over from a simple request/response paradigm to a persistent, “full-duplex” communication model, where both the client and server can send messages at any time in both direction.
The fact that the connections with multiple simultaneous clients needs to be persistent means you can’t have a simple execution model where a single request would be taken care of by your server at a time, which is usually what happens in what you call synchronous servers. Tornado and Twisted have different models for doing networking, using multithreading, so that multiple connections can be left open and taken care of simultanously by a server, and making a chat service possible.
Synchronous approach nevertheless
Having said that, there are ways to implement a very simple, non-scalable chat service with apparent latency:
-
Clients perform
POST
requests to your server to send messages to channels. -
Clients perform periodical
GET
requests to the server to ask for any new messages to the channels they’re subscribed to. The rate at which they send these requests is basically the refresh rate of the chat app.
With this approach, your server will work significantly harder than if it had a asynchronous execution model for maintaining persistent connections, but it will work.
2đź‘Ť
If you’re going to make a chat app, you’ll want to use websockets. They’ll make getting updates to all clients participating in a conversation significantly easier and it’ll give you real time conversations within your app. Having said that, I’ve never seen websockets used within a synchronous framework.
If it’s OK to make Django-only based synchronous chat application? Too many unanswered questions for a reasonable answer. How many people will use this chat app? How many people per conversation? How long will this app be around? If you’re looking to make something simple for you and a couple friends, make what you know. If you’re getting paid to make this app, use websockets and use an asynchronous framework.
- [Django]-Prefetching unrelated model on arbitrary field equivalence
- [Django]-Dealing with legacy django project in new localized projects
2đź‘Ť
You certainly can develop a synchronous chat app, you don’t necessarily need to us an asynchronous framework. but it all comes down to what do you want your app to do? how many people will use the app? will there be multiple users and multiple chats going on at the same time?
- [Django]-Create a facebook notification with Django package facepy : [15] (#15) This method must be called with an app access_token
- [Django]-Django mail sending error with DEBUG=False