[Answered ]-How to define a view function in views.py file in django

1👍

First I want you rethink your design

Views are made to give response to the user’s request. Not for crawling data you should implement another function independent of view. Your view should only display the last entry in database. It should NOT crawl the data.

Think about a scenario:
User1 sends a GET request it will crawl the data and save it to database at tick 00:01 assuming we solved your timely execution problem. Next action should be at tick 00:06. Now if in between User2 and User3 comes at tick 00:02 and 00:03 and they send GET request new crawled data will be added to the database. You supposed to have 2 entries between 00:01 to 00:06 but because of User2 and three there are 4 entries.

So do like this. This is more appropriate

1.Create a myfun.py into your application directory:

from .bbc import bbc_crawler
from .models import News

 def crawl_data():
  allnews = []
  allnews.append(bbc_crawler()) 
  for news in allnews:
   for eachnews,link in news.items():
    News.objects.create(title=eachnews, url=link, source=source)

2.AFTER starting your webserver explicitly run this crawling.py just once

 python crawling.py

Write crawling.py as follows:

import time
from myfun import crawl_data
while(True):
 time.sleep(300)
 crawl_data()         

Into your view just show the last entry in the database to any number of users:

def collect_data(request);
 lastentry=News.objects.all().last()
 allnews=lastentry.allnews #Fetch acoording to your model fields 
 source=lastentry.source   #Fetch acoording to your model fields 
 return render(request, 'news/index.html', {'allnews':allnews, 'source': source})

1👍

Views are for answering the request. if you need to do some crawling on interval you should configure a celery task as suggested by @Rohit Jain -or for trivial stuff- in a management command called from cron or supervisor , save the data crawled in the database, and grab it from the view.

Leave a comment