0👍
I dont know If it gonna work, but the concept is like this. You need to decide in what condition you function will stop the recursion and in what condition your recursion will keep running.
You can modify as your needs.
async init () {
try {
let result = await axios.get('/someurl');
if (result.data.status == 200) {
this.someData = result.data.someDataFromServer
// do execute function recursively
this.init()
}
} catch (error) {
if (error.response.status != 200) {
// what you gonna do if the recursive stopped
}
}
}
0👍
I compromised with my senior.
I tried and showed him with two different results.
A way)
collected data from DB via python – mysqlClient connection
FRONT-END
axios.get("url/")
.then((res) => {
if (res.data.status === 200) {
data_table_HDR = res.data.HDR
data_table_body = res.data.body
}
}).catch((err)=>{console.log(err)})
BACKEND
connection = mysqldb.connectoin(connection_info ID,PW,HOST,PORT etc..)
cursor = connection.cursor()
cursor.excute(f"select * from certain_table")
data = cursor.fetchall()
resp_data = {HDR: data[0], body: data[1]}
response(data=data, status=200)
In this A way, well… the browser’s screen was white with "No response" and did not work properly so had to kill the process by window’s system management.
And also python IDE which I used pycharm IDE, it was didn’t worked properly like no response so I had to kill it too.
I showed him with this symptom(phenomenon) then surprisingly his react was kinda happy.
And then he suggested other way as below, I’ll call it as "B way".
B way)
FRONT-END
const isInit = False
do_data_collect() {
if (isInit === False) {
this.init()
} else {
this.nextStep(user_input_integer)
}
}
init(user_input_integer) {
limit_value = user_input_integer
axios.get("url/" + limit_value)
.then((res) => {
if (res.data.status === 200) {
data_table_HDR = res.data.HDR
data_table_body = res.data.body
isInit = True
}
}).catch((err)=>{console.log(err)})
}
nextStep(user_input_integer) {
limit_value = user_input_integer
offset = limit_value + limit_value
axios.get({
url: "url",
method: "GET",
headers: {
token
},
params: {
limit_value: limit_value,
offset: offset
})
.then((res) => {
if (res.data.status === 200) {
data_table_body.append(res.data.body)
}
}).catch((err)=>{console.log(err)})
}
BACKEND
limit_value = request.get("limit_value")
offset = request.get("offset")
connection = mysqldb.connectoin(connection_info ID,PW,HOST,PORT etc..)
cursor = connection.cursor()
cursor.excute(f"select * from certain_table limit {limit_value} offset {offset}")
data = cursor.fetchall()
resp_data = {HDR: data[0], body: data[1]}
response(data=data, status=200)
In this B way,
It works like ping-pong.
The Front-End requests with amount limitation & offset.
And the Back-End performs Read file or get data from DB.
Front-End receives response and show it result as pop-up(Dialog) and when user
scrolls down via mouse then API occurs again and it loops until when it gets to no more
data.
In this way, it was no browser and IDE down issues.
However, the data_table_body appends next data it become contains huge amount of data
so that it occurs memory problems.
Consequently, show all the data of huge data to END-USER is not a good idea.
Give ’em data as a file-download will be better I think.
And am I along? living this working-drama in this world?
Implement and Testing and Compares.. every time… 🙁