ParamQuery grid support forum
General Category => Help for ParamQuery Pro => Topic started by: nzxgq0 on March 15, 2020, 04:11:10 pm
-
we have a case that the database view has 5 million records. we want to avoid to retrieve all the records as it will get timeout and it also has impact on the server performance. with the remote type and the remote location, we can implement everything nice (of cause, local sort/filter/page has better performance than remote) except export as it inevitably retrieves all the records in one service call.
Any good solution for that?
-
we have a case that the database view has 5 million records. we want to avoid to retrieve all the records as it will get timeout and it also has impact on the server performance. with the remote type and the remote location, we can implement everything nice (of cause, local sort/filter/page has better performance than remote) except export as it inevitably retrieves all the records in one service call.
Any good solution for that?
My idea is that at first the grid sends a request to the remote server and gets the JSON string with a pre-defined size of records (such as 100k records) and fills the data model. When there is more data, a prompt dialog pops up and user clicks it to retrieve the next batch and appends it to the data model This repeats until all the data is retrieved. If user clicks cancel, the data model remain unchanged. So user can sort, filter and export at local.
So the question is:
does the pqgrid allow us to append data to the data model?
Thanks.
-
we have a case that the database view has 5 million records. we want to avoid to retrieve all the records as it will get timeout and it also has impact on the server performance. with the remote type and the remote location, we can implement everything nice (of cause, local sort/filter/page has better performance than remote) except export as it inevitably retrieves all the records in one service call.
Any good solution for that?
My idea is that at first the grid sends a request to the remote server and gets the JSON string with a pre-defined size of records (such as 100k records) and fills the data model. When there is more data, a prompt dialog pops up and user clicks it to retrieve the next batch and appends it to the data model This repeats until all the data is retrieved. If user clicks cancel, the data model remain unchanged. So user can sort, filter and export at local.
So the question is:
does the pqgrid allow us to append data to the data model?
Thanks.
In another word, we want to divide one Ajax service call to multiple Ajax service calls (with manual confirmation) due to the large volume of records. So manipulating the data model becomes necessary.
-
For large datasets, it's possible to divide one ajax call into multiple calls and keep updating the dataModel.data as more data arrives from server.
Server side script receives current page and records per page from grid and sends the requested data back.
postData: function () {
return {
pq_curpage: pqIS.requestPage,
pq_rpp: pqIS.rpp
};
},
These 2 examples use this methodology:
https://paramquery.com/pro/demos/infinite_scroll
https://paramquery.com/pro/demos/virtual_scroll
-
For large datasets, it's possible to divide one ajax call into multiple calls and keep updating the dataModel.data as more data arrives from server.
Server side script receives current page and records per page from grid and sends the requested data back.
postData: function () {
return {
pq_curpage: pqIS.requestPage,
pq_rpp: pqIS.rpp
};
},
These 2 examples use this methodology:
https://paramquery.com/pro/demos/infinite_scroll
https://paramquery.com/pro/demos/virtual_scroll
Thanks. It solves my problem.
By the way, I found that the local export functionality has its limits. For example, it's OK to export 200K records as Excel but it fails when the count of the records reaches 300K. But the upper limit is larger if exporting as HTML.