1

I am fetching data from document library which has more than 5000 items. While doing this, it takes lots of time. Is there any solution to reduce this time?

2
  • how many items are you fetching at a time? Use PnP JS library, use indexing on columns if using filters in API call Commented May 15, 2023 at 11:21
  • currently I'm fetching more than 4000 data but after some time it can be much more than that and using spHttpClient for fetching data. Commented May 15, 2023 at 11:36

1 Answer 1

1

Generic tips to improve performance:

  • use $filter parameter to get items that match your criteria.
    Example : /_api/web/lists/getbytitle('Cars')/items?$filter=Manufacturer eq 'Honda'
  • use $select parameter to only retrieve fields you are interested in - without it, you are requesting all fields for all items.
    Example : /_api/web/lists/getbytitle('Invoices')/items?$select=ID,Title,KV_InvoiceNumber
  • use $orderby parameter to sort items in desired order. Might be useful in specific scenarios like "Get top 10 most profitable projects".
    Example: /_api/web/lists/getbytitle('Projects')/items?$orderby=KV_ProfitsEur

All these can be chained together.
Example : /_api/web/lists/getbytitle('Projects')/items?$filter=KV_Type eq 2&$select=ID,Title&$orderby=ID

If you have gotten this far and still over 5000 items, query cannot be optimized any further. Two more things to do:

  • try working with items in batches (aka pagination) - use ?$skiptoken=Paged=TRUE&$top=5000 and iterate through results via JSON.d.__next function. Good answer and example

  • index a column you are filtering and sorting ( Microsoft explanation and how-to guide ).

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.