Reckon Accounts Hosted API Iterator Best Practices
Hi,
A month back I was asking for assistance in accessing the Reckon Accounts hosted API, I have since created a fabric pipeline extracting information from the various entities inside of Reckon (Invoice, Accounts etc). It worked fine with the dummy data provided. But since having access to our live data, when extracting information for let's say Invoices. It's inconsistent at best, I am currently using Iterators to read large amounts of data in invoices. However sometimes its returning "QBXML result not yet available. To retrieve result, try again later specifying the provided RequestId." which I am using to retry later, about 3 mins I counted. I am then able to pull some data, but when continuing it doesn't respond anymore, just loading in Postman.
I also wasn't able to stop the iterator I tried yesterday, is it a factor? If so how can I clean that up?
Any tips you can give me to make the behavior of the API to be more consistent when looping through large amount of data sets? I checked the count of our Invoice it returned 13000 by using the MetaData keyword.
Here are the request bodies that I have used.
Starting
{"FileName":"{{file_name}}","Operation":"<?xml version=\"1.0\"?><?qbxml version=\"6.1\"?><QBXML><QBXMLMsgsRq onError=\"stopOnError\"> <InvoiceQueryRq requestID=\"1\" iterator=\"Start\"> <MaxReturned>200</MaxReturned> </InvoiceQueryRq></QBXMLMsgsRq></QBXML>","UserName":"{{file_username}}","Password":"{{file_password}}"}
Continue
{"FileName":"{{file_name}}","Operation":"<?xml version=\"1.0\"?><?qbxml version=\"6.1\"?><QBXML><QBXMLMsgsRq onError=\"stopOnError\"> <InvoiceQueryRq requestID=\"1\" iterator=\"Continue\" iterator=\"{iterator_id}\"> <MaxReturned>200</MaxReturned> </InvoiceQueryRq></QBXMLMsgsRq></QBXML>","UserName":"{{file_username}}","Password":"{{file_password}}"}
Tried it with both v2 and v4
Answers
-
Hi,
A month back I was asking for assistance in accessing the Reckon Accounts hosted API, I have since created a fabric pipeline extracting information from the various entities inside of Reckon (Invoice, Accounts etc). It worked fine with the dummy data provided. But since having access to our live data, when extracting information for let's say Invoices. It's inconsistent at best, I am currently using Iterators to read large amounts of data in invoices. However sometimes its returning "QBXML result not yet available. To retrieve result, try again later specifying the provided RequestId." which I am using to retry later, about 3 mins I counted. I am then able to pull some data, but when continuing it doesn't respond anymore, just loading in Postman.I also wasn't able to stop the iterator I tried yesterday, is it a factor? If so how can I clean that up?
Any tips you can give me to make the behavior of the API to be more consistent when looping through large amount of data sets? I checked the count of our Invoice it returned 13000 by using the MetaData keyword.
Here are the request bodies that I have used.Starting
{"FileName":"{{file_name}}","Operation":"<?xml version=\"1.0\"?><?qbxml version=\"6.1\"?><QBXML><QBXMLMsgsRq onError=\"stopOnError\"> <InvoiceQueryRq requestID=\"1\" iterator=\"Start\"> <MaxReturned>200</MaxReturned> </InvoiceQueryRq></QBXMLMsgsRq></QBXML>","UserName":"{{file_username}}","Password":"{{file_password}}"}
Continue
{"FileName":"{{file_name}}","Operation":"<?xml version=\"1.0\"?><?qbxml version=\"6.1\"?><QBXML><QBXMLMsgsRq onError=\"stopOnError\"> <InvoiceQueryRq requestID=\"1\" iterator=\"Continue\" iteratorID=\"{iterator_id}\"> <MaxReturned>200</MaxReturned> </InvoiceQueryRq></QBXMLMsgsRq></QBXML>","UserName":"{{file_username}}","Password":"{{file_password}}"}
Tried it with both v2 and v4
0 -
Using an iterator is fine, but make sure you set a reasonable MaxReturned value and limit the fields you request to reduce the response size. That said, the iterator in Reckon Hosted can be unreliable at times, especially with transactions.
If you frequently need to retrieve the full list, it’s better to narrow the query by TxnDate, for example, fetching invoices weekly or fortnightly depending on your transaction volume. It’s not the most elegant approach, but it’s much more reliable and ensures you don’t miss any records.
Phuong Do / Reckon Developer Partner
phuong@cactussoftware.com.au
https://www.youtube.com/watch?v=O61SfV2bte8
2 -
Were initially fetching all data in bulk, I'll try the weekly invoices approach if it has better a better result.
For the iterator approach can seem to use the iterator to continue its always saying, iterator is not valid
here is the data given by the initial returned iterator. Highlighted the given iteratorIDHere is the body of the continue operation
tried with and without the curly braces({}), am I missing something?
Will test the TxnDate approach for now, if it yields better results.0 -
Using the TxnDate is really better, no need to retry the api call and more reliable overall although sometimes api call still hangs from time to time, is this expected?
Plan to fetch data 2 months at a time for about 300+ records returning in 50s to 1m 20s
0 -
It really depends on how many records are returned in each response. I would recommend keeping it below 150 records for transactions, or limiting the fields that Reckon returns. Timeouts can occur for a range of reasons, so you will need to handle them in your code and retry the request if needed.
Phuong Do / Reckon Developer Partner
phuong@cactussoftware.com.au
https://www.youtube.com/watch?v=O61SfV2bte8
1



