General - if you were me what would you do type question.
I am integrating transactional data from external software into GP Plains using Smart Connect.
The external system is going to create an csv file and upload to a SFTP site every two hours.
Current process - Smart Connect will download the file - upload the data into a SQL staging data - a SQL stored procedure runs and manipulates the data and validates the data.
Smart Connect connects to the staging table - integrates into GP.
External system is an AR module where our customers use a payment portal to pay invoices - apply credits to invoices etc.
I am currently thinking that I want to create a Historical Staging Table for purposes of storing all the data we integrate from the external system into Great Plains and I would create a SQL job that simply empties the staging table each night.
My question is - If you were responsible for this project design would you include a historical SQL table that has all transactions ever integrated or would you just clear the staging table before each integration or end of day or such?
We expect 80-100 credit card eCheck type transactions each day.
I appreciate any and all opinions - my current opinion would be to include a historical table in the process.
Note - The staging table does have a column 'IntegratedGP' integer which indicates the transaction has been integrated - so theoretically the staging table could also be allowed to simply grow with all transactions and the IntegratedGP column be used to indicate historical records. Not certain if that's an alternative better than a separate table.