Let’s review the six major enhancements for advanced lease accounting…
- 1. Period end close capabilities. TRIRIGA 10.5.3 delivers “Advanced Lease Accounting Period Close” which ensures that no additional journal entries can be posted for that closed period…
- 2. Enhanced segregation of duties. With IBM TRIRIGA 10.5.3, the roles and responsibilities of Lease Accountant and Lease Administrator have been clearly delineated…
- 3. Streamlined modification processing. A revised process for recording lease modifications has been introduced in this version which supports FASB/IFRS compliance and creates a single process flow for all modifications…
- 4. Disclosure requirements templates (quantitative metrics). IBM TRIRIGA 10.5.3 comes pre-loaded with a variety of OOB report templates which support FASB/IASB lease accounting compliance…
- 5. Journal Entry Configuration Framework. With capabilities to generate Journal Entries for ASC 840 and ASC 842 under US-GAAP, as well as for IAS 17 and IFRS 16, IBM TRIRIGA 10.5.3 practically transforms the Real Estate Manager module into a sub-ledger system for Real Estate and Asset Lease Accounting…
- 6. Simplified ERP Integration (Data Transfer Object). IBM TRIRIGA 10.5.3 simplifies the process of transferring the lease accounting data into your ERP system by creating a Data Transfer Object (DTO) file that loads into the ERP General Ledger…
[Admin: To see other related posts, use the ValuD tag.]
Is there a maximum number of rows that should be uploaded with an integration object?
We are loading the records into an intermediate object. Once all intermediate records are created, we have a separate process which validates the data and adds live records (essentially like DTOs). The intermediate records are simple null to triActive and simple straight mappings, no validate or transform workflows. So in theory, the performance impact is about as minimal as it can be.
Is there any guidance around what a max number would be before we start to see the platform having issues? We are seeing that the IO can take several hours to dump the rows into the intermediate record if the upload file has 25,000 rows in it. Does this seem about right?
I would expect 25,000 rows in an upload file to take several hours. Each record still goes through some level of platform processing as they are being loaded into TRIRIGA.
We’re using an integration object to import data from a staging DB (populated by an external system) into a DTO which consists of 6 text fields. There should be no associations between the DTO and any other record, as the data from the text fields is used to identify records and build associations when the DTO is processed manually after the integration object has processed…
However, when we run the integration object, using the Execute action on the form, we get the following line in the log for every record:
2016-08-18 11:18:29,912 WARN [com.tririga.ws.TririgaWSImpl](WFA:16722316 - 15534574 triExecute:378233139 IE=378233139) com.tririga.ws.errors.InvalidArgumentException: Invalid association: associated record id should be greater than 0, is -1
We are getting this same error for another integration object providing similar functionality (loading Text/Number fields in a DTO). I have tried both clearing the map and rebuilding it manually and using the ReMap action on the Integration Object form. Looking in server.log, it appears the maps are successfully rebuilt for all integration objects.
The only thing I can think of is that we are not using the Base Parent field in our map, so it could be trying to build an association to the base parent when it is not required? I have attached a screenshot of the map for one our integration objects for reference. It is worth noting that the records are created successfully and the data is mapped as expected, aside from the “noise” in server.log.
I’m trying to import Working Hours and Holidays through integration object, but it seems that the System module is not authorized in the Data Map mapping tab.
- For the File scheme: When System is selected in the Module drop-down list, the BO drop-down list was empty.
- For the File-to-DC scheme: The staging tables were created, and the Job Control BO was also created, but the System module wasn’t in the Module drop-down list.
Is this a bug? In this case, I have to use Data Integrator or DTO or insert the staging table and DC_JOB data using SQL script?
This looks like a valid defect. Log a PMR with the following info, and we’ll be able to run with it.
I have a workflow that iterates through 30K records in the DTO and I get the error below. Has anyone seen this and does this mean I need to import less records in the DTO to prevent the Oracle error below?
2015-05-22 19:29:14,378 WARN [com.tririga.platform.workflow.runtime.WFProcessor](WebContainer : 4) Continue processing workflow after problem detected in TaskStep: Query(22) WFTID=26655548.5 TSID=196665 Label=’Query Status CL’ FormulaRecalc=’Recalculate as Needed’ EventAction=” StepInstance: WFIID=366663174642091 SID=196665 UserEvent=’ ‘ SO=-1 Results=0 Sum=0 Status=’FAILED’. Root cause: java.sql.SQLSyntaxErrorException: ORA-01795: maximum number of expressions in a list is 1000