I am uploading a tab-delimited file and writing the data into a staging table using TRIRIGA integration object. While uploading the file, I need to restrict the number of lines that can be uploaded into the staging table. I am using DataConnect and a custom task to read the file from the binary field. But I am not able to read the file. How can we read the file from the integration object?
So I just learned that I can’t use the Object Migration tool to migrate record data between two TRIRIGA environments. For example, I have two environments on different servers on the same application and platform version. If I try to use OM to migrate the Record Data only, for instance, the Building Equipment records, not all of the associated records will get migrated and certain smart sections do not get properly migrated either.
What are some other options that I could use to quickly migrate this data? I was thinking the Data Integrator (DI) method, but that would be tedious because I have over 100,000 records.
Ideally, DI should be used for the initial load. If the data is available somewhere else, you can look into Integration Object or DataConnect. You can populate staging tables and then run the integration. In your workflow, you can have logic to create any dependent records (such as organizations or contacts) based on the staging table data.
We have already upgraded our platform to 188.8.131.52. We are currently in the process of upgrading our application from 10.3.2 to 10.5.2.
For the application upgrade, we have set up a staging environment with an initial install of 10.5.2 and we have configured all BOs, forms, and other objects to meet our current customization. My question is: What if we import the IBM upgrade OM packages (sequential from 10.4 to 10.5.2) to our current environment (which has all customization)? It would definitely overwrite all the customization and configuration, but does it affect the record data as well (e.g. lease records)?
When it overwrites the customization at the BO and form level, would it corrupt the record data since some of the custom fields on the records won’t exist at the BO level any more? And what happens after we import all our customization back in the current environment from the staging environment?
The short answer is: You wouldn’t apply the IBM upgrade OM packages. Instead, you’d build OMs in your now customized 10.5.2 environment and then apply them to your current environment.
We are in the process of upgrading to TRIRIGA 10.5.3/3.5.3. We are also importing the triFoodServiceLineItem business object from the modified environment (10.5.0.1/3.5.3) to the new staging environment. But when we import the business objects from the modified environment to the staging environment, we see database errors. Can someone advise us on what the issue could be?
L_TRIFOODSERVICELINEITEM is the language table for the triFoodServiceLineItem business object. For some reason, it sounds like it is unable to create it in the target, most likely because it is already there. I suggest that in the target environment, go to the triFoodServiceLineItem in the Data Modeler, revise the BO, and republish. Hopefully, that will fix it.
[Admin: To see other related posts, use the Upgrade tag.]
What is the “Integration” check box for in the workflow Start task?
Assuming you are referring to the Start task of an asynchronous workflow, when this property is selected, the workflow is used to migrate data from staging tables in IBM TRIRIGA records. This type of workflow is used extensively in IBM TRIRIGA DataConnect.
Check out this IBM Knowledge Center topic about DataConnect that describes the “Integration” check box: Workflow task settings.
In my current project, there was a suggestion to extract (updated) data from TRIRIGA, with a high frequency, and import it into some kind of data warehouse (DW) or business intelligence (BI) solution. Then, from there, perform more advanced reporting and analytics. Have other TRIRIGA solutions implemented something similar? Are there any TRIRIGA best practices or recommendations for staging area, extract-transform-load (ETL), DW, or BI reporting solutions?
I have created an object migration (OM) with its workflow. The execution works well on the Development environment. But after an import of the OM package with all objects needed, the execution didn’t work on the Test environment. The object migration launch works. The triIntegration workflow launch works. The execution of the request works in SQL Server. The connection in my object migration works.
But there is no row in the staging table S_CSTPHINTERMARCHECONTRAT. Also, I see in the logs:
Calling SQL: [INSERT INTO S_CSTPHINTERMARCHECONTRAT(DC_JOB_NUMBER, DC_CID, DC_SEQUENCE_ID, DC_STATE, DC_ACTION, DC_GUI_NAME, TRIIDTX, CSSTPHHPIDRATTTX, CSTPHRETIRETX) VALUES (?,?,?,?,?,?,?,?,?)] with params[402, 0, 1, 1, 4, cstPHInterMarcheContrat, 2013/M0166, 101GT, ]
I found the problem. The configuration of the integration object was for the Development environment and not for the Test environment.
[Admin: To see other related posts, use the DataConnect tag.]