Does anyone have experience in project integration (data transfer) from an external existing project through any IBM TRIRIGA integration module? Our templates and mapping have been identified, but the question is about reproducing the overall standard when a project is created. This seems hard. In order to have this project consistent in TRIRIGA, we think that we need to be consistent with the following:
- Project and direct associations with business objects in perimeter
- Associated purchase orders
- Budget and their associated cost codes
- Financial data and their associated computation
Any help and experience would be great, even if the answer is it’s too hard.
Without detailed knowledge on the user’s part, such as a fundamental knowledge of all of the application functionality, it would really be best to engage IBM Services to implement this integration. It’s not impossible, but it really would need a lot of care.
[Admin: To see other related posts, use the Integration tag.]
I’m using the TRIRIGA integration object (File method) to import data into the space BO. I created the Data Map properly, but my records are not importing because of the following error:
“Could not get recordId for smartSection[triCurrentSpaceClass] on row, column with value. Record was not saved.”
Even though I selected the Smart Section filter and mapped it to triNameTX, the integration object fails. Any thoughts?
[Admin: To see other related posts, use the Integration Object tag.]
Our customer has seen an issue when installing TRIRIGA 3.5.3 (Linux, Server build number: 276955) on an existing database (on 3.4.2 / 10.4.2). Everything goes well until starting up the server. Generally, TRIRIGA will run a database upgrade on the first startup when a build number difference is detected.
In the OM log, we notice that TRIRIGA tried to import the upgrade OM package… The import process started with the triPlatformObjectLabelManager package, but it failed to import a navigation item, which is newly created for Object Label Manager. I haven’t found any log which can explain this failure. I’ve checked the NAV_ITEM table. This navigation item wasn’t there before the upgrade process. Then all of the other packages are stuck on a pending status. Nothing happens after “Creating package from Zip file”. This behavior causes a lot of SQL update failures.
Meanwhile, on our Dev environment (Windows, Server build number: 279835), the upgrade went very well. You can find the difference in the logs. The OM log was set on “Debug” level on both servers. Note that the build number is slightly different between these two enviroments. Have you seen this kind of issue? Where can I find more details about the navigation item import failure?
[Admin: This post is related to the 02.17.17 post and 05.19.16 post about inconsistent OM validation results. To see other related posts, use the Object Migration tag or Upgrade tag.]
We have upgraded the TRIRIGA platform to 18.104.22.168 and started upgrading the application from 10.2 to 10.5.2 in incremental order (10.3, 10.3.1, until 10.5.2). To minimize the outage and complexity during production implementation, we have been suggested to take a final OM package after completing 10.5.2 deployment, and apply all the customizations which might have been impacted with the upgrade. This final OM package will contain all the changes from 10.2 to 10.5.2.
Our question is on the patch helpers: Can we run all the patch helpers (from 10.3 to 10.5.2 in order) after importing the final OM package?
Also, we are running the Varchar-to-Numeric script before importing the application upgrade packages. This script is taking a long time (almost a day in two test environments), but when we tried in another environment, it’s running for more than 2 days and still didn’t get executed. Is it normal for this script to run like that? Or will it be an issue? There are no differences between the environments.
I wouldn’t recommend doing the upgrade in one package. Usually, it ends up being quite large and it will cause issues. The IBM-recommended way is to perform each OM, then run the patch helpers. Once you have upgraded the OOB OM packages, you can have one OM which has your custom objects…
[Admin: This post is related to the 10.25.17 post and 04.28.17 post about running “SetVarcharColsToNumeric” scripts. To see other related posts, use the Scripts tag.]
Is it possible to download the imported OM package from the front end or server?
Yes. Go to Tools > Object Migration. Locate the OM package, click the “Copy Package” icon link to the left of the OM name, then in the upper-right, click “Export”. Give it a name, click “Export”, then it will ask you to “Open” or “Save”. I would “Open” the ZIP to see where it was stored. There you have it!
[Admin: To see other related posts, use the Object Migration tag.]
I encountered a bug with classification (CL) field types. I have a classification BO with the publish name comprised of (ID – Name). I have a certain number of records where the character length in the Name field is about 150 characters. So the full length of the publish name will be more than 100 characters.
- Problem: I went to a form that references that classification via an associated CL field. After I selected the value, I noticed that the displayed value showed a truncated value that was less than 100 characters. However, when I went to the Association tab of that form, it had the correct association.
- Second Problem: When I imported data via Data Integrator (DI), I made sure that the CL field had the full path which is more than 100 characters. DI gave no errors after import. I opened the record to verify the CL field populated, but the CL field was not updated and left null. I had to manually select the value in the CL field to associate it correctly.
Question: How do I import data with CL fields that have more than 100 characters?
I am not sure how to import data with CL fields that are more than 100 characters, but if you feel you have encountered some bugs, please submit a PMR.
[Admin: To see other related posts, use the Classifications tag or Character tag.]
I made a big import of data in TRIRIGA, but the resources were not enough to proceed. So I have to stop TRIRIGA and truncate all events to stop the import. But now, the database log never stops growing and crashing the database. Is there a way to clean up and make TRIRIGA stable?
If this is Microsoft SQL Server, this may be related to the following SQL Server defect: SQL Server crashes when the log file of tempdb database is full in SQL Server 2012 or SQL Server 2014.
[Admin: To see other related posts, use the SQL tag or “sql server” search phrase.]
We are seeing an Oracle error during OM package import related to the number of open cursors. We checked the documentation, but there is no recommendation of what the cursor size should be for large OM package imports. We are currently on 500. Has anyone seen this issue or performed a successful upgrade in this regard? What is the optimal cursor size being used by different clients?
I checked with one of the senior TRIRIGA architects, and he said he would recommend setting open_cursors to at least 2000 for an OM package import. I hope this information helps!
[Admin: To see other related posts, use the Object Migration tag or Upgrade tag.]
We have already upgraded our platform to 22.214.171.124. We are currently in the process of upgrading our application from 10.3.2 to 10.5.2.
For the application upgrade, we have set up a staging environment with an initial install of 10.5.2 and we have configured all BOs, forms, and other objects to meet our current customization. My question is: What if we import the IBM upgrade OM packages (sequential from 10.4 to 10.5.2) to our current environment (which has all customization)? It would definitely overwrite all the customization and configuration, but does it affect the record data as well (e.g. lease records)?
When it overwrites the customization at the BO and form level, would it corrupt the record data since some of the custom fields on the records won’t exist at the BO level any more? And what happens after we import all our customization back in the current environment from the staging environment?
The short answer is: You wouldn’t apply the IBM upgrade OM packages. Instead, you’d build OMs in your now customized 10.5.2 environment and then apply them to your current environment.
[Admin: To see other related posts, use the Object Migration tag or Upgrade tag.]