How do you import projects through IBM TRIRIGA integration?


Does anyone have experience in project integration (data transfer) from an external existing project through any IBM TRIRIGA integration module? Our templates and mapping have been identified, but the question is about reproducing the overall standard when a project is created. This seems hard. In order to have this project consistent in TRIRIGA, we think that we need to be consistent with the following:

  • Project and direct associations with business objects in perimeter
  • Associated purchase orders
  • Budget and their associated cost codes
  • Financial data and their associated computation

Any help and experience would be great, even if the answer is it’s too hard.

Without detailed knowledge on the user’s part, such as a fundamental knowledge of all of the application functionality, it would really be best to engage IBM Services to implement this integration. It’s not impossible, but it really would need a lot of care.

[Admin: To see other related posts, use the Integration tag.]

Continue reading

Advertisements

Why aren’t Group record changes copied through object migration?


Why aren’t Group record changes copied through object migration?

The short answer is that IBM TRIRIGA sees this as an unsupported customization of the Group record. Let’s clarify this further. Even though technically, behind the scenes, Groups are record data, they are currently considered TRIRIGA platform-owned and so, controlled BOs (business objects).

The platform controls exactly what Group data the object migration (OM) can export/import. Thus, any fields added to the Group BO will not be recognized by OM when exporting/importing Group records. Modifications to any platform-owned and controlled BOs are not supported. This does not just apply to Group BOs only.

If the BO is a platform-controlled object and any changes are not supported, then why does the platform currently allow changes to it?

IBM TRIRIGA currently does not prevent users from modifying any BOs, even the ones that are specifically necessary for core platform functionality. The Group BO, Document BO, and triPlatformObjectLabelManager BO are just a few examples. Although the platform does nothing to prevent users from modifying these BOs, TRIRIGA does not support the modification of any of these.

For these core platform BOs, the object migration tool is designed to pull exactly what it needs for the designed platform functionality when exporting/importing the record data tied to these BOs. In other words, any modifications will compromise the TRIRIGA platform integrity, so it becomes an unsupported action if done so.

The wiki on Core objects in TRIRIGA Application Platform functionality details the core platform business objects that should not be modified. Meanwhile, for the expressed requirement to see Group modifications exported/imported with Group record data, a request for enhancement (RFE) was submitted and will be considered for a potential platform change in a future TRIRIGA release.

[Admin: This post is related to the 11.07.17 post about core objects you shouldn’t modify. To see other related posts, use the Groups tag or Object Migration tag.]

Continue reading

Why does the integration object fail with a smart section filter?


I’m using the TRIRIGA integration object (File method) to import data into the space BO. I created the Data Map properly, but my records are not importing because of the following error:

“Could not get recordId for smartSection[triCurrentSpaceClass] on row[1], column[6] with value[3]. Record was not saved.”

Even though I selected the Smart Section filter and mapped it to triNameTX, the integration object fails. Any thoughts?

[Admin: To see other related posts, use the Integration Object tag.]

Continue reading

Having issues with OM packages & nav items during upgrade to 3.5.3


Our customer has seen an issue when installing TRIRIGA 3.5.3 (Linux, Server build number: 276955) on an existing database (on 3.4.2 / 10.4.2). Everything goes well until starting up the server. Generally, TRIRIGA will run a database upgrade on the first startup when a build number difference is detected.

In the OM log, we notice that TRIRIGA tried to import the upgrade OM package… The import process started with the triPlatformObjectLabelManager package, but it failed to import a navigation item, which is newly created for Object Label Manager. I haven’t found any log which can explain this failure. I’ve checked the NAV_ITEM table. This navigation item wasn’t there before the upgrade process. Then all of the other packages are stuck on a pending status. Nothing happens after “Creating package from Zip file”. This behavior causes a lot of SQL update failures.

Meanwhile, on our Dev environment (Windows, Server build number: 279835), the upgrade went very well. You can find the difference in the logs. The OM log was set on “Debug” level on both servers. Note that the build number is slightly different between these two enviroments. Have you seen this kind of issue? Where can I find more details about the navigation item import failure?

[Admin: This post is related to the 02.17.17 post and 05.19.16 post about inconsistent OM validation results. To see other related posts, use the Object Migration tag or Upgrade tag.]

Continue reading

Can you run all patch helpers (10.3 to 10.5.2) after final OM import?


We have upgraded the TRIRIGA platform to 3.5.2.3 and started upgrading the application from 10.2 to 10.5.2 in incremental order (10.3, 10.3.1, until 10.5.2). To minimize the outage and complexity during production implementation, we have been suggested to take a final OM package after completing 10.5.2 deployment, and apply all the customizations which might have been impacted with the upgrade. This final OM package will contain all the changes from 10.2 to 10.5.2.

Our question is on the patch helpers: Can we run all the patch helpers (from 10.3 to 10.5.2 in order) after importing the final OM package?

Also, we are running the Varchar-to-Numeric script before importing the application upgrade packages. This script is taking a long time (almost a day in two test environments), but when we tried in another environment, it’s running for more than 2 days and still didn’t get executed. Is it normal for this script to run like that? Or will it be an issue? There are no differences between the environments.

I wouldn’t recommend doing the upgrade in one package. Usually, it ends up being quite large and it will cause issues. The IBM-recommended way is to perform each OM, then run the patch helpers. Once you have upgraded the OOB OM packages, you can have one OM which has your custom objects…

[Admin: This post is related to the 10.25.17 post and 04.28.17 post about running “SetVarcharColsToNumeric” scripts. To see other related posts, use the Scripts tag.]

Continue reading

Is there a way to download the imported OM package from server?


Is it possible to download the imported OM package from the front end or server?

Yes. Go to Tools > Object Migration. Locate the OM package, click the “Copy Package” icon link to the left of the OM name, then in the upper-right, click “Export”. Give it a name, click “Export”, then it will ask you to “Open” or “Save”. I would “Open” the ZIP to see where it was stored. There you have it!

[Admin: To see other related posts, use the Object Migration tag.]

Continue reading

How do you import CL fields that have more than 100 characters?


I encountered a bug with classification (CL) field types. I have a classification BO with the publish name comprised of (ID – Name). I have a certain number of records where the character length in the Name field is about 150 characters. So the full length of the publish name will be more than 100 characters.

  • Problem: I went to a form that references that classification via an associated CL field. After I selected the value, I noticed that the displayed value showed a truncated value that was less than 100 characters. However, when I went to the Association tab of that form, it had the correct association.
  • Second Problem: When I imported data via Data Integrator (DI), I made sure that the CL field had the full path which is more than 100 characters. DI gave no errors after import. I opened the record to verify the CL field populated, but the CL field was not updated and left null. I had to manually select the value in the CL field to associate it correctly.

Question: How do I import data with CL fields that have more than 100 characters?

I am not sure how to import data with CL fields that are more than 100 characters, but if you feel you have encountered some bugs, please submit a PMR.

[Admin: To see other related posts, use the Classifications tag or Character tag.]

Continue reading