Can you run all patch helpers (10.3 to 10.5.2) after final OM import?


We have upgraded the TRIRIGA platform to 3.5.2.3 and started upgrading the application from 10.2 to 10.5.2 in incremental order (10.3, 10.3.1, until 10.5.2). To minimize the outage and complexity during production implementation, we have been suggested to take a final OM package after completing 10.5.2 deployment, and apply all the customizations which might have been impacted with the upgrade. This final OM package will contain all the changes from 10.2 to 10.5.2.

Our question is on the patch helpers: Can we run all the patch helpers (from 10.3 to 10.5.2 in order) after importing the final OM package?

Also, we are running the Varchar-to-Numeric script before importing the application upgrade packages. This script is taking a long time (almost a day in two test environments), but when we tried in another environment, it’s running for more than 2 days and still didn’t get executed. Is it normal for this script to run like that? Or will it be an issue? There are no differences between the environments.

I wouldn’t recommend doing the upgrade in one package. Usually, it ends up being quite large and it will cause issues. The IBM-recommended way is to perform each OM, then run the patch helpers. Once you have upgraded the OOB OM packages, you can have one OM which has your custom objects…

[Admin: This post is related to the 10.25.17 post and 04.28.17 post about running “SetVarcharColsToNumeric” scripts. To see other related posts, use the Scripts tag.]

Continue reading

Advertisements

How do you migrate record data quickly between environments?


So I just learned that I can’t use the Object Migration tool to migrate record data between two TRIRIGA environments. For example, I have two environments on different servers on the same application and platform version. If I try to use OM to migrate the Record Data only, for instance, the Building Equipment records, not all of the associated records will get migrated and certain smart sections do not get properly migrated either.

What are some other options that I could use to quickly migrate this data? I was thinking the Data Integrator (DI) method, but that would be tedious because I have over 100,000 records.

Ideally, DI should be used for the initial load. If the data is available somewhere else, you can look into Integration Object or DataConnect. You can populate staging tables and then run the integration. In your workflow, you can have logic to create any dependent records (such as organizations or contacts) based on the staging table data.

[Admin: To see other related posts, use the Integration tag or DataConnect tag.]

Continue reading

What are the Oracle Database settings for 10.5.2 and 3.5.2?


I wanted to see if anyone has set up the following settings in an Oracle Database for TRIRIGA 10.5.2 and 3.5.2:

  • 1. NLS_LENGTH_SEMANTICS: Should this be set to CHAR? In our current production environment, it’s set to BYTE, but the TRIRIGA support documentation says that this can lead to data loss, so they recommend using CHAR.
  • 2. NLS_CHARACTERSET: This is set to WE8ISO8859P1 in our current production environment, but the support document says that it must be UTF-8 or UTF-16.
  • 3. Block size: This is set to 8k, but the documentation recommends using 16k.

For (1) and (2), if you never want to store multibyte characters, then what you have is fine. But if you do, then you must use what the support documentation suggests.  Once you have your database created, it is difficult and time-consuming to change it, and it needs to be done outside of TRIRIGA. As for (3), I would encourage you to use 16k, since it will allow you better throughput and paging, unless you have a strong reason why you need to stay at 8k.

[Admin: This post is related to the 04.04.16 post about database character settings. NLS refers to National Language Support parameters in Oracle. To see other related posts, use the Multibyte tag, MBCS tag, or NLS tag.]

Continue reading

How do you import the upgrade OM packages to custom environments?


We have already upgraded our platform to 3.5.2.1. We are currently in the process of upgrading our application from 10.3.2 to 10.5.2.

For the application upgrade, we have set up a staging environment with an initial install of 10.5.2 and we have configured all BOs, forms, and other objects to meet our current customization. My question is: What if we import the IBM upgrade OM packages (sequential from 10.4 to 10.5.2) to our current environment (which has all customization)? It would definitely overwrite all the customization and configuration, but does it affect the record data as well (e.g. lease records)?

When it overwrites the customization at the BO and form level, would it corrupt the record data since some of the custom fields on the records won’t exist at the BO level any more? And what happens after we import all our customization back in the current environment from the staging environment?

The short answer is: You wouldn’t apply the IBM upgrade OM packages. Instead, you’d build OMs in your now customized 10.5.2 environment and then apply them to your current environment.

[Admin: To see other related posts, use the Object Migration tag or Upgrade tag.]

Continue reading

Why aren’t servers in the “Active Servers” table in Admin Console?


We are seeing servers dropping out of the “Active Servers” table in the Admin Console > Agent Manager page. We are running multiple environments, each with platform version 3.5.2.2 with two UI and two process servers, and have experienced this across multiple environments. Current observations:

  • Both UI and process servers can drop out, and it’s not the same server every time.
  • Servers can be accessed and logged into, even if they are gone from the “Active Servers” table.
  • A restart of the server will make the servers appear again.

Our current approach is to monitor daily and when a server drops, take a look in the logs for that day. Any other ideas? What controls when servers are listed or not in the “Active Servers” table in the Admin Console > Agent Manager page?

[Admin: To see other related posts, use the Admin Console tag.]

Continue reading

Having an issue with the database after upgrade to 3.5.3


We are in the process of upgrading to TRIRIGA 10.5.3/3.5.3. We are also importing the triFoodServiceLineItem business object from the modified environment (10.5.0.1/3.5.3) to the new staging environment. But when we import the business objects from the modified environment to the staging environment, we see database errors. Can someone advise us on what the issue could be?

L_TRIFOODSERVICELINEITEM is the language table for the triFoodServiceLineItem business object. For some reason, it sounds like it is unable to create it in the target, most likely because it is already there. I suggest that in the target environment, go to the triFoodServiceLineItem in the Data Modeler, revise the BO, and republish. Hopefully, that will fix it.

[Admin: To see other related posts, use the Upgrade tag.]

Continue reading

IV97475: Source user template replaces destination user template


The user template of the source environment replaces the user template of the destination environment of the same person record after an OM import. In TRIRIGA 3.5.2, when a triPeople user template is migrated from one environment to another, if a user’s people record is associated with the string “Applied Template” in the source environment, the most-recently applied template will be applied to same user’s people record in the target environment.

For example, user James Sullivan has a Project Team Member template applied in the test environment. In the CERT environment, user James Sullivan has a Facilities Manager template applied. But when the Project Team template is migrated to the CERT environment, the template is applied (instead of the Facilities Manager template) to James Sullivan’s people record.

This is working as designed. The root of the issue is that when an OM that has a people template is imported from the source environment to the target environment, and when the published name of the user profile record is the same in both the source and target environments, it will NOT replace, but create additional associations from source to target. All of these associations can be seen in the Associations tab of the user record. However, the form will show that latest template that was applied.

[Admin: To see other related posts, use the Templates tag.]

Continue reading