Having issues with OM packages & nav items during upgrade to 3.5.3


Our customer has seen an issue when installing TRIRIGA 3.5.3 (Linux, Server build number: 276955) on an existing database (on 3.4.2 / 10.4.2). Everything goes well until starting up the server. Generally, TRIRIGA will run a database upgrade on the first startup when a build number difference is detected.

In the OM log, we notice that TRIRIGA tried to import the upgrade OM package… The import process started with the triPlatformObjectLabelManager package, but it failed to import a navigation item, which is newly created for Object Label Manager. I haven’t found any log which can explain this failure. I’ve checked the NAV_ITEM table. This navigation item wasn’t there before the upgrade process. Then all of the other packages are stuck on a pending status. Nothing happens after “Creating package from Zip file”. This behavior causes a lot of SQL update failures.

Meanwhile, on our Dev environment (Windows, Server build number: 279835), the upgrade went very well. You can find the difference in the logs. The OM log was set on “Debug” level on both servers. Note that the build number is slightly different between these two enviroments. Have you seen this kind of issue? Where can I find more details about the navigation item import failure?

[Admin: This post is related to the 02.17.17 post and 05.19.16 post about inconsistent OM validation results. To see other related posts, use the Object Migration tag or Upgrade tag.]

Continue reading

Can you run all patch helpers (10.3 to 10.5.2) after final OM import?


We have upgraded the TRIRIGA platform to 3.5.2.3 and started upgrading the application from 10.2 to 10.5.2 in incremental order (10.3, 10.3.1, until 10.5.2). To minimize the outage and complexity during production implementation, we have been suggested to take a final OM package after completing 10.5.2 deployment, and apply all the customizations which might have been impacted with the upgrade. This final OM package will contain all the changes from 10.2 to 10.5.2.

Our question is on the patch helpers: Can we run all the patch helpers (from 10.3 to 10.5.2 in order) after importing the final OM package?

Also, we are running the Varchar-to-Numeric script before importing the application upgrade packages. This script is taking a long time (almost a day in two test environments), but when we tried in another environment, it’s running for more than 2 days and still didn’t get executed. Is it normal for this script to run like that? Or will it be an issue? There are no differences between the environments.

I wouldn’t recommend doing the upgrade in one package. Usually, it ends up being quite large and it will cause issues. The IBM-recommended way is to perform each OM, then run the patch helpers. Once you have upgraded the OOB OM packages, you can have one OM which has your custom objects…

[Admin: This post is related to the 10.25.17 post and 04.28.17 post about running “SetVarcharColsToNumeric” scripts. To see other related posts, use the Scripts tag.]

Continue reading

How do you migrate record data quickly between environments?


So I just learned that I can’t use the Object Migration tool to migrate record data between two TRIRIGA environments. For example, I have two environments on different servers on the same application and platform version. If I try to use OM to migrate the Record Data only, for instance, the Building Equipment records, not all of the associated records will get migrated and certain smart sections do not get properly migrated either.

What are some other options that I could use to quickly migrate this data? I was thinking the Data Integrator (DI) method, but that would be tedious because I have over 100,000 records.

Ideally, DI should be used for the initial load. If the data is available somewhere else, you can look into Integration Object or DataConnect. You can populate staging tables and then run the integration. In your workflow, you can have logic to create any dependent records (such as organizations or contacts) based on the staging table data.

[Admin: To see other related posts, use the Integration tag or DataConnect tag.]

Continue reading

What are the Oracle Database settings for 10.5.2 and 3.5.2?


I wanted to see if anyone has set up the following settings in an Oracle Database for TRIRIGA 10.5.2 and 3.5.2:

  • 1. NLS_LENGTH_SEMANTICS: Should this be set to CHAR? In our current production environment, it’s set to BYTE, but the TRIRIGA support documentation says that this can lead to data loss, so they recommend using CHAR.
  • 2. NLS_CHARACTERSET: This is set to WE8ISO8859P1 in our current production environment, but the support document says that it must be UTF-8 or UTF-16.
  • 3. Block size: This is set to 8k, but the documentation recommends using 16k.

For (1) and (2), if you never want to store multibyte characters, then what you have is fine. But if you do, then you must use what the support documentation suggests.  Once you have your database created, it is difficult and time-consuming to change it, and it needs to be done outside of TRIRIGA. As for (3), I would encourage you to use 16k, since it will allow you better throughput and paging, unless you have a strong reason why you need to stay at 8k.

[Admin: This post is related to the 04.04.16 post about database character settings. NLS refers to National Language Support parameters in Oracle. To see other related posts, use the Multibyte tag, MBCS tag, or NLS tag.]

Continue reading

How do you import the upgrade OM packages to custom environments?


We have already upgraded our platform to 3.5.2.1. We are currently in the process of upgrading our application from 10.3.2 to 10.5.2.

For the application upgrade, we have set up a staging environment with an initial install of 10.5.2 and we have configured all BOs, forms, and other objects to meet our current customization. My question is: What if we import the IBM upgrade OM packages (sequential from 10.4 to 10.5.2) to our current environment (which has all customization)? It would definitely overwrite all the customization and configuration, but does it affect the record data as well (e.g. lease records)?

When it overwrites the customization at the BO and form level, would it corrupt the record data since some of the custom fields on the records won’t exist at the BO level any more? And what happens after we import all our customization back in the current environment from the staging environment?

The short answer is: You wouldn’t apply the IBM upgrade OM packages. Instead, you’d build OMs in your now customized 10.5.2 environment and then apply them to your current environment.

[Admin: To see other related posts, use the Object Migration tag or Upgrade tag.]

Continue reading

Why aren’t servers in the “Active Servers” table in Admin Console?


We are seeing servers dropping out of the “Active Servers” table in the Admin Console > Agent Manager page. We are running multiple environments, each with platform version 3.5.2.2 with two UI and two process servers, and have experienced this across multiple environments. Current observations:

  • Both UI and process servers can drop out, and it’s not the same server every time.
  • Servers can be accessed and logged into, even if they are gone from the “Active Servers” table.
  • A restart of the server will make the servers appear again.

Our current approach is to monitor daily and when a server drops, take a look in the logs for that day. Any other ideas? What controls when servers are listed or not in the “Active Servers” table in the Admin Console > Agent Manager page?

[Admin: To see other related posts, use the Admin Console tag.]

Continue reading

Having an issue with the database after upgrade to 3.5.3


We are in the process of upgrading to TRIRIGA 10.5.3/3.5.3. We are also importing the triFoodServiceLineItem business object from the modified environment (10.5.0.1/3.5.3) to the new staging environment. But when we import the business objects from the modified environment to the staging environment, we see database errors. Can someone advise us on what the issue could be?

L_TRIFOODSERVICELINEITEM is the language table for the triFoodServiceLineItem business object. For some reason, it sounds like it is unable to create it in the target, most likely because it is already there. I suggest that in the target environment, go to the triFoodServiceLineItem in the Data Modeler, revise the BO, and republish. Hopefully, that will fix it.

[Admin: To see other related posts, use the Upgrade tag.]

Continue reading

IV97475: Source user template replaces destination user template


The user template of the source environment replaces the user template of the destination environment of the same person record after an OM import. In TRIRIGA 3.5.2, when a triPeople user template is migrated from one environment to another, if a user’s people record is associated with the string “Applied Template” in the source environment, the most-recently applied template will be applied to same user’s people record in the target environment.

For example, user James Sullivan has a Project Team Member template applied in the test environment. In the CERT environment, user James Sullivan has a Facilities Manager template applied. But when the Project Team template is migrated to the CERT environment, the template is applied (instead of the Facilities Manager template) to James Sullivan’s people record.

This is working as designed. The root of the issue is that when an OM that has a people template is imported from the source environment to the target environment, and when the published name of the user profile record is the same in both the source and target environments, it will NOT replace, but create additional associations from source to target. All of these associations can be seen in the Associations tab of the user record. However, the form will show that latest template that was applied.

[Admin: To see other related posts, use the Templates tag.]

Continue reading

How do you move classifications from one environment to another?


I recently set up a new environment in which I need to migrate the classifications (not just the record data) from the existing system. What is the fastest way to do this and ensure that the classifications are set up properly in the new system?

I migrated the BOs and forms. I checked the Include association for the BO to itself and with the classification BO. The form has been added to the “Includes/Forms” tab of itself as well as the classification form. But I still don’t see this BO added under the classification hierarchy when I click on “New” to create the root record.

[Admin: This post is related to the 03.29.17 post about creating a classification. To see other related posts, use the Classifications tag or Object Migration tag.]

Continue reading

UX: How do you export components from one environment to another?


I have a couple of software development life cycle (SDLC) questions about Perceptive apps using the TRIRIGA UX Framework:

  • How do you export changes to Perceptive app components from one environment to another, without exporting the whole application? Example components include web view files and data sources. Currently, if we only change and test a data source or web view in Dev, we cannot figure out how to export them from the Dev environment and import them into the system integration testing (SIT) environment.
  • How do you add an existing data source to an existing model? We only found an Add button to add a new data source. We would like to associate a pre-defined data source to a model.

TRIRIGA currently supports exporting of the whole app, so you cannot export only a data source. For web view files, you have the option to use the WebViewSync tool to pull the files from one environment and push them into another.

About adding an existing data source, the data sources were designed to be part of a model and that’s why there is no button to add an existing data source to a model. However, you can manually associate them by using the Association tab.

[Admin: To see other related posts, use the UX Framework tag.]

Continue reading