Can you resize the triStatusCL column size in T_TRIPEOPLE?


I’ve noticed that the triStatusCL column of the T_TRIPEOPLE table has been defined as VARCHAR2(1000) by default. Can we resize it using SQL? What’s the impact? According to my understanding, all of the classification type fields are defined in the same way, correct?

This is actually set by design, and uses a thousand characters for compatibility reasons. But there might be a size change based on platform changes in the future.

[Admin: To see other related posts, use the Character tag or Classifications tag.]

Continue reading

How do you clean up the SESSION_HISTORY table?


The SESSION_HISTORY table uses 500GB of disk space in our production environment. What is the best way to clean it up?

You would want to make appropriate backups and test this thoroughly, but this is something that can be done via the Platform Maintenance Scheduler (formerly Cleanup Agent) in the IBM TRIRIGA Admin Console. You can develop SQL to do the cleanups as they meet your business requirements, add that as a new cleanup command at the bottom section of the window, and then add a new Cleanup Schedule event in the top section of the window to have it run periodically.

This may be something that you want to reach out to a business partner for assistance with, depending on your skill and comfort level in implementing such a change. If so, you can search the IBM PartnerWorld portal.

[Admin: To see other related posts, use the Sessions tag or History tag.]

Continue reading

Why is a cost code path corrupted when applying a template?


We have an issue where sometimes after applying a cost code template to a project, the hierarchy path will not be complete. It will be missing all of the parent path and only shows the name. This issue is only visible in the app by viewing the System Path field inside the form or by using a SQL query, because the system path in the T_TRICOSTCODE table is correct, but the object path field in the IBS_SPEC table is the one that’s not complete. The issue does not have much consequence unless you are using the rollup fields, in which case the corrupt cost code path will cause a posted transaction to fail.

If there are customers who use cost codes heavily, you can try running the following SQL, and if you get any results back, then that means the issue is present at some level in your environment. It is not necessary that you use the Apply Template to create your cost codes, as I have heard of others having the issue where their cost codes are created via an integration. This SQL is for Oracle and may need a tweak for SQL Server. If any customers can run this, and see if they have the issue, it may help us identify how it happens.

select tripathsy, triprojectnamesy, OBJECT_PATH from t_tricostcode T1, IBS_SPEC T2 where T1.spec_id in (select spec_id from ibs_Spec where type_name = ‘triCostCode’ and object_path not like ‘%\Cost Code%’) AND T1.spec_id = T2.spec_id and tripathsy like ‘%\Cost Code%’

[Admin: To see other related posts, use the Cost Code tag or Templates tag.]

Continue reading

Having issues with OM packages & nav items during upgrade to 3.5.3


Our customer has seen an issue when installing TRIRIGA 3.5.3 (Linux, Server build number: 276955) on an existing database (on 3.4.2 / 10.4.2). Everything goes well until starting up the server. Generally, TRIRIGA will run a database upgrade on the first startup when a build number difference is detected.

In the OM log, we notice that TRIRIGA tried to import the upgrade OM package… The import process started with the triPlatformObjectLabelManager package, but it failed to import a navigation item, which is newly created for Object Label Manager. I haven’t found any log which can explain this failure. I’ve checked the NAV_ITEM table. This navigation item wasn’t there before the upgrade process. Then all of the other packages are stuck on a pending status. Nothing happens after “Creating package from Zip file”. This behavior causes a lot of SQL update failures.

Meanwhile, on our Dev environment (Windows, Server build number: 279835), the upgrade went very well. You can find the difference in the logs. The OM log was set on “Debug” level on both servers. Note that the build number is slightly different between these two enviroments. Have you seen this kind of issue? Where can I find more details about the navigation item import failure?

[Admin: This post is related to the 02.17.17 post and 05.19.16 post about inconsistent OM validation results. To see other related posts, use the Object Migration tag or Upgrade tag.]

Continue reading

How do you populate or update dates stored as numeric values?


I added a new date field, cstReminderDateDA, to a custom business object. For existing records, there is a requirement that cstReminderDate be set to the value of an existing date field (cstDueDateDA) minus 30 days, i.e. cstReminderDateDA = (cstDueDateDA – 30). This would be pretty straightforward, except that both cstReminderDateDA and cstDueDateDA are stored as numeric fields in our Microsoft SQL Server database. How do I populate cstReminderDateDA?

IBM TRIRIGA stores the date as epoch time. See the following wiki link explaining this. If you do some searching, you will find some functions available for SQL Server for data calculations.

[Admin: To see other related posts, use the Epoch Time tag.]

Continue reading

How do you prevent SQL database log from crashing the database?


I made a big import of data in TRIRIGA, but the resources were not enough to proceed. So I have to stop TRIRIGA and truncate all events to stop the import. But now, the database log never stops growing and crashing the database. Is there a way to clean up and make TRIRIGA stable?

If this is Microsoft SQL Server, this may be related to the following SQL Server defect: SQL Server crashes when the log file of tempdb database is full in SQL Server 2012 or SQL Server 2014.

[Admin: To see other related posts, use the SQL tag or “sql server” search phrase.]

Continue reading

Is there any tool which imports currency conversion rates?


We are looking for an integration solution which allows you to import currency conversion rates. It seems that this data is not based on a BO, so I think traditional integration tools such as DataConnect (DC), Data Integrator (DI), and Integration Object won’t work. Have you seen this kind of requirement? Are there any solutions other than using SQL script? I’ve tried the SQL below, and it seems to be working.

INSERT INTO BUDGET_CURRENCY_CONVERSION (CONVERSION_GROUP, FROM_CURRENCY_CODE, TO_CURRENCY_CODE, CONVERSION_RATE, START_DATE, END_DATE, INSTANCE_ID) VALUES (‘LIBA’, ‘Chinese Yuan’, ‘Euro’, 0.13, {ts ‘2017-01-01 01:00:00’}, {ts ‘2017-12-31 01:00:00’}, BUDGET_CURRENCY_ID_SEQ.nextval)

BusinessConnect (a.k.a. Connector for Business Applications or CBA) is the best method to use for this. Here’s the PDF. The putCurrencyConversionRates method is what you would want to use.

[Admin: To see other related posts, use the Currency tag.]

Continue reading

Can you get the Cleanup Agent to remove record data in chunks?


In one of our environments, we have a large amount of records that have been transitioned to the null state. When the Cleanup Agent runs, it runs out of DB2 transaction log space executing the following:

DELETE FROM IBS_SPEC_ASSIGNMENTS WHERE EXISTS (SELECT ‘X’ FROM IBS_SPEC_CA_DELETE WHERE IBS_SPEC_CA_DELETE.SPEC_ID=IBS_SPEC_ASSIGNMENTS.SPEC_ID)

For workflow instance saves, the Cleanup Agent now seems to remove the data in small chunks (of 1000 rows each). But for the record data cleanup, it still seems to (try to) remove all data in one huge SQL statement/transaction. Is it possible to get the Cleanup Agent to remove record data in chunks, like it does for workflow instance saves?

Otherwise, I’m thinking of writing a small util that would run the statement above, but in smaller chunks, since it seems we still have the list of record IDs that it tries to remove in the IBS_SPEC_CA_DELETE table. Any obvious issues with that?

As of 3.5.3, there have been no changes for the data to be deleted in chunks for IBS_SPEC_ASSIGNMENTS. This sounds like a good PMR. It is rarely recommended to delete records directly from the database, but in this circumstance, you might be okay. However, the IBS_SPEC_CA_DELETE only exists during the time the Cleanup Agent is running. It is dropped when the agent goes back to sleep.

[Admin: To see other related posts, use the Cleanup tag or SQL tag. As a side note, starting with version 3.4, the Cleanup Agent name was changed to Platform Maintenance Scheduler.]

Continue reading

How do you get high-availability with a database-as-a-service?


The most critical aspect of all production systems is being able to ensure high-availability in case of critical failures. The traditional approach was to operate a disaster recovery site and fail-over clusters. This requires 24×7 maintenance and monitoring activities and adds the significant costs for hardware, software, and human resources engaged. Due to this, many companies opt-out of having the high-availability for their EAM system.

Thankfully, there is a new option with database-as-a-service (DBaaS) offerings being provided by all major database vendors. Providing a high-availability system for 10,000 users can now be achieved by filling the single web form. You will then be provided with a database service that is available 24×7 and all maintenance, monitoring, and backup activities are automated in the background.

We tested DBaaS offerings from several vendors such as Amazon, Microsoft, and Oracle and found all of them being able to deliver on their promise. Although some products like IBM Maximo and IBM TRIRIGA are not officially certified for these platforms, we managed to successfully migrate several clients to a DBaaS database on Oracle and MS SQL Server…

[Admin: To see other related posts, use the High Availability tag or SaaS tag.]

Continue reading

Can anyone help with extracting data via Oracle SQL Developer?


Does anyone have any experience in extracting TRIRIGA data from the database directly through Oracle SQL Developer? I am trying to extract through SQL code, and combine in one table, the data from triSpace, triSpaceClassCurrent, and triSpaceStandardsSpec. The idea is to create a single table with all spaces, space classifications, and space standards.

[Admin: To see other related posts, use the Oracle tag or SQL tag.]

Continue reading