When you run a query expecting JSON results, via the integration object external URL functionality, the modified names of the roles are not being reflected. That is, the original name is displayed in the result query. This happens for localized users only. Meanwhile, US English (US_en) users observe the proper changes when refreshing the query results…
As I’ve understood it, both the integration object and DataConnect allow you to import localized data (except business key fields, I think). In addition, we have another option to use the Globalization Manager to import traditional data. I found it’s pretty cool. We will only deal with the localized data, with less impact for the non-localized data. Before going forward with an option, I’d like to know what’s the best option for you guys?
Importing by using the Globalization Manager updates the L_ tables directly. If your data does not include localized values that need to be concatenated, for example in a formula, the Globalization Manager import is your best option.
However, if your data includes localized values that need to be concatenated through a formula, or if your data needs to be processed by workflow before it is added to the TRIRIGA tables, then you should use either the integration object or DataConnect.
[Admin: This post is related to the 03.02.16 post about best practices for integration optimization.]
Numeric fields containing decimal values cannot be loaded into TRIRIGA with the use of integration object. Regardless if the fields are out-of-the-box or custom. It occurs when the “Validate” check box is checked, causing the integration to fail for the fields containing numeric data.
Why are your retired integration workflows being republished after restarting your server?
This is working as designed. The integration object is controlled by the platform, so any changes it finds will be added back upon an application server restart. You can find more information in the IBM Knowledge Center topic: Object upgrades. Note that if you add an integration object record called IGNORE_UPGRADE, then no updates will be made to the integration object and it will not apply fixes in the future that are added to the package.
We’re using an integration object to import data from a staging DB (populated by an external system) into a DTO which consists of 6 text fields. There should be no associations between the DTO and any other record, as the data from the text fields is used to identify records and build associations when the DTO is processed manually after the integration object has processed…
However, when we run the integration object, using the Execute action on the form, we get the following line in the log for every record:
2016-08-18 11:18:29,912 WARN [com.tririga.ws.TririgaWSImpl](WFA:16722316 - 15534574 triExecute:378233139 IE=378233139) com.tririga.ws.errors.InvalidArgumentException: Invalid association: associated record id should be greater than 0, is -1
We are getting this same error for another integration object providing similar functionality (loading Text/Number fields in a DTO). I have tried both clearing the map and rebuilding it manually and using the ReMap action on the Integration Object form. Looking in server.log, it appears the maps are successfully rebuilt for all integration objects.
The only thing I can think of is that we are not using the Base Parent field in our map, so it could be trying to build an association to the base parent when it is not required? I have attached a screenshot of the map for one our integration objects for reference. It is worth noting that the records are created successfully and the data is mapped as expected, aside from the “noise” in server.log.
You may find an issue with the setup of an integration object record when using a business object that exists with the same name in two different modules. For example, when setting up the Data Map, you can select the Module as triItem and the Business Object as triChecklistCategory. But then you don’t get any options in the Form drop-down (just a blank value), so you are unable to continue in setting up the Data Map. You may observe that there is a triChecklistCategory BO in both the triItem and Classification modules.
The issue is that integration object logic to display business objects with staging tables in the Data Map drop-down list did not take into consideration that the same-named business object can belong to multiple modules. This was reported in TRIRIGA 3.5.1/10.5.1.
As suggested, the issue was that the SQL retrieving the BO staging tables was not looking at the module. Thus, if two BOs with the same name belonging to different modules had staging tables, the integration object logic did not know which BO to use. The fix is to pass in the module name into the SQL. Moving forward, we resolved an integration object “File to DC” issue, where staging tables were not correctly loading in the Data Map tab, if the business object selected had the same name as another business object belonging to a different module.
The TRIRIGA integration object’s XML post type requests are not correctly encoding UTF-8 characters. As a result, double-byte characters, although they appear correctly in the server.log when the integration object’s “Debug” is clicked, are seen by the receiver as “??”.
The issue was that XML post type requests were not correctly handling UTF-8 characters. The fix is to include UTF-8 content binding when the entity is being added to the request. Moving forward, we resolved an integration object HTTP post scheme issue, where multibyte characters were not being correctly handled in the outgoing HTTP request. Please note with this fix, if for some reason your integration object HTTP post scheme configuration leaves the Content-Type field blank, and the Content-Type is not defined in the Headers field, then the Content-Type of the HTTP request will default to: “Content-Type: text/plain; charset=UTF-8”. On prior releases, it defaulted to: “Content-Type: null”.