Why does the Esri GIS map initially display at the lowest zoom level?


I have configured Esri connector in our TRIRIGA instance using the following URL:

/html/en/default/rest/EsriJS?map=Default – Location – Associated to Current Record&level=5

We have found that the GIS map initially displays but at the lowest zoom level. Essentially, the map is not responding to the Level parameter. Does anyone have any experience with this?

[Admin: This post is related to the 02.04.15 post about GIS documentation, and the 01.21.15 post about GIS functionality.]

Continue reading

Having issues with DataConnect after moving from Oracle to DB2


We are encountering some issues running DataConnect on a DB2 database and wondered if anyone could point us in the right direction. Previously, we were successfully running on an Oracle Database, but since moving to DB2, we have come across some issues.

We found some notes that said we could resolve some issues by ensuring that numeric fields in the import file contained zeros instead of blanks, but we still get the same issues. The import of the data to the staging table is failing, and the TRIRIGA integration object is displaying errors. For some reason, it is not recognizing the data in the file…

Continue reading

What is the max number of rows to upload with integration objects?


Is there a maximum number of rows that should be uploaded with an integration object?

We are loading the records into an intermediate object. Once all intermediate records are created, we have a separate process which validates the data and adds live records (essentially like DTOs). The intermediate records are simple null to triActive and simple straight mappings, no validate or transform workflows. So in theory, the performance impact is about as minimal as it can be.

Is there any guidance around what a max number would be before we start to see the platform having issues? We are seeing that the IO can take several hours to dump the rows into the intermediate record if the upload file has 25,000 rows in it. Does this seem about right?

I would expect 25,000 rows in an upload file to take several hours. Each record still goes through some level of platform processing as they are being loaded into TRIRIGA.

Continue reading

IV93379: Record name change is not being reflected via web services


When a record name is changed in the record itself within TRIRIGA, the new name is reflected, but when querying the record via an integration URL for an XML output (e.g. f=xml url parameter), the specId is still referring to the record with the original name. This happens with localized users…

When a record name is changed, it is not being reflected when requested via web services. The issue was that integration query URLs were not returning localized values when the response is in XML format. The fix adds two new URL parameters that can be used to localize XML results. They are f=xml-loc and f=pxml-loc…

Continue reading

IV92257: JSON result of a query not reflecting the correct names


When you run a query expecting JSON results, via the integration object external URL functionality, the modified names of the roles are not being reflected. That is, the original name is displayed in the result query. This happens for localized users only. Meanwhile, US English (US_en) users observe the proper changes when refreshing the query results…

Continue reading

What is the best practice for localized data loading?


As I’ve understood it, both the integration object and DataConnect allow you to import localized data (except business key fields, I think). In addition, we have another option to use the Globalization Manager to import traditional data. I found it’s pretty cool. We will only deal with the localized data, with less impact for the non-localized data. Before going forward with an option, I’d like to know what’s the best option for you guys?

Importing by using the Globalization Manager updates the L_ tables directly. If your data does not include localized values that need to be concatenated, for example in a formula, the Globalization Manager import is your best option.

However, if your data includes localized values that need to be concatenated through a formula, or if your data needs to be processed by workflow before it is added to the TRIRIGA tables, then you should use either the integration object or DataConnect.

[Admin: This post is related to the 03.02.16 post about best practices for integration optimization.]

Continue reading

IV89067: Cannot upload decimal values using integration object


Numeric fields containing decimal values cannot be loaded into TRIRIGA with the use of integration object. Regardless if the fields are out-of-the-box or custom. It occurs when the “Validate” check box is checked, causing the integration to fail for the fields containing numeric data.

Continue reading