When trying to edit the mapping of a locator field within Data Modeler, the spyglass next to the field does not pop up the mapping window.
[Admin: To see other related posts, use the Firefox tag.]
In TRIRIGA 10.5.1, if you navigate to Home > Projects, and have the “Projects – Projects Landing Page – Default” portal in place, the primary locations for any projects you have listed in the “My Active Projects” portal section are flagged in the “My Project Locations” portal section.
But in TRIRIGA 10.5.2, this does not happen. No project locations appear with a flag in the “My Project Locations” portal section. There appears to be a problem introduced between 10.5.1 and 10.5.2 due to a reverse association filter being removed.
The “Location – Navigation – GIS – Buildings, Structures, and Retail Locations – Project Manager Query” was configured with an incorrect forward association string, that prevented capital project locations from displaying on a GIS map. Moving forward, the Advanced tab > Geography Module > triCity Business Object association filter had its forward association string updated to “Geography Contains” from “Geography Belongs To”.
[Admin: To see other related posts, use the GIS tag.]
The “Apply Record” and “Apply Template” methods use current time stamps, instead of source-record time stamps when mapping to the created tasks.
We needed to make modifications to use a Query task to grab all associated tasks and task templates on the target record, and call two workflows against each to force updates to the Planned Start and Planned End dates within the context of their associated calendars. Moving forward, the application now correctly applies the task calendar hour restrictions to the tasks and task templates when using the “Apply Template” and “Apply Record” functionality with capital projects.
[Admin: To see other related posts, use the Templates tag or Calendar tag.]
We can’t export the graphic floor map to PDF from TRIRIGA. The system is “hanging” and not exporting. This is only happening with some drawings and only occurs when using Layer 0 from the Xref file ( xrefdwg | 0 ). If all other layers are off and a layer zero from any Xref is displayed on the graphics section, even if empty, the export will fail to complete.
The export graphic was throwing a malformed XML exception. The root cause was that there was a “1 = 1” element that got pulled in from the layout of an attached Xref onto layer 0 on that Xref. It turns out that, for any text element that contains any equals sign, the process of sending the SVG from the client to server using a Dojo API to post via a hidden input element, resulted in extra double quotes getting sent, and causing the SVG XML to be malformed.
We resolved this by pre-processing the SVG sent to the server to remove these extra double quotes, before sending it to the SVG converter. Moving forward, the export graphic will now successfully export a graphics section that includes text that contains any equals sign. Text that contains both double quotes and equals in it, will get the double quotes removed for technical reasons.
[Admin: To see other related posts, use the Xref tag or SVG tag.]
Is it possible to use a Modify Records task in the workflow to map or set the value of a locator field via an association?
I noticed in the database query tool that a locator field has two additional columns named TRIGEOGRAPHYCOSTINDEXT and TRIGEOGRAPHYCOSTINDEXTOBJID that are not visible in the Data Modeler. So when I use Modify Records task mapping, those two fields do not appear.
Yes, in the Modify task mapping, if you choose to map from one field to a locator field, it will save the OBJID in the T_ table. The key is that the text must be exactly what the locator expects.
So, let’s say you have a Time Zone locator that goes against the triTimeZone classification, triNameTX field. If your source field must contain the entire name, it would look something like “(GMT +12) Wellington, Auckland [Pacific/Auckland]”. If the source field just contained “Auckland”, the mapping to the locator would not work, because it looks it up via the Published Name. In other words, the name “Auckland” didn’t match the actual Published Name…
[Admin: To see other related posts, use the Locator tag.]
When you have currency conversions for multiple time periods, and then you have a lease that falls inside the first period, it is taking the last currency conversion as shown below… The amount base should be “2000” since the time period fell within the February exchange rate. But it seems to be taking the March exchange rate.
We needed to set the conversion group and exchange data in the BO mapping. Moving forward, the triCostItem > triContractCostBreakDownItems business object had their properties updated to set triConversionGroupLI for the conversion group, and triExchangeDT as the exchange date. We are addressing this specific scenario only and are not making sweeping changes to all BOs in OOB apps.
[Admin: To see other related posts, use the Currency tag.]
We have some trouble understanding how the “Database” scheme is supposed to be correctly implemented through the TRIRIGA integration object. From what we saw, TRIRIGA is unable to interact with an external database if it does not have 4 particular columns dedicated for the TRIRIGA integration process. These columns are IMD_STATUS, IMD_ID, IMD_MESSAGE and TRIRIGA_RECORD_ID on both the external source of data and the internal target for data in the TRIRIGA database.
We found it odd that to interact with an external database, TRIRIGA forces it to have 4 columns dedicated to itself and is not able to simply send a SELECT statement and map the corresponding fields. Did we miss (or overdo) something which could avoid altering the source table? Or is it common practice to interact with an external table?
[Admin: This post is related to the 11.05.14 post about using an integration object with an inbound database scheme.]