What is the “Integration” check box for in workflow Start task?


What is the “Integration” check box for in the workflow Start task?

Assuming you are referring to the Start task of an asynchronous workflow, when this property is selected, the workflow is used to migrate data from staging tables in IBM TRIRIGA records. This type of workflow is used extensively in IBM TRIRIGA DataConnect. 

Check out this IBM Knowledge Center topic about DataConnect that describes the “Integration” check box: Workflow task settings.

[Admin: The same question is also posted in the main Application Platform forum. To see other related posts, use the Staging tag or DataConnect tag.]

Continue reading

Advertisements

IV96036: DataConnect issue with same BO name in different modules


When having two business objects (BOs) with the same name in different modules, and you select the “Validate” check box on DataConnect (DC) runs, this will cause an SQL statement (built to receive a single row, the BO name) to fail, since the logic does not include the module on the WHERE clause.

The DataConnect job issue was that the File-to-DC validation logic had SQL that assumed BO names are unique, and did not account for same-named BOs in different modules. This fix includes a module name parameter to the logic, so that the SQL knows from which module to retrieve the BO. Moving forward, we resolved an integration object File-to-DC issue involving the “Validate” check box. When the “Validate” check box was selected, the validation process would fail if the BO being validated had the same name as another BO in a different module.

[Admin: To see other related posts, use the DataConnect tag.]

Continue reading

Why is the DataConnect staging table empty after OM import?


I have created an object migration (OM) with its workflow. The execution works well on the Development environment. But after an import of the OM package with all objects needed, the execution didn’t work on the Test environment. The object migration launch works. The triIntegration workflow launch works. The execution of the request works in SQL Server. The connection in my object migration works.

But there is no row in the staging table S_CSTPHINTERMARCHECONTRAT. Also, I see in the logs:

Calling SQL: [INSERT INTO S_CSTPHINTERMARCHECONTRAT(DC_JOB_NUMBER, DC_CID, DC_SEQUENCE_ID, DC_STATE, DC_ACTION, DC_GUI_NAME, TRIIDTX, CSSTPHHPIDRATTTX, CSTPHRETIRETX) VALUES (?,?,?,?,?,?,?,?,?)] with params[402, 0, 1, 1, 4, cstPHInterMarcheContrat, 2013/M0166, 101GT, ]

I found the problem. The configuration of the integration object was for the Development environment and not for the Test environment.

[Admin: To see other related posts, use the DataConnect tag.]

Continue reading

Is there a maximum file size for importing data using DataConnect?


I have 2 questions regarding the use of DataConnect (DC) and was hoping someone may have the answers I’m looking for:

  • 1. Is there a recommended maximum file size for importing data into TRIRIGA using DataConnect via staging tables?
  • 2. We have a two-stage DataConnect process where we first create a record (using one integration object) and then associate locations to the created record (using a separate integration object). We split this into separate IO records due to the parallel processing of DC trying to create things in the wrong order. However, when our update record runs, it overwrites the values that are blank in the staging table, even though we are not directly mapping them in the workflow. Is there any way to avoid this?

Continue reading

Having issues with DataConnect after moving from Oracle to DB2


We are encountering some issues running DataConnect on a DB2 database and wondered if anyone could point us in the right direction. Previously, we were successfully running on an Oracle Database, but since moving to DB2, we have come across some issues.

We found some notes that said we could resolve some issues by ensuring that numeric fields in the import file contained zeros instead of blanks, but we still get the same issues. The import of the data to the staging table is failing, and the TRIRIGA integration object is displaying errors. For some reason, it is not recognizing the data in the file…

Continue reading

IV92377: DC job workflows failing after system was converted


Our DataConnect job workflows have been failing after the system was converted from 3.5.1.2. In the logs, we see the following error message:

2016-11-10 13:42:26,482 ERROR
[com.tririga.platform.workflow.runtime.taskhandler.DCTaskHandler](WFA:0 - 27156646 SYSTEM DC PROCESS JOB:88095958 IE=88095958) com.tririga.platform.smartobject.dataaccess.SmartObjectNotFoundException: No Smart Object with id '0' exists...

Continue reading

What is the best practice for localized data loading?


As I’ve understood it, both the integration object and DataConnect allow you to import localized data (except business key fields, I think). In addition, we have another option to use the Globalization Manager to import traditional data. I found it’s pretty cool. We will only deal with the localized data, with less impact for the non-localized data. Before going forward with an option, I’d like to know what’s the best option for you guys?

Importing by using the Globalization Manager updates the L_ tables directly. If your data does not include localized values that need to be concatenated, for example in a formula, the Globalization Manager import is your best option.

However, if your data includes localized values that need to be concatenated through a formula, or if your data needs to be processed by workflow before it is added to the TRIRIGA tables, then you should use either the integration object or DataConnect.

[Admin: This post is related to the 03.02.16 post about best practices for integration optimization.]

Continue reading