In the age of technological advancements happening almost every minute, upgrading a business is essential to survive competition, offering a customer experience beyond expectations while deploying fewer resources to derive value from any process or business.
Platform upgrades, software upgrades, security upgrades, architectural enhancements, and so on are required to ensure stability, agility, and efficiency.
Customers prefer to move from Legacy systems to the Cloud due to the offerings it brings. From cost, monitoring, maintenance, operations, ease of use, and landscape, Cloud has transformed D&A businesses significantly over the last decade.
Movement from Informatica Powercenter to IDMC has been perceived as the need of the hour due to the humongous advantages it offers. Developers must understand both flavors to perform this code transition effectively.
This post explains the PWC vs IDMC CDI gaps from different perspectives.
- Development
- Data
- Operations
Development
- The difference in native datatypes can be observed in IDMC when importing Source, Target, or Lookup. Workaround as follows.,
- If any consistency is observed in IDMC mappings with Native Datatype/Precision/Scale, ensure that the Metadata Is Edited to keep them in sync between DDL and CDI mappings.
- In CDI, taskflow workflow parameter values experience read and consumption issues. Workaround as follows.,
- A Dummy Mapping task has to be created where the list of Parameters/Variables needs to be defined for further consumption by tasks within the taskflows (Ex, Command task/Email task, etc)
- Make sure to limit the # of Dummy Mapping tasks during this process
- Best practice is to create 1 Dummy Mapping task for a folder to capture all the Parameters/Variables required for that entire folder.
- For Variables whose value needs to be persistent for the next taskflow run, make sure the Variable value is mapped to the Dummy Mapping task via an Assignment task. This Dummy mapping task would be used at the start and end of the task flow to ensure that the overall task flow processing is enabled for Incremental Data processing.
- All mapping tasks/sessions in IDMC are reusable. They could be used in any task flow. If some Audit sessions are expected to run concurrently within other taskflows, ensure that the property “Allow the mapping task to be executed simultaneously” is enabled.
- Sequence generator: Data overlap issues in CDI. Workaround as follows.,
- If a sequence generator is likely to be used in multiple sessions/workflows, it’s better to make it a reusable/SHARED Sequence.
- VSAM Sources/Normalizer was not available in CDI. Workaround as follows.,
- Use the Sequential File connector type for mappings using Mainframe VSAM Sources/Normalizer.
- Sessions are configured to have STOP ON ERRORS >0. Workaround as follows.,
- Ensure the LINK conditions for the next task to be “PreviousTask.TaskStatus – STARTS WITH ANY OF 1, 2” within CDI taskflows.
- Partitions are not supported with Sources under Query mode. Workaround as follows.,
- Ensure multiple sessions are created and run in parallel as a workaround.
- Currently, parameterization of Schema/Table is not possible for Mainframe DB2. Workaround as follows.,
- Use an ODBC-type connection to access DB2 with Schema/Table parameterization.
- A mapping with a LOOKUP transformation used across two sessions cannot be overridden at the session or mapping task level to enable or disable caching. Workaround as follows.,
- Use 2 different mappings with LOOKUP transformations if 1 mapping/session has to have cache enabled and the other mapping/session has to have cache disabled.
Data
- IDMC Output data containing additional Double quotes. Workaround as follows.,
- Session level – use this property – __PMOV_FFW_ESCAPE_QUOTE=No
- Administrator settings level – use this property – UseCustomSessionConfig = Yes
- IDMC Output data containing additional Scale values with Decimal datatype (ex., 11.00). Workaround as follows.,
- Use IF-THEN-ELSE statement to remove Unwanted 0s in data (O/P : from 11.00 -> 11)
Operations
- CDI doesn’t store logs beyond 1000 mapping tasks run in 3 days on Cloud (it does store logs in Secure Agent). Workaround as follows.,
- To retain Cloud job run stats, create Audit tables and use the Data Marketplace utility to get the Audit info (Volume processes, Start/End time, etc) loaded to the Audit tables by scheduling this job at regular intervals (Hourly or Daily).
- Generic Restartability issues occur during IDMC Operations. Workaround as follows.,
- Ensure a Dummy assignment task is introduced whenever the code contains Custom error handling flow.
- SKIP FAILED TASK and RESUME FROM NEXT TASK operations have issues in IDMC. Workaround as follows.,
- Ensure every LINK condition has an additional condition appended, “Mapping task. Fault.Detail.ErrorOutputDetail.TaskStatus=1”
- In PWC, any task can be run from anywhere within a workflow; however, this is not possible in IDMC. Workaround as follows.
- Feature request worked upon by GCS to update the Software
- IDMC mapping task config level is not capable due to parameter concatenation issues. Workaround as follows.,
- Ensure to use a separate parameter within the parameter file to have the Mapping task log file names suffixed with the Concurrent run workflow instance name.
- IDMC doesn’t honour the “Save Session log for these runs” property set at the mapping task level when the session log file name is parameterized. Workaround as follows.,
- Ensure to copy the mapping task log files in the Secure agent server after the job run
- If Session Log File Directory contains / (Slash) when used along with parameters (ex., $PMSessionLogDir/ABC) under Session Log Directory Path, this would append every run log to the same log file. Workaround as follows.,
- Ensure to use a separate parameter within the parameter file for $PMSessionLogDir
- In IDMC, the @numAppliedRows and @numAffectedRows features are not available to get the source and target success rows to load them in the audit table. Workaround as follows.,
- @numAppliedRows is used instead of @numAffectedRows
- Concurrent runs cannot be performed on taskflows from the CDI Data Integration UI. Workaround as follows.,
- Use the Paramset utility to upload concurrent paramsets and use the runAJobCli utility to run taskflows with multiple concurrent run instances from the command prompt.
Conclusion
While performing PWC to IDMC conversions, the following Development and Operations workarounds will help avoid rework and save effort, thereby achieving customer satisfaction in delivery.
Source: Read MoreÂ