Scenario 1: How can we load first
and last record from a flat file source to target?
Solution:
Create two pipelines in a mapping
1st pipeline would capture the first
record, and 2nd one for last record.
1st Pipeline:
src-> sq-> exp(take a variable
port with numeric data type and pass through a output port
'O_Test')->filter(pass if only
O_Test =1)->tgt
2nd pipeline:
src->sq->agg(No group it will
pass only last entry)->tgt
In session for 2nd instance of
target enable 'Append if Exists' option
Scenario
2: How to find out nth
row in flat file...we used to do top N analysis by using rownum &
some other functionalities by using rowid when source is table .and my
query is how to achieve the same functionalities when my source is flat file?
Solution: In the Mapping designer, go to Mappings-> Parameters and
Variables.
Here we have two things -
Parameters(constant values passed to the mapping) and variables which are
dynamic and can be stored as a metadata for future runs(for example you want to
do an incremental load into a table B from table A. So you can define a
variable which holds the seqid from source. Before you write the data into
target , create an expression and source the seqid from source as input and
create a variable Max_seqid as output. Now update this value for each row. when
the session finishes informatica saves the last read seqid and you can use this
in your source qualifier when you run the mapping next time. Please see Infa
doc for setmaxvaribale and setminvariables.
In this case, we have to just make
use parameters to find the nth row.
Create a parameter(type) -
Last_row_number and select datatype as integer or double.
Now you have to create a parameter
file on unix box before you call the workflow.
something like this
echo '[<FOLDERNAME>.WF:<WorkflowName>.ST:<SessionName>]'
count=`wc -l filename`
echo
"\$\$MappingVariable="$count
Name the parameter file as
<workflowname>.par and copy the complete path of the file name and update
the "Parameter filename" field under Properties tab in workflow edit
tasks.
You can then use this variable in
your mapping wherever you want. Just proceed it with two $$.
Scenario 3: How to create flat file dynamically?
SRC
FILE
TRGT
1
Trgt 2
---------------- --------------------------------
--------------------------------------
Eid Name
Sal Eid Name
Sal
Eid Name Sal
10 a
100 10
a
100
20 b 100
20 b
100 10
c
200
20 d 300
10 c 200
20 d 300
Solution :
1. Sort the data coming from the
source based on EID.
2. Create a variable in an
expression transformation that would track the change in EID e.g. in your case
if the data is sorted based on EID then it would look like
EID
Name SAL
10
a
100
10
c
200
20
b
100
20
d
300
Whenever there is a change in EID
the variable would track it
variable1= IIF(EID = PREV_EID, 0, 1)
3. Add a transaction control
transformation in the map with a similar condition
IIF(variable1 = 1, TC_COMMIT_BEFORE,
TC_CONTINUE_TRANSACTION)
this would create a new file
whenever there is a change in the EID value.
4. Add a "filename" port
in the target and then pass on a value as per your requirement so that the
filenames get generated dynamically as per your requirement.
Scenario 4: I HAVE A SOURCE FILE CONTAINING
1|A,1|B,1|C,1|D,2|A,2|B,3|A,3|B
AND IN TARGET I SHOULD GET LIKE
1|A+B+C+D
2|A+B
3|A+B
Solution:
Follow the logic given below in the
expression and you will get your output.
Please ensure that all the ports you
mentioned below are variable ports and the incoming data should be sorted by
key,data
VarPorts
|
Assigned
|
Row1
|
Row2
|
Row3
|
Row4
|
V_CURNT
|
KEY
|
1
|
1
|
1
|
2
|
V_CURNT_DATA
|
DATA
|
a
|
b
|
c
|
d
|
v_OUT
(variable port)
|
IIF(isnull(v_PREV_DATA) or
length(v_PREV_DATA)=5,v_CURNT_DATA,iif(V_CURNT = V_PREV,
V_PREV_DATA||'~'||V_CURNT_DATA,NULL)
|
a
|
a~b
|
a~b~c
|
d
|
o_OUT
|
v_OUT
|
a
|
a~b
|
a~b~c
|
d
|
V_PREV
|
V_CURNT
|
null
|
1
|
1
|
1
|
V_PREV_DATA
|
v_OUT
|
null
|
a
|
a~b
|
a~b~c
|
And After the expression
transformation, you have to add an aggregator tx with group by port as 'key'.
this will return the last record with the key.
No comments:
Post a Comment