1.Define Informatica?
Informatica is a tool, supporting all the steps of the Extraction, Transformation, and Load (ETL) process. Nowadays, Informatica is also being used as an integration tool. Informatica is an easy-to-use tool. It has got a simple visual interface like forms in visual basic. You just need to drag and drop different objects (known as transformations) and design process flow for data extraction, transformation, and load.
These process flow diagrams are known as mappings. Once a mapping is made, it can be scheduled to run as and when required. In the background, the Informatica server takes care of fetching data from the source, transforming it, and loading it to the target.
2. Explain the different kinds of loading in Informatica?
In general, Informatica is considered to be having about two different types of leading techniques. One is “Normal Loading” and the other is “Bulk Loading” Normal Loading is considered to be a tremendously time taking process as it has to load the records one after the other and a log can be written for every loading. In the Bulk Loading process, it has the scope of loading a number of records at a same time to the target database. This ensures saving a lot of time in delivering data to the target.
3. What is Informatica PowerCenter?
Informatica PowerCenter is an ETL/data integration tool that has a wide range of applications. This tool allows users to connect and fetch data from different heterogeneous sources and subsequently process the same.
For example, users can connect to a SQL Server Database or an Oracle Database, or both, and also integrate the data from both these databases to a third system.
4. During the installation of PowerCenter, what are all the components that get installed?
While installing the Informatica Power Center, the following components also gets installed as well
•PowerCenter clients
•Integration services
•Repository service
•PowerCenter Domain
•Administration console for PowerCenter
5. Explain the concepts of aggregate cache in terms of relation to aggregator transformation?
This aggregate cache is the place where the aggregate stores the data until the completion of the aggregate calculations. In the case of a session which performs an aggregator transformation, the server being used by the Informatica will be creating an index & it is here where the data catches the memory to commence with the process of the ongoing transformation. And also, in the cases where the server requires additional space, the cache files will be handy to store these overflow values.
6. Differentiate Static Cache And Dynamic Cache?
The major difference between the “Static Cache” and “Dynamic Cache” are as follows
•Dynamic cache: Dynamic cache greatly decreases the systems total performance and productivity much to the contrast in comparison with the static cache.
•Static Cache: Static cache can simply be interpreted as a process where the data gets inserted all the time. It doesn’t mind the number of times a particular data is being inserted, all that it cares is to insert the data.
7. Explain Where We Can Find The Throughput Option In Informatica?
Within the workflow monitor of the Informatica, the throughput option can be found. You can access it by viewing the workflow monitor and then right click on the session followed by clicking on the run properties. And there we can spot the throughput option under source / target statistics.
8. Explain What Is A Session Task?
A session task can simply be interpreted as a set of instructions that are guided towards a power center server where it gets decided when the data is needed to be transferred from the source region to the targets.
9. What are the actions to be performed to accomplish the session partition?
In case if you have to make any session partition then you need to begin with configuring the session to partition to source data and then the next thing which is to be performed is installing the Informatica server machine in a different CPU which is also known as the multifold CPU.
10. Explain about Surrogate Key?
A surrogate key can simply be explained as the replacement within the database of the primary key. Surrogate key acts as a unique identification factor for each row within a table. And the surrogate key will always be in the form of a digit or an integer.
11. Define Informatica transformation and state the different types of transformations?
A transformation in Informatica can be interpreted as an object of repository that can simply modify, generate, or transform the data. Informatica supports different types of transformation functions that are listed here below.
•Aggregator transformation
•Expression transformation
•Filter transformation
•Joiner transformation
•Lookup transformation
•Normalizer transformation
•Rank transformation
•Router transformation.
12. Explain the concept of Informatica repository?
Repository is one among the eminent concepts of Informatica. The Informatica repository is present at the center of the Informatica suite. A set of metadata tables can be created within the repository database where it can be accessed by the Informatica application and tools. The repository is then accessed by the Informatica client and server for the function of saving and retrieving the metadata.
13. Tell us about the different types of data which pass between stored procedure and Informatica server?
Between the Informatica server and the stored procedure, a total of three types of data pass between them. The three different data types include
•Input / Output parameters
•Status code
•Return Values
14. Discuss the concept of source qualifier transformation?
A source qualifier transformation comes handy whenever a relational or a flat file source simply gets added to a mapping. The source qualifier transformation holds the records of all records which are read by the Informatica server whenever the session is under the run.
15. Explain the concept of status code?
Error handling function in the Informatica is performed by the Status code within the concerned session. A status code gets issued by the stored procedure which simply notifies whether the stored procedure is completed successfully or not. This value will not be visible to the user. IT helps the Informatica server to simply state whether to keep the session running or to stop the session.
16. Explain the different formats of Lookup Cache?
Different lookup cache is explained below:
•Informatica Lookups can be classified into two types either cached or uncached (No cache).
•Cached lookup can also be classified as being either static or dynamic.
•A static cache doesn’t get modified during the session run and remains constant.
•A dynamic cache shows variation in the results during the session run. This can be done by simply inserting or updating the records in the incoming source data. Informatica cache will be in the state of static cache by the default.
•Well, coming to the Lookup cache it also gets divided into two different categories. This includes persistent or non-persistent based lookup caches.
17. Differentiate between STOP and ABORT options in Workflow Monitor?
Upon the execution of STOP command in the currently executing session task, the concerned integrating service stops reading the data from the source. However, it continues the process of processing, writing and committing the data to their concerned targets.
If the Integration Abort command can be issued when a service stops performing processing or writing the data. ABORT command assigns a time span of 60 seconds in which the DMT gets aborted and the session automatically gets terminated if the system doesn’t finish its tasks on the data source.
18. Discuss different categories of dimensions in Informatica?
A total of three dimensions can be seen in the Informatica which are stated below
•Junk dimension
•Conformed Dimension
•Degenerative Dimension
19.What are Session and Batches?
A “Session” and a “Batch” can be defined as
•Session: A session can be interpreted as a group of commands which lets the server describe to move the targeted data.
•Batch: A Batch can be interpreted as a collection of tasks which also covers one or more number of tasks.
20. What is the Mapplet?
Mapplet can simply be interpreted as a set of transformations that help you build within the mapplet designer which can be availed to be used in the multiple mappings.
21.Differentiate between the Active Transformations And A Passive Transformation In Informatica?
Active transformation: Active transformation can simply be interpreted as the process where the number of rows which have gone through the mapping gets changed. This process is termed as an Active transformation.
Some of the Active transformations include:
•Sorter transformations
•Filter transformations
•Joiner transformations
•Rank transformations
•Router transformations and some other transformations as well.
Passive transformation: A Passive transformation can be termed as the process where after having gone through the mapping the number of rows doesn’t get changed. This process is called the Passive transformation.
Some of the Passive transformations are:
•Expression transformation
•Sequence Generator transformation
•Lookup transformation
•External procedure transformation
•Output transformation
•Input transformation and more.
22. Define Parameter File And How many values can be defined in a parameter file?
A parameter file can be interpreted as a file which has mainly been created in a text editor or a word pad.
The different values which can be defined in a parameter file, are
•Mapping parameters
•Mapping variables
•Session parameters
23. Differentiate Between A Mapplet And A Rule?
Mapplet can be validated as a rule and a rule is nothing but a logic which defines all the conditions applied to the source data. Mapplet can be validated as a rule when it meets the following requirements
•Mapplet must and should contain input and output transformations.
•Active transformations should not be present.
•Cardinality between input groups is not specified.
24. Differentiate between sessions and mappings?
•Session: A session can also simply be interpreted as a complete group of instructions which states how and when exactly to move the data from the source to its respective targets.
•Mapping: Mapping can be defined as a primarily linked set of source and the target by transformation of objects which in general define the rules for transformation.
25. What is Data warehouse?
“A Data warehouse is a collection of data which is a great help in strengthening the management decision making process. It consists of time variant, subject-oriented, integrated, and non volatile collection of data.
26. How can you enhance the performance of Informatica Aggregator Transformation?
By just simply doing sorting of records before they are passed through the aggregator the Informatica aggregator performance can be drastically enhanced. Sorting of the record set can be performed on the columns that are used in Group By operation.
27. What is the status code?
Status code is an error handling technique for the Informatica server during a session. A notification occurs in the form of a status code stating whether or not the stored procedure is successfully completed. It then helps the user to decide whether to run the procedure ahead or to stop it.
28. Explain what exactly is DTM?
The term DMT stands for Data Transformation Manager (DMT). The concept of Data Transfer Management can simply be explained. DMT is nothing but when the load manager performs respective validations among the session, it generates the DTM process.
29. State About The Reusable Transformation?
This is a broad concept which is extensively being widely used in mappings. By attaining a change in the reusable transformations the concerned effect will be nullified in the mappings. It is completely different from the other mappings where the transformations in mappings as they are used to store metadata.
30. Define Target Load Order?
Target and Order are one among the major concepts of Informatica. Target load order can simply be interpreted as a list of all the activities one can easily define the priority based on which it will become easy to load the data into the Informatica server. In the case where you are having a list of source qualifiers which are connected to multiple targets then you can define the order in which data can be loaded into the targets in the Informatica servers.