Google News
logo
ETL Testing - Quiz(MCQ)
A)
Extract, Transform and Load
B)
Extract, Time and Load
C)
Extract, Transfer and Load
D)
Extract, Transform and Loss

Correct Answer :   Extract, Transform and Load


Explanation : ETL stands for Extract, Transform and Load. Extract does the process of reading data from a database. Transform does the converting of data into a format that could be appropriate for reporting and analysis. While, load does the process of writing the data into the target database.

A)
Analyzing the collected information.
B)
Addition of new data to the database.
C)
Reading and collecting data from multiple sources.
D)
None of the above.

Correct Answer :   Reading and collecting data from multiple sources.

A)
IBM- WebSphere DataStage
B)
Microsoft Dynamic AX
C)
Informatica- Power Center
D)
SAP- Business objects data service BODS

Correct Answer :   Microsoft Dynamic AX

A)
Verify whether the data is transforming correctly according to business requirements
B)
Make sure that ETL application reports invalid data and replaces with default values
C)
Make sure that data loads at expected time frame to improve scalability and performance
D)
All of the above

Correct Answer :   All of the above

A)
TEL
B)
ETL
C)
LET
D)
LTE

Correct Answer :   ETL


Explanation : Using an ETL tool, data is extracted from multiple data sources, transformed, and loaded into a data warehouse after joining fields, calculating, and removing incorrect data fields.

A)
1
B)
2
C)
3
D)
4

Correct Answer :   3


Explanation : 3 types of ETL facts are : Additive Facts, Semi-additive Facts and Non-additive Facts

A)
Transformation
B)
Transfusion
C)
Information
D)
Transfiction

Correct Answer :   Transformation


Explanation : After business transformation, ETL testing ensures that the data has been loaded accurately from a source to a destination.

A)
Data Staging
B)
Workflow
C)
Bus Schema
D)
Schema Objects

Correct Answer :   Data Staging

A)
Source Bugs
B)
Calculation Bugs
C)
Load Condition Bugs
D)
All of the above

Correct Answer :   All of the above

A)
OLTP
B)
Cubes
C)
OLAP
D)
None of the above

Correct Answer :   Cubes


Explanation : Cubes are data processing units comprised of fact tables and dimensions from the data warehouse. It provides multi-dimensional analysis.

A)
Two
B)
Three
C)
Four
D)
Five

Correct Answer :   Three


Explanation : To fetch data from one database and place it in another, ETL combines all three database functions into one tool.

A)
2
B)
3
C)
4
D)
5

Correct Answer :   2


Explanation : A transformation is a repository object which generates, modifies or passes data. Transformation are of two types Active and Passive.

A)
Update slowly changing dimension table
B)
Verify whether records already exist in the table
C)
Getting a related value from a table using a column value
D)
All of the above

Correct Answer :   All of the above

A)
Summarizing data
B)
Building dimensions
C)
Checking referential integrity
D)
Only extracting valid data

Correct Answer :   Checking referential integrity

A)
One
B)
Two
C)
Three
D)
Four

Correct Answer :   One

A)
ODS
B)
OLAP
C)
OLTP
D)
None of the above

Correct Answer :   OLAP

A)
Session
B)
Mapplet
C)
Worklet
D)
Workflow

Correct Answer :   Mapplet


Explanation : Mapplet : It arranges or creates sets of transformation

A)
Yes
B)
No
C)
Can not say
D)
--

Correct Answer :   Yes


Explanation : Yes, Data purging is a process of deleting data from data warehouse.

A)
Performance
B)
Task Capability
C)
Management and Administration
D)
Maintainability

Correct Answer :   Maintainability

A)
Loading
B)
Transforming
C)
Extracting
D)
None of the above

Correct Answer :   Extracting

A)
Loading
B)
Transforming
C)
Extracting
D)
None of the above

Correct Answer :   Transforming


Explanation : The process of transforming data involves converting it from one form to another.

A)
Loading
B)
Extracting
C)
Transforming
D)
None of the above

Correct Answer :   Loading


Explanation : Writing data into a database is called loading.

A)
One-to-one
B)
Many-to-Many
C)
One-to-many
D)
None of the above

Correct Answer :   One-to-many

A)
Simple to use
B)
It is almost free.
C)
It has a good user interface.
D)
It makes development easier, faster, and cheaper.

Correct Answer :   It makes development easier, faster, and cheaper.

A)
Rules
B)
Lookup tables
C)
Both (A) and (B)
D)
None of the above

Correct Answer :   Both (A) and (B)


Explanation : In addition to using rules or lookup tables, the data can be combined with other data to undergo transformation.

A)
Data Set
B)
Data Warehouse
C)
Data Center
D)
Data Care Center

Correct Answer :   Data Warehouse


Explanation : ETL is often used to build a Data Warehouse.

A)
After
B)
Before
C)
While
D)
None of the above

Correct Answer :   Before


Explanation : Before the data is loaded, transformation logic is applied.

A)
Data warehouses are repositories where data is shared.
B)
In ETL, data is moved from a variety of sources into a data warehouse.
C)
In order to make critical business decisions, companies use ETL to analyze their business data.
D)
All of the above

Correct Answer :   All of the above


Explanation :

In today's environment, ETL is becoming more and more necessary for many reasons, including :
 
* In order to make critical business decisions, companies use ETL to analyze their business data.
* Data warehouses are repositories where data is shared.
* In ETL, data is moved from a variety of sources into a data warehouse.

A)
Data Mart
B)
Data Warehouse
C)
Both (A) and (B)
D)
None of the above

Correct Answer :   Both (A) and (B)


Explanation : A data mart or data warehouse is loaded with data through the ETL process

A)
Make sure the thresholds for the data are valid
B)
Before and after checking the count record, transformation logic is applied.
C)
The intermediate table must be validated as the data flows from the staging area.
D)
All of the above

Correct Answer :   All of the above


Explanation :

Following operations are performed in Applying Transformation Logic___
 
* Before and after checking the count record, transformation logic is applied.
* The intermediate table must be validated as the data flows from the staging area.
* Make sure the thresholds for the data are valid

A)
The ETL process allows the source and target systems to compare sample data.
B)
Earlier, we defined ETL as a process of converting source data into target data and manipulating it.
C)
As part of the ETL process, complex transformations can be performed and additional storage space is required.
D)
All of the above

Correct Answer :   All of the above


Explanation :

The things that are TRUE about ETL are :
 
* The ETL process allows the source and target systems to compare sample data.
* As part of the ETL process, complex transformations can be performed and additional storage space is required.
* Earlier, we defined ETL as a process of converting source data into target data and manipulating it.

A)
Joining
B)
Addressing
C)
Cleaning
D)
Filtering

Correct Answer :   Addressing


Explanation : Addressing is NOT any task involved in ETL Transformation Process.

A)
Validating the values of columns in a table is the focus of database testing.
B)
Database testing is used to ensure the foreign key or primary key is maintained.
C)
During database testing, it is verified whether there is any missing data in a column.
D)
All of the above

Correct Answer :   All of the above


Explanation :

Operations that are performed in Database Testing :
 
* Validating the values of columns in a table is the focus of database testing.
* Database testing is used to ensure the foreign key or primary key is maintained.
* During database testing, it is verified whether there is any missing data in a column.

A)
Loaded
B)
Extracted
C)
Transformed
D)
None of the above

Correct Answer :   Loaded


Explanation : To facilitate business analysis, our data warehouse needs to be loaded regularly.

A)
Loading
B)
Extraction
C)
Transformation
D)
None of the above

Correct Answer :   Transformation


Explanation : To avoid affecting the source system's performance, the transformation occurs on the ETL server or staging area.

A)
Copy
B)
Original
C)
Artificial
D)
Corrupted

Correct Answer :   Original


Explanation : In order to maintain the certification for products with ETL-listed marks, regular product and site inspections are conducted to ensure that the product is manufactured and matches the original product.

A)
Data
B)
Stamp
C)
Listed
D)
None of the above

Correct Answer :   Listed


Explanation : A product with an ETL Listed Mark has been independently tested to meet the applicable standard.

A)
A compressed binary format is then used to store trace logs in a log.
B)
By default, trace providers generate trace logs in their trace session buffers, which are stored by the operating system.
C)
Both (A) and (B)
D)
None of the above

Correct Answer :   Both (A) and (B)


Explanation :

The things TRUE about Trace Logs are :
 
* By default, trace providers generate trace logs in their trace session buffers, which are stored by the operating system.

* A compressed binary format is then used to store trace logs in a log.

A)
.etl
B)
.png
C)
.psd
D)
.pdf

Correct Answer :   .etl


Explanation : .etl files are also used by the Eclipse Open Development Platform.

A)
Studying
B)
Staging
C)
Staggering
D)
None of the above

Correct Answer :   Staging


Explanation : Before data is moved to the warehouse, the extracted data can be validated in the staging area.

A)
FULL Extraction
B)
Partial Extraction - With Update Notification
C)
Partial Extraction - Without Update Notification
D)
All of the above

Correct Answer :   All of the above


Explanation :

The following are the methods to extract the data :
 
* FULL Extraction
* Partial Extraction - With Update Notification
* Partial Extraction - Without Update Notification

A)
Response
B)
Performance
C)
Both (A) and (B)
D)
None of the above

Correct Answer :   Both (A) and (B)


Explanation : Whatever extraction method we use, the source system should not be affected in terms of performance or response time.

A)
Ensure that the data type is correct
B)
Check the source data against the record
C)
There will be a check to see if all the keys are there
D)
All of the above

Correct Answer :   All of the above


Explanation :

The following are the validations using the extractions :
 
* Ensure that the data type is correct
* Check the source data against the record
* There will be a check to see if all the keys are there

A)
Source
B)
Site
C)
Storage
D)
None of the above

Correct Answer :   Source

A)
Page rows
B)
Page faults
C)
Pagination
D)
Page Initials

Correct Answer :   Page faults


Explanation : Logs of ETL contain information about disk access, page faults, and the Microsoft Operating System's performance. They also contain the event of high-frequency events.

A)
Build predictive models
B)
Build data-driver web products
C)
Integrate data across applications
D)
All of the above

Correct Answer :   All of the above


Explanation : Using data pipelines, one can integrate data across applications, build data-driven web products, build predictive models, create real-time data streaming applications, conduct data mining, and build data-driven digital products.

A)
Filtering
B)
Conversion of character sets and encodings
C)
Checking the threshold and validity of data
D)
All of the above

Correct Answer :   All of the above


Explanation :

The following are the validation points during the transformation :
 
* Filtering
* Conversion of character sets and encodings
* Checking the threshold and validity of data

A)
Industry-standard ETL tools are usually used to construct ETL pipelines that transform structured data.
B)
In addition to enterprise data warehouses, subject-specific data marts are also built using ETL pipelines.
C)
As a data migration solution, ETL pipelines are also used when replacing traditional applications with new ones.
D)
All of the above

Correct Answer :   All of the above


Explanation :

The things TRUE about ETL Pipelines are :
 
* Industry-standard ETL tools are usually used to construct ETL pipelines that transform structured data.
* In addition to enterprise data warehouses, subject-specific data marts are also built using ETL pipelines.
* As a data migration solution, ETL pipelines are also used when replacing traditional applications with new ones.

A)
Resumed
B)
Monitored
C)
Canceled
D)
All of the above

Correct Answer :   All of the above


Explanation : Loads need to be monitored, resumed, and canceled according to server performance by the admin of the data warehouse.

A)
Initial Load
B)
Incremental Load
C)
Full Refresh
D)
None of the above

Correct Answer :   Full Refresh


Explanation : With a Full Refresh, all tables are erased and reloaded with new information.

A)
E-MPAC-TL
B)
E-MAP-TL
C)
E-PAC-TL
D)
E-MPAA-TL

Correct Answer :   E-MPAC-TL


Explanation : The term is now extended to E-MPAC-TL or Extract, Monitor, Profile, Analyze, Cleanse, Transform, and Load.

A)
Loading
B)
Extraction
C)
Transformation
D)
None of the above

Correct Answer :   Extraction


Explanation : During extraction, the main goal is to capture data as quickly as possible from a system while minimizing the inconvenience to the system.

A)
Exceeds
B)
Reduces
C)
Extends
D)
Manipulates

Correct Answer :   Reduces


Explanation : Compared to traditional ETL, ETL reduces the time it takes for sources and targets to develop.

A)
Quality Attribution
B)
Quantity Assurance
C)
Quality Assurance
D)
Quantity Attribution

Correct Answer :   Quality Assurance


Explanation : Quality Assurance consists of a process between stages that is defined according to the needs, and it can verify the quality of the product.

A)
Continuous integration is supported.
B)
Test cases can be created automatically with QualiDI, and the automated data can be compared with the manual data.
C)
Featuring a complex testing cycle, eliminating human error, and managing data quality, QualiDI manages complex BI testing cycles.
D)
All of the above

Correct Answer :   All of the above


Explanation :

The features of QualiDI are :
 
Continuous integration is supported.
Test cases can be created automatically with QualiDI, and the automated data can be compared with the manual data.
Featuring a complex testing cycle, eliminating human error, and managing data quality, QualiDI manages complex BI testing cycles.

A)
Automated
B)
Non-automated
C)
Semi-automated
D)
None of the above

Correct Answer :   Automated


Explanation : ETL and end-to-end testing are offered by QualiDI's automated testing platform.

A)
We use iCEDQ to compare millions of files and rows of data when we do ETL testing.
B)
iCEDQ compares the data in memory based on the unique columns in the database.
C)
As a result, it is possible to identify exactly which columns and rows contain data errors.
D)
All of the above

Correct Answer :   All of the above


Explanation :

The features of iCEDQ are :
 
* We use iCEDQ to compare millions of files and rows of data when we do ETL testing.
* iCEDQ compares the data in memory based on the unique columns in the database.
* As a result, it is possible to identify exactly which columns and rows contain data errors.

A)
Verified
B)
Validated
C)
Coordinated
D)
All of the above

Correct Answer :   All of the above


Explanation : Sources and systems are verified, validated, and coordinated by iCEDQ.

A)
Cleansing
B)
Data Profiling
C)
Data Analysis
D)
Source Analysis

Correct Answer :   Data Profiling


Explanation : In data profiling, analysis and validation of the data pattern and formats will be performed, as well as identification and validation of redundant data across data sources to determine the actual content, structure, and quality of the data.

A)
Data Analysis
B)
Data Profiling
C)
Cleansing
D)
Source Analysis

Correct Answer :   Cleansing


Explanation : Based on the Metadata of a set of predefined rules, errors found can be fixed in cleansing.

A)
Tools
B)
Metadata
C)
Technical Issues
D)
All of the above

Correct Answer :   All of the above


Explanation : An extended ETL concept, E-MPAC-TL is designed to meet the requirements while taking into account the realities of the systems, tools, metadata, technical issues, constraints, and most importantly, the data itself.

A)
It automates the manual process and schedules tests for a specific date and time.
B)
A big data testing and ETL testing tool, QuerySurge automates the testing process.
C)
Using this tool, you can create test scenarios and test suits along with configurable reports without knowing SQL.
D)
All of the above

Correct Answer :   All of the above


Explanation :

The following are the features of QuerySurge :

* It automates the manual process and schedules tests for a specific date and time.
* A big data testing and ETL testing tool, QuerySurge automates the testing process.
* Using this tool, you can create test scenarios and test suits along with configurable reports without knowing SQL.

A)
Loaded
B)
Added
C)
Altered
D)
Deleted

Correct Answer :   Loaded


Explanation : An ETL test ensures the data loaded after transformation is accurate after it has been loaded from a source to a destination.

A)
Six
B)
Five
C)
Four
D)
Three

Correct Answer :   Five


Explanation : ETL testing is performed in five stages.

A)
Data platforms with high complexity and large volumes require RightData to work efficiently.
B)
An online tool for testing ETL/Data integration, RightData is available as a self-service program.
C)
Data can be validated and coordinated between datasets despite differences in data models or types of sources with RightData's interface.
D)
All of the above

Correct Answer :   All of the above


Explanation :

The following are TRUE about RightData  :

* Data platforms with high complexity and large volumes require RightData to work efficiently.
* An online tool for testing ETL/Data integration, RightData is available as a self-service program.
* Data can be validated and coordinated between datasets despite differences in data models or types of sources with RightData's interface.

A)
Data
B)
Dataset
C)
Dimensions
D)
Deadlock

Correct Answer :   Dimensions


Explanation : Facts and aggregate facts are grouped into hierarchical groups in the database referred to as dimensions.

A)
Access
B)
Data
C)
Integration
D)
Staging

Correct Answer :   Integration


Explanation : A database is created from the data transformed by the Integration Layer.

A)
Staging layer
B)
Staging database
C)
Both (A) and (B)
D)
None of the above

Correct Answer :   Both (A) and (B)


Explanation : Data extracted from different sources is stored in a staging database or staging layer.

A)
Data recovery
B)
Build reports
C)
Build and populate data
D)
All of the above

Correct Answer :   All of the above


Explanation :

The following are the stages of the ETL testing :
 
* Data recovery
* Build and populate data
* Build reports

A)
Metadata Testing
B)
New Data Warehouse Testing
C)
Application Upgrade
D)
Production Validation Testing

Correct Answer :   New Data Warehouse Testing


Explanation : Customer requirements and different sources of data are taken into account in New Data Warehouse Testing.

A)
QA Testers
B)
Business Analyst
C)
Infrastructure People
D)
All of the above

Correct Answer :   All of the above


Explanation :

The following are the group that plays responsibility in testing New Data Warehouses :

* QA Testers
* Business Analyst
* Infrastructure People

A)
Kettle
B)
AWS Glue
C)
Clover ETL
D)
Jasper ETL

Correct Answer :   AWS Glue


Explanation : The following is the cloud-based tool - AWS Glue.

A)
Each module is tested by a Business Analyst.
B)
The test environment is set up by Business Analyst people.
C)
Requirements are gathered and documented by the business analyst.
D)
These plans and scripts are developed by Business Analysts and then executed by them.

Correct Answer :   Requirements are gathered and documented by the business analyst.

A)
Users
B)
Developers
C)
QA Testers
D)
Infrastructure People

Correct Answer :   QA Testers


Explanation : The QA tester develops test plans and scripts and executes these plans and scripts.

A)
Database Administrators
B)
Users
C)
Developers
D)
Infrastructure People

Correct Answer :   Database Administrators


Explanation : Performance and stress tests are conducted by Database Administrators.

A)
Calculating
B)
Changing Data
C)
String Manipulation
D)
All of the above

Correct Answer :   All of the above


Explanation : A ETL tool simplifies the task of calculating, string manipulating, changing data, and integrating multiple data sets when dealing with complex rules and transformations.

A)
Natural
B)
Built-in
C)
Artificial
D)
None of the above

Correct Answer :   Built-in


Explanation : Data engineers can develop a successful and well-instrumented system with ETL tools that have built-in error handling.

A)
Visual Flow
B)
Ease of Use
C)
Operational Resilience
D)
All of the above

Correct Answer :   All of the above


Explanation :

The benefits of ETL tools are :

* Visual Flow
* Ease of Use
* Operational Resilience

A)
User Analyst Testing
B)
User Attribute Testing
C)
User Acceptance Testing
D)
None of the above

Correct Answer :   User Acceptance Testing


Explanation : The full form of UAT is User Acceptance Testing.

A)
Metadata
B)
Data Accuracy
C)
Production Validation
D)
Source to Target

Correct Answer :   Production Validation


Explanation : Whenever data is moved into production systems, production validation tests are performed.

A)
Informatica Data Validation
B)
Irrelevant Data Validation
C)
Irrelevant Duration Validation
D)
Informatica Duration Validation

Correct Answer :   Informatica Data Validation


Explanation : To ensure that the data don't compromise production systems, Informatica Data Validation automates ETL testing and management.

A)
Metadata
B)
Source to target
C)
Data Accuracy
D)
Data Transformation

Correct Answer :   Source to target


Explanation : Validating the data values transformed to the expected data values is done through source-to-target testing.

A)
Using an ETL tool is as easy as sorting, filtering, reformatting, merging, and joining data.
B)
A few ETL tools support BI tools and functionality such as transformation scheduling, monitoring, and version control.
C)
Multiple data structures and different platforms, such as mainframes, servers, and databases, can be collected, read, and migrated using ETL tools.
D)
All of the above

Correct Answer :   All of the above


Explanation :

The requisites provided by ETL tools are :
 
* Using an ETL tool is as easy as sorting, filtering, reformatting, merging, and joining data.
* A few ETL tools support BI tools and functionality such as transformation scheduling, monitoring, and version control.
* Multiple data structures and different platforms, such as mainframes, servers, and databases, can be collected, read, and migrated using ETL tools.

A)
There will be a delay of months before any ETL testing can be done.
B)
On-demand or real-time access is not ideal when we need a fast response.
C)
ETL testing has the disadvantage of requiring us to be database analysts or developers with data-oriented experience.
D)
All of the above

Correct Answer :   All of the above


Explanation :

The disadvantages of ETL Testing are :
 
* There will be a delay of months before any ETL testing can be done.
* On-demand or real-time access is not ideal when we need a fast response.
* ETL testing has the disadvantage of requiring us to be database analysts or developers with data-oriented experience.

A)
During ETL testing, data can be extracted from or received from any data source simultaneously.
B)
In ETL, heterogeneous data sources can be loaded into a single generalized (frequent)/different target simultaneously.
C)
It is possible to load different types of goals simultaneously using ETL.
D)
All of the above

Correct Answer :   All of the above


Explanation :

The advantages of ETL Testing are :
 
* During ETL testing, data can be extracted from or received from any data source simultaneously.
* In ETL, heterogeneous data sources can be loaded into a single generalized (frequent)/different target simultaneously.
* It is possible to load different types of goals simultaneously using ETL.

A)
Data Quality
B)
Data Accuracy
C)
Application Upgrades
D)
Data Transformation

Correct Answer :   Application Upgrades


Explanation : Tests are automatically generated for Application Upgrades, which saves test developers' time.

A)
Varied
B)
Identical
C)
Similar
D)
Different

Correct Answer :   Identical


Explanation : When an application is upgraded, the extracted data from the old application is checked against the new application's data to ensure that they are identical.

A)
Metadata
B)
Data Quality
C)
Data Accuracy
D)
Data Transformation

Correct Answer :   Metadata


Explanation : As part of metadata testing, types of data, lengths of data, and indexes and constraints are measured.

A)
Count Check
B)
Data Type Check
C)
Remove Duplicate Data
D)
All of the above

Correct Answer :   All of the above


Explanation :

Following are the types of operations involved in verifying the table in the source system :
 
* Count Check
* Data Type Check
* Remove Duplicate Data

A)
Data Loading
B)
Apply Transformation Logic
C)
In the source system, verify the table.
D)
All of the above

Correct Answer :   All of the above


Explanation :

ETL testers have the following responsibilities:
 
* Data Loading
* Apply Transformation Logic
* In the source system, verify the table.

A)
Accuracy
B)
Reference
C)
Syntax
D)
Transformation

Correct Answer :   Syntax


Explanation : Invalid characters, invalid character patterns, or improper upper- or lower-case order will result in dirty data being reported by syntax tests.

A)
Data Sources
B)
Apply Transformation
C)
Load the data into the target table
D)
All of the above

Correct Answer :   All of the above


Explanation : It is the ETL tester's responsibility to validate the data sources, apply transformation logic, load the data into the target table, and extract the data from the target table.

A)
Report
B)
Incremental ETL
C)
Migration
D)
GUI/Navigation

Correct Answer :   Incremental ETL


Explanation : A data integrity test is conducted for incremental ETL testing when new data is added to old data.

A)
Still Working
B)
Crashed
C)
Initiated
D)
Deadlocked

Correct Answer :   Still Working


Explanation : After data has been inserted and updated during an incremental ETL process, incremental testing verifies the system is still working properly.

A)
Front-end
B)
Back-end
C)
Both (A) and (B)
D)
None of the above

Correct Answer :   Front-end


Explanation : Front-end reports are tested for navigation and GUI aspects by GUI/Navigation Testing.

A)
Hardware
B)
Help Source
C)
Version Control
D)
Load Condition

Correct Answer :   Version Control


Explanation : Regression Testing bugs do not indicate what version they came from, as they are usually caused by Version Control bugs.

A)
Race Condition
B)
Load Condition
C)
Version Control
D)
Equivalence Class Partitioning

Correct Answer :   Equivalence Class Partitioning


Explanation : Invalid or invalid types are produced by Equivalence Class Partitioning bugs.

A)
Hardware
B)
Calculation
C)
Race Condition
D)
Load Condition

Correct Answer :   Calculation


Explanation : Mathematical errors show up in calculation bugs, and the results are usually inaccurate.

A)
Calculation
B)
Race Condition
C)
Load Condition
D)
Boundary value analysis

Correct Answer :   Boundary value analysis


Explanation : Bugs that check for minimums and maximums are called boundary value analysis bugs.

A)
Input-output
B)
Calculation
C)
Load Condition
D)
Boundary value analysis

Correct Answer :   Input-output


Explanation : As a result of the input-output bug, invalid values are being taken by the application and valid values are being rejected.

A)
GUI
B)
Report
C)
Migration
D)
Incremental ETL

Correct Answer :   Migration


Explanation : An existing data warehouse is used in Migration Testing, and ETL is used to process the data.

A)
Executing the validation test
B)
Design and validation tests
C)
Setting up the test environment
D)
All of the above

Correct Answer :   All of the above


Explanation :

The following steps are included in Migration Testing : 

* Executing the validation test
* Design and validation tests
* Setting up the test environment

A)
Color
B)
Navigation
C)
Font Style
D)
All of the above

Correct Answer :   All of the above


Explanation : Color, font style, navigation, spelling check, and other issues related to the Graphical User Interface of an application are examples of User Interface bugs.

A)
Source-to-target mapping
B)
Analyzes the source data for errors
C)
The ability to understand and report on data
D)
All of the above

Correct Answer :   All of the above


Explanation :

The following tasks are performed in ETL testing :
 
* Source-to-target mapping
* Analyzes the source data for errors
* The ability to understand and report on data

A)
Database, ELT
B)
Database, ETL
C)
ETL, Database
D)
ELT, Database

Correct Answer :   Database, ETL

A)
Mapping Sheet
B)
DB Schema of Target
C)
DB Schema of Source
D)
None of the above

Correct Answer :   Mapping Sheet


Explanation : A mapping sheet contains all the columns and their lookups in reference tables for both source and destination tables.

A)
ETL Mapping Sheets
B)
DB Schema of Source (Target)
C)
Both (A) and (B)
D)
None of the above

Correct Answer :   Both (A) and (B)


Explanation :

The document(s) that the ETL tester always uses during the testing process is/are :
 
* ETL Mapping Sheets
* DB Schema of Source (Target)

A)
It can compare the input with the target.
B)
It can compare the output with the target.
C)
It can't compare the output with the target.
D)
It can't compare the input with the target.

Correct Answer :   It can't compare the output with the target.


Explanation : A single SQL query cannot convert data because it can't compare the output with the target.

A)
Comparing
B)
Examining
C)
Differentiating
D)
None of the above

Correct Answer :   Comparing


Explanation : Comparing the distinct values of critical data columns in the source and target systems is a good way to verify the integrity of critical data columns.

A)
Database
B)
SQL
C)
Relational
D)
None of the above

Correct Answer :   SQL


Explanation : The data accuracy of both the source and the target can be checked with a set of SQL operators.

A)
Keys
B)
Joins
C)
Both (A) and (B)
D)
None of the above

Correct Answer :   Both (A) and (B)

A)
Valid
B)
Correct
C)
Accurate
D)
All of the above

Correct Answer :   All of the above


Explanation : In database testing, the focus is on ensuring that data is accurate, correct, and valid.

A)
Value Compromise
B)
Value Comparison
C)
Value Contraction
D)
Value Compression

Correct Answer :   Value Comparison


Explanation : A value comparison compares the data between a source system and a target system without transforming the data in either system.

A)
Testing the database verifies if the column has any missing data.
B)
During database testing, data values in columns are verified to ensure they are valid.
C)
Tests are conducted on databases to determine whether primary or foreign keys are maintained.
D)
All of the above

Correct Answer :   All of the above


Explanation :

The following operations are performed during database testing:

* Testing the database verifies if the column has any missing data.
* During database testing, data values in columns are verified to ensure they are valid.
* Tests are conducted on databases to determine whether primary or foreign keys are maintained.