Monday, October 24, 2011

OPTIM - FAQ's -- III

Optim to convert critical information such as Social Security number (TRANS SSN), credit card number (TRANS CCN), and e-mail IDs (TRANS EML). There are several other functions available in Optim in the data privacy area that provide various masking functions. For example, some of the functions include:
LOOKUP, a function that uses a lookup table to determine the destination column value

HASH_LOOKUP, a function that determines the destination column value from a lookup table according to a value derived from a source column

RAND_LOOKUP, a random lookup function that selects a value for the destination column from a lookup table based on a random number as the subscript in the lookup table

TRANS SSN flags and their descriptions
Flags   Description
n          Generates a random area number and an appropriate group number and serial number.
r          Generates a random area number corresponding to the state of the source SSN and an appropriate group number and serial number.
v          Validates the source group number to ensure the SSN has used it.
-           Generates an SSN with dashes separating the fields (for example, 123-45-6789). Requires a character-type destination column at least 11 characters long.

The TRANS CCN function is used to generate valid and unique credit card numbers (CCNs). A CCN usually consists of a six-digit issuer identifier followed by a variable-length account number and a single-check digit as the final number. By default, TRANS CCN algorithmically generates a consistently altered CCN, but it can also generate a random value for CCN.

The execution steps followed in this section (used to mask credit card numbers) are similar to the steps followed during the execution of the SSN scenarios. All three scenarios described in this section differ from each other with respect to the source file, control file, destination file, and flags used in TRANS CCN function.

TRANS CCN flags and their descriptions

Flags   Description
n          Generates a random CCN, not based on a source value, that includes a valid issuer
identifier associated with American Express, MasterCard, VISA, or Discover.
r          Generates a random CCN that includes the first four digits of the source issuer identifier.
6          Generates a random CCN that includes the first six digits of the source issuer identifier.

TRANS EML

The TRANS EML function is used to generate an e-mail address. An e-mail address has two parts—the user name and the domain name, separated by the at sign (@).

The TRANS EML function generates e-mail addresses based on either the destination data or a literal concatenated with a sequential number. The domain name can be formed using either the source data or randomly chosen from a long list of e-mail service providers. E-mail addresses can also be converted to upper or lower case.

The execution steps followed in this section (where the scenarios will work on e-mail accounts) are similar to the steps followed during the execution of SSN scenarios. Hence, usage of only one is given here, and the rest can be executed in the same way with only the change of the flag values.

TRANS EML flags and their descriptions
Flags   Description
n          Ignores the source value and generates an e-mail address with a random domain name from a list of large e-mail service providers.
.           Separates the name1col and name2col values with a period (.).
-           Separates the name1col and name2col values with an underscore (_).
i           Uses only the first character of the name1col value.
l           Converts the e-mail address to lower case.
u          Converts the e-mail address to upper case.

Note: Some of the articles are grab from various Websites / Blogs.

Saturday, October 22, 2011

OPTIM - FAQ's -- II

SQL Server DB Alias, with a user that even has database owner privileges. Packages were created without any problem, but when in optim we are trying to define Access definition and want to choose a table to work on, it displays: "no data available". Like it doesn't see any table.

Ans: Can you load/drop sample tables from Optim configuration module ?

Step 1 : Launch Optim configuration
Step 2 : Task --> Load/Drop Sample tables
Step 3 : Choose the respective Optim Directory and Dbalias using correct credentials
Step 4 : In Load/Drop Sample tables windows , make sure you correctly specifying schema name and tablespace info ( as applicable).
Step 5 : Press " Proceed " Button and load the sample tables and close Configuration module
Step 6 : Now launch Optim Tool and try to create an Extract or Access definition and check the default qualifier works or not.

Ans2: it might be that the metadata is not registering in Optim.

Remember to disconnect/reconnect to the Optim Directory after DB Alias creation or alternatively restart the Optim Workstation program.

Optim refreshes it's metadata when it connects to the Optim Directory. So, if you have Optim open and define a DB Alias in the Optim Configuration program, you need to reconnect to see the tables.

Ans3: Indeed, it seems to be a metadata refreshment problem. Quitting the Optim Configurator after DB Alias creation (to make sure the update is persisted) and starting Optim does the trick.


How to verify -optim version/fixpack/patches/releases on Optim Server side- AIX.

I have not documented the changes we made to Optim( I mean after applying fixpacks)

Now I want to see Prod/Test/Dev environments are running with same Version/Optim-patch-fixpack

Ans: You can check rtbuild.h file under $PSTHOME/bin folder ( HOME folder of Optim installation )on AIX side.
Trace LOG files also give release & Build # information $PSTHOME/temp.

On windows , You can find rtbuild.h file under Optim folder ( e.g. c:\program files\IBM Optim\rt\BIN ). You can check "About Optim" option in HELP menu of Optim tool in windows for release and build information as well.

Ans2: /optim/7.2.1/rt/bin
-->

rtbuild.h

#define RTBUILD_SEQUENCE 3125
#define RTBUILD_SEQUENCE_STR "3125"

RTBUILD_FP01.h

#define RTBUILD_SEQUENCE 3125
#define RTBUILD_SEQUENCE_STR "3125"
#define RTBUILD_FIXPACK "01"

in the above scenario my :Option version is:7.2.1
Build:3125, Fixpack:01

Note: Some of the articles are grab from various Websites / Blogs.

OPTIM - FAQ's -- I

1. Does TRANS SSN function generate unique values?

Ans: Yes, for every unique input value it generates unique output value.


2. If same SSN more than once using TRANS SSN function to mask data, will they replaced by different SSN or same for all?
If same for all, Does TRANS SSN function will create a lookup table?
If it create lookup table, where they will store this lookups.

Ans: For same input value Trans SSN will generate the same masked value, unless Random masking (‘n’) is specified. So if you have same SSN multiple times you will get the same output each time and Trans SSN will not create a Lookup table


3. If 2 different workstations using 2 different DB Alias to connect came Database(mask same table which has SSN). Now if we use TRANS SSN function, both will create same SSN or different SSN’s?
Yes, it will generate different SSN’s., FCFS basis, The first DBAlias which will get to mask and second DBAlias will get the masked data of the first DBAlias.

            Similar to Updating the table by two members at a time.


4. If i got Table1 and Table2 both has SSN, i am masked Table1 first and created a Masked Table. Later after couple of days i have masked Table2. If Table 1 & 2 has matching SSN row, will it create same masked SSN for both Table

Ans: ‘r’ flag means generate Random area number so you will not get unique output for unique input. But this should not skip same SSN in a table.

Answer few questions:
1. Does your destination table column have a unique constraint set on it?
2. Are all the rows containing same SSN skipped or at least one row can be seen in destination table
3. What does the control file says for Rows containing same SSN?

Ans: answers below were based on using TRANS SSN with 'r' flag.

1. No unique indexes / constraints. Both source & destination columns are defined as CHAR(20)
2. All rows with same SSN are skipped - I used Compare Request and did some queried against the database.
3. The control file says something like "No Dest"


Note: Some of the articles are grab from various Websites / Blogs.

Saturday, October 8, 2011

Optim - Overview -- IV

About Archive Files

An Archive File contains the selected, relationally intact data described in an Access Definition and the object definitions needed to re-create the database, if necessary.
The Archive File is stored locally on the client system or on a shared file server, or can be stored on secondary media such as a zip disk or a backup device. Archive Directory entries provide general information about each Archive File and any Archive Indexes for the file.
After you have extracted data and created an Archive File, you can browse the contents of the file to answer questions or satisfy a customer request. You can search Archive Directory entries, Archive Index Files, and Archive Files, using criteria for the files and data of interest. If necessary, you can restore selected data from the Archive File.

 

About Compare Requests

Use the Compare Process to compare sets of relational data from several databases and create a Compare File that allows you to browse the results.
A Compare Request identifies the sources of the data to compare and the parameters needed to run the comparison. When you create a Compare Request, you must specify the:
  • Compare file to store the compared rows of data.
  • Comparison mode to compare Single Table Compare Mode or Multiple Tables Compare Mode.
  • Run mode for extracting the source data from two or more tables.
  • Source tables for the data to compare. Define a Column Map or Table Map appropriate for the selected comparison mode.
Primary Key or Match Key to match rows from Source 1 with those in Source 2.

Table Maps and Column Maps in a Process Request

When you create a request to use Move (to convert, insert, or load data), Archive (to restore data using an Insert or Load Process), or Compare (to compare data), you can specify a Table Map and optional Column Maps to have more control over the data you want to process.

Convert, Insert, or Load Data

When you create a request to convert, insert, or load data, Table Map provide a way to correlate tables from a Source File with the destination tables in the same or a different database. You can modify destination table names or exclude tables from a process

Note: Some of the articles are grab from various Websites / Blogs.

Optim - Overview -- III

DB Alias: A DB Alias is a set of specifications used to identify, locate, and access a particular database.
The DB Alias serves as a high-level qualifier for names of database objects referenced, defined, or accessed using Optim, for example, dbalias.creatorid.tablename.
The specifications for defining a DB Alias are usually provided by the database administrator.

 

About DB Aliases

A DB Alias is a user-defined object associated with a database.
When you define a DB Alias, you provide parameters that Optim™ uses to communicate with that database. These parameters include the type and version of the database management system (DBMS) and any required connection specifications.
Production
Cloned & Masked Gold Master
((Mask in Place))
Development Environment
Integrated Test Environment
QA Environment
Training Environment

Data De-identification
Definition: Removing, masking or transforming elements that could be used to identify an individual
Also known as: data masking, depersonalization, desensitization, obfuscation, data scrubbing
Technology that helps conceal real data
Name, address, telephone, SSN / National Identity number, creditcard #…
Scrambles data to create new, legible data
Retains the data's properties, such as its width, type and format
Common data masking algorithms include random, substring, concatenation, date aging
Used in non-production environments as a “Best Practice” to protect sensitive data
Masked data must be appropriate to the context
Within permissible range of values
Application-aware

Note: Some of the articles are grab from various Websites / Blogs.

QuickBase - Overview

QuickBase Applications

QuickBase versatility sets it apart in giving you a single platform for improving many functions in your organization; the list of strong applications for QuickBase is long and growing. Applications have:

*                       Enterprise Project Management (PMO Offices)  Project mgmt Office

*                       Enterprise KPI (Key Performance Indicator) tracking and reporting

*                       Sales and Customer Management (CRM)

*                       Sarbanes-Oxley audit tracking

*                       Program and Affiliate Tracking

*                       Procurement and Marketing Workflow systems (Including RFQ and Contracts generation and tracking)

*                       Property, Asset, Personnel and other Resource Tracking and Assignment

*                       Capital Expenditure Budgeting and Tracking

*                       Online surveys with real-time results reporting


QuickBase applications solve critical business problems and help improve productivity and efficiency because they are tailored by the user to match the exact workflow and unique needs of their team - something complex point solutions or generic spreadsheets simply can't match.

  • Capture and modify data: Whatever kind of data you need to store--sales leads, catalog listings, project milestones, workflow checklists--you can use QuickBase's forms to record and organize that data so it makes sense to you.
  • Filter, sort, and group data: Easily find the records that match your criteria, and then sort those records into groups that make their relationships clear.
  • Display your data: QuickBase uses different views (Table, Grid Edit, Summary/Crosstab, Calendar, Chart, and Timeline) to display and summarize data. Switching between them is easy, like taking tasks listed in a table and displaying them as a timeline.
Create reports: Print out a hard copy, embed charts in the annual report, or email this month's sales numbers.


Note: Some of the articles are grab from various Websites / Blogs.

Thursday, October 6, 2011

Optim - Overview -- II

Table Map

A Table Map defines specifications for correlating source and destination tables of compatible data.
You can map tables that have different names, modify table names, exclude tables from a process, or include Column Maps for greater control over the data.
You can specify Table Maps in a request to compare, convert, insert, load, or restore data, or with the Create Utility.

Using the Command Line Interface

The Command Line Interface (PR0CMND.exe) allows you to automate Optim™ processing:
  • Run process requests to archive, convert, delete, extract, insert, load, or restore data.
  • Run multiple process requests in a scheduled job.
  • Browse and edit data in the Table Editor.

How to Run PR0CMND

  • You can run PR0CMND.EXE from the command line, a batch file, or another program.
  • You must be in the application software BIN directory or specify the path for this directory on the command line.

Syntax Guidelines

The typical command consists of PR0CMND followed by the command-line keywords and associated arguments. The following guidelines apply:
  • Command-line keywords can be specified in any order, separated by one or more spaces, but no commas.
  • Keywords can be specified in mixed case. (Most keywords are shown bold and uppercase in the help for emphasis only).
  • Command-line keywords and associated arguments must be separated with an equal sign (=) or a colon (:).
  • An override keyword must be separated from the associated argument with a blank space.
  • Keywords may be prefixed by a forward slash (/) or a dash ( ). For example, all of the following are equal:
PST=MARKET
/PST=MARKET
—PST=MARKET

  • The first operation argument must be prefixed by a forward slash (/) or a dash ( ), for example:
/X or —X
Browse
/R or —R
Run
/E or —E
Edit

  • An argument associated with a command-line keyword that includes spaces must be enclosed in single or double quotes.
  • When overrides are specified for a process request defined in a parameter file, the command-line keyword indicating that overrides are used (OV) must follow other command-line keywords and precede the override parameters. The first override keyword and associated argument must begin on the following line, and each additional override must be on a separate line. The override keyword END must follow the last override, and must also be on a separate line.
Note: You can also specify a variety of overrides for each process request or specify overrides that apply to browsing and editing data.

Ref: IBM Guide
Note: Some of the articles are grab from various Websites / Blogs.

Optim - Overview I

Access Definition

An Access Definition specifies the Start Table, related tables, relationships, and selection criteria to define the data you want to archive, extract, edit, or compare.
Note: You can save and reuse Access Definitions, which are stored in the Optim™ Directory.

Use Access Definitions to

  • Manage and maintain lists of tables and relationships that describe the data you want to archive, extract, or edit.
  • Specify the selection criteria you want to use to archive or extract data.
  • Specify variables and default values to use in your selection criteria.
  • Select columns and sort options to display for Point and Shoot.
  • Use Point and Shoot to narrow the focus for archiving or extracting data.
Because Access Definitions provide flexibility in managing your data, it is easy to:
  • Create new Access Definitions and modify or reuse existing definitions.
  • Share Access Definitions with other users.
  • Include a named or local (temporary) Access Definition with an Archive Request, Edit Definition, or Extract Request.

Column Map

A Column Map defines the specifications for mapping columns of compatible data between source and destination tables.
You can map columns that have different names, modify column data, or exclude columns from a process.
You can include one or more Column Maps in a Table Map when you create a request to compare, convert, insert, load, or restore data.
Note:
  • If you define a Column Map in a Convert, Insert, Load, or Restore Request, you can specify the source column value, an explicit column name, NULL, constant, literal, function, expression, special register, or exit routine.
  • If you define a Column Map in a Compare Request, you can map only explicit column names.

Column Map Procedure

A Column Map Procedure is a custom program that is referenced by a Column Map and is written and compiled using the Column Map Proc Editor.
These programs are used in a Convert, Insert, Load, or Restore Request for special processing and data manipulation that is beyond the scope of native Column Maps. You can use a Column Map Procedure by specifying the name of the procedure as the source column.

DB Alias

A DB Alias is a set of specifications used to identify, locate, and access a particular database.
The DB Alias serves as a high-level qualifier for names of database objects referenced, defined, or accessed using Optim™, for example, dbalias.creatorid.tablename.
The specifications for defining a DB Alias are usually provided by the database administrator.

About DB Aliases

A DB Alias is a user-defined object associated with a database.
When you define a DB Alias, you provide parameters that Optim™ uses to communicate with that database. These parameters include the type and version of the database management system (DBMS) and any required connection specifications.
Note: DB Aliases are defined using the Configuration Program and are stored in the Optim Directory. You must be authorized to define a DB Alias. See the Installation and Configuration Guide for additional information.
A DB Alias name serves as a high-level qualifier that allows you to access a specific database to perform requested functions. For example, in an Access Definition, you must qualify the name of a table with a DB Alias name. The referenced DB Alias supplies the parameters needed to connect to the database in which the table resides.

Databases

A DB Alias identifies a specific database and serves as a prefix in the fully qualified names of primary keys, database tables, and relationships. DB Aliases are essential elements in managing your databases. Keep in mind that each:
  • Database can have only one DB Alias.
  • DB Alias name must be unique.
Also, objects in the Optim Directory cannot have the same name as a DB Alias.

Ref: IBM Guide and Materials (www.ibm.com)

Note: Some of the articles are grab from various Websites / Blogs.

Tuesday, September 20, 2011

IBM - OPTIM - II

  • InfoSphere Data Architect: Simplifies data modeling and integration design, enabling architects to discover, model, relate, and standardize diverse and distributed data assets.
  • Optim Development Studio: Provides a complete development and testing environment for building database objects, queries, database logic, and Java™ pureQuery applications.
  • Optim pureQuery Runtime: Delivers a high-performance data access platform that provides the capability to visualize, optimize, and lock down SQL without modifying the application. Complements Optim Development Studio by enabling pureQuery support in production systems, including support for static execution of .NET and Java applications.
  • Optim Database Administrator: Manages complex structural changes while preserving data, privileges, dependent objects, and application bindings.
  • Optim High Performance Unload: Formerly known as DB2 High Performance Unload, provides a high-speed unload utility as an alternative to the export feature. Whether used alone or with Optim Database Administrator, this product significantly reduces the time required to migrate databases.
  • DB2 Merge Backup: Shortens recovery time on production servers, ensures full backups are available as needed, and enables organizations to reduce resources and outages for full backups.
  • DB2 Recovery Expert: Localizes recovery processes to impacted objects in order to reduce outages without having to resort to full database recovery.
  • Optim Performance Manager offerings: Provides a comprehensive and proactive performance-management solution to help you identify, diagnose, solve, and prevent database performance issues for applications that access DB2 databases on Linux®, UNIX®, or Windows®.
  • Optim Query Tuner offerings: Includes single-query tuning for both DB2® for z/OS® and DB2 for Linux, UNIX, and Windows. For DB2 for z/OS, you can extend the capability to entire SQL workloads using Optim Query Workload Tuner for DB2 for z/OS (formerly DB2 Optimization Expert for z/OS). These products provide a set of advisors and tools to help make it easier to tune SQL.
While there are more products that are part of the lifecycle management portfolio, including InfoSphere Optim Test Data Management solutions and InfoSphere Optim Data Growth solutions, this article focuses only on the listed products.


To learn more about IBM Integrated Data Management solutions by Optim, visit these links:
IBM documentation --  Tutorials for Optim Development
http://www.ibm.com/software/data/optim/
http://www.ibm.com/software/data/optim/development-studio/
http://www.ibm.com/developerworks/spaces/optim/
http://www.ibm.com/developerworks/data/products/devstudio/
http://www.ibm.com/developerworks/data/library/techarticle/dm-1105optimquerytuner2/index.html
http://publib.boulder.ibm.com/infocenter/idm/v2r2/index.jsp?topic=%2Fcom.ibm.datatools.dwb.tutorial.doc%2Ftopics%2Fdwb_introduction.html
http://www.ibm.com/developerworks/data/library/techarticle/dm-0909optimpackaging/index.html
http://www.ibm.com/developerworks/data/library/techarticle/dm-1107extendedinsight/index.html

Ref: www.ibm.com

Note: Some of the articles are grab from various Websites / Blogs.

IBM - OPTIM - I

IBM Optim Database Management Professional Edition
InfoSphere Optim is the family name of a portfolio of products that delivers on the IBM vision of managing data across your data’s lifecycle. This portfolio is designed to increase productivity for all roles throughout the data life cycle and to help reduce the complexity and cost of managing database applications. InfoSphere Optim offerings help you manage and optimize tasks in each phase of the application life cycle from a data management viewpoint. This article focuses on an overview of some of these offerings and how the products are packaged.
                   
Information governance core disciplines lifecycle management

Discover and define
Develop and test
Monitor and optimize
Consolidate and retire
InfoSphere Discovery
IBM Business Glossary
InfoSphere Data Architect
Optim Development Studio
Optim Database Administrator
Optim Test Data Manager and Data Privacy
Optim Performance Manager
Optim Query Tuner
Optim pureQuery Runtime
DB2 Merge Backup
DB2 Recovery Expert
Optim Data Growth
Optim Application Consolidation solutions

The following list gives a high-level description of the products highlighted in Table 1.

Quicklinks for z/OS developers and database administrators

For many years, IBM has had a great set of products to manage DB2 for z/OS data and applications. Over time, more capabilities for z/OS will be delivered in these products, as well as in products in the Optim portfolio. Of the products described in this article, the following are currently enabled for z/OS databases:
Although not described in this article, the IBM Optim set of solutions also has great support for z/OS to manage data growth, to protect data privacy, and to streamline test data.

Ref: http://www.ibm.com/
Note: Some of the articles are grab from various Websites / Blogs.

Thursday, September 15, 2011

DS Related Useful Links - I

1. http://publib.boulder.ibm.com/infocenter/iisinfsv/v8r5/index.jsp?topic=/com.ibm.swg.im.iis.bpdir.usage.doc/topics/bpdir_con_gui_overview.html


2. For Orc-DWH :: http://oracle-datawarehousing.blogspot.com/2011/02/data-warehousing-objects.html

3. For Unix Related :: http://www.dartmouth.edu/~rc/classes/ksh/

4. For Datastage :: http://datastage-tutorials.blogspot.com/2009/09/1.html

5. For DSxchange Site   :: http://www.dsxchange.com/portal.php

6. For WS_Quality Stage :: http://it.toolbox.com/wiki/index.php/WebSphere_QualityStage

7. http://www.dstagebox.com/cat/real-time/

8. http://www.duke-consulting.com/DwNav.htm

9. http://spulagam.blogspot.com

10. http://www.robvonk.com/datastage/standard-functions/

11. http://evergreendatastage.blogspot.com/2011/02/datastage-faqs.html


DS - Unix Related - II

ps -ef | grep

ps -ef | grep dsrpc

ps -ef | grep phantom

ps -ef | grep osh

ps -ef | grep dscs

ps -ef | grep dsapi

kill -9

Check the permissions
1.Log on to server as dsadm or root
2.cd $DSHOME/bin
3.ls -l | grep rws

The output should be (6) files owned by root with the setuid bit set -- especially the uv file.


Check to ensure that there was a clean restart of the DataStage Engine.
Here are the instructions for a clean restart of DataStage Engine:


1. Log on to server as dsadm or root
2. cd $DSHOME
3. source the environment:


. ./dsenv --> that’s dot space dot slash dsenv

4. Check for DataStage client connections
ps -ef | grep dsapi_slave
ps -ef | grep dscs

If there are any connections found, use kill -9 to remove them.

5. Check for jobs processes:
ps -ef | grep phantom

If there are any connections found, use kill -9 to remove them.

6. Check datastage shared memory segments:
ipcs - mop | grep ade

If there any shared memory segments returned, remove them, use command:
ipcrm -m ID , where ID is the number from the second column of ipcs -mop

7. Check if there is any port hung using:
netstat -a | grep dsrpc

If you get any results and the dsrpc is in WAIT then you need to wait until this it disappears (you can also reboot the machine).

8. At this point you can stop or start the DataStage Engine.
bin/uv -admin -stop
bin/uv -admin –start

DS - Unix Related - I

IN UNIX, the Datastage home directory location will always be specified in the “.dshome” file which will be present in the root directory.  Before you can run your Datastage commands you will have to run the following commands
cd  ‘cat /.dshome’
This will change the location to the home directory. By default this will be /opt/IBM/InformationServer/Server/DSEngine
. ./dsenv > /dev/null 2>&1
This will run the dsenv file which contains all the environment variables. Without doing this, your UNIX commands won’t run on the command prompt.
After you have done this then you can use any Datastage command for interacting with the server. The main command you can use is the ‘dsjob’ command which is not used only to run jobs but for a wide variety of reasons. Let’s look at the various ways you can use the dsjob command
To run a job
Using the dsjob command you can start, stop, reset or run the job in validation mode.
dsjob  –run –mode VALIDATE  project_name  job_name
This command will actually run the job in validation mode. Similarly you can use RESET or RESTART instead of VALIDATE depending on what type of run you want. If you want a normal run then you will not need to specify the –mode keyword as shown below
dsjob –run project_name  job_name | job_name.invocationid
Running with the invocationid would mean that the job would be run with that specific invocation id
Now if you have parameters to set or paratemeterset values to set then this can also be as set as shown below
dsjob –run –param variable_name=“VALUE” –param psParameterSet=“vsValueSet” project_name  job_name
To stop a job
Stopping a job is fairly simple. You might not actually require it but still its worth to take a look. It acts the same way as you would stop a running job the Datastage director.
dsjob –stop  project_name  job_name|job_name.invocationid
To list projects, jobs, stages in jobs, links in jobs, parameters in jobs and invocations of jobs
dsjob can very easily give you all the above based on the different keywords. It will be useful for you if you want to get a report of what’s being used in what project and things like that
The various commands are shown below
‘dsjob –lprojects’  will give you a list of all the projects on the server
‘dsjob –ljobs  project_name’ will give you a list of jobs in a particular project
‘dsjobs –lstages  project_name job_name’  will give you a list of all the stages used in your job. Replacing –lstage with –links will give you a list of all the links in your job. Using –lparams will give you a list of all the parameters used in your job. Using –linvocations will give you a list of all the invocations of your multiple instance job.
To generate reports of a job
You can get the basic information of a job buy using the  ‘jobinfo’ option as shown below
dsjob -jobinfo  project_name job_name
Running this command will give you a short report of your job which includes The current status of the job, the name of any controlling job for the job, the date and time when the job started , the wave number of the last or current run (internal InfoSphere Datastage reference number) and the user status
You can get a more detailed report using the below command
dsjob -report  project  job_name BASIC|DETAIL|XML
BASIC means that your report will contain very basic information like start/end time of the job, time elapsed and the current status of the job. DETAIL as the name indicates will give you a very detailed report on the job down to the stages and link level. XML would give you an XML report which is also a detailed report in an XML format.
To access logs
You can use the below command to get the list of latest 5 fatal errors  from the log of the job that was just run
dsjob -logsum –type FATAL –max 5 project_name job_name
You can get different types of information based on the keyword you specify for –type. Full list of allowable types are available in the help guide for reference.
There are a number of other options also available to get different log information. You can explore this in more detail in the developer guide. With the Datastage commands you can administer jobs, run jobs, maintain jobs, handle errors, prepare meaningful job logs and even prepare reports.  The possibilities are endless. If you like to code then you won’t mind spending your time exploring the command line options available.


Note: Some of the articles are grab from various websites/Blogs.

Monday, September 12, 2011

Subject: Read it carefully.... A Troubled User...


Dear Tech Support Team,

Last year I upgraded from Girlfriend 5.0 to Wife 1.0.

I soon noticed that the new program began unexpected child-processes that took up a lot of space and valuable resources. In addition, Wife 1.0 installed itself into all other programs and now monitors all other system activities.

Applications such as
BachelorNights 10.3, Cricket 5.0 , BeerWithBuddies 7.5, and Outings 3.6 no longer runs, crashing the system whenever selected. I can't seem to keep Wife 1.0 in the background while attempting to run my favorite applications.

I'm thinking about going back to Girlfriend 5.0 , but the 'uninstall ' doesn't work on Wife 1.0.
Please help!
Thanks, "A Troubled User "

REPLY:
Dear Troubled User:
This is a very common problem that people complain about.
Many people upgrade from Girlfriend 5.0 to Wife 1.0, thinking that it is just a Utilities and Entertainment program.Wife 1.0 is an OPERATING SYSTEM and is designed by its Creator to run EVERYTHING !!!

It is also impossible to delete Wife 1.0 and to return to Girlfriend 5.0.
It is impossible to uninstall, or purge the program files from the system once installed.
You cannot go back to Girlfriend 5.0 because Wife 1.0 is designed not to allow this. (Look in your Wife 1.0 Manual under Warnings-Alimony- Child Support) ..

I recommend that you keep Wife1.0 and work on improving the environment.
I suggest installing the background application " Yes Dear" to alleviate software augmentation.

The best course of action is to enter the command
C:\APOLOGIZE because ultimately you will have to give the APOLOGIZE command before the system will return to normal anyway.

Wife 1.0 is a great program, but it tends to be very high maintenance. Wife 1.0 comes with several support programs, such as
Clean 2.5, Sweep 3.0, Cook 1.5 and DoLaundry 4.2. However, be very careful how you use these programs. Improper use will cause the system to launch the program NagNag 9.5 .

Once this happens, the only way to improve the performance of Wife 1.0 is to purchase additional software. I recommendSarees 2.1 and Jewellery 5.0

STATUTORY WARNING : DO NOT, under any circumstances, install SecretaryWithShortS kirt 3.3. This application is not supported by Wife 1.0 and will cause irreversible damage to the operating system.

Best of luck,Tech Support ....



P  Please consider the environment before printing this page.