From:                              route@monster.com

Sent:                               Friday, May 06, 2016 1:58 PM

To:                                   pkumar@altusmeus.com

Subject:                          Candidate for review

 

This resume has been forwarded to you at the request of Monster User xgenesisngnx02

Vidyasagar Reddy 

Last updated:  05/06/16

Job Title:  no specified

Company:  no specified

Rating:  no specified

Screening score:  no specified

Status:  no specified


Springfield, NJ  07081
US

Mobile: 8148816660   
sagaredw@gmail.com
Contact Preference:  Mobile Phone

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: ETL Datastage Lead

Resume Value: efqaef89bys8bfrp   

  

 

 

                                                     Vidyasagar Reddy B S

 

Email:sagaredw@gmail.com                                                                 Mobile: +1-814 881 6660

 

 

Ø       Having 10 years of experience in developing, testing and maintaining applications in Data Warehouse using ETL tools like IBM Infosphere Information Server & IBM InfoSphere DataStage(Parallel Extender)  [(Versions 11.x/9.x/8.x)] and IBM InfoSphere QualityStage and IBM  Information Analyzer, Metadata Workbench, Business Glossary, Metadata management, IBM information governance catalog

Ø       Experience in writing SQL Coding, Views, PL/SQL Stored Procedures, Functions, Packages, Triggers, Cursors, Partitioned Tables

Ø       Experience in Banking, insurance, Healthcare, Automobile domains, health insurance, Life Insurance.

Ø       Experience in IBM Information Analyzer,Business Glossary, FastTrack,metadata, lineage

Ø       Experience in UNIX, Oracle, Microsoft SQL Server,DB2 and Teradata, IBM Netezza

Ø       Experience in Production Support.

Ø       Project lead for multiple projects.

Ø       Experience in Address Verification and Standardization writing rule sets using   IBM InfoSphere QualityStage.

Ø       Perform column analysis, rule analysis, primary key analysis, natural key analysis, foreign-key analysis, and cross-domain analysis using IBM Information Analyzer

Ø       configure and code data quality validation rules within IBM Information Analyzer

Ø       Schedule the jobs through Autosys JIL. 

Ø       Extensive experience in extract the data from ERP Systems like SAP HANA,SAP BW

Ø       Experience in writing rule sets using   IBM InfoSphere QualityStage.

Ø       Experience in writing Survivorship rules using   IBM InfoSphere QualityStage.

Ø       Identify duplicate Records using Matching Stage through IBM InfoSphere QualityStage.

Ø       Experience Select the best record of a match group using Survivorship stage through IBM InfoSphere QualityStage.

Ø       Experience standardization the address,name,Area using standardization stage through IBM InfoSphere QualityStage.

Ø       Experience in Word Investigations,Character Discrete using  Investigate stage

Ø       Experience in Investigate stage,Standardize stage,Match stage

Ø       Information Analyzer administrative tasks such as managing logs, schedules, active sessions and security roles etc

Ø       Experience in Data Analysis and modeling

Ø       Developed new change request processes for managing metadata and key contributor of the initial set-up for the Data Governance Framework

Ø       Provided training, testing and User support for the new Data Governance initiatives

Ø       Experienced in creating Data Lineage report, Impact analysis report through IBM Data Governance Catalog & IBM Meta Data Asset Manager

Ø       Experience in Data modelling, Design and architecture review.

Ø       Experience in UNIX, AIX scripting,Korn Shell

Ø       Import/export projects along with rules and bindings successfully from one environment to another using Information Analyzer

Ø       Experience  in Hadoop stack(Hive,Pig)

Ø       Experience in estimation, planning, forecasting and tracking for projects

Ø       Co-ordinate with multiple development teams to track the progress of the project

Ø       Experience in all Phases of System/Software Development Life Cycle (Process Consulting, Architecture, Design, Development, Testing, Deployment and Support).

Ø       Extensive experience in design and architecture of ETL interfaces and data marts.

Ø       Familiar with aspects of technology projects like Business Requirements, Technical Architecture, Design Specification, Development and Deployment.

Ø       Proficiency in Data warehousing techniques for data cleansing, Slowly Changing Dimension phenomenon, Surrogate key assignment, change data capture

Ø       Experience in ETL and Datawarehouse systems like Datastage, Informatica

Ø       Experience in Agile Methodology

Ø       Experience in Toad

Ø       Excellent knowledge in Business Intelligence, Analytics and Optimization.

Ø       Proficiency in Data Warehouse Architecture and Designing Star Schema, Snow flake Schema, FACT and Dimensional Tables, Physical and Logical Data Modeling using Erwin and Designer 2000

Ø       Extensive experience in loading high volume data, and performance tuning.

Ø       Excellent interpersonal communication skills and ability to work effectively in a team environment.

 

Ø       IBM InfoSphere datastage V8.5 certified

 

 

ETL

IBMInformationAnalyzer/IBMWebsphere DataStage11.5/9.x/8.x/7.5x2,

QualityStage,Business Glossary, Information Analyzer, Business Glossary, FastTrack, Metadata Workbench, Business Glossary, Metadata management

Databases

TeradataV13,V14,Oracle12g/11g/10g/9i,DB2,MicrosoftSQL Server,IBM Netezza, MySQL

Reporting

Tableau

Programming Languages

 SQL,PL/SQL,Shell Scripting,XML, UNIX, AIX scripting,Korn Shell

Operating System

Unix, Linux, Windows NT/XP/2000/2003 server

Scheduling

Autosys, Crontab, Tivoli

Connectivity

Citrix, Putty, Humming Bird

Datamodeling

Erwin

Version control

Pvcs,Accurev, Tortoise SVN

Big data/Hadoop

Hive,Pig,Sqoop,NoSql,Hbase,HDFS

 

 

 

Citi bank, New Jersy, USAJan’14 – Till Date

Role: Senior ETL Lead

 

 

Control Data warehouse Reporting project mainly deals with IT Risk Management. I.e. The business risk associated with the use, ownership, operation, involvement, influence and adoption of IT within an enterprise or organization.

                                                                      We take the data from various source systems like Enterprise Risk Management, Information Security, Reference Data, O&T Risk Management, CTI, Enterprise Supplier Risk Management and Continuity of Business, then do Data Integration and Enrichment and then load the data into Data warehouse. Once data is loaded into warehouse, Users will do Monitoring, Surveillance & Analytics using Business Objects Reporting tool.

 

Responsibilities:

Ø       Worked on full software development cycle activities including analysis, design, development, testing, deployment to production and support.

Ø       Experience in Data modeling through Erwin .

Ø       Strong Experience in BM DataStage(Parallel Extender) 11.X tools: Administrator, Console,Designer, Director, Orchestrate, QualityStage

Ø       experience utilizing relational data modeling techniques and tools, such as ERWIN

Ø       Developed detailed ETL design documents, ETL specifications and data mappings based on requirements

Ø       Experience in IBM Information Server installation and configuration

Ø       Experience in writing oracle Stored Procedures

Ø       Written UNIX shell scripting to handle automation of daily batch runs to handle catch up runs by passing last run date, last pull date and use of infrastructure ETL run tables to accomplish daily batch runs catch-up

Ø       Schedule the jobs through Autosys.

Ø       Experience in Address Verification and Standardization writing rule sets using   IBM InfoSphere QualityStage.

Ø       Strong Experience in DataStage tools: Administrator,Designer,Director and QualityStage

Ø       Performed Installation and migration of IBM Websphere Datastage/ IBM InfoSphere DataStage  Enterprise Edition/PX on UNIX platforms and testing of latest version of Datastage.

Ø       Involved in documentation for Design Review, Code Review and Production Implementation..

Ø       Experience in writing rule sets using  QualityStage.

Ø       Experience in writing Survivorship rules using  QualityStage.

Ø       Experience in data quality (DQ) pilot use cases and validation parameters

Ø       Experience in Investigate stage,Standardize stage,Match stage

Ø       Identify duplicate Records using Matching Stage through IBM InfoSphere QualityStage.

Ø       Experience Select the best record of a match group using Survivorship stage through IBM InfoSphere QualityStage.

Ø       Experience standardization the address,name,Area using standardization stage through IBM InfoSphere QualityStage.

Ø       Experience in Word Investigations,Character Discrete using  Investigate stage

Ø       Experience in writing rule sets using  QualityStage.

Ø       Experience in writing Survivorship rules using  QualityStage.

Ø       Perform column analysis, rule analysis, primary key analysis, natural key analysis, foreign-key analysis, and cross-domain analysis using IBM InfoAnalyzer

Ø       configure and code data quality validation rules within IBM InfoAnalyzer 

Ø       Proficiency in Data warehousing techniques for data cleansing, Slowly Changing Dimension phenomenon, Surrogate key assignment, change data capture

Ø       Information Analyzer administrative tasks such as managing logs, schedules, active sessions and security roles etc

Ø       Experience work with IBM Netezza.

Ø       Experience

Ø       Analysis the data using Infosphere Information Governance Catalog

Ø       Import/export projects along with rules and bindings successfully from one environment to another using Information Analyzer

Ø       Experience in Installation of the Information Server

Ø       Experience in defining and designing data quality(DQ) solutions

Ø       Experience in UNIX, AIX scripting,Korn Shell

Ø       Involved in Low Level design document for mapping the files from source to target and implementing business logic.

Ø       Conduct all review meetings and explain the process of new implementations.

Ø       Coordinate with team members and administer all onsite and offshore work packages.

Ø       Prepared documentation including requirement specification

Ø       Worked with Developers to troubleshoot and resolve issues in job logic as well as performance.

Ø       Conducted weekly status meetings.

Ø       Maintained Data Warehouse by loading dimensions and facts as part of project.

Ø       Experienced in developing parallel jobs using various Development/debug stages (Peek stage, Head & Tail Stage, Row generator stage, Column generator stage, Sample Stage) and processing stages (Aggregator, Change Capture, Filter, Sort & Merge, Funnel, Remove Duplicate Stage)

Ø       Generation of Surrogate Keys for the dimensions and fact tables for indexing and faster access of data in Data Warehouse.

Ø       Debug, test and fix the transformation logic applied in the parallel jobs

Ø       Successfully implemented pipeline and partitioning parallelism techniques and ensured load balancing of data.

Ø       Strong understanding of data warehousing concepts, schemas, Slowly Changing Dimensions, Facts and Dimensions and implementation of the same

Ø       Deployed different partitioning methods like Hash by column, Round Robin, Entire, Modulus, and Range for bulk data loading and for performance boost.

Ø       Developed UNIX shell scripts to automate file manipulation and data loading procedures.

Ø       Tuned transformations and jobs for Performance Enhancement.

Ø       Scheduled the jobs using AutoSys

Ø       Involved in Reading and writing to the XML,IBM MQ ,Flat File , mainframe

 

 

Environment : : IBM InfoSphere Datastage 11.5, Information Analyzer, Quality stage & Metadata Workbench, Business Glossary, Metadata management, information governance catalog ,SQL, Business Objects XIR2,DB2,Teradata, IBM Netezza , Microsoft SQL Server,Oracle,Autosys, PVCS ,SAP, PLSQL,TOAD, UNIX- AIX, MS Word, Excel, DB2,IBM InfoSphere Master Data Management Enterprise Edition,XML,TOAD,SAP

 

 

Ford Motor Company, Dearborn, MI, USAApr 13 – Dec 13

Role:ETL Lead

 

GPARTS is the Global Price and Revenue Targeting System which enables FCSD Finance Parts Pricing Analysts to set the prices for all Parts sold by FCSD for Ford of Europe (FoE). GPARTS gather data from various source systems into the EDW. The integrated data from EDW is extracted to a landing zone on Facility C.  The flat files are picked by the vendor, Marketing Associates (MA). This data is used to support the Vendavo tool, a part pricing and optimization tool hosted by MA to generate part prices. This price is sent back to GPARTS, which is used to create price lists to be shared with the dealers and other downstream systems

. Responsibilities:

Ø       Developed detailed ETL design documents, ETL specifications and data mappings based on requirements

Ø       Involved in migration and implementation of data stage code from development to Test and Production environmenUsed the DataStage Director to validate, schedule and run the jobs.

Ø       Created jobs in DataStage to import data from heterogeneous data sources like Oracle 9i, Text files and SQL Server

Ø       Experience in writing rule sets using  QualityStage.

Ø       Experience in writing Survivorship rules using  QualityStage.

Ø       Experience in Investigate stage,Standardize stage,Match stage

Ø       Perform column analysis, rule analysis, primary key analysis, natural key analysis, foreign-key analysis, and cross-domain analysis using IBM InfoAnalyzer

Ø       configure and code data quality validation rules within IBM InfoAnalyzer 

Ø       Information Analyzer administrative tasks such as managing logs, schedules, active sessions and security roles etc

Ø       Import/export projects along with rules and bindings successfully from one environment to another using Information Analyzer

Ø       Extensively worked on Job Sequences to Control the Execution of the job flow using various Activities & Triggers (Conditional and Unconditional) like Job Activity, Wait for file, Email Notification, Sequencers, Exception handler activity and Execute Command.

Ø       Performed Unit Testing, System Integration Testing and User acceptance testing

Ø       Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool.

Ø       Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.

Ø       Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.

Ø       Experience in IBM MDM

Ø       Developed Test Plan that included the scope of the release, entrance and exit criteria and overall test strategy. Created detailed Test Cases and Test sets and executed them manually

Ø       Worked on Fast load, Multi load for loading the data into the Teradata database

Ø       Provided production support and performed enhancement on existing multiple projects.

Ø       Automate the process of data stage jobs to handle multiple day runs to catch up for any missed day runs

Ø       Written UNIX shell script to handle automation of daily batch runs to handle catch up runs by passing last run date, last pull date and use of infrastructure ETL run tables to accomplish daily batch runs catch-up

Ø       Validated the current production data, existing data and the programming logic involved

Ø       Responsible in debugging Ticketing issues in the production phase

 

 

Environment:- IBM InfoSphere Datastage 8.X, Information Analyzer, IBM MDM, Quality stage & Information Analyzer 8.X,SQL, Business Objects XIR2, Autosys,DB2, IBM Netezza ,SQL Server,Oracle,Teradata,PVCS ,DB2, TOAD, UNIX- AIX, MS Word, Excel, DB2

 

Ford Motor Company, Dearborn, MI, USAJan11 – Mar 13

Role: Senior Specialty Developer

 

An Enterprise Data Warehouse (EDW) has been part of Ford’s Architecture Strategy for Information since it was first published. The EDW is built on the standard Analytic Platform DBMS (Teradata)

The EDW will be an upstream application for BI Reporting applications.  Today the EDW supplies analytic data to various projects inside Ford Motor Company. As EDW Support lead, I should Ensure data should be available to the downstream applications for reporting.

 

Responsibilities:

Ø       Used the Autosys job Scheduler to schedule the Datasatge jobs to run the daily jobs and weekly jobs.

Ø       Production Support - Identify issues, review errors, approve resolution, and advise of potential downstream impacts

Ø       Experience in debugging ETL jobs to check in for the errors and warnings associated with each Job run

Ø       Worked on data analysis and data correction and cleanup for warehouse tables

Ø       Responsible in debugging Ticketing issues in the production phase

Ø       Password maintain in Production environment

Ø       Code migration in Production environment

Ø       Worked on new generic UNIX scripts and existing scripts to automate some of the warehouse functionality.

Ø       Monitor Automobile Environments and run defined Data Quality Checks

Ø       Maintain relationships with IT and Business.

Ø       First line support for Business Users and Analysts including mailbox support, researching questions and maintaining web-based knowledge base.

 

Environment   : IBM InfoSphere Datastage 8.X, Information Analyzer, , Quality stage & Information Analyzer 8.X, Autosys, Teradata,DB2, Oracle, IBM Netezza,SQL Server,Oracle,Teradata, UNIX,SQL Server,DB2,PVCS,Autosys

 

CIGNA Health Care, - Bloomfield, CTSep 06 – Dec 10

Role: Senior Specialty Developer

 

Cigna is a global health service company. This project IBoR is profiling individual data and it's usage throughout the organization, starting with the Lines of Business that are in-scope for the 5010 program. Achieving a better understanding of the current state of individual information will help define the future vision and end state of IBoR. 
IBOR (Individual Book of Records): IBOR is a program to harmonize, cleanse and publish key information about individuals. The objective is to have information that is consistent, accurate and trustworthy. Each line of business (health care, group insurance, etc) has a unique enrollment process. There are at least 23 places where individual data is stored as a source of record and Medicaid claims there are at least 24 different applications that have some role in acquiring individual data from outside CIGNA. IBOR uses IBM Initiate tool, which is a MDM tool and acts as a GUI for individual data. As part of IBOR Systems Integration team data from 23 sources will be loaded to initiate repository. IBOR data will be used as part of 5010 project

 

Responsibilities:

Ø       Designed, developed, tested and attuned Data stage jobs mappings. Analyzed and modified existing ETL objects in order to incorporate new changes in them according to the project requirements

Ø       Extensively developed Data stage parallel & sequence jobs.

Ø       Worked on performance tuning of ETL jobs using SQL queries, data connections, configuration files, parameter sets and environment variables.

Ø       Used Autosys to schedule jobs and e-mailed the status of ETL jobs to operations team daily

Ø       Developing the Parallel jobs using Sequential file, Transformer, Lookup, Funnel, Filter, Remove Duplicate, Oracle and Teradata stages

Ø       Performed Knowledge Transfer [KT] to team members and lead the whole project in all the phases.

Ø       Used shared containers to re-use stages and links in parallel jobs.

Ø       Document the Production issues and solution, monitoring alerts and fixing Remedy ticket

Ø       Coordinating with the Onsite team during the project tenure.

Ø       Involved in post production support.

Environment   : DataStage 7.5, Autosys, Teradata, Oracle, UNIX,SQL Server,DB2

 

 

 

 Page 1 of 1

 

 



Experience

BACK TO TOP

 

Job Title

Company

Experience

ETL Datastage Lead

Citibank N.A.

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Date of Availability:

Immediately

Work Status:

US - I am authorized to work in this country for any employer.

Active Security Clearance:

None

US Military Service:

Citizenship:

None

 

 

Target Job:

Target Job Title:

ETL Datastage Lead

 

Target Company:

Company Size:

Occupation:

IT/Software Development

·         Database Development/Administration

·         Software/Web Development

 

Target Locations:

Selected Locations:

US-CA-San Francisco
US-NJ-Northern

Relocate:

Yes

Willingness to travel:

Up to 50% travel

 

Languages:

Languages

Proficiency Level

English

Fluent