From:                              route@monster.com

Sent:                               Monday, September 28, 2015 1:02 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Rabindra Patro 

Last updated:  10/07/12

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Cherry Hill, NJ  08034
US

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Sr Informatica Developer

Resume Value: dmp4q9bfz4qp92wf   

  

 

Rabindra K Patro

Phone: - 609-332-8165

Email: - rabikpat@gmail.com

Professional Summary
Over 8 years of Software Development experience with proficiency in Analysis, Design, Development, Implementation and Testing in Enterprise Data warehousing, ETL & Database Development technologies

·         Extensive experience in ETL (Extract Transform Load), Data Integration and Data Warehousing using Informatica Power Center & Oracle PL/SQL technologies.

·         Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).

·         Solid management skills, demonstrated proficiency in leading and mentoring individuals to maximize levels of productivity, while forming cohesive team environments

·         Solid experience in Data Warehousing Concepts like OLAP, OLTP, Star Schema, Snow Flake Schema, Fact Table, Dimension Table, Dimension Data Modeling etc.

·         Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility and also worked with XML Sources & Targets

·         Experience in fixing the issues and resolved them using session logs, workflow logs, and used e-mail task for capturing issues via e-mail along with the session logs.

·         Experience in Relational Databases like Oracle 10g/9i/8i and various native tools such as SQL, PL/SQL, SQL*Plus in various platforms like Linux, UNIX, Windows etc.

·         Very good experience in UNIX Shell Scripting, writing SQL queries, understanding requirements, writing functional specs, writing test specs and plans.

·         Good knowledge of Bank conversation, Basel 2 compliance domain and Data Migration.

·         Have good knowledge on Business objects crystal report XI of creation of universe and classes.

·         Have good experience on the creating Macros using MS excel for generating one to one mapping and insert scripts

to the insert the data into target table.

·         Have strong analytical and communication skills and excellent Interpersonal Skills with the ability to work independently and with the Team.

·         Experience in informatica scheduler to shedule jobs and monitor other processes and start jobs based on             success or failure of other processes.

Education & Certificates

·             Masters in Computer Application from Andhra University, India

·             A Level (PGDCA) from  DOCAECC University, India

·             Bachelors Degree from Andhra University, India

 

Technical Skills

 

Data Warehousing

Informatica PowerCenter 8.x,7.X,TOS (Talend Open Studio),Mondorian,SQL*Plus, SQL*Loader, Query Analyzer, ETL, Metadata, OLAP, OLTP, Informatica Administration, Crystal report XI

Programming

SQL, PL/SQL,  Unix Shell Scripting

Tools

Toad, SQL Developer, Visio and Microsoft Office Technologies

Databases

Oracle 8x/9x/10g/11g, SQL Server 2000/2005, 7.x.

Operating Systems

Windows 9x/NT/2000/XP, HP-UNIX, SUSE Linux, Red Hat Linux

 

 

 

Sapient Corporation                                                                                                         Aug’07- Till date

 

Client: TD Bank, Mount Laurel, NJ                                                                                                                  Feb’ 12 – Till date.

Role: Senior ETL Developer/Off Shore Coordinator.

Project Name: TDAF (TD Auto Finance)

 

Basel/Lending Data Mart (BLDM) Enhancements is considered a ‘sub project’ under the TDAF US Risk Management Integration Project umbrella.  The key deliverables of the BLDM Enhancements sub-project are to establish on-going data feeds and provide Historical Data Capture data set from the TD Auto Finance (TDAF) systems to the Basel/Lending Data Mart (BLDM) and then RRVA ELM (Expected Loss Methodology) environment to support US Retail Basel reporting and analysis.  The existing US Basel data fields will be used for incorporation of the TDAF data into the US Basel program.  US Retail Risk Management (US RRM) reporting and analysis will use the TDAF data in the BLDM and ultimately have the necessary data elements in the Retail Risk Universe in order to perform the Monthly Delinquency (DLQ) Report and Quarterly Vintage Report. All of the current data elements and current calculations identified as part of the interim “OCC file” (provided from TDAF to Retail Risk) will be delivered into the BLDM.  This will allow for the elimination of the interim OCC file that Retail Risk Management currently receives

 

It is the intent of this effort to ensure that as much as 60 months of loan performance data be retrieved and stored in a usable and meaningful way so as to enable TD Bank to bring the TDAF portfolio into Basel compliance.

 

Responsibilities:

·         Currently the project is an onshore and offshore model where I am playing the role of ETL offshore coordinator and handling team of 5 people of offshore from onshore.

·         In additional also played the role of business system analyst to get the requirement on the development from the source system and design the target table structure with data modeling. 

·         As a part of daily activity I need to coordinate with offshore team and provide daily status on work to the client.

·         Prepared ETL low level design and MLP (Mid Level Plan) documents with information on implementation of business logic and specifications of the job flow.

·         Extensively worked on Informatica Power Center and the various tools such as Mapping Designer, Workflow Manager Etc.

·         Implemented looping mechanism through Informatica to handle historical data loads.

·         Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility.

·         Worked Extensively with Expression, Lookup, Joiner, Sorter, Aggregator, Update Strategy, Stored Procedure, Sequence Generator, Filter, Rank and Router transformations.

·         Designed reusable components such as transformation, Mapplets, lookups etc and reusable objects sources and targets (shared folder) for ETL process.

·         Designed and developed data Validation controls to evaluate the quality of the ETL process using UNIX scripts and ETL Mappings.

·         Extensively worked in Oracle PL/SQL procedures, functions, triggers & packages for supporting the development ETL jobs and also for maintaining the integrity of the DataMart.

·         Worked on XML mapping where populating the XML source to Target table and next populating the data from the target system into RRVA system .So that the reporting team can pull out the data from RRVA system and generate report.       

·         Closely interacted with the business lines to gather requirements and created analysis/functional specifications.

·         Involved in creating Specification Documents, Mapping documents, Data Flow Diagrams, ER Diagrams, Dependency documents etc.

·         Created and reviewed unit, system and integration test cases for all cycles.

·         Created the Migration document and applied labels to all the XML ETL objects (Source, Target, Mapplet, Mapping, Session, Worklets  and Workflow)

 

Environment: Informatica PowerCenter 8.6.1, XMLs,SQL Server 2005,Oracle 11g, SQL, PL/SQL, UNIX Shell Scripting, TOAD, HP-UX, SUSE-LINUX.

 

 

 

Client : TD Bank, Mount Laurel, NJ                                                                                                                        Mar’11- Feb’ 12

Role: Senior ETL Developer/Designer/Business Analyst

Project Name: TSFG Data Archive

 

TSFG Bank and TD Bank have identified over 500 applications for Archive and decommissioning.  In accordance with legal and regulatory requirements, application data will be classified for retention in an archive system. Decommissioned applications will result in savings from reduced licensing and maintenance costs.

The objective of this project is identifying the applications that require archiving and documenting at high level the current state of those application as well as the archiving requirements.

 

Responsibilities:

·         Involved in Design and Implementation of different application using various tool like Visio, Erwin and SQL developer tool.

·         Worked as business system analyst gathers the requirement from the source system for end report data base.

·         Worked extensively on to design of different application for the Data Modeling and Data Mapping.

·         Created SSDS and SRS document for the different archival applications

·         Extensively worked on Informatica Power Center and the various tools such as Mapping Designer, Workflow Manager Etc.

·         Implemented looping mechanism through Informatica to handle historical data loads.

·         Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility.

·         Worked Extensively with Expression, Lookup, Joiner, Sorter, Aggregator, Update Strategy, Stored Procedure, Sequence Generator, Filter, Rank and Router transformations.

·         Designed reusable components such as transformation, Mapplets, lookups etc and reusable objects sources and targets (shared folder) for ETL process.

·         Designed and developed data Validation controls to evaluate the quality of the ETL process using UNIX scripts and ETL Mappings.

·         Extensively worked in Oracle PL/SQL procedures, functions, triggers & packages for supporting the development ETL jobs and also for maintaining the integrity of the DataMart.

·         Closely interacted with the business lines to gather requirements and created analysis/functional specifications.

·         Prepared ETL low level design documents with information on implementation of business logic and specifications of the job flow.

·         Involved in creating Specification Documents, Mapping documents, Data Flow Diagrams, ER Diagrams, Dependency documents etc.

·         Created and reviewed unit, system and integration test cases for all cycles.

·         Worked on the crystal report to generate the slice and dice report for the TSFG archive data.

·         Used Informatica to schedule the execution on the Informatica workflows and monitor other processes and start jobs based on success or failure of other processes.

·         Created excel macros to generate the insert scripts and informatica mappings one to one which was required for the DART process.

·         Involved in the generating the end reports like Dash board, slice and dice for the business using the Business Object.

·         Also involved in the creation on universe and classes by using the Business Object Crystal report XI BI reporting tool.

 

Environment: Informatica PowerCenter 8.6, SQL Server 2005,Oracle 10g, SQL, PL/SQL, Crystal report, UNIX Shell Scripting, TOAD, HP-UX, SUSE-LINUX, Business Objects Crystal report XI, MS VISIO , Erwin.

 

 

Client: TD Bank, Falmouth, ME                                                                                                                         Feb’10- Mar’11

Role: Senior ETL Developer

Project Name: Data Migration and BASEL 2 compliance

 

TD Bank, America’s Most Convenient Bank®, is one of the 10 largest banks in the United States, and provides Customers with a full range of financial products and services at more than 1,250 convenient locations from Maine to Florida.

Serving as ETL Developer for ETL/Data Migration projects and Basel2 also working to support the various groups of development teams! Also working in the production support ETL team and is involved in fixing issues during the 24/7 support hours.

Responsibilities:

·         Working as an Senior ETL Developer for the Basel Lending Data Mart a critical data warehouse and TPR and DDM Migration of the bank which is fed by various compliance applications.

·         Converted a PL/SQL based for ETL environment consists of close to 70 modules entirely into Informatica.

·         Developed mappings and workflows for loading data into the TPR (Time Persistent Repository) & DDM Databases and for creating and transmitting extracts for various business needs.

·         Performing the role of a Senior ETL Developer for team of 20+ ETL Developers for user management, code migrations, code reviews, sessions monitoring etc.

·         Performing the role of a subject matter expert for the ETL & Database environment for the Basel Lending DataMart

·         Have been closely interacting with the business lines to gather requirements and created analysis/functional specifications.

·         Prepared ETL design specification documents with information on implementation of business logic and specifications of the job flow

·         Developed mappings and workflows for loading data into the TPR (Time Persistent Repository) & DDM Databases and for creating and transmitting extracts for various business needs.

·         Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility.

·         Worked with XML sources as well as midstream parsers for creating mappings based on the XML specifications.

·         Extensively worked in Oracle PL/SQL procedures, functions, triggers & packages for supporting the development ETL jobs and also for maintaining the integrity of the DataMart.

 

Environment: Informatica PowerCenter 8.6, Oracle 10g, SQL, PL/SQL, UNIX Shell Scripting, TOAD, HP-UX, SUSE-LINUX

 

Sapient Corporation

Client: TD Bank, Mount Laurel, NJ                                                                                                                      Jul’09- Feb’10

Role: Senior ETL Developer

Project Name: TD Data Archive

 

Commerce Bank and TD Banknorth have identified over 500 applications for Archive and decommissioning.  In accordance with legal and regulatory requirements, application data will be classified for retention in an archive system. Decommissioned applications will result in savings from reduced licensing and maintenance costs.

The objective of this project is identifying the applications that require archiving and documenting at high level the current state of those application as well as the archiving requirements.

 

Responsibilities:

·         Extensively worked on Informatica Power Center and the various tools such as Mapping Designer, Workflow Manager Etc.

·         Implemented looping mechanism through Informatica to handle historical data loads.

·         Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility.

·         Worked Extensively with Expression, Lookup, Joiner, Sorter, Aggregator, Update Strategy, Stored Procedure, Sequence Generator, Filter, Rank and Router transformations.

·         Designed reusable components such as transformation, Mapplets, lookups etc and reusable objects sources and targets (shared folder) for ETL process.

·         Designed and developed data Validation controls to evaluate the quality of the ETL process using UNIX scripts and ETL Mappings.

·         Extensively worked in Oracle PL/SQL procedures, functions, triggers & packages for supporting the development ETL jobs and also for maintaining the integrity of the DataMart.

·         Closely interacted with the business lines to gather requirements and created analysis/functional specifications.

·         Prepared ETL low level design documents with information on implementation of business logic and specifications of the job flow.

·         Involved in creating Specification Documents, Mapping documents, Data Flow Diagrams, ER Diagrams, Dependency documents etc.

·         Created and reviewed unit, system and integration test cases for all cycles.

·         Used Autosys to schedule the informatica workflows as few workflows are executed on monthly schedule also monitor other processes and start jobs based on success or failure of other processes.

·         Created excel Macros to generate the one to one mapping and insert scripts to the target table in oracle database.

 

Environment: Informatica PowerCenter 8.6, Oracle 10g, SQL, PL/SQL, UNIX Shell Scripting, TOAD, HP-UX, SUSE-LINUX,    AutoSYS utility.

 

Sapient Corporation

Client: TD Bank, Mount laurel ,NJ                                                                                                              Sep’08 – July’ 09

Role: ETL Developer

Project Name: TD US Integration Conversion

 

TD US Integration Conversion is a data conversion project. The objective of this project is to convert, with minimal customer impact, the customer and enterprise information within the Commerce mainframe and distributed systems applications into the appropriate TD Banknorth mainframe and distributed systems applications, maintaining the required product set offerings and otherwise meeting the needs of the business lines.

 

Responsibilities:

·         Extensively worked on Informatica Power Center and the various tools such as Mapping Designer, Workflow Manager Etc.

·         Implemented looping mechanism through Informatica to handle historical data loads.

·         Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility.

·         Worked Extensively with Expression, Lookup, Joiner, Sorter, Aggregator, Update Strategy, Stored Procedure, Sequence Generator, Filter, Rank and Router transformations.

·         Designed reusable components such as transformation, Mapplets, lookups etc and reusable objects sources and targets (shared folder) for ETL process.

·         Designed and developed data Validation controls to evaluate the quality of the ETL process using UNIX scripts and ETL Mappings.

·         Extensively worked in Oracle PL/SQL procedures, functions, triggers & packages for supporting the development ETL jobs and also for maintaining the integrity of the DataMart.

·         Closely interacted with the business lines to gather requirements and created analysis/functional specifications.

·         Prepared ETL low level design documents with information on implementation of business logic and specifications of the job flow.

·         Involved in creating Specification Documents, Mapping documents, Data Flow Diagrams, ER Diagrams, Dependency documents etc.

·         Created and reviewed unit, system and integration test cases for all cycles.

 

Environment: Informatica Power Center 8.6, Oracle 10g, SQL, PL/SQL, UNIX Shell Scripting, TOAD, HP-UX, SUSE-LINUX

 

 

Sapient Corporation

Client: Swisscom, Switzerland/Germany                                                                                                 May’08- Sep’08

Role: ETL Developer

Project Name: Milano

 

Milano is the project name for the extension of the Bluewin TV technical data mart system. A first release of such a technical data mart system was developed by Project42. Milano continues and expands this work done so far and adds more features, also non-TV related, in order to fulfill the various palette of stake holder’s requirements.

 

The applied products are new for Swisscom and they are also new for the manufacturers. The past has shown that we don't have an efficient tool to analyze the measures of the Blue win TV platform. The aim of this platform is to build up a technical data mart system

 

Responsibilities:

·         Extensively worked on TOS (Talend Open Studio) and the various tools such as Designer, Workflow Manager Etc.

·         Worked on Mondarian reporting tool to generate reports to the clients.

·         Designed and developed data Validation controls to evaluate the quality of the ETL process using UNIX scripts and ETL Mappings.

·         Extensively worked in Oracle PL/SQL procedures, functions, triggers & packages for supporting the development ETL jobs and also for maintaining the integrity of the DataMart.

·         Closely interacted with the business lines to gather requirements and created analysis/functional specifications.

·         Prepared ETL low level design documents with information on implementation of business logic and specifications of the job flow.

·         Involved in creating Specification Documents, Mapping documents, Data Flow Diagrams, ER Diagrams, Dependency documents etc.

·         Created and reviewed unit, system and integration test cases for all cycles.

 

Environment: TOS (Talend Open Studio), Oracle 10g, SQL, PL/SQL, UNIX Shell Scripting, TOAD.

 

Sapient Corporation

Client: Credit Suisse, India                                                                                                                                     Sep’07- Apr’08

Role: ETL Developer

Project Name: Prime Brokerage

Credit Suisse focused on Commodities as a potential growth opportunity. Since then there has been an increasing level of interest from Prime Services Hedge Fund clients in investing in this sector. Seeking to create a unique value proposition for our clients the firm has formed a number of strategic partnerships which bring this market to our customers. The prime focus to develop an end-to-end solution that incorporates the flow of data from trade capture of exchange and non-exchange done-with and done-away trades through data enrichment, resulting in static and interactive reports in Prime View. 

Responsibilities:

·         Extensively worked on Informatica Power Center and the various tools such as Mapping Designer, Workflow Manager Etc.

·         Have been closely interacting with the business lines to gather requirements and created analysis/functional specifications.

·         Prepared ETL design specification documents with information on implementation of business logic and specifications of the job flow

·         Access the source XML’s and used UNIX script shell script to modify the source data and load the data into the landing zone.

·         Developed mappings and workflows for loading data into the DDM Databases and for creating and transmitting extracts for various business needs.

·         Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility.

·         Extensively used the Mid stream parser and Mid Stream generator transformation to load the data into target table and to generate the XML’s according to the request of the business.

·         Worked with XML sources as well as midstream parsers for creating mappings based on the XML specifications.

        Extensively worked in Oracle PL/SQL procedures, functions, triggers & packages for supporting the development    

        ETL jobs and also for maintaining the integrity of the DataMart

 

Environment: Informatica Power Center 8.1, Oracle 10g, SQL, PL/SQL, UNIX Shell Scripting, TOAD, HP-UX, SUSE-LINUX, Altova XML .

 

 

CalTech InfoSystems                                                                                                                                 Jun’04 –Jul’07

                                         

Client: Dovebid, India                                                                                                                                                         Sep’06- May’07

Role: ETL Developer

Project Name: Financial Assets valuation

 

The objective of this project is development of Financial Assets valuation Data warehouse and reporting system to analyze the expenses incurred on Legal services based on different regions/divisions. The data is obtained from disparate sources like flat files and oracle tables. After extracting data from different sources it is loaded into temporary database. Business rules are applied on this data with the help of transformations (provided by Power Center 7.1) and then data is loaded into Data Marts.

 

The purpose of building Financial Assets valuation Data warehouse is to help business users in making informed decisions. The business revolves around giving different assets for lease, based on requests made for assets. These assets are given for lease based on lease values that are calculated automatically.

Responsibilities:

·         Developed end-to-end ETL processes for Financial Assets valuation Data warehouse.

·         Understanding of business requirement documents.

·         Analyzing Source System data.

·         Worked on Informatica Power Center 7.1 tool – Source Analyzer, Warehouse designer, Mapping Designer.

·         Designed Sources to Targets mapping from Flat files to Oracle using Informatica PowerCenter7.1.

·         Trapping incorrect data on staging servers and then loading it into the Target Database.

·         Loading the data into data marts using Informatica PowerCenter7.1 on weekly basis by applying the business rules.

·         Created Mappings to populate the data into dimensions and Fact tables.

·         Worked on different transformations Source Qualifier, Expression, Lookup, and Filter, Router, and Update strategy and sequence generator.

·         Extensively used Source Qualifier Transformation to filter data at Source level rather than at Transformation level.

·         Involved in the development of mappings and tuning them for optimum performance.

·         Creating and running sessions using Workflow manager & Workflow monitor to load the data into the Target Database

·         Created Tasks, Workflows, Sessions to move the data at specific intervals on demand using Workflow Manager

·         Created Workflows to run several sessions sequentially and concurrently.

 

Environment: Informatica Power Center 7.1, Oracle 9i, SQL, PL/SQL, UNIX Shell Scripting, TOAD, HP-UX, SUSE-LINUX

 

 

CalTech InfoSystems

Client: Lloyds TSB Bank, India                                                                                                                                                        Aug’04- Jul’06

Role: ETL Developer

Project Name: Financial Management System

 

Lloyds TSB Bank, Dubai is a long term finance group, which is one of the most successful groups. Client wanted a tool to analyze current business trends to make predictions about the future business trends. This Project is developed for client to provide enterprise reports for their internal and external use and to support queries for making intelligent banking decisions based on data available over a period of time.

 

Responsibilities:             

·         Used Source Analyzer and Warehouse designer to import the source and target database schemas and the Mapping Designer to create mappings.

·         Extensively used Source Qualifier Transformation to filter data at Source level rather than at Transformation level.

·         Created the warehouse Structure using Informatica Power Center Warehouse designer and using Informatica Designer designed Mappings that populated the Data into the Target.

·         Created Mappings to populate the data into dimensions and Fact tables.

·         Creation of transformations like Source Qualifier, Joiner, Expression, Lookup, Filters, Update Strategy and Sequence Generator.

·         Involved in the development of mappings and tuning them for optimum performance

·         Running sessions and workflow to load the data.

 

Environment: Informatica Power Center 7.1, Oracle 9i, SQL, PL/SQL, UNIX Shell Scripting, TOAD, HP-UX, SUSE-LINUX

 

 

 

 

 



Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Work Status:

US - I am authorized to work in this country for my present employer only.

 

 

Target Job:

Target Job Title:

Sr Informatica Developer

 

Target Company:

Company Size:

 

Target Locations:

Selected Locations:

US-NJ-Southern