From:                              route@monster.com

Sent:                               Friday, May 06, 2016 1:57 PM

To:                                   pkumar@altusmeus.com

Subject:                          Candidate for review

 

This resume has been forwarded to you at the request of Monster User xgenesisngnx02

Gopichand Ravilla 

Last updated:  05/04/16

Job Title:  no specified

Company:  no specified

Rating:  no specified

Screening score:  no specified

Status:  no specified


Bridgewater, NJ  08807
US

Mobile: 9734765878   
gopichand.ravilla@gmail.com
Contact Preference:  Telephone

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Gopichand Ravilla - ETL Architect / Developer / Data Modeler

Resume Value: 25pkxwfnb4kfz5pm   

  

 

 Bridgewater, NJ

Phone: 973-476-5878

Email: gopichand.ravilla@gmail.com

GOPICHAND RAVILLA

 

SUMMARY

·18 years of IT experience working in all phases of the system development life cycle including business requirements gathering, business process modeling, interacting with business users, functional specifications documentation, software architecture, data analysis, JAVA, data warehousing, data modeling, ETL (Informatica) and deployment. 

·Experience in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.

·Expert knowledge in data migration / integration technologies (Informatica) successfully completing complex multi terabyte EDW systems and data marts (Star Schema, Snowflake Schema) from OLTP and EDB systems.

·Experience in Relational and Dimensional Data modeling using ERWIN.

·Sound understanding of Financial Industry including Prime Brokerage Business.

·17 years of experience working with Relational Databases (Oracle, Informix, Informix XPS, Sybase, DB2, Teradata).

 

TECHNICAL SKILLS

 

SKILL

YEARS OF EXPERIANCE

Data Warehousing

14

Informatica (9.1 / 8.6 / 5.1 / 4.7), Informatica MDM

11

Java, C, C++

4

Teradata

7

Oracle

5

DB2

3

Modeling (Erwin)

2

Unix / Linux

16

Shell Scripting

5

SQL

16

Python

1/2

Business Objects

1/2

Trillium

1/2

Greenplum 4.2

1/2

Hadoop / Pig / Sqoop / Python

1/2

IBM BLU Columnar

1/2

 

 

PROFESSIONAL EXPERIENCE

 

Prudential Insurance, NJ                                                                                                                         Sep 2015 – Present

ETL Architect / Developer / Data Modeler

 

Integrated Data Warehouse:  Modeled the data warehouse integrating several financial source systems like QRM, Clarity, and Intex, internal iRisk actuals for reporting, forecasting and comparing with actuals.

 

Architected / Developed ETL processes to populate the data warehouse from several source systems.

 

Evaluated performance metrics comparing the IBM BLU Columnar Vs Row Store.

 

Streamlined / Automated ETL processes using Linux scripting.

 

Environment: IBM BLU, Linux, Oracle 11g, PL/SQL, QRM, Clarity, Intex, Oracle SQL Data Modeler, Informatica 9.5

 

 

Citi, Warren, NJ                                                                                                                                   Mar 2013 – Sep 2014

ETL Data integration Architect / Hadoop Developer

 

citiKYC : Architected Complex Metadata Driven Data Migration / Data Integration process which involves client data from about 74 countries from four regions across the globe.

 

Designed and developed ETL components using Talend, Informatica and PL/SQL.

 

Guided and assisted data modeling efforts to accommodate the data from various sources in citiKYC database.

 

Customer Metadata Using Informatica MDM : Produced customer master data using Informatica MDM modules of data quality procedures such as profiling, standardization, elimination of duplicates and consolidation are applied to the data

 

Hadoop / Greenplum : Successfully completed Hadoop POC project extending company Data Integration layer to include Hadoop as source and Greenplum as target as well as a framework for distributed processing of large data sets.  The framework is built as a data layer where all company data is integrated (thru Java, Python, Sqoop, and Pig code) for consuming by business users and application teams.

 

 

Environment:Oracle 11g, Linux, Talend 5.1, Informatica MDM, Hadoop, Pig, Python, Sqoop, Pivotal DCA / Greenplum 4.2

 

JP Morgan Chase, Iselin, NJ                                                                                                                     June 2012 – Mar 2013

Data Integration Architect / ETL Lead

 

Participated in requirements gathering and documentation, and created functional specification documentation, interacted with end business analysts to fulfill the real-time business data requirements.

 

Real - Time ODS : Architected a process to receive messages from MQ system from outside vendor (LPS) and process using Informatica real time web services hub via in-house messaging system.              Led the team of developers to program the Informatica components to bring the real-time data into ODS system 24/7.

 

Integration of Batch / Real - Time Data : Designed a complex system of integrating real-time data with batch data while maintaining the latest near real-time data with in 30m sec interval in ODS system.  Led the team of developers to program stored procedures for the integration of data.

 

Environment:Oracle 11g, Linux, Informatica 9.1.

 

MetLife, Bridgewater, NJ                                                                                                                        Jan 2012 – June 2012

ETL Architect / Solution Architect

 

ETL Automation : Developed code in JAVA to generate ETL scripts bringing 700 tables from mainframe.  The automation process generates load scripts for inserts/updates/deletes based on DDL’s.  The automation process reduced need for several programmers.

 

Morgan Stanley, New York                                                                                                                      May 2003 – Dec 2011

Data Warehousing Architect / Data Architect / ETL Manager

 

Responsibilities:

·      Daily routine includes project initiation, requirements gathering, and understanding business requirements, interacting with business users, writing functional specification documentation, data analysis, planning, data modeling, process architecture, writing detailed design documents, ETL development, system integration, and deployment.

·      Leading a team of 11 (Onshore/Offshore) Programmers / Analysts / Data Modelers.

·     Communicating with business users on progress of the projects, enhancements or changes to the requirements / Support Ad-Hoc User Requests / Report Data Enhancements.

 

Data Warehouse Core ETL Architecture: Designed, Architected, Modeled, and led the programming effort for Core tables processing in Data warehouse.  The Architecture is replica of Legacy Systems and the basis for hundreds of systems including Integrated Data Warehouse (IDW).  The Archiving methodology allows "as at the time of the transaction/dimension" values for reporting purposes.  A common component UNIX scripting / generated database coding is implemented to handle four types of load. 

 

Data Modeling Projects: Provided data models (star/snowflake schema) for several data marts like HELOC (Home Equity Loan of Credit), Options P & L Analysis, Switch Reports, Compliance Surveillance for Equities, Fixed Income, Mutual Fund, Money Market (Trade Activity Reporting), Anti Money Laundering (AML). 

 

JAVA XML Frame Work: Developed and deployed automated XML framework in JAVA to create XML files from Database tables and also to strip XML files to load data into database. 

 

Type Ahead Implementation: Designed and implemented DWH component for FA’s see their client / prospects just by typing pre-determined criteria. 

 

Real Time Data Warehousing: Designed and implemented near real time data warehouse utilizing CORE architecture leading the way for several applications to take advantage of real time data.

 

Smith Barney Data Conversion: Smith Barney merger required 3X data to be maintained at data warehouse level. Led the team to successfully test core data in mock 1, mock 2, and mock 3 and helped the application teams test out individual data marts.  Analyzed several technologies to keep historical data of Smith Barney and played a key role in purchasing teratada appliance server and led the team to load 7 years of Smith Barney historical data into appliance server.

 

Data Quality: Led the Customer Data Optimization project for name and address cleansing and standardization for GWM clients using Trillium.   Developed a system to report suspicious SSN numbers using high numbers published Social Security Administration.

 

Vendor Selection: Participated in Teradata vs UDB selection process as an evaluation committee member and made recommendations.  Evaluated performance metrics, cost of conversion, and alternate loading methodologies like ELT.

 

Data warehouse Conversion: Led the team of programmers to convert all the Informatica, UNIX, SQL components from UDB to Tearadata.   As a lead / Project Manager, developed a strategy to identify the critical components, time lines, communicated with other systems managers, evaluated end results. 

 

Branch Number Expansion:  Designed the methodology to convert all the Data Warehouse components to use expanded branch number with minimal operational involvement.  Led the team of offshore programmers for successful project implementation.

 

Business / Data Analysis:  Completed several data mart projects involving Business / Data analysis.   The projects include High Net Worth Client/FA Analysis, Client Gain / Loss Analysis, HELOC (Home Equity Loan Of Credit) for other uses, Options P & L Analysis, Switch Reports, BASEL. 

 

ComplianceProvided data feeds for Compliance Surveillance for Equities, Fixed Income, Mutual Fund, Money Market (Trade Activity Reporting for Actimize), Anti Money Laundering.

 

Environment:UDB 7.2/8.1 (Data warehouse (EDW), Data marts), Teradata 8.1(EDW), Informatica PowerCenter 8.1, Windows2000/NT, UNIX (Sun Solaris 2.6), Java, J2EE, Trillium 6.0.

 

Cendant Corporation, Parsippany, New Jersey                                                                                             Aug 2002 – May 2003

ETL Specialist

Data Warehousing Projects - Mega Loyalty Data mart (Star Schema) is used by end users to properly reward members and is built from EDW source. Marketing Data mart is for analyzing trends of loyalty members, consumers, and plans under various attributes and is built from EDW source. Operational Data mart (Snowflake Schema) is built from financial (FIS), reservation (CRS) OLTP systems and is used for analyzing reservation trends, and financial reports of various franchises. 

Environment:Oracle 9.2 (Data marts), Informix XPS 8.32 (EDW), Informix 7.3, Oracle 7.1, Flat Files (Sources), Informatica Power Center 5.1.2,  Windows2000/NT, J2EE, UNIX (Sun Solaris 2.8)

Responsibilities

·      Installation, Configuration and Administration of Informatica PowerCenter 5.1.2 on development, QA, production environments.  Mentoring Informatica to project members, and enforcing standards and best practices.

·      Prepared Technical, Detailed Design Documents for Mega Loyalty project.

·      Modeled Staging database for Tibco Feed, Mega Loyalty Data mart (Star Schema), Operations (FIS) Data mart (Snowflake) using Erwin 4.0 while helping the architect design Operations (CRS) Data mart.

·      Designed and Developed complex mappings/mapplets to load Operational, Marketing, and Trip Rewards Data marts from OLAP and EDW (terabyte warehousing system) while implementing logging statistics functionality in the mappings.  Designed, Implemented Enforced a common methodology for Initial and Incremental loads for team members to follow.

·      Advising team members to follow development methodologies to achieve performance needed to load from 100 to 450 million rows tables from EDW.

·      Created custom reports using PowerAnalyzer and helped users comparing the results of Data marts and existing OLAP systems.

 

Brown & Williamson Tobacco Corporation, Macon, Georgia                                                               Aug 2000 – Aug 2002
Sr. Programmer/Analyst

Data Warehousing Projects - RODs is an EDW system loaded from the Laboratory Analytics tool (SQL*LIMS) and is used as a Production Monitoring System.  FabOp (Fabrication Operational and Maintenance) Data mart (Star Schema)  is used by end users to monitor operational activities and trends in manufacturing defects and is built from OLTP systems.

Environment:SQL*LIMS, Oracle 8i (Data marts), Oracle 8i (Staging), Informix 7.3 (EDW), Informix 7.3, Oracle 7.3.2, Flat Files, Mainframe Data Files(Sources), Informatica PowerCenter 5.1, Brio 5/6.6, Windows2000/NT, UNIX (HP/UNIX (10.2)), UNIX(AIX).

Responsibilities

·     Developed new modules, reports, bar code generation programs for SQL*LIMS users with new methods, and operations to enter and track the tasks and results.  Support the system from various testing instruments (Gas Chromatographs, Mass Spectrometers).

·      Prepared Functional, Detailed Design Documents for Rods.

·      Day to Day interaction with R&D scientists for their reporting needs and design/reporting changes of RODs.

·      Programming in ESQL/C to load complex reporting structures.

·      Designed part of RODs EDW data warehouse as per the new requirements and prepared technical documents for acceptance by Senior Managers (R&D).

·      Modeled staging database for fast loading from OLAP systems using Erwin 3.5 for FabOp.

·      Modeled FabOp (Star Schema) Maintenance Data mart using Erwin 3.5.

·      Designed and Developed complex mappings/mapplets, procedures (PL/SQL) to FabOp from OLTP systems.

·      Created custom reports using Brio 5/6.6 for FabOp users.

 

Verizon (GTE) Corporation, Dallas, Texas                                                                                                       Jan 2000 – July 2000

Sr. Programmer/Analyst
Data Warehousing Project (GTE.net Business Decision Support System) – BDSS is a marketing Data mart (Star Schema) and is loaded from both EDW and OLTP Systems.  The system is used for analyzing customer trends.

Environment:Oracle 8i (Data marts, staging), Oracle 7.3 (EDW), Sybase, Oracle 7.3, Informatica PowerMart 4.7, Windows 98, UNIX (Sun Solaris 2.4)

Responsibilities

·      Developed mappings, mapplets, reusable transformations, custom procedures, and packages (PL/SQL) to load into EDW and Data marts from various source systems.

·      Participated in designing standards for error handling.

·      Interacting with QA group to fix bugs and changes in the design.

·      Assisted the architect design staging database.

 
FedEx Corporation, Memphis, Tennessee                                                                                                        Oct 1998 – Jan 2000
Programmer/Analyst

Three-tier client-server system (Aircraft Maintenance Automated Dispatch Systems) – The project is implemented as a three-tier model, and uses RMI (Remote Method Invocation) Technology for communication.  The business logic and broker servers are implemented as remote objects.  All the 100 client hand held units connect to servers using RF network.  Sybase is used for persistence).  This system is used to track and assign maintenance activities of Aircrafts.

Environment:Sybase, J2EE, RMI, Apache, Windows 98, UNIX (Sun Solaris 2.6)

Responsibilities

·      Developed complex custom GUI Applications using JAVA2, designed/programmed broker servers (back end (JAVA2)).

·      Participated in requirements gathering from user group for extra features and modifications.

·      Testing the system for User Acceptance and training end users on the system.

·      Developed and deployed a small web project (Apache, HTML, and JavaScript) for internal users to generate reports from existing systems.

 

Sabre Technology Solutions, Inc, Dallas, Texas                                                                                              Mar 1998 – Oct 1998
Programmer/Analyst
Data Warehousing Project (ODAD – Phase 2) - ODAD is a marketing Data mart (Star Schema) for American Airlines and is loaded from Flat Files from main Frame.  It is a terabyte Informix XPS 8.11 data mart with four years worth of data.  It is used for analyzing user trends under various dimensions like promotions, flight leg usage, sources, destinations, vacation places and times.

Environment:Informix XPS 8.11, Flat Files, UNIX (Sun Solaris 2.6)

Responsibilities

·      Programmed ESQL/C application programs to achieve fast loading (12000 rows/sec).  Programming involved data extraction, cleaning and fast loading using the INFORMIX parallel loader in staging dimensional (Star Schema) data warehouse.

·      Build highly complex ESQL/C common libraries for usage by application programs.  These libraries manage memory for aggregation, lookups (binary search of lookup data on HEAP), calculations on HEAP for the data from flat files and dimensional tables.   The high volume of data is handled on Sun box with 25 GIG of RAM.

·      Unit testing, Load testing, and Deployment into production.

 

Edward Jones & Company – St. Louis, Missouri                                                                                             Feb 1996 – Feb 1998

Programmer/Analyst

Three-tier client-server system (WordPower - Phase 2) -  This is a thin client project developed using JAVA (JDK 1.1.1, Oracle 7.3) and implemented as three-tier model.  Multithreaded RPC server (to contact Oracle Database, DB2 database in the mainframe computer) is implemented as middle tier for branch tool services.).  The system is used by Investment Representatives at company branches to write letters to prospective customers as per company rules.

Environment:Oracle, J2SE, JDBC 1.22, UNIX (Sun Solaris 2.6(SUN OS 5.5.1))

Responsibilities

·      Participated in the design process of the project including user requirement gathering.

·      Designed the database to store letter templates.

·      Programming in JAVA objects to manipulate data in the database using JDBC 1.22 as API and Oracle’s JDBC OC17 as driver.  Programmed branch interface used by GUI to communicate with RPC server. 

Two-tier client-server system (WordPower - Phase I) - This simple project involves making calls to a remote mainframe computer using RPC (Remote Procedure Call). – Duties include programming in C, C++, and shell scripting. Sun Solaris 2.4 is the operating environment (Sun OS 5.4 (UNIX)) used for this project.

Three-tier client-server system (Tickler Project) - This complex project consists of relational database calls and manipulating data in an object oriented environment.  This project used DIA (Database Interface Agent) which provided an object-oriented interface to relational and non-relational business data in the client/server environment.).  The system is used by Investment Representatives for marketing purposes.

Environment:Oracle 7.3, UNIX (Sun Solaris 2.6(SUN OS 5.5.1))

Responsibilities

·      Duties include programming in PRO*C (ORACLE 7.3.23) which will be invoked by DIA, and programming in C++ using RougeWave libraries.

 

 

EDUCATION

M.S. in Industrial Engineering, Lamar University, Beaumont, TX, USA.

B.S. in Mechanical Engineering, Osmania University, Hyderabad, India.

 

Citizenship: US Citizen

 



Experience

BACK TO TOP

 

Job Title

Company

Experience

ETL Data Integration Architect / Hadoop Developer

Prudential Life Insurance Company,Ltd., The

- Present

 

Additional Info

BACK TO TOP

 

Desired Salary/Wage:

130,000.00 - 150,000.00 USD yr

Current Career Level:

Experienced (Non-Manager)

Date of Availability:

Within 2 weeks

Work Status:

US - I am authorized to work in this country for any employer.

Active Security Clearance:

None

US Military Service:

Citizenship:

US citizen

 

 

Target Job:

Target Job Title:

ETL Architect / Developer / Data Modeler

Desired Job Type:

Temporary/Contract/Project

Desired Status:

Full-Time

 

Target Company:

Company Size:

Industry:

Banking
Financial Services

 

Target Locations:

Selected Locations:

US-NJ-Central
US-NY-New York City

Relocate:

Yes

Willingness to travel:

Up to 25% travel

 

Languages:

Languages

Proficiency Level

English

Fluent