From:                              route@monster.com

Sent:                               Monday, September 28, 2015 1:02 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Nishank Kudithi 

Last updated:  09/29/14

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Cincinnati, OH  45215
US

Mobile:   
Home:

nishankreddy17@gmail.com

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: nishank reddy(data stage developer)

Resume Value: 5vwdcc7nk32vtib8   

  

 

Nishank Reddy

nishankreddy17@gmail.com

518-354-1572

 

SUMMARY

§    7 + years of experience in IT as an application developer. With experience in the total life cycle of Software Development Applications involving Requirements, Analysis Design, Development, Testing

§    6 + years’ experience in Data warehousing, ETL Process, methodology and data mart design

§    6+ years experience in ETL using IBM DataStage and different databases like Oracle, Teradata.

§    2+ years experience in Pentaho BI tool

§    1+ years experience in Talend ETL tool

§    1+ years experience on Teradata.

§    Experience with Talend Open Studio 4.1.2/5.0.2 & Talend Integration Suite (Installation & Scheduling).

§    Worked on datastage client tools DataStage Designer, DataStage Director, DataStage Manager.

§    Experience RDBMS, E_R model, Logical Model, Dimensional Model.

§    Experience in Designing, compiling, testing, scheduling and running datastage jobs.

§    Developed Server jobs for extracting, transforming, integrating and loading data to targets.

§    Used various Stages like Aggregator, Merge, Hashed File, and Transformer, Bulk load, ODBC etc.

§    Experience in design and implementation of Star schema, Snowflake schema and multi dimensional model.

§    Worked with various databases Oracle Exadata/11g/10g/9i/8i, DB2, Teradata, MS Access and SQL Server.

§    Expertise in data warehousing techniques for data cleansing, Slowly Changing Dimension phenomenon.

§    Worked with Parallel Extender for parallel processing to improve performance of jobs.

§    Strong knowledge of programming languages like SQL, PL/SQL, BASIC, C, COBOL, VB.

§    Experienced in Tuning SQL Statements and Procedures for enhancing the load performance in various schemas across databases.

§    Successfully interact with end-users, developers and top management.

§    Excellent analytical, problem solving, communication and interpersonal skills, with ability to interact with individuals at all levels.

 

TECHNICAL-SKILLS

 

ETL Tools:IBM Data Stage 8.7/8.1/7.1/7.0/6.0/5.x, Talend 4.1.2/5.0.2

BI Tools:Pentaho 4.5

Data Modeling: ERWIN 3.5/4.0/4.1.4

Databases: Oracle Exadata/11g/10g/9i/8i, SQL Server 7.0 / 6.5, MS Access, Teradata, IBM DB2, Netezza, GreenPlum

Languages: C, SQL, PL/SQL, UNIX Shell scripts, HTML, XML, Java

Web Tools:HTML, JAVA 2.0, JAVA Script, JDBC, Servlets, Java Server Pages, 

GUI Tools:DB2 connect, TOAD for oracle, SQL Developer

Scripting:UNIX, Shell Scripting, SQL, PL /SQL Scripts, Test script

Operating System: Windows 7/xp/2000/98/NT, HP-UX, SUN Solaris 7.x/8.0

 

 

EDUCATION:

B.Tech Electrical and Electronic Engineering

 

 

 

 

 

 

 

 

 

PROFESSIONAL EXPERIENCE:

 

GE Aviation, Cincinnati, OH

Period: May 2014 --- Present

Role: Talend Developer

 

GE Aviation, a subsidiary of General Electric, GE Aviation is among the top aircraft engine suppliers, and offers engines for the majority of commercial aircraft. GE Aviation is part of the General Electric conglomerate, which is one of the world's largest corporations. The division operated under the name of General Electric Aircraft Engines 

 

I worked for RDF Data Delivery project this includes the information regarding Full Flight Data. Interacting with Business Users to gather business requirements and designed user friendly templates to communicate any further enhancements needs to be implemented. He will also be designing and developing ETL process using Talend Open Studio (Data Integration) & worked on Talend Integration Suite Version 5.0.2

 

 

Responsibilities:

 

·   Interacted with Business Users to gather business requirements and designed user

Friendly templates to communicate any further enhancements need to be implemented.

·   Designed and Developed ETL process using Talend Open Studio (Data Integration) &

Worked on Talend Integration Suite Version 5.0.2

·   Worked on Design Documentation for the Talend Jobs Developed.

·   Involved in creating a new generation data warehouse solution with Web

Services as reporting tool, Talend ETL tool and Teradata database.

§ Involved in Data Extraction for various Databases & Files using Talend Open Studio

§ Worked on Talend Open Studio with Java as Backend Language

§ Extensively Used tamp component which does lookup & Joiner Functions, tjava, treplace, tlogrow, tlogback components etc in many of my jobs created.

§ Developed mappings between operational sources to Operational target tables.

§ Developed Common Generic Jobs which can be re-usable

§ Developed UNIX shell scripts to automate and run the programs on regular basis.

§ Involved in Testing and maintenance of various application modules.

§ Modified existing codes PL/SQL packages to perform Certain Specialized functions and enhancements

 

 

Environment: Talend Integration Suite Version 5.0.2, Teradata, Green Plum, DB2, Oracle11g Tortoise SVN, UNIX

 

 

 

 

Wells Fargo, Charlotte, NC

Period: August 2013 – February 2014

Role: ETL Developer

 

 

 

Wells Fargo & Company is an American multinational banking and financial services holding company with operations around the world. Wells Fargo is the fourth largest bank in the U.S. by assets and the largest bank by market capitalization. Wells Fargo is the second largest bank in deposits, home mortgage servicing, and debit cards.

 

WLS Datamart­ is an infrastructure used to consolidate loans tracking information into a single repository from multiple applications and to provide operational level reporting to various managers in WLS. These managers or business users are spread across all time zones of Americas. Datamart reports volume, SLA and Quality metrics along external or internal hierarchy lines to the loan centers. FRED is used as a front-end system for Datamart to capture the quality information, auditing and other important user inputs for Datamart reporting.

 

 

Responsibilities:

 

§ Interacted with Business Users to gather business requirements and designed user friendly templates to communicate any further enhancements needs to be implemented.

§ Experience in implementing Type II changes in Slowly Changing Dimension Tables.

§ Developed mappings between operational sources to Operational target tables

§ Extracted the data from several sources such as Oracle, sequential files, XML files.

§ Developed ETL mappings for Staging, Dimensions, Facts Data mart load and different format of source files

§ Used Quality Stage for data cleansing to convert data into required format.

§ Involved in database development by creating Functions, Procedures, Triggers, and Packages using TOAD for accessing, inserting, modifying and deleting data in the database.

§ Designed and Developed PL/SQL Procedures, functions, and packages for creating Summary tables.

§ Participated in Performance Tuning of SQL queries using Explain Plan to improve the performance of the application.

§ Worked extensively on exception handling for handling errors using system defined exceptions and user defined exceptions like INVALID_NUMBER, NO_DATA_FOUND and PRAGMA EXCEPTION_INIT.

§ Used Ref cursors and Collections for accessing complex data resulted from joining of large number of tables.

§ Used the DataStage designer to develop processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database

§ Developed and implemented strategies for slowly changing dimensions, surrogate key generation and for incremental data loading

§ Involved in designing lookup strategies using Hash file stage for data extraction from the source systems

§ Develop parameter driven ETL process to map source systems to target data warehouse with Data Stage complete source system profiling

§ Used Data Stage director to schedule running the solution, testing and debugging its components and monitoring of the resulting executable versions

 

Environment: Accentual Data Stage 8.1, SQL Server ,Oracle 11g, DB2, Erwin Data Modeler 4.1.4, Windows NT 4.0, UNIX

 

Vanguard, Charlotte, NC

Period: September 2012 – July 2013

Role: Sr. ETL Developer

 

Vanguard is an investment company, including mutual funds, ETFs, and annuity products. Explore the variety of our services for individuals, including brokerage, retirement investing, advice, and college savings, as well as our full range of investments and services for institutions and investment professionals.

 

I am involved in the core data warehouse team that manages the Legacy data, ETL application (datastage), Hub and Mart data.  Our system model is based on Hub and Spoke architecture, where the entire data warehouse data is resided in Hub and required data used for reporting resides in retail and campaign mart databases.

 

 

 

Responsibilities:

 

§ Involved in migration of jobs from datastage 7.5 to 8.7

§ Developed ETL jobs for extracting data from different source like flat files, DB2 and oracle tables. Staging them in oracle/exadata tables and loading them to target oracle/exadata tables.

§ Developed mappings between operational sources to operational target tables

§ Designed and Developed ETL process using datastage 8.7

§ Developed ETL mappings for Staging, Dimensions, Facts ,Data mart load and different format of source files

§ Designed and developed the Sequencers which calls several jobs in parallel to load data into corresponding tables

§ Developed unix shell scripts to automate the Data Load processes to the target Data warehouse.

§ Created and use Data Stage Shared Containers, Local Containers

§ Performed data manipulation using BASIC functions and datastage transforms

§ Developed sparse lookups in case of huge reference tables.

§ Familiar with many of the datastage stages like CDC, Remove Duplicates, Surrogate Key Generator, Aggregator, Decode, Encode, FTP Enterprise and Plug-in stage, modify, join, merge and lookup etc.

§ Scheduled and ran the datastage jobs from Control-M tool.

§ Familiar with different source stage like Oracle connector, Stored Procedure, DB2 UDB etc.

§ Designed the Control-M jobs based on their dependency.

§ Executed the multiple cycles (DEV, INT, and SAT) through Control-M and validated the data across multiple environments during project migration.

§ Using Subversion to maintain versioning our datastage work.

§ Browsed mainframe tables using Rumba to check the source.

 

Environment: Datastage 8.7, Oracle 11g, Exa data, Unix AIX, Sun Solaris, Control-M, Tortoise SVN, Rumba, Hummingbird FTP, SQL Developer

 

 

GSI Commerce an ebay Inc company, King of Prussia, PA

Period: October 2010 – August 2012

Role: Sr. ETL Developer

 

GSI Commerce provides e-commerce and multichannel solutions to power all aspects of online business and          integrate with offline channels for more than 80 partners. GSI Commerce integrated Business Management & Strategy, E-Commerce Technology, Customer Care & Fulfillment, Agency Services, and International Solutions to grow our partners’ businesses in retail categories including: apparel, accessories & footwear; consumer electronics; home furnishings; appliances & tools; sporting goods & apparel; cosmetics & fragrances; personal care; music & video; jewelry; toys & video games; baby products; specialty foods; and pet supplies.

 

 

Responsibilities:

 

§ Developed ETL jobs for loading data into Oracle Tables at each area

§ Developed mappings between operational sources to Operational target tables

§ Extracted the data from several sources such as Oracle, sequential files, XML files.

§ Designed and Developed ETL process using Talend Open Studio (Data Integration) & worked on Talend Integration Suite Version 4.2.3

§ Developed ETL mappings for Staging, Dimensions, Facts, Data mart load and different format of source files

§ Involved in Data Extraction for various Databases & Files using Talend Open Studio

§ Worked on Talend Open Studio with Java as Backend Language.

§ Extensively Used tmap component which does lookup & Joiner Functions, tjava, toracle, txml ,tdelimted files, tlogrow , tlogback components etc in many of my jobs created

§ Designed ETL for hash file stages to load data into Oracle and Teradata tables

§ Transformations including aggregation and summation construct from operational data source to Data warehouse.

§ Designed and developed the Sequencers which calls several jobs in parallel to load datainto corresponding tables

§ Involved in creating a new generation data warehouse solution with Pentaho as reporting tool, Talend ETL tool and Netezza database.

§ Populated Company and Location Cross reference Table/Hash files for Data Lookups and to update Company Master Table.

§ Designed and developed the Master controls which control the jobs at each level DRA, TDA, DW.

§ Developed Job Sequencers jobs and extensively used Routine Activities, Job Activities, notifications for the Batch jobs.

§ Developed Job controls which control the other jobs and sends mail notification about the status of the job.

§ Implemented various Dimensions such as patient, visit, admit Doc, Time period,

§ Locations are tied to central fact table

§ Developed Unix shell scripts to automate the Data Load processes to the target Data warehouse.

§ Create and use Data Stage Shared Containers, Local Containers

§  

§ Define reference lookups and aggregations

 

Environment: Talend 4.1.2/5.0.2, Ascential Data Stage 8.1, Pentaho 4.2, Oracle 11g, Teradata, Netezza, DB2, Erwin Data Modeler 4.1.4, Windows NT 4.0, UNIX

 

 

 

Stan Chart Bank, INDIA                                                                                                   

Period: April 2008 – Dec 2009

Role: Data Warehouse Developer

 

SCB need solution to comply with AML Laws for the operating countries regulatory banks accord. The transactional data from different source systems will be processed and loaded into Data base and target files. The files will be saved into Norkom file format and fed to the Alchemist System. If any discrepancies found, Alerts will be generated. Business will analyze the Reports generated for Exceptions, Rejections & Reconciliation.

.

Responsibilities:

 

§    Created the Technical Design Documents for the jobs, Sequencers.

§    Developed Common Sequencer Jobs which can be re-usable

§    Implemented Change Data Capture to load data concurrently into Target DB, Flat Files

§    Developed jobs for SDMS reports used on daily basis for business use

§    Used various stages like Join, Lookup, Change Capture, Filter, Funnel, Copy, Column Generator, Peek, Dataset, Sequential File, Pivot, DB2/UDB, Merge, Transformer, Aggregator, Remove Duplicates Stages etc.

§    Identified, resolved source file format & data quality issues in production loading

§    Used configuration, problem & change management tools clear case, action remedy

§    Played key role in resolving the problem tickets raised by the users

§    Logged defects and analyzed the root causes and data validations.

 

Environment: Data stage 7.5, DB2, Cognos, AIX, ILog

 

 

 

 



Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Date of Availability:

Immediately

Work Status:

US - I am authorized to work in this country for my present employer only.

Active Security Clearance:

None

US Military Service:

Citizenship:

Other

 

 

Target Company:

Company Size:

 

Target Locations:

Selected Locations:

US-OH-Cincinnati

Relocate:

Yes

Willingness to travel:

Up to 100%