From:                              route@monster.com

Sent:                               Monday, September 28, 2015 1:02 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Shaifali Singla 

Last updated:  01/13/15

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Walnut Creek, CA  94598
US

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Shylaja Chirra

Resume Value: sm55pjrqvzs77vhi   

  

 

 

 

Ms. Shylaja Chirra

cshylu@gmail.com

Mobile :908-405-7077

Informatica/TalenD Developer

 

Ø       7+ years of IT experience in Analysis, Design, Development, Implementation, Testing and Support of Data Warehousing and Data Integration Solutions using Informatica Powercenter.

Ø       Sun Certified Java Programmer.

Ø       5+ years of experience in using Informatica PowerCenter (7.1.3/8.6.1/9.0.1).

Ø       1+ years of experience in Reporting tool COGNOS ( 8.4).

Ø       1 +year of  experience in Talend Open studio and Talend Integration Suite .

Ø       Knowledge in Full Life Cycle development of Data Warehousing.

Ø       Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.

Ø       Experience with dimensional modeling using star schema and snowflake models.

Ø       Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.

Ø                   Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.

Ø       Developed OLAP applications using Cognos 8BI - (Frame Work Manager, Cognos Connection, Report Studio, Query Studio, and Analysis Studio) and extracted data from the enterprise data warehouse to support the analytical and reporting for Corporate Business Units.

Ø       Strong with relational database design concepts.

Ø       Extensively worked with Informatica performance tuning involving source level, target level and map level bottlenecks.

Ø       Excellent experience working on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, tmap, tFileCopy, tFileCompare, tFileExist file components, ELT components etc.

Ø       Strong business understanding of verticals like Banking, Brokerage, Insurance, Mutual funds and Pharmaceuticals.

Ø       Independently perform complex troubleshooting, root-cause analysis and solution development.

Ø       Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills.

Ø       Team player, Motivated, able to grasp things quickly with analytical and problem solving skills.

Ø       Comprehensive technical, oral, written and communicational skills.

 

PROFESSIONAL EXPERIENCE:

 

Client: JPMorgan Chase & Co, Tampa, FL                                           

Role: ETL Consultant                               Duration: May 2014 to till date

 

Description :JPMorgan Chase & Co is the leading financial institution in the United states, Control Room  Risk Assessment Data Analytics & Reporting(RADAR) is the project  for risk Analysis and Reporting  for  protecting security and confidentiality of the information to provide an organized , effective and complaint approach to responding any potential breach of JPMorgan Chase Information.

 

Data Privacy was one of the Context  on boarded in to Control Room  RADAR Portal Application, RADAR will receive manual file from  GLASS Source system .using Informatica .xlsx  files will Extract, Transform and loaded  in to Data warehouse. Data will be available for  Dashboard and Reporting requirements in RADAR Application Portal  for end users.

 

Responsibilities:

 

Ø          Involved to understand Business requirements with Business Analysts.

Ø          Designed Function Specification Documentation as per previous generic Framework designs for all on boarding Contexts in RADAR Application Portal.

Ø          Involved to understand Meta Data driven Framework design and Process Workflows.

Ø       Involved in design and development of complex ETL mappings.

Ø       Created context specific views on top of fact tables depending on Reporting Requirements.

Ø       Validated files uploaded through Portal, loaded in to staging, dimension and fact tables.

Ø       Coordinated with offshore team to develop the context specific applications.

Ø       Involved to verify unit test case scenarios, performance issues, defect fixes in SIT and UAT environment

Ø       Coordinated with support team to deploy the code in QA, UAT and Prod and fixes the issues during deployment phase.

 

Environment: Informatica9.5.1, Oracle 11g,Unix,Autosys,Cognos,Qlickview

 

 

Client: CenturyLink, Gardner,KS                                           

Role: Informatica Developer          Duration: August 2012 to May 2014

 

Description:

CenturyLink, Inc. is a multinational communications company headquartered in Monroe, Louisiana. It provides communications and data services to residential, business, governmental and wholesale customers. It acquired Embarq and  Qwest  and began doing business as CenturyLink, Network Inventory Consolidation Program is the conversion project for Qwest acquisition, using Informatica ETL tool to pull the data from source system CIRAS & MetaSolv and transformed the data and load it into TGT_FILE_TABLES which  have the structure of the TIF that we have generated loaded in TIRKS Target Systems.

 

Responsibilities:

Ø       Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.

Ø          Worked on Informatica 9.0.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.

Ø       Involved in design and development of complex ETL mappings.

Ø       Implemented partitioning and bulk loads for loading large volume of data.

Ø       Worked on dimensional modeling containing three phases i.e., conceptual, physical and logical modeling. 

Ø       Based on the requirements, used various transformations like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner and Sequence Generator in the mapping.

Ø       Developed Mapplets, Worklets and Reusable Transformations for reusability.

Ø       Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.

Ø       Identified bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations.

Ø       Monitored jobs in work flow monitor to resolve critical issues.

Ø       Experienced in release management and deployment of code..

Ø       Performance tuning by session partitions, dynamic cache memory, and index cache.

Ø       Developed Informatica SCD type-I, Type-II mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, Mapplets and others.

Ø       Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.

Ø       Extensively worked on various Look up Caches like Static, Dynamic, Persistent, and Shared Caches.

Ø       Developed workflow tasks like Email, Event wait, Event Raise, Timer, Command and Decision.

Ø       Created Stored Procedures in PL/SQL.

Ø       Used PMCMD command to start, stop and ping server from UNIX and created UNIX SHELL PERL scripts to automate the process.

Ø       Created UNIX Shell scripts and called as pre session and post session commands.

Ø       Developed Documentation for all the routines (Mappings, Sessions and Workflows).

Ø       Involved in design and development of Business Requirements in liaison to business users and Technical teams by gathering requirement specification documents and identifying data sources and targets.

Ø       Analyzed application requirements and provided recommended design and studied the current system to understand the existing data structures.

Ø       Participated actively in user meetings and collected requirements from users.

Ø       Used Informatica Power Center for extraction, transformation and loading (ETL) of source data on heterogeneous Database sources like Oracle, SQL Server & flat files.

Ø       Designed and developed a number of complex mappings using various transformations like Source Qualifier, Aggregator, Router, Joiner, Union, Expression, Lookup (Connected & unconnected), Filter, Update Strategy, Stored Procedure, Sequence Generator and used reusable transformations as well as mapplets.

Ø       Worked with Workflow Manager for the creation of various tasks like Worklets, Sessions, Batches, Event Wait, E-mail notifications, Decision and to Schedule jobs.

Ø       Extensively used the Slowly Changing Dimensions-Type II in various data mappings to load dimension tables in Data warehouse.

Ø       Administered the repository by creating folders and logins for the group members and assigning necessary privileges using Informatica Repository Manager.

Ø       Was responsible for the version control and up-gradation.

Ø       Involved in the creation of partitions in Mapping to improve the performance of Informatica sessions.

Ø       Involved in extensive performance tuning by determining bottlenecks using Debugger at various points like targets, sources, mappings, sessions or system. This led to better session performance.

Ø       Wrote UNIX Shell Scripts and pmcmd command line utility for automating Batches and Sessions using ESP Workload manager.

Ø       Involved in Production Scheduling to setup jobs in order and provided 24x7 production support.

Ø       Involved in trouble shooting and resuming failed jobs.

Environment: Informatica Power Center 9.0.1/Informatica Power Center 8.6.1, Putty, MS SQL server 2005/2008, Oracle 10g/9i, SQL, PL/SQL, Shell Scripts, Windows XP, Unix.

 

 

Client: Humana Inc, Louisville, KYDuration: October 2011 to July 2012

Role: ETL Informatica Developer

 

Description:

Humana Inc., headquartered in Louisville, Kentucky, is a leading health care company that offers a wide range of insurance products and health wellness services. Humana provides Medicare Advantage plans and prescription drug coverage to more than 3.5 million members throughout the US.

The main objective of this project shared data Repository is to capture new vitality  program customers data , policies, group policies, HumanaOne and non HumanaOne medicare plans.

Data is coming from various sources like SQL Server, Mainframe, etc. which will be loaded in to EDW based on different frequencies as per the requirement. The entire ETL process consists of source systems, staging area, Datawarehouse and Datamart.

 

Responsibilities:

Ø      Developed ETL programs using Informatica to implement the business requirements.

Ø      Communicated with business customers to discuss the issues and requirements.

Ø      Created shell scripts to fine tune the ETL flow of the Informatica workflows.

Ø      Used Informatica file watch events to pole the FTP sites for the external mainframe files.

Ø      Production Support has been done to resolve the ongoing issues and troubleshoot the problems.

Ø      Performance tuning was done at the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.

Ø      Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.

Ø       Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.

Ø       Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.

Ø       Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.

Ø       Effectively worked on Onsite and Offshore work model.

Ø       Pre and post session assignment variables were used to pass the variable values from one session to other.

Ø       Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.

Ø       Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.

Ø       Performed unit testing at various levels of the ETL and actively involved in team code reviews.

Ø       Identified problems in existing production data and developed one time scripts to correct them.

Ø       Fixed the invalid mappings and troubleshoot the technical problems of the database.

 

Environment: Informatica 8.6.1, SQL Server 2008 R2, LINUX ,

 

 

Client :Allstate, Chicago, IL Duration : August 2010 to Sep 2011

Role :  ETL TalenD developer

Description:

Allstate is one of the fastest growing Auto/Property/ Life Insurance Company. It serves its customers by offering a range of innovative products to individuals and group customers at more than 600 locations through its company-owned offices.

 

The primary objective of this project is to capture different Customers, Policies, Claims Agents, Products and financial related data from multiple OLTP Systems and Flat files.  Extracted Transformed Loaded data in to data warehouse using TalenD and generated various reports on a daily, weekly monthly and yearly basis. These reports give details of the various products of Allstate Insurance products that are sold. The reports are used for identifying agents for various rewards and awards and performance, risk analysis reports for Business development Managers.

 

Responsibilities:

Ø       Worked with Data Mapping Team to understand the source to target mapping rules.

Ø       Analyzed the requirements and framed the business logic for the ETL process using talend.

Ø       Involved in the ETL design and its documentation.

Ø       Developed Jobs in Talend Enterprise edition from stage to source ,intermediate, conversion and Target

Ø       Worked on Talend ETL to load data from various sources to Oracle DB. Used tmap, treplicate, tfilterrow,tsort, tWaitforFile and various other features in Talend.

Ø       Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, file components, ELT components etc

Ø       Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.

Ø       Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.

Ø       Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File

Ø       Tuned sources, targets and jobs to improve the performance

Ø       Used Talend components such as tmap, tFileExist, tFileCompare, tELTAggregate, tOracleInput, tOracleOutput etc.

Ø       Participated in weekly end user meetings to discuss data quality, performance issues. Ways to improve data accuracy and new requirements, etc.

Ø       Involved in migrating objects from DEV to QA and testing them and then promoting to Production.

Ø       Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.

Ø       Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.

Ø       Involved in  automation of FTP process  in Talend and FTPing the Files in Unix .

Ø       Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.

Ø       Extracted data from Oracle as one of the source databases.

Ø       Optimized the performance of the mappings by various tests on sources, targets and transformations

 

Environment:  TalenD 5.1.1, Oracle11g,UNIX

 

Client :Adobe Inc, San Jose, CA                                                        Duration :April 2010 to August 2010

Role :ETL Developer

Description:

This position requires implementing data warehouse for Forecasting, Marketing, Sales performance reports. The data is obtained from Relational tables and Flat files. I was involved in cleansing and transforming the data in the staging area and then loading into Oracle data marts. This data marts/Data warehouse is an integrated Data Mine that provides feed for extensive reporting.

 

Responsibilities:

Ø                                Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.

Ø                                Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.

Ø                                Involved in extracting the data from the Flat Files and Relational databases into staging area.

Ø                                Mappings, Sessions, Workflows from Development to Test and then to UAT environment.

Ø                                Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.

Ø                                Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.

Ø                                Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.

Ø                                Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.

Ø                                Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.

Ø                                Developed several reusable transformations and mapplets that were used in other mappings.

Ø       Prepared Technical Design documents and Test cases.

Ø       Involved in Unit Testing and Resolution of various Bottlenecks came across.

Ø       Implemented various Performance Tuning techniques.

 

Environment:

Informatica 8.1.1 Power Center, Oracle 9i, Windows NT.

 

Client :PRUDENTIAL Financial, Inc. Newark, NJ Duration : Oct 2009 to March 2010

Role :ETL Informatica Developer

Description:

Prudential Financial companies serve individual and institutional customers worldwide and include The Prudential Insurance Company of America, one of the largest life insurance companies in the U.S. These companies offer a variety of products and services, including mutual funds, annuities, real estate brokerage franchises, relocation services, and more. Involved in the development and implementation of goals, policies, priorities, and procedures relating to financial management, budget, and accounting. Analyzes monthly actual results versus plan and forecast.

 

Responsibilities:

 

Ø       Involved in design, development and maintenance of database for Data warehouse project.

Ø       Involved in Business Users Meetings to understand their requirements.

Ø       Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica 7.X.

Ø       Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.

Ø       Created complex mappings which involved Slowly Changing Dimensions, implementation    of Business Logic and capturing the deleted records in the source systems.

Ø       Worked extensively with the connected lookup Transformations using dynamic cache.

Ø       Worked with complex mappings having an average of 15 transformations.

Ø       Created and scheduled Sessions, Jobs based on demand, run on time and run only once.

Ø       Monitored Workflows and Sessions using Workflow Monitor.

Ø       Performed Unit testing, Integration testing and System testing of Informatica mappings.

Ø       Coded PL/SQL scripts.

Ø       Wrote Unix scripts, perl scripts for the business needs.

Ø       Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process.

Ø       Created Universes and generated reports on using Star Schema.

 

Environment: Informatica PowerCenter 7.1.3, Oracle, UNIX

 

 

Client :Oakwood Healthcare System, Dearborn, MIDuration : Nov 2008 to Oct 2009

Role :Cognos Developer

Description:

The Oakwood Healthcare System serves 35 different communities in southeastern Michigan with over 40 primary and secondary care locations. Responsibilities include working with the clinical analytics team on the measurement of provider performance, quality improvement initiatives, and various ad-hoc requests. The reports are created, distributed and published using various Cognos BI tools like ReportNet, Impromptu, Power Play, IWR, and UpFront to the end-users. The application had OLAP features like Drill Down analysis, Multidimensional analysis, Prompts, Exception Highlighting and User Privileges.

 

Responsibilities:

 

Ø       Developed models in Framework Manager.

Ø                                Published packages and managed the distribution / setup of the environment.

Ø                                Used Query Studio for creating Ad-hoc Reports.

Ø                                Created complex and multi-page reports using Report Studio.

Ø                                Performed migration from Impromptu to Reportnet.

Ø                                Used Schedule Management in Cognos Connection.

Ø                                Performed Bursting Reports and Multilingual Reports using Report Studio.

Ø                                Developed Layout, Pages, Object Containers and Packages using Report Studio.

Ø                                Created reports using ReportNet with multiple Charts and Reports.

Ø                   Responsible for assigning user Sign-Ons for the new users.

Ø                                Provided guidance to report creators for enhancement opportunities.

Ø                   Created Multidimensional Cubes using PowerPlay and published on the UpFront Portal using PowerPlay Enterprise Server.

Ø                   Developed PowerPlay Cubes, used multiple queries, calculated measures, customized cube content and optimized cube creation time.

Ø                   Fine-tuned the Cubes and checked the database space issue and cube growth periodically.

Ø                   Responsible in the creation of new User Groups and User Classes using Access Manager.

 

Environment: CogonsBI (Frame work manager, Cognos Connection, Report Studio, Query Studio),   SQL server 2005.

 

TECHNICAL SKILLS

 

Operating Systems:Windows, Linux, HP-UX

Software / Applications:               MS XP, MS 2000, MS Word, MS Excel, MS Access,

Outlook, PowerPoint      

Database:SQL Server 2008/2005/2000, Oracle 10g/9i/8i

ETL:Informatica PowerCenter 7.1.3/8.6.1/9.0.1,

Informatica Power Exchange 8.6.1,

Modeling:Talend 5.1.1, Framework Manager, PowerPlay Transformer

OLAP/BI Tools:           Cognos 8 Series

Languages:Java, HTML, XML, SQL, PL/SQL.

Web/Apps Servers:IBM Web Sphere 4x, Sun iPlanet Server 6.0. , IIS, Tomcat

Tools:TOAD, Visio, Eclipse

 

 

Education:

Bachelor of Science (Osmania University) 

Master Diploma in Computer Applications(Osmania University)

 

Certifications:

SCJP (Sun Java Certified)



Experience

BACK TO TOP

 

Job Title

Company

Experience

ETL DEVELOPER

Cognizant Technology Solutions U.S. Corporation

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Date of Availability:

Immediately

Work Status:

US - I am authorized to work in this country for any employer.

Active Security Clearance:

None

US Military Service:

Citizenship:

None

 

 

Target Job:

Target Job Title:

ETL DEVELOPER

Desired Job Type:

Temporary/Contract/Project

 

Target Company:

Company Size:

Occupation:

IT/Software Development

·         Systems Analysis - IT

 

Target Locations:

Selected Locations:

US-CA-Oakland/East Bay

Relocate:

No

Willingness to travel:

No Travel Required

 

Languages:

Languages

Proficiency Level

English

Fluent