From:                              route@monster.com

Sent:                               Saturday, May 07, 2016 5:10 AM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Cloud

 

This resume has been forwarded to you at the request of Monster User xapeix03

Peng Lim 

Last updated:  07/12/15

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received

804 Townsend Road
Dublin, OH  43016
US

Home: 6146683982   
pengylim@hotmail.com
Contact Preference:  Telephone

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Peng_Lime_June2015

Resume Value: buw5kzb42ay6ty7k   

  

 

 

 

PENG YEW LIM

Senior Data Warehouse Architect

BMW Financial Services Spot Award holder

 

 

AddressEmail

Dublin OH 43016-8421pengylim@hotmail.com

Tel (614) 668 3982 (Cell)

 

Employment Objective

A position that would utilize my skill sets to achieve company’s business and technical goals.

 

Education

Master of Engineering (Electrical and computer systems) – RMIT University, Australia

 

Skills

·  Over twenty four (24) years IT experiences.

·  Eight teen (18) years solid data warehousing experiences in banking, health care, manufacturing, retail, telecommunications, retirement and software services.

·  Implemented thirteen (13) data warehouses, (1) one Decision Support System (DSS), three (3) operating data stores (ODS) and one (1) hub.

·  Lead data warehouse architect/Data/Informatica Architect/Manager.  Provided expertise on data warehouse/data marts and ODS architectures, data modeling, design, ETL coding, reporting and implementations. 

·  Data Modeling Expert.

·  Mastered a set of strategies for delivering projects on time and on budgets.

·  BI Guru. A go to ‘person’ for your queries answered.

·  Provided expertise on high performance Informatica coding strategy and session tuning.

·  In depth knowledge in data modeling, high performance database and SQL tuning, Extract-Transform and Load (ETL), ROLAP, MOLAP and DSS, data manipulation, DBA duties, Erwin and Cognos.

·  Design ETL Do and Don't protocols, table relationship standards, coding standards, data de-duplication and cleansing standards.

·  Provided mentoring and knowledge transfer to developers, guided them in every aspects of development phases.

·  Attention to details. Able to handle multiple tasks concurrently.

·  Effective communicator. Excellent speaking and writing skills.

·  Awarded the BMW Financial Services Spot Award for designing, planning and executing the smoothest ETL process.

·  Databases: DB2 UDB, Oracle 11g,10, 9i, 8i & 7, Oracle Automatic Repository (AWR), Microsoft SQL Sever 2000, 2005, 2007, Sybase 11, Teradata 14.10.

·  Tools: Informatica 4.5, 5.0, 6.x, 7.1,2, 8.1.1, 8.6, 9.1,9.5 Trans SQL, Cognos ReportNet, Cognos Query Studio, Cognos Report Studio, Cognos Framework Manager, Business Objects, Business Objects Data Services, Autosys, Erwin, Perl, Vantive, WinCVS, Subversion, Bourne, Korn shell scripts, Winscp, ftp, Test Qualifer, WinCVS, Big Data, Hadoop, Hive, Cloudera Impala, Hue, Sqoop

 

 

Project Experience

 

Client: Covance

Period: Oct 2014 – Present

Project: Data Warehouse Assessment and Remedy

Role: Data Warehouse Architect

Team Size: 14

Software: Erwin, BODS, BODI, Oracle 11g, Oracle SQL Developer, UNIX, Shell Script, Big Data, Hadoop, Hive, Cloudera Impala, Hue, Sqoop, PIG

 

Project Location: Indianapolis, IN

 

The data warehouse was designed and developed since 2008. It has some challenges when it runs in production. My task was to assess the data warehouse and find ways to fix it. It has data integrity, missing data, data out synch, duplications and performance issue. I was assigned to find the root causes and design the architecture/code templates to fix them. I also tuned the systems where there are poor performances.

 

Developed best practice ETL strategy so that the inefficient processes can be replaced by the best practice and industrial acceptable solutions.

 

Removed inefficient coding like triggers and cascade deletes and replaced them with efficient SQLs.

 

Used Oracle Automatic Repository (AWR) to check for SQL run time statistics.

Tuned SQL and cut the loading time from 76hrs to 4hrs.

 

Replaced inefficient workflows with a series of jobs which encapsulate workflows so that the jobs can be restarted from the point of failure.

 

Enabled scheduling to start jobs in parallel.

 

Replaced some of the Oracle 11g stored procedures with efficient sqls.

 

Built an EDW layer based on Inmon's approach. It is in 3rd normal form, has full history, atomic layer and provide the single version of truth. Built data marts layer based on Kimball's approach. They are star/snow flake schemas. They are de-normalized to obtain the optimum speed in query processing.

 

Developed many reusable shell scripts in Red Hat Linux.

 

Trained a team and mentored them on the right ways to build data warehouses.

 

Effective and communications and convey messages clearly.

 

 

 

Client: Ameren

Period: Aug 2013 – Oct 2014

Project: Grid Smart Advanced Metering Infrastructure (AMI)

Role: Data Architect

Team Size: 14

Software: Erwin, Informatica 9.51, IDR 9.5, Teradata 14.1, Oracle, Oracle SQL Developer, UNIX, Shell Script, Teradata TPT Load Operators, Temporal

Project Location: St Louis, MO

 

Mainly I provided leadership and directions in setting up the new data warehouse. I also set out the best strategies in coding Informatica ETL mappings, performance tuning, IDR, IDQ, DVO, PDO and metadata management. Instead of using traditional type 2 SCD, the project team has implemented the advanced Teradata Temporal version of SCD. I have provided guidance in the Temporal coding.

 

Set up IDR mappings for initial and incremental loadings. Turned on supplemental logging. Set up IDR scheduler to run IDR jobs for every 15mins.

 

Possess exceptional skills in solving complicated problems using simple and effective design, a task which requires years of experiences to work out the best strategy in solving problems.

 

Possess the vision to set the right directions so that no developers need to spend endless hours in recoding and fixing bugs.

 

Effective in mentoring developers/architect/modelers so that the whole will have the skill sets to face the challenges and able to complete tasks on time and on budget.

 

Effective and communications and convey messages clearly.

 

I am also a detailed and hand on person who knows what it takes to get the jobs done.

 

 

Client: Genuine Parts Company (GPC)    

Period: May 2013 – June 2013

Project: Data warehouse and Cloud application health study

Role: Senior Data Warehouse and Cloud Application Architect and Informatica Administrator

Team Size: 40

Software: Erwin, Oracle EXA Data, Informatica, Oracle SQL Developer, UNIX, Shell Script, Subversion, Informatica Administrator Console

Project Location: Atlanta, GA

 

I worked in the leadership team and provided leadership in managing both data warehouse and Cloud application. The Cloud application was to provide real time message interchange among a series of central and stores systems.

 

I have successfully identified the weakness of the set up and set up remedy solutions to address the issues.

 

 

Client: American Electric Power (AEP)

Period: Sep 2010 – Apr 2013

Project: Grid Smart, Electricity Generation Emission, Security Access, Electricity Distribution

Role: Senior Data Warehouse Architect

Team Size: 30

Software: Erwin, UML, DB2 UDB, Oracle, PL/SQL, MS SQL Server, T-SQL, Informatica, Advanced Query Tool, UNIX, Shell Script, Autosys, Business Objects, Informatica CDC, Informatica Web Services

Project Location: Columbus, OH

 

I worked as a senior data warehouse architect for the Electrical Grid Smart project. The Grid Smart project deployed million of Grid Smart meters. They can also be used in real time electricity usage analysis and diagnosis of any fault conditions. It enables corrective actions to be taken in real time via remote access.

 

My responsibility was to design an event tracking analysis system in data warehouse that enable users to analyze all fault events. This will allow corrective actions to be taken in real time or by dispatching service personnel. The net gain is to reduce long term maintenance cost and also ensure the health of electricity supply.

 

My main function was to design system and produce both logical and physical models. I also led a team to implement the solution. This involved in coding complicated ETL codes, performance tuning and mentoring developers.

 

 

Client: Amtrak

Period: Apr 2009 – Sep 2010

Project: Engineering Asset Management

Role: Senior Data Warehouse Architect

Team Size: 6

Software: Erwin, Oracle, PL/SQL, MS SQL Server, T-SQL, Informatica, Toad, UNIX, Shell Script

Project Location: Philadelphia, PA

Amtrak provides freight and passenger transportation services for the whole nation. Amtrak has fast and normal train services running between Boston and Washington DC.

 

My current responsibility is to architect a historical asset repository to capture the changes for all assets that are related to rail way tracks. This includes creating historical Oracle SDO Geometry data for tracks, stations, signals, transformers etc. The SDO Geometry data is then displayed on Google Earth. Three and four dimension topologies are being drawn on Google Earth.

 

The source contains over three hundred tables. These tables are consolidated to about one hundred tables at the data mart. I’ve designed the data mart model using Erwin. I have employed Informatica 8.6 to successfully load data from the source to the target. Since historical data is required, I’ve deployed the check sum generator MD5 function to decode the incremental rows and used them to create type two slow changing dimension and fact rows at the target database tables. An active flag is used to indicate the current active records. All records including the historical records have been conditioned by their start and end dates.

 

 

Client: American Electric Power (AEP)

Period: May 2008 – Feb 2009

Project: MS Project Server and Project Portfolio Server

Role: Senior Data Warehouse Architect

Team Size: 10

Software: Erwin, UML, DB2 UDB, Oracle, PL/SQL, MS SQL Server, T-SQL, Informatica, Advanced Query Tool, PeopleSoft, UNIX, Shell Script, Autosys, Business Objects, Informatica CDC, Informatica Web Services

Project Location: Columbus, OH

 

AEP provides electricity supply for a few states in Midwest. It employs over 19 thousand employees.

 

My responsibility is to architect the reporting needs around People Soft Enterprise Performance Management (EPM), Microsoft Project Portfolio, Project Servers, Workflow management, PeopleSoft finance labor and material costs.

 

I have created a new Portfolio Project Server Management Data Mart to replace the old IT Project Management Data Mart. It sources its data from both Portfolio and Project Servers. It provides high level project level attributes as well as low level task labor hours and costs. Apart from labor costs, People Soft EPM provides material, software, hardware and professional services costs. It enables users to manage all company projects from a single point.

 

My duty is to provide architect methodology, new incremental loading strategy which makes use of MD5 check sum generator, ETL strategy, source data analysis, ETL and reporting coding. Using Informatica 8.6, I’ve coded incremental mappings to load the Portfolio and Project Data mart.

 

My near future assignment is to create a virtual enterprise data warehouse architecture where costs and revenues from several finance systems can reported using a standard set of key performance indicators.

 

 

Client: Sterling Commerce, a subsidiary of AT&T

Period: Feb 2005 – May 2008

Project: Enterprise Data Warehouse

Role: Data Warehouse Architect

Team Size: 10

Software: Erwin, DB2 UDB, Oracle, PL/SQL, MS SQL Server, Sybase, Informatica, Informatica Administrator Console, Embarcadero, PeopleSoft EPM GL AP AR Sales Contract, PeopleSoft CRM, SalesForce, Vantive,UNIX, Shell Script, Salesforce.com, Vantive, Cognos ReportNet, Cognos Query Studio, Cognos Report Studio, Cognos Framework Manager

Project Location: Dublin, OH

 

Sterling Commerce is the market leader in providing B2B software and services. Its customers are mainly made up of fortune 500 multi-national corporations. The B2B software enables data transfers through secured channels with encryptions.

 

The data warehouse consists of PeopleSoft finance AP, AR, GL, Journal, Billing, Contract, Sales revenue, customer support ticket data as well as software programmers work hours.

 

Lead a team of data modelers, ETL and Cognos developers to build data warehouse and data marts. Primarily, I’m responsible to drive the team to deliver projects on time and on budget. My greatest strength is able to set the right directions for building data warehouses and data marts and able to solve complicated issues with simple solutions.

 

Architecture duty

 

·  Setup overall system and data warehouse architecture frame work.

·  Interview users on requirements and consolidate and current and future needs.

·  Provide coding estimates to users.

·  Design the building phases.

·  Design data warehouse and data mart schemas using the best of breeds between Kimball and Inmon’s methodologies.

·  Foresee future requirements can be met with minimum alterations.

·  Setup ETL strategy and standard.

·  Enable incremental loading.

·  Detect source delta rows using Change Data Capture (CDC), Checksum, comparing columns or database triggers.

·  Analyze source system and derive source system data models.

·  Ensure data from different source systems can be easily integrated at the target tables.

·  Design process flow from source systems to staging, data warehouse and data marts.

·  Define the most efficient way to process data whether it is at the database layer or at Information Power center engine.

·  Determine a way to handle dirty data.

·  Provide Do and Don’t protocols.

·  Design and prototype coding templates and standards.

·  Transfer knowledge to developers and train them on the coding templates and business logics.

·  Assign works to developers according to developers’ strength and train them to attain the next level of skill sets.

 

ETL duty

·  Ensure the following ETL objectives are met

1.  Data integrity

2.  Ease of reporting

3.  High performance

4.  Efficient data loading

5.  Adhere to standard

6.  Ease of maintenance – Able to restart easily

7.  Robust mappings - Able to self correct errors

8.  Ease of enhancement – Anyone can work anyone else mappings

9.  Stability

10.                   Simplicity

11.                   Ease of trouble shooting

12.                   Data traceability

13.                   Able to report on historical data

·  Solve day to day developers’ issues.

·  Review Erwin data models, coding and reports to see if they meet coding standards.

·  Check on Change Data Capture (CDC) and Slow Changing Dimension (SCD) and other mappings.

·  Test historical data marts. A historical data mart captures data and the transaction timestamps. The data can be reported at any point of time in history where it presents different pictures at different point of times.

·  Perform incremental data loading. Collect all changes for a table and push the delta rows to data warehouse and data marts. This will enable incremental loading to both dimension and fact tables. The full history of the data can then be maintained.

·  Maintain a data warehouse layer to process all business rules at a central layer to achieve a single version of truth. The data is then distributed to data marts as needed. This will have conformed dimension tables across all data marts.

·  Tune sql at source qualifiers and lookup transformations.

·  Ensure unique keys conditions are met at every table, source qualifier, look up and target transformations.

·  Ensure error handling is met.

·  Work on most complicated mappings and reports.

·  Perform tuning on sql using hash, merge or nested loop joins.

·  Perform session tuning by having process partitioning, pipe files, external loaders, dropping and recreating indexes.

·  Bypass database transaction logs by using external loaders.

·  Upgraded Informatica from version 7.1.2 to 8.1.1.

·  Coded Unix Perl, Bourne and Korn shell scripts.

·  Queried Informatica’s repository to get the following meta data information.

1.  Which mappings will be impacted by a proposed code change?

2.  Which SQ or Lookup transformation SQL overwrite has a key word imbedded?

3.  Which mappings are saved by a specific date?

4.  Which mappings loaded zero rows and what errors did they produce?

5.  Which mappings are having a specific transformation?

6.  Which session use bulk or normal load?

7.  Which sequence counter does not have recycle flag turned on?

 

Report generation duty

·  Assign work.

·  Ensure business rules are met.

·  Ensure look and feel is set similarly among all reports.

·  Create master, detail and drill through reports.

·  Test drill through reports to see if the sum of detail records match up to master records.

 

 

Client: BMW Financial Services

Period: Apr 2004 – Feb 2005

Project: Enterprise Data Warehouse

Role: Lead data warehouse architect.

Team Size: 10

Software: Erwin, MS SQL Server, T-SQL, Informatica, Toad, UNIX, Shell Script

Project Location: Dublin, OH

Award: BMW Financial Services Spot Award holder

 

·  Lead a team of five people to convert the Enterprise Data Warehouse (EDW) to daily incremental loading process.

·  Expanded the data warehouse and merge data from additional six data sources.

·  Implemented five new data marts and six mini cubes.

·  Designed and implemented new ETL strategy, coding standards, codes reuse, unit and regression testing methodology.

·  Designed high performance ETL flow, tuned Trans SQL statements and databases and implemented efficient multi-stage data cleansing strategy.

·  For ease of maintenance, implemented robust program execution restart strategies. Programs may stop and restart safely without truncating or rolling back loaded data.

·  Implemented error recycling process.

·  Extensively querying Informatica meta data to examine coding bugs.

·  Mentored and trained ETL developers.

 

 

Client: Michigan State Office of Retirement Services

Period: Feb 2002 - Apr 2004.

Project: Data conversion project. 

Role: Operation Data Store (ODS Hub) lead architect

Team Size: 7

Software: Erwin, MS SQL Server, DTS, WinCVS

Project Location: Lansing, MI

 

·  Designed data model for the data hub using Erwin based on business requirement analysis. 

·  Designed and implemented ETL strategy, data hub interfaces, migration, integration testing, database sizing, audit trail, error handling and security.

·  Mentored team members on coding the ETL modules using SQL Server 2000.

·  Coded in Data Transformation Services (DTS).

 

 

Client: State Teachers Retirement Systems

Period: Sep 2001- Dec 2001 and also Apr 1999 - Jun 2000

Project: Data Warehouse

Role: Data Warehouse Guru

Team Size: 7

Software: Erwin, MS SQL Server, T-SQL, DB2 UDB, Informatica, Toad, UNIX, Shell Script, Perl, WinCVS

Project Location: Columbus, OH

 

·  Provided technical team lead functionalities to mentor team members in designing, coding and implemented a data warehouse using Informatica.

·  Designed data model in Erwin.

·  Mined Informatica’s repository and performed SQL tuning, Shell and Perl scripts coding, data base design, data mart constructions.

·  Designed do and don’t ETL coding methodologies.

·  Implemented high performance loading and error recycling strategies.

 

 

Client: Submitorder.com

Period: June 2000 - Aug 2001

Project: Data Warehouse

Role: Data Warehouse Architect/DSS Application Architect

Team Size: 8

Software: Erwin, MS SQL Server, Oracle, Informatica, Toad, UNIX, Shell Script

Project Location: Dublin, OH

        

 

·  Designed DSS data model and implemented ETL.

·  Designed star schemas.

·  Coded in Informatica to extract XML data from job queues, parsed and populated tables for the Oracle 9i databases.

 

Client: Lucent Technologies

Period: Feb 2000 - Sep 2000

Project: Scalability and availability study

Role: Team Lead at Lucent Technologies

Team Size: 4

Software: Oracle

Project Location: Colmbus, OH

 

·  Managed a team to work on the scalability and availability to scale the workload. 

 

National Australia Bank.                                                         Aug 1998 - Jan 1999

Data Warehouse Senior Consultant

 

·  Performed data model design and ETL coding in Oracle 8i.

·  Populated data warehouse and data marts.

·  Generated high risk lending reports for auditing and monitoring purposes.

 

IBM-GSA TELSTRA ALLIANCE.                                                          Feb 1998 - Aug 1998

Senior Consultant

 

·  Designed and coded the import and the export modules for the churning of long distance carriers using Oracle Designer 2000 and Oracle database.

 

Australian Hospital Care Group.                                            Jun 1996 - Jan 1998

Data Mart Senior Consultant

 

·  Implemented data marts to consolidate financial data from all hospital branches to generate monthly KPI, profit and loss as well as actual vs budget reports.

 

Security Mailing Services Pty Ltd.                                                     May 1993 - May 1996

Chief Analyst Programmer

 

·  Developed shell and Clipper scripts to cleanse and format data for the high volume Xerox printers.

 

Leigh Mardon Pty Ltd.                                                       Sep 1991 - May 1993

Programmer Analyst

 

·  Developed programs to format incoming bank data for check printings.

 

TRI-M Technologies(S) Pte Ltd.                                                          Aug 1988 - Jul 1991

System Engineer

 

·  Maintained ASK ManMan MRPII system.

·  Developed operation reports in Power House Quiz.

 

References

Available upon request

 

 

 

 

 



Additional Info

BACK TO TOP

 

Current Career Level:

Manager (Manager/Supervisor of Staff)

Years of relevant work experience:

More than 15 Years

Work Status:

US - I am authorized to work in this country for any employer.

Active Security Clearance:

None

US Military Service:

 

 

Target Company:

Company Size:

 

Target Locations:

Selected Locations:

US-OH-Columbus/Zanesville

Relocate:

No

Willingness to travel:

Up to 25% travel

 

Languages:

Languages

Proficiency Level

Chinese - Mandarin

Beginner