From:                              route@monster.com

Sent:                               Monday, September 28, 2015 1:03 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Solomon Tu 

Last updated:  12/11/14

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Darnestown, MD  20874
US

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Solomon Tu - Sr. Data Analyst

Resume Value: m98tpabppx4czppx   

  

 

Solomon

 

SUMMARY

·               Over 8+ years of IT experience in the field of Data/ Business analysis, ETL Development, Data Modeling, and Project Management.

·               Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion ,Data Quality, Data Governance, Data Integration and Metadata Management Services and Configuration Management

·               Over 2+ years of experience in Data Modeling with expertise in creating Star & Snow-Flake Schemas, FACT and Dimensions Tables, Physical and Logical Data Modeling using Erwin and Embarcadero.

·               Ability to collaborate with peers in both, business, and technical areas, to deliver optimal business process solutions, in line with corporate priorities.

·               Knowledge in Business Intelligence tools like Business Objects, Cognos and OBIEE

·               Experience with Teradata as the target for the data marts, worked with BTEQ, Fast Load and MultiLoad

·               Strong experience in interacting with stakeholders/customers, gathering requirements through interviews, workshops, and existing system documentation or procedures, defining business processes, identifying and analyzing risks using appropriate templates and analysis tools.

·               Experience in various phases of Software Development life cycle (Analysis, Requirements gathering, Designing) with expertise in documenting various requirement specifications, functional specifications, Test Plans, Source to Target mappings, SQL Joins.

·               Experience in conducting Joint Application Development (JAD) sessions for requirements gathering, analysis, design and Rapid Application Development (RAD) sessions to converge early toward a design acceptable to the customer and feasible for the developers and to limit a project’s exposure to the forces of change

·               Experience in coding SQL/PL SQL using Procedures, Triggers and Packages.

·               Good understanding of Relational Database Design, Data Warehouse/OLAP concepts and methodologies

·               Worked on different platforms such as Windows 95/windows/98/NT and UNIX, Sun Solaris, AIX, HP

·               Experience in Working with different industries like Financial, Media, Retail and Banking.

·               Implemented Optimization techniques for better performance on the ETL side and also on the database side

·               Experience in creating functional/technical specifications, data design documents based on the requirements

·               Excellent Communication, interpersonal, analytical skills and strong ability to perform in a team as well as individually.

 

EDUCATION

Bachelor’s (B. Tech.) degree in Computer Science, Dhaka, Bangladesh

 

 

 

 

 

 

TECHNICAL SKILLS

 

Languages

T-SQL, Pl/SQL, R, Python

Data Visualization

 

Tableau

Data Analysis

SAS, R

Modeling Technique

Predictive Modeling/ANOVAs/Linear Regression, Logistics   Regression/Cluster analysis

Web Technologies

HTML, DHTML, JAVA Script ,XML

Databases

MS SQL Server (2012/2008,2005/ 2000/ 7.0), MS-Access, Oracle 10g

 

Reporting

SQL Server Reporting Services (SSRS), Tableau

Process/Model Tools-

 

MS Office,MS Project,Visio,Rational Rose,RequisitePro,                 

Clearcase,Clearquest,MS Excel,MS Power Point,MS Word

 

 

professional ExPERIENCE

 

Farmers Insurance, LosAngeles, CA. June’13 – Till Date       

Sr. Data Analyst                                                                                         

 

Hero sales and service System  

Worked as a Sr.Data analyst, to create a sales and service data mart required for Business Intelligence and reporting purposes. The Existing Hero system which allows the agents to enter, modify new sales/service data is used as one of major source systems.

 

Help-Point Claims Services

 

Worked as Sr. Data analyst to perform Data Quality Integrity for the claim settlement & Help-Point Claim Service DWH.claim settlement & Help-Point Claim Service e-Business application used to offer claim services to the insurance customers by providing them with direct access to their claim details and other required details on the site hence reducing the calls to Help-Point center and thus benefiting both the business and the customers. This application provides a self service tool to track the claim process.

 

Responsibilities:

·   Work with users to identify the most appropriate source of record and profile the data required for sales and service.

·   Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.

·   Involved in defining the business/transformation rules applied for sales and service data.

·   Define the list codes and code conversions between the source systems and the data mart.

·   Worked with internal architects and, assisting in the development of current and target state data architectures

·   Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines

·   Involved in defining the source to target data mappings, business rules, business and  data definitions

·   Analyzing different source systems and interfaces that interact with the Data Distribution Environment (UNIX).

·   Executed marketing campaigns by collaborating with senior business partners across the enterprise. Extracted and analyzed billions of rows data through Teradata SQL, SAS scripts and Excel to solve complex business problems.

·   Develop FASTLOAD and MLOAD scripts to load data from legacy system and flat files through Informatica to Teradata data warehouse

·   Building up VBA Macro for Processes which are recurring jobs to pull data for specific requirement.

·   Created Daily, Weekly, monthly reports related by using Teradata, Ms Excel, and UNIX.

·   As per Ad-hoc request created tables, views on the top of the Finance Data mart/ production databases by using Teradata, BTEQ, and UNIX.

·   Created different set of tables (Like Set, Multiset, Derived, Volatile), Macros, views, procedures using SQL scripts.

·   Extensively worked on Unix Shell Scripts for automating the SQL scripts, checking daily logs, sending e-mails, purging the data, extracting the data from legacy systems, archiving data in flat files to Tapes, setting user access level, automating backup procedures, Scheduling the Jobs, checking Space usage and validating the data loads.

·   Creating and executing DATA Test Plans and data test strategies for the projects.

·   Worked on validating the performance of a project after the code optimization to make sure that it doesn’t miss any ETA on production.

·   Involved in compiling, analyzing, and formatting data from multiple sources to create reports.

·   Compared the output data of optimized code with the old version to make sure the functionality and the Business rules have not been affected by the code changes.

·   Creation of all test scripts using Teradata queries to ensure coding is accurate.

·   Create Teradata tables from Xcel using Macros or SAS for unit testing

·   Analyzed existing data processes to make recommendations for improvements

·   Created automated scripts to monitor and validate data quality on a daily basis.

·   Worked with importing external data files and dealing with missing data and data anomalies in SAS data sets and raw data files.

·   Analyze data, writing ETL scripts, designing data structures for efficient report access and creating reports and data cubes.

·   Developed strategy for the data management and best practices.

·   Responsible for creating the complicated SQLs to verify the Business logic for the reports.

·   Maintain SAS Data by using existing codes and develop new codes if necessary.

·   Validation checks on the data warehouse for the Data integrity

·   Verify that Dimensions are working properly as their defined types.

·   Interacting with client to capture information on source data for preparing data scenarios for testing

·   Test the data in both testing and production environments to make sure the requirements have been met.

·   Responsible for working along with Privacy, compliance & legal team to confirm the Affiliate sharing strategies and metadata.

Environment: SQL/Server, Oracle 9i, MS-Office, Teradata, Informatica, ER Studio, XML, Business Objects

 

 

First Hawaiian Bank, Honolulu, Hawaii                                                                              Aug’12 - May’13                                                                                

Sr. Data Analyst                                                                                                                           

 

As a Sr.Data Analyst, I was Involved in the Data Integration/Infrastructure project, for the Retail Banking. My role was to assess the impact on downstream data management environments of processing different banking products via different systems and processes. I was also, involved in requirements gathering, data mapping, and documenting metadata.

 

Responsibilities:

·   Responsible for analyzing business requirements and developing Reports using PowerPoint, Excel to provide data analysis solutions to business clients

·   Worked with Business analysts to Provide Business Performance Data and other data using Teradata, Oracle, SQL, BTEQ, and UNIX.

·   Created Daily, Weekly, monthly reports related by using Teradata, Ms Excel, and UNIX.

·   Wrote BTEQ, SQL scripts for large data pulls and ad hoc reports for analysis..

·   Created tables, indexes, views, snap shots, database links to see the information extracted from SQL files.

·   Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from multiple sources.

·   Designed SQL queries on Credit Card Transaction Data Warehouse to generate usage reports.

·   Created summary and drill down reports for Credit Card Transactions to track usage patterns.

·   Work closely with Fraud team to address any data issues; consult with the team on data, reporting, and analysis needs

·   Develop SQL queries for new reports as needed; update existing SQL queries according to business requirements

·   Schedule Weekly & Monthly Fraud Monitoring reports using VBA & Macros.

·   Written complex sql statements using joins, Sub queries and correlated sub queries.

·   Automate SAS & SQL scripts with Unix/Linux commands using Putty; transfer files to remote servers using WinSCP.

·   Create Flat files from Teradata Tables and vise versa using SAS and FTPing these files to the Vendor for fulfillments.

·   Generated SAS data files, graphical reports and summary statistics, developed standard reports and document writing.

·   Write Teradata & UNIX Shell scripts for segmentation purpose for different campaigns.

·   Create permanent User tables in Teradata, load data into the tables from various sources, and grant access to members of the Fraud team

·   Utilize Business Objects (Crystal Reports) for preparing and designing reports

·   Extensively use Excel & VBA for data analysis on Fraud claims

·   Utilize SAS scripts to parse data and prepare datasets to load into Teradata tables

·   Generate reports summarizing Fraud & Bank Operations data

·   Analyze and evaluate Retail Operations processes; make recommendations to improve the processes.

Environment: Teradata SQL Assistant, BTEQ, Oracle 9i, Teradata, Windows NT, SAS, Tableau,  UNIX Shell scripts, SQL,  Mainframes, MVS/TSO,  PowerPoint, Excel.

 

 

TD Bank, Maine                                                                                                                  July’11–June’12                                                              Business/Data Analyst                                                                                                  

 

As a Business/ Data Analyst, I was Involved in the Data Integration project, for the Consumer Finance Group. My role was to assess the impact on downstream data management environments of processing different banking products via different systems and processes. I was also, involved in requirements gathering, data mapping, and documenting metadata.

 

Responsibilities:

·   Work with the Project Management in the creation of project estimates.

·   Analysis of the data identifying source of data and data mappings of HCFG.

·   Worked extensively in documenting the Source to Target Mapping documents with data transformation logic.

·   Interact with the SME’s to analyze the data extracts from Legacy Systems (Mainframes and COBOL Files) and determine the element source, format and its integrity within the system

·   Transformation of requirements into data structures which can be used to  efficiently store, manipulate and retrieve information

·   Collaborate with data modelers, ETL developers in the creating the Data Functional Design documents.

·   Ensure that models conform to established best practices (including normalization rules) and accommodate change in a cost-effective and timely manner.

·   Enforce standards to ensure that the data elements and attributes are properly named,

·   Work with the business and the ETL developers in the analysis and resolution of data related problem tickets.

·   Support development teams creating applications against supported databases.

·   Provide 24 x 7 problem management support to the development team.

·   Document various Data Quality mapping document, audit and security compliance adherence.

·   Perform small enhancements (SOR element additions, data cleansing/data quality).

·   Create various Data Mapping Repository documents as part of Metadata services (EMR).

·   Provide inputs to development team in performing extraction, transformation and load for data marts and data warehouses.

·   Provide support in developing and maintaining  ETL processes that extract data from multiple SOR’s residing on various technology platforms then transport the data to various delivery points such as data marts or data warehouses.

·   Collaborate with data modelers, ETL developers in the creating the Data Functional Design documents.

·   Ensure that models conform to established best practices (including normalization rules) and accommodate change in a cost-effective and timely manner.

·   Enforce standards to ensure that the data elements and attributes are properly named,

·   Work with the business and the ETL developers in the analysis and resolution of data related problem tickets.

·   Support development teams creating applications against supported databases.

·   Provide 24 x 7 problem management supports to the development team.

·   Document various Data Quality mapping document, audit and security compliance adherence.

·   Perform small enhancements (SOR element additions, data cleansing/data quality).

·   Create various Data Mapping Repository documents as part of Metadata services (EMR).

·   Provide inputs to development team in performing extraction, transformation and load for data marts and data warehouses.

·   Provide support in developing and maintaining  ETL processes that extract data from multiple SOR’s residing on various technology platforms then transport the data to various delivery points such as data marts or data warehouses.

Environment:  MS Excel, MS Access, Oracle 10g, UNIX, Windows XP, SQL, PL/SQL, Power Designer, Informatica, UNICA 6.4

 

 

Dish Network, Denver, CO.                                                                                                        Sep’09 - June’11

Data Analyst

BT-Consumer Credit project:

As part of the overall Billing Transformation project, a new Consumer Credit subject area is being built in the Data Warehouse. This subject area will cater to the credit qualification reporting needs of the CMO Credit team and will also be used for credit and fraud analysis done by the Data Warehouse.

Consumer Credit Subject Area have built following steps –

1.                      Both internal and external data sources that will be used for generating existing as well as future Credit qualification reports will be identified and analyzed.

2.                      Database tables have been designed to store the data brought-in from these sources into EDW.

3.                      ETL (Extract-Transform-Load) process had performed to populate these EDW tables with the data from the current sources.

4.                      EDW Data Marts have been built for the Consumer Credit subject area to perform data processing needed for reporting purposes.

 

Responsibilities:

·   Performed  data profiling in the source systems that are required for Billing System

·   Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.

·   Involved in defining the trumping rules applied by Master Data Repository

·   Define the list codes and code conversions between the source systems and MDR.

·   Worked with internal architects and, assisting in the development of current and target state enterprise data architectures

·   Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines

·   Involved in defining the source to target data mappings, business rules, business and  data definitions

·   Responsible for defining the key identifiers for each mapping/interface

·   Implementation of Metadata Repository, Maintaining Data Quality, Data Cleanup procedures, Transformations , Data Standards, Data Governance program, Scripts, Stored Procedures, triggers and execution of test plans

·   Performed data quality  in Talend Open Studio

·   Responsible for defining the functional requirement documents for each source to target interface.

·   Document, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.

·   Reverse engineered all the  Source Database’s  using  Embarcadero  

·   Coordinate with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality

·   Document data quality and traceability documents for each source interface

·   Designed and implemented  data integration modules for Extract/Transform/Load (ETL) functions

·   Involved in  data warehouse design

·   Experience with various ETL, data warehousing tools and concepts

·   Documented the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.

·   Good Experience with Mainframe enterprise billing systems Involved in defining the business/transformation rules applied for sales and service data.

·   Worked with internal architects and, assisting in the development of current and target state data architectures

·   Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines

·   Used data analysis techniques to validate business rules and identify low quality

·   Missing data in the existing Dish enterprise data warehouse (EDW).  Evaluate the

·   Also Worked on some impact of low quality and/or missing data on the performance of data warehouse client

·   Identified design flaws in the data warehouse

Environment: SQL/Server, Oracle10&11g,MS-Office,Embarcadero,Netezza ,Enterprise Architect, Informatica, ER Studio, XML, Informatica, OBIEE , Talend Open Studio

 

Huawei, Dhaka, Bangladesh                                                                                                     Sept’07-Aug’09                                                         

 

Market Research Custom Financial System (MR) supports the business needs of the Market Research group by providing them a financial system, addressing the need for managing short-term and long-term projects, collecting information about the various costs associated with a project, allowing for client invoicing and providing them with the business summary on project proposal and forecasting the project spectrum. EPF is to provide timely, accurate, and comprehensive forecast data to the Executive Committee and to Business Unit Management. The data is updated monthly and will include monthly detail for a rolling 12 months. Pipeline data is maintained at an opportunity (bid) and a Business Unit level. Departmental forecast expenses are also maintained monthly at an account level for a rolling 12 months.

 

Responsibilities:

·   Plan, design, and implement application database code objects, such as stored procedures and views.

·   Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.

·   Created ad-hoc reports for the upper level management using Stored Procedures and MS SQL Server 2005 Reporting Services (SSRS) following the business requirements.

·   Created reports by dragging data from cube and wrote mdx scripts.

·   Created reports by extracting data from cube.

·   Generated reports using SQL Server Reporting Services 2005/2008 from OLTP and OLAP data sources.

·   Provide database coding to support business applications using Sybase T-SQL.

·   Perform quality assurance and testing of SQL server environment.

·   Develop new processes to facilitate import and normalization, including data file for counterparties.

·   Work with business stakeholders, application developers, and production teams and across functional units to identify business needs and discuss solution options.

·   Ensure best practices are applied and integrity of data is maintained through security, documentation, and change management.

Environment: SQL Server 2005 Enterprise Edition, T-SQL, Enterprise manager, VBS.

 



Experience

BACK TO TOP

 

Job Title

Company

Experience

Sr. Data Analyst

Axis Tech, Inc

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Date of Availability:

Immediately

Work Status:

US - I am authorized to work in this country for any employer.

Active Security Clearance:

None

US Military Service:

Citizenship:

None

 

 

Target Job:

Target Job Title:

Sr. Data Analyst

Desired Job Type:

Employee

Desired Status:

Full-Time

 

Target Company:

Company Size:

Occupation:

IT/Software Development

·         General/Other: IT/Software Development

 

Target Locations:

Selected Locations:

US

Relocate:

Yes

Willingness to travel:

Up to 100%

 

Languages:

Languages

Proficiency Level

English

Fluent