From: route@monster.com
Sent: Friday, May 06, 2016 2:14 PM
To: hg@apeironinc.com
Subject: Please review this candidate for: Cloud
This resume has been forwarded to
you at the request of Monster User xapeix03
|
|||||||
|
|||||||
|
|
|
||||||
|
||||||
|
OBJECTIVE: |
C2C only with bill/pay cycle on a
bi-weekly basis (non 1099) | http://spider-schema.info | US Citizen | No
military or Govt jobs please. http://markhargraves.com |
|
EXPERIENCE: |
11/2015 - 1/2016 |
HCL Technologies Ltd |
|
|
Industry: |
|||
|
|
|||
|
Brought on as a BI Data Architect.
Ended up working in a number of different roles while they tried to place me
on a project. |
|||
|
Since all projects at HCL /
Microsoft were based upon work at Microsoft, no work could be accomplished
until my credentials were complete. The V Dash or v- account is necessary for
access to any Microsoft systems for security reasons. I did not get my v- account
completed until the end of this project which prevented me from being
assigned to any new projects.Having no past experience in Big Data, I was
given a month to learn Azure, and its many offerings (Hadoop, Pig, R, Spark,
Azure Data Lake, Azure Data Warehouse, Azure SQL Database, HD Insight, Power
BI, Data Migration Utility, Cosmos, Scope, Horton Works, Poly Glass, and much
more), using a personal account. I was to learn what I could, until I was
assigned a new project. In the first month (November) I spent 30% of my time
in meetings either working with the sales team or listening in on prospective
sales meetings. I edited several Power Point Presentations to correct
grammatical errors, and provided a number of Architectural Diagrams for the
meetings. 60% of my time was spent learning Big Data technologies.The second
month, while I continued to learn Azure and some Cosmos / Scope as well, I
facilitated several POC projects by reverse engineering OLTP databases and
modeling those into dimensional databases and then building SSAS cubes (SQL
Server 2012) to facilitate data exploration, data quality issues and so
forth.The third month became very slow work wise as HCL struggled to land any
new projects. On January 13, 2016 I was given notice that I had until the
22nd of January 2016 to find another job. There were no hard feelings, just
no work. |
|||
|
7/2015 - 11/2015 |
Texas A&M University-Corpus
Christi |
||
|
Industry: |
|||
|
|
|||
|
Consulted as a EDW and Business
Intelligence Solutions Architect |
|||
|
Initiated this project as directed
by performing an analysis on Blackboard, and Banner EDW BI Solutions.
Completed this initial task by creating technical specifications and
recommendations on which to use. Completed a series of technical documents to
specify hardware and software requirements for the project. Additionally
performed research on several different projects out of scope from my
technical role.Specified and configured several Microsoft Windows 2012 and
2008 VM servers for the BI project to include a: Development, Test,
Reporting, Data Warehouse, and Analysis Services Environment. Setup and
installed the necessary OLEDB and Data providers to access Banner (Ellucian)
installed on an Oracle database server.Performed a POC for SharePoint 2013.
Lead the BI team on this project with no supervision. Installed and
configured SharePoint 2013 on a barebones Windows 2012 SP2 Server. Created
numerous demos for the organization to showcase each particular feature of
SharePoint 2013 excluding the BI features.Created a custom ETL application
written in VB.Net using Visual Studio 2015 Pro to replace Microsoft SSIS to
extract data from any data source, and place it into the ODS (Offline Data
Store). This application imported the DDL: table names, column names, data
types, constraints and Primary Keys from numerous Schemas running on an
Oracle Database server and then produced the DDL to create the same yet
compatible objects in MS SQL Server in an automated fashion; which reduced
the development time of creating tables in the ODS by weeks for each Schema
to seconds. The application was capable of adding thousands of: source
tables, and database connections in a matter of seconds instead of weeks.
This application not only created the objects needed for the ODS, but also
ran that ETL nightly without issue. It also encrypted the usernames and
passwords of all data source connections. The application also had its own
error logging, schema / table logging which logged the start and stop times
of each tables ETL, including total execution time, starting and ending row
counts, and it also included a logging process for the Merging of data into
the ODS destination tables. |
|||
|
1/2015 - 3/2015 |
Business Intelligence Solutions
Inc. |
Remote |
|
|
Industry: |
|||
|
|
|||
|
Consulted to determine if the
Spider Schema was a valid Data Model for an existing ERP System. |
|||
|
Consulted to create a Sales Order
Entry System (Proof of Concept) using Microsoft Visual Studio 2010, VB.Net,
ADO.Net, and WPF.Gathered requirements from the CEO and Board of Directors.
Took the existing ERP Sales Order data with 10 years of history and modeled
it into the Spider Schema Data Model.Created a fully functional WPF
application to use ADO.Net to pull and push data via disconnected Data
Sets.Tested performance on a live production server via a programmatic
testing methodology.It was determined that the Spider Schema Data Model not
only outperformed the existing ERP System, but did so 40% faster.
Additionally there was a 45% savings in data storage.Microsoft Excel was
connected via Pivot Tables directly to the database and was able to do full
BI (Slicing and Dicing) of the data with no changes, no ETL. The reporting
was up to the minute, and accurate.Tabular and Analytic SSAS Cubes were
created directly from the Spider Schema Data Model with again no changes to
the underlying data model.According to the CEO, they are likely to create a
custom ERP System in the future with the Spider Schema Data Model once a
budget can be established. |
|||
|
4/2014 - 12/2014 |
Flagstone Foods |
Worked Remotely 100% |
|
|
Industry: |
|||
|
|
|||
|
Completed a BI Project sourcing
data from CIMPRO via an aftermarket ODBC driver via Microsoft SSIS. |
|||
|
Gathered Requirements, architected
and created the BI Solution to entail reverse engineering the source system
(CIMPRO running on UNIX), ETL (moving of data from the source system to the
ODS, to the Stage database via SQL Scripts and SSIS packages), architecture
and creation of the Data warehouse, architecture and creation of the MS
Analysis Services and Tabular Cubes. Created and scheduled SSIS packages to
pull only the daily transactions from the CIMPRO system via a Proprietary
ODBC driver using SSIS. There were many data type issues between the source
system and SQL Server. The ODS (Offline Data Store), kept data synchronized
via nightly and daily pulls. Created an ETL to transfer the ODS data into a
Stage database where the data was further cleaned up and organized. Finally, created
an ETL via SSIS to move data from Stage to the Data Warehouse database which
was modeled into the Spider Schema (http//:www.spider-schema.info) which I
invented and have been using for the past 8 years on various projects.
Created numerous SSAS Cubes to host data for Procurement, Production Batches,
Sales Orders, Invoicing, GL, Bill of Materials, Forecasting, and Inventory
Transactions. Worked directly with business users to train and mentor them to
use Power Pivot, Power View, Excel Pivot tables, and SharePoint 2013. 100% of
work was completed remotely. |
|||
|
1/2013 - 12/2014 |
AMPI |
Worked Remotely 100% |
|
|
Industry: |
|||
|
|
|||
|
Created a BI Solution for a large
Milk Dairy using DSI ERP Systems. |
|||
|
Reversed engineered the DSI (Data
Specialists, Inc.) Dairy Food ERP System into a robust Business Intelligence
Solution using the MS BI Stack and the Spider Schema. Created a nightly
running ETL using SSIS to pull all data from the source system DSIV1R7
database running in MS SQL Server 2008 R2. The data was pulled into a MS SQL
Server 2008 R2 (ODS offline data store) to keep an exact copy of the source
systems data. This data was updated nightly via a scheduled SSIS package.
Additionally an SSIS package was created to pull data on demand. The ODS data
was then pulled into a Stage Database where it was normalized and merged /
tied into data from JD Edwards 9.0 One World Application; which was then
moved over the Data Warehouse where the reporting data was stored. This
project started when the previous one finished, and was for the same client.
The data enabled the company to have a complete financial picture from the
procurement, transportation, and testing of the milk, to the purchasing of
materials used to create various dairy based products manufactured in various
plants, to the selling and distribution of those products. This required
merging data together from very different systems. When this project was
finished, the creation of a custom Milk Pricing / Competitor Model BI data
model using the same data was done; so that the Organization could compare
their pricing model to other competitors in the same regions. The final solution
offered a true Business Intelligence Solution where the Organization could
slice and dice their Milk Dairy Hauler, and Producer Payroll metrics by their
own Company, Division, Zone, Producer, Hauler, Payroll Date, and Pricing
Metrics, and compare those to competitors, and view this data along with
their JD Edwards data. 100% of work was completed remotely. |
|||
|
4/2011 - 10/2012 |
AMPI |
Worked Remotely 100% |
|
|
Industry: |
|||
|
|
|||
|
Developing BI Solutions for JD
Edwards One World Applications |
|||
|
Reverse engineered numerous
modules within JD Edwards (People Soft) One World Application with its
database running on MS SQL Server, including GL, Sales, Work Orders,
Inventory Management (Cardex), Purchasing, AR, AP, Advanced Pricing, and
Transportation. Created a robust Data Model using the Sider Schema to provide
advanced BI functionality downstream to SharePoint, Excel, and MS SSAS Cubes.
Completed a custom ETL where only changed data in JDE was pulled and merged
into an Offline Data Store (ODS), which the Stage and Data Warehouse ETL
pulled its data from. Completed the entire project using the MS BI Stack.
100% of work was completed remotely. |
|||
|
10/2010 - 3/2011 |
MMIC |
Worked Remotely 50% |
|
|
Industry: |
|||
|
|
|||
|
Reverse engineered Depli Oasis
(Insurance Application) into new Data Warehouse / BI Solution using MS SQL
Server 2008 R2, SSAS, SSRS, SSIS, Share Point, Performance Point Server, and
Power Pivot. |
|||
|
Consulted to create a new BI
Solution in an organization that was currently using reports generated from
SQL Queries from the Delphi Oasis Insurance software platform (Data was
stored in an Oracle Database). Primarily responsible for the architecture and
creation of the BI Solution to entail reverse engineering the source system,
ETL (moving of data from the source system to the Stage and DW), architecture
and creation of the Data warehouse, architecture and creation of the MS
Analysis Services Cubes. The source system architecture supported what was
referred to as “versioning”, which stored the previous data of each element
in the database. This data was stored in the same table as the current data,
which created an extra level of complexity to the project. Although many SME
for the Oasis Database existed, only knowledge around the Financial Module of
the application existed. The organization wanted a full BI solution
encompassing their Claims, Underwriting, Risk Management, and Finance
Departments. Without much to go on, reverse engineered the source systems
(Oasis) data base, and created an elegant/reliable ETL process to pull data
nightly from the source system into what could be called an ODS (Offline Data
Store). This data was then de-normalized into the Spider Schema in a Data
Warehouse after going through a “Stage” database. Instigated an Agile
approach to the project, and facilitated a daily Scrum process with eventual
planning meetings and full organization participation. Mentored many folks in
the Agile development process. Gave classes on Data Warehousing techniques
and architecture to in house IT personnel. Created new reports in Share Point
2010 and Performance Point Server using the new Power Pivot functionality.
Duplicated many existing reports in Reporting Services to facilitate change
from legacy reporting. Cross trained a FTE resource to replace my services
over the last part of the project. Worked with SQL Server 2008 R2, SSAS,
SSRS, SSIS, Oracle Developer, MS Office 2010, Share Point 2010, Performance
Point Server 2010, and Power Pivot Galleries. Worked remotely about 80% of
the time, the other 20% was on site. |
|||
|
6/2010 - 10/2010 |
Razor Fish |
Seattle, WA |
|
|
Industry: |
|||
|
|
|||
|
Publicis Group, Vivaki, Razor
Fish, SMG, Amazon Web Services |
|||
|
Consulted as a team member to
extract Double Click Advertising data from Google for specific Advertisers
and create a BI Solution for it. Project involved creating an ETL process to
extract data from Double Click via Web Service calls using the Dart API, and
flat files via FTP.Our primary development environment was the AWS cloud
(Amazon Web Services). Configured our own workstations, servers and
environments using a built in tool. Discovered data via Web Services and FTP.
ETL’d that data over using the Talend ETL tool. Created a Talend job to ETL
data via FTP into a central repository, then un-zipped, parsed, and imported
that data into a Staging database.Created a Staging database to store data
extracted from the source temporarily until it could be processed and ETL
into the data mart. Created a robust Data Mart where data was stored in a
highly relational Star Schema for consumption downstream for separate
clients. This Data Mart stored data for all the clients that were served up
reports through client specific data marts and cubes.Designed a SSAS cube to
host the data for consumption via the in house reporting tool called Insights
On demand. The cube design allowed for simple and easy hierarchy creation
across dimensions without the need to create any new underlying data
objects.Created numerous data dictionaries, Source to Target Data mapping
documents, Logical Data Models, Physical Data Models, and documentation on
what was available from the source, stage, and data mart.Participated in the
architecture of the data mart and cube in an iterative fashion while
delivering report ready data via Edge or IOS (Insights on Demand) every
sprint.Worked with a geographically split team that had never developed under
Agile (although in house Agile was the normal process). Went through the
Storming to Norming phase in a pretty painful way. Were able to get to the
Performing phase after two months. Exposed to: SQL Server 2008, SSIS 2008,
Talend, MS Excel 2007, Visual Studio 2008, IBM Lotus Notes, Cisco Web Ex. |
|||
|
1/2010 - 6/2010 |
CCLI |
||
|
Industry: |
|||
|
|
|||
|
Contracted to migrate data from a
heavily customized legacy CRM Application (Pivotal) into Microsoft Dynamics
CRM 4.0 via T-SQL (As opposed to Web Services). My day to day
responsibilities included 100% of the ETL (Extract Transform & Loading of
data) design and development. Each day the requirements changed based “upon
newly discovered requirements” from the business. While these newly
discovered requirements occurred, the Architect for the CRM Schema would make
changes daily without any notice. The majority of the entities in CRM were
heavily customized over time as were the ones in the Source System. The ETL
processes ended up being fairly complicated with almost all of the entities
being related to each other. On numerous occasions the CRM Schema would change
to a point where data could not be ETL regardless of method or process as the
architect of the customized CRM schema did not consider any database design
considerations. In fact when such existing ETL processes failed due to these
changes, they were specifically called out as ETL data bugs.Wrote numerous
complex TSQL queries to move 75GB of relational data into CRM 4.0 and created
a package in SSIS that allowed for automation. This SSIS package was capable
of uploading the CRM schema changes made by the CRM architect into the CRM
Server, publish that schema, drop and restore the targeted database with all
users and meta-data predefined, and then moved data from the source system
into this target database.Participated in daily Scrum meetings where Agile
was preached but not practiced by its true nature, while extreme programming
was the daily flavor. Almost no QA or Testing existed during the life of the
project. Exposed to Microsoft Dynamics CRM 4.0, Dynamics Great Plains, SQL
Server 2008 R2, Visual Studio 2008 C#.Net, Office 2010, Windows Server 2008
R2, and Red Gate. |
|||
|
4/2009 - 7/2009 |
Microsoft |
Redmond Wa |
|
|
Industry: |
|||
|
|
|||
|
Contracted to "make
sense" out of new/existing data that had never been analyzed
before. Determined the quality and completeness of that data and
provided in-depth reporting on it. Performed validation checks on
metrics through known "truths", and reported on the
discrepancies.Installed and configured SQL Server 2008 including SSRS, SSAS,
SSIS, and other SQL based services in a Clustered SQL Server environment
using two nodes and fail over services. Installed and configured MOSS, and
created our team site to document and facilitate team collaboration.Once the
initial project analysis was complete, the data was then extracted via SSIS
packages from their sources which were then executed from an SQL Agent
Job. The completed solution was deployed in multiple
environments (Dev, Test, UAT, and Prod). The ETL process included the
moving of data, data cleansing, and the creation of multiple: OLTP, OLAP, and
(MOLAP) fact tables / common, and slowly changing dimensions.
The last step in the SQL Agent job executed an SSIS package
which updated the SSAS cube.The final solution included a single cube
with multiple fact tables, shared dimensions and custom hierarchies which
spanned the common dimensions. The cube compared known and validated
metrics to those in the target data source for data validation, and
accurateness and completeness of the data.Created numerous custom mapping
tables which were used to map dimensional data from two completely different
source systems. This had never been done before with any level of
accurateness until this project.In the end the new data was of tremendous
value to MS and I was informed that not only did the new information save MS
a significant amount in lost revenue, but also changed future budget
forecasts, marketing strategies, and facilitated in restructuring the
way in which the data was acquired, managed and later used and relied
upon.Exposed to XML, Excel 2007, SQL Server 2008, SSAS 2008, SSIS 2008, SSRS
2008. Consulted remotely for the majority of the project.
Consulted in the OEM finance group at MS. Worked offsite for 50% of this
project.Although an offer to extend for another 6 months was offered, I
decided to take a 6 month leave of absence to explore business opportunities
in Asia. |
|||
|
5/2008 - 6/2009 |
Mutual Of Enumclaw (Insurance) |
Enumclaw, Washington |
|
|
Industry: |
|||
|
|
|||
|
Contracted to maintain, add
functionality, and create new reports in MS SQL Server 2008 Reporting
Services. In addition to reporting, an existing unfinished ETL process had to
be re-architected and in some cases the staging database also had to be
re-architected. Customer wanted a true history of new, deleted, and updated
records. Created a new process where the destination staging tables had
triggers on them which moved these changes to "history tables".
This Staging Database became an Operational Data Store (ODS) for most of the
companies downstream reporting.Wrote numerous transactional stored
procedures, removed numerous performance bottle necks and design issues.
Created numerous SSIS packages that included package configurations, SQL
logging, and error handling. Created several SQL Agent Jobs which executed
the SSIS packages and also performed numerous data validation checks.
Re-wrote numerous existing production reports in Reporting Services for
performance reasons. The biggest gain on performance was one report
that previously took 15 hours to run, and after modification, 20
seconds. Data Sources came from Oracle 10g and 11i, IBM i-Series DB2,
MS SQL Server 2000 and 2005, XML, CSV text files, and MS Excel spread sheets.
Since there was no Database Administrator, all DBA work was also included in
the development lifecycle. This included installing and configuring the
Development, Testing, User Acceptance, and Production Environments.Worked
directly with the customer to gather requirements for reporting. In
this role I acted as a PM, DBA, Developer, Architect, Business Analyst, and
mentored many junior level developers.Custom created a MS SharePoint portal
that tracked the entire project lifecycle from gathering requirements, to
document creation and tracking, to development stages, to testing, and
finally sign off and release into production.Exposed to: MS Office 2003 &
2007, MS SQL Server 2005, SSIS, SSRS, VB.Net, C#.Net, ASP.Net, i-Series,
Oracle, VSS 2005, SSMS, and Visual Studio 2005. Performed a wide variety of
DBA tasks such as: code migration/deployment, database maintenance,
replication, database backups and restores, SQL Agent job creation and
scheduling, database creation, security models, data models, and data
warehouse star schema architecture. |
|||
|
2/2008 - 4/2008 |
Microsoft Corporation |
Redmond, WA |
|
|
Industry: |
|||
|
|
|||
|
Sub-Contracted for 6 weeks to
install, configure and build a new Dashboard / Scorecard reports in
Performance Point Server 2007. Determined pre-install requirements; installed
and configured Performance Point Server 2007 and all of its components to
include: Windows Server 2003 SR2, IIS and ASP.Net, SQL Server 2005 Database
Engine, SQL Server 2005 Analysis Services, SQL Server 2005 Reporting
Services, SharePoint 2007, Office 2007, Performance Point Server.Using
existing and an evolving set of documentation (BRD/FRD) architected, and
developed a new SSAS reporting data mart. Created numerous new SSIS packages,
TSQL Scripts to extract data from source systems and build out new highly
relational Fact, Intermediate, and Dimension Tables. The main data source for
the cubes was actually being completed while we were building our
solution.Created new SSAS Cubes with new dimensions, facts with measurer's,
calculated measures, and KPI’s. Once the data was validated against the
source systems, the new dashboard and scorecard reports were created,
documented, and went through an informal UAT process.This entire project was
done on an Extreme Programming SDLC methodology where we were architecting a
solution as requirements came in and often times changed. Worked “side by
side” with a Junior SQL developer to “tag team” and complete many aspects of
this project. This work was completed under extremely high pressurized time
lines in a politically charged environment.Exposed to: MS Analysis Server
2005, SSIS, SQL Server 2005, MS Excel, Performance Point Server, SQL Server
Management Studio, MDX, and Business Intelligence Development Studio. Office
2007, Windows Vista, MS Windows Server 2003 R2, SharePoint Services 3.0 MOSS
2007, Visual Studio 2005. |
|||
|
11/2007 - 1/2008 |
Chesapeake Energy |
Oklahoma City, OK |
|
|
Industry: |
|||
|
|
|||
|
Consulted to audit, document, and
fix a partially completed SSAS cube and its ETL from data sources; while
adding new additional functionality. Resolved numerous ETL process and logic
issues, updated and added several SSIS packages, wrote numerous TSQL Scripts
to create Fact, Dimension, and Intermediate tables, re-created new cubes,
resolved numerous cube business logic issues, provided solid documentation.
Re-installed, configured, and administered the database server. Created
maintenance plans, and disaster recovery processes. Installed and
deployed TFS (Team Foundation Server) with MOSS for project documentation.
Worked alone without a project manager, business analyst, tester or any
direction from management for the duration of the project. Gave several
presentations to the in house data warehouse team on data mart logic in
regards to fact, dimension, and intermediate tables. Provided ongoing
mentoring for in house developers on the development of MS technologies for
BI Solutions. The completed cube was later used by the internal data
warehouse team as a model for new OLAP projects. Consulted "one on
one" with many managers from different groups within the organization.
Exposed to: Toad for Oracle, i-Series Tools for the IBM/ DB2, Office 2007,
Share Point, VS 2005 & 2008, TFS 2005, SQL Server 2005 SSAS, SSIS, MS
Excel, and Oracle 10G. Extracted data from sources to include: MS Excel, XML,
SQL Server 2000, SQL Server 2005, Oracle 10g, IBM i-Series DB2, text files
(tab delimited); using SSIS packages in SMS (SQL Server Management Studio)
and Business Intelligence Studio 2005, and sourcing the project in TFS (Team
Foundation Server) 2005. All documentation was stored on a project Share
Point Site. |
|||
|
7/2007 - 10/2007 |
Microsoft Corporation |
Redmond, Washington |
|
|
Industry: |
|||
|
|
|||
|
Consulted to create SSAS cubes
based upon current business needs. Participated in a business logic upgrade
in the underlying dataset. Created SSIS packages to ETL data from one various
data sources to the data mart. Re-wrote numerous stored procedures for business
logic changes and performance reasons. Created several new MS Excel pivot
table reports based on new and existing cubes hosted in SSAS. Re-architected
all the currently used cubes to run and compile faster. Designed, created,
and implemented several new SSAS databases, Cubes and pertaining objects. Set
up and configured MS Team Foundation Server / Share Point, and got our team
(4) to use it as part of an SDLC which the team did not have before. Provided
ongoing maintenance to the TFS / Share Point site as needed. Instructed team
members on how to create issues, assign or create tasks, source/version code,
and collaborate on the Team Portal. Exposed to: MS SQL Server 2005, SQL
Server Analysis Services, SQL Server Integration Services, MS Excel 2007, MS
Team Foundation Server, MS Tools: (SQL Server Management Studio, Business
Intelligence Development Studio, MS Office 2007, Share Point). Worked off
site from home (telecommute) for 95% of this project. |
|||
|
1/2006 - 2/2007 |
Microsoft Corporation |
Redmond, WA |
|
|
Industry: |
|||
|
|
|||
|
Provided rock solid numbers to the
World Wide (EPG) Enterprise Group. These numbers were used as a baseline for
other groups under the “Finance Umbrella”. These reports were published on a
Share Point Portal Site, and downloaded and used around the World by MS FTE.
Gathered requirements, architected solution, developed, tested and published
production code and reports for worldwide use. These reports provided
accurate data where none existed. 99% of work was performed off-site (from
home) through: RAS, MS Office Communicator, MS Outlook, and conference calls
for the entire project. Significant experience in: MS SQL Server 2005, and
SQL Server 2000: DTS - SSIS, ETL, and MS Excel. Providing experience to
"Team Members", whom I am told will replace the current reporting
mechanisms. Translated “Business Intelligence” into user friendly reports
that allowed users to drill down and see numbers that they were never able to
see before. Extensive experience in: MS Excel, Excel Pivot Tables, Excel
Macros, Excel Formulas, and Active X controls, including advanced VBA, SQL,
MS SQL Server 2000 & 2005, ETL, OLTP, Share Point Portal Services.
Provided team leadership to: developers, testers, and Project Managers.
Extensive Database: Design and Development under minimal supervision,
leadership and documentation. Many have requested that I describe what I did
on a daily basis, since my job goes beyond the normal. So here is that
description: Based upon previous and on-going needs for new reports, I would
gather requirements based upon what was requested vs. what was considered
common knowledge. In some cases this need occurred based upon what I
determined was required, vs. what I determined I was un-clear about. This
could be anything from how the data needed to be presented to what specific
data to use. The ongoing weekly need was to produce reports to the EPG group.
These reports required the pulling of data from numerous data sources (SQL
Server 2000, SQL Server 2005, MS Excel spreadsheets) within MS from their CRM
(Seibel) to marketing data sources (MS Sales). Exposed to: Office 2007,
Windows Vista, MS Marketing and Deployment Goals, SQL Server 2005, SQL Server
2000 MS Excel 2007, Excel 2003, DTS SSIS & ETL. |
|||
|
7/2005 - 12/2005 |
Bill & Melinda Gates Foundation |
Seattle, WA |
|
|
Industry: |
|||
|
|
|||
|
Developed and designed a new data
mart project while cycling through numerous development lifecycles / builds
using Microsoft Styled coding, versioning and development methodologies.
Configured and scripted MS SQL Server Replication Services for a transactional
database. Created numerous dynamic SQL scripts using SQL Server System tables
to create transaction tables and triggers for all Replicated tables in the
staging database for updates, deletes and inserts. Complex TSQL was used
within DTS packages, stored procedures, and functions from the transaction
tables to the data warehouse fact and dimension tables. DTS packages included
transactional level processing, logging, and used ActiveX (VB Script) to
perform data transformations. Validation was performed through complex TSQL
Statements to compare data in the source database to that in the Data
Warehouse. Numerous technical Specification documents were created to detail:
build deployment of code, technical document specifications, coding
requirements, code sharing and collaboration. Exposed to MS: Visio, Visual
Source Safe, Share Point Portal services, Project, Office, XML, XSLT,
(ASP.Net), (ADO.Net), (C #.Net), and SQL Server 2000. |
|||
|
3/2005 - 7/2005 |
Microsoft (contract) |
Redmond, WA |
|
|
Industry: |
|||
|
|
|||
|
Microsoft: Accounting and Sales
based Enterprise Level reporting for Microsoft Corporation: Performed complex
reports in MS Excel & Pivot Tables based upon SQL Server views, stored
procedures, existing MS Excel spread sheets, and newly created requirements.
Worked directly with MS management and staff to analyze and create new and
competitive reports based upon current market needs. Several existing reports
were converted into a newer MS Reporting Services format to provide new
insight to the competition, and the current market. Numerous reports remained
in MS Excel format due to MS Reporting Services limitations based on
performance and functionality on a World Wide Basis. The newest reports were
hosted in MS Reporting Services, with an OLTP database as the data source.
Exposed to MS: SQL Server 2000, SQL Server 2005 (YUKON), Reporting Services,
Stored Procedures, Triggers, Cursors, DTS, VISIO, Share Point Portal Server,
MS Server 2003, ASP.Net, VB.Net, VBA 6.3, Office 2003 with a focus on Excel
Pivot Tables, XML, XSLT, DTD, HTML, ADO.Net, Virtual PC, Virtual Images,
Power Point, Score Card Designer, Dashboard Designer, and Numerous non-released,
Beta technologies are being used.. |
|||
|
1/2005 - 3/2005 |
EED Inc. (Contract) |
Kirkland, Washington |
|
|
Industry: |
|||
|
|
|||
|
Research was first conducted on
the feasibility, complexity, and flexibility of using MS: Analysis Services,
Reporting Services, and SQL Server 2005 for reporting needs within the
organization. Although the "proof of concept" was originally completed
on SQL Server 2005 Beta 2, a business decision was made to use the existing
SQL Server 2000 platform. Installed, configured, and managed all aspects of
MS Reporting Services. Created, configured and managed a Security Model for
Reporting Services in Active Directory. A number of reports were created,
which included: Sub Reports, Drill Down, Linked, Snapshot, and Parameterized
Reports. Reports were scheduled to be delivered by Email and File Share for
different users. Dynamically executed reports were accessed via URL’s, while
subscriptions were published in varying formats to a separate IIS Server for
hosting or sent via email. Reports were archived, executed dynamically,
scheduled, and exported in various formats with varying complexity. The
administration and creation of reports were performed through VS 2003, and
Business Intelligence Development Studio in SQL Server 2005 Beta 2. |
|||
|
6/2004 - 12/2004 |
AT&T / Cingular (Contract) |
Redmond, WA 98052 |
|
|
Industry: |
|||
|
|
|||
|
A MS Access 97 application that
performed Tax Calculation was upgraded to MS Access 2002. DAO code was
converted into ADO code, and MS Access Tables were moved to SQL Server 2000. MS
Access functions, views, and database functionality were moved to SQL Server
2000 by converting objects to Stored Procedures, Views, and Functions,
Triggers, and DTS packages. Personally designed, implemented, and deployed a
full SLDC for this team. Analyzed, determined effort, documented, designed
and implemented numerous enhancements for existing application. Perform RDMS
design, architecture, security, administration, maintenance, and archiving of
data on a MS 2000 SQL Server. Interface database technologies with Oracle
11i, and MS SQL Server using ODBC connectivity. Upgrade, develop, design,
debug, compile, and document VBA code, forms, reports, queries, and linked
objects in MS Access. Provide end user support for a globally accessed MS
Access application running a SQL Server back end, which performed tax
calculation through a Citrix Metaframe Client. All processes met SOX
Compliance (Sarbanes Oxley). Exposed to: VBA, MS Access 97, MS Access 2002,
NT Server 4.0, MS 2000 Advanced Server, Citrix Metaframe, SQL Server 7.0, SQL
Server 2000, MS Project, Visual Source Safe, NetMeeting, VISIO, DAO, ADO,
TSQL, Stored Procedures, Triggers, Cursors, SQL, Remedy & BAER Issue
Trackers, and MS Office products were used. |
|||
|
1/2004 - 6/2004 |
Fidalgo Networking (FTE) |
Mount Vernon, Washington |
|
|
Industry: |
|||
|
|
|||
|
Existing customer data were
analyzed, parsed and exported from Quick Books Enterprise Edition into
Platypus Billing Software / SQL Server 2000 platform. Analyzed existing Quick
Books data, exported data using a 3rd party ODBC driver into a MS Access 2003
project. Complex data: cleansing, parsing, removal and concatenation were
performed on existing data. VBA and complex views in MS Access were used to
sort the existing data in Quick Books, to data in MYSQL Server, Oracle and MS
SQL Server databases. Performed complex SQL statements to join and compare
the parsed data. Reports in MS Access, Excel and Crystal Reports were created
to display information for final exportation. VBA in MS Access 2003 was then
used to place the data directly into MS Excel, and MS Word documents for
day-to-day business operations. Created a front end to access the SQL
database to provide additional functionality not supported in the Platypus
software using MS Access 2003. The Version 5.0 of Platypus software was
tested, while providing feedback to Boardtown developers. Created DB driven
web pages using ASP.Net, ADO.Net and VB.Net. Exposed to: ADO, T-SQL, VBA 6.3,
MS Access 2003, Excel Object Model, Word Object Model, MS SQL Server 2000,
MYSQL Server, Oracle 11i, Platypus, and Quick Books Enterprise Edition. |
|||
|
1/1996 - 11/2003 |
Elite Computers & Networking
(Owner / Operator) |
Hamilton, Montana |
|
|
Industry: |
|||
|
|
|||
|
Lead teams ranging from 5-10
persons to complete numerous database related applications created primarily
in MS Access with the RDBMS in MS SQL Server 2000. Gathered requirements and
worked directly with business owners. Provided project planning in Visio, and
Power Point. Development included: creating GUI forms with VB6 & MS
Access, debugging and testing VB6 & VBA code, writing queries, writing
TSQL statements & Stored Procedures, creating various web sites to access
database, creating reports to print on a wide variety of printers and
pre-printed forms, database design, table creation, database administration,
database performance tuning, importing and exporting data from legacy
databases, creating scripts, and monitoring network loads.Exposed to the following
MS related products: MS Access 97-2002, MS Enterprise Manager, SQL Server 7
& 2000, ADO, VB6, VBA, C++, HTML, VB-Script, TSQL, SQL, Triggers, and
Stored Procedures. |
|||
|
EDUCATION: |
U of M |
US |
|
|
|
|||
|
Generals: Psychology, Speech,
Technical Writing, LAN / WAN Technologies. |
|||
College of Technology |
US |
||
|
|
|||
|
Network Specialist - Received
training on Novell Netware 4.0, NT Server 4.0, MS Access 97, VB6, SQL Server
7, CISCO IOS, HTML, ASP, ADO, PHP, C++, VBA, VB6, Advanced Networking /
Protocol Classes. |
|||
|
SKILLS: |
Skill Name |
Skill Level |
Accounting |
Expert |
|
MS Access |
Expert |
|
VBA |
Expert |
|
VB6 |
Intermediate |
|
ADO |
Expert |
|
HTML, XML |
Intermediate |
|
OLAP |
Expert |
|
SSAS (SQL Server Analysis Services) |
Expert |
|
SSIS (SQL Server Integration
Services) |
Expert |
|
Data Warehouse Architect |
Expert |
|
Share Point Portal Server |
Intermediate |
|
Performance Point Server 2007 |
Beginner |
|
IBM iSeries DB2 |
Intermediate |
|
Oracle 10G PLSQL |
Beginner |
|
Toad for Oracle |
Beginner |
|
ERWIN |
Intermediate |
|
SQL, TSQL |
Expert |
|
VB.NET, ADO.NET, ASP.NET |
Intermediate |
|
MS Excel |
Expert |
|
Business Intelligence |
Expert |
|
|
LANGUAGES: |
Languages |
Proficiency Level |
German |
Beginner |
|
Spanish |
Intermediate |
|
Vietnamese |
Beginner |
|
|
CAREER HIGHLIGHTS: |
I have completed 10 years of
intense BI development using the Spider Schema: http://spider-schema.info |
|
REFERENCES: |
Reference Name: |
Shelley Rose |
|
Reference Company: |
Green Gardens |
|
|
Reference Title: |
CEO |
|
|
Phone: |
(206) 354-1185 |
|
|
Type: |
Professional |
|
|
Reference Name: |
Krista & Ken Kanenwisher |
|
|
Reference Company: |
First Montana Title |
|
|
Reference Title: |
CEO |
|
|
Phone: |
(406) 363-2661 |
|
|
Type: |
Professional |
|
|
Reference Name: |
Surya Narayanan |
|
|
Reference Company: |
Cingular Wireless |
|
|
Reference Title: |
Developer / Lead |
|
|
Phone: |
(425) 753-3684 |
|
|
Type: |
Professional |
|
|
Reference Name: |
Tom Youngblom |
|
|
Reference Company: |
ASSOCIATED MILK PRODUCERS INC. |
|
|
Reference Title: |
IT and Applications Director |
|
|
Phone: |
507.233.3654 |
|
|
Email: |
youngblomt@ampi.com |
|
|
Type: |
Professional |
|
|
Reference Name: |
Bryant Avey |
|
|
Reference Company: |
InterNuntius, Inc. |
|
|
Reference Title: |
Principal Strategy Architect, CEO |
|
|
Phone: |
(612)719-1174 |
|
|
Email: |
Bryant@InterNuntius.com |
|
|
Type: |
Professional |
|
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||