Women's Job List

Latest Resume


LOG IN OR REGISTER TO CONTACT ME

This button will open the login/register page in a new tab. After logging in, come back to this page and refresh your browser.

Resume:


Work Summary :    

 

A business, systems, data analyst, developer, programmer and during pandemic and periods of limited staffing, assigned role as lead

Tier-, Excel/VBA Trade Floor support person, Morgan Stanley.

 

Considerable RAD and traditional IT background and experience deployed, embedded with business... A liaison with Stakeholder, PMO and IT, also lead developer and programmer supporting large portfolios of sophisticated, complex Excel/VBA SQL applications that make use of ADO and ODBC connectivity with major Database DBMS’s, Oracle, SQL-Server, Sybase, SAS (IOM), DB, and MS-Access, also live data-feeds with realtime internet information providers, Bloomberg, Reuter’s, also Sharepoint, Cloud, and ETL…

 

I am dependable, easy-going, multi-task, and solid under pressure… Excel/VBA and Formula Expert... I assimilate fast.  Adept at stepping in, filling gaps, solving problems, fitting in...  Strong RCA...  Comfortable working with all levels of management and organization…  An affinity for detail.  Strong business acumen.  Solid written, speaking, power-point presentation and communication skill to include documentation and technical writing…   

 

Company Overview:    

 

  • Abbott Labs – Chicago/NYC, Contract, Excel/VBA SQL, Oracle Database and Order fullfillment, (WFH Jan , to present)
  • TIAA-Cref – NYC, Contract, Excel/VBA Annuities, Investments, Tax, Cost Basis, Gain/Loss Reporting (July to Jan )
  • Walmart (Bentonville) Remote; Excel/VBA, Word/VBA urgent Mass Mail during outbreaks of COVID- (May to July )
  • Morgan Stanley (MS) – NYC. Excel/VBA R, C#, Python Trading Applications (Sep to Mar then WFH to May )
  • MFX/Fairfax / IronShore P&C – NYC/NJ, Excel/VBA SQL, UW/Risk/Rating/Pricing, Contract (May to Sep )
  • Merrell Lynch - Pennington, NJ , Excel/VBA SQL Java Front Office, Middle-offce re-engineering Contract, (Dec to May )  
  • TV Guide (ROVI), Reengineer Excel/VBA SQL ETL, Philadelphia, Contract (June , through December )
  • Bank of Tokyo, Mitsubishi Securities, NYC, Excel/VBA SQL, Backfill PNL (February , thru early June , )
  • Teleflex Medical Instruments Int’l, Phila, Excel/VBA, HR Performance Rating, Salary/ Bonus Planning (Sept , to Feb )
  • EmblemHealth Insurance Companies, NYC, Excel/VBA SQL, SAS, ActuaryArchitect (Jan to Sep , )
  • JP Morgan/Bear Stearns, NYC, Excel/VBA SQL Trading Applications Developer and DTCC Clearing (Sep to Jan )
  • CitiGroup, NYC Global Creditcard Portfolio Risk Mgmt Backfill Excel/VBA SQL C,R, Python, Java (Jan to Sep )
  • Advent Capital Management Investment Advisors, Midtown Manhattan. Excel/VBA and SQL Equities, Options Trading Developer
  • Unilever Nat. Starch, Bridgewater NJ, Excel/VBA SQL,CICS,COBOL/VSAM/MVS/JCL, Developer, Programmer, and Support
  • Henkel, Ingredion, Bridgewater NJ, Excel/VBA SQL RPA/ML AI, ERP, Demand Planning, Forecasting, and Predictive Analytics
  • Proctor and Gamble, Oil of Olay, Ft Washington Pa, (formerly Richardson Merrell – Vicks Chemical Co.) Programmer
  • University of Penna, (DRL/UCSC, S. rd St and Market St, Phila, Math Dept (asst. to head Math Professor Charlotte Bergey,

                  Fortran/C/PL/JAVA as student tutor adviser)

 

Education:    

 

University of Pennsylvania.   Bachelor of Science, Mathematics (Quantitative Methods), Computer Science…

 

Application Experience: 

 

Extensive IT and technical background as analyst, programmer, project lead working exclusively with Excel/VBA and SQL technology platforms + years. Applications include CRM, PNL(P&L), FP&A, S&OP, ERP, Forecasting and Predictive Analytics (my strengths), also Planning, Budget, Spend and Runout... Expert skill with Excel/VBA, also intricate array formula, PivotTables, PivotCharts, PowerPivots, Slicers, Spark-Lines, PowerQuery, Power-BI/DAX, HeatMaps, Dashboards, Scorecards, API, and Custom UDF Functions.  Heavy experience with Shared Excel Workbooks, also workbooks with cell formula with cell references to cells that reside in other workbooks, connecting large networks of these workbooks to one another. Developed workbook applications that support the culling and filtering of reference data, sometimes many hundred thousands of rows, where use of seriously advanced performance and optimizing techniques essential in making this kind of workbook feasible, especially with the more sophisticated trading applications with live, frequent, real-time refresh-rates, eg, Bloomberg, Reuter’s to include numerous continuously refreshed URL data connections. These technical skills are current. With most recent projects here with Abbott, TIAA, and Morgan Stanley, wrote thousands of lines of VBA, and had maintained this pace of coding throughout my career, balancing project lead responsibilities with many of these projects along the way...

 

With Morgan Stanley and several Major Investment Banks NYC deployed, embedded with Quant and support teams sitting side-by-side with derivative (clo,cdo,cds,mbs,abs,lbs), also Fixed Income (Rates), FX and Equities traders and trading desks wearing many hats as developer, programmer, and rapid-response support person for urgent or broken Excel/VBA trading apps or modifications to live realtime pricing models, LIBOR, (re)building a yield curve, spreads, and RAD programmming initiatives.  Considered a high pressure role, many high pressure moments.  For me, a  very fulfilling, rewarding part of my overall career and work experience. 

 

ETL, EDI, and FTP data conversions, transformations, data mapping, data migrations w/CSV, TXT files during startup phases of new projects or ongoing feeds to and from legacy platforms, enterprise databases, the cloud, ie, handshakes to and from other systems such as SAP.  ETL always an important part of every project throughout my career. Sometimes under-appreciated, these are critical, sometimes very complex processes that require sophisticated safeguards, control-reports, exception detection and sometimes suspense files that are recycled automatically with each next run to assure data quality, and data integrity...

 

Business Analyst

 

Embedded with business liaising with Stakeholders, PMO, IT and with use of SDLC (Waterfall) and ALM, authored many BRDs making presentations to senior management.  Essential st steps conferring with stakeholders, business areas and project teams, building concensus, confirming that all of us are on the same page, that we fully understand what we are being asked to do... Authored many FRDs and DFDs, Use-Case drawings, with use of Agile, Jira, and Scrum boards, Visio, wearing many hats, as JAD Facilitator presenting the DFD which is the deeper study, a detailed mapping, the business model, a forest view that identifies all operational areas, data sources, information and dataflow to and from each bubble, one operational area to another, essentially this is the scope of the project... If it’s not on the drawing, it’s not in-scope, but revisions do happen. Heavily annotated lines and arrows with supporting narrative describe each exchange of data and information from bubble to bubble, to and from proposed new and existing data-store(s), one to another and if there were gaps, missing pieces, we caught them here... My goal, the final presentation, is to gain a final concensus, a key milestone,and  then shifting to a higher gear developing the SRD and making the transition from SDLC Waterfall to the more traditional IT Waterfall where I switch to using JIRA, MS Project... Throughout my career I’ve gained the bandwidth and skillset to present to senior management, huddle at the whiteboard sit side-by-side with project team in many roles as developer, designer, a programmer, and everything in between. 

 

HR/HCM Projects: Numerous projects as designer and developer assisting HR with the automation of the annual Budget, Staffing Plan, Salary Planning, Employee review/rating, and the annual employee global census collection, Europe, Asia, and the Americas. A process that begins every September.  A template, essentially, an Excel workbook, one for each profit/cost center location allowing for cross border currency differences, transliterations. Also salary increase and bonus funding pool attribution process using KPI Metrics as predicates, examples would be regional profitablility, budget versus spend, fewer incident reports, amongst many other performance metrics... Developed payroll compliance reports with sensitivities to diversity and discrimination in its many forms…  See Teleflex Int’l HR project details, page …

 

Data Analyst :

 

Formal study of data normalization, data modeling, data science and database design at University of Pennsylvania (a math and computer-science major), and as an IT DBA, and lead DBA with Unilever, considerable experience designing relational databases and data warehouses with considerable scale and complexity.  Some examples of large database initatives, data mining, data modeling and analysis follow…

 

With ERP and related projects , built and performed studies with databases with millions of rows of historical data, testing for trend, cycles, interval, amplitude, seasonality, a distinguishable signal.  Built many fully automated model fitting (RPA/ML) and back-testing processes that determine which of many forecasting methodologies (triangles, time-series techniques, Bayesian Linear techniques, single, double, and triple exponential smoothing (choosing optimal smoothing constant sets), where it decides which of these algorithms is best suited, most accurate, product-by-product to produce forecasts SKU specific, that feed a production planning process for multi-national company that manufactures s of millions of pounds of product each month.  Knowing which products to make, when to make them, in what quantities to make them and where to inventory them (warehouse distribution) in an industry that is always capacity constrained, has direct impact on logistics costs and company’s bottom line.  Having this experience, I found that the accuracy of a forecast isn’t just a function of the arithmetic one uses to produce a forecast, it’s also a function of the preparation, organization and quality of the data it feeds from (described more fully below).  I developed many fully automated data modeling and model fitting techniques to identify what specific forecasting method to use in each of the varied products company produces, sku specific, where each sku exhibits unique trend, cycles, and seasonality. Please see page in particular for intricate detail of several large scale Forecasting and RPA/ML AI applications...

 

At EmblemHealth, a leading health insurance company, NYC, a finding from my studies with the millions of rows of claim history analyzed each month, that in order to more accurately calculate future claim reserve projections, data must first be organized in homogeneous cohorts (projection cells) of data where each grouping of insureds, the cohorts, share a similar profile, the same demographics...  This is a disciplined data analysis process using methods such as chi-square, multiple regression, correlation coefficient (similar to market beta), and covariance - the trend correlation of many datasets of data and with the use of Excel/VBA, it was now possible to demonstrate that data when organized in these cohorts, with greater homogeneity, similar profiles, and with use of more advanced predictive models (triangles) that better understand medical event costs, specifically post medical event additional costs, these are considerable costs during period of recovery (the runout) previously thought not predictable; These projections are now greatly improved, more accurate, consistently produce reliable projections, a sufficient cash reserve number, a mandate with New York State Insurance Commision’s stringent requirements (they impose heavy fines) to never be under-reserved which was a major justification for this project.  Please find a detailed description on following pages...

 

Mass Mail-Merge, Labels, Letters, eMail, Invitations, attachments, notifications, Forms, artifacts, contracts, and personalized document distribution  Built numerous applications leveraging use of PDFs, PDF-ToolKit, VBScript, Excel/VBA, Microsoft Word, Outlook HTML(OLE), and sophisticated Bookmarking, using Word/VBA , sometimes VBScript to produce massive volumes of labels, also many thousands of important, time-sensitive, documents, email, invitations, and appointments with attachments and did so in a wide range of industries. eg, Cenlar Mortgage Servicing Company, NJ. 

 

 

Innovation Projects.

 

Developed numerous Excel tools, one example, a tool that performs sophisticated workbook comparisons (compares current with prior or other like instances of the workbook in question, an RCA tool as a way to isolate/debug a broken Excel Workbook), a fully-automated tool that scans all parameters, all properties that live deeper inside these Excel workbooks, and dives even deeper  insides each worksheet, and way-way deep inside each cell, the hundreds of cell properties, examples would be conditional formats, hard-coded validation lists, or pointers to validation lists in other ranges of cells on same or different worksheets, also cell-ranges that point to live data connections, also sparklines (and the cell range of the cells it feeds from) and then there are the range-names, where sometimes they point to cell ranges that live inside other workbooks (especially if other workbook accidentally is moved to a different filepath), and the more familiar properties like - locked, unlocked, color, Font, Font-size, bold, it’s endless... It’s way more than just the simple comparison of each cell formula in workbook to corresponding cell in workbook , which typically is as far as most of these types of “compare tools” go... This tool also compares every property associated with external data connections, and vba query-table URL, and ADO SQL database connections, also pivot-table properties, Chart properties, ie, a Chart’s dataseries-collection cell-range pointers, and it does all this with VBA (VBA code that looks at VBA code) looking for modifications (recent changes to vba code to include vba code that lives inside Addins, and API) that if code changes are detected, no matter how subtle or small they are, they generally are what caused or contributed to a broken, or mis-behaving and/or corrupted workbook... Also this process looks for changes to the states of Names (range-names in the menu ribbon, ie,  Formula>Name Manager menu ribbon) and the cell address(es) they point too, to include external references to cells that reside in other workbooks that for any number of reasons these connections/pointers become broken. Happens alot...  A note that this is a far more comprehensive tool than existing tools of this nature, even those made available from Microsoft, and believed that this tool may very well be the only tool of its kind, having this degree of complexity and capability. Has received much praise, kudos from top senior management here at Morgan Stanley...  When an important workbook all of a sudden stops working, it’s almost always the result of a change to it, sometimes inadvertant, or to something it feeds from upstream of it, the example is a trading tool on the trading floor, feeding from an external source, during a high pressure trading moment... diagnosing and solving these problems with the fastest speed possible is highest priority... Was shared with me that Morgan Stanley senior management was appreciative, and that the person that developed this tool “must’ve really know their stuff” is how word of this got back to me, definitely a good moment for me...

 

Also with Morgan Stanley, I developed an impact anticipation tool, ahead of rollouts of updates (software refreshes) for existing Addins, API and common code specifically with Trade related workbooks, This tool walks every filepath (every folder and subfolder) all servers company-wide (looks behind every pillar and post) and with user-supplied search-criteria it examines, jumps inside each and every Excel workbook as it traverses each filePath/folder, examining each and every worksheet, looks at every cell, its value, its formula, every property deeper inside each cell....  Also every data connection, every addin, every API especially if being obsoleted or being replaced with new API, also every validation rule, conditional format, any property that might make a reference or utilize API, Addin, or common-code, and as mentioned above, with use of VBA code it looks at each workbook’s VBA, every line of code, also in every module, every event, every addin, and doing this, it compiles a listing of every Excel workbook it finds with content that matches criteria that indicates impact, again its main purpose is to be pro-Active, to identify workbooks that will be impacted with a rollout, or software refreshes of this kind (it makes a list)... Its mission is to avert a breakage resulting from an impending change to a common service, addin, or API that it and other workbooks may be using/feeding from upstream of it.  These are the Excel Workbooks, especially trading tools that tend to proliferate, get copied and freely distributed throughout the many Trading Desk communities company-wide (world-wide), makes having these kinds of tools essential, especially suited for larger global Investment Banks and multi-national companies. Sometimes as trade floor support person asked to be at numerous places at same time, a one-person army (pronoun) as highest tier problem solver, essential for me to have or build tools like these... Many sophisticated tools like this I developed throughout my career. My role normally is as a developer in its many forms. But just like a volunteer fire person, I do receive the urgent calls as tier trading floor and senior management solution provider…  But for me, this role is a perfect blend as developer and RAD deployment Excel/VBA go-to special projects and support person…

 

My Technical background, skillsets:   The many acronyms of technology I work(ed) with  - Advanced Excel/VBA/SQL, Macros, Power-BI, DAX, VBScript, VB, .Bat, Add-Ins/API, Rest-API, Word VBA, Heavy SQL with ADO/ODBC Database connectivity with MS Access, Salesforce, CRM, SAP/ABAP,FieldGlass, SQL-Server, SSRS/SSIS, UI/UX, UDF, Dashboards, Tableau, Sybase, Mainframe, MVS ,DB ,VSAM,CICS,COBOL, JCL, SPF, TSO, and SAS (with IOM), and most every DBMS...  SAAS, Heavy JIRA, Scrum, Agile, ETL,EDI, HL data migration, FTP, with Cloud based apps (Azure, AWS). RPA/ML AI and Blue Prism. Considerable use of Hyperion for rollups and consolidation (PNL/GL), QlikView, Adobe PDF ToolKit, Cognos, XML, SOAP/Rest API, Sharepoint, Java, R, Python, VSTO, Scripting Dictionary, .Net, VB, C#, MS , MS-Project, Word, Outlook, PowerPoint, Visio, and an OOP programming style …  Note:  Visual Basic aka VBA and VBScript

 

With recent projects, any given day, you can find me at my desk coding, in meetings with stakeholders, making a presentation, or sitting side-by-side with a jr programmer debugging a program, and that same day you might find me on a trading floor fixing a broken trading app…  Considered a seasoned professional, embedded with business, a low-key, team-oriented individual that can step into a high-pressure role technical and otherwise day- and hit the ground running... Please consider me for this opportunity... Thank You...

 

Company Specific Project Descriptions :

 

Abbott Labs – Chicago/NYC, Contract, Excel/VBA SQL, Oracle DBMS, Order fullfillment, Salesforce (WFH and onsite Jan , to present)

A backfill role as an Excel/VBA programmer.  Abbott, well into their project determined they needed more experienced Excel developers to re-write their Order Entry, Pricing, Order Fullfillment, Supply-Chain, Shipping and Invoicing applications. They reached out to me.  Initially thought Excel would be the fastest and easiest way to build this application, Abbott realized well into this project that much of the complexities of building this kind of application would need to be written with VBA code. The UI/UX (user interface) aspects of the application would be ideal for an Excel presentation layer. But there was no easy way to achieve data connectivity with the spreadsheets and Abbott’s massive enterprise database. I developed numerous API to address what was identified as system and enterprise-wide reusable code. This was a high priority application that needed to be brought online as quickly as possible to strengthen an aging order fullfillment system and supply issues in the overall marketplace. Abbott, at this time, was under considerable pressure with much publicized supply-chain issues. This was an ideal role for me as a lead programmer and project lead. Excel and VBA with ODBC connectivity to an Oracle database are a strength and this project needed to be up and running ASAP; was clearly the top priority here at Abbott...   

 

TIAA-Cref - Contract  NYC / Durham, NC (Onsite/Remote) (July to January )   Excel/VBA SQL Technology platform...

Assigned Role as analyst, developer, and programmer embeddded with Actuarial and Quant Teams developing an Excel Workbook Pension and Retirement Funds Management and Administration System. Its purpose is the managing of Retirement, Savings and Investments, and related transaction activity, examples, Derivatives, Insurance, Dividends, Annuity Surrender, Premium Payments, Untaxed Gains, Taxable Gains, Loan Withdrawals, Taxable Income, Disbursements, Distributions, Cost Basis calculations and Valuations, and a predictive cash flow and runout projection process for millions of members (UFT Teachers Union), many billions of dollars...  Was asked to develop this sophisticated spreadsheet. Heavy emphasis with UI/UX features, just shy of . lines of VBA code, a fully automated refresh process with ADO/ODBC SQL connectivity to back-office Oracle databases.

 

Walmart (Bentonville Ar) Remote; Excel/VBA, Word/VBA during outbreaks of COVID-  (May to July )

Large-scale office-automation, urgent short-term projects, with major retailer Walmart, Excel/VBA, and extensive Word/VBA (a very rare skillset ) and HTML with sophisticated Labeling and Bookmarking for mass mail-merge sending many ’s of thousands of individualized postal letters, literature to customers, buyers, vendors, business associates, employees, a real challenge given the considerable time constraints to make this happen; Also built a mechanism to send mass email via Outlook, ,+ employees and business associates, also buyers and vendors, intra-company with individualized (personal) attachments, and Outlook meeting invitations also with meeting agenda attachments and exhibits working with the Walmart (Bentonville Arkansas IT location) with one of the world’s largest IT departments, a high pressure role working with considerable time constraints during COVID- regional peaks, spikes and outbreaks that made mass tactical correspondence essential… I did this to assist Walmart IT, surprisingly less skilled in this area, to get this done… And did this type of work many times at many companies such as Cenlar, a mortgage servicing company located in Pennington, NJ  and on ad-hoc basis at the many companies I worked with throughout my career.

 

Morgan Stanley  - Midtown Manhattan, NY (September to March , WFH March to May )

Joined STRATs and FX Quant Teams as a developer/support person initially asked to build a new class of trading, pricing, spread, and yield-curve applications with emphasis on data visualization, office automation, dashboards, heatmaps, and scorecards. At the time when COVID spred through our building at Broadway, Morgan Stanley invoked their continuity plan, where employees and consultants, all of us were asked to transition, and work from home. I was assigned role as Morgan Stanley’s Tie-, Tier- Excel/VBA go-to support person, embedded with trading desks, sitting side-by-side with traders, at that time, our top priority was simply to keep our technology platforms stable, up and running. A reputation for multi-tasking with trading desks, a go-to support person that could step up, and do the ‘work of many’, my managing director shared with me during an annual review...  Pre-Pandemic, some technical highlights include creation of a new class of worksheet function (an Addin with , lines of VBA) that produces sophisticated business graphics.  This was considered a very special capability. Received lots of attention. Normally, a worksheet function works within the scope of the one cell it resides in. Building a dashboard, comprised of sophisticated charts, and sparklines, that are generated from a worksheet function not generally thought doable.  MS’s Quant community, determined to have this capability asked me to find a way to make it happen, to have an ability to rapidly build and distribute sophisticated charts with enhanced graphics with what would look to be a simple custom worksheet function (a UDF), a formula within the cell that serves as the top left corner of the chart it just generated. We named it FastChart =FastChart().  And true with all BI dashboard tools, Excel provides for one primary and one secondary Y axis for its charts, but with what I was being asked to do, this new chart required the creation of a rd Y.  I refer to it as a tertiary Y.  And this required a modification, an enhancement to the Microsoft Excel Chart Object.  This was a high visibility project, thought not possible to do, a request that came from senior management, to make happen (See last page discussion “why there was so much interest with this dashboard”).  Was given opportunities to develop numerous custom add-ins (API), powerful tools, many of them, that attach automatically to any workbook (using VBA code that writes VBA code that it embeds inside the VBE VBProject property of the receiving workbook during the initial addin or API open event.  These are tools that had become popular throughout the Morgan Stanley Quant community, also Senior Management, the Derivatives and Options Trading Desks... Also asked to participate in many urgent special projects to include BI dashboards, that feed from databases, data warehouses, local and global, via connections from within the VBA code, eg, SAS IOM, ADO ODBC connectivity to Access, SQL-Server, Oracle, DB, Sybase, the Cloud, pretty much, all of Morgan Stanley’s DBMSs... Also Pioneer, Bloomberg, and Reuters Addins that provide realtime/live ticker data... These kinds of projects and opportunities have come my way throughout my career. And probably explains why to this day, I love working with Excel and VBA. Every project is a thrill.  Morgan Stanley’s PMO Team believed I would be the right person for this role after what I understand was a lengthy candidate search. Wasn’t initially hired to be a technical IT person, but was recognized as having a rare combination of strong technical Excel/VBA skill and strong business acumen and experience, and met the criteria for the person Morgan Stanley was looking for.  

 

MFX Fairfax / Ironshore P&C  Morristown, NJ (May to Sep )

MFX Fairfax provides software, and software services for property and casualty insurance industry. In this role, as business, systems, data analyst and Excel/VBA SQL developer/programmer heavy collaboration with Ironshore P&C, New York City. Working closely and embedded with Ironshore Actuarial staff; tasked with development and overhaul of large diverse set of casualty insurance risk/rating/pricing workbooks preparing them for a process that persists key policy submission, underwriting, booking, rating, pricing, artifacts and supporting documents, essentially all relevant information to a centralized enterprise database, itself in a state of development when I first came on board…  Extensive modifications needed to the embedded rating and pricing functions in each of the workbooks. These are large sophisticated workbooks. They undergo considerable modification as response to steadily changing business needs.  Formats and layouts vary from insurance coverage to coverage but much of the underlying key data is constant with all insurance products. The initial approach described to me during the interview simply was to perform the mapping of each cell address in a workbook to a specific database table and corresponding column-fieldname.  I was asked during the interview how I might approach this project mindful of an incomplete database design with workbooks that are always in perpetual states of change.  I cited some risks - starting with one example where something as simple as the insertion of a row or column in one of these workbooks, which is common-place, could easily break this process.  Underscoring the value of using range-names in this project, not necessarily for all Excel based applications..  Adding that incorporating the use of range-names for key data versus a mapping of rigid non-scalable cell addresses would eliminate ongoing (re)mapping (IT support) along with other larger advantages where the persisting process could be accomplished by simply walking the (range) names collection with VBA wherein each range-name always accurately points to the actual cell to be persisted (even if cell is moved elsewhere in the workbook) which could then be fed and indeed does feed the SQL engine that updates a simplistic database table consisting of (policy-id, range-name, element-name, and value) the mapping that would serve as the linchpin, essentially a database staging area that sits between workbook and what would ultimately become the Policy Object Database.  MFX liked this idea and believed I grasped the complexities of this task, and they hired me and went with the concept I outlined, and this actually simplified the overall process.  Allowed database team, Ironshore actuaries, and my work to have a reduced dependency with one another in that we could independently work toward the same goal without getting in one another’s way.   I developed a number of Excel based automation tools to help build this system. Entrusted with this responsibility, MFX and their client Ironshore believed I had the intuitiveness for how this system would need to be put together, an imperative for all of us, to meet the tight time-lines and milestones along the way.

 

TV Guide (ROVI), Philadelphia, Pa (suburb) (June , through December )   Performed work for TV Guide (ROVI), a global RAD initiative to automate receipt of television schedules and programming information sent to ROVI from local broadcasting companies from around the world in varied formats and languages.  Exclusive use of Excel/VBA and SQL.  All are transformed into a common format (a massive ETL process), required a massive staff doing this manually, now fully automated, a huge initiative that now feeds all of this content to TV Guide’s enterprise database, and then digitally re-distributed globally to every cable and set top box here in America, Europe, Asia-Pacific, and someday their goal, every cable and set top box, everywhere... This URL describes ROVI as being the most important company no one has ever heard of... https://www.businessinsider.com/the-most-important-media-tech-company-you-dont-know-rovi--  

 

  • Excel spreadsheets are used as the primary medium where local broadcasters from around the world send TV programming and scheduling information to ROVI where a staff of hundreds of people manually reformat incoming program schedules and captioning information into a common format and then manually upload this information into Rovi’s proprietary programming database. 
  • My role was to study this process and develop a strategy to automate this costly manual process.
  • Asked to assume role as Excel developer I built the automation that now feeds all inbound Excel schedules in varied formats through a process that converts all input into a uniform format using an Excel/VBA platform, with ADO/ODBC connection to Rovi’s proprietary Oracle database. Rovi does not require inbound content to be in standard nor even a clean format as doing so would place large burden on content providers (Rovi’s valued customers) and this was seen as the key challenge to overcome in order for this project to be successful...  This of itself was a significant technical challenge, one that I could not refuse, which drew me to do this project.  The project was successful and It really did require extensive advanced VBA and worksheet formula Excel skills to make this project happen.  

 

Bank of Tokyo, Mitsubishi Securities, NYC (February , thru early June, )  As Business, systems, data Analyst, Lead Developer.  Joined team well after project was underway.  Categorized as a RAD initiative, I helped FP&A management team meet aggressive target to go live with new generation of financial reports targeted for March month-end.

  • Assisted team to overcome unforeseen technical issues. Project was at risk of missing firm implementation date. Power-Pivots, charts, seemingly ordinary spreadsheets in a sharepoint environment were not operable. Sharepoint and Excel services with its many features, had constraints with earlier versions of Excel that rendered the initial approach not workable.  No experiment nor proof-of-concept undertaken earlier in the project.  Compatibility issues went undiscovered until mid-January less than months before implementation when former consulting team began porting (sending) spreadsheets from client machine to sharepoint environment…
  • Spreadsheets worked perfectly from desktop client but would fail when attempting to load / run in Sharepoint environment. Widely understood, but overlooked as examples, Sharepoint and Excel services, VBA code and validation-lists not compatible.
  • Experienced with these matters I was asked to help address and solve these issues quickly. And what was originally accomplished with elaborate VBA, we were able to re-engineer and accomplish solely with use the of worksheet formulae. Functionality such as doing an alphabetic sort (not for faint of heart) with the use of Excel array formula, and not VBA, deceptively complex to do.  Stakeholder requirement that sorting needed to be fully automated, no user intervention; This meant “do not force” end-user (Senior Executive/Management team) to make use of standard sort facility on menu ribbon… To further sidestep use of VBA, which was a firm stakeholder requirement/constraint regardless of which versions of Excel are in use, we did ultimately make use of Excel functionality (the slicer) and this enabled us to sidestep primarily the use of VBA code that synchronized filtering of the clusters of pivots and charts, aka, the dashboard. Oddly the slicer was not known to the original team heavily staffed with a consulting firm.

Validation list issues, also not a problem with Excel .  Only hitch I needed to overcome was to persuade management team to do emergency upgrade from Excel to which was otherwise prohibitive this late in the project.  But doing this allowed us to successfully address all showstopping issues which put us back on a glide-path to produce initial set of financial reports on time, as promised.  I developed of the financial reports considered highest priority to meet this first deliverable target date. 

 

Teleflex Medical Implant and Instrumentation Corp, Limerick, Pa (Sept , to Feb, ) As Business, Systems, Data Analyst, Developer and programmer.  With Excel/VBA, SQL and ODBC connection to Microsoft Access DBMS via ADO, I was asked to develop an HRIS/HCM Human Resource Total Compensation Mgmt application. One of the more challenging ETL initiatives in my career. This is an Employee Review, Headcount, Compensation and Bonus planning tool. Teleflex is a multi-national company with ,+ employees during the timeframe of this project. SAP HRIS module in early development, not available for their annual review, compensation, and bonus planning process, so Teleflex went with an Excel/Access interim alternative.  

   

  • This was a RAD project. Teleflex went with an Excel platform, mostly because Excel was already installed on every desktop at every location global, essentially already in place everywhere, Teleflex needed to develop an application that would produce a Salary Planning spreadsheet/template for distribution and collection to contain employee census information, one spreadsheet for each and every location, where each was built to capture/contain performance score, salary, bonus, and extra compensation data. countries (every employee).

 

  • System was designed to combine employee census data into a central Data Warehouse, a significant challenge, given the complexities with cross-border organizational relationships of Executives, Management, and staff - with a facility to capture each employee-to-manager (line/staff) relationship, inverse, from bottom to top of the hierarchical organization (org chart) all the way to CEO and Chairman. This new collection process would become the single feed for all ad-hoc reporting requests and possibly the initial feed to an SAP HR Module when and if SAP would eventually be implemented … This was the company’s first complete database with all employees, world-wide… Seen as a noteworthy milestone for Teleflex.

 

  • Challenges included transliterations and normalizing of diacritics (gliffs aka accent marks) where names are the only key in the absence of employee-ids. Salaries in multiple currencies converted to USD and then back to each local currency.

 

  • Developed many custom worksheet functions identified as reusable determined to be poppular and reusable company-wide... An example =findHier()  a custom worksheet function that looks at each employee’s next higher level of manager/supervisor iteratively,  to create threads that go from each employee to CEO, in some cases, approaching levels of management from employee to CEO…  When assembled, a vba routine translates a semi-colon delimited thread (returned from findHier function) and runs it through a vba split function into an array, and using shell commands, following the hierarchy, it creates a file-server folder structure that precisely mirrors Teleflex’s org chart.  The folder structure actually looks like an org-chart. Each Exec can drill down into their respective folder structure to see the detailed salary planning of their directors and managers. Security builtin to the process permitting manager ability to see only his/her complete organization.  In essence, the CEO can ultimately drill down from the very top of the folder structure down any leg of folders to view the very highest and lowest levels of salary planning data. 

 

  • These spreadsheets, fairly sophisticated, i.e., the salary and bonus planning tool, incorporate heavy analytics that take into consideration issues of discrimination, bell curves, each business entity’s contribution to profit, overall corporate profit pro-ration rules, with a dashboard cluster of business graphics and KPI to be used by Senior Executive Salary Planning Committee with capabilities to adjust pro-ration and bonus incentive parameters, essentially a tool that enables Exec committee to dial-in and fine-tune the settings, a cool tool to calibrate the parameters in order to conform with the budgeted bonus line-item.  It’s like having a set of dials, and as you fine-tune adjust the parameters, the set of pie-charts line-graphs, and bar-charts are changing with each turn of a dial.  A special effect that we built unknowingly, was a bit of a thrill to watch when this system was implemented. A really neat project for me.

 

  • Hierarchy threads, via the =findHier custom worksheet function integral to a company-wide initiative to re-rationalize global organization... Also a neat challenge to me and the neat staff I was working with, was building this facility to be flexible to organizational changes occurring while this process was being developed. The resulting database was to be used in the initial population via ETL of the SAP HR module in early planning stages. Was said that the functionality we built into these spreadsheets as short-term as this project was, would be a tough act to follow for SAP HR Module or with whatever HR application Teleflex ultimately goes with...

 

 

EmblemHealth Insurance Companies - Water St, Lower Manhattan, NY  (Jan to Sep , )  As Business, Data Analyst / Developer. Excel, VBA, ActiveX Controls, ADO, SQL, Hyperion, Ms Access, Oracle/SAS/IOM (SAS proprietary ODBC equivalent).

 

  • A lead developer, an Actuary Architect working with Emblem’s Senior Actuary as co-leads reporting directly to the Chief Actuary with a team made up of actuarial staff, working independent of IT, but working directly with SAS Technical staffing to address the numerous technical issues with what was for Emblem their first experience using the SAS environment.

 

  • As part of actuarial team initially assigned responsibility to re-engineer Emblem’s aging legacy data collection as preparation for a new risk, rating, pricing, underwriting, valuation and reserve projection process… Emblem’s month-end manual preparation and consolidation of ETL data-feeds (CSV files) and scores of manual FTP downloads, millions of rows of claims data from numerous aging legacy systems, many still reside on mainframes from earlier mergers and acquisitions, grew unreliable.  Adding to the tedium was the recoding of claims data to a common coding scheme (HL) and the culling of this data into separate cohorts ahead of the valuation process and doing this had added complexity and considerable risk of human error.  These Cohorts were found to have poor homogeneity.  This manual process, no matter how careful it was done, was prone to data errors and the staff that supported this process worked with very difficult time constraints to prep all of this data by rd workday, in many cases, not achievable, ahead of Emblem’s monthly number-crunching process that itself could take or more days, the result, future claim reserve projections are produced and then fed to Emblem’s Accounting systems,  also a manual process, and reserves are then set aside to meet NY State’s stringent reserve requirements…  First task was to fully automate, build sophisticated ETL for this process. where all claims data is now combined in a standardized format, all using a common coding scheme, all data now stored centrally on a SAS/Oracle database platform.

 

  • Once this was achieved, performed rigorous data analysis task, described earlier, producing new sets of cohorts with high levels of homogeneity using techniques such as multiple regression, chi-square, and correlation-coefficient (not unlike a stock market beta). My suggestion to Emblem to undertake doing this was that No matter how great our projection process was or wasn’t, the accuracy of these projections would not be a function of the arithmetic nor our techniques, but more a function of the data it feeds from”.  To me this was a familiar situation, because in a different initiative at Imperial Chemical Company years earlier where I led an initiative to build an ERP Demand forecasting system it was essential we first smooth anomaly from historical sales data as a first step. Not doing this, the noise/ bumpy data may be seen as legitimate signals, and trip up the model-fitting process and produce bad forecasts.  Rebuilding the cohorts with greater homogeneity gave us significant advantage as starting point to produce what has become Emblem’s best reserve projections ever, far exceeding Emblem stakeholder expectations.  

 

  • As a lead programmer, I wrote the code for what is now a fully automated application that feeds directly from a new SAS Database using ADO and IOM (a SAS proprietary ODBC equivalent) where the SQL that resides on worksheets is ingested via vba code and treated as though the sql are parameters and not hardcoded nor embedded in VBA code making it easier to modify or maintain each sql statement wherein this SQL is executed to include building the connection string and issuing the open from within the VBA subroutines where the select statement uses copyFromRecordSet option to drop the data into the receiving worksheet; Describing some of this app’s features, choosing one from many of its PopUpMenu capabilities, the owning actuary initiates retrieval of a specified cohort from SAS with detailed membership claims data, and numerous components of trend, a decomposition of overall trend that identify the smaller signals that comprise overall trend. Built-in facilities enable actuary to calibrate / dial-in an individual component of trend’s signal strength and this adds enormously to accuracy. The arithmétic product of components of trend is recalculated to represent overall trend. This includes components that address Medicaid and Medicare membership.  Adding with this, there are many additional popup menus with numerous methods that remove noise, outliers in claim data, using a triangle view of cohort data that feed a process that imputes either of completion-factors or development factors,  actuary my choose either method, where either demonstrates the rate of completion (medical cost obligations from date of medical event and succeeding costs post trauma) along with every best known actuarial full and partial credibility and blending technique that when applied to one of numerous projection base methodologies combine to almost always produce a sensible initial set of projection reserve numbers for next and future months... These reserve numbers are sent through an attribution process, built up from a PMPM (per member per month) using what are considered current membership numbers, to ultimately become an input (future liability reserve projection) that is forwarded to Emblem’s financial accounting systems. Projection Module has ,+ lines of VBA code, is easy to maintain, but has considerable sophistication.  I wrote this VBA code.

 

  • Developed a Chief Actuary Monthend Dashboard that lists all cohorts (number of cohorts can vary). and for each cohort, its status as either of in-progress or complete, also the owning actuary name, and the reserve projection total dollar amount. The dashboard gives the Chief Actuary the ability to drill into each cohort’s detail, the projection’s arithmétic triangle, to include owning actuary”s adjustments, and the mandatory notation (documentation that explains the rationale for each adjustment) which is necessary at times to address an anomaly in the data... Each cohort feeds from its projection triangle. Each triangle is the product of a forecasting process, sophisticated algorithms, almost always is unique to each cohort. The Dashboard enables Chief Actuary a full view of all cohorts, the ability to drill into each one, view the system generated projection, the triangle with its vast array of numbers along with all owning actuary adjustments, justifications for each adjustment (the accompanying notation for each adjustment, wherein the chief actuary, using the dashboard, has an ability to drill into each cohort, perform a review, a look-see at the owning-actuary’s adjustments and reasons, and can add/modify verbeage to an adjustment’s notations. Chief Actuary from the Dashboard has complete ability to override the adjustment, or recommend adjustments to owning actuary.                        

 

  • When all cohorts are checkmarked, approved by the chief actuary, this sends signal for the monthend closing process to begin... As each cohort is approved, with the presence of a checkmark, this freezes (locks) the cohort. Once locked, no additional actuary modifications are possible.  With use of sophisticated ETL, the membership (the subscriber membership count) is the divisor, and the total cohort projection is the dividend, where the result (the quotient) is a PMPM (Per Member Per Month projected cost per member). Membership changes notably from month to month. PMPM is a common denominator. So this journal entry that represents a projected total cash reserve needs to scale precisely with most current membership, and then set-a-side as the projected claim expense for each cohort each month. Something that wasn’t done all that well with existing Emblem systems at that time. And upon doing this, A/P, G/L, and P&L (Hyperion) are updated to be the net of existing cash reserve account balances and adjusted accordingly… The entire risk, rating, pricing, and reserve projection process was written entirely on an Excel/VBA technology platform.  From inception, the anticipated complexity to build this application was considered outside the abilities of Emblem IT and EmblemHealth’s Actuarial staff, largely the reason I was brought in to join the actuarial team, however, the broader EmblemHealth underwriting, actuarial, and the overall operational communities were familiar with Excel worksheet look-and-feel, but not so much with VBA and Excel’s deeper capabilities, but all things considered, it made sense to them to use this technology platform. Every employee has Microsoft Office on their desktop. Was a great project for me. Many technical challenges, some of its features not thought do-able made this project that much more memorable and fullfilling for me…

 

 

JPM/Bear Stearns - Midtown Manhattan, NY  (Sept to Jan ) As Business, systems, data Analyst, Developer

Technologies: Bloomberg, Reuter’s, MS Access, Excel, VBA (macros), ADO, CDO, SQL, SQL-Server, Sybase, Oracle, SharePoint, VB.Net.  

 

  • Worked as member of a select team, an internal SWAT/RAD team in the final days of Bear Stearns responding to urgent financial data needs particularly during the period leading to and following their merger.

 

  • Developed and distributed surveillance and governance reports of investor unwinds, positions held in the weeks leading to the collapse of Bear Stearns and merger with JP Morgan. Made heavy use of reference-data and DTCC DerivServ during this timeframe.

 

  • Produced a significant number of senior management and operational reports to track formal trade confirmation process with all Counterparties to include Novation re-Assignments, transfers/terminations of selected positions, during transfer of derivative portfolio holdings from Bear to JPM with use of ETL to and from Scrittura, Calypso, other internal systems and DTCC/DerivServ.

 

  • Developed numerous scorecards for senior management that tracked progress of Novation of Bear Stearns positions and portfolios with corresponding JP Morgan portfolios. A tedious process for JPM and Bear Stearns taking more than year to complete.

         Deployed directly with Front Office, Middle Office, Back Office, Trading, Marketing, Equities and Derivatives Teams. 

 

  • Built many higher-functioning Excel dashboards and applications, many as RAD initiatives, some with thousand lines and greater of VBA code. Heavy use of WebQuery, and real-time links to live data. Other VBA techniques make use of OLE embedded Outlook object, OLM and outlook mail envelope to automate the sending of email w/attachments from within Excel with VBA using CDO and SMTP mail servers and Excel SharePoint services via hyperlinks embedded in email. We used this method to distribute reports globally in varied formats to include PDFs, PPTs, XLSX, XLSM, and Docx files...

 

  • Built and maintained several trading blotter apps in addition to my primary responsibilities as a derivative FO/MO developer. Considered alpha Excel go-to support person company-wide at that time. Designated as Tier-, Tier- support for all trading desks world-wide...

 

  • Developed office-automation with what had come to be named xlsm. Its need grew from a heavy workload where middle-office staff worked late into each evening preparing a growing stack (deck) of management reports and metrics (KPI/KRI) that quantify, identify and prioritize outstanding unconfirmed trades as support for Bear Stearns' significant Derivatives business. This scheduler, built with excel/vba, was the key component in the automation of the reports. It provided a facility to populate a to-do list of spreadsheet reports that are refreshed with automation (written with VBA) and distributed one or more times daily at varying intervals. Known to us as the to-do list, each row contains a spreadsheet’s file name, file-path, a start-time, and the procedure (a subroutine, also with a path and filename) that performs the particular refresh at each designated start-time.  Alternatively, a spreadsheet refresh can be initiated/triggered when the file it feeds from is found to exist in what is generically referred to as an FTP Drop-Off folder.  A VBA CDO (email) routine is triggered that sends reports/spreadsheets in the form of a hyperlink or attachment to the managing director / exec &#;the global community&#; earmarked to receive the report/spreadsheet.  This enabled us to bypass using fileserver as delivery mechanism where a report earmarked for Singapore as example, sitting on a server in Delaware back in would take forever to download to recipient in other parts of the world…

 

  • Developed derivatives workflow trade lifecycle management application (Excel/VBA/ActiveX) to support Bear Stearns' Derivatives Desks. A significant tool that drives much of the derivatives middle office by directing focus to T+, R+, problemed trades. This supported Credit, Rates, FX, Equity, Base and Precious Metals Commodities, Energy derivatives, CDS, CDO, ABS, MBS, CLO, LBS, Syndicated Loan products... The Excel application feeds from a Data Warehouse via SQL that feeds from DTCC Deriv/SERV post trade matching service at that time an advanced Integral to this project was the design and development of a Derivatives Data Warehouse. 

 

  • I developed what was known as the master spreadsheet, a living document that receives updates of ’s of rows of reference data via this new Data Warehouse that feeds or fed directly from DTCC throughout each day via a VBA updating process triggered with a VBA ontime timer event. Each refresh to the spreadsheet added new trades, assignments, amendments, partial and full terminations, and updated break information. Metrics derived from the spreadsheets assist management teams to quickly spot through-put problems and in many cases led to improvements in appropriate best-practice procedure(s). Master spreadsheet undergoes a process that redirects trades via an elaborate data-mining process to the appropriate analyst using a pre-determined mapping of counterparty(s) and responsible middle-office analyst(s)… Analysts resolve breaks, make notations (with what we called the diary) and indicate status on their respective spreadsheets, subsets of the master spreadsheet and this data, in turn, is uploaded back to the master spreadsheet (a shared workbook_ and then updated to the Data Warehouse where a scorecard, the KPI/KRI process feeds from, producing very elaborate (red/yellow/green scorecards) heatmap reports and distributes them to respective managing directors along with metrics that indicate speed of throughput highlighting with use of an aging process how long a trade is sitting in a problem status.  These management reports did a great job of showing most aged problemed trades (stuck trades), blinking red with all pertinent diary of actions and dialog with counterparty resolver. Useful to the Sr Managing Director during status meetings where management teams prioritize problem trades (breaks) stuck in the pipeline in order to meet t+/r+ throughput constraints  wherein the master spreadsheet contains latest break information it obtains with realtime connectivity with DTCC post trade clearinghouse.

 

  • Worked with On-Boarding Account Documentation Team in RAD mode to meet aggressive target date. Goal was to automate a process that heretofore was manual, quite tedious, to support initiative to Know-Your-Customer (KYC) by gathering the requisite legal documentation as part of what is known as a Customer-Information-Profile &#;CIP&#; for every client in order to comply with 'Regulatory', branch of risk management. A questionnaire was developed with Excel but looks nothing like Excel. Essentially this is a rules engine. Rules provide this application an innate understanding of jurisdictions, client entity business types such as C Corporations, S Corporations, LLC, LLP, and the financial products and Bear entities they can be traded with. It produces a listing of required legal documentation that varies widely, based on country, state, city, and province, wherein this documentation is considered a required pre-requisite to trading and/or doing business with new client. 

 

  • Developed a series of compliance/surveillance Key Risk (KRI) reports for risk management group. Purpose is many-fold, but largely to identify anomalous/suspicious trading patterns, irregularity with economics, AML, fees and/or pricing data present as skew with like trades having exact or similar tenor, and occur within the same contiguous to day window of time. Generally, these reports developed specifically for risk management and compliance teams, SOX Compliance, they identify extra-ordinary trading patterns and trading support activity to include middle office confirmation and amendment processes.

 

Citigroup, Manhattan Midtown Manhattan, NYC  (Jan to Sept )  Sr. Business Analyst / Lead Developer (backfilled position to fill a staffing gap mid-project) Technologies: Excel, VBA (macros), ADO, CDO, Pivots, Dashboards, SQL, Access, SQL, SQL-Server, Oracle...

 

  • Working with Global Risk Management Group reporting into the Global Chief Risk Officer.
  • Developed numerous sophisticated spreadsheets with heavy analytics to centralize the risk oversight and management for Citi’s Global Portfolio of credit card products exceeding at that time ’s of Billions in total value and exposure. Heavy use of statistical techniques to produce forecasts (tactical and strategic) that feed from historical actuals.
  • Developed sophisticated spreadsheet templates to capture global data that contain both actual data and guidance (forecasts), collected monthly from + countries and maintained the histories of this data both actuals and forecasts on Microsoft Access to determine accuracy of forecasts.
  • Regional Dashboards with impressive business graphics showing degrees of risk, the interval, amplitude of payments, latency, the percentage of cardholders at or near limit, making mimimum payment, late and missed payments, indicating degrees of risk and default. And a dashboard showing global hotspots, with point and click drilldown…
  • Automated an existing process to build decks of pivot-oriented dashboards, distributed globally to local servers every continent with each refresh. Pre-dating this, it required large staff, sometimes one or more persons each region at Citigroup’s NYC office to produce manually; all staff local to Manhattan office. I was considered a top Excel resource at Citi.

 

Ingredion, Imperial Chemical Company, ICI  - BridgeWater, NJ  Sr. Business, Systems, Data Analyst / Designer / Developer.

Excel, VBA (macros), (Power) Pivots, SQL-Server, Sybase, Oracle, SAP, VB.Net...  As  consultant and lead Excel/VBA developer. A go to person for Excel and VBA assistance with all operational areas. Conducted numerous advanced Excel training classes and lunch-and-learns for employees.  Developed many Excel/VBA/ SQL applications that feed from Access, Oracle, Sybase, and SQLServer using ADO/ODBC.  Developed a Budget and performance tracking System using Excel, VBA, Access, and an ETL process with feeds to and from SAP.  A joint project with HRIS and IT, a complete Global IT budgeting process to include salary planning, vendor management, also equipment, software licensing, licensing other, leasing, and departmental chargebacks. A system unto itself, an employee review process, employee census, with new-hire planning. Dozens of separate departmental budget planning spreadsheet templates built with interfaces to G/L and COA, each worksheet, a template providing for FTE, employee rating, bonus compensation rules, multiple levels of pro-ration in part predicated on pre-defined KPI parameters. Bonuses determined based upon each operational areas contribution to profit. Templates perform local and US Currency transliteration. Spreadsheets reside on shared servers. Rollup and Consolidation of budget items to G/L accounts where each departmental budget spreadsheet content is additionally consolidated (rolled up) to corresponding business entities spreadsheets. All business entity spreadsheets are then consolidated, a rollup to a corporate spreadsheet. Monthly Budget-versus-Spend reports using a dashboard format with clickable graphics for drilldown to underlying departmental data, Heavy emphasis on UI/UX. Obhjective is to identify overspend and underspend and with simulation project year-end impact. Dashboard designed wherein a click on a datapoint shows underlying data possibly answeing the question of what caused the anomaly, the skew of actual versus bdget. Designed for CIO, CTO, CFO, and CEO, and re-enforces the value of a UI and the UX … 

 

Henkel, Unilever, National Starch and Chemical Company, NSC , BridgeWater, NJ   

As Senior Analyst, ERP Architect / Developer embedded with operational areas to include Supply and Demand Teams.

Developed RPA early-warning capability (aka, eWarning) with higher functioning custom UDF Excel VBA functions, and SQL with ADO/ODBC connectivity to Oracle to acquire historical sales data, inventory, and factory production schedules and from a Data Warehouse. performs ETL transfers of CSV files, ie, feeds from SAP to this new system.  This app has an ability to detect breaks in ordering patterns. This capability was named “eWarning”. Its job was to alert the planning community, and customer service (CSRs) to reach out to a customer (customer refers to big Fortune companies)  when an order is not received when expected. In the ABC analysis of things, these are typically the ‘A’ customers, ie, the % of customers that account for % of Unilever NSC’s business.  Almost always at most inopportune times, a customer or its systems might fail to place orders in timely manner for any number of reasons. Much of the time, the Unilever NSC CSR (the customer service rep) reaches out to the customer and gets the order . Outcomes are fed back to demand and supply teams and forecasts are revised as needed. I.e., if customer chooses to not place an order.  Anecdotal stories from CSR recount how sometimes customer had simply forgotten to submit an order and very happy that NSC could know this... Or the customer changed their production schedule and failed to communicate this with NSC.  This was a real bonus for NSC.  The consequence of a customer that fails to place an order can shut-down customer’s own operations especially those that operate with a just-in-time ordering strategy… Bad for them and bad for NSC… Re-tooling NSC factory to make unscheduled product is expensive. Each category of product, its composition is different, causes factory down-time, costly time-consuming equipment retooling, reconfiguration, to re-stage the factory and suppliers to stage different categories of product. Not easy to make changes to production schedule without major disruption to NSC’s factory output . And this underscores the importance of an accurate forecast as a key to efficient use of manufacturing capacity, ie, production planning and capacity planning..   Early Warning logic makes heavy use of interval, amplitude, order rate and frequency. 

  

The initial concept, my reasoning for the importance of eWarning, I was asked to explain or demonstrate  its value to senior management, demonstrate its’ value (value-added) to the company, and prove that it will work. So I worked ‘round the clock to create what became known as a concept car, a working Excel application known as the concept-car the reason it was so successful, far exceeded everyone’s expectations. This thing I had developed as the Unilever NSC lead architect, initially its purpose had nothing to do with the overall efficiencies that were realized with the many business areas that benefited from it... I built It to protect the forecast in the event that assumptions that went into producing the forecast hadn’t materialized.  I invented eWarning to protect the forecasts, and because I led the project that built the forecasting system. Jokingly I’d say my reputation was at stake.  This was my sense of humor, but in truth, eWarning raises a “heads-up” giving planners the time to react and hopefully meaningfully re-use what otherwise would’ve been seen as unused or unspent manufacturing capacity…  An unforced error.  The eWarning app studies every customers’ ordering and shipment patterns  (their typical lead-time when placing orders), and doing so, it became a tactical but also strategically important operational tool, and a competitive advantage for NSC…  It warns planners and customer service with sufficient lead time, and averts what are big logistics problems that had come to be accepted as the cost of doing business. This Excel workbook filled a gap that know one knew existed... Solved a problem that was presumed not solvable.  NSC-IT unable to make a business case to recreate this process with their conventional technology platforms;  Recalling IT said - to reproduce what eWarning (the Excel workbook) could do would be a Year effort (a staff of might be able to complete the project in to years).  IT, not considered a fan of Excel, not positioned to take this task on, and unprecedented, IT embraced this Excel App, and so began a growing respect throughout IT for Excel and VBA.  To predict order date, eWarning, uses a form of rate/time (rate over time) to estimate/anticipate a next order-date datapoint, with use of algorithm detection. It studies order interval and amplitude (when to expect orders, the frequency, and in what amounts, the amplitude. We are talking about tank trucks and railcar sized shipments of product or barrels or tote-bins. Not getting it right is costly. But I built eWarning for selfish reasons I would say...  I was the architect of NSC’s Forecasting system.  I had faith in this forecasting to always produce a sensible forecast so I built eWarning to keep an eye on NSC customer buying patterns, in the event their ordering patterns changed, or something as simple as forgetting to place an order occurred, it would be costly, both for NSC and the customer possibly forcing shut down to reconfigure factories (costly depending on varying equipment needs for each production run).  eWarning would in fact reliably raise the alert giving the planner(s) sufficient time to adjust schedules with little to no disruption to factory output.. I built eWarning to protect the forecasts. :)  SAP to this day cannot support these capabilities.  Forecasting is the Demand Planning team’s primary planning tool.  It drives when to make product, how much to make and where most efficiently in what regional warehouse or cross-dock to place product for it’s final leg of journey to the customer. Forecasting produces a forecast, checks inventory positions, where the inventory is located in proximity to the customer, adjusts production output to be net of current inventory positions.  Transportation and shipping costs are expensive, even moreso now. Especially at this point in time.   Not efficiently and logistcally storing/warehousing product geographically, means added substantial shipping costs. This was a great project. Embedded with Demand and Supply teams and I was accepted as being one of them. Part of their team. This meant the world to me…

 I have an extensive background in Forecasting, Predictive Analytics, Marketing, Metrics, Inference Engines, and Econometric Modeling as developer, programmer, all on Excel/VBA technology platforms and with use of RPA/ML and AI automation that drive model-fitting, ie, the running of  thousands of simulations where it, on it’s own, decides best algorithm choosing from Time-Series, Simple Regression, Multiple Regression, Bayesean Linear techniques, Single, Double, and Triple Exponential Smoothing where by itself, RPA drives a secondary process that determines optimal alpha, beta, and gamma degrees of dampening, and sometimes it determines “simple is better”, that a moving average, weighted moving, or simple average is best, as this is RPA’s way of punting when no algorithm, signal, or method exists within the data it is working with...  Using aforementioned algorithms, it iteratively runs these simulations for each of the thousands of sets of historical data it works with, and chooses the one simulation, the winning model, the one algorithm that wins the contest from the thousand plus simulations, a thousand plus passes at each set of data, where RPA chooses best algorithm, the one simulation that most closely mirrors the signature, the signal of the most recent months of historical data, a proxy, that represents the future, essentially the baseline for the purpose of doing these simulations.  Essentially, it uses a process commonly known as back-testing, where it looks back in time at or more years of historical data (the actuals), then feeds from the oldest of the years, in this case, data coming from SAP via ETL handshake.  RPA puts this data through its paces, the simulations, using every model and smoothing constant combination (+((+)*)) + (**)) =, combinations and using these accuracy metrics, MAPE, APE, MAD, StdDev ( sigma tolerance), and least squares MSE (derived from a sum of squared errors) and a method I invented, Degrees of Accuracy, it chooses the simulation, ie, the one that digitally demonstrates the smallest variance, i.e., where comparing its signal with the signal of the most recent months of historical data (the proxy representing the future), both curves, the proxy and the winning simulation, their trend, slope, every bump, cycle, and seasonal feature should appear to move in lockstep, collinear… And voila’,  this is Model-fitting.  This process almost never gets fooled with noisy data, an errant bump (the anomaly) that inevitably exist in historical data, i.e, the one-off events…  The project to do this was deemed necessary as SAP forecasts not sufficiently sophisticated to be able to achieve an acceptable degree of accuracy… Each month this process self-evaluates itself, A-F/A, how it did in the previous month(s), it measures its own accuracy;  Produces a report for its administrator, a scorecard, and it re-modelfits those data-sets where RPA thinks it can do a better forecast going forward... I have built many variations of this methodology dating to my days at U of P, spending considerable time at University City Science Center at Market St, the West Philadelphia campus…

 I developed several Large Scale Automotons... One of many projects working with RPA/ML and AI as the developer.  This particular application, distinguishes itself from other current-state RPA, a true automoton, a snap-on tool that attaches to existing workbooks, eg, trading apps that feed from live data, local web-farms, Bloomberg, Reuters, perhaps numerous URL amongst the many realtime internet information providers and does so exclusively on Excel/VBA SQL technology platforms. Several thousand lines of VBA code. Designed to self-install; attach to existing Excel apps.  Essentially an Addin, an xlam that self-installs in a matter of seconds. Uses VBA code that writes VBA code it embeds with VBE inside the receiving workbook’s activate event during the robot addin’s (xlam) open event.  This app Self-Study's live data using ANOVA, Qualitative, Quantitative, Algorithmic, Stochastic, and all best known Statistical Methods. This Addin is a true robot that operates as an employee. Essentially with Instantiation, it is replicatable, scalable General Staffing, Quants, Analysts, MiddleOffice staff, where each instance’s role, as one example, is to monitor an Excel desktop looking at live ticker data with an ability to spot what it determines is an actionable event, eg, a trading opportunity... The advantage versus human is  this app never blinks, no bathroom breaks, a workaholic… Learns, study’s on its own "what normal looks like". It gains perspective; A context; Essentially it studies every digital signature, every change with data; gains an innate understanding of normal and the ability to Identify anomaly, a datapoint or series that exceeds sigma tolerance, an outlier that deviates outside of what it knows is normal variation, ergo a potential actionable event, ie, a new normal moment.  Equipped with this ability, It communicates observed event instantly to SMEs, stake-holders, actionaries (the decision makers), the traders, with relevant charts, essentially snapshots of cells (before/after), and artifacts that tell the story via email, of what triggered the alert. It does this with use of HTML and Outlook object embedded in its VBA via OLE. Seriously advanced VBA that makes heavy use of Intersect() functions (this is just one example) to include jumping inside the data-series collection(s), properties that are hidden deeper inside an Excel chart or graph object. These collections are pointers that tell the chart where in what cells to find the data it feeds on. But this also helps the robot to know which charts are relevant to the story. Snapshots of these charts, graphs, dashboards (the artifacts), are sent either as attachments or embedded inside these heads-up email alerts, and does this in fractions of seconds, ie. sending this email to the actionary, a trader, a managing director, and/or it can send a pulse to a backoffice mechanism to make the trade, or initiate the trade by itself, generally speaking it alerts the actionary, in this case a backoffice trading app, or sends text message or email to responsible persons...

 

Earlier Experience

University of Penna (DRL/UCSC, Market St, Phila, Math Dept (as student, assistant to Math head professor C. Bergey, Fortran/C/PL/VB/JAVA student advisor tutor (mentor) onsite student support at University City Science Center (Student’s Computer Center)…