11 Jan

My Reporting Structure At A Prior Company

For a recent job interview I put together a short 2-page summary of the things I’d done at Higgins. Mainly this is a data flow diagram showing where the data started and ended. I was involved in all of it (ok, not every single piece, I had an IT manager who liked command line programming) but for the most part I did all of the design, programming, testing and implementation for our reporting.

Reporting (PDF)

What follows is an explanation that I sent to a hiring manager.

The main business ERP was developed by a company called Activant (now Epicor) Falcon. It was a lumber based ERP (used by Higgins and Golden State) that was developed in and ran on Progress (the database and language are tied together in many ways). I don’t have the total database size but it would have been several hundred gigabytes. This is where the majority of our data was stored. The Pella side of the business was much smaller and did most of their analysis in an Access database. This was merged with our data.

Data was pulled from the Falcon ERP in multiple ways and used in multiple systems as follows:

  • Ad-hoc Reporting using Cyberquery – This was standard reporting (Excel, PDF, command line, automated emails, etc.).
  • Cyberquery hold files – These were flat-file based data extracts that were very fast. We used these extensively.
  • Cyberquery data extracts – These were extracted CSV files that were eventually imported into our intranet (Horde application framework – LAMP based) for reporting, CRM (purchasing used SugarCRM), Sharepoint (KPI’s displayed there) and web-based reporting for our on-line customers.
  • We did occasionally use an ODBC connection to access this data directly from PHP but it was awkward so in general we stuck to exporting it into MySQL.

We moved the data around, for the most part, using UNIX/Linux commands (rsync, cat, mysqlimport). Cyberquery also allowed you to create macros which ran AIX commands.

This data wound up in two main interfaces (company websites and our intranet which ran on the Horde application framework). The following are all LAMP based except for some minor reporting done using SQL Server and ASP.

  • Our company websites (www.goldenstateflooring.com and www.higginshardwoods.com) where customers could see their statements, invoice history, payment history and view product information on our blowout sku’s. Everything on those sites was designed and built by me except the appearance.
  • One-Stop Reporting (see the attachment for more information). The concept behind One-Stop was to take multiple reports from our ERP (which often had very poor indexing) and combine them into a single interface (LAMP based) improving efficiency and speed (in some cases reducing a report from hours to seconds). It was focused on four specific areas: purchasing, sales, credit and customer data. For instance, One-Stop Sales allowed you to select from 15+ parameters (inside rep, outside rep, warehouse, sku) and run 6 different reports based around some of the same factors. So it was possible to see sales by rep by warehouse, sales by rep for a given sku, sales by inside rep for a sku, etc. Data was updated once/day.
  • There was also customized sales reporting for the outside sales team, including automated emails, that allowed a salesperson to view all of his sales, customer and credit information. Management could access it to see a salesperson’s performance.
  • We also did some work with exporting data to Sharepoint and SugarCRM. The latter was used by purchasing and we exported PO data to it and the former was for some simple KPI’s for individual warehouses.

This is a bit long but it does clarify the process we were using.