11 Jan

Golden State Flooring

gsfsiteNote: The site I built and describe here is currently dead. The company has gone into bankruptcy and I have no association with the current group calling themselves Golden State Flooring. Their site is not the site that I worked on for years. The best I can offer as far as viewing it is to go to the Internet Archive and view it there. Unfortunately, only about a third of the pages work.

If I can get a copy of the original site, I’ll post it here.

Golden State Flooring is a website that served flooring dealers and contractors throughout California. I was responsible for, well, everything on it, except, for it’s current design. I did the original designs and, I’ll admit, design is not my strong suit.

Specifically, the site provided customer credit information, pricing, promotions and other items for customers. The back-end allowed for an account to be limited in it’s access to certain product lines, or, for instance, a credit person to see a customer’s aging and purchase information but not allow a salesperson to see it. The site and the database running it also tied into the Higgins Hardwoods site. In other words, a user could log into either site and see, mostly, but not always, the same information. Both companies used the same ERP, and shared some customers, so it made sense to tie the two together in that way. Today, those sites would be built around Joomla or Drupal, and my life would have been a bit easier in that regard. But I started working on these in 2003, built the back end somewhere around that time, and, well, those two products weren’t as well done as they are today.

One question I always get asked was, did you do E-Commerce? The answer is a resounding No. And there are a couple of solid reasons for that. First, we didn’t sell directly to the public. We sold to flooring dealers and contractors or cabinet shops (Higgins). So giving the appearance that we sold online, especially in the flooring game, was problematic.

The second reason we didn’t sell online is that our data was a mess. The ERP did a very poor job of product attributes. Basically, they didn’t have any, so we didn’t have any. Flooring, which has at least a dozen potential attributes (species, width, length, warranty, color, stain, etc) had all of that information stored in a SKU description field where Red Oak might be “RO”, “Red Ok”, “Red Oak”, “Rd Oak” and so forth. As such, every SKU would have had to have been recreated with a cross-reference outside the ERP and the will to do such a thing was lacking.

We did come close, building an Overstock section, which took months, and as near as I can tell is broken, that allowed a visitor to request more information on a specific SKU.

Finally, the sites are LAMP based and supported around 2,500 registered users. That number, while small by many standards, represented significant market penetration in the flooring dealer and contractor business. I’m amazed at how well we did, most of which is owed to a very determined vice-president who pushed the sales team to sign people up.

11 Jan

My Reporting Structure At A Prior Company

For a recent job interview I put together a short 2-page summary of the things I’d done at Higgins. Mainly this is a data flow diagram showing where the data started and ended. I was involved in all of it (ok, not every single piece, I had an IT manager who liked command line programming) but for the most part I did all of the design, programming, testing and implementation for our reporting.

Reporting (PDF)

What follows is an explanation that I sent to a hiring manager.

The main business ERP was developed by a company called Activant (now Epicor) Falcon. It was a lumber based ERP (used by Higgins and Golden State) that was developed in and ran on Progress (the database and language are tied together in many ways). I don’t have the total database size but it would have been several hundred gigabytes. This is where the majority of our data was stored. The Pella side of the business was much smaller and did most of their analysis in an Access database. This was merged with our data.

Data was pulled from the Falcon ERP in multiple ways and used in multiple systems as follows:

  • Ad-hoc Reporting using Cyberquery – This was standard reporting (Excel, PDF, command line, automated emails, etc.).
  • Cyberquery hold files – These were flat-file based data extracts that were very fast. We used these extensively.
  • Cyberquery data extracts – These were extracted CSV files that were eventually imported into our intranet (Horde application framework – LAMP based) for reporting, CRM (purchasing used SugarCRM), Sharepoint (KPI’s displayed there) and web-based reporting for our on-line customers.
  • We did occasionally use an ODBC connection to access this data directly from PHP but it was awkward so in general we stuck to exporting it into MySQL.

We moved the data around, for the most part, using UNIX/Linux commands (rsync, cat, mysqlimport). Cyberquery also allowed you to create macros which ran AIX commands.

This data wound up in two main interfaces (company websites and our intranet which ran on the Horde application framework). The following are all LAMP based except for some minor reporting done using SQL Server and ASP.

  • Our company websites (www.goldenstateflooring.com and www.higginshardwoods.com) where customers could see their statements, invoice history, payment history and view product information on our blowout sku’s. Everything on those sites was designed and built by me except the appearance.
  • One-Stop Reporting (see the attachment for more information). The concept behind One-Stop was to take multiple reports from our ERP (which often had very poor indexing) and combine them into a single interface (LAMP based) improving efficiency and speed (in some cases reducing a report from hours to seconds). It was focused on four specific areas: purchasing, sales, credit and customer data. For instance, One-Stop Sales allowed you to select from 15+ parameters (inside rep, outside rep, warehouse, sku) and run 6 different reports based around some of the same factors. So it was possible to see sales by rep by warehouse, sales by rep for a given sku, sales by inside rep for a sku, etc. Data was updated once/day.
  • There was also customized sales reporting for the outside sales team, including automated emails, that allowed a salesperson to view all of his sales, customer and credit information. Management could access it to see a salesperson’s performance.
  • We also did some work with exporting data to Sharepoint and SugarCRM. The latter was used by purchasing and we exported PO data to it and the former was for some simple KPI’s for individual warehouses.

This is a bit long but it does clarify the process we were using.

10 Jan

The Statements Don’t Add Up

There’s nothing like a bit of craziness to start off a website with, especially, when it’s in the context of a perfectly serious business environment.

A few years ago I was handed a statement for one of our largest customers, along with several bad words, from our flooring VP. I really didn’t have the time, as I was supposed to be working on websites, but I decided to take a look at it and see if I could make sense of it. The statement was 10 pages long, very convoluted, and the V.P. who had handed me the statement was correct, it would be hard for a customer to reconcile. Actually, it would be impossible to reconcile as the detail didn’t add up to the totals.

There’s nothing like a bit of craziness to start off a website with, especially, when it’s in the context of a perfectly serious business environment.

A few years ago I was handed a statement for one of our largest customers, along with several bad words, from our flooring VP. I really didn’t have the time, as I was supposed to be working on websites, but I decided to take a look at it and see if I could make sense of it. The statement was 10 pages long, very convoluted, and the V.P. who had handed me the statement was correct, it would be hard for a customer to reconcile. Actually, it would be impossible to reconcile as the detail didn’t add up to the totals..

Well, as it turns out, I was apparently the first person to ever add one of these things up (that’s not actually true as I’ll explain in a moment, but it’s close). No one had any idea of the true depths of the problem and no one wanted to deal with it as I learned while trying to bring the problem to the attention of those who logically should have been concerned that when it came to our statements 1+1 might equal anything. So, lets do this with a bit more detail and watch carefully because there are a lot of moving parts.

First, the totals on a customer statement (aging buckets and grand total) were taken not from the detail on the statement but from the customer master file (cust table).

Second, the totals in the customer master file were generated every time an aging was run for whenever the aging was run. So, if an aging was run for two years ago (likely what did happen in the example I looked at) guess what the customer master file buckets were updated to? Yup, two years ago!

By itself, this would be enough to think (how much did you pay for that system anyway) but it gets better. We had a long history of distributed statement distribution. In other words, each individual location mailed out their own statements, or didn’t. So what would happen is that at month-end we’d run a company-wide aging, which set the buckets, and the credit managers would come in on the 1st of the month, run statements, and mail them out. Had that been all that happened we’d have been fine but it was never that clean. Credit managers, prior to sending out statements, often ran an aging for customers during the day (especially on the 1st of the month) which reset the customer master file to the date that they ran the aging for. Then, after all of that, they would run the statements and happily mail them off and, well, most of them would add up.

Really, the only logical response to all of this is “this is insane”. And, I would agree.

Anyways, we eventually fixed all of it and for me it was a blessing as it got me into the BI side of the house. I’ve included a list showing all of the problems, and the solutions we implemented. I, honestly, doubt you will ever run into something this bad but if you do just know that you aren’t alone.

Problems:

  1. Statement totals come from customer master file not statement detail.
  2. An aging wrote totals to the customer master file.
  3. An aging, especially company-wide, took a very long time to run (a full aging could take a couple of hours)
  4. Statements hard to read regardless.

Solutions

  1. Run an aging during month-end (we’d always done this)
    Print statements to document imaging immediately after step 1. This prevented anything else from changing them.

  2. We changed the format of the statements, our vendor had more than one, to one that made more sense for our customers.
  3. Created a Cyberquery report (based on data pulled from the aging data during the night) that gave our credit managers a daily aging total within our ERP. This ran in a few seconds.
  4. Pulled this data into our intranet and built several web-based tools to view it and other data.
  5. Pushed this data to our websites so our customers could view both their credit history and an open item statement. This data included their current balance, historical transaction data and line item detail.