Excel is a much-maligned product but it has stayed on top since it took over the lead position in business spreadsheets from Lotus 1-2-3 some time in the early 1990’s. The main reason often suggested for its dominance is the power of the Visual Basic for Applications (VBA) scripting language for creating Macros. Using Excel as a base platform, the ability to extend it using Macros allows Excel to become a fully fledged application development environment.
I recall investigating alternatives to Excel and the greater office suite of applications at a previous organisation I worked for. Open Office was the front-runner at the time, I suppose Google Docs would be considered now but this was a few years back. The justification for switching was the hundreds of thousands of dollars that would have been saved in license costs. The downfall of the plan was mainly centered around file compatibility issues (which were minor and had work-arounds) but the clincher was the macros. Open Office had an implementation of visual basic, but the object model was totally different. There was a migration tool to convert Excel macros into Open Office macros, but on most of the workbooks we tested, it didn’t do the job properly. There were that many business critical macro-enabled workbooks across the organisation that it was deemed too hard to make the switch. The very reason for Excel’s dominance was what then embedded it into the organisation, we were addicted to it and couldn’t kick it.
The scourge of Excel
The bane of the IT department is this abundance of uncontrolled Excel workbooks that seem to multiply across the organisation, get out of sync with current data, and lead to silo-ed reporting across different corporate divisions. At another organisation I worked for, we (in IT) created a macro-enabled workbook for a specific auditing requirement that was then given to an administrator to use and manage. Shortly after, a number of support calls came about the workbook “suddenly not working”. After investigating we found the macros gone. After even more investigation we found out the administrator was taking a copy of the workbook from the network location to their laptop, taking it home as a “backup” and then restoring the workbook to the server the following day. Besides the fact that the network drive was already being backed up to tape, as well as distributed using DFS to other server locations, this was not a very smart thing to do. In the save-as that the user was doing, the macros weren’t being saved in the non-macro-enabled file.
But whose fault is this? There is one argument for advanced IT training now being a necessity for all information workers (that means anyone using a computer as part of their job). On the other hand we all know that most people aren’t nerdy IT types, preferring instead to have real conversations with people rather than speaking in 1’s and 0’s 😉 . So I don’t blame that person for their well-meaning behaviour, but that’s just one example of the problem with Excel.
Balance between flexibility and control
On the one hand IT want to control the applications for reasons of security, backup, consistency of service, integrity of data and ultimately effectiveness and efficiency of business processes. On the other hand the operations side of the business is frustrated by IT being a hurdle and a bottleneck to them getting new applications quickly. There is a lack of understanding on both sides too. When the business requests new functions from IT and it takes a long time due to the necessary security and administrative reasons, the business are frustrated by the time it takes to deliver. They don’t see these administrative due-diligence tasks as necessary because it is not part of their responsibility nor are they assessed on them. The business is mostly assessed on performance and profit generation where there is no motivation for security, stability and consistency. The IT perspective is that the business is gung-ho, can’t make up their mind what they really want, and are generally lazy and irresponsible, preferring a slap-dash quick solution to one that is robust and won’t require endless support calls to fix it. Scarily sometimes these quick and dirty applications are used for critical business decisions, or even to support government policy & ideology even though based on flawed logic that does not go through proper testing & validation processes that are standard in the proper software development life-cycle (SDLC).
The business resorts to the self-service option because they can’t wait, and then you get a proliferation of Excel workbooks because that’s the tool they have and the skillset is fairly widely distributed amongst the staff members. Operations staff end up copying & pasting data between applications, letting data get out of date and generally coming up with very inefficient processes simply because they are time poor and pressured for immediate results. There’s no time to spend optimising a process that might end up save years of person-hours. Fire-fighting is the norm, not prevention and proactive solutions.
Addressing the problem
Microsoft are well aware of these problems and have proposed various solutions for tackling the issue. Excel services for SharePoint is one such example; it enables users to create powerful spreadsheets but to then have them hosted centrally on the intranet web portal. This would have certainly avoided the problem with the user taking their own “backup” mentioned earlier, and gives IT visibility over the application surface area, but doesn’t fix the robustness, data integrity and logic flaw problems.
PowerPivot and other database connections in Excel workbooks are another measure to keep data in sync and up to date, but I find the refresh options lacking, particularly in PowerPivot. It requires the user to first switch to the PowerPivot window, refresh it, then switch back to Excel and refresh that. Worse, both steps can’t be automated on-load so the user has an annoying process to have to perform whenever they work with that workbook. I’ve written short t-the-point instructions on how to do this and given training but then later found users not doing both steps of the refresh process. This is frustrating for me, but I don’t blame the users, I blame the process. Add-ins like PowerPivot are extremely powerful but if they fall outside the object model accessible by VBA then they become relegated to power-users only. By that I mean if many of the functions can’t be Automated and guaranteed to be performed even when the user forgets, then the application can’t be guaranteed to be used appropriately and the data currency and reliability can’t be guaranteed either.
I don’t have a ready solution for this problem but I know that it has to benefit both camps. It has to allow for quick ad-hoc work by end users, but also enable a level of standardization for IT. Hopefully future versions will do that, assuming Excel doesn’t lose its crown to a contender.