May the source be with you, but remember the KISS principle ;-)

Contents Bulletin Scripting in shell and Perl Network troubleshooting History Humor

Excel 2003 Macro Programming

News Office Recommended Books Recommended Links Recommended Articles Reviews
Selected Excel Functions VBA User-defined functions (UDF) Excel Object model VBA Books R programming language
Retirement modeling Excel formulas Simple 401K portfolio modeling Tips Humor Etc

Excel is not only commodity spreadsheet it is also a powerful development platform. You can develop VBA custom applications where the user can't see the underlying application (Excel). The best version was probably Excel2003. Many users prefer it to latest versions.

Excel can help to solve simple problems quicker and provide better visual feedback loop, then with a standalone procedural language like R or an expensive application such as Matlab or Mathematica. Since Excel has such advanced computational capabilities and is ubiquitous in offices and homes, it would make sense to learn the capabilities of this application.

While this product is marketed by Microsoft as user-oriented, but in reality it has "professional" capabilities. and is suitable for solving wide range of numerical problem and can perform pretty complex financial simulations including simple statistical analysis tasks such as correlating data.

You can also perform curve fitting and regression, solve equations, perform integration and differentiation, and solve both ordinary and partial differential equations. Some linear programming optimization tasks also can be done with Excel.

There are very few high quality books written about Excel. Most books including a dozen of books from O'Reilly are weak. Moreover there is also tremendous amount of low quality, junk books and some of them are rated pretty highly on Amazon (lemmings effect ;-)

Spreadsheet applications were the killer application that drove the personal computer market. In the early PC days, Lotus 1-2-3 (long with WordPerfect) dominated the productivity software business.

It's probably a safe bet that there's more data stored in Excel spreadsheets than in all the world's databases.

Excel's Data menu offers basic database features, such as sorting, filtering, validation, and data-entry forms. You can quickly import and export data in a variety of formats, including those of database management applications, such as Access. The need for macro-programming in VBA become apparent as soon as you  need for entering, manipulating, and reporting data grow beyond the spreadsheet's basic row-column metaphor.

In cases Excel power is not enough moving to a relational database management system (RDBMS), such as Access might be appropriate. The latter makes creating pretty complex databases with multiple linked tables quick and easy. It also contains Query by Example wizard (called called Query Design view). Access 2003 and 2007 has a collection of wizards to lead you step-by-step through each process involved in developing and using a production-grade database application.

The first step is Excel macro programming is creation of user-defined functions (UDF). Excel makes the creations of UDF really easy and such functions can be a valuable addition to set of standard functions. See

Here is the code to create our user-defined Commission() function:

Function Commission(Fee)
If Fee <= 1000 Then Commission = Fee * 0.1 Else Commission = Fee * 0.05
End Function

You can store UDF in several places which affect you ability to share them:

There are a lot of useful but little known Tips for exacel usage. See Excel Tips. Here is one:

Ctrl+` in Excel switches between Normal view and Formula view.

See also Automation and programmability - Excel - Microsoft Office It contain several very useful excepts form books and article written by Microsoft staffers. Among them


[Oct 20, 2010] Microsoft Excel Data Analysis and Business Modeling

John Matlock: Non-Traditional Uses of Excell, May 24, 2005
This book is aimed at a level slightly higher than the total beginner. On page one, for instance he assumes that you understand the statement SUM(A5000:A5049) means. Then he points out that writing SUM(USSales) would make it easier to understand. He then describes how to name a range of cells -- Not exactly what I'd call advanced, but certainly above novice.

The first third, or so, of the book is on these kinds of slightly advanced procedures that you could get out of several Excell books. After that he kind of turns the approach around to things like "Estimating a Demand Curve." This is looking for what you want to do rather than just looking at a technique within Excell.

Many of his chapters have fairly sophisticated subjects, "Incorporating Qualitative Factors into Multiple Regression." In this chapter he then looks at multiple factors that might go into predicting things like predicting sales, predicting the Presidential race - with fairly surprising results.

In a couple of chapters he has a bit of fun, i.e. Chapter 73 is "Picking Your Fantasy Football Team."

Excell is quite a powerful package. It can be used for a lot of things beyond just filling out an expense account. In this book a lot of non traditional uses are described in a very light but informative manner. It is a very well done book.

[Mar 3, 2010] Advanced Excel for Scientific Data Analysis


Learn to turn Excel into something that rivals Mathematica using VBA, brains, and a heaping helping of fortitude.(Jack Herrington's review.)

[Nov 4, 2006] Excel 2003 Formulas Books John Walkenbach

Very weak and expensive cookbook, November 4, 2006
Reviewer: Nikolai N Bezroukov
This is an eclectic collection of various recipies for Excel. This is a strange "lemmings" effect that the book has such a high rating on Amazon.

The book is essentially a badly written cookbook as it does not provide underling mechanics and key ideas behind the Excel formulas. Chapters are more or less disconnected and most of them can be read in any order.

At the end the reader is left with very new good findings that probably are not worth the price of the book.

Pagecount is very deceptive -- considerable part of the book is fluff -- brainless reproduction of basic things that one can find in help and that is not worth even one dollar. Many examples are very trivial and not worth reading.

Tricks like Appendix B are simply disgusting -- the author just copied the listing of functions that has no practical value whatsoever to inflate the pagecount.

Explanations mostly are extremely fuzzy. The author has real talent to make simple things complex and complex things impossible. Also this is just "do like I said" type of cookbook: the author never tries to explain concepts that are used (use of absolute adressing vs relative, the syntax intricacies of the second argument of countif and similar functions, etc)

Also the book suffers from frequent references to previous versions of Excel, which only distract the reader. One can assume that if the reader really wants to use one of the previous versions of Excel he can buy prev. edition of the book and save money.

In few places were things became more interesting they are also incomplete and/or incorrect (creation of your own VBA functions and collection of functions, usage of array functions like frequency, etc).

My impression is the author is just a book writer and does not have rich real world experience with Excel, the experience that is necessary to distinguish between important and trivial things.

As a result he cannot provide the reader any help in getting the grasp of underling architectural ideas, that are often very non-trivial (Excel is extremely powerful analytical tool disguised as a commodity spreadsheet).

Top updates

Softpanorama Switchboard
Softpanorama Search


Old News ;-)

Professional Excel Development The Definitive Guide to Developing Applications Using Microsoft Excel and VBA Stephen Bullen

XL-Dennis on March 6, 2005
The book is targeting power users and professional developers and if You consider Yourself to be in this group then this book is for you.

It's a well written book, with a high technical level and with a good structure.

But as with every book it has its strong parts as well as weak parts. Weak parts in terms of that some chapters are overviews only and do not give any depth on the subjects they cover.

The chapters that cover best practice are all excellent. Here the authors share all their experience & knowledge by discussion application structures, structure for notation, advanced techniques for userforms and toolbars, error handling, debugging and many other things. The chapters also cover well how to create and use add-ins and so called user defined functions (UDFs).

The chapters that deal with VB 6.0 & Excel are all excellent too. A welcome contribution is that this book leverages many of the good techniques we use in MS VB 6.0 into Excel, especially when it comes to userforms.

The book covers in a nice way how to create & use classes, create & use ActiveX DLL and front-loaders for Excel in MS VB 6.0.

A whole chapter is devoted to Visual Tools for Office System (VSTO) but point also out all the present shortcomings of VSTO.

As with every book nowadays this book also includes a chapter about XML which is well written and with a high technically level. However the chapter is rather short which tend to compress the content.

If the above is what You're looking for then this book is a must have.

The following chapters provide overviews of the subjects:

Worksheet Design

Programming with Database

Data Manipulation techniques

Windows API

Creating XLLs with C#

If You're looking to get more then an introduction or compressed picture on these subject then there exist other books that will give You more.

The book uses an example, PETRAS Timesheet, to exemplify the subject that is covered in each chapter. Some readers will appreciate it very much while other (like me) will skip it. Anyway, PETRAS timesheet is also well worked out like the rest of the book.

In order to work through both the PETRAS Timesheet case and all the examples the book is given You need to have access to the following softwares:

MS Office 2003 Professional (minimum requirement for the VSTO-chapter)

MS Visual Basic 6.0 (no longer available for sale from Microsoft)

Visual Studio.NET (C#)

VSTO (Separate tool that require Visual Studio.NET)

Kind regards,

Dennis Wallentin aka XL-Dennis

[Jun 14, 2015] Technical Analysis in Excel SMA, EMA, Bollinger bands

­ how traders can use Excel to apply technical analysis (TA) to historical market data. This will include computation of some of the most popular technical analysis indicators and implementation of a trading strategy backtesting spreadsheet (in Part III). Backtesting will involve generation of buy and sell signals based on TA indicators and computation of strategy P&L. We’d like to point out upfront that all computations in these articles will be performed using standard Excel functions available in Excel 2011 and later. We will not be using any VBA/custom Excel macros. This is done on purpose to keep spreadsheets simple and functionality understandable by non-programmers.

In the first part of this article series we will create an Excel spreadsheet where we will use formulas some common technical analysis indicators such as: Simple Moving Average, Bollinger Bands, and Exponential Moving Average. We’ll explain the formulas and include step-by-step instructions below. In addition, we are providing a spreadsheet we’ve created by following steps listed in this article so that you can use it for your own market data analysis or as basis for building your own spreadsheets.

Sample Excel File

Excel file (download) containing formulas for calculation of simple moving average, Bollinger Bands, and exponential moving average as described in this post.

Data File

For this example we’ve got a CSV file with 6 months of hourly SPY data, covering Sep 3, 2013 – Feb 28, 2014. SPY is an ETF tracking S&P500 index. We have nearly 2000 data points in this file. The file contains OHCL price columns, volume, and timestamp column. Disclaimer: this file has been generated using IB Data Downloader.

Data file: historical_data_SPY_1hour_20140301 (text file – to download – right-click and select “Save Linked File As…”)

Simple Moving Average Basic Calculation

Simple Moving Average (SMA) is simply the average price over last N number of bars. Let’s calculate SMA for the close prices from our sample data file.

We’ll be calculating a 20-day moving average based on the SPY close price (column D). Let’s add column header “SMA-20” in column G and we type in the following formula value in cell G21 (since row 21 is the first one that has enough data to calculate 20-day SMA):



After hitting return to save the formula you should see value ‘164.57’ or close to that in cell G21. In order to calculate SMA-20 for all of the remaining cells below – just select cell G21, move cursor over cell and double-click the small square in the lower-right corner of that cell. You should now see values in column G calculated for the remainder of SPY prices.


Generalizing SMA Calculation

Now we have calculated 20-day simple moving average values in column G . It’s great, but what if we want to calculate 50-day, or 200-day SMA now? Updating formula values every time you want to change SMA range is pretty tedious and error-prone. Let’s make our calculation more generic by adding a “length” parameter. We can start off by storing SMA range parameter in a separate cell so that we can reference it in or formula.

Here are the steps we followed to implement a generic SMA calculation in our spreadsheet:

  1. Let’s start off by creating a little table on the side where we can store some input parameter values for our indicators.
  2. In cell O1 let’s type “Variable Name”, in cell P1 let’s type “Value”.
  3. In cell O2 let’s type name of our variable: “PERIOD”.
  4. In cell P2 we specify value of the “PERIOD” variable which we’ll be using to specify period length for our generalized SMA calculation. Changing this variable will trigger recalculation of SMA with the current period value. Let’s use value 14 for now.
  5. Let’s type column header value “SMA” in cell H1; column H will contain values for our generic SMA indicator.
  6. In cell H2 enter this formula:



    Let’s dissect this formula. We are now using value of our PERIOD variable from cell P2. We had to add $ in front of column and row numbers to freeze reference to cell P2 as we copy SMA formula to other cells in column H. We’ve also replaced absolute reference to the Close column price range with the OFFSET Excel function. OFFSET returns a range of cells based on the offset in terms of number rows and columns from a given “reference” cell. First parameter is the reference cell (in our case H2 itself), second is an expression calculating the first row of the range based on the value of length parameter ($P$2), 3rd parameter is the column offset to the Close column (-4), negative value represents offset to the left while positive is offset to the right of the reference cell, and the last function parameter with value 1 represents the width of the range returned by OFFSET function, which in our case is just one column: D (CLOSE).

  7. Save the formula in cell in H2 and expand it to the rest of cells in column H by double-clicking the little square in lower-right corner of the cell, or dragging the formula down.
Removing Formula Errors

Now, you will notice that first several rows in the column have error value #REF!. This happens because there are not enough rows in our data set to calculate the SMA value, and the range returned by OFFSET function goes over the edge of the worksheet for some rows. There exists a number of various techniques to hide error values in excel. Some of them involve formulas which return blank or zero values if a cell value contains an error. While this is perfectly valid technique- it complicates cell formulas and makes them hard to read. Instead, we’ll use conditional formatting to simply hide error values be changing foreground color to white. To change cell’s font color to white and use no error highlighting follow these instructions:

  1. Select columns H-N
  2. In Excel: Home -> Conditional Formatting -> Highlight Cell Rules -> More Rules.
  3. In the “New Formatting Rule” dialog select “Errors” and in “Format with…” select “Custom format”, then set Fill color to white and font color to white as well.



Bollinger Bands Introduction

Bollinger Bands is a simple but useful indicator providing valuable information on historical price volatility of a financial instrument, as well as current price deviation from a moving average. When price moves become more volatile – the bands widen, in the periods of relative calm – they come closer together. The relative position of the current price to the bands can also be used to estimate whether market is overbought or oversold. If the current price is close to or crossed upper band – the price is considered in overbought territory, while price close to/crossed lower band – underlying market is considered oversold.

Basic Calculation

Bollinger Bands indicator could be calculated using either simple moving average or exponential moving average as the basis. Bollinger Bands consists of three data series: moving average (simple or exponential) and two standard deviation (boundary) lines, one above, and one below the moving average, usually at 2 standard deviations from the moving average. Exponential moving average (covered below) gives more weight to the more recent price action, while Simple moving average provides a more stable and less jittery indicator. There are a total of 2 input parameters: 1) moving average period (number of bars), 2) number of standard deviations for the upper band lower bands. In this example we’ll use simple moving average we already calculated in column H (see instructions in the section above). All that’s remaining is to add columns for upper and lower bands.

  1. We are still using 14-day moving average period value. The first row that has enough data for 14-day SMA is row 15 (since row 1 is used for column header). The upper band will be in column I, so in cell I15 we type the following formula:


    In this formula we are simply adding two standard deviations of the Close prices from cells D2:D15 to the SMA value.


  2. And for lower band in column I we enter the following:



    Here the only difference from the previous formula is that we are subtracting two standard deviations from SMA. Excel formula STDEV() calculates standard deviation for a series of values. In this case we are multiplying value by 2 to get 2 standard deviations, and adding/subtracting the result from the moving average to generate the upper/lower band values.

  3. To expand the formulas – just roll over and double-click on a small square in the lower-right corner of the cell to replicate formula for the rest of the data range.

Generalized Bollinger Band Computation

Now , how about generalizing the Bollinger Band formula so that we don’t have to update our formulas every time we want to calculate Bollinger bands for different number of standard deviations from MA or when we change moving average length.

  1. Let’s add another parameter to our generic variables table on the right of the spreadsheet. Let’s type “Std devs:” in cell O3, and 2.0 in P3.
  2. Next, let’s add the following formula in I15:


    In this formula we’ve replaced 2 with $P$3 – which points to our variable in cell P3 containing number of standard deviations for the bands, and calculate offset based on the PERIOD variable in cell P2.


  3. Similarly in cell J15 let’s now enter this formula:


    The only difference from the formula in the previous step is that we’ve replaced + after H15 with – (minus), to subtract number of standard deviations from SMA, and we had to change offset to the price columnd , notice -6, instead of -5 in the “cols” parameter to the OFFSET function to refer to column D (CLOSE).

  4. Don’t forget to copy new formulas in cells I15 and J15 to the rest of the respective column cells.

    Technical Analysis in Excel - Bollinger Bands

    You can now change values of “PERIOD” and “Std devs” variables in cells P2 & P3, and have SMA and Bollinger Band values automatically recalculated.

[May 29, 2015] Goldman goes astray on Grand Theft Auto and productivity by Matthew C Klein

May 27, 2015 | FT Alphaville

By contrast, consider Microsoft Excel, one of the greatest computer programmes ever created, and among the most integral to global business. It would be ludicrous to think that increasing the number of cells in a worksheet increases the value of new versions by a comparable amount. In fact, many of the most active Excel users prefer the 2003 version, presumably because the genuine improvements are outweighed by various hidden costs.

Similarly, many companies still prefer to use Microsoft’s Windows 7 because the newer versions have features that aren’t particularly useful for workers sitting at desks.

[Dec 27, 2013] There was a problem sending the command to the program error in Excel

Microsoft Excel 2007
  1. Click the Microsoft Office Button, and then click Excel Options.
  2. Click Advanced, and then clear the Ignore other applications that use Dynamic Data Exchange (DDE) check box in the General area.
  3. Click OK.

[May 1, 2013] With Great Power . . . by James Kwak

May 1, 2013 | 13 Comments

By James Kwak

A friend brought to my attention another example of how Excel may actually be a precursor of Skynet, after the London Whale trade and the Reinhart-Rogoff controversy. This comes to us in a research note from several years ago by several bioinformatics researchers titled “Mistaken Identifiers: Gene name errors can be introduced inadvertently when using Excel in bioinformatics.” The problem is that various genes have names like “DEC1″ or identifiers like “2310009E13.” When you important those text strings into Excel, by default, the former is converted into a date and the later is converted into scientific notation (2.310009 x 10^13). Since dates in Excel are really numbers behind the scenes (beginning with January 1, 1900), those text identifiers have been irretrievably converted into numbers.

This problem is related to what makes Excel so popular: it’s powerful, intuitive, and easy to use. In this case, it is guessing at what you really mean when you give it data in a certain format, and most of the time it’s right—which saves you the trouble of manually parsing text strings and converting them into dates (which you can do using various Excel functions, if you know how). But the price of that convenience is that it also makes it very easy to make mistakes, if you don’t know what you’re doing or you’re not extremely careful.

There are workarounds to this problem, but as of 2004, it had infected several public databases. As the authors write, “There is no way to know how many times and in how many laboratories the default date and floating point conversions to non-gene names have adversely affected an experiment or caused genes to ‘disappear’ from view.”

[Feb 9, 2013] The Importance of Excel By James Kwak

February 9, 2013 | | 35 Comments

I spent the past two days at a financial regulation conference in Washington (where I saw more BlackBerries than I have seen in years—can’t lawyers and lobbyists afford decent phones?). In his remarks on the final panel, Frank Partnoy mentioned something I missed when it came out a few weeks ago: the role of Microsoft Excel in the “London Whale” trading debacle.

The issue is described in the appendix to JPMorgan’s internal investigative task force’s report. To summarize: JPMorgan’s Chief Investment Office needed a new value-at-risk (VaR) model for the synthetic credit portfolio (the one that blew up) and assigned a quantitative whiz (“a London-based quantitative expert, mathematician and model developer” who previously worked at a company that built analytical models) to create it. The new model “operated through a series of Excel spreadsheets, which had to be completed manually, by a process of copying and pasting data from one spreadsheet to another.” The internal Model Review Group identified this problem as well as a few others, but approved the model, while saying that it should be automated and another significant flaw should be fixed.** After the London Whale trade blew up, the Model Review Group discovered that the model had not been automated and found several other errors. Most spectacularly,

“After subtracting the old rate from the new rate, the spreadsheet divided by their sum instead of their average, as the modeler had intended. This error likely had the effect of muting volatility by a factor of two and of lowering the VaR . . .”

I write periodically about the perils of bad software in the business world in general and the financial industry in particular, by which I usually mean back-end enterprise software that is poorly designed, insufficiently tested, and dangerously error-prone. But this is something different.

Microsoft Excel is one of the greatest, most powerful, most important software applications of all time.** Many in the industry will no doubt object. But it provides enormous capacity to do quantitative analysis, letting you do anything from statistical analyses of databases with hundreds of thousands of records to complex estimation tools with user-friendly front ends. And unlike traditional statistical programs, it provides an intuitive interface that lets you see what happens to the data as you manipulate them.

As a consequence, Excel is everywhere you look in the business world—especially in areas where people are adding up numbers a lot, like marketing, business development, sales, and, yes, finance. For all the talk about end-to-end financial suites like SAP, Oracle, and Peoplesoft, at the end of the day people do financial analysis by extracting data from those back-end systems and shoving it around in Excel spreadsheets. I have seen internal accountants calculate revenue from deals in Excel. I have a probably untestable hypothesis that, were you to come up with some measure of units of software output, Excel would be the most-used program in the business world.

But while Excel the program is reasonably robust, the spreadsheets that people create with Excel are incredibly fragile. There is no way to trace where your data come from, there’s no audit trail (so you can overtype numbers and not know it), and there’s no easy way to test spreadsheets, for starters. The biggest problem is that anyone can create Excel spreadsheets—badly. Because it’s so easy to use, the creation of even important spreadsheets is not restricted to people who understand programming and do it in a methodical, well-documented way.***

This is why the JPMorgan VaR model is the rule, not the exception: manual data entry, manual copy-and-paste, and formula errors. This is another important reason why you should pause whenever you hear that banks’ quantitative experts are smarter than Einstein, or that sophisticated risk management technology can protect banks from blowing up. At the end of the day, it’s all software. While all software breaks occasionally, Excel spreadsheets break all the time. But they don’t tell you when they break: they just give you the wrong number.

There’s another factor at work here. What if the error had gone the wrong way, and the model had incorrectly doubled its estimate of volatility? Then VaR would have been higher, the CIO wouldn’t have been allowed to place such large bets, and the quants would have inspected the model to see what was going on. That kind of error would have been caught. Errors that lower VaR, allowing traders to increase their bets, are the ones that slip through the cracks. That one-sided incentive structure means that we should expect VaR to be systematically underestimated—but since we don’t know the frequency or the size of the errors, we have no idea of how much.

Is this any way to run a bank—let alone a global financial system?

* The flaw was that illiquid tranches were given the same price from day to day rather than being priced based on similar, more liquid tranches, which lowered estimates of volatility (since prices were remaining the same artificially).

** But, like many other Microsoft products, it was not particularly innovative: it was a rip-off of Lotus 1-2-3, which was a major improvement on VisiCalc.

*** PowerPoint has an oft-noted, parallel problem: It’s so easy to use that people with no sense of narrative, visual design, or proportion are out there creating presentations and inflicting them on all of us.

Update 2/10: There is an interesting follow-on discussion that includes a lot of highly-informed technical people, including some who work in finance, over at Hacker News.

35 Responses to The Importance of Excel

  1. Sash | February 9, 2013 at 3:43 pm |

    I think there is a more fundamental problem, the model distorts the nature of the reality. Back in the late 80s i spent a summer in one of NY’s largest real estate firms. At the time spreadsheets were relatively new.

    My boss observed that the spreadsheet made people think that real estate was a bond and not real estate.

  2. cubicamente | February 9, 2013 at 4:11 pm |

    In my opinion, if a ‘model devoloper’ has left to his users the possibility to copy and paste or something like that, he did a bad job. With my work, I use Excel spreadsheets every day but when I start to develop a project, I have in my mind a system with an engine (the spreadsheet with formulas and macros) and with an user interface, with icons that prevent at users to do anything not provided. So, the core of the spreadsheet can work without modifications, and results can be more reliable.

  3. Vic Volpe | February 9, 2013 at 6:06 pm |

    Do you suppose they use Excel because they don’t know how to do statistical analysis in a database program?

  4. BillK | February 9, 2013 at 6:07 pm |

    Points out the fact that spreadsheet models should be reviewed and checked by other experts BEFORE they go into production systems.

  5. David | February 9, 2013 at 7:34 pm |

    I am a professional software engineer, and what I design and build works because I know what I’m doing. I have no sympathy for any of this.

    Programming IS an extreme sport. Leave it to the professionals, and do NOT attempt it yourself at home.

  6. Jason | February 9, 2013 at 7:38 pm |

    ” Errors that lower VaR, allowing traders to increase their bets, are the ones that slip through the cracks. That one-sided incentive structure means that we should expect VaR to be systematically underestimated . . ..”

    This. A thousand times, this.

    They’d probably use Ouija boards if they justified high-risk/high-reward strategies. ;)
    After all, what happened to ratings agencies that dared to call crap “crap?”

  7. azlib | February 9, 2013 at 11:55 pm |

    As a software engineer what you describe is all too common. Excel is very powerful because it is relatively easy to use, but its ease of use means anybody can write a program using the tool which can be very difficult to understand or test.

    What amazes me that the model was not tested more thoroughly considering the risks if there were systematic errors in the calculations.

  8. colby | February 10, 2013 at 12:13 am |

    cubicamente – I agree, I follow the same logic. The users are dumb so prevent them from doing anything other than what you build the tool to do.

    However, to James’ point, anyone can create a spreadsheet, and their is no formal user acceptance testing, unit testing, stress testing, etc that occurs with the normal SDLC. I have been stating for many years at my current employer that we should not be running our business on an Excel spreadsheet, even when we run SAP. There is far to much downside risk and limited upside from the productivity. Additionally, I think it makes people dumber and less likely to critically think through situations.

  9. aronjohnson | February 10, 2013 at 12:48 am |

    The mistake made in the London Whale case could have been easily caught using the Formulas/Trace Precedents and Trace Dependents tools.This tool is invaluable in debugging complex spreadsheets.

  10. osamakhn | February 10, 2013 at 1:22 am

    I guess this is where python and its libraries come in. You can build reliable models using code. Maybe its time the curriculum of mba/accounting/finance programs are also curated to add courses in python and financial modelling instead of just the fin engineers and quants doing the modelling.

  11. Robert Klemme | February 10, 2013 at 7:42 am

    “The biggest problem is that anyone can create Excel spreadsheets—badly. Because it’s so easy to use, the creation of even important spreadsheets is not restricted to people who understand programming and do it in a methodical, well-documented way.”

    Unfortunately this is not only true for Excel and Powerpoint but for other parts of common office suites as well. One of my pet peeves is text documents where empty paragraphs are used to set paragraphs visually apart. Another one is the abundance of ad hoc formatting which increases size of documents and makes them look odd.

  12. Bench Warmer | February 10, 2013 at 7:46 am

    I just like the old copy and paste format.–mlb.html

  13. Jim Dikelow | February 10, 2013 at 8:09 am

    There is a rich literature (accessible through Google Scholar) on the deficiencies of Excel for ststistical calculation and the unwiilingness or inability of Microsoft to fix them. Much better is the open source (and free) Excel clone, Gnumeric.

  14. Moses Herzog | February 10, 2013 at 8:23 am

    Well, I think it’s safe to say I’ve been one of the more loyal and LONGTIME readers of this site. I joined in on the dialogue here after Simon’s first appearance on Bill Moyers. I wanna say that was 2008, but if it wasn’t 2008 it was 2009. I gladly purchased both books, and I enjoyed them immensely and learned a lot from them (I learned more from “13 Bankers” than from “Burning” but I walked away with better understand and knowledge base from both).

    But I think this will probably be my last post here. The “sweet spot” from Mike Konczal’s old site (Rortybomb) really left after he went to the Roosevelt Institute. Konczal’s writing drastically changed after that. And I’m afraid that time has come to this site as well. Too much time intervals between posts, too many tease links to other sites where after years of loyalty to the two hosts we get the crumbs from the paid sites tossed at us for increased link hits.

    And frankly, too many phrases like this one I don’t remember seeing in this site’s early days: “where I saw more BlackBerries than I have seen in years—can’t lawyers and lobbyists afford decent phones?”
    I suppose people would say I am being “hypersensitive” to be upset at such a small passing remark. But class snobbery, and judging people on socio-economic issues was kind of what I had thought this site was supposed to be fighting against, not encouraging. I don’t think James Kwak would have written that in the first 6 months—1yr+ of when this site was at its best. So to me, it’s like picking up a glass of what you thought was your favorite brand of soda pop, but when you ran it on your tongue you found they had substituted the sugar with high fructose corn syrup, and it just wasn’t the same.

    When I think of Konczal repeating liberal talking points VERBATIM (not because they are the best, but as if he phoned the ACLU and Berkley Liberal Arts Dept. before every post to see if it was kosher) and James Kwak saying things like “where I saw more BlackBerries than I have seen in years—can’t lawyers and lobbyists afford decent phones?” I think I am tasting the Dr. Pepper with high fructose corn syrup and not the Dr Pepper with sugar (“Rortybomb” and James Kwak circa 2009—20??) that I liked so much.

    Warm Regards,
    Resentful and Bitter Zeitgeist crouching under the keyboard.

  15. Marc Heller | February 10, 2013 at 10:28 am

    Dear Moses -
    I’ve never commented on this blog before, but I always read what others have to say. I hope you reconsider you decision to not continue making your ideas known on this blog. I, for one, look forward to what you have to say.

  16. Dan Palanza | February 10, 2013 at 10:54 am

    Let me add to Moses Herzog’s critique: Your topic today, whether you know it or not, is a bookkeeping problem/solution pattern. I discovered, 35 years ago, when I bought my first microprocessor-driven computer, that it is impossible to create a proper double-entry book-of-accounts using any spreadsheet, no matter how it is designed.

    There are four fundamenta variables in a proper book-of-accounts: debtor-value, creditor-rights, cash, and capital. Pacioli calls the bookkeepers attention to these four variables in 1494 A.D. The four are isomorphically related to one another within the contract that the bookkeeping sets out to resolve; read as “sets out to balance.”

    Until these bookkeeping fundamentals are understood by the software community, and the accounting software designers know how to create the four-way isomorphic balance by using a serious programming language, the banks will continue to steal from the monetary system.

    Right now, James, your above essay has no right to take up the bandwidth that you are wasting by your not knowing where first base is located in the playing-field of software-driven accounting.

  17. Colin | February 10, 2013 at 11:30 am

    The FT Alphaville team have a very good overview of this —

  18. MacCruiskeen | February 10, 2013 at 12:17 pm

    “PowerPoint has an oft-noted, parallel problem”

    Actually, the real problem with PP is that it, at root, has no actual value at all. Anything that can be communicated with PP can be communicated better with some other media: verbally, paper, PDF, whatever. Heck, most presentations and meetings are really unnecessary.

  19. Cornbread | February 10, 2013 at 12:56 pm

    I don’t use Excel except when I am required to. There’s several reasons for this. First off, it’s ability to do complex mathematics is incredibly poor. If you’re not dealing with something that was linear in the first place Excel is not the best program for it. If you’ve got phenomena operating in opposing directions it’s not terrible if you can linearize them and run in very small steps. But if you’ve got coupled equations that aren’t necessarily positively or negatively reinforcing, Excel should be last on the list.

    Second problem is documentation. Excel spreadsheets are notorious for being poorly documented. Shoot, it’s difficult to see who came up with it in the first place, then you’ve got to comb through the attached VBA and the cell formulas to find it.

    Third, mobility of data. I’ve seen many instances where Excel doesn’t correctly or consistently pull from a database or update the database correctly. It was certainly a matter of poor coding, but Excel doesn’t seem to consistently write to databases from machine to machine.

    What Excel is pretty good for is dumbed-down visualization, ie, when you’re presenting technical stuff but need to make it ultra pedestrian for the business school guys, nothing’s better than Excel’s cartoonish graphics.

  20. Austin | February 10, 2013 at 1:06 pm

    any “quant” worth their weight should be intimately familiar with SAS and SQL and deliver their results via Excel so the “business” can lipstick that pig.

  21. Oregano | February 10, 2013 at 1:24 pm

    @Moses, I have to agree with you. Baselinescenario was created to discuss the reasons behind and way forward from the Crash of 2008-2009 and had its greatest relevancy then. Now, it’s mostly a site waiting and hoping for something positive to happen in a political environment that has regrouped and moved on in the same general direction it was heading before the crash. Simon and James have spoken their peace, written their books, and are moving on as well. There needs to be a change in national and Beltway politics that can reestablish a nexus of pragmatic governance for this site to regain its vitality. I’m hoping that my child will see that day. I suspect I won’t.

  22. aronjohnson | February 10, 2013 at 1:35 pm

    I totally disagree with this disgusting comment!!! Excel has several powerful tools although it’s solver function could be improved for non linear responses and coupled models with over 4 degrees of freedom (a convergence issue). For after tax cash flow analysis (in Engineering Economics class), I once programmed in Excel the entire IRS Depreciation schedule with just two inputs and a lot of conditional logic. (I found an error in the published tables which they fixed.)

    It does not do PDE’s which economists seem to love and practical engineers regard as oversimplifications of reality since most of them ignore friction (economic parallel??). It doesn’t do computational fluid dynamics either but we have good programs like Fluent which do. In addition to the wonderful flow visualizations, they provide the data we need in spreadsheet format for further analysis.

    I like Excel because I can check the individual calculations which are hidden by most programs.

  23. Keith Stephenson | February 10, 2013 at 2:28 pm

    By the way from what I have read, A program that excel basically copied, Lotus 1-2-3, was basically the ‘killer app’ of personal computers which made forever changed accountants jobs for the better. The PC revolution in the early 80′s took off from there. It that way the spreadsheet is the most important application of all time.

  24. Anonyomouse | February 10, 2013 at 6:02 pm

    I sympathize with Moses’s concerns. When statistics and numbers are manipulated to favor the ruling classes agenda, and since we it has been proved that politicians, (at this point in time), can no longer distinguish right from wrong, or right from the left, or anything other than the current status quo. We really do have a dysfunctional congress, that takes money candidates over quality ones. And we have a wall street that is in denial to it’s dying breath.
    I myself don’t know how it can sustain itself, and had I the power, I would expose them all.
    It’s really only a matter of time as to when, and how, the reset equation is used, so the strong CAN carry on. Until then we must put up with disinformation politics, and economics, because it has become the norm.

  25. Jack | February 10, 2013 at 6:26 pm

    Blaming Excel is like blaming guns for gun crime. There are many technical ways to potentially overcome the ‘copy and pasting’ problem. For example using templates, locked cells/sheets, Excel Services, document-level customizations etc.

    The problem described is NOT a technical one. Its social or organizational. What is it about the banks and those they employ which allow casual and unprofessional ‘hacks’ to be built for mission critical activity? Why don’t they know their tools well? Where is the training? Where is choosing the right tool for the job? Where is the technical oversight and leadership?

  26. Worthy LaFollette | February 10, 2013 at 7:32 pm

    Excel, Powerpoint and Access all suffer from similar issues – they are tools which are overused by amateurs. What is described above for Excel is exactly the same sort of critique Tufte has of Powerpoint – namely it creates a false sense of competence and is overused in places it should not be in order to satisfy the gods of “convenience”.

  27. Derek Burden | Joe Smith | February 10, 2013 at 8:16 pm

    I think so se sight of the original facts. The bank contracted out the building of a model to an “expert” who made a programming error. I agree with the original post that it is too easy to make such errors in Excel. Excel should probably not be used for production calculations but when it is, the model should be properly tested and documented. I wonder what the state of the art is for standards and tools to document and test spreadsheets. (VisiCalc was the original killer app – Lotus 123 was a clone written for the IBM-PC)

  28. kylehreed | February 10, 2013 at 8:29 pm

    I just discounted everything you wrote because of your hysterical and gratuitous Microsoft bashing. By the way, Lotus had an innovative commanding system but still essentially ripped off VisiCalc and Multiplan. Excel was the first with a GUI.

  29. ShamelessBill | February 10, 2013 at 9:26 pm

    @David (professional software engineer): Your point concerning the importance of experience, training, and competence is well taken. Having said that,…
    With all due respect for your doubtless considerable expertise in building software that does the work it is designed to do, please bear in mind this well established and humbling fact: Every human being is subject to marvelously seductive illusions of attention, memory, confidence and knowledge.
    Awareness of their existence and power immunizes none of us against these illusions. Self-deception afflicts each of us, bar none.

  30. Celer | February 10, 2013 at 10:05 pm

    “After subtracting the old rate from the new rate, the spreadsheet divided by their sum instead of their average, as the modeler had intended. This error likely had the effect of muting volatility by a factor of two and of lowering the VaR . . .”

    “As the modeler had intended”….what?

    The spreadsheet didn’t do anything the modeler screwed up. Excel is singularly unimaginative and uncreative and suffers from being able to do one linear thing at a time which undermines supposedly nice features like Real TIme Data links etc. Oy a+b vs (a+b)/2…..or such.

  31. Celer | February 10, 2013 at 10:18 pm

    Believing in Models in the first place is a far bigger error perhaps than misconstructing them…a model allows humans to withdraw from responsibility for their actions. Models are only models. Reality is far more complex than a model and the assumption a model captures reality in a suitably complete manner to be the sole basis of decision making is stupid in itself.

    In any case the dialogue set out above gets sucked into the bs of technical matters when in fact there were some very obvious ‘non-technical’ matters that should have constrained the Whale or led to his constraint by a supervisor (not a committee with no apparent singular person held accountable) like an overall position limit for example which requires no VAR to ascertain and could easily have been summed up on one or two sheets of an Excel workbook.

    And so through technical arguments, critiques of spreadsheets, committees
    we lose track of the fact that the Whale was out of control and no one is ultimately held responsible in a significant way. Controlling unbridled corporate Banking Greed would go much further than quiblling about models to prevent trading abuses – a class and moral approach to controlling criminal behavior.

[Oct 20, 2010] Recurring formula for new records

Excel Help Forum

...have an Excel table. Each row is a separate record. The user types in new records at the end of the table. One of the columns in the table contains a formula. Ideally the formula should apply for each cell in the entire column. Since I don't know how long the table will end up being, I simply ask that the user drag the formula down from the cell above. I do not want the user to accidentally change the formula though. At first I thought a Custom validation with a secret word that the user is unlikely to type in, would keep the user from changing the formula already in the cell. However, I can't stop the user from deleting the formula, which doesn't help. I then thought maybe locking (protecting with password) the column would do the trick. However, now the user can't drag down the formula from the cell above. I would therefor like help with one of 2 options:

1- have the formula apply to the cell automatically as a new record is created (ideal)
2- find a way to protect the cell so that the user can copy the formula down, but not change or delete it

Any brain waves?

I did find the following code online, which appears to go some ways to answer my first option, though I can't figure out why it only works if the formula is in column A (i.e. I have formulas in columns A, B, K, L, and AF to DL).


.alt2 font { font: 11px monospace !important; color: #333 !important; } body.dp {} .tiny { font-family: verdana,arial,helvetica,sans-serif; font-size: xx-small; } .h3color { font-family: verdana,arial,helvetica,sans-serif; color: #E47911; font-size: x-small; }
Option Explicit 
Private Sub Worksheet_Change(ByVal Target As Range) 
    Dim r As Range, c As Range, t As Range 
    On  Error Goto TheEnd 
    If Not  Intersect(Target, [E9:E65536]) Is Nothing Then 'Skips B1.  Titles usually on first 7 rows. Row 8 has first  formulas. Column E has first user typed entry
        MsgBox Intersect(Target, [E9:E65536]).Address & " was changed." 
        Application.EnableEvents = False 
        Application.Calculation = xlCalculationManual 
        Application. ScreenUpdating = False 
        For Each t In Target 
            Set r = Union(Range("A" & t.Row), Range("C" & t.Row, Cells(t.Row, 19))) 
            If VarType(t.Value) = vbEmpty Then 
                 'r.ClearContents  'Clear all contents in row where B is empty.
                For Each c In r 'Clear only formulas where B is set to empty.
                    If c.HasFormula Then c.Clear 
                Next c 
                Goto nextt 
            End If 
            For Each c In r 
                If c.Offset(-1, 0).HasFormula Then c.Offset(-1, 0).Copy c 
            Next c 
        Next t 
        Application.EnableEvents = True 
        Application.Calculation = xlCalculationAutomatic 
        Application.ScreenUpdating = True 
    End If 
End Sub

Last edited by m.cain; 02-21-2008 at 07:02 PM. Reason: found possible code, though not sure how to use

ATM • Spreadsheet Files

Here we present a huge collection of spreadsheet files which originally appeared on a CD which was sent, free, to members of ATM. We have now negotiated permission to make them available here.

But, if you’re not a member, please consider joining ATM. Not only will you be supporting the further development of resources such as these but you will also receive the other benefits of belonging to one of the leading professional associations for mathematics educationalists.

ATM • Problem Solving with Interactive Spreadsheets

ATM’s ICT publication is a CD of 16 programs especially designed to be used with any interactive whiteboard or projector. The programs provide a variety of teaching opportunities for KS2, KS3 and KS4 classrooms and because they are so flexible, almost all can be used at all these key stages.

<< Samples from Problem Solving with Interactive Spreadsheets

The CD associated with this book contains sixteen interactive spreadsheets which have been developed to pose interesting and challenging problems. The tasks, many of which are well known, have been transferred to this format to help teachers offer learning opportunities to pupils that help develop their skills of mathematical reasoning.

The files will work with Excel and most other spreadsheet applications.

The problems require little prior knowledge in terms of explicit mathematical content. The aim is to provide problems that are challenging but accessible to students in the upper primary and secondary age ranges. Most of the problems include a range of extension problems to challenge even the most able.

The files are interactive so that as learners respond to the tasks and questions posed the next tasks appear on the screen automatically. The teacher’s guide offers suggestions for further extension and hints concerning solutions and proofs.

[Nov 5, 2006]

Today’s Tip

*Applying Colors to Maximum/Minimum Values in a List*

*To apply colors to maximum and/or minimum values:*

Rounding all the numbers in column A to zero decimal places, except for those 
that have "5" in the first decimal place.

1. Select a cell in the region, and press Ctrl+Shift+* (in Excel 2003, press 
this or Ctrl+A) to select the Current Region.
2. From the Format menu, select Conditional Formatting.
3. In Condition 1, select Formula Is, and type =MAX($F:$F) =$F1.
4. Click Format, select the Font tab, select a color, and then click OK.
5. In Condition 2, select Formula Is, and type =MIN($F:$F) =$F1.
6. Repeat step 4, select a different color than you selected for Condition 1, 
and then click OK.

Be sure to distinguish between absolute reference and relative reference when 
entering the formulas.

See screenshot for a sample


[Oct 24, 2006] Microsoft Office Assistance You asked for it Crabby's top 5 Excel tips and tricks

Double-click your way to fast formatting

By default, when you copy text from a different program (such as Microsoft Word, Microsoft PowerPoint®, or even Microsoft Internet Explorer) and paste it into an Excel cell, the original formatting comes along for the ride. Sometimes this leaves you with a tiny little cell with GIGANTIC FORMATTING that you need to fix by hand.

The Paste Options button that looks like this:

pops up, and you can use it to keep the original formatting or to match the destination formatting (that is, the formatting you've chosen for your cell). However, if you're like me (or want to be like me — aren't you sweet), that is just too many steps if you know that you want to match the formatting of the cell. This is especially true if you have a another program and keep your cell's formatting

  1. Select the text you want to pop into your Excel worksheet.
  2. Press CTRL+C.
  3. Switch back to Excel.
    Tip Hey! Now's a good time to use that shortcut I told you about in the first tip!
  4. Double-click in the cell, and press CTRL+V.

It's like magic in a click (or two).

And so you have it. Five decent little tips to boost your skill level and keep your work humming along. So start the music!

"We are what we repeatedly do. Excellence, then, is not an act, but a habit." — Aristotle

[Oct 24, 2006] 4 quick tips for using Excel

Creating a drop-down menu

Lynda Morris, president of NicLyn Consulting Corp, an Internet-based computer and network service company, often finds herself entering the same information or formulas in different parts of a spreadsheet. Instead of typing data repeatedly, she creates drop-down menus that save typing time. To create a drop-down menu:

Open a worksheet and label your menu (Days for example) in the top cell of a column.
Enter data (days of the week, in this case) in the cells below the label (such as A2 to A8).
Highlight the list, including the label.
Click the Insert menu, then click Name, and then Define.
In the box under Names in workbook you will see the name of the range (Days). In the Refers to box, it should list "=Sheet1!" followed by the range (=Sheet1!$A$1:$A$8).
Click Add and then OK.
Select any cell or range of cells in which you want to enter data.
Click the Data menu and then Validation.
From the Allow drop-down menu, select List. In Source, type "=" and the name of your range (=Days). Make sure that the In-cell dropdown box is checked.
Click one of the cells you highlighted to see your drop-down menu. Choose the item you want to enter in the active cell. Enter data from your list, in any order, in each of the cells you selected.

[May 22, 2006] Project details for SpreadsheetWriteExcel 2.17 by John McNamara

About: Spreadsheet::WriteExcel is a Perl module which can be used to create native Excel binary files. Formatted text and numbers can be written to multiple worksheets in a workbook. Formulas and functions are also supported. It is 100% Perl and doesn't require any Windows libraries or a copy of Excel. It will also work on the majority of Unix and Macintosh platforms. Generated files are compatible with Excel 97, 2000, 2002, and 2003, and with OpenOffice and Gnumeric. An older version also supports Excel 5/95.

Changes: This release includes minor bugfixes and enhancements.

[Feb 16, 2006] Write VBA with the Macro Recorder in Word and Excel Take advantage of autogenerated code to speed up your coding efforts.

[Jan 13, 2004] Excel 2003 Power Programming with VBA (Excel Power Programming With Vba)

by John Walkenbach (Paperback - Jan 13, 2004)

Buy a book to match your background and needs, February 11, 2004
Needing to finish my first Excel add-in, and frustrated by the incompleteness and obscurity of MS's help system, I picked up this book after reading warm recommendations from readers of earlier versions.

If you have never programmed Excel before, but have programmed a tiny bit in some other language, and do not have great ambitions for software development, this might be a fine text. It is quite readable and full of useful information. Walkenbach introduces VBA quickly, which is great, but so quickly he forgets to say what most of the language constructs do. His approach to teaching the Excel object model is to provide several fairly well written examples of little macros and utilities, each one with a clear English explanation. Unfortunately, if the technique you need does not appear in any of these examples, you are out of luck, because his explanations are neither extensive, detailed, nor thorough enough to impart a good understanding of what is going on. This, coupled with Excel's erratic behavior (mis-type a property name and watch your user form mysteriously disappear, for instance), makes it very difficult to become independently productive without spilling a lot of sweat and tears.

The book's strengths include the numerous and well-organized examples provided on the companion CD; the occasional sidebar that offers first-hand knowledge of bugs, inconsistencies, and strange design; fairly broad, if incomplete, coverage of the major aspects of Excel VBA programming; and very clear indications of differences among various Excel versions (97, 2000, 2003 mainly). Walkenbach is obviously an expert and has been so for a long time.

The weaknesses become apparent in contrasting this book with, say, Roman's text (O'Reilley). Where Walkenbach gives a macro to display all the icons associated with the several thousand Excel 'FaceId's, Roman publishes the complete table as an appendix. Where Walkenbach loosely skims over the properties of many key objects, such as ranges and charts, Roman takes the time to provide a terse but useful description of nearly every property, as well as a very illuminating diagram of the object hierarchy. Where Walkenbach completely omits to describe how VBA works, Roman actually offers a deeper explanation (showing how object references are arranged in memory, for instance, and describing exactly how a loop is executed). Boring stuff for some, maybe, but a huge time saver for those who appreciate that the details matter. For someone who either has a lot of programming experience, or who plans to develop more than toy utilities or one-off apps in Excel VBA, Roman's approach is much more useful than Walkenbach's.

If Walkenbach is appropriate for your background and ambitions, then you will probably agree it is a four- or five-star effort. Otherwise, you will likely be somewhat disappointed and, like me, will quickly find yourself looking for another book.

Excel 2003 Formulas Books

by John Walkenbach

Walkenbach and Excel...a great combination!, November 24, 2003
Reviewer: William A. Huber (Rosemont, PA USA) - See all my reviews
Reviewer: Claire (Jasper, Indiana) - See all my reviews
Excel 2003 Formulas provides some amazing ways to really master Excel. Walkenbach's formulas are the best! He covers operators, naming techniques, debugging, auditing, developing custom VBA functions, array formulas, imported 1-2-3 files, etc. Oh, and the CD that comes with this one is load too! Keep it within arms reach, August 19, 2005
Reviewer: Mark Jeffrey Lipin (Arizona) - See all my reviews
If you only get 1 book on Excel, buy this book! Its simplicity of use and easy reading style make it a perfect companion for a basic to an experienced Excel user (or abuser).

Excel Scientific and Engineering Cookbook (Cookbooks (O'Reilly)) Books David Bourg

Great showcase of Excel's mathematical capabilities, February 6, 2006
Reviewer: calvinnme "Texan refugee" (Fredericksburg, Va) - See all my reviews
It is often either inconvenient or financially unfeasible to solve every mathematical problem with a standalone procedural language or an expensive application such as Matlab or Mathematica. Since Excel has such advanced computational capabilities built into it and is ubiquitous in offices and homes, it would make sense to mine this application for numerical problem solving techniques. Bourg does a masterful job of presenting the tools available in Excel and showing the reader how to put them to work to solve real world engineering and mathematical problems. First, he spends a couple of chapters briefly going over Excel and its language VBA (Visual Basic for Applications). Next he goes over some intermediate level tasks such as collecting and cleaning up data, charting including 3D charts, and statistical analysis tasks such as correlating data and generating random numbers. Next Bourg moves into purely mathematical tasks such as working with discrete Fourier transforms, manipulating matrices and vectors, as well as working with basic mathematical functions. Once you have learned to use these mathematical tools in Excel, Bourg uses them to show you how to perform curve fitting and regression, solve equations, perform integration and differentiation, and solve both ordinary and partial differential equations. He even spends time on performing multivariable calculus and the finite element method in Excel. Chapter 13, on optimization, was my favorite chapter. Here, Bourg is actually getting into a little algorithm analysis and design via linear programming and genetic algorithms. This chapter showed me some truly innovative uses of Excel. The final chapter deviates from the scientific flavor of the book and concentrates on more traditional financial applications.
The writing is very accessible, the examples are clear and very creative, and the author does a tremendous job of capturing the range of Excel's mathematical capabilities. Do not expect mathematical theory in this book. Bourg assumes you already know how to set up a mathematical problem and that you just need a computational platform and tools with which to perform your work. It is nice to know I am not necessarily held hostage by Mathworks(maker of Matlab) every time I need to solve a mathematical problem of any level of sophistication. Highly recommended. The table of contents is as follows:
1. Using Excel
2. Getting Acquainted with Visual Basic for Applications
3. Collecting and Cleaning Up Data
4. Charting
5. Statistical Analysis
6. Time Series Analysis
7. Mathematical Functions
8. Curve Fitting and Regression
9. Solving Equations
10. Numerical Integration and Differentiation
11. Solving Ordinary Differential Equations
12. Solving Partial Differenby Gerald Knight tial Equations
13. Performing Optimization Analyses in Excel
14. Introduction to Financial Calculations

Analyzing Business Data with Excel

by Gerald Knight

Recommended Links

oreilly/Excel Annoyances

Free Excel Training. Basic-Level 1 Free Excel Training

  1. Excel Lesson 1: The Fundamentals of Excel

  2. Excel Lesson 2: How to Start Excel, Creating and Saving Workbooks

  3. Excel Lesson 3: Toolbars and Task Panes in Excel

  4. Excel Lesson 4: Insert/Delete/Name/Move Worksheets

  5. Excel Lesson 5: Cells, Moving Around Workbooks, Exiting

  6. Excel Lesson 6: Basic Copy/Cut & Paste

  7. Excel Lesson 7: Quick Copying Using the Fill Handle

  8. Excel Lesson 8: Pasting Using Paste Special

  9. Excel Lesson 9: Inserting Rows, Columns and Cells

  10. Excel Lesson 10: Changing the Standard Default Options in Excel

  11. Excel Lesson 11: Using the Undo and Redo Feature

  12. Excel Lesson 12: Using the Format Painter to Copy a Cells Format

  13. Excel Lesson 13: Using Dates and Times in Excel

  14. Excel Lesson 14: Working with Custom Formats in Excel

  15. Excel Lesson 15: The Basics of Excel Formulas

  16. Excel Lesson 16: Excel Cell References. Relative and Absolute Cell References

  17. Excel Lesson 17: Avoid Typing Whenever Possible. Use Your Mouse Pointer

  18. Excel Lesson 18: Excel Arguments and Syntax in Formulas

  19. Excel Lesson 19: Excel AutoSum Function/Formula

  20. Excel Lesson 20: Excel Auto Calculate. Calculate Without Formulas

  21. Excel Lesson 21: How to Use the Insert Function Feature to Create Formulas

  22. Excel Lesson 22: Using Some of Excel's most Common Functions

  23. Excel Lesson 23: Using Named Ranges in Excel as an Alternative to Cell References

  24. Excel Lesson 24: Using Constants to Name Values and the Paste Name Dialog

  25. Excel Lesson 25: Excel Calculations/How Excel Calculates/The Order of Calculations

  26. Excel Lesson 26: Excel Cell Comments

  27. Excel Lesson 27: Excel Find & Replace

  28. Excel Lesson 28: Different Methods of Clearing Cell Contents

  29. Excel Lesson 29: Effective Printing in Excel 1

  30. Excel Lesson 30: Effective Printing in Excel 2

  31. Excel Lesson 31: Sorting Data in Excel

  32. Excel Lesson 32: Hide/Show Rows/Columns in Excel

  33. Excel Lesson 33: Excel AutoFormats

  34. Excel Lesson 34: Creating a Basic Spreadsheet

  35. Excel Lesson 35: Charting the Basic Spreadsheet

  36. Excel Lesson 36: Excel Worksheet Protection

  37. Excel Lesson 37: Excel IF Formula/Function

  38. Excel Lesson 38: Volatile Functions-Now & Today

Microsoft Excel Tips from Excel Tip .com - Excel Tutorial - Free Excel Help

Excel VBA Examples

EXCEL and EXCEL/VBA examples

VBA Programming in Excel

Excel Viewer 2003

Recommended Articles -- Online Catalog Integrating Excel and Access -- Online Catalog Excel The Missing Manual -- Online Catalog Excel 2003 Personal Trainer -- Online Catalog Excel Scientific and Engineering Cookbook -- Online Catalog Excel 2000 in a Nutshell, First Edition -- Online Catalog Excel 2003 Programming A Developer's Notebook

Excel Object Model -- Online Catalog Writing Excel Macros with VBA, Second Edition


FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit exclusivly for research and educational purposes.   If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner. 

ABUSE: IPs or network segments from which we detect a stream of probes might be blocked for no less then 90 days. Multiple types of probes increase this period.  


Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy


War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes


Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law


Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least

Copyright © 1996-2015 by Dr. Nikolai Bezroukov. was created as a service to the UN Sustainable Development Networking Programme (SDNP) in the author free time. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License.

The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to make a contribution, supporting development of this site and speed up access. In case is down currently there are two functional mirrors: (the fastest) and


The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.

Last modified: November 21, 2015