Quantitative Consulting for Business
In today's business life, quantitative methods are needed everywhere. The ability not only to efficiently organize but also to meaningfully interpret your data, and act upon it, is an important competitive advantage in the marketplace.
The new website is in the making and our new presentation will be available soon: [not yet]
Archive of the old Website (2006-2008)
Technical presentation (2008)
We present a concept for truly adaptive discrete choice designs. Based on the Bayesian paradigm, our algorithm uses sequential Monte Carlo methods to update the posterior probability after each new answer and generate new product comparisons based on a variety of possible target measures. We provide results comparing different adaptive and fixed-design strategies from a simulation study performed in R. Our algorithm outperforms classical adaptive methods based on utility balance. Our method is consistently based on discrete choice theory and should therefore also lead to more valid results. [pdf presentation - long! (5MB)]
Short PDF articles (2006-2008)
Bundles are everywhere. While one is usually inclined to think of the more notorious examples of software bundles or copiers and paper, essentially every product is a bundle of features and services. The art of bundling consists in knowing which parts to sell together and which to sell separately. Be careful! Bundling projects are just as common as unbundling (aka "add-on pricing") projects. [pdf presentation]
Many industries (Telcos, Banks, Utilities) work with two-part tariffs, consisting of a fixed access fee and a usage component. While this allows to specifically target different customer segments (think of prepaid tariffs vs. flat fees in mobile telephony), it also makes the assessment of the effects of tariff changes very complex. [pdf presentation]
The usefulness of transaction based benchmarking systems is often hampered by a lack of comparability. No two transactions are the same, specific conditions for the product, the client or the market seem to apply everytime you try to make a comparison. Our method consists in constructing fair reference lines taking into account all major influences. While our main examples are pricing-related the same methods can be applied for performance-, quality- or other measures. [pdf presentation]
Good reports are an essential ingredient for any efficient management organization. Many good frontends for datawarehouses exist, permitting flexible setup and distribution of reports. However, the identification of those areas needing special attention, the right choice of indicators to track or the definition of warning thresholds require massive analytical input. Read our [pdf presentation] to learn more.
Short Online articles (2006-2008)
There's no denying it. As much as quantitative consultants would wish that dedicated statistics tools like R, Splus, SAS or business intelligence tools like BO and Cognos played a more prominent role, the program that is virtually ubiquitous when it comes to analyzing business data is Excel. Its large distribution base as part of the Office package, its ease of use and its flexibility make it the most popular analysis tool by far.
The advantages of Excel constitute at the same time its largest drawback. Excel does not prescribe how to store or format data, it does not distinguish between data management and data presentation and in general invites the user to be repetitive and uneconomical with his or her data.
If in a project data sources are given to me in Excel, I usually count in several extra days of work to bring the data into a usable format. Even if you are not working with a consultant: The necessity to share data and to regularly update it, the fact that the data may be useful in answering new questions not thought of before, and not least the fact that we all belong to a rather forgetful species, place a number of restrictions on the way how to Excel should be used.
The following list of recommendations is mainly concerned with how you as the user should use the program: be specific, be precise, be encompassing and be systematic, if you know BO or Access: use Excel like you use BO or Access. Some of the recommendations are concerned with Excel's built-in "intelligence". Here, the message is: Do not trust it.
One remark before we start: Many people use Excel to "build models" by chaining formulas, putting different functions of their models on different pages etc.. While I do believe that most of this modelling should better be done in a statistical programming language like R, the following "commandments" can only in part be applied in that case.
So, now that you have put your data in a nice unformatted table, you can start making analyses using the pivot table function. Have a look at the [xls-example].
Dr. B oris V aillant -