A similar procedure is often useful when you are developing an intensive data system. If you are uncertain of your ability to manipulate the required volume of data, construct a prototype to test it. I found this invaluable when asked to investigate whether some of the early PC networks could work fast enough to replace the predominant mini-computer-based systems for equity trades. We thought they could but some members of the client's team were highly skeptical, especially the managing director (MD). The solution was to build a prototype with a server that generated several hundred of thousands of changing numbers per minute (mimicking the stock market in perpetual heavy trading) and a workstation displayed these in the standard dealer's grid colors. It was then possible to demonstrate that the PC network system was able to cope with the heaviest data load 14 times faster than the older, more arcane mini-computers.
Was this article helpful?