I had mentioned earlier that there are three levels of MAST users. The first and simplest level is to drop data items (and geographies, to limit the area tabulated) into the shopping cart. It’s easy to use, pretty powerful, but doesn’t get anywhere close to my claim to be able to answer (almost) any question you can think of that can be derived from the data. While you can do a great deal with it, you are limited to the variables that are either created by the Census Bureau, or the few that I have created in anticipation of people needing them (e.g. pincp_vwa). You can do a lot at the first level, but if you need a different variable, you’ll have to move up.
The second level is script-writing. In script-writing you can create variables that you might need that I could have never possibly anticipated. It is extremely powerful. Unfortunately I don’t currently have any error-checking/debugging available to the user, so I have to admit that when used via the website it doesn’t work very well. I still have it as an option on the site, but I’m not encouraging people to use it. (The way it works is the user writes the script and sends it, then I debug the script [assuming there are errors] and return the debugged script to them, which they resubmit. I end up getting $25 to debug their script, which isn’t good for me or for them.)
The third level is the one that takes us to the point where (almost) any question imaginable regarding the data can be answered. At the third level, which is actually the easiest level of all to use, the user simply tells me what they want. I then create the necessary variables, some via the scripting language, some otherwise, I run the tabulation, and send it to them. At all levels, the tabulations are somewhat dynamic, meaning that after the first tabulation the user learns something about MAST’s capabilities and the data. I learn more about the user’s needs and the data. We then decide to do another tabulation. After a few iterations of this, the user then has what they want.