(same as previous post - this time with "[sbml-discuss]" in title ... sorry).
I was given an action item at the last SBML workshop to clarify the meaning of the "fast" attribute for reactions within SBML. In response, I have both a concise definition (below) and a supplemental guide that discusses some of the implementation issues for continuous systems (see attached).
---- A concise definition that could be included in spec ----
The set of reactions that have the "fast" attribute set to "true" defines those reactions whose time scales are sufficiently fast, relative to the remaining reactions, that together they form a subsystem that is well described by a pseudo steady state approximation. Under this approximation, relaxation of any initial condition or perturbation from this psuedo steady state would relax infinitely fast. It is important to note that the correctness of this approximation requires a significant separation of time scales.
----- Some Discussion -----
I didn't include (in the attachment) any discussion of automated time scale analysis algorithms which could determine which reactions should be "fast". The "fast" attribute is an assumption to be encoded in a model rather than a numerical technique used to solve the "full" problem. Although the Virtual Cell implements the PSSA slightly differently (as a time splitting method that is very useful for solving PDEs ... without splitting some inconvenient spatial operators can appear), the paper that I provided describes an approach to formulate these systems as a traditional DAE (although the derivation is not rigorous).
On the one hand, just like many other aspects of SBML models, consistency of modeling assumptions (e.g. proper use of HMM kinetics) is not yet within the scope of SBML language itself. On the other hand, when introducing a "new" feature (although "fast" is not technically new SBML), a "best practices" guide should be appropriately informative.
In one class of applications, the modeler is explicitly introducing one or more reactions that are assumed to be in "fast equilibrium", and where the actual time scale is assumed to be much faster than the other dynamics, but need not be exactly known (e.g. only Kd's known rather than both kforward and kreverse).
In another class of applications, the time scale of all processes are well described, but inconveniently fast and very well separated from the set of slower dynamics in the model. In this class the PSSA can be used to reduce the order of the system (at least eliminate one parameter .. kforward or kreverse) and to allow an efficient solution where any initial fast transients are not resolved (like a fast boundary layer), but subsequent dynamics are faithfully reproduced.
So, when is "fast" fast enough? When does a modeling assumption become sufficiently justified (or wholy inappropriate)? There are techniques for determining time scales based on a local linearization along a trajectory (evaluating Jacobians along the solution) that I have seen referenced but have not investigated myself. There are other techniques from combustion (lots of fast intermediates), and other more heuristic methods of time-scale based model reduction that we have kicked around. A problem is that for nonlinear systems, the time scales are time/state dependent. For automatically analyzing time scales, maybe others in the SBML community have more actual experience (I could look into this but find other obligations to be calling me).
The best answer is that this option should not be used unless the modeler is aware of the implications.
Software Lead - Virtual Cell Project (http://vcell.org)
Richard D. Berlin Center for Cell Analysis and Modeling
University of Connecticut Heath Center
263 Farmington Ave, MC-1507
Farmington, CT 06030