Data flow
Dataflow is the fundamental concept under which the LabVIEW execution system executes code written in G (the graphical language used in LabVIEW). The basic philosophy is that the passage of data through nodes within the program determines the order of execution of the functions of the program. LabVIEW nodes (functions, structures, and subVIs) have inputs, process data and produce outputs. Whenever all of a node's inputs contain valid data, that node will be scheduled for execution by the LabVIEW execution engine. As soon as a processor becomes available (subject to available CPUs and the completion of previously scheduled nodes), the node will be executed.
By chaining together nodes that have common inputs and outputs it is possible to arrange the functions in the order by which the programmer wants the data to be manipulated. Often a single input is used as a common "thread" running through several VI's and the program to force the order of operations. Examples include the error cluster or a refnum data type.
An example of how dataflow works and how LabVIEW differs from textual languages is given in [1] this PDF titled LabVIEW and G as a Computing Language Course.
External links
- Dataflow (Wikipedia Definition)
- Dataflow langauge (Wikipedia Definition)