Data flow
Dataflow is the fundamental concept under which the LabVIEW execution system executes code written in G (the graphical language used in LabVIEW). The basic philosophy is that the passage of data through nodes within the program determines the order of execution of the functions of the program. LabVIEW nodes (functions, structures, and subVIs) have inputs, process data and produce outputs. Whenever all of a node's inputs contain valid data, that node will be scheduled for execution by the LabVIEW execution engine. As soon as a processor becomes available (subject to available CPUs and the completion of previously scheduled nodes), the node will be executed.
After a node executes, values on its output wires become valid and "flow" to the next nodes with those wires connected as inputs. By chaining together nodes that have compatible inputs and outputs, the programmer controls the order in which the nodes execute. Often a single input is used as a common "thread" running through several VI's and the program to force the order of operations, whether or not each node has any need to modify the data passing through it. Examples include the error cluster or a refnum data type.
An example of how dataflow works and how LabVIEW differs from textual languages is given in [1] this PDF titled LabVIEW and G as a Computing Language Course.
External links
- Dataflow (Wikipedia Definition)
- Dataflow langauge (Wikipedia Definition)