Jump to content

GLA Summit 2022/Deep Learning and GPU Acceleration with LabVIEW

From LabVIEW Wiki
Revision as of 21:20, 29 March 2024 by Q (talk | contribs) (Started stub)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Deep Learning and GPU Acceleration with LabVIEW by Ish Gevorgyan, Alik Sargsyan

This presentation dives into the utilization of GPUs for enhancing deep learning applications within LabVIEW. Introducing the Deep Learning Toolkit for LabVIEW (DeepLTK) and CuLab (GPU Toolkit for LabVIEW), it showcases how to develop, train, and deploy custom deep learning models and accelerate computationally intensive LabVIEW code on GPUs.

Presentation Links

See Also

External Links