Go back

Tether Data launches QVAC Fabric LLM for edge inference

Tether Data launches QVAC Fabric LLM for edge inference

Tether Data releases QVAC Fabric LLM, enabling LLM inference and fine-tuning on consumer hardware.

Tether Data has introduced QVAC Fabric LLM, a new inference runtime and fine-tuning framework that allows large language models to run and be personalized directly on everyday devices, including consumer GPUs, laptops, and smartphones. This breakthrough enables on-device AI training and inference, even on mobile GPUs like Qualcomm Adreno and ARM Mali, marking the first production-ready framework for modern LLM training on smartphone-class hardware. The technology empowers users and enterprises to maintain privacy and data control, as models can be fine-tuned locally without relying on cloud providers.

For businesses, QVAC Fabric LLM offers a secure way to deploy and customize AI models in-house, meeting strict privacy and regulatory requirements. The framework supports heterogeneous hardware, making it easier for organizations to leverage their existing device fleets for AI workloads. Tether Data’s CEO, Paolo Ardoino, emphasized that this release advances decentralized, privacy-preserving, and ubiquitous AI, moving away from centralized cloud platforms and enabling a new generation of resilient, on-device AI applications.

Source: Tether Data


Share this post on:

Previous Post
OpenAI declares 'code red' amid new LLM competition
Next Post
NVIDIA Expands Open-Source AI Models and Tools

Related Posts