schema: 1 bugzilla: product: Core component: "Machine Learning: On Device" origin: name: wllama description: WebAssembly binding for llama.cpp - Enabling on-browser LLM inference url: https://github.com/ngxson/wllama release: 2.3.1 revision: e4bd5e79b316d0b40c71322b426116ec43f8b93e license: MIT license-file: LICENSE notes: see README.md for more details.