返回首页
on-device-inference
找到 2 篇关于此标签的文章
llama.cppAndroid
端侧AI 模型部署实战五(Android大模型加载)
Step-by-step JNI bridge implementation for running quantized LLMs on Android using llama.cpp.
Apr 143 分钟
CoreMLApple-Intelligence
Apple's On-Device AI Moat: What It Means for Edge Builders
Apple's privacy-first, on-device AI stack may become the default for builders who need inference without cloud costs .
Apr 134 分钟