#on-device-inference

[ follow ]
fromApp Developer Magazine
9 months ago

OpenAI open weight models released for optimized laptop performance

OpenAI has released two open-weight language models designed to operate efficiently on laptops and personal computers. These models are intended to provide advanced reasoning capabilities while allowing developers greater flexibility through local deployment and fine-tuning. Unlike proprietary models, open-weight models provide public access to trained parameters, enabling developers to adapt the models for specific tasks without access to the original training datasets. This approach improves control over AI applications and supports secure, local usage in environments with sensitive data.
Artificial intelligence
Artificial intelligence
fromInfoQ
4 months ago

Google Brings Gemini Nano to ML Kit with New On-Device GenAI APIs

The new GenAI APIs allow for on-device inference in Android applications, ensuring data privacy and eliminating cloud costs.
[ Load more ]