Latest Updates
- Inferencer v1.6.1 with Multi-Windows and Distributed Compute for OpenAI
This update introduces support got multi-window sessions, distributed compute compatibility for OpenAI compatible API, code block copying and more.
- Inferencer v1.6 with Distributed Compute
With Distributed compute you can pool the memory of two Macs together for inferencing larger models.
- Inferencer v1.5.3 with Xcode Intelligence
Support for agentic code writing and interaction with Xcode Intelligence.
- Inferencer v1.5 with Private Server
Private inference serving over the network and internet.
- Inferencer v1.4 with Shortcuts.app integration
Generation queueing, shortcuts.app integration and faster startup times.
- Inferencer v1.3 with token entropy and memory offloading
Token entropy inspector improvements and a new memory offloading feature to allow for inferencing models larger than available memory.
- Inferencer v1.2 with token inspection and memory savings
Version 1.2 is now available for download on the Mac App Store. This update introduces major memory savings and improvements to the token inspector.
- Inferencer v1.1 with markdown rendering
Version 1.1 is now available for download on the Mac App Store. This update introduces markdown rendering support, allowing messages to display tables, headings, and mathematical notation with proper formatting.
- Inferencer for macOS v1.0 now available
Version 1.0 is now available for download on the Mac App Store. It’s been built with native code and in my testing it’s the fastest inferencer available on Mac, even with the added support for token inspection.
Subscribe for updates
With more features coming soon, you can be the first to know.