Claris FileMaker and Private AI Part 2: DeepSeek R1
Previously, we discussed how it is possible to run your own private AI service powered by Ollama using the llama3.2 model on commodity Apple hardware. However, there are many other publicly available models which can work with Ollama, including Google's Gemma, Alibab Cloud's Gwen, Mistral's Devstral, Microsoft's Phi and perhaps the most prominent to the general public, DeepSeek's R1.
Why DeepSeek R1?
As a new and relatively unknown Chinese startup, DeekSeek garner significant media attention with the release of the R1 model in January 2025. It had previously been assumed that the large American Mega Cap tech stock's heavy spending on cutting edge infrastructure provided by Nvidia's most advanced chips would provide a significant technological 'moat' to other entrants into the AI market.
DeepSeek's principle innovation was to build a model capable of simulating how humans reason through problems for a fraction of the cost of similar American models (~$6m versus over $100m for OpenAI's GPT4). Even more significantly, DeepSeek uses less memory than rival models and so executing it is less costly for users.
Running DeepSeek's R1 Locally
Ollama works with multiple models and so once it is set-up, it is trivial to swap between models, allowing developers to evaluate the performance and accuracy to determine which is the best fit for your application.
1. Choosing the right model
- deepseek-r1:1.5b - 1.1GB size; should run with reasonable performance on an 8GB M series Apple processor (or better).
- deepseek-r1:8b - 5.2GB size; should run with reasonable performance on a 16GB M series Apple processor (or better).
- deepseek-r1:32b - 30GB size; will likely require 32GB M4 servers Apple processor (or better).
Doing some basic real world tests using a 24GB M3 MacBook, it takes around 9 seconds to execute a query against deepseek-r1:1.5b and around 1 minute to execute a query against deepseek-r1:8b.
The models can all be downloaded from Ollama.com and installed in the same manner as Llama.
i.e. Download and install Ollama, and then open Terminal and run:
ollama run deepseek-r1:1.5b
DeepSeek R1 vs Llama vs other models
So why use DeepSeek R1 instead of Llama? Given the seemly magical responses that LLMs can generate it is easy to forget that they are black box solutions - they can hallucinate and are all built off of different training data sets which may have different inherent biases. Equally importantly they haven't all been optimised for exactly the same task. It is therefore, important to evaluate different real-world applications using your own data to determine which is the best fit. With this in mind, some general observations can be made:
π¦ DeepSeek Strengths
β Pros:
• MIT License: True open-source — even commercial use is permitted.
• Strong coding performance, especially for Python, JavaScript, Go, etc.
• Excellent at structured generation (JSON, YAML, config files).
• Well-optimised for multi-turn reasoning, chaining logic across instructions.
β Cons:
• Less fluent in creative or open-ended prose than LLaMA 3 or Mistral.
• Slightly heavier memory footprint at the same parameter size (esp. compared to Mistral).
• Not trained with as much Reinforcement Learning from Human Feedback (RLHF) tuning for alignment, so you may need to fine-tune for sensitive apps.
Hosting Options: On-Premise or Private Cloud
If your Claris FileMaker Server runs on an Apple Silicon Mac or a compatible GPU-based machine, you can host Ollama to run DeepSeek R1 on the same system. Alternatively, we can help you migrate to a private cloud cluster for high-performance, scalable AI—without sacrificing data privacy or control.
What’s Next?
π Explore real-world use cases and benefits in our detailed post:
π Supercharge FileMaker with Private AI: Integrating Local LLMs like Llama 3
Ready to Bring DeepSeek-R1 AI Into Your Business?
At DataTherapy, we specialise in integrating private AI with Claris FileMaker. Whether you’re just exploring or ready to go all-in on secure, AI-enhanced automation—we can help.
β Certified Claris FileMaker Developers
β UK-based team of full-time professionals
β Platinum Claris Partner
β Experts in secure, on-premise and private cloud AI deployment
π Contact us today for a free consultation and discover how local AI can transform your business—without ever sending your data to the public cloud.

