Private AI on Oracle OCI: How to Stop Sending Company Data to Public LLMs

There is something that bothers me every time I see a company adopting AI tools without thinking too much about what happens to their data. You open ChatGPT, you paste some internal document, you ask for a summary or a rewrite, and that's it. The data left the building. Where it goes, who reads it, if it is used to train future models; these are questions that most people don't ask until it becomes a problem. I work a lot…

Continue ReadingPrivate AI on Oracle OCI: How to Stop Sending Company Data to Public LLMs

How to Connect Openclaw to Oracle OCI LLM Models

This guide explains how to connect Large Language Models hosted and ready to use on Oracle Cloud Infrastructure AI service to OpenClaw using LiteLLM (selfhosted) as a proxy layer without the need to deploy a dedicated cluster. In fact, Oracle provides a range of ready-to-use models on demand. Logical flow: OpenClaw → LiteLLM → Oracle OCI Generative AI OpenClaw never talks directly to OCI. It speaks OpenAI compatible APIs to LiteLLM. LiteLLM translates and forwards requests to Oracle’s Generative AI…

Continue ReadingHow to Connect Openclaw to Oracle OCI LLM Models

Ceph is amazing. Just don’t ask it to be Lustre

(And Yes, It Can Still Power AI and SAN Workloads) Ceph is one of the most powerful open-source storage platforms available today. It offers object, block, and file storage in a single distributed system, with high availability, strong durability, and the ability to scale on standard hardware.That alone makes Ceph exceptional. But here is the uncomfortable truth: Ceph can scale massively and still be the wrong storage for a specific workload. Understanding why is the difference between a great architecture…

Continue ReadingCeph is amazing. Just don’t ask it to be Lustre