How to Connect Openclaw to Oracle OCI LLM Models

This guide explains how to connect Large Language Models hosted and ready to use on Oracle Cloud Infrastructure AI service to OpenClaw using LiteLLM (selfhosted) as a proxy layer without the need to deploy a dedicated cluster. In fact, Oracle provides a range of ready-to-use models on demand. Logical flow: OpenClaw → LiteLLM → Oracle OCI Generative AI OpenClaw never talks directly to OCI. It speaks OpenAI compatible APIs to LiteLLM. LiteLLM translates and forwards requests to Oracle’s Generative AI…

Continue ReadingHow to Connect Openclaw to Oracle OCI LLM Models

Ceph is amazing. Just don’t ask it to be Lustre

(And Yes, It Can Still Power AI and SAN Workloads) Ceph is one of the most powerful open-source storage platforms available today. It offers object, block, and file storage in a single distributed system, with high availability, strong durability, and the ability to scale on standard hardware.That alone makes Ceph exceptional. But here is the uncomfortable truth: Ceph can scale massively and still be the wrong storage for a specific workload. Understanding why is the difference between a great architecture…

Continue ReadingCeph is amazing. Just don’t ask it to be Lustre

Lustre Storage Service on Oracle OCI: A Game-Changer for High-Performance Data Management

I’m excited to share something really cool with you all! Oracle has just released a new Lustre storage service on their Cloud Infrastructure (OCI), and I think it’s going to change the way we handle large-scale data storage and processing. Whether you’re in HPC or AI/ML development, this new service has some serious potential. What’s Lustre Storage? Lustre is a parallel file system designed for high-performance workloads. It’s perfect for managing and processing massive amounts of data—think huge simulations, big…

Continue ReadingLustre Storage Service on Oracle OCI: A Game-Changer for High-Performance Data Management