OCP Articles: Machine Learning & Analytics
- Running Analytics on OpenShift with Daikon (11/05/2016),
- Running Spark Jobs On OpenShift (20/01/2017),
- Intro to Machine Learning using Tensorflow – Part 1 (25/01/2017),
- Jupyter on OpenShift: Using OpenShift for Data Analytics (11/04/2017),
- Jupyter on OpenShift Part 2: Using Jupyter Project Images (12/04/2017),
- Jupyter on OpenShift Part 3: Creating a S2I Builder Image (14/04/2017),
- Jupyter on OpenShift Part 4: Adding a Persistent Workspace (17/04/2017),
- Jupyter on OpenShift Part 5: Ad-hoc Package Installation (19/04/2017),
- Jupyter on OpenShift Part 6: Running as an Assigned User ID (21/04/2017),
- Jupyter on OpenShift Part 7: Adding the Image to the Catalog (24/04/2017),
- Machine Learning on OpenShift and Kubernetes (20/12/2017),
- Deploy Deep Learning Model on OpenShift/Kubernetes (24/10/2018),
- Getting Started with Machine Learning (14/12/2018),
- Why Data Scientists Love Kubernetes (18/12/2018),
- Machine Learning as a Service (30/01/2019),
- Kubeflow on OpenShift (12/03/2019),
- Deploying a scalable, shared Data Science platform at Université Laval (04/09/2019),
- Kubeflow + OpenShift Container Platform + Dell EMC Hardware: A Complete Machine Learning Stack (12/09/2019),
- Training and Deploying Machine Learning Models with Containers and MiniShift (09/2019),
- Deep Learning Performance on Red Hat OpenShift 3.11 (04/10/2019),
- Building an open ML platform with Red Hat OpenShift and Open Data Hub Project (15/10/2019),
- Microbenchmarks for AI applications using Red Hat OpenShift on PSI in project Thoth (28/10/2019),
- AI/ML pipelines using Open Data Hub and Kubeflow on Red Hat OpenShift (16/12/2019),
- Automated data pipeline using Ceph notifications and KNative Serving (09/01/2020),
- Installing Kubeflow v0.7 on OpenShift 4.2 (10/02/2020),
- Open Data Hub v0.5.1 released (26/02/2020),
- CodeReady Containers with GPU for Data Science (08/03/2020),
- OpenShift4.3: Retest Static IP configuration on vSphere (16/03/2020),
- Run an automated ML pipeline with Ceph Bucket Notifications, TensorFlow and Flask using Openshift (05/04/2020),
- Open Data Hub 0.6 brings component updates and Kubeflow architecture (07/05/2020),
- Open Data Hub 0.6.1: Bug fix release to smooth out redesign regressions (02/06/2020),
- OpenShift for Machine Learning / Deep Learning (part 1) (17/07/2020),
- OpenShift for Machine Learning / Deep Learning (part 2) (19/07/2020),
- Open Data Hub and Kubeflow installation customization (23/07/2020),
- From notebooks to pipelines: Using Open Data Hub and Kubeflow on OpenShift (29/07/2020),
- Validating Distributed Multi-Node Autonomous Vehicle AI Training with NVIDIA DGX Systems on OpenShift with DXC Robotic Drive (29/07/2020),
- Open Data Hub 0.7 adds support for Kubeflow 1.0 (13/08/2020),
- Kubeflow 1.0 monitoring and enhanced JupyterHub builds in Open Data Hub 0.8 (18/09/2020),
- Run your Spark data processing workloads using OpenDataHub, OCS, and an external Ceph cluster (22/09/2020),
- AI/ML Industrial Edge Solution Blueprint v1.0 (11/11/2020),
- Configure Code Ready Workspace for Developing Machine Learning Workflow (02/12/2020),
- Serving Machine Learning Models on OpenShift: Part 1 (15/12/2020),
- Machine Learning Model Monitoring on OpenShift Kubernetes (21/12/2020),
- Run Your Business Intelligence Using Presto & Superset, Backed By OpenDataHub and OCS Openshift Operators (22/12/2020),
- Business Centric AI/ML With Kubernetes – Part 2: Data Preparation (11/02/2021),
- Business Centric AI/ML With Kubernetes – Part 3: GPU Acceleration (18/02/2021),
- Business Centric AI/ML With Kubernetes – Part 4: ML/OPs (25/02/2021),
- Introducing Red Hat OpenShift Data Science (27/04/2021),
- Operationalizing Kubeflow in OpenShift (25/05/2021),
- How to install Kubeflow 1.2 on Red Hat OpenShift (28/05/2021),
- Applying machine learning to GitOps (29/06/2021),
- Inference Scaling with OpenVINO™ Model Server in Kubernetes and OpenShift Clusters (05/07/2021),
- Open Data Hub 1.1.0 provides new JupyterHub capabilities and more (26/07/2021),
- Getting Started running Spark workloads on OpenShift (16/09/2021),
- Building machine learning models in the cloud (16/11/2021),
- Build and deploy an object detection model using OpenShift Data Science (22/11/2021),
- Access more data from your Jupyter notebook (22/11/2021),
- Enterprise MLOps Reference Design (02/12/2021),
- More machine learning with OpenShift Data Science (23/12/2021),
- How to build and operate cloud-native AI with Open Data Hub (11/01/2022),
- Image Classification machine learning in RHODS Sandbox under 5 minutes (26/01/2022),
- Learn how to build, train, and run a PyTorch model (23/03/2022),
- Configure CodeReady Containers for AI/ML development (06/04/2022),
- How to use Kubeflow and the MPI Operator on OpenShift (12/04/2022),
- How to install an open source tool for creating machine learning pipelines (05/05/2022),
- Your Guide to the Red Hat Data Science Model Lifecycle (09/05/2022),
- Open source edge detection with OpenCV and Pachyderm (01/06/2022),
- A Guide to OpenShift Machine Pool Design Strategies for Red Hat OpenShift Data Science (08/06/2022),
- Red Hat Openshift Database Access (RHODA) Integration with Jupyter Notebook (23/06/2022),
- Develop faster, operate smart: A Kubernetes-native guide to AI application development (22/08/2022),
- The Future of AI, Security, and the Edge (08/09/2022),
- Boost OpenShift Data Science with the Intel AI Analytics Toolkit (21/09/2022),
- Perform inference using Intel OpenVINO Model Server on OpenShift (30/09/2022),
- IBM announces Embeddable AI (28/10/2022),
- A Guide to GPU-enhanced, Text-to-Speech Model Training with Red Hat OpenShift Data Science and Coqui TTS (07/11/2022),
- Why GPUs are essential for AI and high-performance computing (21/11/2022),
- Red Hat Adds New Collaboration and MLOps Capabilities in Red Hat OpenShift Data Science (29/11/2022),
- How I’ve Used ChatGPT To Create A Whole GitOps Demo Using 3 Simple Questions (11/12/2022),
- A Guide to Scaling OpenShift Data Science to Hundreds of Users and Notebooks (13/12/2022),
- Getting started with data science and machine learning: what architects need to know (11/01/2023),
- Expanding OpenShift Data Science Support for On-Premise Deployments (19/01/2023),
- Scaling Model Serving with Red Hat OpenShift Data Science (20/02/2023),
- Serving and Monitoring Models across Clusters (23/02/2023),
- How to Resolve Jupyter Notebook Spawning Issues in Red Hat OpenShift Data Science, couldn’t parse image reference “:py3.8-v1” (07/04/2023),
- Data scientists’ dream team: Red Hat and Intel AI and Machine Learning (27/04/2023),
- AI, ML, ChatOps, and MLOps: a Real Understanding of Artificial Intelligence (10/05/2023),
- Deploying OpenAI’s Large Language Models on OpenShift (12/05/2023),
- Leveraging OpenAI’s LLM on OpenShift for Technical Q&A: A Case Study with Red Hat Documentation (14/05/2023),
- Demo: 3 Exciting Ways to Leverage Locally Served Large Language Models (15/05/2023),
- Integrating OpenAI with OpenShift: Managing Your Cluster via ChatOps (16/05/2023),
- AI/ML Models Batch Training at Scale with Open Data Hub (15/05/2023),
- Exploring AI/ML Tools on OpenShift (11/05/2023),
- Leveraging ODF for AI Workloads on OpenShift (15/05/2023),
- Scaling AI Workloads on OpenShift: Techniques and Best Practices (17/05/2023),
- Deploying and Running Argo Workflows for AI/ML on OpenShift (17/05/2023),
- Leveraging Tekton for AI Workloads on OpenShift (17/05/2023),
- Securing AI Workloads on OpenShift with Native Tooling (17/05/2023),
- Solving the Access Issue for the `kubeadmin` User in Red Hat OpenShift Data Science Project. Unable to access Settings page (18/05/2023),
- How to use OpenShift Data Science for fraud detection (22/05/2023),
- Fine-tune large language models using OpenShift Data Science (19/06/2023),
- Fine-tuning and serving foundation models (20/06/2023),
- Run machine learning workflows with Red Hat OpenShift Pipelines (24/06/2023),
- Voice Cloning and TTS with IMS-Toucan and Red Hat OpenShift Data Science (28/07/2023),
- Deploying AI/ML Models in Kubernetes using Seldon Core, Istio and MetalLB (15/08/2023),
- Generative AI on Red Hat OpenShift 4 (18/08/2023),
- Accelerating Data Science workflows with Red Hat Data Science (28/08/2023),
- Installing OpenShift Data Science (ODS) in Self-Managed Environment (28/08/2023),
- Fine-tuning and Serving an open source foundation model with Red Hat OpenShift AI (18/09/2023),
- Journey into the Future with OpenShift Data Science (02/10/2023),
- Open Source MLOps with ZenML and OpenShift (04/10/2023),
- A guide to ClearML on OpenShift (05/10/2023),
- Unleashing the Full Potential of RHODS for Model Fine-tuning and Inferencing (15/10/2023),
- Deploying an AI ChatBot in Azure Red Hat OpenShift fully integrated with Azure OpenAI (31/10/2023),
- A personal AI assistant for developers that doesn’t phone home (06/11/2023),
- Quantifying performance of Red Hat OpenShift for Machine Learning (ML) Training on Superimicro A+ Servers with MLPerf Training v3.1 (23/11/2023),
- Supercharging chaos testing using AI (08/01/2024),
- Multilingual semantic similarity search with Elasticsearch (10/01/2024),
- How to install ClearML Enterprise on Red Hat OpenShift (11/01/2024),
- Evaluating LLM inference performance on Red Hat OpenShift AI (16/01/2024),
- Continuous performance and scale validation of Red Hat OpenShift AI model-serving stack (17/01/2024),
- Unlocking the power of generative AI with Cloudera Data Platform and Red Hat OpenShift (18/01/2024),
- Implement MLOps with Kubeflow Pipelines (25/01/2024),
- MLRun Community Edition on Red Hat OpenShift (01/02/2024),
- How to integrate Quarkus applications with OpenShift AI (06/02/2024),
- Stop paying for your code copilots, build your own with Openshift and Ollama (12/02/2024),
- Lets Talk about Inferencing and Monitoring Opensource LLMs on Red Hat Openshift AI — Episode 1 (24/02/2024),
- GitOps in Action: Building and Deploying AI Chatbots on Azure Red Hat OpenShift with Azure OpenAI (07/03/2024),
- Empower conversational AI at scale with KServe (15/03/2024),
- Persisting Python Packages in OpenShift AI Workbench with Custom PIP_TARGET and PYTHONPATH (18/04/2024),
- Accelerating generative AI adoption: Red Hat OpenShift AI achieves impressive results in MLPerf inference benchmarks with vLLM runtime (24/04/2024),
- Create a Red Hat OpenShift AI environment (26/04/2024),
- Integrating External Applications with JupyterHub/ Red hat Openshift AI Using jupyter-server-proxy (26/04/2024),
- Red Hat OpenShift AI installation and setup (01/05/2024),
- Model training in Red Hat OpenShift AI (02/05/2024),
- Prepare and label custom datasets with Label Studio (02/05/2024),
- Deploy computer vision applications at the edge with MicroShift (03/05/2024),
- Run Openshift pod as sudo user on Openshift 4 (06/05/2024),
- Running Watsonx on ROSA with an integrated application pipeline for Generative AI driven app modernization (15/05/2024),
- Implement AI-driven edge to core data pipelines (24/05/2024),
- Quick Testing of Data Science Applications on Red Hat OpenShift AI without creating container image (06/06/2024),
- How to integrate and use RStudio Server on OpenShift AI (06/06/2024),
- Sustainability in the Age of Local LLMs: Who’s Watching the Electricity Bill? (10/06/2024),
- Copilot OpenShift Operations using Ansible Lightspeed (23/06/2024),
- How to install KServe using Open Data Hub (27/06/2024),
- Manage deep learning models with OpenVINO Model Server (03/07/2024),
- Elevating business processes with OpenShift AI and agentic workflows (09/07/2024),
- Protecting your models made easy with Authorino (22/07/2024),
- Try OpenShift AI and integrate with Apache Camel (22/07/2024),
- Creating an AI powered service for detecting fraudulant card transactions (29/07/2024),
- TrustyAI Detoxify: Guardrailing LLMs during training (01/08/2024),
- Episode-XXIII TrueAI4Telco (02/08/2024),
- Getting-to-grips with NVIDIA GPUs and OpenShift (05/08/2024),
- Red Hat OpenShift AI and machine learning operations (06/08/2024),
- Agentic API introspection and execution with OpenShift AI and Llama 3.1 (08/08/2024),
- Supercharge your Red Hat OpenShift local environment with Red Hat OpenShift Lightspeed (19/08/2024),
- Making LLMs environmentally (and budget) friendly (21/08/2024),
- Infusing AI into applications using IBM watsonx.ai with Red Hat OpenShift AI (22/08/2024),
- Use Stable Diffusion to create images on Red Hat OpenShift AI on a ROSA cluster with GPU enabled (04/09/2024),
- Simple LLM Co-Pilot for OpenShift Development (07/09/2024),
- Unlocking Data Insights : Leveraging RAG to Integrate Data Sources for AI Solutions using Red Hat OpenShift AI (18/09/2024),
- Integrating LangChain with Ollama’s llama2 Model on Red Hat OpenShift AI: Guide to Hosting and Serving Custom AI Models (18/09/2024),
- How to serve embeddings models on OpenShift AI (25/09/2024),
- How to fine-tune Llama 3.1 with Ray on OpenShift AI (30/09/2024),
- Tutorial: Tool up your LLM with Apache Camel on OpenShift (04/10/2024),
- Deploy a coding copilot model with OpenShift AI (07/10/2024),
- Network observability: Optimized anomaly detection with AI/ML (08/11/2024),
- Red Hat OpenShift Lightspeed is now available as a technology preview (11/11/2024),
- An Introduction to TrustyAI (12/11/2024),
- Creating cost effective specialized AI solutions with LoRA adapters on Red Hat OpenShift AI (12/11/2024),
- How to template AI software in Red Hat Developer Hub (12/11/2024),
- LLMs and Red Hat Developer Hub: How to catalog AI assets (12/11/2024),
- Deliver generative AI at scale with NVIDIA NIM on OpenShift AI (12/11/2024),
- Taming the Wild West of Research Computing: How Policies Saved Us a Thousand Headaches (29/11/2024),
- Achieve better large language model inference with fewer GPUs (03/12/2024),
- Build and deploy a ModelCar container in OpenShift AI (30/01/2025),
- Overview: RoCE multi-node AI training on Red Hat OpenShift (30/01/2025),
- Accelerating AI Workflows with OpenShift Lightspeed Powered-by Local | Air-gapped vLLM Served by Openshift AI (18/02/2025),
- How to secure Azure credentials for OpenShift LightSpeed (24/02/2025),
- Deployment-ready reasoning with quantized DeepSeek-R1 models (03/03/2025),
- Managing Models with integrated Model Registry in OpenShift AI (09/03/2025),
- Accelerating AI value with Model-as-a-Service (19/03/2025),
- How we optimized vLLM for DeepSeek-R1 (19/03/2025),
- Experiencing DeespSeek R1 on OpenShift (20/03/2025),
- Scalable and cost-effective fine-tuning for LLMs (26/03/2025),
- How to fine-tune LLMs with Kubeflow Training Operator (26/03/2025),
- Supercharge Your AI with OpenShift AI and Redis: Unleash speed and scalability (04/04/2025),
- Llama 4 Herd is here and already works with Red Hat OpenShift AI (07/04/2025),
- How building workbenches accelerates AI/ML development (10/04/2025),
- How Developer Hub and OpenShift AI work with OpenShift (14/04/2025),
- Fine-tune LLMs with Kubeflow Trainer on OpenShift AI (22/04/2025),
- Optimizing Users Workload Resources With Openshift AI Hardware Profiles (24/04/2025),
- Performance boosts in vLLM 0.8.1: Switching to the V1 engine (28/04/2025),
- How to run performance and scale validation for OpenShift AI (30/04/2025),
- How to set up NVIDIA NIM on Red Hat OpenShift AI (08/05/2025),
- How to use pipelines for AI/ML automation at the edge (12/05/2025),
- OpenShift AI with vLLM and Spring AI (12/05/2025),
- Implementing TrustyAI Guardrails in OpenShift AI (14/05/2025),
- Geospatial Prithvi Models on OpenShift (18/05/2025),
- Optimize LLMs with LLM Compressor in Red Hat OpenShift AI (20/05/2025),
- llm-d: Kubernetes-native distributed inferencing (20/05/2025),
- Introducing Red Hat AI Inference Server: High-performance, optimized LLM serving anywhere (20/05/2025),
- No S3? No Problem: OpenShift AI Model Serving Made Easy! (26/05/2025),
- Serving vLLM and Granite Models on ARM with Red Hat OpenShift AI (05/06/2025),
- Enhancing OpenShift AI Connections Security: Secret Management with HashiCorp Vault (05/06/2025),
- Optimize model serving at the edge with RawDeployment mode (09/06/2025),
- AI at scale, without the price tag: Why enterprises are turning to Models-as-a-Service (10/06/2025),
- How to run AI models in cloud development environments (13/06/2025),
- How to run vLLM on CPUs with OpenShift for GPU-free inference (17/06/2025),
- OpenShift Lightspeed: Assessing AI for OpenShift operations (18/06/2025),
- Assessing AI for OpenShift operations: Advanced configurations (18/06/2025),
- Why Models-as-a-Service architecture is ideal for AI models (30/06/2025),
- How to build a Model-as-a-Service platform (07/07/2025),
- Red Hat OpenShift AI Intro/Workbenches (09/07/2025),
- Red Hat OpenShift AI Intro/Pipelines (21/07/2025),
- Red Hat OpenShift AI Intro / Data Science and MLOps Fundamentals (22/07/2025),
- ReAct vs. naive prompt chaining on Llama Stack (22/07/2025),
- Deploy ChatQnA on OpenShift with AMD Instinct (23/07/2025),
- Openshift Lightspeed CLI Client (23/07/2025),
- Submit remote RayJobs to a Ray cluster with the CodeFlare SDK (24/07/2025),
- Automating quantization pipelines for large language models (27/07/2025),
- From raw data to model serving with OpenShift AI (29/07/2025),
- Batch inference on OpenShift AI with Ray Data, vLLM, and CodeFlare (07/08/2025),
- Running GPT-OSS on OpenShift AI with a Custom Serving Runtime (08/08/2025),
- Bring your own knowledge to OpenShift Lightspeed (29/08/2025),
- What’s new with data science pipelines in Red Hat OpenShift AI (02/09/2025),
- Dell AMD Server on OCP 4.16 Air-Gapped (09/09/2025),
- AI search with style: Fashion on OpenShift AI with EDB (10/09/2025),
- How to deploy language models with Red Hat OpenShift AI (10/09/2025),
- Benchmarking with GuideLLM in air-gapped OpenShift clusters (15/09/2025),
- From Hugging Face to OpenShift: Serving Granite7B Models (19/09/2025),
- How to set up KServe autoscaling for vLLM with KEDA (23/09/2025),
- Run DialoGPT-small on OpenShift AI for internal model testing (25/09/2025),
- Kubernetes MCP server: AI-powered cluster management (25/09/2025),
- vLLM or llama.cpp: Choosing the right LLM inference engine for your use case (30/09/2025),
- How to deploy MCP servers on OpenShift using ToolHive (01/10/2025),
- Autoscaling vLLM with OpenShift AI (02/10/2025),
- DeepSeek-V3.2-Exp on vLLM, Day 0: Sparse Attention for long-context inference, ready for experimentation today with Red Hat AI (03/10/2025),
- Optimize and deploy LLMs for production with OpenShift AI (06/10/2025),
- Master KV cache aware routing with llm-d for efficient AI inference (07/10/2025),
- One model is not enough, too many models is hard: Technical deep dive (08/10/2025),
- Episode XXX: The Best Choice for AI Inference -> vLLM (09/10/2025),
- A Hidden Gem for OpenShift AI: GUI Tools for Your ODF S3 Buckets (15/10/2025),
- From tokens to caches: How llm-d improves LLM observability in Red Hat OpenShift AI 3.0 (22/10/2025),
- Enhancing AI inference security with confidential computing: A path to private data inference with proprietary LLMs (23/10/2025),
- Preparing for Red Hat OpenShift AI 3.0: Understanding Key Deprecations and Migration Paths (26/10/2025),
- Why vLLM is the best choice for AI inference today (30/10/2025),
- Running Open-Source GPT-OSS-20B Models on Red Hat OpenShift AI (02/11/2025),
- Deploy an LLM inference service on OpenShift AI (03/11/2025),
- 3 MCP servers you should be using (safely) (04/11/2025),
- Use OpenShift Lightspeed with locally served LLMs to drive security-focused, cost-efficient enterprise solutions for Red Hat products (05/11/2025),
- Bring Your Own Knowledge (BYOK) in OpenShift Lightspeed: Empower Your AI Assistant with Custom Context (10/11/2025),
- KServe joins CNCF as an incubating project (11/11/2025),
- [Experimental] OpenShift AI — vLLM hosting multiple models with Nvidia L4 time slicing (12/11/2025),
- Reduce LLM benchmarking costs with oversaturation detection (18/11/2025),
- Speculators: Standardized, production-ready speculative decoding (19/11/2025),
- Defining success: Evaluation metrics and data augmentation for oversaturation detection (20/11/2025),
- Introduction to distributed inference with llm-d (21/11/2025).

Loading...
Recent Comments