Solution Overview

This solution helps you quickly set up the DeepSeek-R1 distilled models on the Huawei Cloud Flexus X Instance (Elastic Cloud Server (ECS)). DeepSeek-R1 is a high-performance AI inference model focused on mathematical, coding, and natural language inference tasks. By deploying the DeepSeek-R1 distilled models on the cloud server through Ollama, you can quickly create your personal AI assistant, ideal for the following applications:

1. Natural Language Processing (NLP): Understands and generates natural language text, suitable for tasks such as dialogue, translation, and summarization.

2. Text Generation: Produces coherent and logically sound text, applicable to content creation and story writing.

3. Question Answering System: Answers user queries, ideal for customer service and knowledge base searches.

4. Sentiment Analysis: Analyzes the emotional tone of text, useful for market research and public opinion monitoring.

5. Text Classification: Categorizes text, applicable to spam filtering and news categorization.

6. Information Extraction: Extracts key information from text, suitable for data mining and knowledge graph construction.

Solution Architecture

This solution helps you quickly set up the DeepSeek-R1 distilled models on the Huawei Cloud Flexus X instance (Elastic Cloud Server (ECS)).

Building a DeepSeek Inference System

Version: 1.0.0

Last Updated: February 2025

Built By: Huawei Cloud

Time Required for Deployment: About 10 minutes

Time Required for Uninstallation: About 5 minutes

Solution Description

Solution Description

  • This solution will create a FlexusX instance(Elastic Cloud Server (ECS)) to set up the DeepSeek-R1 distilled models.

  • This solution will create one Elastic IP (EIP) for internal and external communication.

  • This solution will create a security group and configure security group rules to protect Huawei Cloud cloud servers.

展开内容
收起内容

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more