Skip to content

Commit f4e8757

Browse files
Update README.md
1 parent 7978931 commit f4e8757

File tree

1 file changed

+107
-0
lines changed

1 file changed

+107
-0
lines changed

README.md

Lines changed: 107 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,109 @@
11
# syncfusion-blazor-smartcomponents-with-deepseek-ai
22
This repository demonstrates the sample for Deepseek AI Integration with Syncfusion Blazor Smartcomponents.
3+
4+
# How to Run the Project
5+
6+
This project demonstrates the integration of **DeepSeek local AI model with Syncfusion Blazor SmartComponents** brings AI-powered automation to text-based applications, enhancing sentence completion, content pasting, and real-time suggestions.
7+
8+
---
9+
## Prerequisites
10+
11+
Before you begin, ensure you have the following:
12+
13+
1. [**Visual Studio 2022 with the ASP.NET and web development workload**](https://visualstudio.microsoft.com/vs/) installed.
14+
2. **[.NET 8.0 SDK](https://dotnet.microsoft.com/en-us/download/dotnet/8.0) or [later](https://dotnet.microsoft.com/en-us/download)**.
15+
3. DeepSeek Model: Installed in your system.
16+
---
17+
18+
## Installing and Setting Up the DeepSeek Model
19+
20+
To integrate DeepSeek with Syncfusion SmartComponents, you must first install and run the model locally on your system. Follow these steps to get started:
21+
22+
### Step 1: Install Ollama
23+
Download and install Ollama from [Ollama's official website](https://ollama.com) based on your operating system.
24+
25+
### Step 2: Install the DeepSeek Model
26+
After installing Ollama, open a command prompt or terminal and run the following command to install and start the DeepSeek model:
27+
28+
```sh
29+
ollama run <model-name>
30+
```
31+
32+
Replace `<model-name>` with the specific DeepSeek model you want to use. For example:
33+
34+
```sh
35+
ollama run deepseek:7b
36+
```
37+
38+
Once the model is running, it will be hosted locally and can be used within Syncfusion SmartComponents for AI-powered text assistance.
39+
40+
## Steps to Run the Project
41+
42+
### 1. Clone the Repository
43+
Clone the repository to your local machine:
44+
```bash
45+
git clone <repository-url>
46+
cd <repository-folder>
47+
```
48+
49+
### 2. Install Required Dependencies
50+
Make sure all required NuGet packages are installed. Use the following commands if necessary:
51+
```bash
52+
dotnet restore
53+
```
54+
55+
---
56+
57+
### 3. Configure the Deepseek ai model in program.cs
58+
59+
To configure the AI service, add the following settings to the `~/Program.cs` file in your Blazor application:
60+
61+
```csharp
62+
builder.Services.AddSyncfusionSmartComponents()
63+
.ConfigureCredentials(new AIServiceCredentials
64+
{
65+
SelfHosted = true,
66+
Endpoint = new Uri("Your local endpoint"),
67+
DeploymentName = "your model name"
68+
})
69+
.InjectOpenAIInference();
70+
```
71+
72+
- Set `SelfHosted` to `true`.
73+
- Provide the `Endpoint` URL where the model is hosted (e.g., `http://localhost:11434`).
74+
75+
## Installing and Running the DeepSeek Model
76+
Before proceeding, ensure the DeepSeek model is installed and running locally. You can install the model using the following command:
77+
78+
```sh
79+
ollama run <model-name>
80+
```
81+
82+
Replace `<model-name>` with the specific DeepSeek model you want to use. For example:
83+
84+
```sh
85+
ollama run deepseek-r1:1.5b
86+
```
87+
88+
Once the model is running, set the `DeploymentName` to match the installed model, such as:
89+
90+
- `deepseek-r1:1.5b`
91+
- `deepseek-r1:7b`
92+
- `mistral:7b`
93+
94+
This setup enables Syncfusion SmartComponents to leverage AI-powered text assistance using the locally hosted DeepSeek model.
95+
96+
---
97+
98+
### 4. Build and Run the Project
99+
Run the following command to build and start the application:
100+
```bash
101+
dotnet run
102+
```
103+
104+
---
105+
106+
### 5. Access the Application
107+
Once the application starts, open your browser and navigate to the provided URL.
108+
109+

0 commit comments

Comments
 (0)