Skip to content

Commit b0305ef

Browse files
docs: update docs for sql, create, links, examples (#1571)
* sql, new create, fixed broken links and examples * docs: improve explaination of the semantic layer --------- Co-authored-by: Gabriele Venturi <lele.venturi@gmail.com>
1 parent f667367 commit b0305ef

17 files changed

+488
-760
lines changed

docs/mint.json

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -58,8 +58,8 @@
5858
"version": "v3"
5959
},
6060
{
61-
"group": "Data",
62-
"pages": ["v3/data-layer", "v3/semantic-layer", "v3/data-ingestion", "v3/transformations", "v3/dataframes"],
61+
"group": "Data layer",
62+
"pages": ["v3/semantic-layer", "v3/semantic-layer/new", "v3/semantic-layer/views", "v3/data-ingestion", "v3/transformations"],
6363
"version": "v3"
6464
},
6565
{

docs/v3/ai-dashboards.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ description: 'Turn your dataframes into collaborative AI dashboards'
77
Release v3 is currently in beta. This documentation reflects the features and functionality in progress and may change before the final release.
88
</Note>
99

10-
PandaAI provides a [data platform](https://app.pandabi.ai) that maximizes the power of your [semantic dataframes](/v3/dataframes).
10+
PandaAI provides a [data platform](https://app.pandabi.ai) that maximizes the power of your [semantic dataframes](/v3/semantic-layer).
1111
With a single line of code, you can turn your dataframes into auto-updating AI dashboards - no UI development needed.
1212
Each dashboard comes with a pre-generated set of insights and a conversational agent that helps you and your team explore the data through natural language.
1313

docs/v3/chat-and-output.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ You can inspect the code that was generated to produce the result:
108108

109109
```python
110110
response = df.chat("Calculate the correlation between age and salary")
111-
print(response.last_code_generated)
111+
print(response.last_code_executed)
112112
# Output: df['age'].corr(df['salary'])
113113
```
114114

docs/v3/cli.mdx

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,12 @@
11
---
2-
title: "Command Line Interface"
2+
title: "Command line interface"
33
description: "Learn how to use PandaAI's command-line interface"
44
---
55

6+
<Note title="Beta Notice">
7+
PandaAI 3.0 is currently in beta. This documentation reflects the latest features and functionality, which may evolve before the final release.
8+
</Note>
9+
610
PandaAI comes with a command-line interface (CLI) that helps you manage your datasets and authentication.
711

812
## Authentication

docs/v3/data-ingestion.mdx

Lines changed: 38 additions & 173 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,6 @@ file = pai.read_csv("data.csv")
3131
# Use the semantic layer on CSV
3232
df = pai.create(
3333
path="company/sales-data",
34-
name="sales_data",
3534
df = file,
3635
description="Sales data from our retail stores",
3736
columns={
@@ -50,182 +49,48 @@ response = df.chat("Which product has the highest sales?")
5049

5150
## How to work with SQL in PandaAI?
5251

53-
PandaAI provides a sql extension for you to work with SQL, PostgreSQL, MySQL, SQLite databases.
52+
PandaAI provides a sql extension for you to work with SQL, PostgreSQL, MySQL, and CockroachDB databases.
5453
To make the library lightweight and easy to use, the basic installation of the library does not include this extension.
55-
It can be easily installed using either `poetry` or `pip`.
54+
It can be easily installed using pip with the specific database you want to use:
5655

5756
```bash
58-
poetry add pandasai-sql
57+
pip install pandasai-sql[postgres]
58+
pip install pandasai-sql[mysql]
59+
pip install pandasai-sql[cockroachdb]
5960
```
6061

61-
```bash
62-
pip install pandasai-sql
63-
```
64-
65-
Once you have installed the extension, you can use it to connect to SQL databases.
66-
67-
### PostgreSQL
68-
69-
```yaml
70-
name: sales_data
71-
72-
source:
73-
type: postgres
74-
connection:
75-
host: db.example.com
76-
port: 5432
77-
database: analytics
78-
user: ${DB_USER}
79-
password: ${DB_PASSWORD}
80-
table: sales_data
81-
82-
destination:
83-
type: local
84-
format: parquet
85-
path: company/sales-data
86-
87-
columns:
88-
- name: transaction_id
89-
type: string
90-
description: Unique identifier for each sale
91-
- name: sale_date
92-
type: datetime
93-
description: Date and time of the sale
94-
- name: product_id
95-
type: string
96-
description: Product identifier
97-
- name: quantity
98-
type: integer
99-
description: Number of units sold
100-
- name: price
101-
type: float
102-
description: Price per unit
103-
104-
transformations:
105-
- type: convert_timezone
106-
params:
107-
column: sale_date
108-
from: UTC
109-
to: America/New_York
110-
- type: calculate
111-
params:
112-
column: total_amount
113-
formula: quantity * price
114-
115-
update_frequency: daily
62+
Once you have installed the extension, you can use the [semantic data layer](/v3/semantic-layer#for-sql-databases-using-the-create-method) and perform [data transformations](/docs/v3/transformations).
11663

117-
order_by:
118-
- sale_date DESC
119-
120-
limit: 100000
121-
```
122-
123-
### MySQL
124-
125-
```yaml
126-
name: customer_data
127-
128-
source:
129-
type: mysql
130-
connection:
131-
host: db.example.com
132-
port: 3306
133-
database: analytics
134-
user: ${DB_USER}
135-
password: ${DB_PASSWORD}
136-
table: customers
137-
138-
destination:
139-
type: local
140-
format: parquet
141-
path: company/customer-data
142-
143-
columns:
144-
- name: customer_id
145-
type: string
146-
description: Unique identifier for each customer
147-
- name: name
148-
type: string
149-
description: Customer's full name
150-
- name: email
151-
type: string
152-
description: Customer's email address
153-
- name: join_date
154-
type: datetime
155-
description: Date when customer joined
156-
- name: total_purchases
157-
type: integer
158-
description: Total number of purchases made
159-
160-
transformations:
161-
- type: anonymize
162-
params:
163-
column: email
164-
- type: split
165-
params:
166-
column: name
167-
into: [first_name, last_name]
168-
separator: " "
169-
170-
update_frequency: daily
171-
172-
order_by:
173-
- join_date DESC
174-
175-
limit: 100000
176-
```
177-
178-
### SQLite
179-
180-
```yaml
181-
name: inventory_data
182-
183-
source:
184-
type: sqlite
185-
connection:
186-
database: path/to/database.db
187-
table: inventory
188-
189-
destination:
190-
type: local
191-
format: parquet
192-
path: company/inventory-data
193-
194-
columns:
195-
- name: product_id
196-
type: string
197-
description: Unique identifier for each product
198-
- name: product_name
199-
type: string
200-
description: Name of the product
201-
- name: category
202-
type: string
203-
description: Product category
204-
- name: stock_level
205-
type: integer
206-
description: Current quantity in stock
207-
- name: last_updated
208-
type: datetime
209-
description: Last inventory update timestamp
210-
211-
transformations:
212-
- type: categorize
213-
params:
214-
column: stock_level
215-
bins: [0, 10, 50, 100, 500]
216-
labels: ["Critical", "Low", "Medium", "High"]
217-
- type: convert_timezone
218-
params:
219-
column: last_updated
220-
from: UTC
221-
to: America/Los_Angeles
222-
223-
update_frequency: hourly
224-
225-
order_by:
226-
- last_updated DESC
227-
228-
limit: 50000
64+
```python
65+
sql_table = pai.create(
66+
path="example/mysql-dataset",
67+
description="Heart disease dataset from MySQL database",
68+
source={
69+
"type": "mysql",
70+
"connection": {
71+
"host": "database.example.com",
72+
"port": 3306,
73+
"user": "${DB_USER}",
74+
"password": "${DB_PASSWORD}",
75+
"database": "medical_data"
76+
},
77+
"table": "heart_data",
78+
"columns": [
79+
{"name": "Age", "type": "integer", "description": "Age of the patient in years"},
80+
{"name": "Sex", "type": "string", "description": "Gender of the patient (M = male, F = female)"},
81+
{"name": "ChestPainType", "type": "string", "description": "Type of chest pain (ATA, NAP, ASY, TA)"},
82+
{"name": "RestingBP", "type": "integer", "description": "Resting blood pressure in mm Hg"},
83+
{"name": "Cholesterol", "type": "integer", "description": "Serum cholesterol in mg/dl"},
84+
{"name": "FastingBS", "type": "integer", "description": "Fasting blood sugar > 120 mg/dl (1 = true, 0 = false)"},
85+
{"name": "RestingECG", "type": "string", "description": "Resting electrocardiogram results (Normal, ST, LVH)"},
86+
{"name": "MaxHR", "type": "integer", "description": "Maximum heart rate achieved"},
87+
{"name": "ExerciseAngina", "type": "string", "description": "Exercise-induced angina (Y = yes, N = no)"},
88+
{"name": "Oldpeak", "type": "float", "description": "ST depression induced by exercise relative to rest"},
89+
{"name": "ST_Slope", "type": "string", "description": "Slope of the peak exercise ST segment (Up, Flat, Down)"},
90+
{"name": "HeartDisease", "type": "integer", "description": "Heart disease diagnosis (1 = present, 0 = absent)"}
91+
]
92+
}
93+
)
22994
```
23095

23196
## How to work with Enterprise Cloud Data in PandaAI?
@@ -590,8 +455,8 @@ limit: 100000
590455
</tr>
591456
<tr>
592457
<td style={{ border: '1px solid #ccc', padding: '8px 16px' }}>pandasai_sql</td>
593-
<td style={{ border: '1px solid #ccc', padding: '8px 16px' }}><code>poetry add pandasai-sql</code></td>
594-
<td style={{ border: '1px solid #ccc', padding: '8px 16px' }}><code>pip install pandasai-sql</code></td>
458+
<td style={{ border: '1px solid #ccc', padding: '8px 16px' }}><code>poetry add pandasai-sql[postgres]</code></td>
459+
<td style={{ border: '1px solid #ccc', padding: '8px 16px' }}><code>pip install pandasai-sql[postgres]</code></td>
595460
<td style={{ border: '1px solid #ccc', padding: '8px 16px' }}>No</td>
596461
</tr>
597462
<tr>

docs/v3/data-layer.mdx

Lines changed: 0 additions & 20 deletions
This file was deleted.

docs/v3/dataframes.mdx

Lines changed: 0 additions & 44 deletions
This file was deleted.

0 commit comments

Comments
 (0)