Skip to content

improve: handles decoding error #6

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Oct 28, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
52 changes: 16 additions & 36 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,8 @@ for model in models {
}
```

> Note: The Anthropic models are hardcoded. They do not require an API call to retrieve.
> [!NOTE]
> The Anthropic models are hardcoded. They do not require an API call to retrieve.

### Retrieving Cohere Models

Expand All @@ -79,7 +80,8 @@ for model in models {
}
```

> Note: The Google models are hardcoded. They do not require an API call to retrieve.
> [!NOTE]
> The Google models are hardcoded. They do not require an API call to retrieve.

### Retrieving Ollama Models

Expand All @@ -98,10 +100,8 @@ do {
### Retrieving OpenAI Models

```swift
let apiKey = "your-openai-api-key"

do {
let models = try await retriever.openAI(apiKey: apiKey)
let models = try await retriever.openAI(apiKey: "your-openai-api-key")

for model in models {
print("Model ID: \(model.id), Name: \(model.name)")
Expand All @@ -116,11 +116,10 @@ do {
The `openAI(apiKey:endpoint:headers:)` method can also be used with OpenAI-compatible APIs by specifying a custom endpoint:

```swift
let apiKey = "your-api-key"
let customEndpoint = URL(string: "https://api.your-openai-compatible-service.com/v1/models")!

do {
let models = try await modelRetriever.openAI(apiKey: apiKey, endpoint: customEndpoint)
let models = try await modelRetriever.openAI(apiKey: "your-api-key", endpoint: customEndpoint)

for model in models {
print("Model ID: \(model.id), Name: \(model.name)")
Expand All @@ -132,48 +131,29 @@ do {

### Error Handling

The package uses `AIModelRetrieverError` to represent specific errors that may occur. You can catch and handle these errors as follows:
`AIModelRetrieverError` provides structured error handling through the `AIModelRetrieverError` enum. This enum contains several cases that represent different types of errors you might encounter:

```swift
let apiKey = "your-openai-api-key"

do {
let models = try await modelRetriever.openai(apiKey: apiKey)
// Process models
let models = try await modelRetriever.openAI(apiKey: "your-api-key")
} catch let error as AIModelRetrieverError {
switch error {
case .badServerResponse:
print("Received an invalid response from the server")
case .serverError(let statusCode, let errorMessage):
print("Server error (status \(statusCode)): \(errorMessage ?? "No error message provided")")
}
} catch {
print("An unexpected error occurred: \(error)")
}
```

### Error Handling

`AIModelRetrieverError` provides structured error handling through the `AIModelRetrieverError` enum. This enum contains three cases that represent different types of errors you might encounter:

```swift
do {
let models = try await modelRetriever.openai(apiKey: "your-api-key")
} catch let error as LLMChatOpenAIError {
switch error {
case .serverError(let message):
// Handle server-side errors (e.g., invalid API key, rate limits)
print("Server Error: \(message)")
case .networkError(let error):
// Handle network-related errors (e.g., no internet connection)
print("Network Error: \(error.localizedDescription)")
case .badServerResponse:
// Handle invalid server responses
print("Invalid response received from server")
case .decodingError(let error):
// Handle errors that occur when the response cannot be decoded
print("Decoding Error: \(error)")
case .cancelled:
// Handle cancelled requests
print("Request cancelled")
// Handle requests that are cancelled
print("Request was cancelled")
}
} catch {
// Handle any other errors
print("An unexpected error occurred: \(error)")
}
```

Expand Down
29 changes: 13 additions & 16 deletions Sources/AIModelRetriever/AIModelRetriever.swift
Original file line number Diff line number Diff line change
Expand Up @@ -24,21 +24,18 @@ public struct AIModelRetriever: Sendable {
}

guard let httpResponse = response as? HTTPURLResponse, 200...299 ~= httpResponse.statusCode else {
throw AIModelRetrieverError.badServerResponse
throw AIModelRetrieverError.serverError(response.description)
}

let models = try JSONDecoder().decode(T.self, from: data)

return models
return try JSONDecoder().decode(T.self, from: data)
} catch is CancellationError {
throw AIModelRetrieverError.cancelled
} catch let error as URLError where error.code == .cancelled {
throw AIModelRetrieverError.cancelled
} catch let error as DecodingError {
throw AIModelRetrieverError.decodingError(error)
} catch let error as AIModelRetrieverError {
throw error
} catch let error as URLError {
switch error.code {
case .cancelled:
throw AIModelRetrieverError.cancelled
default:
throw AIModelRetrieverError.networkError(error)
}
} catch {
throw AIModelRetrieverError.networkError(error)
}
Expand Down Expand Up @@ -78,11 +75,11 @@ public extension AIModelRetriever {
/// Retrieves a list of AI models from Cohere.
///
/// - Parameters:
/// - apiKey: The API key for authenticating with the API.
/// - apiKey: The API key that authenticates with the API.
///
/// - Returns: An array of ``AIModel`` that represents Cohere's available models.
///
/// - Throws: An error if the network request fails or if the response cannot be decoded.
/// - Throws: An error that occurs if the request is cancelled, if the network request fails, if the server returns an error, or if the response cannot be decoded.
func cohere(apiKey: String) async throws -> [AIModel] {
guard let defaultEndpoint = URL(string: "https://api.cohere.com/v1/models?page_size=1000") else { return [] }

Expand Down Expand Up @@ -138,7 +135,7 @@ public extension AIModelRetriever {
///
/// - Returns: An array of ``AIModel`` that represents Ollama's available models.
///
/// - Throws: An error if the network request fails or if the response cannot be decoded.
/// - Throws: An error that occurs if the request is cancelled, if the network request fails, if the server returns an error, or if the response cannot be decoded.
func ollama(endpoint: URL? = nil, headers: [String: String]? = nil) async throws -> [AIModel] {
guard let defaultEndpoint = URL(string: "http://localhost:11434/api/tags") else { return [] }

Expand Down Expand Up @@ -173,13 +170,13 @@ public extension AIModelRetriever {
/// Retrieves a list of AI models from OpenAI or OpenAI-compatible APIs.
///
/// - Parameters:
/// - apiKey: The API key for authenticating with the API.
/// - apiKey: The API key that authenticates with the API.
/// - endpoint: The URL endpoint for the API. If not provided, it defaults to "https://api.openai.com/v1/models".
/// - headers: Optional dictionary of additional HTTP headers to include in the request.
///
/// - Returns: An array of ``AIModel`` that represents the available models from the specified API.
///
/// - Throws: An error if the network request fails or if the response cannot be decoded.
/// - Throws: An error that occurs if the request is cancelled, if the network request fails, if the server returns an error, or if the response cannot be decoded.
func openAI(apiKey: String, endpoint: URL? = nil, headers: [String: String]? = nil) async throws -> [AIModel] {
guard let defaultEndpoint = URL(string: "https://api.openai.com/v1/models") else { return [] }

Expand Down
10 changes: 5 additions & 5 deletions Sources/AIModelRetriever/AIModelRetrieverError.swift
Original file line number Diff line number Diff line change
Expand Up @@ -19,21 +19,21 @@ public enum AIModelRetrieverError: Error, Sendable {
/// - Parameter error: The underlying network error.
case networkError(Error)

/// A case that represents an invalid server response.
case badServerResponse
/// A case that represents a decoding error.
case decodingError(Error)

/// A case that represents a request has been canceled.
case cancelled

/// A localized message that describes the error.
public var errorDescription: String? {
switch self {
case .serverError(let error):
return error
case .networkError(let error):
return error.localizedDescription
case .badServerResponse:
return "Invalid response received from server"
case .decodingError(let error):
return error.localizedDescription
case .cancelled:
return "Request was cancelled"
}
Expand Down
30 changes: 15 additions & 15 deletions Sources/AIModelRetriever/Documentation.docc/Documentation.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ for model in models {
}
```

> Note: The Anthropic models are hardcoded. They do not require an API call to retrieve.
> The Anthropic models are hardcoded. They do not require an API call to retrieve.

### Retrieving Cohere Models

Expand All @@ -50,7 +50,7 @@ for model in models {
}
```

> Note: The Google models are hardcoded. They do not require an API call to retrieve.
> The Google models are hardcoded. They do not require an API call to retrieve.

### Retrieving Ollama Models

Expand All @@ -69,10 +69,8 @@ do {
### Retrieving OpenAI Models

```swift
let apiKey = "your-openai-api-key"

do {
let models = try await retriever.openAI(apiKey: apiKey)
let models = try await retriever.openAI(apiKey: "your-openai-api-key")

for model in models {
print("Model ID: \(model.id), Name: \(model.name)")
Expand All @@ -87,11 +85,10 @@ do {
The `openAI(apiKey:endpoint:headers:)` method can also be used with OpenAI-compatible APIs by specifying a custom endpoint:

```swift
let apiKey = "your-api-key"
let customEndpoint = URL(string: "https://api.your-openai-compatible-service.com/v1/models")!

do {
let models = try await modelRetriever.openAI(apiKey: apiKey, endpoint: customEndpoint)
let models = try await modelRetriever.openAI(apiKey: "your-api-key", endpoint: customEndpoint)

for model in models {
print("Model ID: \(model.id), Name: \(model.name)")
Expand All @@ -103,25 +100,28 @@ do {

### Error Handling

``AIModelRetrieverError`` provides structured error handling through the ``AIModelRetrieverError`` enum. This enum contains three cases that represent different types of errors you might encounter:
``AIModelRetrieverError`` provides structured error handling through the ``AIModelRetrieverError`` enum. This enum contains several cases that represent different types of errors you might encounter:

```swift
do {
let models = try await modelRetriever.openai(apiKey: "your-api-key")
} catch let error as LLMChatOpenAIError {
let models = try await modelRetriever.openAI(apiKey: "your-api-key")
} catch let error as AIModelRetrieverError {
switch error {
case .serverError(let message):
// Handle server-side errors (e.g., invalid API key, rate limits)
print("Server Error: \(message)")
case .networkError(let error):
// Handle network-related errors (e.g., no internet connection)
print("Network Error: \(error.localizedDescription)")
case .badServerResponse:
// Handle invalid server responses
print("Invalid response received from server")
case .decodingError(let error):
// Handle errors that occur when the response cannot be decoded
print("Decoding Error: \(error)")
case .cancelled:
// Handle cancelled requests
print("Request cancelled")
// Handle requests that are cancelled
print("Request was cancelled")
}
} catch {
// Handle any other errors
print("An unexpected error occurred: \(error)")
}
```
Loading