Support for Low-Cost API Models in Too Code VSCode Extension - GPTAPI.US #735
benzntech
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Description
I propose adding API support for various low-cost AI models such as ChatGPT, Claude, DeepSeek, Gemini, and more to the Too Code VSCode extension. This will enable users to access cost-effective AI services for code generation, assistance, and real-time interactions.
Key Features Requested
API Integration for Various AI Models
Online Recharge & Chat
Shortened Model Naming Support
Claude-3-5-Sonnet
can be used asc-3-5-sonnet
in some platforms.Model-Specific Integration Requirements
Claude Opus
andSonnet
versions support image transfer (Base64/URL).api-gemini-2.0-flash-exp
in NexChat).Custom API Configuration
api.openai.com
withapi.gptapi.us
.api.anthropic.com
withapi.gptapi.us
.Pricing & Cost Consideration
Given the fluctuating nature of AI model pricing, a built-in cost estimator could be beneficial. Users could view token consumption rates before executing large queries.
GPT-4o
: $0.0025 / 1K tokens (input), $0.01 / 1K tokens (output)Claude 3.5 Sonnet
: $0.003 / 1K tokens (input), $0.015 / 1K tokens (output)DeepSeek-v3
: $0.00014 / 1K tokens (input), $0.00028 / 1K tokens (output)Potential Benefits of This Feature
Conclusion
By integrating API support for low-cost AI models, the Too Code VSCode extension will become a more powerful and flexible tool for developers. This will allow seamless AI assistance with cost-effective options while maintaining high performance.
Request Priority: High
Please consider adding this feature to enhance AI accessibility in VSCode. Looking forward to feedback from the community!
Beta Was this translation helpful? Give feedback.
All reactions