Skip to content

DataScience-ArtificialIntelligence/Healthcare_LLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

92 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿฅ LLM Performance Comparison in a Healthcare App

๐Ÿ“Œ Project Overview

This project compares the performance of various Large Language Models (LLMs) integrated into a healthcare-focused app. It applies software engineering principles to evaluate LLMs across multiple metrics and platforms.


๐ŸŽฏ Application Use Case

The app allows users to input healthcare-related questions, which are processed by both cloud-based and local LLMs, enabling a detailed performance comparison.


๐Ÿš€ Objectives

1๏ธโƒฃ Develop the App

  • Build an interactive, secure app for healthcare-related chat with LLMs.
  • Include cloud-based APIs (e.g., OpenAI, Gemini, Claude, Grok) and local models.

2๏ธโƒฃ Integrate Multiple LLMs

  • Compare models from:
    • Cloud-based: OpenAI (ChatGPT), Gemini, Claude, Grok.
    • Local: bioGPT, LLama-3.2 (1B).

3๏ธโƒฃ Test Across Devices & Conditions

  • Evaluate performance on mobile, laptop, cloud VMs, and edge devices.
  • Simulate varied network conditions to test reliability and latency.

4๏ธโƒฃ Measure Key Metrics

  • โฑ Response Time
  • ๐ŸŽฏ Accuracy & Relevance
  • ๐Ÿง  Resource Usage (CPU, RAM, GPU)
  • ๐Ÿšง Latency & Delay Analysis

5๏ธโƒฃ Visualize Data

  • Collect and analyze results using charts, graphs, and tables.

๐Ÿ› ๏ธ Technologies Used

๐Ÿ”น Primary Language

  • TypeScript

๐Ÿ”น IDE

  • Visual Studio Code (VSCode)

๐Ÿ”น Frontend

  • React Native โ€“ UI development
  • Zustand โ€“ State management
  • Ky โ€“ HTTP requests
  • Gluestick โ€“ UI component library
  • Firebase Auth โ€“ Authentication

๐Ÿ”น Backend

  • Express.js โ€“ API and service handling
  • Mongoose (ORM) โ€“ MongoDB interaction
  • Axios โ€“ Internal and external HTTP requests

๐Ÿ”น Database

  • MongoDB

๐Ÿ”น Microservices

  • Express.js โ€“ Microservice framework
  • LLM SDKs/APIs โ€“ Official packages from OpenAI, Anthropic, Google, etc.

๐Ÿ“ฒ How to Use the App

  1. Sign in with your credentials via Firebase.
  2. Fill in patient details (Name, Age, Height, Weight, Symptoms).
  3. Tap Start Chatting.
  4. Choose your preferred LLM from the dropdown.
  5. Begin the conversation and compare results across models.

๐Ÿ” If a signed-out user tries to access any protected page (like the form or chat), the app redirects them to the Sign-In screen.


๐Ÿง‘โ€๐Ÿ’ป User Features

  • ๐Ÿงพ Fetch & resume previous chat sessions.
  • ๐Ÿง  Compare different model responses on identical queries.
  • ๐Ÿ“ˆ View performance stats and model efficiency insights.

๐Ÿ“Š Expected Outcomes

  • In-depth analysis of cloud vs. local LLMs in a real-world app.
  • Software engineering insights into LLM integration.
  • Visualization dashboards to present the performance metrics.

๐Ÿงฑ Architecture Diagrams

โœ… Updated versions of the following diagrams are required:

  • Class Diagrams (Class & Class2)
  • Activity Diagram
  • State Diagram
  • Network Architecture Diagram
  • Sequence Diagram (with correct tools and tech stack)

๐Ÿ” Auth Flow (Activity/State Diagram)

  1. User opens the app โ†’ lands on homepage.
  2. Clicks button to go to form.
  3. If not signed in โ†’ auto redirected to Sign-In page.

๐Ÿ”ฎ Future Enhancements

  • Expand into other industries like education, finance, legal.

  • Add support for new LLMs as they are released.

  • Enhance the benchmarking engine for deeper analysis and automation.

๐Ÿ“š License

This project is licensed under the MIT License.

๐Ÿ‘ฅ Contributors

  • Adheil Gupta (23BDS002)
  • Arnav Gupta (23BDS009)
  • Atharva Agrawal (23BDS010)
  • SuryaNarayan Rao (23BDS025)

๐Ÿš€ Project Setup

Follow these steps to set up and run the project locally:


๐Ÿ–ฅ๏ธ Clone the Repository

git clone <repo-url>
cd <repo-directory>

๐ŸŒ Start Frontend

cd frontend
  1. Replace <your-IP> in the project files with the IP address of the backend server.
  2. Open the firebaseConfig.js, GoogleService-Info.plist, google-services.json file and populate it with your Firebase project configuration.

๐Ÿ“ฆ Install Dependencies

npx install-expo-modules@latest
npm install

โ–ถ๏ธ Start the Development Server

npx expo start

๐Ÿง  Start Microservice (Models)

cd models
  1. Create a .env file.
  2. Follow the format provided in .env.example.
  3. Populate the keys using your model files and credentials.

๐Ÿ“ฆ Install Dependencies

npm install

โ–ถ๏ธ Start the Development Server

npm start

๐Ÿ› ๏ธ Start Backend

cd backend
  1. Create a .env file.
  2. Follow the format provided in .env.example.
  3. Populate the keys using your own configuration values.

๐Ÿ“ฆ Install Dependencies

npm install

โ–ถ๏ธ Start the Development Server

npm start

About

Health care LLM with performance metrics.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •