diff --git a/.gitignore b/.gitignore index 0deeac16..a3894856 100644 --- a/.gitignore +++ b/.gitignore @@ -1,6 +1,9 @@ # Recorded data *.csv +/logs/ +logging.txt + # Byte-compiled / optimized / DLL files __pycache__/ *.py[cod] diff --git a/README.md b/README.md index 14a1a475..a149400f 100644 --- a/README.md +++ b/README.md @@ -1,168 +1,94 @@ # Chords - Python -Chords Python script is designed to interface with an Arduino-based bio-potential amplifier, read data from it, optionally log this data to CSV or stream it via the Lab Streaming Layer (LSL), and visualize it through a graphical user interface (GUI) with live plotting. - -> [!NOTE] -> Flash Arduino code to your hardware from [Chords Arduino Firmware](https://github.com/upsidedownlabs/Chords-Arduino-Firmware) to use this python tool. - -## Features - -- **Automatic Arduino Detection:** Automatically detects connected Arduino devices via serial ports. -- **Data Reading:** Read data packets from the Arduino's serial port. -- **CSV Logging:** Optionally logs data to a CSV file. -- **LSL Streaming:** Optionally streams data to an LSL outlet for integration with other software. -- **Verbose Output:** Provides detailed statistics and error reporting, including sampling rate and drift. -- **GUI:** Live plotting of six channels using a PyQt-based GUI. -- **Invert:** Optionally Invert the signal before streaming LSL and logging -- **Timer:** Record data for a set time period in seconds. - -## Requirements - -- Python -- `pyserial` library (for serial communication) -- `pylsl` library (for LSL streaming) -- `argparse`, `time`, `csv`, `datetime` (standard libraries) -- `pyqtgraph` library (for GUI) -- `PyQt5` library -- `numpy` library - -## Installation - -1. Ensure you have latest version of Python installed. -2. Create Virtual Environment +Chords- Python is a bag of tools designed to interface with Micro-controller development boards running [Chords Arduino Firmware](https://github.com/upsidedownlabs/Chords-Arduino-Firmware).Use Upside Down Labs bio-potential amplifiers to read data, visualize it, record data in CSV Files, and stream it via Lab Streaming Layer. + +> [!NOTE] +> **Firmware Required:** +> - For Arduino: [Chords Arduino Firmware](https://github.com/upsidedownlabs/Chords-Arduino-Firmware) + +## Features +- **Multiple Protocols**: Supports `Wi-Fi`, `Bluetooth`, and `Serial` communication. +- **LSL Data Streaming**:Once the LSL stream starts, any PC on the same Wi-Fi network can access the data using tools like BrainVision LSL Viewer. +- **CSV Logging**: Save raw data with Counter +- **GUI**: Live plotting for all channels. +- **Applications**: EEG/ECG/EMG/EOG-based games and utilities (e.g., Tug of War, Keystroke Emulator). + +## Installation +1. **Python**: Ensure Latest version of Python is installed. +2. **Virtual Environment**: ```bash - python -m venv venv - ``` - + python -m venv venv + source venv/bin/activate # Linux/macOS + .\venv\Scripts\activate # Windows + ``` +3. **Dependencies**: ```bash - .\venv\Scripts\activate - ``` - -> [!IMPORTANT] -> You may get an execution policy error if scripts are restricted. To fix it, run: - -> ```bash -> Set-ExecutionPolicy Unrestricted -Scope Process -> ``` - -3. Install the required Python libraries needed to run the python script: - ```bash - pip install -r chords_requirements.txt - ``` - -4. Install the required Python libraries needed to run the applications: - ```bash - pip install -r app_requirements.txt - ``` - -## Usage - -To use the script, run it from the command line with various options: - ```bash - python chords.py [options] - ``` -### Options - -- `-p`, `--port` : Specify the serial port to use (e.g., COM5, /dev/ttyUSB0). -- `-b`, `--baudrate` : Set the baud rate for serial communication. By default the script will first attempt to use 230400, and if that fails, it will automatically fallback to 115200. -- `--csv`: Enable CSV logging. Data will be saved to a timestamped file. -- `--lsl`: Enable LSL streaming. Sends data to an LSL outlet. -- `-v`, `--verbose`: Enable verbose output with detailed statistics and error reporting. -- `-t` : Enable the timer to run program for a set time in seconds. - -### Example: - ```bash - python chords.py --lsl -v --csv -t 60 - ``` -- This command executes the Python script `chords.py`, initiates the LSL stream, enables verbose output, activates CSV logging, and sets a timer for 60 seconds: - -### Data Logging - -- **CSV Output**: The script saves the processed data in a CSV file with a timestamped name. - - The CSV file contains the following columns: - - `Counter`: The sample counter from the Arduino. - - `Channel1` to `Channel6`: The data values from each channel. - -- **Log Intervals**: The script logs data counts every second and provides a summary every 10 minutes, including the sampling rate and drift in seconds per hour. - -## Applications -Open another terminal and run an application. Ensure the LSL Stream is running first. - -### Installation -Before running any application, install all dependencies with the following command: - -```bash -pip install -r app_requirements.txt -``` - -### Available Applications + pip install -r requirements.txt + ``` -#### ECG with Heart Rate +> [!IMPORTANT] +> On Windows, if scripts are blocked, run: +> ```powershell +> Set-ExecutionPolicy Unrestricted -Scope Process +> ``` -- `python heartbeat_ecg.py`:Enable a GUI with real-time ECG and heart rate. - -#### EMG with Envelope - -- `python emgenvelope.py`: Enable a GUI with real-time EMG & its Envelope. - -#### EOG with Blinks - -- `python eog.py`: Enable a GUI with real-time EOG that detects blinks and mark them with red dots. - -#### EEG with FFT - -- `python ffteeg.py`: Enable a GUI with real-time EEG data with its FFT and band powers. - -#### EEG Tug of War Game - -- `python game.py`: Enable a GUI to play tug of war game using EEG Signal. - -#### EEG Beetle Game - -- `python beetle.py`: Enable a GUI for Beetle Game using EEG signal. - -#### GUI - -- `python gui.py`: Enable the real-time data plotting GUI. - -#### EOG Keystroke Emulator - -- `python keystroke.py`: On running, a pop-up opens for connecting, and on pressing Start, blinks are detected to simulate spacebar key presses. - -#### CSV Plotter - -- `python csv_plotter.py`: On running, a pop-up window opens with option to load a file, select a channel to plot, and then plot the data. - -## Running All Applications Together in a Web-Interface - -To run all applications simultaneously, execute: - -```bash -python app.py -``` - -> [!NOTE] -> Before running, make sure to install all dependencies by running the command: +## Usage +Run the script and access the web interface: ```bash -pip install -r app_requirements.txt -``` - -This will launch a Web interface. Use the interface to control the applications: - -1. Click the `Start LSL Stream` button to initiate the LSL stream or `Start NPG Stream` button to initiate the NPG stream. -2. Then, click on any application button to run the desired module. -Important: Keep the `python app.py` script running in the background while using any application. - -### Available Applications -- `ECG with Heart Rate`: Analyze ECG data and extract heartbeat metrics. -- `EMG with Envelope`: Real-time EMG monitor with filtering and RMS envelope. -- `EOG with Blinks`: Real-time EOG monitoring with blink detection. -- `EEG with FFT`: Real-time EEG analysis with FFT and brainwave power calculation. -- `EEG Tug of War`: A 2-player game where brain activity determines the winner in a battle of focus. -- `EEG Beetle Game`: Use your concentration to control a beetle's movement in this brain-powered challenge. -- `GUI of Channels`: Launch the GUI for real time signal visualization. -- `EOG Keystroke Emulator`: GUI for EOG-based blink detection triggering a keystroke. -- `CSV Plotter`: Plot data from a CSV file. +python app.py +``` +**Web Interface Preview**: +![Web Interface Screenshot](./media/Interface.png) + +![Web Interface Screenshot](./media/Webinterface.png) + +### Key Options: + +- **LSL Streaming**: Choose a protocol (`Wi-Fi`, `Bluetooth`, `Serial`). +- **CSV Logging**: Data saved as `ChordsPy_{timestamp}.csv`. +- **Applications**: Multiple Applications can be Launch from the Interface simultaneously(e.g., `EEG Tug of War`). + +## Connection Guide + +#### WIFI Connection + 1. Upload the NPG-Lite WIFI Code to your device. + 2. Connect to the device's WIFI network. + 3. Click the **WIFI** button in the interface, then select **CONNECT**. + 4. Once connected, the button will change to **Disconnect**, and a pop-up will confirm: *"Connected via Wifi!"* + +#### Bluetooth Connection + 1. Ensure Bluetooth is turned ON on your system. + 2. Upload the Bluetooth code to your device. + 3. Click the **Bluetooth** button to scan for available devices. + 4. Select your device from the list and click **Connect**. + 5. Once connected, the button will change to **Disconnect**, and a pop-up will confirm: *"Connected via Bluetooth!"* + +#### Serial Connection + 1. Ensure Bluetooth is OFF and the device is connected via USB. + 2. Upload the required code to your hardware. + 3. Click the **Serial** button, then select **Connect**. + 4. Once connected, the button will change to **Disconnect**, and a pop-up will confirm: *"Connected via Serial!"* + +## CSV Logging +To save sensor data for future analysis, follow these steps: +1. **Start Data Streaming** – Begin streaming data via **WiFi, Bluetooth, or Serial**. +2. **Start Recording** – Click the **Start Recording** button (it will change to **Stop Recording**). +3. **File Saved Automatically** – The data is saved as `ChordsPy_{timestamp}.csv` in your default folder. + +Visualizing CSV Data - You can plot the recorded data using the **CSV Plotter** tool. + +## Applications +| Application | Description | +|----------------------------|------------------------------------------------------------------| +| **ECG with Heart Rate** | Real-time ECG with BPM calculation. | +| **EMG with Envelope** | Real-time EMG Visualization with Envelope. | +| **EOG with Blinks** | Real-time EOG Signal visualization with Blinks marked as Red Dot.| +| **EEG with FFT** | Real-time EEG Signal visualization with FFT and Brainpower bands.| +| **EEG Tug of War Game** | 2 Player EEG Based Game | +| **EEG Beetle game** | Real-time EEG focus based game. | +| **EOG Keystroke Emulator** | Blink detection triggers spacebar. | +| **GUI** | Visualize raw data in real-time | +| **CSV Plotter** | Tool to plot the recorded CSV Files | ## Troubleshooting @@ -170,6 +96,14 @@ Important: Keep the `python app.py` script running in the background while using - **CSV File Not Created:** Ensure you have write permissions in the directory where the script is run. - **LSL Stream Issues:** Ensure that the `pylsl` library is properly installed and configured. Additionally, confirm that Bluetooth is turned off. +## How to Contribute + +You can add your project to this repo: + +- Add a button in apps.yaml to link your application. +- Include your script as a .py file with LSL Data Reception code. +(Pull requests welcome!) + ## Contributors We are thankful to our awesome contributors, the list below is alphabetically sorted. diff --git a/app.py b/app.py index eff7cf8f..29be055b 100644 --- a/app.py +++ b/app.py @@ -8,6 +8,7 @@ import queue import yaml from pathlib import Path +import os console_queue = queue.Queue() app = Flask(__name__) @@ -20,6 +21,22 @@ stream_active = False running_apps = {} # Dictionary to track running apps +@app.route('/log_error', methods=['POST']) +def log_error(): + try: + error_data = request.get_json() + if not error_data or 'error' not in error_data or 'log_error' in str(error_data): + return jsonify({'status': 'error', 'message': 'Invalid data'}), 400 + + os.makedirs('logs', exist_ok=True) + + with open('logs/logging.txt', 'a') as f: + f.write(error_data['error']) + + return jsonify({'status': 'success'}) + except Exception as e: + return jsonify({'status': 'error', 'message': 'Logging failed'}), 500 + def run_async(coro): def wrapper(*args, **kwargs): loop = asyncio.new_event_loop() @@ -63,9 +80,8 @@ async def scan_ble_devices(): @app.route('/check_stream') def check_stream(): - if connection_manager and connection_manager.stream_active: - return jsonify({'connected': True}) - return jsonify({'connected': False}) + is_connected = connection_manager.stream_active if hasattr(connection_manager, 'stream_active') else False + return jsonify({'connected': is_connected}) @app.route('/check_connection') def check_connection(): diff --git a/chords_ble.py b/chords_ble.py index cbf2a6ed..2267c259 100644 --- a/chords_ble.py +++ b/chords_ble.py @@ -13,7 +13,8 @@ class Chords_BLE: CONTROL_CHAR_UUID = "0000ff01-0000-1000-8000-00805f9b34fb" # Packet parameters - SINGLE_SAMPLE_LEN = 7 # (1 Counter + 3 Channels * 2 bytes) + NUM_CHANNELS = 3 + SINGLE_SAMPLE_LEN = (NUM_CHANNELS * 2) + 1 # (1 Counter + Num_Channels * 2 bytes) BLOCK_COUNT = 10 NEW_PACKET_LEN = SINGLE_SAMPLE_LEN * BLOCK_COUNT @@ -72,10 +73,8 @@ def process_sample(self, sample_data: bytearray): if self.start_time is None: self.start_time = time.time() - channels = [ - int.from_bytes(sample_data[1:3], byteorder='big', signed=True), - int.from_bytes(sample_data[3:5], byteorder='big', signed=True), - int.from_bytes(sample_data[5:7], byteorder='big', signed=True)] + channels = [int.from_bytes(sample_data[i:i+2], byteorder='big', signed=True) + for i in range(1, len(sample_data), 2)] self.samples_received += 1 diff --git a/chords_wifi.py b/chords_wifi.py index fc82e8ef..33b91f1b 100644 --- a/chords_wifi.py +++ b/chords_wifi.py @@ -5,7 +5,7 @@ from scipy.signal import butter, filtfilt class Chords_WIFI: - def __init__(self, stream_name='NPG', channels=3, sampling_rate=250, block_size=13, timeout_sec=1): + def __init__(self, stream_name='NPG', channels=3, sampling_rate=500, block_size=13, timeout_sec=1): self.stream_name = stream_name self.channels = channels self.sampling_rate = sampling_rate diff --git a/connection.py b/connection.py index 590ad1d1..af525de3 100644 --- a/connection.py +++ b/connection.py @@ -4,11 +4,14 @@ from pylsl import StreamInfo, StreamOutlet import argparse import time -import sys import asyncio import csv from datetime import datetime import threading +from collections import deque +from pylsl import local_clock +from pylsl import StreamInlet, resolve_stream +import numpy as np class Connection: def __init__(self): @@ -21,14 +24,24 @@ def __init__(self): self.stream_format = "float32" self.stream_id = "UDL" self.last_sample = None - self.ble_samples_received = 0 - self.ble_start_time = time.time() + self.samples_received = 0 + self.start_time = time.time() self.csv_file = None self.csv_writer = None self.sample_counter = 0 self.num_channels = 0 + self.sampling_rate = 0 self.stream_active = False self.recording_active = False + self.usb_thread = None + self.ble_thread = None + self.wifi_thread = None + self.running = False + self.sample_count = 0 + self.rate_window = deque(maxlen=10) + self.last_timestamp = time.perf_counter() + self.rate_update_interval = 0.5 + self.ble_samples_received = 0 async def get_ble_device(self): devices = await Chords_BLE.scan_devices() @@ -56,6 +69,7 @@ def setup_lsl(self, num_channels, sampling_rate): print(f"LSL stream started: {num_channels} channels at {sampling_rate}Hz") self.stream_active = True self.num_channels = num_channels + self.sampling_rate = sampling_rate def start_csv_recording(self, filename=None): if self.recording_active: @@ -108,10 +122,293 @@ def log_to_csv(self, sample_data): print(f"Error writing to CSV: {str(e)}") self.stop_csv_recording() + def update_sample_rate(self): + now = time.perf_counter() + elapsed = now - self.last_timestamp + self.sample_count += 1 + + if elapsed >= self.rate_update_interval: + current_rate = self.sample_count / elapsed + self.rate_window.append(current_rate) + + # Print average rate + avg_rate = sum(self.rate_window) / len(self.rate_window) + print(f"\rCurrent sampling rate: {avg_rate:.2f} Hz", end="", flush=True) + + self.sample_count = 0 + self.last_timestamp = now + + def lsl_rate_checker(self, duration=2.0): + try: + streams = resolve_stream('type', self.stream_type) + if not streams: + print("No LSL stream found to verify.") + return + + inlet = StreamInlet(streams[0]) + timestamps = [] + start_time = time.time() + + while time.time() - start_time < duration: + sample, ts = inlet.pull_sample(timeout=1.0) + if ts: + timestamps.append(ts) + + if len(timestamps) > 10: + diffs = np.diff(timestamps) + filtered_diffs = [d for d in diffs if d > 0] + if filtered_diffs: + estimated_rate = 1 / np.mean(filtered_diffs) + else: + print("\nAll timestamps had zero difference.") + else: + print("\nNot enough timestamps collected to estimate rate.") + except Exception as e: + print(f"Error in LSL rate check: {str(e)}") + + def counter_based_data_handler(self): + last_counter = -1 + dropped_samples = 0 + total_samples = 0 + last_print_time = time.time() + + while self.running and self.connection: + try: + raw_sample = self.connection.get_latest_sample() + if not raw_sample: + continue + + current_counter = raw_sample[2] + channel_data = raw_sample[3:] + + # Handle counter rollover (0-255) + if last_counter != -1: + expected_counter = (last_counter + 1) % 256 + + if current_counter != expected_counter: + if current_counter > last_counter: + missed = current_counter - last_counter - 1 + else: + missed = (256 - last_counter - 1) + current_counter + + dropped_samples += missed + print(f"\nWarning: {missed} samples dropped. Counter jump: {last_counter} -> {current_counter}") + + # Only process if this is a new sample + if current_counter != last_counter: + total_samples += 1 + timestamp = local_clock() + + if self.lsl_connection: + self.lsl_connection.push_sample(channel_data, timestamp=timestamp) + + if self.recording_active: + self.log_to_csv(channel_data) + + last_counter = current_counter + + self.update_sample_rate() + + # Print stats every 5 seconds + if time.time() - last_print_time > 5: + drop_rate = (dropped_samples / total_samples) * 100 if total_samples > 0 else 0 + print(f"\nStats - Processed: {total_samples}, Dropped: {dropped_samples} ({drop_rate:.2f}%)") + last_print_time = time.time() + + except Exception as e: + print(f"\nCounter-based handler error: {str(e)}") + print(f"Last counter: {last_counter}, Current counter: {current_counter}") + break + + def hybrid_data_handler(self): + last_counter = -1 + target_interval = 1.0 / 500.0 + last_timestamp = local_clock() + dropped_samples = 0 + total_samples = 0 + last_print_time = time.time() + + while self.running and self.connection: + try: + raw_sample = self.connection.get_latest_sample() + if not raw_sample: + continue + + current_counter = raw_sample[2] + channel_data = raw_sample[3:] + + if current_counter == last_counter: + continue + + current_time = local_clock() + + counter_diff = (current_counter - last_counter) % 256 + if counter_diff == 0: + counter_diff = 256 + + # Check for missed samples + if last_counter != -1 and counter_diff > 1: + dropped_samples += (counter_diff - 1) + print(f"\nWarning: {counter_diff - 1} samples dropped. Counter jump: {last_counter} -> {current_counter}") + print(f"Current timestamp: {current_time}") + print(f"Sample data: {channel_data}") + + time_per_sample = target_interval + + for i in range(counter_diff): + sample_timestamp = last_timestamp + (i + 1) * time_per_sample + + # Check if we're falling behind + if local_clock() > sample_timestamp + time_per_sample * 2: + print(f"\nWarning: Falling behind by {local_clock() - sample_timestamp:.4f}s, skipping samples") + break + + if self.lsl_connection: + self.lsl_connection.push_sample(channel_data, timestamp=sample_timestamp) + + if self.recording_active: + self.log_to_csv(channel_data) + + total_samples += 1 + + last_counter = current_counter + last_timestamp = current_time + + if time.time() - last_print_time > 5: + drop_rate = (dropped_samples / total_samples) * 100 if total_samples > 0 else 0 + print(f"\nStats - Processed: {total_samples}, Dropped: {dropped_samples} ({drop_rate:.2f}%)") + last_print_time = time.time() + + except Exception as e: + print(f"\nHybrid handler error: {str(e)}") + print(f"Last counter: {last_counter}, Current counter: {current_counter}") + break + + def ble_data_handler(self): + TARGET_SAMPLE_RATE = 500.0 + SAMPLE_INTERVAL = 1.0 / TARGET_SAMPLE_RATE + next_sample_time = local_clock() + + while self.running and self.ble_connection: + try: + if hasattr(self.ble_connection, 'data_available') and self.ble_connection.data_available: + current_time = local_clock() + + if current_time >= next_sample_time: + sample = self.ble_connection.get_latest_sample() + if sample: + channel_data = sample[:self.num_channels] + + # Calculate precise timestamp + sample_time = next_sample_time + next_sample_time += SAMPLE_INTERVAL + + # If we're falling behind, skip samples to catch up + if current_time > next_sample_time + SAMPLE_INTERVAL: + next_sample_time = current_time + SAMPLE_INTERVAL + + if self.lsl_connection: + self.lsl_connection.push_sample(channel_data, timestamp=sample_time) + + self.update_sample_rate() + + if self.recording_active: + self.log_to_csv(channel_data) + except Exception as e: + print(f"BLE data handler error: {str(e)}") + break + + def wifi_data_handler(self): + TARGET_SAMPLE_RATE = 500.0 + SAMPLE_INTERVAL = 1.0 / TARGET_SAMPLE_RATE + next_sample_time = local_clock() + + while self.running and self.wifi_connection: + try: + if hasattr(self.wifi_connection, 'data_available') and self.wifi_connection.data_available: + current_time = local_clock() + + if current_time >= next_sample_time: + sample = self.wifi_connection.get_latest_sample() + if sample: + channel_data = sample[:self.num_channels] + + # Calculate precise timestamp + sample_time = next_sample_time + next_sample_time += SAMPLE_INTERVAL + + # If we're falling behind, skip samples to catch up + if current_time > next_sample_time + SAMPLE_INTERVAL: + next_sample_time = current_time + SAMPLE_INTERVAL + + if self.lsl_connection: + self.lsl_connection.push_sample(channel_data, timestamp=sample_time) + + self.update_sample_rate() + + if self.recording_active: + self.log_to_csv(channel_data) + except Exception as e: + print(f"WiFi data handler error: {str(e)}") + break + + def usb_data_handler(self): + TARGET_SAMPLE_RATE = 500.0 + SAMPLE_INTERVAL = 1.0 / TARGET_SAMPLE_RATE + next_sample_time = local_clock() + + while self.running and self.usb_connection: + try: + if hasattr(self.usb_connection, 'ser') and self.usb_connection.ser.is_open: + self.usb_connection.read_data() + + if hasattr(self.usb_connection, 'data'): + current_time = local_clock() + + if current_time >= next_sample_time: + sample = self.usb_connection.data[:, -1] + channel_data = sample.tolist() + + # Calculate precise timestamp + sample_time = next_sample_time + next_sample_time += SAMPLE_INTERVAL + + if current_time > next_sample_time + SAMPLE_INTERVAL: + next_sample_time = current_time + SAMPLE_INTERVAL + + if self.lsl_connection: + self.lsl_connection.push_sample(channel_data, timestamp=sample_time) + + self.update_sample_rate() + + if self.recording_active: + self.log_to_csv(channel_data) + except Exception as e: + print(f"\nUSB data handler error: {str(e)}") + break + + def connect_usb_with_counter(self): + self.usb_connection = Chords_USB() + if not self.usb_connection.detect_hardware(): + return False + + self.num_channels = self.usb_connection.num_channels + self.sampling_rate = self.usb_connection.supported_boards[self.usb_connection.board]["sampling_rate"] + + self.setup_lsl(self.num_channels, self.sampling_rate) + self.usb_connection.send_command('START') + + self.running = True + self.usb_thread = threading.Thread(target=self.counter_based_data_handler) + self.usb_thread.daemon = True + self.usb_thread.start() + + return True + def connect_ble(self, device_address=None): self.ble_connection = Chords_BLE() original_notification_handler = self.ble_connection.notification_handler - + def notification_handler(sender, data): if len(data) == self.ble_connection.NEW_PACKET_LEN: if not self.lsl_connection: @@ -122,11 +419,8 @@ def notification_handler(sender, data): for i in range(0, self.ble_connection.NEW_PACKET_LEN, self.ble_connection.SINGLE_SAMPLE_LEN): sample_data = data[i:i+self.ble_connection.SINGLE_SAMPLE_LEN] if len(sample_data) == self.ble_connection.SINGLE_SAMPLE_LEN: - channels = [ - int.from_bytes(sample_data[1:3], byteorder='big', signed=True), - int.from_bytes(sample_data[3:5], byteorder='big', signed=True), - int.from_bytes(sample_data[5:7], byteorder='big', signed=True) - ] + channels = [int.from_bytes(sample_data[i:i+2], byteorder='big', signed=True) + for i in range(1, len(sample_data), 2)] self.last_sample = channels self.ble_samples_received += 1 @@ -136,7 +430,7 @@ def notification_handler(sender, data): self.log_to_csv(channels) self.ble_connection.notification_handler = notification_handler - + try: if device_address: print(f"Connecting to BLE device: {device_address}") @@ -144,46 +438,20 @@ def notification_handler(sender, data): else: selected_device = asyncio.run(self.get_ble_device()) if not selected_device: - return + return False print(f"Connecting to BLE device: {selected_device.name}") self.ble_connection.connect(selected_device.address) - + print("BLE connection established. Waiting for data...") return True except Exception as e: print(f"BLE connection failed: {str(e)}") return False - def connect_usb(self): - self.usb_connection = Chords_USB() - if self.usb_connection.detect_hardware(): - self.num_channels = self.usb_connection.num_channels - sampling_rate = self.usb_connection.supported_boards[self.usb_connection.board]["sampling_rate"] - - self.setup_lsl(self.num_channels, sampling_rate) - - original_read_data = self.usb_connection.read_data - def wrapped_read_data(): - original_read_data() - if hasattr(self.usb_connection, 'data') and self.lsl_connection: - sample = self.usb_connection.data[:, -1] - self.lsl_connection.push_sample(sample) - if self.recording_active: - self.log_to_csv(sample.tolist()) - - self.usb_connection.read_data = wrapped_read_data - - # Start streaming in a separate thread - self.usb_thread = threading.Thread(target=self.usb_connection.start_streaming) - self.usb_thread.daemon = True - self.usb_thread.start() - return True - return False - def connect_wifi(self): self.wifi_connection = Chords_WIFI() self.wifi_connection.connect() - + self.num_channels = self.wifi_connection.channels sampling_rate = self.wifi_connection.sampling_rate @@ -218,19 +486,55 @@ def connect_wifi(self): finally: self.stop_csv_recording() + def connect_usb(self): + self.usb_connection = Chords_USB() + if not self.usb_connection.detect_hardware(): + return False + + self.num_channels = self.usb_connection.num_channels + self.sampling_rate = self.usb_connection.supported_boards[self.usb_connection.board]["sampling_rate"] + + self.setup_lsl(self.num_channels, self.sampling_rate) + + # Start the USB streaming command + self.usb_connection.send_command('START') + + # Start the data handler thread + self.running = True + self.usb_thread = threading.Thread(target=self.usb_data_handler) + self.usb_thread.daemon = True + self.usb_thread.start() + + threading.Thread(target=self.lsl_rate_checker, daemon=True).start() + return True + def cleanup(self): + self.running = False self.stop_csv_recording() if self.lsl_connection: self.lsl_connection = None - print("LSL stream stopped") + self.stream_active = False + print("\nLSL stream stopped") + + threads = [] + if self.usb_thread and self.usb_thread.is_alive(): + threads.append(self.usb_thread) + if self.ble_thread and self.ble_thread.is_alive(): + threads.append(self.ble_thread) + if self.wifi_thread and self.wifi_thread.is_alive(): + threads.append(self.wifi_thread) + + for t in threads: + t.join(timeout=1) + # Clean up connections if self.usb_connection: try: - self.usb_connection.cleanup() + if hasattr(self.usb_connection, 'ser') and self.usb_connection.ser.is_open: + self.usb_connection.send_command('STOP') + self.usb_connection.ser.close() print("USB connection closed") - if hasattr(self, 'usb_thread') and self.usb_thread.is_alive(): - self.usb_thread.join(timeout=1) # Wait for thread to finish except Exception as e: print(f"Error closing USB connection: {str(e)}") finally: @@ -238,7 +542,7 @@ def cleanup(self): if self.ble_connection: try: - self.ble_connection.disconnect() + self.ble_connection.stop() print("BLE connection closed") except Exception as e: print(f"Error closing BLE connection: {str(e)}") @@ -247,7 +551,7 @@ def cleanup(self): if self.wifi_connection: try: - self.wifi_connection.disconnect() + self.wifi_connection.cleanup() print("WiFi connection closed") except Exception as e: print(f"Error closing WiFi connection: {str(e)}") @@ -270,15 +574,23 @@ def main(): try: if args.protocol == 'usb': - manager.connect_usb() + if manager.connect_usb(): + while manager.running: + time.sleep(1) elif args.protocol == 'wifi': - manager.connect_wifi() + if manager.connect_wifi(): + while manager.running: + time.sleep(1) elif args.protocol == 'ble': - manager.connect_ble(args.ble_address) + if manager.connect_ble(args.ble_address): + while manager.running: + time.sleep(1) except KeyboardInterrupt: print("\nCleanup Completed.") except Exception as e: - print(f"Error: {str(e)}") + print(f"\nError: {str(e)}") + finally: + manager.cleanup() if __name__ == '__main__': main() \ No newline at end of file diff --git a/ffteeg.py b/ffteeg.py index 501661b9..a46991a4 100644 --- a/ffteeg.py +++ b/ffteeg.py @@ -1,170 +1,423 @@ import numpy as np from collections import deque -from PyQt5.QtWidgets import QApplication, QVBoxLayout, QHBoxLayout, QMainWindow, QWidget +from PyQt5.QtWidgets import (QApplication, QVBoxLayout, QHBoxLayout, QMainWindow, QWidget, QGridLayout, QScrollArea, QPushButton, QDialog, QCheckBox, QLabel, QComboBox, QFrame, QSizePolicy) from PyQt5.QtCore import Qt from pyqtgraph import PlotWidget import pyqtgraph as pg import pylsl import sys from scipy.signal import butter, iirnotch, lfilter, lfilter_zi -from scipy.fft import fft -import math import time +# Constants +FFT_WINDOW_SIZE = 512 # Data points we are using for fft analysis +# On increase this value - Frequency analysis becomes more accurate but updates slower +# On decrease this value - Updates faster but frequency details become less precise + +SMOOTHING_WINDOW_SIZE = 10 # How many FFT results we average to make the display smoother +# On increase this value - Graph looks smoother but reacts slower to changes +# On decrease this value - Reacts faster but graph looks more jumpy + +DISPLAY_DURATION = 4 # How many seconds of EEG data to show at once (in seconds) + +class DataProcessor: + def __init__(self, num_channels, sampling_rate): + self.num_channels = num_channels + self.sampling_rate = sampling_rate + + # Filters - 1. A notch filter to remove electrical interference (50Hz noise) and A bandpass filter (0.5-45Hz) + self.b_notch, self.a_notch = iirnotch(50, 30, self.sampling_rate) + self.b_band, self.a_band = butter(4, [0.5 / (self.sampling_rate / 2), 45.0 / (self.sampling_rate / 2)], btype='band') + self.zi_notch = [lfilter_zi(self.b_notch, self.a_notch) * 0 for _ in range(num_channels)] + self.zi_band = [lfilter_zi(self.b_band, self.a_band) * 0 for _ in range(num_channels)] + + # Circular buffers to store the last few seconds of EEG data + self.eeg_data = [np.zeros(DISPLAY_DURATION * sampling_rate) for _ in range(num_channels)] + self.current_indices = [0 for _ in range(num_channels)] # Pointers to know where to put new data + self.moving_windows = [deque(maxlen=FFT_WINDOW_SIZE) for _ in range(num_channels)] # 3. Moving windows for FFT calculation + + def process_sample(self, sample): + filtered_data = [] + for ch in range(self.num_channels): + raw_point = sample[ch] # Get the raw EEG value + + # Apply filters + notch_filtered, self.zi_notch[ch] = lfilter(self.b_notch, self.a_notch, [raw_point], zi=self.zi_notch[ch]) + band_filtered, self.zi_band[ch] = lfilter(self.b_band, self.a_band, notch_filtered, zi=self.zi_band[ch]) + band_filtered = band_filtered[-1] # Get the final filtered value + + # Update EEG data buffer + self.eeg_data[ch][self.current_indices[ch]] = band_filtered + self.current_indices[ch] = (self.current_indices[ch] + 1) % len(self.eeg_data[ch]) + + # Update moving window for FFT + self.moving_windows[ch].append(band_filtered) + filtered_data.append(band_filtered) + + return filtered_data + + def get_display_data(self, channel): + idx = self.current_indices[channel] + return np.concatenate([self.eeg_data[channel][idx:], self.eeg_data[channel][:idx]]) + +class FFTAnalyzer: + def __init__(self, num_channels, sampling_rate): + self.num_channels = num_channels + self.sampling_rate = sampling_rate + + # Calculate all the frequency bins + self.freqs = np.fft.rfftfreq(FFT_WINDOW_SIZE, d=1.0/self.sampling_rate) + self.freq_resolution = self.sampling_rate / FFT_WINDOW_SIZE + self.fft_window = np.hanning(FFT_WINDOW_SIZE) # Create a window function to make the FFT more accurate + self.window_correction = np.sum(self.fft_window) # For amplitude scaling + + # Smoothing buffers + self.smoothing_buffers = [deque(maxlen=SMOOTHING_WINDOW_SIZE) for _ in range(num_channels)] + + print(f"[FFT Setup] Sampling Rate: {self.sampling_rate} Hz") + print(f"[FFT Setup] Freq Resolution: {self.freq_resolution:.2f} Hz/bin") + print(f"[FFT Setup] FFT Window Size: {FFT_WINDOW_SIZE} samples") + + def compute_fft(self, channel, time_data): + if len(time_data) < FFT_WINDOW_SIZE: + return None, None + + # Extract the most recent EEG Data (FFT_WINDOW_SIZE samples) + signal_chunk = np.array(time_data[-FFT_WINDOW_SIZE:], dtype=np.float64) + windowed_signal = signal_chunk * self.fft_window + fft_result = np.fft.rfft(windowed_signal) + fft_magnitude = np.abs(fft_result[1:]) * (2.0 / self.window_correction) # Skip first value + adjusted_freqs = self.freqs[1:] # Skip DC frequency (0 Hz) + + # DEBUG: Print detected peak frequency + if channel == 0: + start_idx = int(2.0 * len(fft_magnitude) / (self.sampling_rate / 2)) + sorted_indices = np.argsort(fft_magnitude[start_idx:])[::-1] + start_idx + peak1_idx = sorted_indices[0] + peak1_freq = adjusted_freqs[peak1_idx] + print(f"Peak Frequency: {peak1_freq:.2f} Hz") + + # Update smoothing buffer + self.smoothing_buffers[channel].append(fft_magnitude) + + # Return smoothed FFT + smoothed_fft = np.mean(self.smoothing_buffers[channel], axis=0) if self.smoothing_buffers[channel] else fft_magnitude + return adjusted_freqs, smoothed_fft + + def calculate_band_power(self, fft_magnitudes, freq_range): + low, high = freq_range + mask = (self.freqs[1:] >= low) & (self.freqs[1:] <= high) # Find which frequencies are in our desired range + return np.sum(fft_magnitudes[mask] ** 2) # Sum up the power in this range + + def compute_band_powers(self, channel, time_data): + freqs, fft_mag = self.compute_fft(channel, time_data) + if fft_mag is None: + return None + + # Compute band powers (absolute) + delta = self.calculate_band_power(fft_mag, (0.5, 4)) + theta = self.calculate_band_power(fft_mag, (4, 8)) + alpha = self.calculate_band_power(fft_mag, (8, 12)) + beta = self.calculate_band_power(fft_mag, (12, 30)) + gamma = self.calculate_band_power(fft_mag, (30, 45)) + + total_power = delta + theta + alpha + beta + gamma + + # Return relative powers + return {'delta': delta / total_power,'theta': theta / total_power,'alpha': alpha / total_power,'beta': beta / total_power,'gamma': gamma / total_power} + +class SettingBox(QDialog): + def __init__(self, num_channels, selected_eeg, selected_bp, parent=None): + super().__init__(parent) + self.setWindowTitle("Channel Selection Settings") + self.setGeometry(200, 200, 400, 400) + + self.layout = QVBoxLayout() + + # EEG Channel Selection + self.eeg_label = QLabel("Select EEG Channels to Display:") + self.layout.addWidget(self.eeg_label) + + self.eeg_checkboxes = [] + for i in range(num_channels): + cb = QCheckBox(f"Channel {i+1}") + cb.setChecked(i in selected_eeg) + self.eeg_checkboxes.append(cb) + self.layout.addWidget(cb) + + # Brainpower Channel Selection + self.bp_label = QLabel("\nSelect Brainpower Channel:") + self.layout.addWidget(self.bp_label) + + self.bp_combobox = QComboBox() + for i in range(num_channels): + self.bp_combobox.addItem(f"Channel {i+1}") + self.bp_combobox.setCurrentIndex(selected_bp) + self.layout.addWidget(self.bp_combobox) + + # OK Button + self.ok_button = QPushButton("OK") + self.ok_button.clicked.connect(self.validate_and_accept) + self.layout.addWidget(self.ok_button) + + self.setLayout(self.layout) + + def validate_and_accept(self): + # Ensure at least one EEG channel is selected + eeg_selected = any(cb.isChecked() for cb in self.eeg_checkboxes) + + if not eeg_selected: + self.eeg_checkboxes[0].setChecked(True) + + self.accept() + class EEGMonitor(QMainWindow): def __init__(self): super().__init__() self.setWindowTitle("Real-Time EEG Monitor with FFT and Brainwave Power") self.setGeometry(100, 100, 1200, 800) - - self.stream_active = True # Flag to check if the stream is active - self.last_data_time = None # Variable to store the last data time - - # Main layout split into two halves: top for EEG, bottom for FFT and Brainwaves - self.central_widget = QWidget() - self.main_layout = QVBoxLayout(self.central_widget) - - # First half for EEG signal plot - self.eeg_plot_widget = PlotWidget(self) - self.eeg_plot_widget.setBackground('w') - self.eeg_plot_widget.showGrid(x=True, y=True) - self.eeg_plot_widget.setLabel('bottom', 'EEG Plot') - self.eeg_plot_widget.setYRange(-5000, 5000, padding=0) - self.eeg_plot_widget.setXRange(0, 4, padding=0) - self.eeg_plot_widget.setMouseEnabled(x=False, y=True) # Disable zoom - self.main_layout.addWidget(self.eeg_plot_widget) - - # Second half for FFT and Brainwave Power, aligned horizontally - self.bottom_layout = QHBoxLayout() - - # FFT Plot (left side of the second half) - self.fft_plot = PlotWidget(self) - self.fft_plot.setBackground('w') - self.fft_plot.showGrid(x=True, y=True) - self.fft_plot.setLabel('bottom', 'FFT') - # self.fft_plot.setYRange(0, 500, padding=0) - self.fft_plot.setXRange(0, 50, padding=0) # Set x-axis to 0 to 50 Hz - # self.fft_plot.setMouseEnabled(x=False, y=False) # Disable zoom - self.fft_plot.setAutoVisible(y=True) # Allow y-axis to autoscale - self.bottom_layout.addWidget(self.fft_plot) - - # Bar graph for brainwave power bands (right side of the second half) - self.bar_chart_widget = pg.PlotWidget(self) - self.bar_chart_widget.setBackground('w') - self.bar_chart_widget.setLabel('bottom', 'Brainpower Bands') - self.bar_chart_widget.setXRange(-0.5, 4.5) - self.bar_chart_widget.setMouseEnabled(x=False, y=False) # Disable zoom - # Add brainwave power bars - self.brainwave_bars = pg.BarGraphItem(x=[0, 1, 2, 3, 4], height=[0, 0, 0, 0, 0], width=0.5, brush='g') - self.bar_chart_widget.addItem(self.brainwave_bars) - # Set x-ticks for brainwave types - self.bar_chart_widget.getAxis('bottom').setTicks([[(0, 'Delta'), (1, 'Theta'), (2, 'Alpha'), (3, 'Beta'), (4, 'Gamma')]]) - self.bottom_layout.addWidget(self.bar_chart_widget) - - # Add the bottom layout to the main layout - self.main_layout.addLayout(self.bottom_layout) - self.setCentralWidget(self.central_widget) - - # Set up LSL stream inlet + + # Initialize LSL stream + self.inlet = self.connect_to_lsl() + if not self.inlet: + sys.exit(0) + + self.stream_info = self.inlet.info() + self.sampling_rate = int(self.stream_info.nominal_srate()) + self.num_channels = self.stream_info.channel_count() + + # Data processing components + self.data_processor = DataProcessor(self.num_channels, self.sampling_rate) + self.fft_analyzer = FFTAnalyzer(self.num_channels, self.sampling_rate) + + self.selected_eeg_channels = list(range(self.num_channels)) + self.selected_bp_channel = 0 + self.last_data_time = None + self.stream_active = True + + self.colors = ['#FF0054', '#00FF8C', '#AA42FF', '#00FF47', '#FF8C19', '#FF00FF', '#00FFFF', '#FFFF00'] + + self.init_ui() + + # Start update timer + self.timer = pg.QtCore.QTimer() + self.timer.timeout.connect(self.update_plot) + self.timer.start(20) + + def connect_to_lsl(self): print("Searching for available LSL streams...") available_streams = pylsl.resolve_streams() if not available_streams: print("No LSL streams found! Exiting...") - sys.exit(0) + return None - self.inlet = None + inlet = None for stream in available_streams: try: - self.inlet = pylsl.StreamInlet(stream) + inlet = pylsl.StreamInlet(stream) print(f"Connected to LSL stream: {stream.name()}") + print(f"Sampling rate: {inlet.info().nominal_srate()} Hz") + print(f"Number of channels: {inlet.info().channel_count()}") break except Exception as e: print(f"Failed to connect to {stream.name()}: {e}") - if self.inlet is None: - print("Unable to connect to any LSL stream! Exiting...") - sys.exit(0) - - # Sampling rate - self.sampling_rate = int(self.inlet.info().nominal_srate()) - print(f"Sampling rate: {self.sampling_rate} Hz") - - # Data and Buffers - self.eeg_data = deque(maxlen=500) # Initialize moving window with 500 samples - self.moving_window = deque(maxlen=500) # 500 samples for FFT and power calculation (sliding window) - - self.b_notch, self.a_notch = iirnotch(50, 30, self.sampling_rate) - self.b_band, self.a_band = butter(4, [0.5 / (self.sampling_rate / 2), 48.0 / (self.sampling_rate / 2)], btype='band') + if inlet is None: + print("Unable to connect to any LSL stream!") + + return inlet + + def init_ui(self): + self.central_widget = QWidget() + self.main_layout = QHBoxLayout(self.central_widget) - self.zi_notch = lfilter_zi(self.b_notch, self.a_notch) * 0 - self.zi_band = lfilter_zi(self.b_band, self.a_band) * 0 + # Left side: EEG plots with settings button + self.left_container = QWidget() + self.left_layout = QVBoxLayout(self.left_container) + + # Scroll area for EEG channels + self.eeg_scroll = QScrollArea() + self.eeg_scroll.setWidgetResizable(True) + self.eeg_container = QWidget() + self.eeg_layout = QVBoxLayout(self.eeg_container) + self.eeg_layout.setSpacing(0) # Remove spacing between plots + self.eeg_scroll.setWidget(self.eeg_container) + self.left_layout.addWidget(self.eeg_scroll) + + # Add a frame for the settings button in bottom right + self.settings_frame = QFrame() + self.settings_frame.setStyleSheet("QFrame { background-color: rgba(50, 50, 50, 150); border: 1px solid #888; border-radius: 5px; }") + self.settings_frame_layout = QHBoxLayout(self.settings_frame) + self.settings_frame_layout.setContentsMargins(5, 5, 5, 5) + + # Settings button with improved styling + self.settings_button = QPushButton("⚙️ Settings") + self.settings_button.setStyleSheet(""" + QPushButton { + background-color: #2c3e50; + color: #ecf0f1; + border: none; + border-radius: 3px; + padding: 4px 8px; + font-size: 16px; + } + QPushButton:hover { + background-color: #34495e; + } + """) - # Timer for updating the plot - self.timer = pg.QtCore.QTimer() - self.timer.timeout.connect(self.update_plot) - self.timer.start(20) + self.settings_button.clicked.connect(self.show_settings) + self.settings_frame_layout.addWidget(self.settings_button, alignment=Qt.AlignRight) + + self.left_layout.addWidget(self.settings_frame, alignment=Qt.AlignRight) + self.main_layout.addWidget(self.left_container, stretch=1) - self.eeg_curve = self.eeg_plot_widget.plot(pen=pg.mkPen('b', width=1)) - self.fft_curve = self.fft_plot.plot(pen=pg.mkPen('r', width=1)) # FFT Colour is red + # Right side: FFT and Brainpower + self.right_container = QWidget() + self.right_layout = QVBoxLayout(self.right_container) + + # FFT Plot + self.fft_plot = PlotWidget() + self.fft_plot.setBackground('black') + self.fft_plot.showGrid(x=True, y=True, alpha=0.3) + self.fft_plot.setLabel('bottom', 'Frequency (Hz)') + self.fft_plot.setXRange(0, 50, padding=0) + self.fft_plot.setYRange(0, 10000) + self.right_layout.addWidget(self.fft_plot, stretch=1) + + # Brainpower Plot + self.bar_chart_widget = pg.PlotWidget() + self.bar_chart_widget.setBackground('black') + self.bar_chart_widget.showGrid(x=True, y=True, alpha=0.3) + self.bar_chart_widget.setLabel('bottom', 'Brainpower Bands') + self.bar_chart_widget.setLabel('left', 'Relative Power') + self.bar_chart_widget.setXRange(-0.5, 4.5) + self.bar_chart_widget.setYRange(0, 1) + self.bar_chart_widget.setMouseEnabled(x=False, y=False) + self.brainwave_bars = pg.BarGraphItem(x=[0, 1, 2, 3, 4], height=[0, 0, 0, 0, 0], width=0.5, brushes=[pg.mkBrush(color) for color in self.colors[:5]]) + self.bar_chart_widget.addItem(self.brainwave_bars) + self.bar_chart_widget.getAxis('bottom').setTicks([[(0, 'Delta'), (1, 'Theta'), (2, 'Alpha'), (3, 'Beta'), (4, 'Gamma')]]) + self.right_layout.addWidget(self.bar_chart_widget, stretch=1) + + self.main_layout.addWidget(self.right_container, stretch=1) + self.setCentralWidget(self.central_widget) + # Initialize plots + self.eeg_plots = [] + self.eeg_curves = [] + self.fft_curves = [] + self.init_plots() + + def init_plots(self): + # Clear existing plots + for i in reversed(range(self.eeg_layout.count())): + widget = self.eeg_layout.itemAt(i).widget() + if widget is not None: + widget.setParent(None) + + self.eeg_plots = [] + self.eeg_curves = [] + self.fft_plot.clear() + self.fft_curves = [] + + # Create EEG plots for all channels + for ch in range(self.num_channels): + plot = PlotWidget() + plot.setBackground('black') + plot.showGrid(x=True, y=True, alpha=0.3) + plot.setLabel('left', f'Ch {ch+1}', color='white') + plot.getAxis('left').setTextPen('white') + plot.getAxis('bottom').setTextPen('white') + plot.setYRange(-5000, 5000, padding=0) + plot.setXRange(0, DISPLAY_DURATION, padding=0) + plot.setMouseEnabled(x=False, y=True) + plot.setSizePolicy(QSizePolicy.Expanding, QSizePolicy.Expanding) + + color = self.colors[ch % len(self.colors)] + pen = pg.mkPen(color=color, width=2) + curve = plot.plot(pen=pen) + + self.eeg_layout.addWidget(plot) + self.eeg_plots.append(plot) + self.eeg_curves.append((ch, curve)) + + plot.setVisible(False) # Initially hide all plots + + # Create FFT curves for all channels + for ch in range(self.num_channels): + color = self.colors[ch % len(self.colors)] + pen = pg.mkPen(color=color, width=2) + self.fft_curves.append((ch, self.fft_plot.plot(pen=pen))) + + self.update_plot_visibility() + + def update_plot_visibility(self): + for idx, (ch, curve) in enumerate(self.eeg_curves): + visible = ch in self.selected_eeg_channels + self.eeg_plots[idx].setVisible(visible) + if visible: + self.eeg_layout.setStretch(self.eeg_layout.indexOf(self.eeg_plots[idx]), 1) + + # Update FFT curve visibility + for ch, curve in self.fft_curves: + curve.setVisible(ch in self.selected_eeg_channels) + + def show_settings(self): + dialog = SettingBox(self.num_channels, self.selected_eeg_channels, self.selected_bp_channel, self) + if dialog.exec_(): + new_eeg_selection = [i for i, cb in enumerate(dialog.eeg_checkboxes) if cb.isChecked()] + new_bp_channel = dialog.bp_combobox.currentIndex() + + if (set(new_eeg_selection) != set(self.selected_eeg_channels) or + (new_bp_channel != self.selected_bp_channel)): + self.selected_eeg_channels = new_eeg_selection + self.selected_bp_channel = new_bp_channel + self.update_plot_visibility() + def update_plot(self): - samples, _ = self.inlet.pull_chunk(timeout=0.0) + samples, _ = self.inlet.pull_chunk(timeout=0.0, max_samples=50) if samples: - self.last_data_time = time.time() # Store the last data time + self.last_data_time = time.time() + for sample in samples: - raw_point = sample[0] - - notch_filtered, self.zi_notch = lfilter(self.b_notch, self.a_notch, [raw_point], zi=self.zi_notch) - band_filtered, self.zi_band = lfilter(self.b_band, self.a_band, notch_filtered, zi=self.zi_band) - band_filtered = band_filtered[-1] # Get the current filtered point - - # Update EEG data buffer - self.eeg_data.append(band_filtered) - - if len(self.moving_window) < 500: - self.moving_window.append(band_filtered) - else: - self.process_fft_and_brainpower() - - self.moving_window = deque(list(self.moving_window)[50:] + [band_filtered], maxlen=500) - - plot_data = np.array(self.eeg_data) - time_axis = np.linspace(0, 4, len(plot_data)) - self.eeg_curve.setData(time_axis, plot_data) - + self.data_processor.process_sample(sample) + + self.update_eeg_plots() + self.update_fft_plots() + self.update_brainpower_plot() else: if self.last_data_time and (time.time() - self.last_data_time) > 2: self.stream_active = False print("LSL stream disconnected!") self.timer.stop() self.close() + + def update_eeg_plots(self): + time_axis = np.linspace(0, DISPLAY_DURATION, len(self.data_processor.eeg_data[0])) + for ch, curve in self.eeg_curves: + if ch in self.selected_eeg_channels: + display_data = self.data_processor.get_display_data(ch) + curve.setData(time_axis, display_data) + + def update_fft_plots(self): + for ch, curve in self.fft_curves: + if ch in self.selected_eeg_channels: + time_data = list(self.data_processor.moving_windows[ch]) + freqs, fft_result = self.fft_analyzer.compute_fft(ch, time_data) + if fft_result is not None and freqs is not None: + curve.setData(freqs, fft_result) + + def update_brainpower_plot(self): + ch = self.selected_bp_channel + time_data = list(self.data_processor.moving_windows[ch]) + band_powers = self.fft_analyzer.compute_band_powers(ch, time_data) + + if band_powers is not None: + relative_powers = [band_powers['delta'], band_powers['theta'], band_powers['alpha'], band_powers['beta'], band_powers['gamma']] + self.brainwave_bars.setOpts(height=relative_powers) - def process_fft_and_brainpower(self): - window = np.hanning(len(self.moving_window)) - buffer_windowed = np.array(self.moving_window) * window - fft_result = np.abs(np.fft.rfft(buffer_windowed)) - fft_result /= len(buffer_windowed) - freqs = np.fft.rfftfreq(len(buffer_windowed), 1 / self.sampling_rate) - self.fft_curve.setData(freqs, fft_result) - - brainwave_power = self.calculate_brainwave_power(fft_result, freqs) - self.brainwave_bars.setOpts(height=brainwave_power) - - def calculate_brainwave_power(self, fft_data, freqs): - delta_power = math.sqrt(np.sum(((fft_data[(freqs >= 0.5) & (freqs <= 4)])**2)/4)) - theta_power = math.sqrt(np.sum(((fft_data[(freqs >= 4) & (freqs <= 8)])**2)/5)) - alpha_power = math.sqrt(np.sum(((fft_data[(freqs >= 8) & (freqs <= 13)])**2)/6)) - beta_power = math.sqrt(np.sum(((fft_data[(freqs >= 13) & (freqs <=30)])**2)/18)) - gamma_power = math.sqrt(np.sum(((fft_data[(freqs >= 30) & (freqs <= 45)])**2)/16)) - print("Delta Power", delta_power) - print("Theta Power", theta_power) - print("Alpha Power", alpha_power) - print("Beta Power", beta_power) - print("Gamma Power", gamma_power) - return [delta_power, theta_power, alpha_power, beta_power, gamma_power] - if __name__ == "__main__": app = QApplication(sys.argv) window = EEGMonitor() diff --git a/media/Interface.png b/media/Interface.png new file mode 100644 index 00000000..77056e8c Binary files /dev/null and b/media/Interface.png differ diff --git a/media/Webinterface.png b/media/Webinterface.png new file mode 100644 index 00000000..8257d65d Binary files /dev/null and b/media/Webinterface.png differ diff --git a/static/script.js b/static/script.js index 8ab3e95c..ed1684e4 100644 --- a/static/script.js +++ b/static/script.js @@ -1,3 +1,27 @@ +let isLogging = false; + +function logError(error) { + if (isLogging) return; // Prevent recursion + + isLogging = true; + try { + const errorMessage = `[${getTimestamp()}] ${error.stack || error.message || error}\n`; + + // Fallback to console if fetch fails + fetch('/log_error', { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ error: errorMessage }) + }).catch(() => { + console.error('Failed to log error:', errorMessage); + }); + + console.error(errorMessage); + } finally { + isLogging = false; + } +} + async function loadApps() { try { const appGrid = document.getElementById('app-grid'); @@ -21,7 +45,7 @@ async function loadApps() { return config.apps; } catch (error) { - console.error('Error loading apps config:', error); + logError('Error loading apps config:', error); // Show error state to user const appGrid = document.getElementById('app-grid'); @@ -48,8 +72,9 @@ async function initializeApplication() { const apps = await loadApps(); renderApps(apps); setupCategoryFilter(apps); + startAppStatusChecker(); } catch (error) { - console.error('Application initialization failed:', error); + logError('Application initialization failed:', error); } } @@ -71,7 +96,8 @@ function renderApps(apps) { apps.forEach(app => { const card = document.createElement('div'); - card.className = `group bg-gradient-to-b from-white to-gray-50 dark:from-gray-700 dark:to-gray-800 rounded-xl shadow border hover:shadow-lg transition-all duration-300 dark:border-gray-700 cursor-pointer overflow-hidden`; + card.className = `group bg-gradient-to-b from-white to-gray-50 dark:from-gray-700 dark:to-gray-800 rounded-xl shadow border hover:shadow-lg transition-all duration-300 dark:border-gray-700 overflow-hidden cursor-pointer`; + card.id = `card-${app.script}`; card.innerHTML = `
@@ -79,55 +105,112 @@ function renderApps(apps) {
-

${app.title}

+
+

${app.title}

+ +

${app.description}

`; + updateAppStatus(app.script); card.addEventListener('click', async () => { - if (!isConnected) { - showAlert('Please connect to a device first using USB, WiFi or Bluetooth'); - return; - } - - // Add loading state to the clicked card - const originalContent = card.innerHTML; - card.innerHTML = ` -
- - Launching ${app.title}... -
- `; - - try { - const response = await fetch(`/check_app_status/${app.script}`); - - if (!response.ok) { - throw new Error('Failed to check app status'); - } - - const data = await response.json(); - - if (data.status === 'running') { - showAlert(`${app.title} is already running!`); - card.innerHTML = originalContent; - return; - } - - await launchApplication(app.script); - card.innerHTML = originalContent; - } catch (error) { - console.error('Error launching app:', error); - showAlert(`Failed to launch ${app.title}: ${error.message}`); - card.innerHTML = originalContent; - } + await handleAppClick(app, card); }); appGrid.appendChild(card); }); } +async function handleAppClick(app, card) { + const statusElement = document.getElementById(`status-${app.script}`); + if (statusElement && !statusElement.classList.contains('hidden')) { + return; + } + + if (!isConnected) { + showAlert('Please connect to a device first using USB, WiFi or Bluetooth'); + return; + } + + const originalContent = card.innerHTML; // Add loading state to the clicked card + card.innerHTML = ` +
+ + Launching ${app.title}... +
+ `; + + try { + const response = await fetch(`/check_app_status/${app.script}`); + + if (!response.ok) { + throw new Error('Failed to check app status'); + } + + const data = await response.json(); + + await launchApplication(app.script); + card.innerHTML = originalContent; + updateAppStatus(app.script); // Update status after launch + } catch (error) { + logError('Error launching app:', error); + showAlert(`Failed to launch ${app.title}: ${error.message}`); + card.innerHTML = originalContent; + } +} + +async function updateAppStatus(appName) { + try { + const response = await fetch(`/check_app_status/${appName}`); + if (!response.ok) return; + + const data = await response.json(); + const statusElement = document.getElementById(`status-${appName}`); + const cardElement = document.getElementById(`card-${appName}`); + + if (statusElement && cardElement) { + if (data.status === 'running') { + statusElement.classList.remove('hidden'); + cardElement.classList.add('cursor-not-allowed'); + cardElement.classList.remove('cursor-pointer'); + cardElement.classList.remove('hover:shadow-lg'); + cardElement.classList.add('opacity-60'); + } else { + statusElement.classList.add('hidden'); + cardElement.style.pointerEvents = 'auto'; + cardElement.classList.remove('cursor-not-allowed'); + cardElement.classList.add('cursor-pointer'); + cardElement.classList.add('hover:shadow-lg'); + cardElement.classList.remove('opacity-60'); + } + } + } catch (error) { + logError('Error checking app status:', error); + } +} + +// Periodically check all app statuses +function startAppStatusChecker() { + checkAllAppStatuses(); + setInterval(checkAllAppStatuses, 200); +} + +// Check status of all apps +function checkAllAppStatuses() { + const appGrid = document.getElementById('app-grid'); + if (!appGrid) return; + + const apps = appGrid.querySelectorAll('[id^="status-"]'); + apps.forEach(statusElement => { + const appName = statusElement.id.replace('status-', ''); + updateAppStatus(appName); + }); +} + // Set up category filter with fixed options function setupCategoryFilter(apps) { const categorySelect = document.querySelector('select'); @@ -179,6 +262,7 @@ function filterAppsByCategory(category, allApps) { appGrid.style.opacity = '0'; setTimeout(() => { appGrid.style.opacity = '1'; + checkAllAppStatuses(); }, 10); }, 300); } @@ -230,6 +314,26 @@ let isRecording = false; let eventSource = null; let isScanning = false; +// Function to update the filename timestamp periodically +function startTimestampUpdater() { + updateFilenameTimestamp(); + setInterval(updateFilenameTimestamp, 1000); +} + +// Update the filename timestamp in the input field +function updateFilenameTimestamp() { + // Only update if recording is stop + if (!isRecording) { + const defaultName = `ChordsPy_${getTimestamp()}`; + filenameInput.placeholder = defaultName; + + // If the input is empty or has the default pattern, update the value too + if (!filenameInput.value || filenameInput.value.startsWith('ChordsPy_')) { + filenameInput.value = defaultName; + } + } +} + // Function to generate timestamp for filename function getTimestamp() { const now = new Date(); @@ -272,6 +376,10 @@ function initializeFilename() { const defaultName = `ChordsPy_${getTimestamp()}`; filenameInput.value = defaultName; filenameInput.placeholder = defaultName; + filenameInput.disabled = false; // Ensure input is enabled initially + filenameInput.classList.remove('bg-gray-100', 'dark:bg-gray-700', 'cursor-not-allowed'); + filenameInput.classList.add('dark:bg-gray-800'); + startTimestampUpdater(); } // Sanitize filename input - replace spaces and dots with underscores @@ -348,7 +456,7 @@ function scanBleDevices() { }) .catch(error => { isScanning = false; - console.error('BLE scan error:', error); + logError('BLE scan error:', error); bleDevicesList.innerHTML = `
Error scanning for devices. Please try again. @@ -429,8 +537,6 @@ connectBtn.addEventListener('click', async () => { postData.device_address = selectedBleDevice.address; } - // showStatus('Connecting...', 'fa-spinner fa-spin', 'text-blue-500'); - const response = await fetch('/connect', { method: 'POST', headers: { 'Content-Type': 'application/json' }, @@ -450,7 +556,7 @@ connectBtn.addEventListener('click', async () => { throw new Error(data.message || 'Connection failed'); } } catch (error) { - console.error('Connection error:', error); + logError('Connection error:', error); showStatus(`Connection failed: ${error.message}`, 'fa-times-circle', 'text-red-500'); // Return to connect state connectingBtn.classList.add('hidden'); @@ -463,7 +569,7 @@ connectBtn.addEventListener('click', async () => { // Poll connection status async function pollConnectionStatus() { let attempts = 0; - const maxAttempts = 5; // 5 seconds timeout + const maxAttempts = 15; // 15 seconds timeout const checkStatus = async () => { attempts++; @@ -491,7 +597,7 @@ async function pollConnectionStatus() { }); } } catch (error) { - console.error('Connection polling error:', error); + logError('Connection polling error:', error); showStatus(`Connection failed: Try again`, 'fa-times-circle', 'text-red-500'); // Return to connect state connectingBtn.classList.add('hidden'); @@ -526,7 +632,7 @@ function handleConnectionSuccess() { btn.disabled = true; }); - showStatus(`Connected via ${selectedProtocol.toUpperCase()}`, 'fa-check-circle', 'text-green-500'); + showStatus(`Connected via ${selectedProtocol.toUpperCase()}`, 'fa-check-circle'); // Start console updates startConsoleUpdates(); @@ -558,7 +664,6 @@ disconnectBtn.addEventListener('click', async () => { // Show connecting state during disconnection disconnectBtn.classList.add('hidden'); disconnectingBtn.classList.remove('hidden'); - // showStatus('Disconnecting...', 'fa-spinner fa-spin', 'text-blue-500'); const response = await fetch('/disconnect', { method: 'POST' }); const data = await response.json(); @@ -568,6 +673,7 @@ disconnectBtn.addEventListener('click', async () => { // Return to connect state disconnectingBtn.classList.add('hidden'); connectBtn.classList.remove('hidden'); + showStatus('Disconnected!', 'fa-times-circle', 'text-red-500'); // Reset all protocol buttons connectionBtns.forEach(btn => { @@ -604,7 +710,7 @@ disconnectBtn.addEventListener('click', async () => { } } } catch (error) { - console.error('Disconnection error:', error); + logError('Disconnection error:', error); // Return to disconnect state if disconnection failed disconnectingBtn.classList.add('hidden'); disconnectBtn.classList.remove('hidden'); @@ -625,7 +731,7 @@ function startConsoleUpdates() { }; eventSource.onerror = function() { - console.error('EventSource failed'); + logError('EventSource failed'); if (eventSource) { eventSource.close(); eventSource = null; @@ -662,10 +768,17 @@ function toggleRecording() { recordBtn.classList.remove('bg-gray-500'); recordBtn.classList.add('bg-red-500', 'hover:bg-red-600'); recordingStatus.classList.add('hidden'); + + // Enable filename input + filenameInput.disabled = false; + filenameInput.classList.remove('bg-gray-100', 'dark:bg-gray-700', 'cursor-not-allowed'); + filenameInput.classList.add('dark:bg-gray-800'); + updateFilenameTimestamp() + showStatus('Recording stopped', 'fa-stop-circle', 'text-red-500'); } }) .catch(error => { - console.error('Error stopping recording:', error); + logError('Error stopping recording:', error); }); } else { // Start recording - send the filename (or null for default) @@ -682,10 +795,16 @@ function toggleRecording() { recordBtn.classList.remove('bg-red-500', 'hover:bg-red-600'); recordBtn.classList.add('bg-gray-500'); recordingStatus.classList.remove('hidden'); + + // Disable filename input + filenameInput.disabled = true; + filenameInput.classList.add('bg-gray-100', 'dark:bg-gray-700', 'cursor-not-allowed'); + filenameInput.classList.remove('dark:bg-gray-800'); + showStatus('Recording started', 'fa-record-vinyl', 'text-green-500'); } }) .catch(error => { - console.error('Error starting recording:', error); + logError('Error starting recording:', error); showAlert('Failed to start recording: ' + error.message); }); } @@ -698,11 +817,11 @@ initializeFilename(); // Set default filename with timestamp function showStatus(text, icon, colorClass) { const statusDiv = document.getElementById('connection-status'); statusText.textContent = text; - statusIcon.innerHTML = ``; + statusIcon.innerHTML = ``; statusDiv.classList.remove('hidden'); setTimeout(() => { statusDiv.classList.add('hidden'); - }, 2000); + }, 3000); } function showAlert(message) { @@ -716,6 +835,7 @@ function checkStreamStatus() { if (data.connected) { // If connected, update the frontend if (!isConnected) { + handleConnectionSuccess(); isConnected = true; connectBtn.classList.add('hidden'); connectingBtn.classList.add('hidden'); @@ -727,11 +847,13 @@ function checkStreamStatus() { } else { // If not connected, update the frontend if (isConnected) { + handleDisconnection(); isConnected = false; disconnectBtn.classList.add('hidden'); disconnectingBtn.classList.add('hidden'); connectingBtn.classList.add('hidden'); connectBtn.classList.remove('hidden'); + showStatus('Disconnected!', 'fa-times-circle', 'text-red-500'); // Re-enable protocol buttons setProtocolButtonsDisabled(false); @@ -743,6 +865,12 @@ function checkStreamStatus() { recordBtn.classList.remove('bg-gray-500'); recordBtn.classList.add('bg-red-500', 'hover:bg-red-600'); recordingStatus.classList.add('hidden'); + + // Enable filename input if recording was stopped due to disconnection + filenameInput.disabled = false; + filenameInput.classList.remove('bg-gray-100', 'dark:bg-gray-700', 'cursor-not-allowed'); + filenameInput.classList.add('dark:bg-gray-800'); + showStatus('Recording stopped (connection lost)', 'fa-stop-circle', 'text-red-500'); } // Stop console updates @@ -754,10 +882,42 @@ function checkStreamStatus() { } }) .catch(error => { - console.error('Error fetching stream status:', error); + logError('Error fetching stream status:', error); }); } +function handleDisconnection() { + isConnected = false; + disconnectBtn.classList.add('hidden'); + disconnectingBtn.classList.add('hidden'); + connectingBtn.classList.add('hidden'); + connectBtn.classList.remove('hidden'); + showStatus('Stream disconnected!', 'fa-times-circle', 'text-red-500'); + + // Reset protocol buttons + connectionBtns.forEach(btn => { + btn.disabled = false; + btn.classList.remove('bg-cyan-600', 'dark:bg-cyan-700', 'cursor-default'); + btn.classList.add('hover:bg-cyan-500', 'hover:text-white'); + }); + + // Handle recording state + if (isRecording) { + isRecording = false; + recordBtn.innerHTML = 'Start Recording'; + recordBtn.classList.remove('bg-gray-500'); + recordBtn.classList.add('bg-red-500', 'hover:bg-red-600'); + recordingStatus.classList.add('hidden'); + filenameInput.disabled = false; + showStatus('Recording stopped (stream lost)', 'fa-stop-circle', 'text-red-500'); + } + + if (eventSource) { + eventSource.close(); + eventSource = null; + } +} + // Call the checkStreamStatus function every 1 second setInterval(checkStreamStatus, 1000); @@ -767,6 +927,9 @@ checkStreamStatus(); // Initialize the app when DOM is loaded document.addEventListener('DOMContentLoaded', () => { initializeApplication(); +window.onerror = function(message, source, lineno, colno, error) { + logError(error || message); + return true; }; document.getElementById('github-btn').addEventListener('click', () => { window.open('https://github.com/upsidedownlabs/Chords-Python', '_blank'); diff --git a/templates/index.html b/templates/index.html index b65eb453..eaf842a1 100644 --- a/templates/index.html +++ b/templates/index.html @@ -74,9 +74,9 @@
-