You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
if hidden files are present in a data directory (such as .DS_Store), some utils.read functions may fail. An example of how to deal with this if occurs is in utils.read_ExperimentEvents and read_OnixAnalogFrameCount
all data readers in harp/utils.py should be multi-file aware (either through Andrew's method using load_2 or explicitely in the read_xxx functions used by Hilde)
harp.REFERENCE_EPOCH is 1904-01-01T00-00-00, this is from when HARP startup is counted from
2025 April 9 -> going forward
Test notebooks 1 and 2 (now as Sandbox 2), make papermills
discuss with Nora if Cohort 0 analysis is needed or not (quite some coding for synchronisation) - Cohort0 syncronisation #67
-- may be worth doing a whole V1 cohort instead
-- it also has "regular mismatch", which may worth analysing
-- GRAB wakeup analysed and good
-- Visual SF tuning good
FIXME load SLEAP data from multiple csv files (currently needed manual merging into single csv file) and add to figure 1
FIXME in photometry preprocess.py
create papermil batch processing for photometry (keep it separate from Bonsai preprocessing)
Ede to set up ONIX computer backup scheduler (same as on Hugh's setup) (do a weekly backup of C and D to NAS/backup and daily incremental of DT to data/ONIX). How to deal with the photometry (set up similar but the folder structure may be tricky)
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
references and potential issues in the future
2025 April 9 -> going forward
Test notebooks 1 and 2 (now as Sandbox 2), make papermills
go through experiment pipeline with Nora
discuss with Nora if Cohort 0 analysis is needed or not (quite some coding for synchronisation) - Cohort0 syncronisation #67
-- may be worth doing a whole V1 cohort instead
-- it also has "regular mismatch", which may worth analysing
-- GRAB wakeup analysed and good
-- Visual SF tuning good
visual MM day 4 data issues Cohort 1 Visual mismatch day 4 data issues #63
FIXME load SLEAP data from multiple csv files (currently needed manual merging into single csv file) and add to figure 1
FIXME in photometry preprocess.py
create papermil batch processing for photometry (keep it separate from Bonsai preprocessing)
Ede to set up ONIX computer backup scheduler (same as on Hugh's setup) (do a weekly backup of C and D to NAS/backup and daily incremental of DT to data/ONIX). How to deal with the photometry (set up similar but the folder structure may be tricky)
discuss further analysis requirements Save high-level results csv after analysis #68 and SLEAP processing - observations and further steps #66
Beta Was this translation helpful? Give feedback.
All reactions