Skip to content

Tutorial on binder #111

@zain-sohail

Description

@zain-sohail

So binder has a strict memory limit of 2 GB. Even after having the data saved into parquet files, the filling method takes a lot more memory which crashes the kernel. Any suggestions? One possibility:

  • to save the filled dataframe on gitlab repo and load that on binder

Metadata

Metadata

Labels

distributionSoftware packaging and distribution

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions