Skip to content

Substiantial slowdown while downloading thousands of files #43

@coezbek

Description

@coezbek

I am using WSL2 to run gphotos-cdp and have noticed that the performance noticeably drops the more files are already downloaded in the download folder. I have measured it to take roughly 1s slower for each download for each 1000 files which have already been downloaded.

I have traced it to a readDir call in the download function, which seems to take longer and longer to scan the whole directory when there are already thousands of directories in the download folder.

As a quick fix I have modified the moveDownload function to move the files/folders into a subdirectory called 'results' (see newDir := line):

func (s *Session) moveDownload(ctx context.Context, dlFile, location string) (string, error) {
	log.Printf("Move Download start")
	parts := strings.Split(location, "/")
	if len(parts) < 5 {
		return "", fmt.Errorf("not enough slash separated parts in location %v: %d", location, len(parts))
	}
	newDir := filepath.Join(s.dlDir, "results", parts[4])
	if err := os.MkdirAll(newDir, 0700); err != nil {

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions