Skip to content

Extrema, flattened iterator and too much memory alloation #34385

Closed
@jakubwro

Description

@jakubwro

Hi,

The issue I am reporting is related to discourse conversation: https://discourse.julialang.org/t/concatenating-iterables-without-allocating-memory/33282
Also I commented some details to other issue: #31442 (comment)

I encountered it using extrema function with flattened iterator, but here I am providing more distilled example.

First of all I need 2 big arrays which we are going to concatenate with a lazy iterator.

const signal1 = rand(10000000)
const signal2 = rand(10000000)
const flat = Iterators.flatten((signal1, signal2))

Now I am going to define a functions that will iterate this flattened data. They are based on extrema implementation.

function works_ok(itr)
    y = iterate(itr)
    (v, s) = y
    y = iterate(itr, s)
    while y !== nothing
        (v, s) = y
        y = iterate(itr, s)
    end

    return v
end

function gives_strange_result(itr)
    y = iterate(itr)
    (v, s) = y
    while y !== nothing
        y = iterate(itr, s)
        y === nothing && break
        (v, s) = y
    end

    return v
end

The second one has strange timings and memory consumption.

julia> @time works_ok(flat)
  0.024651 seconds (7 allocations: 240 bytes)
0.9342115147070622

julia> @time gives_strange_result(flat)
  0.252765 seconds (20.00 M allocations: 610.352 MiB, 22.57% gc time)
0.9342115147070622

There is no memory leak, I tried to run in a loop 1000 times.

Metadata

Metadata

Assignees

No one assigned

    Labels

    foldsum, maximum, reduce, foldl, etc.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions