Skip to content

WIP: leverage Base reduction functions #7

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 4 commits into from
Closed

Conversation

simonbyrne
Copy link
Member

This is something I have been meaning to do for a while. Basically the idea is to define a kbn addition operator, and define all the reductions in terms of that.

import Base.TwicePrecision


function plus_kbn(x::T, y::T) where {T}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should these be restricted to T<:Real? Also do they necessarily have to be the same type? Using + as we do here should take care of any promotion that needs to happen.





Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a lot of blank lines. Could you reduce it to just one, maybe two?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No. I need those for caching :trollface:

@ararslan
Copy link
Member

This looks really cool! Is there any noticeable change in performance?

@simonbyrne
Copy link
Member Author

simonbyrne commented Dec 12, 2017

AHHHHH!!!!!

This doesn't work on 0.7 because r_promote was removed (JuliaLang/julia#22825). The recommendation seems to be that you combine the promotion into the mapping function, but that will be pretty inefficient in this case, as it will end up calling plus_kbn(::TwicePrecisionN, ::TwicePrecisionN) instead of plus_kbn(::TwicePrecisionN, ::Float64).

simonbyrne added a commit to JuliaLang/julia that referenced this pull request Dec 12, 2017
Since the demise of `r_promote` in #22825, there is now a type-instability in `mapreduce` if the operator does not give an element of the same type as the input. This arose during my implementation of Kahan summation using a reduction operator, see: JuliaMath/KahanSummation.jl#7

This adds a `mapreduce_single` function which defines what the result should be in these cases.
simonbyrne added a commit to JuliaLang/julia that referenced this pull request Dec 14, 2017
Since the demise of `r_promote` in #22825, there is now a type-instability in `mapreduce` if the operator does not give an element of the same type as the input. This arose during my implementation of Kahan summation using a reduction operator, see: JuliaMath/KahanSummation.jl#7

This adds a `mapreduce_single` function which defines what the result should be in these cases.
simonbyrne added a commit to JuliaLang/julia that referenced this pull request Jan 3, 2018
Since the demise of `r_promote` in #22825, there is now a type-instability in `mapreduce` if the operator does not give an element of the same type as the input. This arose during my implementation of Kahan summation using a reduction operator, see: JuliaMath/KahanSummation.jl#7

This adds a `mapreduce_single` function which defines what the result should be in these cases.
JeffBezanson pushed a commit to JuliaLang/julia that referenced this pull request Jan 4, 2018
Since the demise of `r_promote` in #22825, there is now a type-instability in `mapreduce` if the operator does not give an element of the same type as the input. This arose during my implementation of Kahan summation using a reduction operator, see: JuliaMath/KahanSummation.jl#7

This adds a `mapreduce_single` function which defines what the result should be in these cases.
@mgr327
Copy link

mgr327 commented Jul 4, 2019

Would it be possible to re-activate/implement that pull request?

@JeffreySarnoff
Copy link
Member

JeffreySarnoff commented Jun 7, 2022

We do not care about support for Julia 0.7. If you can get this work as intended for Julia "1.6+", we are open to including it.

@JeffreySarnoff JeffreySarnoff deleted the sb/reduce branch June 7, 2022 13:04
@asinghvi17
Copy link

asinghvi17 commented May 10, 2025

I just looked at this code again and it seems to be fine in v1.11.

# this implementation
julia> @be sum_kbn($(rand(1000)))
Benchmark: 3348 samples with 16 evaluations
 min    1.677 μs
 median 1.734 μs
 mean   1.738 μs
 max    7.698 μs

# the implementation on master
julia> @be KahanSummation.sum_kbn($(rand(1000)))
Benchmark: 3336 samples with 16 evaluations
 min    1.682 μs
 median 1.698 μs
 mean   1.740 μs
 max    4.336 μs

# compared to current Base pairwise summation, slower of course
julia> @be sum($(rand(1000)))
Benchmark: 3198 samples with 290 evaluations
 min    98.131 ns
 median 98.707 ns
 mean   100.700 ns
 max    268.390 ns

I attached a timer output and it never calls the method for two TwicePrecisions. (Annotations mine)

julia> to
────────────────────────────────────────────────────────────────────
                           Time                    Allocations      
                  ───────────────────────   ────────────────────────
Tot / % measured:      2.13s /   0.8%           3.68MiB /   0.0%    

Section                           ncalls     time    %tot     avg     alloc    %tot      avg
───────────────────────────────────────────────────────────────────────────────
+(::Float64, ::TwicePrecisionN)     1.72M   17.9ms   99.9%  10.4ns     0.00B     - %    0.00B
+(::Float64, ::Float64)             1.72k   17.4μs    0.1%  10.1ns     0.00B     - %    0.00B
───────────────────────────────────────────────────────────────────────────────

But this seems mergeable as is.

cc @simonbyrne @JeffreySarnoff

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants