@@ -8,6 +8,8 @@ Return the loss corresponding to mean absolute error:
88# Examples
99
1010```jldoctest
11+ julia> using Flux.Losses: mae
12+
1113julia> y_model = [1.1, 1.9, 3.1];
1214
1315julia> mae(y_model, 1:3)
@@ -31,6 +33,8 @@ See also: [`mae`](@ref), [`msle`](@ref), [`crossentropy`](@ref).
3133# Examples
3234
3335```jldoctest
36+ julia> using Flux.Losses: mse
37+
3438julia> y_model = [1.1, 1.9, 3.1];
3539
3640julia> y_true = 1:3;
@@ -57,6 +61,8 @@ Penalizes an under-estimation more than an over-estimatation.
5761# Examples
5862
5963```jldoctest
64+ julia> using Flux.Losses: msle
65+
6066julia> msle(Float32[1.1, 2.2, 3.3], 1:3)
61670.009084041f0
6268
@@ -113,6 +119,8 @@ of label smoothing to binary distributions encoded in a single number.
113119# Examples
114120
115121```jldoctest
122+ julia> using Flux.Losses: label_smoothing, crossentropy
123+
116124julia> y = Flux.onehotbatch([1, 1, 1, 0, 1, 0], 0:1)
1171252×6 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
118126 ⋅ ⋅ ⋅ 1 ⋅ 1
@@ -179,6 +187,8 @@ See also: [`logitcrossentropy`](@ref), [`binarycrossentropy`](@ref), [`logitbina
179187# Examples
180188
181189```jldoctest
190+ julia> using Flux.Losses: label_smoothing, crossentropy
191+
182192julia> y_label = Flux.onehotbatch([0, 1, 2, 1, 0], 0:2)
1831933×5 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
184194 1 ⋅ ⋅ ⋅ 1
@@ -232,6 +242,8 @@ See also: [`binarycrossentropy`](@ref), [`logitbinarycrossentropy`](@ref), [`lab
232242# Examples
233243
234244```jldoctest
245+ julia> using Flux.Losses: crossentropy, logitcrossentropy
246+
235247julia> y_label = onehotbatch(collect("abcabaa"), 'a':'c')
2362483×7 OneHotMatrix(::Vector{UInt32}) with eltype Bool:
237249 1 ⋅ ⋅ 1 ⋅ 1 1
@@ -273,7 +285,10 @@ computing the loss.
273285See also: [`crossentropy`](@ref), [`logitcrossentropy`](@ref).
274286
275287# Examples
288+
276289```jldoctest
290+ julia> using Flux.Losses: binarycrossentropy, crossentropy
291+
277292julia> y_bin = Bool[1,0,1]
2782933-element Vector{Bool}:
279294 1
@@ -314,7 +329,10 @@ Mathematically equivalent to
314329See also: [`crossentropy`](@ref), [`logitcrossentropy`](@ref).
315330
316331# Examples
332+
317333```jldoctest
334+ julia> using Flux.Losses: binarycrossentropy, logitbinarycrossentropy
335+
318336julia> y_bin = Bool[1,0,1];
319337
320338julia> y_model = Float32[2, -1, pi]
@@ -348,6 +366,8 @@ from the other. It is always non-negative, and zero only when both the distribut
348366# Examples
349367
350368```jldoctest
369+ julia> using Flux.Losses: kldivergence
370+
351371julia> p1 = [1 0; 0 1]
3523722×2 Matrix{Int64}:
353373 1 0
@@ -467,6 +487,8 @@ For `γ == 0`, the loss is mathematically equivalent to [`binarycrossentropy`](@
467487# Examples
468488
469489```jldoctest
490+ julia> using Flux.Losses: binary_focal_loss
491+
470492julia> y = [0 1 0
471493 1 0 1]
4724942×3 Matrix{Int64}:
@@ -509,6 +531,8 @@ For `γ == 0`, the loss is mathematically equivalent to [`crossentropy`](@ref).
509531# Examples
510532
511533```jldoctest
534+ julia> using Flux.Losses: focal_loss
535+
512536julia> y = [1 0 0 0 1
513537 0 1 0 1 0
514538 0 0 1 0 0]
0 commit comments