#124 May 2026

124. Iterator::cycle — Round-Robin Without the Modulo Math

Round-robin assignment usually shows up as things[i % workers.len()] — fine until the index gets clever, the slice gets reordered, or the source isn’t even indexable. Iterator::cycle turns any Clone iterator into an infinite one, and the modulo dance disappears.

The textbook version: distribute jobs across a fixed pool of workers. Indexing works, but you’re carrying the index and the modulo around just to walk in a circle:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
let workers = ["alice", "bob", "carol"];
let jobs = ["build", "test", "lint", "deploy", "notify"];

// Index-and-modulo: works, but the math is doing the iterating for you.
let mut assigned: Vec<(&str, &str)> = Vec::new();
for (i, job) in jobs.iter().copied().enumerate() {
    assigned.push((job, workers[i % workers.len()]));
}

assert_eq!(
    assigned,
    vec![
        ("build",  "alice"),
        ("test",   "bob"),
        ("lint",   "carol"),
        ("deploy", "alice"),
        ("notify", "bob"),
    ],
);

Swap the modulo for cycle and the loop tells you exactly what it’s doing — pull the next worker, forever:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
let workers = ["alice", "bob", "carol"];
let jobs = ["build", "test", "lint", "deploy", "notify"];

let assigned: Vec<(&str, &str)> = jobs
    .iter()
    .copied()
    .zip(workers.iter().copied().cycle())
    .collect();

assert_eq!(
    assigned,
    vec![
        ("build",  "alice"),
        ("test",   "bob"),
        ("lint",   "carol"),
        ("deploy", "alice"),
        ("notify", "bob"),
    ],
);

The trick is zip — zipping a finite iterator with an infinite one stops as soon as the finite side runs out, so you never have to bound cycle yourself. No off-by-one, no bookkeeping for “did I already use this worker?”.

It also composes with take when you want a fixed-length output and the source is the short one:

1
2
3
4
5
let pattern = [1, 2, 3];

let twelve: Vec<i32> = pattern.iter().copied().cycle().take(12).collect();

assert_eq!(twelve, vec![1, 2, 3, 1, 2, 3, 1, 2, 3, 1, 2, 3]);

A handy companion is enumerate — when you want the round-robin and the original index together:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
let colors = ["red", "green", "blue"];
let rows = ["one", "two", "three", "four", "five"];

let painted: Vec<(usize, &str, &str)> = rows
    .iter()
    .copied()
    .zip(colors.iter().copied().cycle())
    .enumerate()
    .map(|(i, (row, color))| (i, row, color))
    .collect();

assert_eq!(
    painted,
    vec![
        (0, "one",   "red"),
        (1, "two",   "green"),
        (2, "three", "blue"),
        (3, "four",  "red"),
        (4, "five",  "green"),
    ],
);

Two things worth knowing. cycle requires the underlying iterator to be Clone — it remembers the original and starts over each time it’s exhausted, which means it’ll panic or loop forever on an empty iterator depending on what you do with it (zip is safe — empty side wins; bare .next() would just return None forever). And it’s lazy: nothing repeats until the consumer pulls another item, so pairing it with a finite iterator costs nothing extra.

Stable since Rust 1.0 — one of those iterator adapters that makes the modulo operator feel like the wrong tool the moment you remember it exists.

#123 May 2026

123. BTreeMap::pop_first — A Sorted Map That Doubles as a Priority Queue

BinaryHeap only goes one way — biggest first. When you want to pull the smallest or the largest from the same collection, reach for BTreeMap and let pop_first / pop_last do the work.

The classic shape: a queue of jobs keyed by priority where you sometimes need the most-urgent job and sometimes the least-urgent one. With BinaryHeap you’d pick a direction and stick with it (or wrap things in Reverse to flip it). With BTreeMap you get both ends for free, because the keys are already sorted:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
use std::collections::BTreeMap;

let mut jobs: BTreeMap<u32, &str> = BTreeMap::new();
jobs.insert(5, "rebuild index");
jobs.insert(1, "send heartbeat");
jobs.insert(9, "page oncall");
jobs.insert(3, "rotate logs");

// Smallest key first — drain by priority.
assert_eq!(jobs.pop_first(), Some((1, "send heartbeat")));
assert_eq!(jobs.pop_first(), Some((3, "rotate logs")));

// Or grab the most urgent from the other end.
assert_eq!(jobs.pop_last(), Some((9, "page oncall")));

// Empty? You get None — same shape as Vec::pop.
let mut empty: BTreeMap<u32, &str> = BTreeMap::new();
assert_eq!(empty.pop_first(), None);

Both methods return Option<(K, V)> and remove the entry from the map. No second lookup, no .remove(key) follow-up after .first_key_value().

Where this really earns its keep is the “drain-in-order” loop — the kind of thing you’d otherwise write with a heap plus a sidecar map:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
use std::collections::BTreeMap;

let mut tasks: BTreeMap<u32, String> = BTreeMap::new();
tasks.insert(20, "compact".into());
tasks.insert(10, "vacuum".into());
tasks.insert(30, "snapshot".into());

let mut order = Vec::new();
while let Some((priority, name)) = tasks.pop_first() {
    order.push((priority, name));
}

assert_eq!(
    order,
    vec![
        (10, "vacuum".into()),
        (20, "compact".into()),
        (30, "snapshot".into()),
    ],
);

Same loop, swap pop_first for pop_last and you drain in reverse order — no Reverse wrapper, no second collection.

BTreeSet got the same pair (pop_first / pop_last) at the same time, so a sorted set behaves like a deque you can pop from either end:

1
2
3
4
5
6
use std::collections::BTreeSet;

let mut ids: BTreeSet<u32> = BTreeSet::from([7, 2, 9, 4]);
assert_eq!(ids.pop_first(), Some(2));
assert_eq!(ids.pop_last(),  Some(9));
assert_eq!(ids.len(), 2);

A few things worth knowing. BTreeMap insertion is O(log n) — heavier than a BinaryHeap push, which amortises to O(1). If you genuinely only ever pop from one side and throughput matters, a heap still wins. The moment you need ordered iteration, range queries, or popping from both ends, BTreeMap is the better fit and pop_first / pop_last make that fit feel native.

Stable since Rust 1.66 — and one of those methods that quietly replaces a fistful of match arms once you remember it exists.

#122 May 2026

122. Option::filter — Keep Some Only When the Value Passes

You’ve got an Option<T>, but you only want to keep the Some if the value passes a test. The match-with-guard version works — Option::filter says the same thing in one call.

The shape that keeps showing up: parse something into an Option, then validate it. The naive version stacks an if on top of the unwrap:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
fn parse_port(raw: Option<&str>) -> Option<u16> {
    let s = raw?;
    let n: u16 = s.parse().ok()?;
    if n > 0 { Some(n) } else { None }
}

assert_eq!(parse_port(Some("8080")), Some(8080));
assert_eq!(parse_port(Some("0")),    None);
assert_eq!(parse_port(Some("nope")), None);
assert_eq!(parse_port(None),         None);

Option::filter collapses that trailing if into the chain:

1
2
3
4
5
6
7
fn parse_port(raw: Option<&str>) -> Option<u16> {
    raw.and_then(|s| s.parse().ok())
       .filter(|&n| n > 0)
}

assert_eq!(parse_port(Some("8080")), Some(8080));
assert_eq!(parse_port(Some("0")),    None);

Same story for the match-with-guard pattern — when the predicate is the only thing the arm checks, filter reads better:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
let name: Option<&str> = Some("");

// Before: pattern match with a guard
let valid = match name {
    Some(s) if !s.is_empty() => Some(s),
    _ => None,
};
assert_eq!(valid, None);

// After: just say what you mean
let valid = name.filter(|s| !s.is_empty());
assert_eq!(valid, None);

Two things worth knowing. First, the closure receives &T, not T — same as Iterator::filter. So for Option<i32> you write |&n| n > 0 or |n| *n > 0. For Option<String> auto-deref makes |s| !s.is_empty() just work.

Second, filter only keeps or drops — it never transforms. If the predicate returns true you get the original Some back, untouched. To transform, chain .map() after:

1
2
3
4
let trimmed = Some("  hello  ")
    .map(str::trim)
    .filter(|s| !s.is_empty());
assert_eq!(trimmed, Some("hello"));

Stable since Rust 1.27, and the kind of method that quietly disappears the boilerplate once you know it’s there.

#121 May 2026

121. rem_euclid — The Modulo That Doesn't Go Negative

-1 % 7 in Rust is -1, not 6. That’s a math gotcha lurking in every wraparound index, every clock arithmetic, every “what day of the week” calculation. rem_euclid is the modulo you actually wanted.

Rust’s % operator follows the same rule as C: the sign of the result matches the sign of the dividend. Useful sometimes, surprising the rest of the time:

1
2
3
assert_eq!(-1_i32 % 7, -1);
assert_eq!(-8_i32 % 7, -1);
assert_eq!( 8_i32 % 7,  1);

Try indexing a circular buffer with that and you get a panic the first time you step backwards across zero. The fix is rem_euclid, which always returns a value in [0, |divisor|):

1
2
3
assert_eq!((-1_i32).rem_euclid(7), 6);
assert_eq!((-8_i32).rem_euclid(7), 6);
assert_eq!(( 8_i32).rem_euclid(7), 1);

A real-world shape — wrap an index around a slice in either direction, no if ladder, no manual + len trick:

1
2
3
4
5
6
7
8
9
let days = ["Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun"];

fn day_after(today: i32, delta: i32) -> i32 {
    (today + delta).rem_euclid(7)
}

assert_eq!(days[day_after(0, -1) as usize], "Sun"); // Mon - 1 = Sun
assert_eq!(days[day_after(2,  4) as usize], "Sun"); // Wed + 4 = Sun
assert_eq!(days[day_after(0, -8) as usize], "Sun"); // wraps past week boundary

div_euclid is the partner that pairs with it: a.div_euclid(b) * b + a.rem_euclid(b) == a always holds, even for negative a. Plain / and % only satisfy that identity for non-negative inputs.

1
2
3
let a = -7_i32;
let b =  3_i32;
assert_eq!(a.div_euclid(b) * b + a.rem_euclid(b), a);

Both are available on every signed integer type (and floats), and they’re const. The rule of thumb: if your code can ever see a negative operand and you want the mathematician’s modulo — not the hardware’s — reach for rem_euclid.

#120 May 2026

120. OnceLock — Lazy Statics That Initialize on Your Schedule

LazyLock runs its initializer the first time anyone touches the value — fine when the inputs are baked in at compile time, useless when you only learn them at runtime. OnceLock is the same idea, but you decide when (and with what data) initialization happens.

The classic case: you want a global that’s expensive to build, and the data only exists after main starts — CLI args, env vars, a parsed config file. LazyLock can’t see those without baking the work into a closure that re-runs every test setup.

OnceLock solves it by separating creation from initialization:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
use std::sync::OnceLock;

static CONFIG: OnceLock<String> = OnceLock::new();

fn main() {
    // Initialize from real runtime data, exactly once.
    let cfg = std::env::var("APP_CONFIG").unwrap_or_else(|_| "default".into());
    CONFIG.set(cfg).unwrap();

    assert_eq!(config(), "default");
}

fn config() -> &'static str {
    CONFIG.get().expect("config not initialized")
}

set returns Err if the cell was already filled — you get to decide whether that’s a panic, a log line, or a no-op.

For the read-mostly path, get_or_init combines “is it set?” and “set it” into a single thread-safe call. Concurrent callers race; the winner runs the closure, everyone else waits and reads the result:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
use std::sync::OnceLock;

static GREETING: OnceLock<String> = OnceLock::new();

fn greeting() -> &'static str {
    GREETING.get_or_init(|| format!("hello, {}", "world"))
}

assert_eq!(greeting(), "hello, world");
assert_eq!(greeting(), "hello, world"); // cached, closure does not run again

When to reach for which: pick LazyLock when the initializer is self-contained and you’re happy with it firing on first touch. Pick OnceLock when you need to feed in runtime data — or when you want the option to ask “has this been set yet?” before triggering the work.

#119 May 2026

119. iter::from_fn — Build a Custom Iterator Without the Struct

Need a custom iterator that carries a bit of state? Skip the struct plus impl Iterator boilerplate. iter::from_fn turns any closure that returns Option<T> into a real iterator.

The problem

You want a custom iterator with some captured state — a counter, a parser cursor, a lazy generator. The textbook approach is a struct plus a manual Iterator impl:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
struct Counter { n: u32, max: u32 }

impl Iterator for Counter {
    type Item = u32;
    fn next(&mut self) -> Option<u32> {
        if self.n < self.max {
            self.n += 1;
            Some(self.n)
        } else {
            None
        }
    }
}

let counter = Counter { n: 0, max: 5 };
let v: Vec<u32> = counter.collect();
assert_eq!(v, vec![1, 2, 3, 4, 5]);

Three lines of logic, ten lines of scaffolding.

The fix

std::iter::from_fn takes a FnMut() -> Option<T> and returns an iterator. Yield Some(x) for each item, None to stop. The closure’s captured variables become your iterator state:

1
2
3
4
5
6
7
8
use std::iter;

let mut n = 0;
let v: Vec<u32> = iter::from_fn(|| {
    n += 1;
    (n <= 5).then_some(n)
}).collect();
assert_eq!(v, vec![1, 2, 3, 4, 5]);

The closure captures n mutably and updates it on every call. No struct, no impl, no type to name.

Where it shines: tiny tokenizers

from_fn really earns its keep when paired with Peekable::next_if. Pull characters until a condition fails and you have a one-liner tokenizer:

1
2
3
4
5
6
7
8
let mut chars = "123abc".chars().peekable();
let digits: String =
    iter::from_fn(|| chars.next_if(|c| c.is_ascii_digit())).collect();
assert_eq!(digits, "123");

// chars still has "abc" left to consume
let rest: String = chars.collect();
assert_eq!(rest, "abc");

Reach for from_fn whenever the iterator’s state would just be a couple of local variables. Reach for a manual impl Iterator when the iterator is a public type, needs to implement other traits, or you want a size hint and specialization.

#118 May 2026

118. [T; N]::map — Transform an Array Without Allocating a Vec

[1, 2, 3].iter().map(|n| n * 2).collect::<Vec<_>>() works, but you’ve thrown the length away in the type and paid for a heap allocation. Arrays have their own map — same shape in, same shape out, no Vec in sight.

The reflex for transforming an array is the iterator chain:

1
2
3
let nums = [1, 2, 3, 4];
let doubled: Vec<i32> = nums.iter().map(|n| n * 2).collect();
assert_eq!(doubled, vec![2, 4, 6, 8]);

That gives you a Vec<i32>. The compiler no longer knows the length, and you allocated on the heap to find that out. If you want the array shape back, you’re stuck with try_into and a unwrap you don’t want.

[T; N]::map skips all of it. The output is [U; N] — same N, brand-new element type:

1
2
3
let nums = [1, 2, 3, 4];
let doubled: [i32; 4] = nums.map(|n| n * 2);
assert_eq!(doubled, [2, 4, 6, 8]);

No heap, no length erased, no try_into. Just an array on the stack with a different element type.

It takes each element by value, so it works fine with non-Copy types — no clone dance:

1
2
3
let names = [String::from("a"), String::from("bb"), String::from("ccc")];
let lens: [usize; 3] = names.map(|s| s.len());
assert_eq!(lens, [1, 2, 3]);

The closure consumes the String, the array is moved, and you get a fresh [usize; 3] back. Compare to the iterator version, which would need .into_iter() plus a try_into to recover the array type.

It’s also a clean way to build initialized arrays from one you already have — RGB to RGBA, raw bytes to parsed records, anything fixed-width:

1
2
3
4
5
6
let rgb: [u8; 3] = [200, 100, 50];
let rgba: [u8; 4] = {
    let [r, g, b] = rgb.map(|c| c.saturating_add(5));
    [r, g, b, 255]
};
assert_eq!(rgba, [205, 105, 55, 255]);

When you genuinely want a Vec, .iter().map().collect() still wins. But when the length is part of the design — config slots, fixed-N pipelines, embedded buffers, no_std code — [T; N]::map keeps that fact in the type system instead of throwing it away.

#117 May 2026

117. Iterator::step_by — Every Nth Element Without filter + enumerate

Want every 3rd value from a series? The reflex is enumerate().filter(|(i, _)| i % 3 == 0) — three combinators, one modulo, and you’ve thrown away the indices anyway. step_by(3) does the same thing in one call.

The classic shape: keep every Nth item, drop the rest. Most people reach for enumerate plus a modulo filter:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
let xs = [10, 20, 30, 40, 50, 60, 70];

let evens_by_index: Vec<_> = xs
    .iter()
    .enumerate()
    .filter(|(i, _)| i % 2 == 0)
    .map(|(_, x)| *x)
    .collect();

assert_eq!(evens_by_index, [10, 30, 50, 70]);

That works, but you’re indexing just to throw the index away, and the filter runs once per element even though the iterator already knows where to land.

Iterator::step_by(n) yields the first item, then advances by n - 1, repeating. Same result, no bookkeeping:

1
2
3
4
5
let xs = [10, 20, 30, 40, 50, 60, 70];

let stepped: Vec<_> = xs.iter().step_by(2).copied().collect();

assert_eq!(stepped, [10, 30, 50, 70]);

The first element is always included — step_by(n) starts at index 0, then jumps. If you want to skip the first one, chain with skip:

1
2
3
4
5
let xs = [10, 20, 30, 40, 50, 60, 70];

let from_second: Vec<_> = xs.iter().skip(1).step_by(2).copied().collect();

assert_eq!(from_second, [20, 40, 60]);

It composes nicely with ranges, which is where it really shines — multiples, downsampling, every-other-frame logic without writing the loop yourself:

1
2
3
4
5
6
7
8
// All multiples of 5 up to 30 (inclusive)
let multiples: Vec<i32> = (0..=30).step_by(5).collect();
assert_eq!(multiples, [0, 5, 10, 15, 20, 25, 30]);

// Downsample a buffer to one in four
let signal: Vec<f32> = (0..16).map(|i| i as f32).collect();
let downsampled: Vec<f32> = signal.iter().step_by(4).copied().collect();
assert_eq!(downsampled, [0.0, 4.0, 8.0, 12.0]);

One footgun: step_by(0) panics. The step has to be at least 1, which makes sense — you can’t “advance by zero” and make progress — but it’s a runtime panic, not a compile error, so don’t pass a step you computed at runtime without checking.

1
2
3
4
5
6
7
8
// This would panic: (0..10).step_by(0)
fn safe_step(xs: &[i32], n: usize) -> Vec<i32> {
    if n == 0 { return Vec::new(); }
    xs.iter().step_by(n).copied().collect()
}

assert_eq!(safe_step(&[1, 2, 3, 4], 0), Vec::<i32>::new());
assert_eq!(safe_step(&[1, 2, 3, 4], 2), vec![1, 3]);

Reach for step_by whenever you’d otherwise write enumerate().filter(|(i, _)| i % n == 0) — same behavior, half the code, and the iterator can actually skip elements instead of inspecting every one.

#116 May 2026

116. Path::file_prefix — Get the Real Stem of archive.tar.gz

Path::file_stem strips the last extension, so archive.tar.gz comes back as archive.tar. That’s almost never what you want for double-extension files. file_prefix strips from the first dot instead — archive, finally.

The classic confusion. You ask for the “stem” of a tarball and get something with .tar still glued on:

1
2
3
4
5
6
use std::path::Path;

let p = Path::new("backups/archive.tar.gz");

assert_eq!(p.file_stem(),   Some("archive.tar".as_ref()));
assert_eq!(p.extension(),   Some("gz".as_ref()));

file_stem takes the file name and drops everything from the last . onwards. For a single extension that’s fine. For .tar.gz, .min.js, .d.ts, .spec.ts, you end up doing the second strip yourself:

1
2
3
4
5
6
7
8
9
use std::path::Path;

fn real_stem_old(p: &Path) -> Option<&str> {
    let stem = p.file_stem()?.to_str()?;
    Some(stem.split('.').next().unwrap_or(stem))
}

assert_eq!(real_stem_old(Path::new("archive.tar.gz")), Some("archive"));
assert_eq!(real_stem_old(Path::new("bundle.min.js")),  Some("bundle"));

Works, but you’ve left OsStr land just to do a string split, and you’ve quietly made the function lossy on non-UTF-8 paths.

Rust 1.91 stabilised Path::file_prefix. It returns the file name up to the first . — staying in OsStr the whole time:

1
2
3
4
5
6
use std::path::Path;

assert_eq!(Path::new("archive.tar.gz").file_prefix(), Some("archive".as_ref()));
assert_eq!(Path::new("bundle.min.js").file_prefix(),  Some("bundle".as_ref()));
assert_eq!(Path::new("notes.md").file_prefix(),       Some("notes".as_ref()));
assert_eq!(Path::new("README").file_prefix(),         Some("README".as_ref()));

Leading dots on dotfiles are kept — exactly like file_stem already does — so you don’t accidentally turn .bashrc into an empty string:

1
2
3
4
use std::path::Path;

assert_eq!(Path::new(".bashrc").file_prefix(),     Some(".bashrc".as_ref()));
assert_eq!(Path::new(".config.toml").file_prefix(), Some(".config".as_ref()));

Pair it with file_stem when you want both halves of a multi-extension name in one place:

1
2
3
4
5
6
7
8
use std::path::Path;

let p = Path::new("logs/app.2026-05-03.log.gz");
let prefix = p.file_prefix().and_then(|s| s.to_str()).unwrap_or("");
let stem   = p.file_stem().and_then(|s| s.to_str()).unwrap_or("");

assert_eq!(prefix, "app");                     // the real name
assert_eq!(stem,   "app.2026-05-03.log");      // everything except the final ext

Reach for file_prefix whenever a filename has more than one dot and you want the part a human would call “the name”.

#115 May 2026

115. Vec::resize_with — Grow a Vec With a Closure, Not a Clone

Vec::resize makes every new slot a clone of the same value. When you need fresh values per slot — counters, allocations, defaults — resize_with calls a closure for each new element instead.

Vec::resize(n, value) is fine when the filler is cheap and identical, but it has two annoyances. It needs T: Clone, and every new slot is the same clone. So this doesn’t work the way you want:

1
2
3
4
5
// One Vec shared across every slot — mutating slot 0 mutates them all.
let mut grid: Vec<Vec<u8>> = Vec::new();
grid.resize(3, Vec::new());
grid[0].push(42);
assert_eq!(grid[1], vec![]); // fine — Vec::new() clones to a new empty Vec

That one happens to be safe because Vec::clone actually allocates. But the moment your T is Rc<RefCell<…>>, every slot points at the same cell. And if T isn’t Clone at all, you can’t call resize in the first place.

resize_with takes a closure and calls it once per new slot:

1
2
3
4
5
6
7
let mut counter = 0;
let mut v = vec![10, 20];
v.resize_with(5, || {
    counter += 1;
    counter
});
assert_eq!(v, vec![10, 20, 1, 2, 3]);

The closure can capture mutable state, so each call is fresh. Generating IDs, pulling from an RNG, allocating independent buffers — all easy:

1
2
3
4
5
6
7
8
9
let mut next_id = 100;
let mut buffers: Vec<(usize, Vec<u8>)> = Vec::new();
buffers.resize_with(3, || {
    let id = next_id;
    next_id += 1;
    (id, Vec::with_capacity(1024))
});
assert_eq!(buffers[0].0, 100);
assert_eq!(buffers[2].0, 102);

For non-Clone types, Default::default is the usual filler:

1
2
3
4
5
6
7
8
9
#[derive(Default, Debug, PartialEq)]
struct Slot {
    open: bool,
    payload: Vec<u8>,
}

let mut slots: Vec<Slot> = Vec::new();
slots.resize_with(2, Default::default);
assert_eq!(slots, vec![Slot::default(), Slot::default()]);

Shrinking still works, and the closure is never called when the new length is smaller:

1
2
3
let mut v = vec![1, 2, 3, 4, 5];
v.resize_with(2, || unreachable!());
assert_eq!(v, vec![1, 2]);

Reach for resize_with whenever the filler isn’t a single static value — and especially when T doesn’t (or shouldn’t) implement Clone.