alloc/boxed.rs
1//! The `Box<T>` type for heap allocation.
2//!
3//! [`Box<T>`], casually referred to as a 'box', provides the simplest form of
4//! heap allocation in Rust. Boxes provide ownership for this allocation, and
5//! drop their contents when they go out of scope. Boxes also ensure that they
6//! never allocate more than `isize::MAX` bytes.
7//!
8//! # Examples
9//!
10//! Move a value from the stack to the heap by creating a [`Box`]:
11//!
12//! ```
13//! let val: u8 = 5;
14//! let boxed: Box<u8> = Box::new(val);
15//! ```
16//!
17//! Move a value from a [`Box`] back to the stack by [dereferencing]:
18//!
19//! ```
20//! let boxed: Box<u8> = Box::new(5);
21//! let val: u8 = *boxed;
22//! ```
23//!
24//! Creating a recursive data structure:
25//!
26//! ```
27//! # #[allow(dead_code)]
28//! #[derive(Debug)]
29//! enum List<T> {
30//! Cons(T, Box<List<T>>),
31//! Nil,
32//! }
33//!
34//! let list: List<i32> = List::Cons(1, Box::new(List::Cons(2, Box::new(List::Nil))));
35//! println!("{list:?}");
36//! ```
37//!
38//! This will print `Cons(1, Cons(2, Nil))`.
39//!
40//! Recursive structures must be boxed, because if the definition of `Cons`
41//! looked like this:
42//!
43//! ```compile_fail,E0072
44//! # enum List<T> {
45//! Cons(T, List<T>),
46//! # }
47//! ```
48//!
49//! It wouldn't work. This is because the size of a `List` depends on how many
50//! elements are in the list, and so we don't know how much memory to allocate
51//! for a `Cons`. By introducing a [`Box<T>`], which has a defined size, we know how
52//! big `Cons` needs to be.
53//!
54//! # Memory layout
55//!
56//! For non-zero-sized values, a [`Box`] will use the [`Global`] allocator for its allocation. It is
57//! valid to convert both ways between a [`Box`] and a raw pointer allocated with the [`Global`]
58//! allocator, given that the [`Layout`] used with the allocator is correct for the type and the raw
59//! pointer points to a valid value of the right type. More precisely, a `value: *mut T` that has
60//! been allocated with the [`Global`] allocator with `Layout::for_value(&*value)` may be converted
61//! into a box using [`Box::<T>::from_raw(value)`]. Conversely, the memory backing a `value: *mut T`
62//! obtained from [`Box::<T>::into_raw`] may be deallocated using the [`Global`] allocator with
63//! [`Layout::for_value(&*value)`].
64//!
65//! For zero-sized values, the `Box` pointer has to be non-null and sufficiently aligned. The
66//! recommended way to build a Box to a ZST if `Box::new` cannot be used is to use
67//! [`ptr::NonNull::dangling`].
68//!
69//! On top of these basic layout requirements, a `Box<T>` must point to a valid value of `T`.
70//!
71//! So long as `T: Sized`, a `Box<T>` is guaranteed to be represented
72//! as a single pointer and is also ABI-compatible with C pointers
73//! (i.e. the C type `T*`). This means that if you have extern "C"
74//! Rust functions that will be called from C, you can define those
75//! Rust functions using `Box<T>` types, and use `T*` as corresponding
76//! type on the C side. As an example, consider this C header which
77//! declares functions that create and destroy some kind of `Foo`
78//! value:
79//!
80//! ```c
81//! /* C header */
82//!
83//! /* Returns ownership to the caller */
84//! struct Foo* foo_new(void);
85//!
86//! /* Takes ownership from the caller; no-op when invoked with null */
87//! void foo_delete(struct Foo*);
88//! ```
89//!
90//! These two functions might be implemented in Rust as follows. Here, the
91//! `struct Foo*` type from C is translated to `Box<Foo>`, which captures
92//! the ownership constraints. Note also that the nullable argument to
93//! `foo_delete` is represented in Rust as `Option<Box<Foo>>`, since `Box<Foo>`
94//! cannot be null.
95//!
96//! ```
97//! #[repr(C)]
98//! pub struct Foo;
99//!
100//! #[unsafe(no_mangle)]
101//! pub extern "C" fn foo_new() -> Box<Foo> {
102//! Box::new(Foo)
103//! }
104//!
105//! #[unsafe(no_mangle)]
106//! pub extern "C" fn foo_delete(_: Option<Box<Foo>>) {}
107//! ```
108//!
109//! Even though `Box<T>` has the same representation and C ABI as a C pointer,
110//! this does not mean that you can convert an arbitrary `T*` into a `Box<T>`
111//! and expect things to work. `Box<T>` values will always be fully aligned,
112//! non-null pointers. Moreover, the destructor for `Box<T>` will attempt to
113//! free the value with the global allocator. In general, the best practice
114//! is to only use `Box<T>` for pointers that originated from the global
115//! allocator.
116//!
117//! **Important.** At least at present, you should avoid using
118//! `Box<T>` types for functions that are defined in C but invoked
119//! from Rust. In those cases, you should directly mirror the C types
120//! as closely as possible. Using types like `Box<T>` where the C
121//! definition is just using `T*` can lead to undefined behavior, as
122//! described in [rust-lang/unsafe-code-guidelines#198][ucg#198].
123//!
124//! # Considerations for unsafe code
125//!
126//! **Warning: This section is not normative and is subject to change, possibly
127//! being relaxed in the future! It is a simplified summary of the rules
128//! currently implemented in the compiler.**
129//!
130//! The aliasing rules for `Box<T>` are the same as for `&mut T`. `Box<T>`
131//! asserts uniqueness over its content. Using raw pointers derived from a box
132//! after that box has been mutated through, moved or borrowed as `&mut T`
133//! is not allowed. For more guidance on working with box from unsafe code, see
134//! [rust-lang/unsafe-code-guidelines#326][ucg#326].
135//!
136//! # Editions
137//!
138//! A special case exists for the implementation of `IntoIterator` for arrays on the Rust 2021
139//! edition, as documented [here][array]. Unfortunately, it was later found that a similar
140//! workaround should be added for boxed slices, and this was applied in the 2024 edition.
141//!
142//! Specifically, `IntoIterator` is implemented for `Box<[T]>` on all editions, but specific calls
143//! to `into_iter()` for boxed slices will defer to the slice implementation on editions before
144//! 2024:
145//!
146//! ```rust,edition2021
147//! // Rust 2015, 2018, and 2021:
148//!
149//! # #![allow(boxed_slice_into_iter)] // override our `deny(warnings)`
150//! let boxed_slice: Box<[i32]> = vec![0; 3].into_boxed_slice();
151//!
152//! // This creates a slice iterator, producing references to each value.
153//! for item in boxed_slice.into_iter().enumerate() {
154//! let (i, x): (usize, &i32) = item;
155//! println!("boxed_slice[{i}] = {x}");
156//! }
157//!
158//! // The `boxed_slice_into_iter` lint suggests this change for future compatibility:
159//! for item in boxed_slice.iter().enumerate() {
160//! let (i, x): (usize, &i32) = item;
161//! println!("boxed_slice[{i}] = {x}");
162//! }
163//!
164//! // You can explicitly iterate a boxed slice by value using `IntoIterator::into_iter`
165//! for item in IntoIterator::into_iter(boxed_slice).enumerate() {
166//! let (i, x): (usize, i32) = item;
167//! println!("boxed_slice[{i}] = {x}");
168//! }
169//! ```
170//!
171//! Similar to the array implementation, this may be modified in the future to remove this override,
172//! and it's best to avoid relying on this edition-dependent behavior if you wish to preserve
173//! compatibility with future versions of the compiler.
174//!
175//! [ucg#198]: https://github.com/rust-lang/unsafe-code-guidelines/issues/198
176//! [ucg#326]: https://github.com/rust-lang/unsafe-code-guidelines/issues/326
177//! [dereferencing]: core::ops::Deref
178//! [`Box::<T>::from_raw(value)`]: Box::from_raw
179//! [`Global`]: crate::alloc::Global
180//! [`Layout`]: crate::alloc::Layout
181//! [`Layout::for_value(&*value)`]: crate::alloc::Layout::for_value
182//! [valid]: ptr#safety
183
184#![stable(feature = "rust1", since = "1.0.0")]
185
186use core::borrow::{Borrow, BorrowMut};
187#[cfg(not(no_global_oom_handling))]
188use core::clone::CloneToUninit;
189use core::cmp::Ordering;
190use core::error::{self, Error};
191use core::fmt;
192use core::future::Future;
193use core::hash::{Hash, Hasher};
194use core::marker::{Tuple, Unsize};
195use core::mem::{self, SizedTypeProperties};
196use core::ops::{
197 AsyncFn, AsyncFnMut, AsyncFnOnce, CoerceUnsized, Coroutine, CoroutineState, Deref, DerefMut,
198 DerefPure, DispatchFromDyn, LegacyReceiver,
199};
200use core::pin::{Pin, PinCoerceUnsized};
201use core::ptr::{self, NonNull, Unique};
202use core::task::{Context, Poll};
203
204#[cfg(not(no_global_oom_handling))]
205use crate::alloc::handle_alloc_error;
206use crate::alloc::{AllocError, Allocator, Global, Layout};
207use crate::raw_vec::RawVec;
208#[cfg(not(no_global_oom_handling))]
209use crate::str::from_boxed_utf8_unchecked;
210
211/// Conversion related impls for `Box<_>` (`From`, `downcast`, etc)
212mod convert;
213/// Iterator related impls for `Box<_>`.
214mod iter;
215/// [`ThinBox`] implementation.
216mod thin;
217
218#[unstable(feature = "thin_box", issue = "92791")]
219pub use thin::ThinBox;
220
221/// A pointer type that uniquely owns a heap allocation of type `T`.
222///
223/// See the [module-level documentation](../../std/boxed/index.html) for more.
224#[lang = "owned_box"]
225#[fundamental]
226#[stable(feature = "rust1", since = "1.0.0")]
227#[rustc_insignificant_dtor]
228#[doc(search_unbox)]
229// The declaration of the `Box` struct must be kept in sync with the
230// compiler or ICEs will happen.
231pub struct Box<
232 T: ?Sized,
233 #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
234>(Unique<T>, A);
235
236/// Constructs a `Box<T>` by calling the `exchange_malloc` lang item and moving the argument into
237/// the newly allocated memory. This is an intrinsic to avoid unnecessary copies.
238///
239/// This is the surface syntax for `box <expr>` expressions.
240#[rustc_intrinsic]
241#[unstable(feature = "liballoc_internals", issue = "none")]
242pub fn box_new<T>(x: T) -> Box<T>;
243
244impl<T> Box<T> {
245 /// Allocates memory on the heap and then places `x` into it.
246 ///
247 /// This doesn't actually allocate if `T` is zero-sized.
248 ///
249 /// # Examples
250 ///
251 /// ```
252 /// let five = Box::new(5);
253 /// ```
254 #[cfg(not(no_global_oom_handling))]
255 #[inline(always)]
256 #[stable(feature = "rust1", since = "1.0.0")]
257 #[must_use]
258 #[rustc_diagnostic_item = "box_new"]
259 #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
260 pub fn new(x: T) -> Self {
261 return box_new(x);
262 }
263
264 /// Constructs a new box with uninitialized contents.
265 ///
266 /// # Examples
267 ///
268 /// ```
269 /// let mut five = Box::<u32>::new_uninit();
270 /// // Deferred initialization:
271 /// five.write(5);
272 /// let five = unsafe { five.assume_init() };
273 ///
274 /// assert_eq!(*five, 5)
275 /// ```
276 #[cfg(not(no_global_oom_handling))]
277 #[stable(feature = "new_uninit", since = "1.82.0")]
278 #[must_use]
279 #[inline]
280 pub fn new_uninit() -> Box<mem::MaybeUninit<T>> {
281 Self::new_uninit_in(Global)
282 }
283
284 /// Constructs a new `Box` with uninitialized contents, with the memory
285 /// being filled with `0` bytes.
286 ///
287 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
288 /// of this method.
289 ///
290 /// # Examples
291 ///
292 /// ```
293 /// let zero = Box::<u32>::new_zeroed();
294 /// let zero = unsafe { zero.assume_init() };
295 ///
296 /// assert_eq!(*zero, 0)
297 /// ```
298 ///
299 /// [zeroed]: mem::MaybeUninit::zeroed
300 #[cfg(not(no_global_oom_handling))]
301 #[inline]
302 #[stable(feature = "new_zeroed_alloc", since = "CURRENT_RUSTC_VERSION")]
303 #[must_use]
304 pub fn new_zeroed() -> Box<mem::MaybeUninit<T>> {
305 Self::new_zeroed_in(Global)
306 }
307
308 /// Constructs a new `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
309 /// `x` will be pinned in memory and unable to be moved.
310 ///
311 /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin(x)`
312 /// does the same as <code>[Box::into_pin]\([Box::new]\(x))</code>. Consider using
313 /// [`into_pin`](Box::into_pin) if you already have a `Box<T>`, or if you want to
314 /// construct a (pinned) `Box` in a different way than with [`Box::new`].
315 #[cfg(not(no_global_oom_handling))]
316 #[stable(feature = "pin", since = "1.33.0")]
317 #[must_use]
318 #[inline(always)]
319 pub fn pin(x: T) -> Pin<Box<T>> {
320 Box::new(x).into()
321 }
322
323 /// Allocates memory on the heap then places `x` into it,
324 /// returning an error if the allocation fails
325 ///
326 /// This doesn't actually allocate if `T` is zero-sized.
327 ///
328 /// # Examples
329 ///
330 /// ```
331 /// #![feature(allocator_api)]
332 ///
333 /// let five = Box::try_new(5)?;
334 /// # Ok::<(), std::alloc::AllocError>(())
335 /// ```
336 #[unstable(feature = "allocator_api", issue = "32838")]
337 #[inline]
338 pub fn try_new(x: T) -> Result<Self, AllocError> {
339 Self::try_new_in(x, Global)
340 }
341
342 /// Constructs a new box with uninitialized contents on the heap,
343 /// returning an error if the allocation fails
344 ///
345 /// # Examples
346 ///
347 /// ```
348 /// #![feature(allocator_api)]
349 ///
350 /// let mut five = Box::<u32>::try_new_uninit()?;
351 /// // Deferred initialization:
352 /// five.write(5);
353 /// let five = unsafe { five.assume_init() };
354 ///
355 /// assert_eq!(*five, 5);
356 /// # Ok::<(), std::alloc::AllocError>(())
357 /// ```
358 #[unstable(feature = "allocator_api", issue = "32838")]
359 #[inline]
360 pub fn try_new_uninit() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
361 Box::try_new_uninit_in(Global)
362 }
363
364 /// Constructs a new `Box` with uninitialized contents, with the memory
365 /// being filled with `0` bytes on the heap
366 ///
367 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
368 /// of this method.
369 ///
370 /// # Examples
371 ///
372 /// ```
373 /// #![feature(allocator_api)]
374 ///
375 /// let zero = Box::<u32>::try_new_zeroed()?;
376 /// let zero = unsafe { zero.assume_init() };
377 ///
378 /// assert_eq!(*zero, 0);
379 /// # Ok::<(), std::alloc::AllocError>(())
380 /// ```
381 ///
382 /// [zeroed]: mem::MaybeUninit::zeroed
383 #[unstable(feature = "allocator_api", issue = "32838")]
384 #[inline]
385 pub fn try_new_zeroed() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
386 Box::try_new_zeroed_in(Global)
387 }
388}
389
390impl<T, A: Allocator> Box<T, A> {
391 /// Allocates memory in the given allocator then places `x` into it.
392 ///
393 /// This doesn't actually allocate if `T` is zero-sized.
394 ///
395 /// # Examples
396 ///
397 /// ```
398 /// #![feature(allocator_api)]
399 ///
400 /// use std::alloc::System;
401 ///
402 /// let five = Box::new_in(5, System);
403 /// ```
404 #[cfg(not(no_global_oom_handling))]
405 #[unstable(feature = "allocator_api", issue = "32838")]
406 #[must_use]
407 #[inline]
408 pub fn new_in(x: T, alloc: A) -> Self
409 where
410 A: Allocator,
411 {
412 let mut boxed = Self::new_uninit_in(alloc);
413 boxed.write(x);
414 unsafe { boxed.assume_init() }
415 }
416
417 /// Allocates memory in the given allocator then places `x` into it,
418 /// returning an error if the allocation fails
419 ///
420 /// This doesn't actually allocate if `T` is zero-sized.
421 ///
422 /// # Examples
423 ///
424 /// ```
425 /// #![feature(allocator_api)]
426 ///
427 /// use std::alloc::System;
428 ///
429 /// let five = Box::try_new_in(5, System)?;
430 /// # Ok::<(), std::alloc::AllocError>(())
431 /// ```
432 #[unstable(feature = "allocator_api", issue = "32838")]
433 #[inline]
434 pub fn try_new_in(x: T, alloc: A) -> Result<Self, AllocError>
435 where
436 A: Allocator,
437 {
438 let mut boxed = Self::try_new_uninit_in(alloc)?;
439 boxed.write(x);
440 unsafe { Ok(boxed.assume_init()) }
441 }
442
443 /// Constructs a new box with uninitialized contents in the provided allocator.
444 ///
445 /// # Examples
446 ///
447 /// ```
448 /// #![feature(allocator_api)]
449 ///
450 /// use std::alloc::System;
451 ///
452 /// let mut five = Box::<u32, _>::new_uninit_in(System);
453 /// // Deferred initialization:
454 /// five.write(5);
455 /// let five = unsafe { five.assume_init() };
456 ///
457 /// assert_eq!(*five, 5)
458 /// ```
459 #[unstable(feature = "allocator_api", issue = "32838")]
460 #[cfg(not(no_global_oom_handling))]
461 #[must_use]
462 pub fn new_uninit_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
463 where
464 A: Allocator,
465 {
466 let layout = Layout::new::<mem::MaybeUninit<T>>();
467 // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
468 // That would make code size bigger.
469 match Box::try_new_uninit_in(alloc) {
470 Ok(m) => m,
471 Err(_) => handle_alloc_error(layout),
472 }
473 }
474
475 /// Constructs a new box with uninitialized contents in the provided allocator,
476 /// returning an error if the allocation fails
477 ///
478 /// # Examples
479 ///
480 /// ```
481 /// #![feature(allocator_api)]
482 ///
483 /// use std::alloc::System;
484 ///
485 /// let mut five = Box::<u32, _>::try_new_uninit_in(System)?;
486 /// // Deferred initialization:
487 /// five.write(5);
488 /// let five = unsafe { five.assume_init() };
489 ///
490 /// assert_eq!(*five, 5);
491 /// # Ok::<(), std::alloc::AllocError>(())
492 /// ```
493 #[unstable(feature = "allocator_api", issue = "32838")]
494 pub fn try_new_uninit_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
495 where
496 A: Allocator,
497 {
498 let ptr = if T::IS_ZST {
499 NonNull::dangling()
500 } else {
501 let layout = Layout::new::<mem::MaybeUninit<T>>();
502 alloc.allocate(layout)?.cast()
503 };
504 unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
505 }
506
507 /// Constructs a new `Box` with uninitialized contents, with the memory
508 /// being filled with `0` bytes in the provided allocator.
509 ///
510 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
511 /// of this method.
512 ///
513 /// # Examples
514 ///
515 /// ```
516 /// #![feature(allocator_api)]
517 ///
518 /// use std::alloc::System;
519 ///
520 /// let zero = Box::<u32, _>::new_zeroed_in(System);
521 /// let zero = unsafe { zero.assume_init() };
522 ///
523 /// assert_eq!(*zero, 0)
524 /// ```
525 ///
526 /// [zeroed]: mem::MaybeUninit::zeroed
527 #[unstable(feature = "allocator_api", issue = "32838")]
528 #[cfg(not(no_global_oom_handling))]
529 #[must_use]
530 pub fn new_zeroed_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
531 where
532 A: Allocator,
533 {
534 let layout = Layout::new::<mem::MaybeUninit<T>>();
535 // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
536 // That would make code size bigger.
537 match Box::try_new_zeroed_in(alloc) {
538 Ok(m) => m,
539 Err(_) => handle_alloc_error(layout),
540 }
541 }
542
543 /// Constructs a new `Box` with uninitialized contents, with the memory
544 /// being filled with `0` bytes in the provided allocator,
545 /// returning an error if the allocation fails,
546 ///
547 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
548 /// of this method.
549 ///
550 /// # Examples
551 ///
552 /// ```
553 /// #![feature(allocator_api)]
554 ///
555 /// use std::alloc::System;
556 ///
557 /// let zero = Box::<u32, _>::try_new_zeroed_in(System)?;
558 /// let zero = unsafe { zero.assume_init() };
559 ///
560 /// assert_eq!(*zero, 0);
561 /// # Ok::<(), std::alloc::AllocError>(())
562 /// ```
563 ///
564 /// [zeroed]: mem::MaybeUninit::zeroed
565 #[unstable(feature = "allocator_api", issue = "32838")]
566 pub fn try_new_zeroed_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
567 where
568 A: Allocator,
569 {
570 let ptr = if T::IS_ZST {
571 NonNull::dangling()
572 } else {
573 let layout = Layout::new::<mem::MaybeUninit<T>>();
574 alloc.allocate_zeroed(layout)?.cast()
575 };
576 unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
577 }
578
579 /// Constructs a new `Pin<Box<T, A>>`. If `T` does not implement [`Unpin`], then
580 /// `x` will be pinned in memory and unable to be moved.
581 ///
582 /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin_in(x, alloc)`
583 /// does the same as <code>[Box::into_pin]\([Box::new_in]\(x, alloc))</code>. Consider using
584 /// [`into_pin`](Box::into_pin) if you already have a `Box<T, A>`, or if you want to
585 /// construct a (pinned) `Box` in a different way than with [`Box::new_in`].
586 #[cfg(not(no_global_oom_handling))]
587 #[unstable(feature = "allocator_api", issue = "32838")]
588 #[must_use]
589 #[inline(always)]
590 pub fn pin_in(x: T, alloc: A) -> Pin<Self>
591 where
592 A: 'static + Allocator,
593 {
594 Self::into_pin(Self::new_in(x, alloc))
595 }
596
597 /// Converts a `Box<T>` into a `Box<[T]>`
598 ///
599 /// This conversion does not allocate on the heap and happens in place.
600 #[unstable(feature = "box_into_boxed_slice", issue = "71582")]
601 pub fn into_boxed_slice(boxed: Self) -> Box<[T], A> {
602 let (raw, alloc) = Box::into_raw_with_allocator(boxed);
603 unsafe { Box::from_raw_in(raw as *mut [T; 1], alloc) }
604 }
605
606 /// Consumes the `Box`, returning the wrapped value.
607 ///
608 /// # Examples
609 ///
610 /// ```
611 /// #![feature(box_into_inner)]
612 ///
613 /// let c = Box::new(5);
614 ///
615 /// assert_eq!(Box::into_inner(c), 5);
616 /// ```
617 #[unstable(feature = "box_into_inner", issue = "80437")]
618 #[inline]
619 pub fn into_inner(boxed: Self) -> T {
620 *boxed
621 }
622
623 /// Consumes the `Box` without consuming its allocation, returning the wrapped value and a `Box`
624 /// to the uninitialized memory where the wrapped value used to live.
625 ///
626 /// This can be used together with [`write`](Box::write) to reuse the allocation for multiple
627 /// boxed values.
628 ///
629 /// # Examples
630 ///
631 /// ```
632 /// #![feature(box_take)]
633 ///
634 /// let c = Box::new(5);
635 ///
636 /// // take the value out of the box
637 /// let (value, uninit) = Box::take(c);
638 /// assert_eq!(value, 5);
639 ///
640 /// // reuse the box for a second value
641 /// let c = Box::write(uninit, 6);
642 /// assert_eq!(*c, 6);
643 /// ```
644 #[unstable(feature = "box_take", issue = "147212")]
645 pub fn take(boxed: Self) -> (T, Box<mem::MaybeUninit<T>, A>) {
646 unsafe {
647 let (raw, alloc) = Box::into_raw_with_allocator(boxed);
648 let value = raw.read();
649 let uninit = Box::from_raw_in(raw.cast::<mem::MaybeUninit<T>>(), alloc);
650 (value, uninit)
651 }
652 }
653}
654
655impl<T> Box<[T]> {
656 /// Constructs a new boxed slice with uninitialized contents.
657 ///
658 /// # Examples
659 ///
660 /// ```
661 /// let mut values = Box::<[u32]>::new_uninit_slice(3);
662 /// // Deferred initialization:
663 /// values[0].write(1);
664 /// values[1].write(2);
665 /// values[2].write(3);
666 /// let values = unsafe { values.assume_init() };
667 ///
668 /// assert_eq!(*values, [1, 2, 3])
669 /// ```
670 #[cfg(not(no_global_oom_handling))]
671 #[stable(feature = "new_uninit", since = "1.82.0")]
672 #[must_use]
673 pub fn new_uninit_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
674 unsafe { RawVec::with_capacity(len).into_box(len) }
675 }
676
677 /// Constructs a new boxed slice with uninitialized contents, with the memory
678 /// being filled with `0` bytes.
679 ///
680 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
681 /// of this method.
682 ///
683 /// # Examples
684 ///
685 /// ```
686 /// let values = Box::<[u32]>::new_zeroed_slice(3);
687 /// let values = unsafe { values.assume_init() };
688 ///
689 /// assert_eq!(*values, [0, 0, 0])
690 /// ```
691 ///
692 /// [zeroed]: mem::MaybeUninit::zeroed
693 #[cfg(not(no_global_oom_handling))]
694 #[stable(feature = "new_zeroed_alloc", since = "CURRENT_RUSTC_VERSION")]
695 #[must_use]
696 pub fn new_zeroed_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
697 unsafe { RawVec::with_capacity_zeroed(len).into_box(len) }
698 }
699
700 /// Constructs a new boxed slice with uninitialized contents. Returns an error if
701 /// the allocation fails.
702 ///
703 /// # Examples
704 ///
705 /// ```
706 /// #![feature(allocator_api)]
707 ///
708 /// let mut values = Box::<[u32]>::try_new_uninit_slice(3)?;
709 /// // Deferred initialization:
710 /// values[0].write(1);
711 /// values[1].write(2);
712 /// values[2].write(3);
713 /// let values = unsafe { values.assume_init() };
714 ///
715 /// assert_eq!(*values, [1, 2, 3]);
716 /// # Ok::<(), std::alloc::AllocError>(())
717 /// ```
718 #[unstable(feature = "allocator_api", issue = "32838")]
719 #[inline]
720 pub fn try_new_uninit_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
721 let ptr = if T::IS_ZST || len == 0 {
722 NonNull::dangling()
723 } else {
724 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
725 Ok(l) => l,
726 Err(_) => return Err(AllocError),
727 };
728 Global.allocate(layout)?.cast()
729 };
730 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
731 }
732
733 /// Constructs a new boxed slice with uninitialized contents, with the memory
734 /// being filled with `0` bytes. Returns an error if the allocation fails.
735 ///
736 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
737 /// of this method.
738 ///
739 /// # Examples
740 ///
741 /// ```
742 /// #![feature(allocator_api)]
743 ///
744 /// let values = Box::<[u32]>::try_new_zeroed_slice(3)?;
745 /// let values = unsafe { values.assume_init() };
746 ///
747 /// assert_eq!(*values, [0, 0, 0]);
748 /// # Ok::<(), std::alloc::AllocError>(())
749 /// ```
750 ///
751 /// [zeroed]: mem::MaybeUninit::zeroed
752 #[unstable(feature = "allocator_api", issue = "32838")]
753 #[inline]
754 pub fn try_new_zeroed_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
755 let ptr = if T::IS_ZST || len == 0 {
756 NonNull::dangling()
757 } else {
758 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
759 Ok(l) => l,
760 Err(_) => return Err(AllocError),
761 };
762 Global.allocate_zeroed(layout)?.cast()
763 };
764 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
765 }
766
767 /// Converts the boxed slice into a boxed array.
768 ///
769 /// This operation does not reallocate; the underlying array of the slice is simply reinterpreted as an array type.
770 ///
771 /// If `N` is not exactly equal to the length of `self`, then this method returns `None`.
772 #[unstable(feature = "slice_as_array", issue = "133508")]
773 #[inline]
774 #[must_use]
775 pub fn into_array<const N: usize>(self) -> Option<Box<[T; N]>> {
776 if self.len() == N {
777 let ptr = Self::into_raw(self) as *mut [T; N];
778
779 // SAFETY: The underlying array of a slice has the exact same layout as an actual array `[T; N]` if `N` is equal to the slice's length.
780 let me = unsafe { Box::from_raw(ptr) };
781 Some(me)
782 } else {
783 None
784 }
785 }
786}
787
788impl<T, A: Allocator> Box<[T], A> {
789 /// Constructs a new boxed slice with uninitialized contents in the provided allocator.
790 ///
791 /// # Examples
792 ///
793 /// ```
794 /// #![feature(allocator_api)]
795 ///
796 /// use std::alloc::System;
797 ///
798 /// let mut values = Box::<[u32], _>::new_uninit_slice_in(3, System);
799 /// // Deferred initialization:
800 /// values[0].write(1);
801 /// values[1].write(2);
802 /// values[2].write(3);
803 /// let values = unsafe { values.assume_init() };
804 ///
805 /// assert_eq!(*values, [1, 2, 3])
806 /// ```
807 #[cfg(not(no_global_oom_handling))]
808 #[unstable(feature = "allocator_api", issue = "32838")]
809 #[must_use]
810 pub fn new_uninit_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
811 unsafe { RawVec::with_capacity_in(len, alloc).into_box(len) }
812 }
813
814 /// Constructs a new boxed slice with uninitialized contents in the provided allocator,
815 /// with the memory being filled with `0` bytes.
816 ///
817 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
818 /// of this method.
819 ///
820 /// # Examples
821 ///
822 /// ```
823 /// #![feature(allocator_api)]
824 ///
825 /// use std::alloc::System;
826 ///
827 /// let values = Box::<[u32], _>::new_zeroed_slice_in(3, System);
828 /// let values = unsafe { values.assume_init() };
829 ///
830 /// assert_eq!(*values, [0, 0, 0])
831 /// ```
832 ///
833 /// [zeroed]: mem::MaybeUninit::zeroed
834 #[cfg(not(no_global_oom_handling))]
835 #[unstable(feature = "allocator_api", issue = "32838")]
836 #[must_use]
837 pub fn new_zeroed_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
838 unsafe { RawVec::with_capacity_zeroed_in(len, alloc).into_box(len) }
839 }
840
841 /// Constructs a new boxed slice with uninitialized contents in the provided allocator. Returns an error if
842 /// the allocation fails.
843 ///
844 /// # Examples
845 ///
846 /// ```
847 /// #![feature(allocator_api)]
848 ///
849 /// use std::alloc::System;
850 ///
851 /// let mut values = Box::<[u32], _>::try_new_uninit_slice_in(3, System)?;
852 /// // Deferred initialization:
853 /// values[0].write(1);
854 /// values[1].write(2);
855 /// values[2].write(3);
856 /// let values = unsafe { values.assume_init() };
857 ///
858 /// assert_eq!(*values, [1, 2, 3]);
859 /// # Ok::<(), std::alloc::AllocError>(())
860 /// ```
861 #[unstable(feature = "allocator_api", issue = "32838")]
862 #[inline]
863 pub fn try_new_uninit_slice_in(
864 len: usize,
865 alloc: A,
866 ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
867 let ptr = if T::IS_ZST || len == 0 {
868 NonNull::dangling()
869 } else {
870 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
871 Ok(l) => l,
872 Err(_) => return Err(AllocError),
873 };
874 alloc.allocate(layout)?.cast()
875 };
876 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
877 }
878
879 /// Constructs a new boxed slice with uninitialized contents in the provided allocator, with the memory
880 /// being filled with `0` bytes. Returns an error if the allocation fails.
881 ///
882 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
883 /// of this method.
884 ///
885 /// # Examples
886 ///
887 /// ```
888 /// #![feature(allocator_api)]
889 ///
890 /// use std::alloc::System;
891 ///
892 /// let values = Box::<[u32], _>::try_new_zeroed_slice_in(3, System)?;
893 /// let values = unsafe { values.assume_init() };
894 ///
895 /// assert_eq!(*values, [0, 0, 0]);
896 /// # Ok::<(), std::alloc::AllocError>(())
897 /// ```
898 ///
899 /// [zeroed]: mem::MaybeUninit::zeroed
900 #[unstable(feature = "allocator_api", issue = "32838")]
901 #[inline]
902 pub fn try_new_zeroed_slice_in(
903 len: usize,
904 alloc: A,
905 ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
906 let ptr = if T::IS_ZST || len == 0 {
907 NonNull::dangling()
908 } else {
909 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
910 Ok(l) => l,
911 Err(_) => return Err(AllocError),
912 };
913 alloc.allocate_zeroed(layout)?.cast()
914 };
915 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
916 }
917}
918
919impl<T, A: Allocator> Box<mem::MaybeUninit<T>, A> {
920 /// Converts to `Box<T, A>`.
921 ///
922 /// # Safety
923 ///
924 /// As with [`MaybeUninit::assume_init`],
925 /// it is up to the caller to guarantee that the value
926 /// really is in an initialized state.
927 /// Calling this when the content is not yet fully initialized
928 /// causes immediate undefined behavior.
929 ///
930 /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
931 ///
932 /// # Examples
933 ///
934 /// ```
935 /// let mut five = Box::<u32>::new_uninit();
936 /// // Deferred initialization:
937 /// five.write(5);
938 /// let five: Box<u32> = unsafe { five.assume_init() };
939 ///
940 /// assert_eq!(*five, 5)
941 /// ```
942 #[stable(feature = "new_uninit", since = "1.82.0")]
943 #[inline]
944 pub unsafe fn assume_init(self) -> Box<T, A> {
945 let (raw, alloc) = Box::into_raw_with_allocator(self);
946 unsafe { Box::from_raw_in(raw as *mut T, alloc) }
947 }
948
949 /// Writes the value and converts to `Box<T, A>`.
950 ///
951 /// This method converts the box similarly to [`Box::assume_init`] but
952 /// writes `value` into it before conversion thus guaranteeing safety.
953 /// In some scenarios use of this method may improve performance because
954 /// the compiler may be able to optimize copying from stack.
955 ///
956 /// # Examples
957 ///
958 /// ```
959 /// let big_box = Box::<[usize; 1024]>::new_uninit();
960 ///
961 /// let mut array = [0; 1024];
962 /// for (i, place) in array.iter_mut().enumerate() {
963 /// *place = i;
964 /// }
965 ///
966 /// // The optimizer may be able to elide this copy, so previous code writes
967 /// // to heap directly.
968 /// let big_box = Box::write(big_box, array);
969 ///
970 /// for (i, x) in big_box.iter().enumerate() {
971 /// assert_eq!(*x, i);
972 /// }
973 /// ```
974 #[stable(feature = "box_uninit_write", since = "1.87.0")]
975 #[inline]
976 pub fn write(mut boxed: Self, value: T) -> Box<T, A> {
977 unsafe {
978 (*boxed).write(value);
979 boxed.assume_init()
980 }
981 }
982}
983
984impl<T, A: Allocator> Box<[mem::MaybeUninit<T>], A> {
985 /// Converts to `Box<[T], A>`.
986 ///
987 /// # Safety
988 ///
989 /// As with [`MaybeUninit::assume_init`],
990 /// it is up to the caller to guarantee that the values
991 /// really are in an initialized state.
992 /// Calling this when the content is not yet fully initialized
993 /// causes immediate undefined behavior.
994 ///
995 /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
996 ///
997 /// # Examples
998 ///
999 /// ```
1000 /// let mut values = Box::<[u32]>::new_uninit_slice(3);
1001 /// // Deferred initialization:
1002 /// values[0].write(1);
1003 /// values[1].write(2);
1004 /// values[2].write(3);
1005 /// let values = unsafe { values.assume_init() };
1006 ///
1007 /// assert_eq!(*values, [1, 2, 3])
1008 /// ```
1009 #[stable(feature = "new_uninit", since = "1.82.0")]
1010 #[inline]
1011 pub unsafe fn assume_init(self) -> Box<[T], A> {
1012 let (raw, alloc) = Box::into_raw_with_allocator(self);
1013 unsafe { Box::from_raw_in(raw as *mut [T], alloc) }
1014 }
1015}
1016
1017impl<T: ?Sized> Box<T> {
1018 /// Constructs a box from a raw pointer.
1019 ///
1020 /// After calling this function, the raw pointer is owned by the
1021 /// resulting `Box`. Specifically, the `Box` destructor will call
1022 /// the destructor of `T` and free the allocated memory. For this
1023 /// to be safe, the memory must have been allocated in accordance
1024 /// with the [memory layout] used by `Box` .
1025 ///
1026 /// # Safety
1027 ///
1028 /// This function is unsafe because improper use may lead to
1029 /// memory problems. For example, a double-free may occur if the
1030 /// function is called twice on the same raw pointer.
1031 ///
1032 /// The raw pointer must point to a block of memory allocated by the global allocator.
1033 ///
1034 /// The safety conditions are described in the [memory layout] section.
1035 ///
1036 /// # Examples
1037 ///
1038 /// Recreate a `Box` which was previously converted to a raw pointer
1039 /// using [`Box::into_raw`]:
1040 /// ```
1041 /// let x = Box::new(5);
1042 /// let ptr = Box::into_raw(x);
1043 /// let x = unsafe { Box::from_raw(ptr) };
1044 /// ```
1045 /// Manually create a `Box` from scratch by using the global allocator:
1046 /// ```
1047 /// use std::alloc::{alloc, Layout};
1048 ///
1049 /// unsafe {
1050 /// let ptr = alloc(Layout::new::<i32>()) as *mut i32;
1051 /// // In general .write is required to avoid attempting to destruct
1052 /// // the (uninitialized) previous contents of `ptr`, though for this
1053 /// // simple example `*ptr = 5` would have worked as well.
1054 /// ptr.write(5);
1055 /// let x = Box::from_raw(ptr);
1056 /// }
1057 /// ```
1058 ///
1059 /// [memory layout]: self#memory-layout
1060 #[stable(feature = "box_raw", since = "1.4.0")]
1061 #[inline]
1062 #[must_use = "call `drop(Box::from_raw(ptr))` if you intend to drop the `Box`"]
1063 pub unsafe fn from_raw(raw: *mut T) -> Self {
1064 unsafe { Self::from_raw_in(raw, Global) }
1065 }
1066
1067 /// Constructs a box from a `NonNull` pointer.
1068 ///
1069 /// After calling this function, the `NonNull` pointer is owned by
1070 /// the resulting `Box`. Specifically, the `Box` destructor will call
1071 /// the destructor of `T` and free the allocated memory. For this
1072 /// to be safe, the memory must have been allocated in accordance
1073 /// with the [memory layout] used by `Box` .
1074 ///
1075 /// # Safety
1076 ///
1077 /// This function is unsafe because improper use may lead to
1078 /// memory problems. For example, a double-free may occur if the
1079 /// function is called twice on the same `NonNull` pointer.
1080 ///
1081 /// The non-null pointer must point to a block of memory allocated by the global allocator.
1082 ///
1083 /// The safety conditions are described in the [memory layout] section.
1084 ///
1085 /// # Examples
1086 ///
1087 /// Recreate a `Box` which was previously converted to a `NonNull`
1088 /// pointer using [`Box::into_non_null`]:
1089 /// ```
1090 /// #![feature(box_vec_non_null)]
1091 ///
1092 /// let x = Box::new(5);
1093 /// let non_null = Box::into_non_null(x);
1094 /// let x = unsafe { Box::from_non_null(non_null) };
1095 /// ```
1096 /// Manually create a `Box` from scratch by using the global allocator:
1097 /// ```
1098 /// #![feature(box_vec_non_null)]
1099 ///
1100 /// use std::alloc::{alloc, Layout};
1101 /// use std::ptr::NonNull;
1102 ///
1103 /// unsafe {
1104 /// let non_null = NonNull::new(alloc(Layout::new::<i32>()).cast::<i32>())
1105 /// .expect("allocation failed");
1106 /// // In general .write is required to avoid attempting to destruct
1107 /// // the (uninitialized) previous contents of `non_null`.
1108 /// non_null.write(5);
1109 /// let x = Box::from_non_null(non_null);
1110 /// }
1111 /// ```
1112 ///
1113 /// [memory layout]: self#memory-layout
1114 #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1115 #[inline]
1116 #[must_use = "call `drop(Box::from_non_null(ptr))` if you intend to drop the `Box`"]
1117 pub unsafe fn from_non_null(ptr: NonNull<T>) -> Self {
1118 unsafe { Self::from_raw(ptr.as_ptr()) }
1119 }
1120
1121 /// Consumes the `Box`, returning a wrapped raw pointer.
1122 ///
1123 /// The pointer will be properly aligned and non-null.
1124 ///
1125 /// After calling this function, the caller is responsible for the
1126 /// memory previously managed by the `Box`. In particular, the
1127 /// caller should properly destroy `T` and release the memory, taking
1128 /// into account the [memory layout] used by `Box`. The easiest way to
1129 /// do this is to convert the raw pointer back into a `Box` with the
1130 /// [`Box::from_raw`] function, allowing the `Box` destructor to perform
1131 /// the cleanup.
1132 ///
1133 /// Note: this is an associated function, which means that you have
1134 /// to call it as `Box::into_raw(b)` instead of `b.into_raw()`. This
1135 /// is so that there is no conflict with a method on the inner type.
1136 ///
1137 /// # Examples
1138 /// Converting the raw pointer back into a `Box` with [`Box::from_raw`]
1139 /// for automatic cleanup:
1140 /// ```
1141 /// let x = Box::new(String::from("Hello"));
1142 /// let ptr = Box::into_raw(x);
1143 /// let x = unsafe { Box::from_raw(ptr) };
1144 /// ```
1145 /// Manual cleanup by explicitly running the destructor and deallocating
1146 /// the memory:
1147 /// ```
1148 /// use std::alloc::{dealloc, Layout};
1149 /// use std::ptr;
1150 ///
1151 /// let x = Box::new(String::from("Hello"));
1152 /// let ptr = Box::into_raw(x);
1153 /// unsafe {
1154 /// ptr::drop_in_place(ptr);
1155 /// dealloc(ptr as *mut u8, Layout::new::<String>());
1156 /// }
1157 /// ```
1158 /// Note: This is equivalent to the following:
1159 /// ```
1160 /// let x = Box::new(String::from("Hello"));
1161 /// let ptr = Box::into_raw(x);
1162 /// unsafe {
1163 /// drop(Box::from_raw(ptr));
1164 /// }
1165 /// ```
1166 ///
1167 /// [memory layout]: self#memory-layout
1168 #[must_use = "losing the pointer will leak memory"]
1169 #[stable(feature = "box_raw", since = "1.4.0")]
1170 #[inline]
1171 pub fn into_raw(b: Self) -> *mut T {
1172 // Avoid `into_raw_with_allocator` as that interacts poorly with Miri's Stacked Borrows.
1173 let mut b = mem::ManuallyDrop::new(b);
1174 // We go through the built-in deref for `Box`, which is crucial for Miri to recognize this
1175 // operation for it's alias tracking.
1176 &raw mut **b
1177 }
1178
1179 /// Consumes the `Box`, returning a wrapped `NonNull` pointer.
1180 ///
1181 /// The pointer will be properly aligned.
1182 ///
1183 /// After calling this function, the caller is responsible for the
1184 /// memory previously managed by the `Box`. In particular, the
1185 /// caller should properly destroy `T` and release the memory, taking
1186 /// into account the [memory layout] used by `Box`. The easiest way to
1187 /// do this is to convert the `NonNull` pointer back into a `Box` with the
1188 /// [`Box::from_non_null`] function, allowing the `Box` destructor to
1189 /// perform the cleanup.
1190 ///
1191 /// Note: this is an associated function, which means that you have
1192 /// to call it as `Box::into_non_null(b)` instead of `b.into_non_null()`.
1193 /// This is so that there is no conflict with a method on the inner type.
1194 ///
1195 /// # Examples
1196 /// Converting the `NonNull` pointer back into a `Box` with [`Box::from_non_null`]
1197 /// for automatic cleanup:
1198 /// ```
1199 /// #![feature(box_vec_non_null)]
1200 ///
1201 /// let x = Box::new(String::from("Hello"));
1202 /// let non_null = Box::into_non_null(x);
1203 /// let x = unsafe { Box::from_non_null(non_null) };
1204 /// ```
1205 /// Manual cleanup by explicitly running the destructor and deallocating
1206 /// the memory:
1207 /// ```
1208 /// #![feature(box_vec_non_null)]
1209 ///
1210 /// use std::alloc::{dealloc, Layout};
1211 ///
1212 /// let x = Box::new(String::from("Hello"));
1213 /// let non_null = Box::into_non_null(x);
1214 /// unsafe {
1215 /// non_null.drop_in_place();
1216 /// dealloc(non_null.as_ptr().cast::<u8>(), Layout::new::<String>());
1217 /// }
1218 /// ```
1219 /// Note: This is equivalent to the following:
1220 /// ```
1221 /// #![feature(box_vec_non_null)]
1222 ///
1223 /// let x = Box::new(String::from("Hello"));
1224 /// let non_null = Box::into_non_null(x);
1225 /// unsafe {
1226 /// drop(Box::from_non_null(non_null));
1227 /// }
1228 /// ```
1229 ///
1230 /// [memory layout]: self#memory-layout
1231 #[must_use = "losing the pointer will leak memory"]
1232 #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1233 #[inline]
1234 pub fn into_non_null(b: Self) -> NonNull<T> {
1235 // SAFETY: `Box` is guaranteed to be non-null.
1236 unsafe { NonNull::new_unchecked(Self::into_raw(b)) }
1237 }
1238}
1239
1240impl<T: ?Sized, A: Allocator> Box<T, A> {
1241 /// Constructs a box from a raw pointer in the given allocator.
1242 ///
1243 /// After calling this function, the raw pointer is owned by the
1244 /// resulting `Box`. Specifically, the `Box` destructor will call
1245 /// the destructor of `T` and free the allocated memory. For this
1246 /// to be safe, the memory must have been allocated in accordance
1247 /// with the [memory layout] used by `Box` .
1248 ///
1249 /// # Safety
1250 ///
1251 /// This function is unsafe because improper use may lead to
1252 /// memory problems. For example, a double-free may occur if the
1253 /// function is called twice on the same raw pointer.
1254 ///
1255 /// The raw pointer must point to a block of memory allocated by `alloc`.
1256 ///
1257 /// # Examples
1258 ///
1259 /// Recreate a `Box` which was previously converted to a raw pointer
1260 /// using [`Box::into_raw_with_allocator`]:
1261 /// ```
1262 /// #![feature(allocator_api)]
1263 ///
1264 /// use std::alloc::System;
1265 ///
1266 /// let x = Box::new_in(5, System);
1267 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1268 /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1269 /// ```
1270 /// Manually create a `Box` from scratch by using the system allocator:
1271 /// ```
1272 /// #![feature(allocator_api, slice_ptr_get)]
1273 ///
1274 /// use std::alloc::{Allocator, Layout, System};
1275 ///
1276 /// unsafe {
1277 /// let ptr = System.allocate(Layout::new::<i32>())?.as_mut_ptr() as *mut i32;
1278 /// // In general .write is required to avoid attempting to destruct
1279 /// // the (uninitialized) previous contents of `ptr`, though for this
1280 /// // simple example `*ptr = 5` would have worked as well.
1281 /// ptr.write(5);
1282 /// let x = Box::from_raw_in(ptr, System);
1283 /// }
1284 /// # Ok::<(), std::alloc::AllocError>(())
1285 /// ```
1286 ///
1287 /// [memory layout]: self#memory-layout
1288 #[unstable(feature = "allocator_api", issue = "32838")]
1289 #[inline]
1290 pub unsafe fn from_raw_in(raw: *mut T, alloc: A) -> Self {
1291 Box(unsafe { Unique::new_unchecked(raw) }, alloc)
1292 }
1293
1294 /// Constructs a box from a `NonNull` pointer in the given allocator.
1295 ///
1296 /// After calling this function, the `NonNull` pointer is owned by
1297 /// the resulting `Box`. Specifically, the `Box` destructor will call
1298 /// the destructor of `T` and free the allocated memory. For this
1299 /// to be safe, the memory must have been allocated in accordance
1300 /// with the [memory layout] used by `Box` .
1301 ///
1302 /// # Safety
1303 ///
1304 /// This function is unsafe because improper use may lead to
1305 /// memory problems. For example, a double-free may occur if the
1306 /// function is called twice on the same raw pointer.
1307 ///
1308 /// The non-null pointer must point to a block of memory allocated by `alloc`.
1309 ///
1310 /// # Examples
1311 ///
1312 /// Recreate a `Box` which was previously converted to a `NonNull` pointer
1313 /// using [`Box::into_non_null_with_allocator`]:
1314 /// ```
1315 /// #![feature(allocator_api, box_vec_non_null)]
1316 ///
1317 /// use std::alloc::System;
1318 ///
1319 /// let x = Box::new_in(5, System);
1320 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1321 /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1322 /// ```
1323 /// Manually create a `Box` from scratch by using the system allocator:
1324 /// ```
1325 /// #![feature(allocator_api, box_vec_non_null, slice_ptr_get)]
1326 ///
1327 /// use std::alloc::{Allocator, Layout, System};
1328 ///
1329 /// unsafe {
1330 /// let non_null = System.allocate(Layout::new::<i32>())?.cast::<i32>();
1331 /// // In general .write is required to avoid attempting to destruct
1332 /// // the (uninitialized) previous contents of `non_null`.
1333 /// non_null.write(5);
1334 /// let x = Box::from_non_null_in(non_null, System);
1335 /// }
1336 /// # Ok::<(), std::alloc::AllocError>(())
1337 /// ```
1338 ///
1339 /// [memory layout]: self#memory-layout
1340 #[unstable(feature = "allocator_api", issue = "32838")]
1341 // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1342 #[inline]
1343 pub unsafe fn from_non_null_in(raw: NonNull<T>, alloc: A) -> Self {
1344 // SAFETY: guaranteed by the caller.
1345 unsafe { Box::from_raw_in(raw.as_ptr(), alloc) }
1346 }
1347
1348 /// Consumes the `Box`, returning a wrapped raw pointer and the allocator.
1349 ///
1350 /// The pointer will be properly aligned and non-null.
1351 ///
1352 /// After calling this function, the caller is responsible for the
1353 /// memory previously managed by the `Box`. In particular, the
1354 /// caller should properly destroy `T` and release the memory, taking
1355 /// into account the [memory layout] used by `Box`. The easiest way to
1356 /// do this is to convert the raw pointer back into a `Box` with the
1357 /// [`Box::from_raw_in`] function, allowing the `Box` destructor to perform
1358 /// the cleanup.
1359 ///
1360 /// Note: this is an associated function, which means that you have
1361 /// to call it as `Box::into_raw_with_allocator(b)` instead of `b.into_raw_with_allocator()`. This
1362 /// is so that there is no conflict with a method on the inner type.
1363 ///
1364 /// # Examples
1365 /// Converting the raw pointer back into a `Box` with [`Box::from_raw_in`]
1366 /// for automatic cleanup:
1367 /// ```
1368 /// #![feature(allocator_api)]
1369 ///
1370 /// use std::alloc::System;
1371 ///
1372 /// let x = Box::new_in(String::from("Hello"), System);
1373 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1374 /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1375 /// ```
1376 /// Manual cleanup by explicitly running the destructor and deallocating
1377 /// the memory:
1378 /// ```
1379 /// #![feature(allocator_api)]
1380 ///
1381 /// use std::alloc::{Allocator, Layout, System};
1382 /// use std::ptr::{self, NonNull};
1383 ///
1384 /// let x = Box::new_in(String::from("Hello"), System);
1385 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1386 /// unsafe {
1387 /// ptr::drop_in_place(ptr);
1388 /// let non_null = NonNull::new_unchecked(ptr);
1389 /// alloc.deallocate(non_null.cast(), Layout::new::<String>());
1390 /// }
1391 /// ```
1392 ///
1393 /// [memory layout]: self#memory-layout
1394 #[must_use = "losing the pointer will leak memory"]
1395 #[unstable(feature = "allocator_api", issue = "32838")]
1396 #[inline]
1397 pub fn into_raw_with_allocator(b: Self) -> (*mut T, A) {
1398 let mut b = mem::ManuallyDrop::new(b);
1399 // We carefully get the raw pointer out in a way that Miri's aliasing model understands what
1400 // is happening: using the primitive "deref" of `Box`. In case `A` is *not* `Global`, we
1401 // want *no* aliasing requirements here!
1402 // In case `A` *is* `Global`, this does not quite have the right behavior; `into_raw`
1403 // works around that.
1404 let ptr = &raw mut **b;
1405 let alloc = unsafe { ptr::read(&b.1) };
1406 (ptr, alloc)
1407 }
1408
1409 /// Consumes the `Box`, returning a wrapped `NonNull` pointer and the allocator.
1410 ///
1411 /// The pointer will be properly aligned.
1412 ///
1413 /// After calling this function, the caller is responsible for the
1414 /// memory previously managed by the `Box`. In particular, the
1415 /// caller should properly destroy `T` and release the memory, taking
1416 /// into account the [memory layout] used by `Box`. The easiest way to
1417 /// do this is to convert the `NonNull` pointer back into a `Box` with the
1418 /// [`Box::from_non_null_in`] function, allowing the `Box` destructor to
1419 /// perform the cleanup.
1420 ///
1421 /// Note: this is an associated function, which means that you have
1422 /// to call it as `Box::into_non_null_with_allocator(b)` instead of
1423 /// `b.into_non_null_with_allocator()`. This is so that there is no
1424 /// conflict with a method on the inner type.
1425 ///
1426 /// # Examples
1427 /// Converting the `NonNull` pointer back into a `Box` with
1428 /// [`Box::from_non_null_in`] for automatic cleanup:
1429 /// ```
1430 /// #![feature(allocator_api, box_vec_non_null)]
1431 ///
1432 /// use std::alloc::System;
1433 ///
1434 /// let x = Box::new_in(String::from("Hello"), System);
1435 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1436 /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1437 /// ```
1438 /// Manual cleanup by explicitly running the destructor and deallocating
1439 /// the memory:
1440 /// ```
1441 /// #![feature(allocator_api, box_vec_non_null)]
1442 ///
1443 /// use std::alloc::{Allocator, Layout, System};
1444 ///
1445 /// let x = Box::new_in(String::from("Hello"), System);
1446 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1447 /// unsafe {
1448 /// non_null.drop_in_place();
1449 /// alloc.deallocate(non_null.cast::<u8>(), Layout::new::<String>());
1450 /// }
1451 /// ```
1452 ///
1453 /// [memory layout]: self#memory-layout
1454 #[must_use = "losing the pointer will leak memory"]
1455 #[unstable(feature = "allocator_api", issue = "32838")]
1456 // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1457 #[inline]
1458 pub fn into_non_null_with_allocator(b: Self) -> (NonNull<T>, A) {
1459 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1460 // SAFETY: `Box` is guaranteed to be non-null.
1461 unsafe { (NonNull::new_unchecked(ptr), alloc) }
1462 }
1463
1464 #[unstable(
1465 feature = "ptr_internals",
1466 issue = "none",
1467 reason = "use `Box::leak(b).into()` or `Unique::from(Box::leak(b))` instead"
1468 )]
1469 #[inline]
1470 #[doc(hidden)]
1471 pub fn into_unique(b: Self) -> (Unique<T>, A) {
1472 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1473 unsafe { (Unique::from(&mut *ptr), alloc) }
1474 }
1475
1476 /// Returns a raw mutable pointer to the `Box`'s contents.
1477 ///
1478 /// The caller must ensure that the `Box` outlives the pointer this
1479 /// function returns, or else it will end up dangling.
1480 ///
1481 /// This method guarantees that for the purpose of the aliasing model, this method
1482 /// does not materialize a reference to the underlying memory, and thus the returned pointer
1483 /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1484 /// Note that calling other methods that materialize references to the memory
1485 /// may still invalidate this pointer.
1486 /// See the example below for how this guarantee can be used.
1487 ///
1488 /// # Examples
1489 ///
1490 /// Due to the aliasing guarantee, the following code is legal:
1491 ///
1492 /// ```rust
1493 /// #![feature(box_as_ptr)]
1494 ///
1495 /// unsafe {
1496 /// let mut b = Box::new(0);
1497 /// let ptr1 = Box::as_mut_ptr(&mut b);
1498 /// ptr1.write(1);
1499 /// let ptr2 = Box::as_mut_ptr(&mut b);
1500 /// ptr2.write(2);
1501 /// // Notably, the write to `ptr2` did *not* invalidate `ptr1`:
1502 /// ptr1.write(3);
1503 /// }
1504 /// ```
1505 ///
1506 /// [`as_mut_ptr`]: Self::as_mut_ptr
1507 /// [`as_ptr`]: Self::as_ptr
1508 #[unstable(feature = "box_as_ptr", issue = "129090")]
1509 #[rustc_never_returns_null_ptr]
1510 #[rustc_as_ptr]
1511 #[inline]
1512 pub fn as_mut_ptr(b: &mut Self) -> *mut T {
1513 // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1514 // any references.
1515 &raw mut **b
1516 }
1517
1518 /// Returns a raw pointer to the `Box`'s contents.
1519 ///
1520 /// The caller must ensure that the `Box` outlives the pointer this
1521 /// function returns, or else it will end up dangling.
1522 ///
1523 /// The caller must also ensure that the memory the pointer (non-transitively) points to
1524 /// is never written to (except inside an `UnsafeCell`) using this pointer or any pointer
1525 /// derived from it. If you need to mutate the contents of the `Box`, use [`as_mut_ptr`].
1526 ///
1527 /// This method guarantees that for the purpose of the aliasing model, this method
1528 /// does not materialize a reference to the underlying memory, and thus the returned pointer
1529 /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1530 /// Note that calling other methods that materialize mutable references to the memory,
1531 /// as well as writing to this memory, may still invalidate this pointer.
1532 /// See the example below for how this guarantee can be used.
1533 ///
1534 /// # Examples
1535 ///
1536 /// Due to the aliasing guarantee, the following code is legal:
1537 ///
1538 /// ```rust
1539 /// #![feature(box_as_ptr)]
1540 ///
1541 /// unsafe {
1542 /// let mut v = Box::new(0);
1543 /// let ptr1 = Box::as_ptr(&v);
1544 /// let ptr2 = Box::as_mut_ptr(&mut v);
1545 /// let _val = ptr2.read();
1546 /// // No write to this memory has happened yet, so `ptr1` is still valid.
1547 /// let _val = ptr1.read();
1548 /// // However, once we do a write...
1549 /// ptr2.write(1);
1550 /// // ... `ptr1` is no longer valid.
1551 /// // This would be UB: let _val = ptr1.read();
1552 /// }
1553 /// ```
1554 ///
1555 /// [`as_mut_ptr`]: Self::as_mut_ptr
1556 /// [`as_ptr`]: Self::as_ptr
1557 #[unstable(feature = "box_as_ptr", issue = "129090")]
1558 #[rustc_never_returns_null_ptr]
1559 #[rustc_as_ptr]
1560 #[inline]
1561 pub fn as_ptr(b: &Self) -> *const T {
1562 // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1563 // any references.
1564 &raw const **b
1565 }
1566
1567 /// Returns a reference to the underlying allocator.
1568 ///
1569 /// Note: this is an associated function, which means that you have
1570 /// to call it as `Box::allocator(&b)` instead of `b.allocator()`. This
1571 /// is so that there is no conflict with a method on the inner type.
1572 #[unstable(feature = "allocator_api", issue = "32838")]
1573 #[inline]
1574 pub fn allocator(b: &Self) -> &A {
1575 &b.1
1576 }
1577
1578 /// Consumes and leaks the `Box`, returning a mutable reference,
1579 /// `&'a mut T`.
1580 ///
1581 /// Note that the type `T` must outlive the chosen lifetime `'a`. If the type
1582 /// has only static references, or none at all, then this may be chosen to be
1583 /// `'static`.
1584 ///
1585 /// This function is mainly useful for data that lives for the remainder of
1586 /// the program's life. Dropping the returned reference will cause a memory
1587 /// leak. If this is not acceptable, the reference should first be wrapped
1588 /// with the [`Box::from_raw`] function producing a `Box`. This `Box` can
1589 /// then be dropped which will properly destroy `T` and release the
1590 /// allocated memory.
1591 ///
1592 /// Note: this is an associated function, which means that you have
1593 /// to call it as `Box::leak(b)` instead of `b.leak()`. This
1594 /// is so that there is no conflict with a method on the inner type.
1595 ///
1596 /// # Examples
1597 ///
1598 /// Simple usage:
1599 ///
1600 /// ```
1601 /// let x = Box::new(41);
1602 /// let static_ref: &'static mut usize = Box::leak(x);
1603 /// *static_ref += 1;
1604 /// assert_eq!(*static_ref, 42);
1605 /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1606 /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1607 /// # drop(unsafe { Box::from_raw(static_ref) });
1608 /// ```
1609 ///
1610 /// Unsized data:
1611 ///
1612 /// ```
1613 /// let x = vec![1, 2, 3].into_boxed_slice();
1614 /// let static_ref = Box::leak(x);
1615 /// static_ref[0] = 4;
1616 /// assert_eq!(*static_ref, [4, 2, 3]);
1617 /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1618 /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1619 /// # drop(unsafe { Box::from_raw(static_ref) });
1620 /// ```
1621 #[stable(feature = "box_leak", since = "1.26.0")]
1622 #[inline]
1623 pub fn leak<'a>(b: Self) -> &'a mut T
1624 where
1625 A: 'a,
1626 {
1627 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1628 mem::forget(alloc);
1629 unsafe { &mut *ptr }
1630 }
1631
1632 /// Converts a `Box<T>` into a `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
1633 /// `*boxed` will be pinned in memory and unable to be moved.
1634 ///
1635 /// This conversion does not allocate on the heap and happens in place.
1636 ///
1637 /// This is also available via [`From`].
1638 ///
1639 /// Constructing and pinning a `Box` with <code>Box::into_pin([Box::new]\(x))</code>
1640 /// can also be written more concisely using <code>[Box::pin]\(x)</code>.
1641 /// This `into_pin` method is useful if you already have a `Box<T>`, or you are
1642 /// constructing a (pinned) `Box` in a different way than with [`Box::new`].
1643 ///
1644 /// # Notes
1645 ///
1646 /// It's not recommended that crates add an impl like `From<Box<T>> for Pin<T>`,
1647 /// as it'll introduce an ambiguity when calling `Pin::from`.
1648 /// A demonstration of such a poor impl is shown below.
1649 ///
1650 /// ```compile_fail
1651 /// # use std::pin::Pin;
1652 /// struct Foo; // A type defined in this crate.
1653 /// impl From<Box<()>> for Pin<Foo> {
1654 /// fn from(_: Box<()>) -> Pin<Foo> {
1655 /// Pin::new(Foo)
1656 /// }
1657 /// }
1658 ///
1659 /// let foo = Box::new(());
1660 /// let bar = Pin::from(foo);
1661 /// ```
1662 #[stable(feature = "box_into_pin", since = "1.63.0")]
1663 pub fn into_pin(boxed: Self) -> Pin<Self>
1664 where
1665 A: 'static,
1666 {
1667 // It's not possible to move or replace the insides of a `Pin<Box<T>>`
1668 // when `T: !Unpin`, so it's safe to pin it directly without any
1669 // additional requirements.
1670 unsafe { Pin::new_unchecked(boxed) }
1671 }
1672}
1673
1674#[stable(feature = "rust1", since = "1.0.0")]
1675unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Box<T, A> {
1676 #[inline]
1677 fn drop(&mut self) {
1678 // the T in the Box is dropped by the compiler before the destructor is run
1679
1680 let ptr = self.0;
1681
1682 unsafe {
1683 let layout = Layout::for_value_raw(ptr.as_ptr());
1684 if layout.size() != 0 {
1685 self.1.deallocate(From::from(ptr.cast()), layout);
1686 }
1687 }
1688 }
1689}
1690
1691#[cfg(not(no_global_oom_handling))]
1692#[stable(feature = "rust1", since = "1.0.0")]
1693impl<T: Default> Default for Box<T> {
1694 /// Creates a `Box<T>`, with the `Default` value for `T`.
1695 #[inline]
1696 fn default() -> Self {
1697 let mut x: Box<mem::MaybeUninit<T>> = Box::new_uninit();
1698 unsafe {
1699 // SAFETY: `x` is valid for writing and has the same layout as `T`.
1700 // If `T::default()` panics, dropping `x` will just deallocate the Box as `MaybeUninit<T>`
1701 // does not have a destructor.
1702 //
1703 // We use `ptr::write` as `MaybeUninit::write` creates
1704 // extra stack copies of `T` in debug mode.
1705 //
1706 // See https://github.com/rust-lang/rust/issues/136043 for more context.
1707 ptr::write(&raw mut *x as *mut T, T::default());
1708 // SAFETY: `x` was just initialized above.
1709 x.assume_init()
1710 }
1711 }
1712}
1713
1714#[cfg(not(no_global_oom_handling))]
1715#[stable(feature = "rust1", since = "1.0.0")]
1716impl<T> Default for Box<[T]> {
1717 /// Creates an empty `[T]` inside a `Box`.
1718 #[inline]
1719 fn default() -> Self {
1720 let ptr: Unique<[T]> = Unique::<[T; 0]>::dangling();
1721 Box(ptr, Global)
1722 }
1723}
1724
1725#[cfg(not(no_global_oom_handling))]
1726#[stable(feature = "default_box_extra", since = "1.17.0")]
1727impl Default for Box<str> {
1728 #[inline]
1729 fn default() -> Self {
1730 // SAFETY: This is the same as `Unique::cast<U>` but with an unsized `U = str`.
1731 let ptr: Unique<str> = unsafe {
1732 let bytes: Unique<[u8]> = Unique::<[u8; 0]>::dangling();
1733 Unique::new_unchecked(bytes.as_ptr() as *mut str)
1734 };
1735 Box(ptr, Global)
1736 }
1737}
1738
1739#[cfg(not(no_global_oom_handling))]
1740#[stable(feature = "pin_default_impls", since = "1.91.0")]
1741impl<T> Default for Pin<Box<T>>
1742where
1743 T: ?Sized,
1744 Box<T>: Default,
1745{
1746 #[inline]
1747 fn default() -> Self {
1748 Box::into_pin(Box::<T>::default())
1749 }
1750}
1751
1752#[cfg(not(no_global_oom_handling))]
1753#[stable(feature = "rust1", since = "1.0.0")]
1754impl<T: Clone, A: Allocator + Clone> Clone for Box<T, A> {
1755 /// Returns a new box with a `clone()` of this box's contents.
1756 ///
1757 /// # Examples
1758 ///
1759 /// ```
1760 /// let x = Box::new(5);
1761 /// let y = x.clone();
1762 ///
1763 /// // The value is the same
1764 /// assert_eq!(x, y);
1765 ///
1766 /// // But they are unique objects
1767 /// assert_ne!(&*x as *const i32, &*y as *const i32);
1768 /// ```
1769 #[inline]
1770 fn clone(&self) -> Self {
1771 // Pre-allocate memory to allow writing the cloned value directly.
1772 let mut boxed = Self::new_uninit_in(self.1.clone());
1773 unsafe {
1774 (**self).clone_to_uninit(boxed.as_mut_ptr().cast());
1775 boxed.assume_init()
1776 }
1777 }
1778
1779 /// Copies `source`'s contents into `self` without creating a new allocation.
1780 ///
1781 /// # Examples
1782 ///
1783 /// ```
1784 /// let x = Box::new(5);
1785 /// let mut y = Box::new(10);
1786 /// let yp: *const i32 = &*y;
1787 ///
1788 /// y.clone_from(&x);
1789 ///
1790 /// // The value is the same
1791 /// assert_eq!(x, y);
1792 ///
1793 /// // And no allocation occurred
1794 /// assert_eq!(yp, &*y);
1795 /// ```
1796 #[inline]
1797 fn clone_from(&mut self, source: &Self) {
1798 (**self).clone_from(&(**source));
1799 }
1800}
1801
1802#[cfg(not(no_global_oom_handling))]
1803#[stable(feature = "box_slice_clone", since = "1.3.0")]
1804impl<T: Clone, A: Allocator + Clone> Clone for Box<[T], A> {
1805 fn clone(&self) -> Self {
1806 let alloc = Box::allocator(self).clone();
1807 self.to_vec_in(alloc).into_boxed_slice()
1808 }
1809
1810 /// Copies `source`'s contents into `self` without creating a new allocation,
1811 /// so long as the two are of the same length.
1812 ///
1813 /// # Examples
1814 ///
1815 /// ```
1816 /// let x = Box::new([5, 6, 7]);
1817 /// let mut y = Box::new([8, 9, 10]);
1818 /// let yp: *const [i32] = &*y;
1819 ///
1820 /// y.clone_from(&x);
1821 ///
1822 /// // The value is the same
1823 /// assert_eq!(x, y);
1824 ///
1825 /// // And no allocation occurred
1826 /// assert_eq!(yp, &*y);
1827 /// ```
1828 fn clone_from(&mut self, source: &Self) {
1829 if self.len() == source.len() {
1830 self.clone_from_slice(&source);
1831 } else {
1832 *self = source.clone();
1833 }
1834 }
1835}
1836
1837#[cfg(not(no_global_oom_handling))]
1838#[stable(feature = "box_slice_clone", since = "1.3.0")]
1839impl Clone for Box<str> {
1840 fn clone(&self) -> Self {
1841 // this makes a copy of the data
1842 let buf: Box<[u8]> = self.as_bytes().into();
1843 unsafe { from_boxed_utf8_unchecked(buf) }
1844 }
1845}
1846
1847#[stable(feature = "rust1", since = "1.0.0")]
1848impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for Box<T, A> {
1849 #[inline]
1850 fn eq(&self, other: &Self) -> bool {
1851 PartialEq::eq(&**self, &**other)
1852 }
1853 #[inline]
1854 fn ne(&self, other: &Self) -> bool {
1855 PartialEq::ne(&**self, &**other)
1856 }
1857}
1858
1859#[stable(feature = "rust1", since = "1.0.0")]
1860impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for Box<T, A> {
1861 #[inline]
1862 fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
1863 PartialOrd::partial_cmp(&**self, &**other)
1864 }
1865 #[inline]
1866 fn lt(&self, other: &Self) -> bool {
1867 PartialOrd::lt(&**self, &**other)
1868 }
1869 #[inline]
1870 fn le(&self, other: &Self) -> bool {
1871 PartialOrd::le(&**self, &**other)
1872 }
1873 #[inline]
1874 fn ge(&self, other: &Self) -> bool {
1875 PartialOrd::ge(&**self, &**other)
1876 }
1877 #[inline]
1878 fn gt(&self, other: &Self) -> bool {
1879 PartialOrd::gt(&**self, &**other)
1880 }
1881}
1882
1883#[stable(feature = "rust1", since = "1.0.0")]
1884impl<T: ?Sized + Ord, A: Allocator> Ord for Box<T, A> {
1885 #[inline]
1886 fn cmp(&self, other: &Self) -> Ordering {
1887 Ord::cmp(&**self, &**other)
1888 }
1889}
1890
1891#[stable(feature = "rust1", since = "1.0.0")]
1892impl<T: ?Sized + Eq, A: Allocator> Eq for Box<T, A> {}
1893
1894#[stable(feature = "rust1", since = "1.0.0")]
1895impl<T: ?Sized + Hash, A: Allocator> Hash for Box<T, A> {
1896 fn hash<H: Hasher>(&self, state: &mut H) {
1897 (**self).hash(state);
1898 }
1899}
1900
1901#[stable(feature = "indirect_hasher_impl", since = "1.22.0")]
1902impl<T: ?Sized + Hasher, A: Allocator> Hasher for Box<T, A> {
1903 fn finish(&self) -> u64 {
1904 (**self).finish()
1905 }
1906 fn write(&mut self, bytes: &[u8]) {
1907 (**self).write(bytes)
1908 }
1909 fn write_u8(&mut self, i: u8) {
1910 (**self).write_u8(i)
1911 }
1912 fn write_u16(&mut self, i: u16) {
1913 (**self).write_u16(i)
1914 }
1915 fn write_u32(&mut self, i: u32) {
1916 (**self).write_u32(i)
1917 }
1918 fn write_u64(&mut self, i: u64) {
1919 (**self).write_u64(i)
1920 }
1921 fn write_u128(&mut self, i: u128) {
1922 (**self).write_u128(i)
1923 }
1924 fn write_usize(&mut self, i: usize) {
1925 (**self).write_usize(i)
1926 }
1927 fn write_i8(&mut self, i: i8) {
1928 (**self).write_i8(i)
1929 }
1930 fn write_i16(&mut self, i: i16) {
1931 (**self).write_i16(i)
1932 }
1933 fn write_i32(&mut self, i: i32) {
1934 (**self).write_i32(i)
1935 }
1936 fn write_i64(&mut self, i: i64) {
1937 (**self).write_i64(i)
1938 }
1939 fn write_i128(&mut self, i: i128) {
1940 (**self).write_i128(i)
1941 }
1942 fn write_isize(&mut self, i: isize) {
1943 (**self).write_isize(i)
1944 }
1945 fn write_length_prefix(&mut self, len: usize) {
1946 (**self).write_length_prefix(len)
1947 }
1948 fn write_str(&mut self, s: &str) {
1949 (**self).write_str(s)
1950 }
1951}
1952
1953#[stable(feature = "rust1", since = "1.0.0")]
1954impl<T: fmt::Display + ?Sized, A: Allocator> fmt::Display for Box<T, A> {
1955 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1956 fmt::Display::fmt(&**self, f)
1957 }
1958}
1959
1960#[stable(feature = "rust1", since = "1.0.0")]
1961impl<T: fmt::Debug + ?Sized, A: Allocator> fmt::Debug for Box<T, A> {
1962 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1963 fmt::Debug::fmt(&**self, f)
1964 }
1965}
1966
1967#[stable(feature = "rust1", since = "1.0.0")]
1968impl<T: ?Sized, A: Allocator> fmt::Pointer for Box<T, A> {
1969 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1970 // It's not possible to extract the inner Uniq directly from the Box,
1971 // instead we cast it to a *const which aliases the Unique
1972 let ptr: *const T = &**self;
1973 fmt::Pointer::fmt(&ptr, f)
1974 }
1975}
1976
1977#[stable(feature = "rust1", since = "1.0.0")]
1978impl<T: ?Sized, A: Allocator> Deref for Box<T, A> {
1979 type Target = T;
1980
1981 fn deref(&self) -> &T {
1982 &**self
1983 }
1984}
1985
1986#[stable(feature = "rust1", since = "1.0.0")]
1987impl<T: ?Sized, A: Allocator> DerefMut for Box<T, A> {
1988 fn deref_mut(&mut self) -> &mut T {
1989 &mut **self
1990 }
1991}
1992
1993#[unstable(feature = "deref_pure_trait", issue = "87121")]
1994unsafe impl<T: ?Sized, A: Allocator> DerefPure for Box<T, A> {}
1995
1996#[unstable(feature = "legacy_receiver_trait", issue = "none")]
1997impl<T: ?Sized, A: Allocator> LegacyReceiver for Box<T, A> {}
1998
1999#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2000impl<Args: Tuple, F: FnOnce<Args> + ?Sized, A: Allocator> FnOnce<Args> for Box<F, A> {
2001 type Output = <F as FnOnce<Args>>::Output;
2002
2003 extern "rust-call" fn call_once(self, args: Args) -> Self::Output {
2004 <F as FnOnce<Args>>::call_once(*self, args)
2005 }
2006}
2007
2008#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2009impl<Args: Tuple, F: FnMut<Args> + ?Sized, A: Allocator> FnMut<Args> for Box<F, A> {
2010 extern "rust-call" fn call_mut(&mut self, args: Args) -> Self::Output {
2011 <F as FnMut<Args>>::call_mut(self, args)
2012 }
2013}
2014
2015#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2016impl<Args: Tuple, F: Fn<Args> + ?Sized, A: Allocator> Fn<Args> for Box<F, A> {
2017 extern "rust-call" fn call(&self, args: Args) -> Self::Output {
2018 <F as Fn<Args>>::call(self, args)
2019 }
2020}
2021
2022#[stable(feature = "async_closure", since = "1.85.0")]
2023impl<Args: Tuple, F: AsyncFnOnce<Args> + ?Sized, A: Allocator> AsyncFnOnce<Args> for Box<F, A> {
2024 type Output = F::Output;
2025 type CallOnceFuture = F::CallOnceFuture;
2026
2027 extern "rust-call" fn async_call_once(self, args: Args) -> Self::CallOnceFuture {
2028 F::async_call_once(*self, args)
2029 }
2030}
2031
2032#[stable(feature = "async_closure", since = "1.85.0")]
2033impl<Args: Tuple, F: AsyncFnMut<Args> + ?Sized, A: Allocator> AsyncFnMut<Args> for Box<F, A> {
2034 type CallRefFuture<'a>
2035 = F::CallRefFuture<'a>
2036 where
2037 Self: 'a;
2038
2039 extern "rust-call" fn async_call_mut(&mut self, args: Args) -> Self::CallRefFuture<'_> {
2040 F::async_call_mut(self, args)
2041 }
2042}
2043
2044#[stable(feature = "async_closure", since = "1.85.0")]
2045impl<Args: Tuple, F: AsyncFn<Args> + ?Sized, A: Allocator> AsyncFn<Args> for Box<F, A> {
2046 extern "rust-call" fn async_call(&self, args: Args) -> Self::CallRefFuture<'_> {
2047 F::async_call(self, args)
2048 }
2049}
2050
2051#[unstable(feature = "coerce_unsized", issue = "18598")]
2052impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Box<U, A>> for Box<T, A> {}
2053
2054#[unstable(feature = "pin_coerce_unsized_trait", issue = "123430")]
2055unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Box<T, A> {}
2056
2057// It is quite crucial that we only allow the `Global` allocator here.
2058// Handling arbitrary custom allocators (which can affect the `Box` layout heavily!)
2059// would need a lot of codegen and interpreter adjustments.
2060#[unstable(feature = "dispatch_from_dyn", issue = "none")]
2061impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Box<U>> for Box<T, Global> {}
2062
2063#[stable(feature = "box_borrow", since = "1.1.0")]
2064impl<T: ?Sized, A: Allocator> Borrow<T> for Box<T, A> {
2065 fn borrow(&self) -> &T {
2066 &**self
2067 }
2068}
2069
2070#[stable(feature = "box_borrow", since = "1.1.0")]
2071impl<T: ?Sized, A: Allocator> BorrowMut<T> for Box<T, A> {
2072 fn borrow_mut(&mut self) -> &mut T {
2073 &mut **self
2074 }
2075}
2076
2077#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2078impl<T: ?Sized, A: Allocator> AsRef<T> for Box<T, A> {
2079 fn as_ref(&self) -> &T {
2080 &**self
2081 }
2082}
2083
2084#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2085impl<T: ?Sized, A: Allocator> AsMut<T> for Box<T, A> {
2086 fn as_mut(&mut self) -> &mut T {
2087 &mut **self
2088 }
2089}
2090
2091/* Nota bene
2092 *
2093 * We could have chosen not to add this impl, and instead have written a
2094 * function of Pin<Box<T>> to Pin<T>. Such a function would not be sound,
2095 * because Box<T> implements Unpin even when T does not, as a result of
2096 * this impl.
2097 *
2098 * We chose this API instead of the alternative for a few reasons:
2099 * - Logically, it is helpful to understand pinning in regard to the
2100 * memory region being pointed to. For this reason none of the
2101 * standard library pointer types support projecting through a pin
2102 * (Box<T> is the only pointer type in std for which this would be
2103 * safe.)
2104 * - It is in practice very useful to have Box<T> be unconditionally
2105 * Unpin because of trait objects, for which the structural auto
2106 * trait functionality does not apply (e.g., Box<dyn Foo> would
2107 * otherwise not be Unpin).
2108 *
2109 * Another type with the same semantics as Box but only a conditional
2110 * implementation of `Unpin` (where `T: Unpin`) would be valid/safe, and
2111 * could have a method to project a Pin<T> from it.
2112 */
2113#[stable(feature = "pin", since = "1.33.0")]
2114impl<T: ?Sized, A: Allocator> Unpin for Box<T, A> {}
2115
2116#[unstable(feature = "coroutine_trait", issue = "43122")]
2117impl<G: ?Sized + Coroutine<R> + Unpin, R, A: Allocator> Coroutine<R> for Box<G, A> {
2118 type Yield = G::Yield;
2119 type Return = G::Return;
2120
2121 fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2122 G::resume(Pin::new(&mut *self), arg)
2123 }
2124}
2125
2126#[unstable(feature = "coroutine_trait", issue = "43122")]
2127impl<G: ?Sized + Coroutine<R>, R, A: Allocator> Coroutine<R> for Pin<Box<G, A>>
2128where
2129 A: 'static,
2130{
2131 type Yield = G::Yield;
2132 type Return = G::Return;
2133
2134 fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2135 G::resume((*self).as_mut(), arg)
2136 }
2137}
2138
2139#[stable(feature = "futures_api", since = "1.36.0")]
2140impl<F: ?Sized + Future + Unpin, A: Allocator> Future for Box<F, A> {
2141 type Output = F::Output;
2142
2143 fn poll(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
2144 F::poll(Pin::new(&mut *self), cx)
2145 }
2146}
2147
2148#[stable(feature = "box_error", since = "1.8.0")]
2149impl<E: Error> Error for Box<E> {
2150 #[allow(deprecated)]
2151 fn cause(&self) -> Option<&dyn Error> {
2152 Error::cause(&**self)
2153 }
2154
2155 fn source(&self) -> Option<&(dyn Error + 'static)> {
2156 Error::source(&**self)
2157 }
2158
2159 fn provide<'b>(&'b self, request: &mut error::Request<'b>) {
2160 Error::provide(&**self, request);
2161 }
2162}