-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Destructor Tombstones #5318
Destructor Tombstones #5318
Conversation
* `~_Vector_val()` for `vector<T>` demonstrates the most complete pattern: + We guard everything with `#if _MSVC_STL_DESTRUCTOR_TOMBSTONES`, so it completely vanishes when disabled. + We guard against fancy pointers. + We guard against constant evaluation. + We use a named `_Tombstone` because `_MSVC_STL_UINTPTR_TOMBSTONE_VALUE` might expand to a function call. * `~_Vb_val()` for `vector<bool>` already exists. + `_Vectype _Myvec` provides the storage, which we just tombstoned. + Zeroing out `_Mysize` ensures that `vector<bool>::operator[]` hardening will detect that `_Off < this->_Mysize` is false. + We don't need to `reinterpret_cast` here, so we don't need to guard against constant evaluation. + The general pattern is that we use the tombstone value for pointers, and default construction values (typically zero) for other members. This results in the safest state of "I look like I contain nothing after default construction, except that my pointers are radioactive". * Note that different containers use different pointer types with `is_pointer_v` and `reinterpret_cast`. For example, `_Deque_val` uses `_Mapptr` while `_List_val` uses `_Nodeptr`. * `~_Hash()` for the unordered associative containers: + Doesn't need to deal with `_Mylist _List`, which is a normal `list`. + Doesn't need to deal with `_Hash_vec<_Aliter> _Vec`. It's not a normal `vector`, but it stores a `_Vector_val` which we've tombstoned. + Setting `_Mask` and `_Maxidx` back to their default construction values doesn't provide the same benefit as zeroing out size fields, but it's consistent with the pattern and seems worth doing.
This is special because of the Small String Optimization. In small mode, the string would be usable, so we need large mode for the tombstone to have any effect. I'm setting `_Mysize` to zero, to work with hardened preconditions as usual. For the capacity `_Myres`, I'm using the first value that `basic_string` would use when entering large mode, as it has a "roundup mask". `_Large_mode_engaged()` returns `_Myres > _Small_string_capacity`, so `_Small_string_capacity + 1` would be sufficient, but it seems safer to use the value that `basic_string` would ordinarily use, as this is less likely to confuse `basic_string` logic now and in the future. Only the tombstone pointer needs to be impossible.
The Small Functor Optimization is much easier to deal with than `basic_string`. `_Local()` returns `_Getimpl() == static_cast<const void*>(&_Mystorage)`, so the tombstone pointer will be detected as large.
Comment that `_Tidy_deallocate()` already sets `_Mysize = 0;`.
With a comment explaining why we're using `_Small` mode.
This makes the destructor behave like `reset()`. Setting it to empty will work with precondition hardening to prevent access to the object. We should not attempt to scribble over `T`'s bytes.
Everything goes through this pseudo-vtable.
`__ExceptionPtrDestroy()` calls `~shared_ptr<const _EXCEPTION_RECORD>()`, but it's separately compiled.
Add an `#error` for the future when we enable destructor tombstones by default. Update the comments to clarify how we feel about this usage. Add missing `alignas` to the buffers - we were being very bad kitties.
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
I've pushed an additional commit to skip the test under |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
☠️ 🦇 CasSTLvania, symphony of the tombstone ☠️ 🦇
Thanks for this PR! Left some comments, mostly nits and some questions. Very close to approval.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Appreciate the follow ups and incorporating the detailed comments, it helps me a lot and I think it makes the code more accessible. LGTM
🪦 Overview
Resolves #5315. See that issue for the design and rationale.
This feature is currently disabled by default. When disabled, it is completely preprocessed away and has no effect whatsoever.
💀 Commits
~_Vector_val()
forvector<T>
demonstrates the most complete pattern:#if _MSVC_STL_DESTRUCTOR_TOMBSTONES
, so it completely vanishes when disabled._Tombstone
because_MSVC_STL_UINTPTR_TOMBSTONE_VALUE
might expand to a function call.~_Vb_val()
forvector<bool>
already exists._Vectype _Myvec
provides the storage, which we just tombstoned._Mysize
ensures thatvector<bool>::operator[]
hardening will detect that_Off < this->_Mysize
is false.reinterpret_cast
here, so we don't need to guard against constant evaluation.is_pointer_v
andreinterpret_cast
. For example,_Deque_val
uses_Mapptr
while_List_val
uses_Nodeptr
.~_Hash()
for the unordered associative containers:_Mylist _List
, which is a normallist
._Hash_vec<_Aliter> _Vec
. It's not a normalvector
, but it stores a_Vector_val
which we've tombstoned._Mask
and_Maxidx
back to their default construction values doesn't provide the same benefit as zeroing out size fields, but it's consistent with the pattern and seems worth doing.basic_string
._Mysize
to zero, to work with hardened preconditions as usual. For the capacity_Myres
, I'm using the first value thatbasic_string
would use when entering large mode, as it has a "roundup mask"._Large_mode_engaged()
returns_Myres > _Small_string_capacity
, so_Small_string_capacity + 1
would be sufficient, but it seems safer to use the value thatbasic_string
would ordinarily use, as this is less likely to confusebasic_string
logic now and in the future. Only the tombstone pointer needs to be impossible.unique_ptr
(scalar and array).shared_ptr
andweak_ptr
.function
.basic_string
._Local()
returns_Getimpl() == static_cast<const void*>(&_Mystorage)
, so the tombstone pointer will be detected as large.basic_regex
.valarray
._Tidy_deallocate()
already sets_Mysize = 0;
.any
._Small
mode.optional
.reset()
. Setting it to empty will work with precondition hardening to prevent access to the object.T
's bytes.optional
s (which are lower risk anyways, so no big loss).variant
.variant
.move_only_function
.exception_ptr
.__ExceptionPtrDestroy()
calls~shared_ptr<const _EXCEPTION_RECORD>()
, but it's separately compiled.polymorphic_allocator
.VSO_0000000_wcfb01_idempotent_container_destructors
.#error
for the future when we enable destructor tombstones by default.alignas
to the buffers - we were being very bad kitties. 😼GH_005315_destructor_tombstones
.optional
andvariant
which are special because we simply set them to empty/valueless, so they have to start in a value-holding state), then call a simple preconditionless operation, chosen to have the highest chance of succeeding without tombstones (if in unchecked release mode), but to encounter the tombstone if enabled. Again, this test is always built with tombstones enabled, so it should be perfectly reliable.list
,set
, andunordered_set
), not even ASan can detect the UB without tombstones (which is ideal for the test, as the comment explains). ASan with tombstones prints nice messages about using the bogus address.basic_string
because of the capacity calculation. We don't need to test all character types here.basic_string
is special and allows its null terminator to be accessed byoperator[]
and even overwritten as long as it's with an identical null character value, sostr[0] = '\0';
is actually valid on any modifiablestring
(just not a destroyed one!).constexpr
.VSO_0000000_wcfb01_idempotent_container_destructors
), which does provide good fancy pointer coverage.