Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AHM] Vesting #575

Merged
merged 9 commits into from
Feb 11, 2025
Merged

[AHM] Vesting #575

merged 9 commits into from
Feb 11, 2025

Conversation

ggwpez
Copy link
Member

@ggwpez ggwpez commented Feb 3, 2025

Merging into the AHM working branch. Depends on #579

Pallet Vesting

Pallet vesting has one storage map to hold the vesting schedules and one storage value to track the
current version of the pallet. The version can be easily migrated, but for the schedules it is a bit difficult.

Storage: Vesting

The vesting schedules are already measured in relay blocks, as can be seen
here.
This means that we can just integrate the existing schedules. The only possibly issue is when there
are lots of pre-existing schedules. The maximal number of schedules is 28; both on Relay and AH.
We cannot use the merge functionality of the vesting pallet since that can be used as an attack
vector: anyone can send 28 vested transfers with very large unlock duration and low amount to force
all other schedules to adapt this long unlock period. This would reduce the rewards per block, which
is bad.
For now, we are writing all colliding AH schedules into a storage item for manual inspection later.
It could still happen that unmalicious users will have more than 28 schedules, but as nobody has
used the vested transfers on AH yet.

Q: Maybe we should disable vested transfers with the next runtime upgrade on AH.

Storage: StorageVersion

The vesting pallet is not using the proper FRAME version tracking; rather, it tracks its version in
the StorageVersion value. It does this incorrectly though, with Asset Hub reporting version 0
instead of 1. We ignore and correct this by writing 1 to the storage.

User Impact

This affects users that have vesting schedules on the Relay chain or on Asset Hub. There exists a
risk that the number of total schedules exceeds 28, which means that they will not fit into the
storage anymore.

We then prioritize the schedules from AH and pause and stash all schedules that do not fit (up to
28).

  • Does not require a CHANGELOG entry

Signed-off-by: Oliver Tale-Yazdi <oliver.tale-yazdi@parity.io>
Signed-off-by: Oliver Tale-Yazdi <oliver.tale-yazdi@parity.io>
Signed-off-by: Oliver Tale-Yazdi <oliver.tale-yazdi@parity.io>
Comment on lines 72 to 74
// TODO what do? should we create a storage item and insert the truncated ones?
// Nobody seems to use the Vesting pallet on AH yet, but we cannot be sure that there
// won't be a truncatenation.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@muharem, any better idea? I think if we increase the max vesting schedules in AH then it can still happen.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, or we could use the same storage, but derive new account id from an actual account id, but both ways exploitable. I would just add a storage item, since its low effort thing.

Copy link
Member Author

@ggwpez ggwpez Feb 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea this is an issue and easy attack vector.

@ggwpez ggwpez mentioned this pull request Feb 3, 2025
64 tasks
pallets/rc-migrator/src/vesting.rs Outdated Show resolved Hide resolved
Comment on lines 72 to 74
// TODO what do? should we create a storage item and insert the truncated ones?
// Nobody seems to use the Vesting pallet on AH yet, but we cannot be sure that there
// won't be a truncatenation.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, or we could use the same storage, but derive new account id from an actual account id, but both ways exploitable. I would just add a storage item, since its low effort thing.

pallets/ah-migrator/src/vesting.rs Show resolved Hide resolved
Signed-off-by: Oliver Tale-Yazdi <oliver.tale-yazdi@parity.io>
Signed-off-by: Oliver Tale-Yazdi <oliver.tale-yazdi@parity.io>
Signed-off-by: Oliver Tale-Yazdi <oliver.tale-yazdi@parity.io>
@bkchr
Copy link
Contributor

bkchr commented Feb 7, 2025

The vesting pallet is not using the proper FRAME version tracking; rather, it tracks its version in
the StorageVersion value. It does this incorrectly though, with Asset Hub reporting version 0
instead of 1. We ignore and correct this by writing 1 to the storage.

What you mean by this?

bkchr added a commit that referenced this pull request Feb 7, 2025
Context: #575

Nobody has used vested transfers on Polkadot or Kusama Asset Hub so far
and disabling it until the Migration would avoid insertion conflicts.

---------

Signed-off-by: Oliver Tale-Yazdi <oliver.tale-yazdi@parity.io>
Co-authored-by: Bastian Köcher <info@kchr.de>
Co-authored-by: fellowship-merge-bot[bot] <151052383+fellowship-merge-bot[bot]@users.noreply.github.com>
@ggwpez
Copy link
Member Author

ggwpez commented Feb 11, 2025

@bkchr the pallet is doing some manual tracking here https://github.com/paritytech/polkadot-sdk/blob/3846691350abd2c39f557500fe1f115cd6676d16/substrate/frame/vesting/src/lib.rs#L216-L217

This results in two versions:
Screenshot 2025-02-11 at 13 13 29

@ggwpez ggwpez marked this pull request as ready for review February 11, 2025 12:13
@bkchr
Copy link
Contributor

bkchr commented Feb 11, 2025

🙄

Please open an issue to fix this upstream.

Signed-off-by: Oliver Tale-Yazdi <oliver.tale-yazdi@parity.io>
Signed-off-by: Oliver Tale-Yazdi <oliver.tale-yazdi@parity.io>
@ggwpez ggwpez merged commit daf6723 into dev-asset-hub-migration Feb 11, 2025
19 of 44 checks passed
@ggwpez ggwpez deleted the oty-vesting branch February 11, 2025 12:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants