Supply limiter
Last updated
Last updated
The ERC20ElasticSupply contract is an extension of the ERC20 standard that allows you to create elastic supply tokens that can be minted and burned by other contracts. As a security measure, the contract includes a mint limiter whose parameters are set by the constructor during the creation of the token contract.
Both the GEX token and all Geminon stablecoins include this security mechanism. This system is designed for cases in which Geminon token minting permits are granted to contracts outside the protocol, specifically in the case of inter-chain bridges operated by third parties, since in the event of bridge hacking the possible damage to the protocol is limited.
The algorithm, developed by Geminon, is based on adaptive exponential smoothing whose parameters and vary depending on the interval elapsed since the last minting or burning of the token.
The described algorithm is designed to approximate the value of a 24-hour period moving average calculated on a discrete time series of variable frequency by means of a minimum calculation.
In a traditional computer application, if we want to monitor the accumulated value of a discrete variable in a certain time interval, what we would do is create a table that stores each value of the variable together with its timestamp, and use said table to calculate an average of all values that are within the desired time window, for example the last 24 hours.
In a blockchain application however such a basic calculation would be cost prohibitive. First, because storing data on the blockchain, especially a congested one like Ethereum, is extremely expensive. And second, because in each transaction it would be necessary to iterate over the lists of values and times stored, with which the complexity of the calculation and with it the cost of the transaction would grow with the number of data stored in the period.
This approach would not only be disproportionately expensive but also dangerous: given a high enough trading volume, the network's gas limit per block could be reached, causing the minting functions of the token in question to be temporarily blocked due to reversing transactions that exceed said gas limit. A malicious actor could exploit this vulnerability to launch denial of service attacks against the protocol.
The autoregressive algorithm designed by Geminon is capable of optimally approximating the explicitly calculated moving average curve.
In a simple simulation entering constant amount trades at constant intervals to produce a ramp function on the moving average value, the approximator is able to follow the ramp and converges perfectly to the new level without needing to store the series values.
In a much more difficult test, a series of random values is entered at random time intervals. As can be seen in the figure, the algorithm designed by Geminon is capable of following the real value of the moving average at all times despite the strong randomness of the signal without the need to store the previous values.
Where is the time interval in seconds elapsed since the last operation and is the volume (number of tokens) that you want to mint or burn, in which last case it will have a negative value and the value of will be reduced.
Note that time is a discrete variable defined by the moment of creation of the block in which the current transaction is mined, so those transactions mined within the same block have . In these cases, the weights and take the value 1 and the volume of tokens of the transaction is added directly to the accumulated volume up to that moment.
If we were dealing with a uniform time series, with the values separated at regular intervals in time, the period moving average could be easily approximated simply by using an equivalent exponential moving average with parameter . In this case, since there is also no possibility of imputing the missing values to transform the irregular series into a regular one, the only option is to introduce a variable that compensates for the irregularity in the frequency domain, adapting the intensity of the smoothing that is applied to each new data depending on how far it is from the basic frequency defined by the period of the moving average that is being approximated (86400 seconds in our case).