:: Volume 28, Issue 1 (9-2023) ::
Andishe 2023, 28(1): 63-73 Back to browse issues page
Determining the variance boundaries of single-mode distributions using power entropy
Manije Sanei tabas * , Mohammadhosein Dehghan , Fatemeh Ashtab
Abstract:   (263 Views)
Variance and entropy are distinct metrics that are commonly used to measure the uncertainty of random variables. While the variance shows how a random variable spreads more than expected, the entropy measure measures the uncertainty of an information approach, in other words, it measures the average amount of information of a random variable.
 For both uniform and normal distributions, variance is a ratio of power entropy. Finding such a monotonic relationship between variance and entropy for a larger class of these two distributions is very important and useful in signal processing, machine learning, information theory, probability and statistics, for example, it is used to reduce the errors of estimators and choose a strategy. gives, on average, the greatest or nearly greatest reduction in the entropy of the distribution of the target location, and the effectiveness of this method is tested using simulations with mining assay models. In this article, the upper bound of the variance for single-mode distributions whose tails are heavier than the tails of exponential distributions is created with the help of power entropy
Keywords: Power entropy, Vriance bounds, unimodal distributions, Lipshitz continuity.
Full-Text [PDF 247 kb]   (150 Downloads)    
Type of Study: Research | Subject: Special
Received: 2023/10/5 | Accepted: 2023/09/22 | Published: 2024/03/15


XML   Persian Abstract   Print



Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Volume 28, Issue 1 (9-2023) Back to browse issues page