Page 32 - Kỷ yếu hội thảo khoa học lần thứ 12 - Công nghệ thông tin và Ứng dụng trong các lĩnh vực (CITA 2023)
P. 32

16


                     the best offspring produced by different training objectives (i.e., mutations). In this
                     way, it contributes to progress in and the success of GANs. Experiments on several
                     datasets demonstrate the advantages of integrating different adversarial training objec-
                     tives and E-                     ormance for image generation.Dealing with missing
                     data is a common issue in empirical research. Data scientists encounter various types
                     of missing data, such as Missing Completely At Random (MCAR), Missing At Random
                     (MAR), and Missing Not At Random (MNAR), which are classified based on the mech-
                     anisms of missing data. MCAR occurs when the missingness is unrelated to the hypo-
                     thetical value, values of other variables, or observed records. In MAR, on the other
                     hand, missing data points  are unrelated to the specific missing values, but may depend
                     on a subset of observed data. Lastly, MNAR is the ultimate type of missing data, and it
                     occurs when missing data points depend on both hypothetical values and specific vari-
                     able values.



                     3     The proposed method


                     As state before, an evolutionary step in EGAN can exploit the advantages and suppress
                     the weaknesses of different metrics. Meanwhile, using GAN for imputing missing value
                     archived high quality in comparison with state-of-the-art imputation methods. There-
                     fore, in this paper, we proposed an imputation method by integrating GAN and evolu-
                     tionary computation, which is called EGAIN ( Evolutionary Generative Adversarial for
                     Imputation Data). Similar to EGAN, in our EGAIN model, a population  of   genera-
                     tors              is evolves in a given environment discriminator   . Each evolution-
                     ary step consists of three sub-stages:
                                Variation: Applying one step of an optimal algorithm with    different ob-
                                jective functions on each    to produce    generators               .
                                Evaluation: For each generators (i.e., each child), its performance is evalu-
                                ated by a fitness function consist of two components: quality score and di-
                                versity score.
                                Selection:   generators with the highest fitness score is kept and evolve to
                                the next iteration.
                       The discriminator is updated after each evolutionary step to continually provide the
                     adaptive losses to drive the population of generator(s) evolving to produce better solu-
                     tions. Next, our network and the evolutionary step are represented in detail.



                     3.1   Generator

                     Suppose  that                        is  a  missing  data  in   -dimensional  space,
                                           is a mask vector indicating which components of       are ob-
                     served, that is:









                     CITA 2023                                                   ISBN: 978-604-80-8083-9
   27   28   29   30   31   32   33   34   35   36   37