Skip to main content

Table 2 Hyperparameter Summary

From: A novel generative adversarial networks modelling for the class imbalance problem in high dimensional omics data

id

func_optim

beta

batch_size

iter_critic

dropout_prob

gen_structure

critic_structure

max_epochs_1

max_loss_1

n_epochs_2

lr_1

lr_2

rate_save

diff_epochs

max_loss_2

min_loss_2

instab_constant

lipid_pretraining

SGD

13

30

4

0.5

50, 100, 200

200, 100, 50

500

3

1000

0.0005, 0.0001, 0.001

1e-05, 5e-05, 0.0001

20

500

15

-10

3

lipid_retraining

SGD

13

10

2

0.7

50, 100, 200

200, 100, 50

500

3

5000

0.0005, 0.0001

1e-05, 5e-05

10

500

15

-15

3

micro_pretraining

SGD

13

30

4

0.5

50, 100, 200

200, 100, 50

500

3

1000

0.0005, 0.0001, 0.001

1e-05, 5e-05, 0.0001

20

500

15

-10

3

micro_retraining

SGD

13

10

2

0.7

50, 100, 200

200, 100, 50

500

3

5000

0.0005, 0.0001

1e-05, 5e-05

10

500

15

-15

3

sim_pretraining

SGD

13

30

4

0.5

50, 100, 200

200, 100, 50

500

3

1000

0.0005, 0.0001, 0.001

1e-05, 5e-05, 0.0001

20

500

15

-10

3

sim_retraining

SGD

13

10

2

0.7

50, 100, 200

200, 100, 50

500

3

5000

0.0005, 0.0001

1e-05, 5e-05

10

500

15

-15

3

  1. Table showing hyperparameter values used across the experiments for each dataset. Pre-training and re-training values are shown for each experiment. Iter_critic defines the number of critic training iterations before 1 training cycle of the generator. n_epochs defines the number of epochs trained, using the determined learning rates. Instab_constant was used to define the effect of instability when determining the most effective learning rates. Furthermore, max_loss_1 and max_epochs_1 were used to define this in the first training iterations, and max_loss_2, min_loss_2 in the second. diff_epochs defines the max number of epochs to train when determining this.Stochastic gradient descent (SGD), metabolomics study (lipid), microarray study (micro), simulated microarray study (sim), optimising function (func_optim), instability (instab), probability (prob), iter (iterations), learning rate (lr)