[2024-06-19 09:33:02,673] INFO: Will use torch.nn.parallel.DistributedDataParallel() and 8 gpus [2024-06-19 09:33:02,811] INFO: NVIDIA GeForce GTX 1080 Ti [2024-06-19 09:33:02,812] INFO: NVIDIA GeForce GTX 1080 Ti [2024-06-19 09:33:02,812] INFO: NVIDIA GeForce GTX 1080 Ti [2024-06-19 09:33:02,812] INFO: NVIDIA GeForce GTX 1080 Ti [2024-06-19 09:33:02,812] INFO: NVIDIA GeForce GTX 1080 Ti [2024-06-19 09:33:02,812] INFO: NVIDIA GeForce GTX 1080 Ti [2024-06-19 09:33:02,812] INFO: NVIDIA GeForce GTX 1080 Ti [2024-06-19 09:33:02,812] INFO: NVIDIA GeForce GTX 1080 Ti [2024-06-19 09:33:10,247] INFO: using dtype=torch.float32 [2024-06-19 09:33:11,181] INFO: using attention_type=math [2024-06-19 09:33:11,193] INFO: using attention_type=math [2024-06-19 09:33:11,204] INFO: using attention_type=math [2024-06-19 09:33:11,215] INFO: using attention_type=math [2024-06-19 09:33:11,226] INFO: using attention_type=math [2024-06-19 09:33:11,238] INFO: using attention_type=math [2024-06-19 09:33:15,300] INFO: DistributedDataParallel( (module): MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) ) [2024-06-19 09:33:15,301] INFO: Trainable parameters: 11671568 [2024-06-19 09:33:15,301] INFO: Non-trainable parameters: 0 [2024-06-19 09:33:15,301] INFO: Total parameters: 11671568 [2024-06-19 09:33:15,306] INFO: Modules Trainable parameters Non-tranable parameters module.nn0_id.0.weight 8704 0 module.nn0_id.0.bias 512 0 module.nn0_id.2.weight 512 0 module.nn0_id.2.bias 512 0 module.nn0_id.4.weight 262144 0 module.nn0_id.4.bias 512 0 module.nn0_reg.0.weight 8704 0 module.nn0_reg.0.bias 512 0 module.nn0_reg.2.weight 512 0 module.nn0_reg.2.bias 512 0 module.nn0_reg.4.weight 262144 0 module.nn0_reg.4.bias 512 0 module.conv_id.0.mha.in_proj_weight 786432 0 module.conv_id.0.mha.in_proj_bias 1536 0 module.conv_id.0.mha.out_proj.weight 262144 0 module.conv_id.0.mha.out_proj.bias 512 0 module.conv_id.0.norm0.weight 512 0 module.conv_id.0.norm0.bias 512 0 module.conv_id.0.norm1.weight 512 0 module.conv_id.0.norm1.bias 512 0 module.conv_id.0.seq.0.weight 262144 0 module.conv_id.0.seq.0.bias 512 0 module.conv_id.0.seq.2.weight 262144 0 module.conv_id.0.seq.2.bias 512 0 module.conv_id.1.mha.in_proj_weight 786432 0 module.conv_id.1.mha.in_proj_bias 1536 0 module.conv_id.1.mha.out_proj.weight 262144 0 module.conv_id.1.mha.out_proj.bias 512 0 module.conv_id.1.norm0.weight 512 0 module.conv_id.1.norm0.bias 512 0 module.conv_id.1.norm1.weight 512 0 module.conv_id.1.norm1.bias 512 0 module.conv_id.1.seq.0.weight 262144 0 module.conv_id.1.seq.0.bias 512 0 module.conv_id.1.seq.2.weight 262144 0 module.conv_id.1.seq.2.bias 512 0 module.conv_id.2.mha.in_proj_weight 786432 0 module.conv_id.2.mha.in_proj_bias 1536 0 module.conv_id.2.mha.out_proj.weight 262144 0 module.conv_id.2.mha.out_proj.bias 512 0 module.conv_id.2.norm0.weight 512 0 module.conv_id.2.norm0.bias 512 0 module.conv_id.2.norm1.weight 512 0 module.conv_id.2.norm1.bias 512 0 module.conv_id.2.seq.0.weight 262144 0 module.conv_id.2.seq.0.bias 512 0 module.conv_id.2.seq.2.weight 262144 0 module.conv_id.2.seq.2.bias 512 0 module.conv_reg.0.mha.in_proj_weight 786432 0 module.conv_reg.0.mha.in_proj_bias 1536 0 module.conv_reg.0.mha.out_proj.weight 262144 0 module.conv_reg.0.mha.out_proj.bias 512 0 module.conv_reg.0.norm0.weight 512 0 module.conv_reg.0.norm0.bias 512 0 module.conv_reg.0.norm1.weight 512 0 module.conv_reg.0.norm1.bias 512 0 module.conv_reg.0.seq.0.weight 262144 0 module.conv_reg.0.seq.0.bias 512 0 module.conv_reg.0.seq.2.weight 262144 0 module.conv_reg.0.seq.2.bias 512 0 module.conv_reg.1.mha.in_proj_weight 786432 0 module.conv_reg.1.mha.in_proj_bias 1536 0 module.conv_reg.1.mha.out_proj.weight 262144 0 module.conv_reg.1.mha.out_proj.bias 512 0 module.conv_reg.1.norm0.weight 512 0 module.conv_reg.1.norm0.bias 512 0 module.conv_reg.1.norm1.weight 512 0 module.conv_reg.1.norm1.bias 512 0 module.conv_reg.1.seq.0.weight 262144 0 module.conv_reg.1.seq.0.bias 512 0 module.conv_reg.1.seq.2.weight 262144 0 module.conv_reg.1.seq.2.bias 512 0 module.conv_reg.2.mha.in_proj_weight 786432 0 module.conv_reg.2.mha.in_proj_bias 1536 0 module.conv_reg.2.mha.out_proj.weight 262144 0 module.conv_reg.2.mha.out_proj.bias 512 0 module.conv_reg.2.norm0.weight 512 0 module.conv_reg.2.norm0.bias 512 0 module.conv_reg.2.norm1.weight 512 0 module.conv_reg.2.norm1.bias 512 0 module.conv_reg.2.seq.0.weight 262144 0 module.conv_reg.2.seq.0.bias 512 0 module.conv_reg.2.seq.2.weight 262144 0 module.conv_reg.2.seq.2.bias 512 0 module.nn_id.0.weight 270848 0 module.nn_id.0.bias 512 0 module.nn_id.2.weight 512 0 module.nn_id.2.bias 512 0 module.nn_id.4.weight 3072 0 module.nn_id.4.bias 6 0 module.nn_pt.nn.0.weight 273920 0 module.nn_pt.nn.0.bias 512 0 module.nn_pt.nn.2.weight 512 0 module.nn_pt.nn.2.bias 512 0 module.nn_pt.nn.4.weight 1024 0 module.nn_pt.nn.4.bias 2 0 module.nn_eta.nn.0.weight 273920 0 module.nn_eta.nn.0.bias 512 0 module.nn_eta.nn.2.weight 512 0 module.nn_eta.nn.2.bias 512 0 module.nn_eta.nn.4.weight 1024 0 module.nn_eta.nn.4.bias 2 0 module.nn_sin_phi.nn.0.weight 273920 0 module.nn_sin_phi.nn.0.bias 512 0 module.nn_sin_phi.nn.2.weight 512 0 module.nn_sin_phi.nn.2.bias 512 0 module.nn_sin_phi.nn.4.weight 1024 0 module.nn_sin_phi.nn.4.bias 2 0 module.nn_cos_phi.nn.0.weight 273920 0 module.nn_cos_phi.nn.0.bias 512 0 module.nn_cos_phi.nn.2.weight 512 0 module.nn_cos_phi.nn.2.bias 512 0 module.nn_cos_phi.nn.4.weight 1024 0 module.nn_cos_phi.nn.4.bias 2 0 module.nn_energy.nn.0.weight 273920 0 module.nn_energy.nn.0.bias 512 0 module.nn_energy.nn.2.weight 512 0 module.nn_energy.nn.2.bias 512 0 module.nn_energy.nn.4.weight 1024 0 module.nn_energy.nn.4.bias 2 0 [2024-06-19 09:33:15,325] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_8GTX_pyg-clic_20240619_093302_150176 [2024-06-19 09:33:15,325] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_8GTX_pyg-clic_20240619_093302_150176 [2024-06-19 09:33:15,365] INFO: train_dataset: clic_edm_qq_pf, 1589912 [2024-06-19 09:33:15,391] INFO: train_dataset: clic_edm_ttbar_pf, 800800 [2024-06-19 09:33:15,432] INFO: train_dataset: clic_edm_ttbar_pu10_pf, 562200 [2024-06-19 09:33:15,455] INFO: train_dataset: clic_edm_ww_fullhad_pf, 800800 [2024-06-19 09:33:15,472] INFO: train_dataset: clic_edm_zh_tautau_pf, 800799 [2024-06-19 09:33:31,272] INFO: valid_dataset: clic_edm_qq_pf, 397514 [2024-06-19 09:33:31,354] INFO: Initiating epoch #1 train run on device rank=0 [2024-06-19 11:29:35,974] INFO: Initiating epoch #1 valid run on device rank=0 [2024-06-19 11:30:49,454] INFO: Rank 0: epoch=1 / 200 train_loss=15.7855 valid_loss=12.8759 stale=0 time=117.3m eta=23343.0m [2024-06-19 11:30:49,513] INFO: Initiating epoch #2 train run on device rank=0 [2024-06-19 13:26:46,593] INFO: Initiating epoch #2 valid run on device rank=0 [2024-06-19 13:27:59,862] INFO: Rank 0: epoch=2 / 200 train_loss=12.3855 valid_loss=11.6398 stale=0 time=117.17m eta=23213.0m [2024-06-19 13:28:00,249] INFO: Initiating epoch #3 train run on device rank=0 [2024-06-19 15:23:57,245] INFO: Initiating epoch #3 valid run on device rank=0 [2024-06-19 15:25:12,202] INFO: Rank 0: epoch=3 / 200 train_loss=11.5072 valid_loss=11.0686 stale=0 time=117.2m eta=23093.7m [2024-06-19 15:25:12,686] INFO: Initiating epoch #4 train run on device rank=0 [2024-06-19 17:21:09,254] INFO: Initiating epoch #4 valid run on device rank=0 [2024-06-19 17:22:25,964] INFO: Rank 0: epoch=4 / 200 train_loss=11.0092 valid_loss=10.6815 stale=0 time=117.22m eta=22976.6m [2024-06-19 17:22:26,901] INFO: Initiating epoch #5 train run on device rank=0 [2024-06-19 19:22:25,466] INFO: Initiating epoch #5 valid run on device rank=0 [2024-06-19 19:24:00,030] INFO: Rank 0: epoch=5 / 200 train_loss=10.6763 valid_loss=10.4501 stale=0 time=121.55m eta=23028.6m [2024-06-19 19:24:01,249] INFO: Initiating epoch #6 train run on device rank=0 [2024-06-19 21:40:23,232] INFO: Initiating epoch #6 valid run on device rank=0 [2024-06-19 21:42:09,113] INFO: Rank 0: epoch=6 / 200 train_loss=10.4217 valid_loss=10.1847 stale=0 time=138.13m eta=23559.0m [2024-06-19 21:42:21,590] INFO: Initiating epoch #7 train run on device rank=0 [2024-06-19 23:39:15,553] INFO: Initiating epoch #7 valid run on device rank=0 [2024-06-19 23:45:34,440] INFO: Rank 0: epoch=7 / 200 train_loss=10.2247 valid_loss=10.0564 stale=0 time=123.21m eta=23492.3m [2024-06-19 23:45:52,934] INFO: Initiating epoch #8 train run on device rank=0 [2024-06-20 01:48:02,402] INFO: Initiating epoch #8 valid run on device rank=0 [2024-06-20 01:49:20,085] INFO: Rank 0: epoch=8 / 200 train_loss=10.0717 valid_loss=9.9078 stale=0 time=123.45m eta=23419.5m [2024-06-20 01:49:21,185] INFO: Initiating epoch #9 train run on device rank=0 [2024-06-20 03:45:04,102] INFO: Initiating epoch #9 valid run on device rank=0 [2024-06-20 03:46:18,484] INFO: Rank 0: epoch=9 / 200 train_loss=9.9383 valid_loss=9.8114 stale=0 time=116.95m eta=23191.3m [2024-06-20 03:46:19,069] INFO: Initiating epoch #10 train run on device rank=0 [2024-06-20 05:42:02,904] INFO: Initiating epoch #10 valid run on device rank=0 [2024-06-20 05:43:16,054] INFO: Rank 0: epoch=10 / 200 train_loss=9.8239 valid_loss=9.6990 stale=0 time=116.95m eta=22985.2m [2024-06-20 05:43:16,321] INFO: Initiating epoch #11 train run on device rank=0 [2024-06-20 07:38:59,595] INFO: Initiating epoch #11 valid run on device rank=0 [2024-06-20 07:40:11,342] INFO: Rank 0: epoch=11 / 200 train_loss=9.7176 valid_loss=9.5818 stale=0 time=116.92m eta=22794.5m [2024-06-20 07:40:11,389] INFO: Initiating epoch #12 train run on device rank=0 [2024-06-20 09:35:54,075] INFO: Initiating epoch #12 valid run on device rank=0 [2024-06-20 09:37:06,527] INFO: Rank 0: epoch=12 / 200 train_loss=9.6157 valid_loss=9.5124 stale=0 time=116.92m eta=22616.2m [2024-06-20 09:37:06,677] INFO: Initiating epoch #13 train run on device rank=0 [2024-06-20 11:32:49,982] INFO: Initiating epoch #13 valid run on device rank=0 [2024-06-20 11:34:01,956] INFO: Rank 0: epoch=13 / 200 train_loss=9.5157 valid_loss=9.4733 stale=0 time=116.92m eta=22447.3m [2024-06-20 11:34:02,004] INFO: Initiating epoch #14 train run on device rank=0 [2024-06-20 13:29:44,705] INFO: Initiating epoch #14 valid run on device rank=0 [2024-06-20 13:31:02,024] INFO: Rank 0: epoch=14 / 200 train_loss=9.4276 valid_loss=9.3803 stale=0 time=117.0m eta=22286.9m [2024-06-20 13:31:03,067] INFO: Initiating epoch #15 train run on device rank=0 [2024-06-20 15:26:46,191] INFO: Initiating epoch #15 valid run on device rank=0 [2024-06-20 15:27:58,261] INFO: Rank 0: epoch=15 / 200 train_loss=9.3524 valid_loss=9.3637 stale=0 time=116.92m eta=22131.5m [2024-06-20 15:27:58,336] INFO: Initiating epoch #16 train run on device rank=0 [2024-06-20 17:23:42,279] INFO: Initiating epoch #16 valid run on device rank=0 [2024-06-20 17:24:54,065] INFO: Rank 0: epoch=16 / 200 train_loss=9.2867 valid_loss=9.3206 stale=0 time=116.93m eta=21980.9m [2024-06-20 17:24:54,218] INFO: Initiating epoch #17 train run on device rank=0 [2024-06-20 19:20:37,514] INFO: Initiating epoch #17 valid run on device rank=0 [2024-06-20 19:21:49,382] INFO: Rank 0: epoch=17 / 200 train_loss=9.2283 valid_loss=9.2814 stale=0 time=116.92m eta=21834.1m [2024-06-20 19:21:49,483] INFO: Initiating epoch #18 train run on device rank=0 [2024-06-20 21:17:29,862] INFO: Initiating epoch #18 valid run on device rank=0 [2024-06-20 21:18:41,740] INFO: Rank 0: epoch=18 / 200 train_loss=9.1761 valid_loss=9.2331 stale=0 time=116.87m eta=21690.1m [2024-06-20 21:18:41,844] INFO: Initiating epoch #19 train run on device rank=0 [2024-06-20 23:14:22,423] INFO: Initiating epoch #19 valid run on device rank=0 [2024-06-20 23:15:34,708] INFO: Rank 0: epoch=19 / 200 train_loss=9.1240 valid_loss=9.1956 stale=0 time=116.88m eta=21549.1m [2024-06-20 23:15:34,820] INFO: Initiating epoch #20 train run on device rank=0 [2024-06-21 01:11:16,212] INFO: Initiating epoch #20 valid run on device rank=0 [2024-06-21 01:12:28,052] INFO: Rank 0: epoch=20 / 200 train_loss=9.0790 valid_loss=9.1380 stale=0 time=116.89m eta=21410.5m [2024-06-21 01:12:28,175] INFO: Initiating epoch #21 train run on device rank=0 [2024-06-21 03:08:07,727] INFO: Initiating epoch #21 valid run on device rank=0 [2024-06-21 03:09:19,602] INFO: Rank 0: epoch=21 / 200 train_loss=9.0371 valid_loss=9.1057 stale=0 time=116.86m eta=21273.8m [2024-06-21 03:09:19,753] INFO: Initiating epoch #22 train run on device rank=0 [2024-06-21 05:04:59,624] INFO: Initiating epoch #22 valid run on device rank=0 [2024-06-21 05:06:11,273] INFO: Rank 0: epoch=22 / 200 train_loss=8.9971 valid_loss=9.0757 stale=0 time=116.86m eta=21138.8m [2024-06-21 05:06:11,333] INFO: Initiating epoch #23 train run on device rank=0 [2024-06-21 07:01:53,908] INFO: Initiating epoch #23 valid run on device rank=0 [2024-06-21 07:03:08,575] INFO: Rank 0: epoch=23 / 200 train_loss=8.9622 valid_loss=9.0516 stale=0 time=116.95m eta=21006.2m [2024-06-21 07:03:08,889] INFO: Initiating epoch #24 train run on device rank=0 [2024-06-21 08:58:50,486] INFO: Initiating epoch #24 valid run on device rank=0 [2024-06-21 09:00:02,856] INFO: Rank 0: epoch=24 / 200 train_loss=8.9270 valid_loss=8.9930 stale=0 time=116.9m eta=20874.5m [2024-06-21 09:00:02,981] INFO: Initiating epoch #25 train run on device rank=0 [2024-06-21 10:55:44,826] INFO: Initiating epoch #25 valid run on device rank=0 [2024-06-21 10:56:57,050] INFO: Rank 0: epoch=25 / 200 train_loss=8.8968 valid_loss=8.9906 stale=0 time=116.9m eta=20744.0m [2024-06-21 10:56:57,225] INFO: Initiating epoch #26 train run on device rank=0 [2024-06-21 12:53:07,673] INFO: Initiating epoch #26 valid run on device rank=0 [2024-06-21 12:54:23,194] INFO: Rank 0: epoch=26 / 200 train_loss=8.8681 valid_loss=8.9685 stale=0 time=117.43m eta=20618.1m [2024-06-21 12:54:23,982] INFO: Initiating epoch #27 train run on device rank=0 [2024-06-21 14:50:35,906] INFO: Initiating epoch #27 valid run on device rank=0 [2024-06-21 14:51:50,529] INFO: Rank 0: epoch=27 / 200 train_loss=8.8392 valid_loss=8.9579 stale=0 time=117.44m eta=20492.9m [2024-06-21 14:51:51,361] INFO: Initiating epoch #28 train run on device rank=0 [2024-06-21 16:48:17,431] INFO: Initiating epoch #28 valid run on device rank=0 [2024-06-21 16:49:29,761] INFO: Rank 0: epoch=28 / 200 train_loss=8.8131 valid_loss=8.9254 stale=0 time=117.64m eta=20369.6m [2024-06-21 16:49:30,009] INFO: Initiating epoch #29 train run on device rank=0 [2024-06-21 18:47:53,323] INFO: Initiating epoch #29 valid run on device rank=0 [2024-06-21 18:49:05,310] INFO: Rank 0: epoch=29 / 200 train_loss=8.7886 valid_loss=8.8990 stale=0 time=119.59m eta=20258.0m [2024-06-21 18:49:05,501] INFO: Initiating epoch #30 train run on device rank=0 [2024-06-21 20:47:58,029] INFO: Initiating epoch #30 valid run on device rank=0 [2024-06-21 20:49:09,537] INFO: Rank 0: epoch=30 / 200 train_loss=8.7666 valid_loss=8.9031 stale=1 time=120.07m eta=20148.6m [2024-06-21 20:49:09,846] INFO: Initiating epoch #31 train run on device rank=0 [2024-06-21 22:46:05,916] INFO: Initiating epoch #31 valid run on device rank=0 [2024-06-21 22:47:17,834] INFO: Rank 0: epoch=31 / 200 train_loss=8.7461 valid_loss=8.8859 stale=0 time=118.13m eta=20028.0m [2024-06-21 22:47:18,415] INFO: Initiating epoch #32 train run on device rank=0 [2024-06-22 00:44:34,505] INFO: Initiating epoch #32 valid run on device rank=0 [2024-06-22 00:45:46,148] INFO: Rank 0: epoch=32 / 200 train_loss=8.7261 valid_loss=8.8806 stale=0 time=118.46m eta=19909.3m [2024-06-22 00:45:46,621] INFO: Initiating epoch #33 train run on device rank=0 [2024-06-22 02:42:42,704] INFO: Initiating epoch #33 valid run on device rank=0 [2024-06-22 02:43:54,353] INFO: Rank 0: epoch=33 / 200 train_loss=8.7069 valid_loss=8.8685 stale=0 time=118.13m eta=19788.9m [2024-06-22 02:43:54,537] INFO: Initiating epoch #34 train run on device rank=0 [2024-06-22 04:40:21,728] INFO: Initiating epoch #34 valid run on device rank=0 [2024-06-22 04:41:33,570] INFO: Rank 0: epoch=34 / 200 train_loss=8.6904 valid_loss=8.8565 stale=0 time=117.65m eta=19666.3m [2024-06-22 04:41:33,787] INFO: Initiating epoch #35 train run on device rank=0 [2024-06-22 06:40:35,675] INFO: Initiating epoch #35 valid run on device rank=0 [2024-06-22 06:41:47,631] INFO: Rank 0: epoch=35 / 200 train_loss=8.6728 valid_loss=8.8614 stale=1 time=120.23m eta=19556.1m [2024-06-22 06:41:48,057] INFO: Initiating epoch #36 train run on device rank=0 [2024-06-22 08:39:12,805] INFO: Initiating epoch #36 valid run on device rank=0 [2024-06-22 08:40:24,338] INFO: Rank 0: epoch=36 / 200 train_loss=8.6567 valid_loss=8.8436 stale=0 time=118.6m eta=19438.0m [2024-06-22 08:40:24,597] INFO: Initiating epoch #37 train run on device rank=0 [2024-06-22 10:37:39,371] INFO: Initiating epoch #37 valid run on device rank=0 [2024-06-22 10:38:50,547] INFO: Rank 0: epoch=37 / 200 train_loss=8.6409 valid_loss=8.8488 stale=1 time=118.43m eta=19319.1m [2024-06-22 10:38:50,771] INFO: Initiating epoch #38 train run on device rank=0 [2024-06-22 12:35:19,425] INFO: Initiating epoch #38 valid run on device rank=0 [2024-06-22 12:36:31,253] INFO: Rank 0: epoch=38 / 200 train_loss=8.6268 valid_loss=8.8223 stale=0 time=117.67m eta=19197.0m [2024-06-22 12:36:31,448] INFO: Initiating epoch #39 train run on device rank=0 [2024-06-22 14:32:42,356] INFO: Initiating epoch #39 valid run on device rank=0 [2024-06-22 14:33:54,180] INFO: Rank 0: epoch=39 / 200 train_loss=8.6135 valid_loss=8.8220 stale=0 time=117.38m eta=19073.9m [2024-06-22 14:33:54,390] INFO: Initiating epoch #40 train run on device rank=0 [2024-06-22 16:30:42,807] INFO: Initiating epoch #40 valid run on device rank=0 [2024-06-22 16:31:54,451] INFO: Rank 0: epoch=40 / 200 train_loss=8.5995 valid_loss=8.7969 stale=0 time=118.0m eta=18953.5m [2024-06-22 16:31:54,589] INFO: Initiating epoch #41 train run on device rank=0 [2024-06-22 18:29:14,175] INFO: Initiating epoch #41 valid run on device rank=0 [2024-06-22 18:30:26,059] INFO: Rank 0: epoch=41 / 200 train_loss=8.5859 valid_loss=8.7855 stale=0 time=118.52m eta=18835.3m [2024-06-22 18:30:26,253] INFO: Initiating epoch #42 train run on device rank=0 [2024-06-22 20:27:13,457] INFO: Initiating epoch #42 valid run on device rank=0 [2024-06-22 20:28:24,913] INFO: Rank 0: epoch=42 / 200 train_loss=8.5700 valid_loss=8.7730 stale=0 time=117.98m eta=18715.1m [2024-06-22 20:28:25,078] INFO: Initiating epoch #43 train run on device rank=0 [2024-06-22 22:25:32,088] INFO: Initiating epoch #43 valid run on device rank=0 [2024-06-22 22:26:44,002] INFO: Rank 0: epoch=43 / 200 train_loss=8.5581 valid_loss=8.7647 stale=0 time=118.32m eta=18596.1m [2024-06-22 22:26:44,252] INFO: Initiating epoch #44 train run on device rank=0 [2024-06-23 00:24:18,401] INFO: Initiating epoch #44 valid run on device rank=0 [2024-06-23 00:25:30,196] INFO: Rank 0: epoch=44 / 200 train_loss=8.5466 valid_loss=8.7470 stale=0 time=118.77m eta=18478.8m [2024-06-23 00:25:30,337] INFO: Initiating epoch #45 train run on device rank=0 [2024-06-23 02:22:54,417] INFO: Initiating epoch #45 valid run on device rank=0 [2024-06-23 02:24:05,965] INFO: Rank 0: epoch=45 / 200 train_loss=8.5344 valid_loss=8.7441 stale=0 time=118.59m eta=18360.9m [2024-06-23 02:24:06,117] INFO: Initiating epoch #46 train run on device rank=0 [2024-06-23 04:21:45,029] INFO: Initiating epoch #46 valid run on device rank=0 [2024-06-23 04:22:56,995] INFO: Rank 0: epoch=46 / 200 train_loss=8.5242 valid_loss=8.7330 stale=0 time=118.85m eta=18243.7m [2024-06-23 04:22:57,176] INFO: Initiating epoch #47 train run on device rank=0 [2024-06-23 06:19:33,272] INFO: Initiating epoch #47 valid run on device rank=0 [2024-06-23 06:20:44,825] INFO: Rank 0: epoch=47 / 200 train_loss=8.5136 valid_loss=8.7415 stale=1 time=117.79m eta=18123.1m [2024-06-23 06:20:45,053] INFO: Initiating epoch #48 train run on device rank=0 [2024-06-23 08:18:42,231] INFO: Initiating epoch #48 valid run on device rank=0 [2024-06-23 08:19:53,716] INFO: Rank 0: epoch=48 / 200 train_loss=8.5032 valid_loss=8.7403 stale=2 time=119.14m eta=18006.8m [2024-06-23 08:19:54,083] INFO: Initiating epoch #49 train run on device rank=0 [2024-06-23 10:17:26,653] INFO: Initiating epoch #49 valid run on device rank=0 [2024-06-23 10:18:38,708] INFO: Rank 0: epoch=49 / 200 train_loss=8.4895 valid_loss=8.7261 stale=0 time=118.74m eta=17889.3m [2024-06-23 10:18:39,049] INFO: Initiating epoch #50 train run on device rank=0 [2024-06-23 12:15:23,652] INFO: Initiating epoch #50 valid run on device rank=0 [2024-06-23 12:16:34,833] INFO: Rank 0: epoch=50 / 200 train_loss=8.4773 valid_loss=8.7267 stale=1 time=117.93m eta=17769.2m [2024-06-23 12:16:35,129] INFO: Initiating epoch #51 train run on device rank=0 [2024-06-23 14:13:14,001] INFO: Initiating epoch #51 valid run on device rank=0 [2024-06-23 14:14:25,756] INFO: Rank 0: epoch=51 / 200 train_loss=8.4658 valid_loss=8.6994 stale=0 time=117.84m eta=17648.9m [2024-06-23 14:14:26,129] INFO: Initiating epoch #52 train run on device rank=0 [2024-06-23 16:11:10,593] INFO: Initiating epoch #52 valid run on device rank=0 [2024-06-23 16:12:21,844] INFO: Rank 0: epoch=52 / 200 train_loss=8.4560 valid_loss=8.7038 stale=1 time=117.93m eta=17529.0m [2024-06-23 16:12:22,036] INFO: Initiating epoch #53 train run on device rank=0 [2024-06-23 18:08:55,082] INFO: Initiating epoch #53 valid run on device rank=0 [2024-06-23 18:10:06,780] INFO: Rank 0: epoch=53 / 200 train_loss=8.4455 valid_loss=8.7057 stale=2 time=117.75m eta=17408.7m [2024-06-23 18:10:07,221] INFO: Initiating epoch #54 train run on device rank=0 [2024-06-23 20:07:06,542] INFO: Initiating epoch #54 valid run on device rank=0 [2024-06-23 20:08:18,817] INFO: Rank 0: epoch=54 / 200 train_loss=8.4367 valid_loss=8.6979 stale=0 time=118.19m eta=17289.6m [2024-06-23 20:08:19,112] INFO: Initiating epoch #55 train run on device rank=0 [2024-06-23 22:04:54,359] INFO: Initiating epoch #55 valid run on device rank=0 [2024-06-23 22:06:06,794] INFO: Rank 0: epoch=55 / 200 train_loss=8.4276 valid_loss=8.6902 stale=0 time=117.79m eta=17169.6m [2024-06-23 22:06:07,024] INFO: Initiating epoch #56 train run on device rank=0 [2024-06-24 00:02:39,464] INFO: Initiating epoch #56 valid run on device rank=0 [2024-06-24 00:03:50,504] INFO: Rank 0: epoch=56 / 200 train_loss=8.4190 valid_loss=8.7000 stale=1 time=117.72m eta=17049.4m [2024-06-24 00:03:50,644] INFO: Initiating epoch #57 train run on device rank=0 [2024-06-24 02:00:26,212] INFO: Initiating epoch #57 valid run on device rank=0 [2024-06-24 02:01:37,995] INFO: Rank 0: epoch=57 / 200 train_loss=8.4097 valid_loss=8.6749 stale=0 time=117.79m eta=16929.5m [2024-06-24 02:01:38,203] INFO: Initiating epoch #58 train run on device rank=0 [2024-06-24 03:58:03,660] INFO: Initiating epoch #58 valid run on device rank=0 [2024-06-24 03:59:15,994] INFO: Rank 0: epoch=58 / 200 train_loss=8.4019 valid_loss=8.6664 stale=0 time=117.63m eta=16809.2m [2024-06-24 03:59:16,265] INFO: Initiating epoch #59 train run on device rank=0 [2024-06-24 05:55:37,639] INFO: Initiating epoch #59 valid run on device rank=0 [2024-06-24 05:56:48,844] INFO: Rank 0: epoch=59 / 200 train_loss=8.3937 valid_loss=8.6834 stale=1 time=117.54m eta=16688.9m [2024-06-24 05:56:49,002] INFO: Initiating epoch #60 train run on device rank=0 [2024-06-24 07:52:52,818] INFO: Initiating epoch #60 valid run on device rank=0 [2024-06-24 07:54:04,473] INFO: Rank 0: epoch=60 / 200 train_loss=8.3866 valid_loss=8.6618 stale=0 time=117.26m eta=16568.0m [2024-06-24 07:54:04,711] INFO: Initiating epoch #61 train run on device rank=0 [2024-06-24 09:50:19,156] INFO: Initiating epoch #61 valid run on device rank=0 [2024-06-24 09:51:31,150] INFO: Rank 0: epoch=61 / 200 train_loss=8.3799 valid_loss=8.6821 stale=1 time=117.44m eta=16447.6m [2024-06-24 09:51:31,375] INFO: Initiating epoch #62 train run on device rank=0 [2024-06-24 11:48:16,513] INFO: Initiating epoch #62 valid run on device rank=0 [2024-06-24 11:49:28,085] INFO: Rank 0: epoch=62 / 200 train_loss=8.3727 valid_loss=8.6690 stale=2 time=117.95m eta=16328.4m [2024-06-24 11:49:28,309] INFO: Initiating epoch #63 train run on device rank=0 [2024-06-24 13:45:39,212] INFO: Initiating epoch #63 valid run on device rank=0 [2024-06-24 13:46:50,301] INFO: Rank 0: epoch=63 / 200 train_loss=8.3675 valid_loss=8.6735 stale=3 time=117.37m eta=16208.0m [2024-06-24 13:46:50,353] INFO: Initiating epoch #64 train run on device rank=0 [2024-06-24 15:42:27,952] INFO: Initiating epoch #64 valid run on device rank=0 [2024-06-24 15:43:39,131] INFO: Rank 0: epoch=64 / 200 train_loss=8.3616 valid_loss=8.6673 stale=4 time=116.81m eta=16086.5m [2024-06-24 15:43:39,240] INFO: Initiating epoch #65 train run on device rank=0 [2024-06-24 17:39:18,291] INFO: Initiating epoch #65 valid run on device rank=0 [2024-06-24 17:40:29,866] INFO: Rank 0: epoch=65 / 200 train_loss=8.3542 valid_loss=8.6470 stale=0 time=116.84m eta=15965.3m [2024-06-24 17:40:30,043] INFO: Initiating epoch #66 train run on device rank=0 [2024-06-24 19:36:11,110] INFO: Initiating epoch #66 valid run on device rank=0 [2024-06-24 19:37:22,912] INFO: Rank 0: epoch=66 / 200 train_loss=8.3489 valid_loss=8.6450 stale=0 time=116.88m eta=15844.2m [2024-06-24 19:37:23,061] INFO: Initiating epoch #67 train run on device rank=0 [2024-06-24 21:33:04,228] INFO: Initiating epoch #67 valid run on device rank=0 [2024-06-24 21:34:15,355] INFO: Rank 0: epoch=67 / 200 train_loss=8.3435 valid_loss=8.6575 stale=1 time=116.87m eta=15723.2m [2024-06-24 21:34:15,525] INFO: Initiating epoch #68 train run on device rank=0 [2024-06-24 23:29:57,302] INFO: Initiating epoch #68 valid run on device rank=0 [2024-06-24 23:31:10,799] INFO: Rank 0: epoch=68 / 200 train_loss=8.3387 valid_loss=8.6359 stale=0 time=116.92m eta=15602.5m [2024-06-24 23:31:11,058] INFO: Initiating epoch #69 train run on device rank=0 [2024-06-25 01:26:51,689] INFO: Initiating epoch #69 valid run on device rank=0 [2024-06-25 01:28:03,042] INFO: Rank 0: epoch=69 / 200 train_loss=8.3336 valid_loss=8.6386 stale=1 time=116.87m eta=15481.8m [2024-06-25 01:28:03,185] INFO: Initiating epoch #70 train run on device rank=0 [2024-06-25 03:23:43,146] INFO: Initiating epoch #70 valid run on device rank=0 [2024-06-25 03:24:55,016] INFO: Rank 0: epoch=70 / 200 train_loss=8.3293 valid_loss=8.6387 stale=2 time=116.86m eta=15361.2m [2024-06-25 03:24:55,461] INFO: Initiating epoch #71 train run on device rank=0 [2024-06-25 05:20:34,833] INFO: Initiating epoch #71 valid run on device rank=0 [2024-06-25 05:21:46,298] INFO: Rank 0: epoch=71 / 200 train_loss=8.3238 valid_loss=8.6571 stale=3 time=116.85m eta=15240.6m [2024-06-25 05:21:46,421] INFO: Initiating epoch #72 train run on device rank=0 [2024-06-25 07:17:26,949] INFO: Initiating epoch #72 valid run on device rank=0 [2024-06-25 07:18:37,978] INFO: Rank 0: epoch=72 / 200 train_loss=8.3192 valid_loss=8.6523 stale=4 time=116.86m eta=15120.2m [2024-06-25 07:18:38,167] INFO: Initiating epoch #73 train run on device rank=0 [2024-06-25 09:14:18,876] INFO: Initiating epoch #73 valid run on device rank=0 [2024-06-25 09:15:31,040] INFO: Rank 0: epoch=73 / 200 train_loss=8.3154 valid_loss=8.6593 stale=5 time=116.88m eta=14999.9m [2024-06-25 09:15:31,525] INFO: Initiating epoch #74 train run on device rank=0 [2024-06-25 11:11:12,973] INFO: Initiating epoch #74 valid run on device rank=0 [2024-06-25 11:12:24,263] INFO: Rank 0: epoch=74 / 200 train_loss=8.3119 valid_loss=8.6623 stale=6 time=116.88m eta=14879.7m [2024-06-25 11:12:24,385] INFO: Initiating epoch #75 train run on device rank=0 [2024-06-25 13:08:04,921] INFO: Initiating epoch #75 valid run on device rank=0 [2024-06-25 13:09:16,202] INFO: Rank 0: epoch=75 / 200 train_loss=8.3075 valid_loss=8.6480 stale=7 time=116.86m eta=14759.6m [2024-06-25 13:09:16,301] INFO: Initiating epoch #76 train run on device rank=0 [2024-06-25 15:04:57,021] INFO: Initiating epoch #76 valid run on device rank=0 [2024-06-25 15:06:08,453] INFO: Rank 0: epoch=76 / 200 train_loss=8.3033 valid_loss=8.6379 stale=8 time=116.87m eta=14639.5m [2024-06-25 15:06:08,793] INFO: Initiating epoch #77 train run on device rank=0 [2024-06-25 17:01:47,934] INFO: Initiating epoch #77 valid run on device rank=0 [2024-06-25 17:02:58,980] INFO: Rank 0: epoch=77 / 200 train_loss=8.3003 valid_loss=8.6368 stale=9 time=116.84m eta=14519.5m [2024-06-25 17:02:59,086] INFO: Initiating epoch #78 train run on device rank=0 [2024-06-25 18:58:37,550] INFO: Initiating epoch #78 valid run on device rank=0 [2024-06-25 18:59:48,748] INFO: Rank 0: epoch=78 / 200 train_loss=8.2963 valid_loss=8.6605 stale=10 time=116.83m eta=14399.6m [2024-06-25 18:59:48,821] INFO: Initiating epoch #79 train run on device rank=0 [2024-06-25 20:55:26,964] INFO: Initiating epoch #79 valid run on device rank=0 [2024-06-25 20:56:38,213] INFO: Rank 0: epoch=79 / 200 train_loss=8.2919 valid_loss=8.6629 stale=11 time=116.82m eta=14279.7m [2024-06-25 20:56:38,285] INFO: Initiating epoch #80 train run on device rank=0 [2024-06-25 22:52:17,157] INFO: Initiating epoch #80 valid run on device rank=0 [2024-06-25 22:53:28,440] INFO: Rank 0: epoch=80 / 200 train_loss=8.2897 valid_loss=8.6654 stale=12 time=116.84m eta=14159.9m [2024-06-25 22:53:28,579] INFO: Initiating epoch #81 train run on device rank=0 [2024-06-26 00:49:09,613] INFO: Initiating epoch #81 valid run on device rank=0 [2024-06-26 00:50:20,750] INFO: Rank 0: epoch=81 / 200 train_loss=8.2862 valid_loss=8.6410 stale=13 time=116.87m eta=14040.3m [2024-06-26 00:50:20,813] INFO: Initiating epoch #82 train run on device rank=0 [2024-06-26 02:46:02,382] INFO: Initiating epoch #82 valid run on device rank=0 [2024-06-26 02:47:13,730] INFO: Rank 0: epoch=82 / 200 train_loss=8.2831 valid_loss=8.6765 stale=14 time=116.88m eta=13920.7m [2024-06-26 02:47:13,814] INFO: Initiating epoch #83 train run on device rank=0 [2024-08-26 08:48:30,861] INFO: Will use single-gpu: NVIDIA GeForce GTX 1080 Ti [2024-08-26 08:48:31,219] INFO: using dtype=torch.float32 [2024-08-26 08:48:31,219] INFO: using dtype=torch.float32 [2024-08-26 08:48:31,385] INFO: model_kwargs: {'input_dim': 17, 'num_classes': 6, 'input_encoding': 'joint', 'pt_mode': 'linear', 'eta_mode': 'linear', 'sin_phi_mode': 'linear', 'cos_phi_mode': 'linear', 'energy_mode': 'linear', 'elemtypes_nonzero': [1, 2], 'learned_representation_mode': 'last', 'conv_type': 'attention', 'num_convs': 3, 'dropout_ff': 0.0, 'dropout_conv_id_mha': 0.0, 'dropout_conv_id_ff': 0.0, 'dropout_conv_reg_mha': 0.0, 'dropout_conv_reg_ff': 0.0, 'activation': 'relu', 'head_dim': 16, 'num_heads': 32, 'attention_type': 'math'} [2024-08-26 08:48:31,385] INFO: model_kwargs: {'input_dim': 17, 'num_classes': 6, 'input_encoding': 'joint', 'pt_mode': 'linear', 'eta_mode': 'linear', 'sin_phi_mode': 'linear', 'cos_phi_mode': 'linear', 'energy_mode': 'linear', 'elemtypes_nonzero': [1, 2], 'learned_representation_mode': 'last', 'conv_type': 'attention', 'num_convs': 3, 'dropout_ff': 0.0, 'dropout_conv_id_mha': 0.0, 'dropout_conv_id_ff': 0.0, 'dropout_conv_reg_mha': 0.0, 'dropout_conv_reg_ff': 0.0, 'activation': 'relu', 'head_dim': 16, 'num_heads': 32, 'attention_type': 'math'} [2024-08-26 08:48:31,446] INFO: using attention_type=efficient [2024-08-26 08:48:31,446] INFO: using attention_type=efficient [2024-08-26 08:48:31,473] INFO: using attention_type=efficient [2024-08-26 08:48:31,473] INFO: using attention_type=efficient [2024-08-26 08:48:31,503] INFO: using attention_type=efficient [2024-08-26 08:48:31,503] INFO: using attention_type=efficient [2024-08-26 08:48:31,532] INFO: using attention_type=efficient [2024-08-26 08:48:31,532] INFO: using attention_type=efficient [2024-08-26 08:48:31,563] INFO: using attention_type=efficient [2024-08-26 08:48:31,563] INFO: using attention_type=efficient [2024-08-26 08:48:31,593] INFO: using attention_type=efficient [2024-08-26 08:48:31,593] INFO: using attention_type=efficient [2024-08-26 08:48:38,347] INFO: Loaded model weights from /pfvol/experiments/MLPF_clic_backbone_8GTX/checkpoints/checkpoint-82-8.676534.pth [2024-08-26 08:48:38,347] INFO: Loaded model weights from /pfvol/experiments/MLPF_clic_backbone_8GTX/checkpoints/checkpoint-82-8.676534.pth [2024-08-26 08:48:38,479] INFO: MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) [2024-08-26 08:48:38,479] INFO: MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) [2024-08-26 08:48:38,481] INFO: Trainable parameters: 11671568 [2024-08-26 08:48:38,481] INFO: Trainable parameters: 11671568 [2024-08-26 08:48:38,481] INFO: Non-trainable parameters: 0 [2024-08-26 08:48:38,481] INFO: Non-trainable parameters: 0 [2024-08-26 08:48:38,482] INFO: Total parameters: 11671568 [2024-08-26 08:48:38,482] INFO: Total parameters: 11671568 [2024-08-26 08:48:38,486] INFO: Modules Trainable parameters Non-tranable parameters nn0_id.0.weight 8704 0 nn0_id.0.bias 512 0 nn0_id.2.weight 512 0 nn0_id.2.bias 512 0 nn0_id.4.weight 262144 0 nn0_id.4.bias 512 0 nn0_reg.0.weight 8704 0 nn0_reg.0.bias 512 0 nn0_reg.2.weight 512 0 nn0_reg.2.bias 512 0 nn0_reg.4.weight 262144 0 nn0_reg.4.bias 512 0 conv_id.0.mha.in_proj_weight 786432 0 conv_id.0.mha.in_proj_bias 1536 0 conv_id.0.mha.out_proj.weight 262144 0 conv_id.0.mha.out_proj.bias 512 0 conv_id.0.norm0.weight 512 0 conv_id.0.norm0.bias 512 0 conv_id.0.norm1.weight 512 0 conv_id.0.norm1.bias 512 0 conv_id.0.seq.0.weight 262144 0 conv_id.0.seq.0.bias 512 0 conv_id.0.seq.2.weight 262144 0 conv_id.0.seq.2.bias 512 0 conv_id.1.mha.in_proj_weight 786432 0 conv_id.1.mha.in_proj_bias 1536 0 conv_id.1.mha.out_proj.weight 262144 0 conv_id.1.mha.out_proj.bias 512 0 conv_id.1.norm0.weight 512 0 conv_id.1.norm0.bias 512 0 conv_id.1.norm1.weight 512 0 conv_id.1.norm1.bias 512 0 conv_id.1.seq.0.weight 262144 0 conv_id.1.seq.0.bias 512 0 conv_id.1.seq.2.weight 262144 0 conv_id.1.seq.2.bias 512 0 conv_id.2.mha.in_proj_weight 786432 0 conv_id.2.mha.in_proj_bias 1536 0 conv_id.2.mha.out_proj.weight 262144 0 conv_id.2.mha.out_proj.bias 512 0 conv_id.2.norm0.weight 512 0 conv_id.2.norm0.bias 512 0 conv_id.2.norm1.weight 512 0 conv_id.2.norm1.bias 512 0 conv_id.2.seq.0.weight 262144 0 conv_id.2.seq.0.bias 512 0 conv_id.2.seq.2.weight 262144 0 conv_id.2.seq.2.bias 512 0 conv_reg.0.mha.in_proj_weight 786432 0 conv_reg.0.mha.in_proj_bias 1536 0 conv_reg.0.mha.out_proj.weight 262144 0 conv_reg.0.mha.out_proj.bias 512 0 conv_reg.0.norm0.weight 512 0 conv_reg.0.norm0.bias 512 0 conv_reg.0.norm1.weight 512 0 conv_reg.0.norm1.bias 512 0 conv_reg.0.seq.0.weight 262144 0 conv_reg.0.seq.0.bias 512 0 conv_reg.0.seq.2.weight 262144 0 conv_reg.0.seq.2.bias 512 0 conv_reg.1.mha.in_proj_weight 786432 0 conv_reg.1.mha.in_proj_bias 1536 0 conv_reg.1.mha.out_proj.weight 262144 0 conv_reg.1.mha.out_proj.bias 512 0 conv_reg.1.norm0.weight 512 0 conv_reg.1.norm0.bias 512 0 conv_reg.1.norm1.weight 512 0 conv_reg.1.norm1.bias 512 0 conv_reg.1.seq.0.weight 262144 0 conv_reg.1.seq.0.bias 512 0 conv_reg.1.seq.2.weight 262144 0 conv_reg.1.seq.2.bias 512 0 conv_reg.2.mha.in_proj_weight 786432 0 conv_reg.2.mha.in_proj_bias 1536 0 conv_reg.2.mha.out_proj.weight 262144 0 conv_reg.2.mha.out_proj.bias 512 0 conv_reg.2.norm0.weight 512 0 conv_reg.2.norm0.bias 512 0 conv_reg.2.norm1.weight 512 0 conv_reg.2.norm1.bias 512 0 conv_reg.2.seq.0.weight 262144 0 conv_reg.2.seq.0.bias 512 0 conv_reg.2.seq.2.weight 262144 0 conv_reg.2.seq.2.bias 512 0 nn_id.0.weight 270848 0 nn_id.0.bias 512 0 nn_id.2.weight 512 0 nn_id.2.bias 512 0 nn_id.4.weight 3072 0 nn_id.4.bias 6 0 nn_pt.nn.0.weight 273920 0 nn_pt.nn.0.bias 512 0 nn_pt.nn.2.weight 512 0 nn_pt.nn.2.bias 512 0 nn_pt.nn.4.weight 1024 0 nn_pt.nn.4.bias 2 0 nn_eta.nn.0.weight 273920 0 nn_eta.nn.0.bias 512 0 nn_eta.nn.2.weight 512 0 nn_eta.nn.2.bias 512 0 nn_eta.nn.4.weight 1024 0 nn_eta.nn.4.bias 2 0 nn_sin_phi.nn.0.weight 273920 0 nn_sin_phi.nn.0.bias 512 0 nn_sin_phi.nn.2.weight 512 0 nn_sin_phi.nn.2.bias 512 0 nn_sin_phi.nn.4.weight 1024 0 nn_sin_phi.nn.4.bias 2 0 nn_cos_phi.nn.0.weight 273920 0 nn_cos_phi.nn.0.bias 512 0 nn_cos_phi.nn.2.weight 512 0 nn_cos_phi.nn.2.bias 512 0 nn_cos_phi.nn.4.weight 1024 0 nn_cos_phi.nn.4.bias 2 0 nn_energy.nn.0.weight 273920 0 nn_energy.nn.0.bias 512 0 nn_energy.nn.2.weight 512 0 nn_energy.nn.2.bias 512 0 nn_energy.nn.4.weight 1024 0 nn_energy.nn.4.bias 2 0 [2024-08-26 08:48:38,486] INFO: Modules Trainable parameters Non-tranable parameters nn0_id.0.weight 8704 0 nn0_id.0.bias 512 0 nn0_id.2.weight 512 0 nn0_id.2.bias 512 0 nn0_id.4.weight 262144 0 nn0_id.4.bias 512 0 nn0_reg.0.weight 8704 0 nn0_reg.0.bias 512 0 nn0_reg.2.weight 512 0 nn0_reg.2.bias 512 0 nn0_reg.4.weight 262144 0 nn0_reg.4.bias 512 0 conv_id.0.mha.in_proj_weight 786432 0 conv_id.0.mha.in_proj_bias 1536 0 conv_id.0.mha.out_proj.weight 262144 0 conv_id.0.mha.out_proj.bias 512 0 conv_id.0.norm0.weight 512 0 conv_id.0.norm0.bias 512 0 conv_id.0.norm1.weight 512 0 conv_id.0.norm1.bias 512 0 conv_id.0.seq.0.weight 262144 0 conv_id.0.seq.0.bias 512 0 conv_id.0.seq.2.weight 262144 0 conv_id.0.seq.2.bias 512 0 conv_id.1.mha.in_proj_weight 786432 0 conv_id.1.mha.in_proj_bias 1536 0 conv_id.1.mha.out_proj.weight 262144 0 conv_id.1.mha.out_proj.bias 512 0 conv_id.1.norm0.weight 512 0 conv_id.1.norm0.bias 512 0 conv_id.1.norm1.weight 512 0 conv_id.1.norm1.bias 512 0 conv_id.1.seq.0.weight 262144 0 conv_id.1.seq.0.bias 512 0 conv_id.1.seq.2.weight 262144 0 conv_id.1.seq.2.bias 512 0 conv_id.2.mha.in_proj_weight 786432 0 conv_id.2.mha.in_proj_bias 1536 0 conv_id.2.mha.out_proj.weight 262144 0 conv_id.2.mha.out_proj.bias 512 0 conv_id.2.norm0.weight 512 0 conv_id.2.norm0.bias 512 0 conv_id.2.norm1.weight 512 0 conv_id.2.norm1.bias 512 0 conv_id.2.seq.0.weight 262144 0 conv_id.2.seq.0.bias 512 0 conv_id.2.seq.2.weight 262144 0 conv_id.2.seq.2.bias 512 0 conv_reg.0.mha.in_proj_weight 786432 0 conv_reg.0.mha.in_proj_bias 1536 0 conv_reg.0.mha.out_proj.weight 262144 0 conv_reg.0.mha.out_proj.bias 512 0 conv_reg.0.norm0.weight 512 0 conv_reg.0.norm0.bias 512 0 conv_reg.0.norm1.weight 512 0 conv_reg.0.norm1.bias 512 0 conv_reg.0.seq.0.weight 262144 0 conv_reg.0.seq.0.bias 512 0 conv_reg.0.seq.2.weight 262144 0 conv_reg.0.seq.2.bias 512 0 conv_reg.1.mha.in_proj_weight 786432 0 conv_reg.1.mha.in_proj_bias 1536 0 conv_reg.1.mha.out_proj.weight 262144 0 conv_reg.1.mha.out_proj.bias 512 0 conv_reg.1.norm0.weight 512 0 conv_reg.1.norm0.bias 512 0 conv_reg.1.norm1.weight 512 0 conv_reg.1.norm1.bias 512 0 conv_reg.1.seq.0.weight 262144 0 conv_reg.1.seq.0.bias 512 0 conv_reg.1.seq.2.weight 262144 0 conv_reg.1.seq.2.bias 512 0 conv_reg.2.mha.in_proj_weight 786432 0 conv_reg.2.mha.in_proj_bias 1536 0 conv_reg.2.mha.out_proj.weight 262144 0 conv_reg.2.mha.out_proj.bias 512 0 conv_reg.2.norm0.weight 512 0 conv_reg.2.norm0.bias 512 0 conv_reg.2.norm1.weight 512 0 conv_reg.2.norm1.bias 512 0 conv_reg.2.seq.0.weight 262144 0 conv_reg.2.seq.0.bias 512 0 conv_reg.2.seq.2.weight 262144 0 conv_reg.2.seq.2.bias 512 0 nn_id.0.weight 270848 0 nn_id.0.bias 512 0 nn_id.2.weight 512 0 nn_id.2.bias 512 0 nn_id.4.weight 3072 0 nn_id.4.bias 6 0 nn_pt.nn.0.weight 273920 0 nn_pt.nn.0.bias 512 0 nn_pt.nn.2.weight 512 0 nn_pt.nn.2.bias 512 0 nn_pt.nn.4.weight 1024 0 nn_pt.nn.4.bias 2 0 nn_eta.nn.0.weight 273920 0 nn_eta.nn.0.bias 512 0 nn_eta.nn.2.weight 512 0 nn_eta.nn.2.bias 512 0 nn_eta.nn.4.weight 1024 0 nn_eta.nn.4.bias 2 0 nn_sin_phi.nn.0.weight 273920 0 nn_sin_phi.nn.0.bias 512 0 nn_sin_phi.nn.2.weight 512 0 nn_sin_phi.nn.2.bias 512 0 nn_sin_phi.nn.4.weight 1024 0 nn_sin_phi.nn.4.bias 2 0 nn_cos_phi.nn.0.weight 273920 0 nn_cos_phi.nn.0.bias 512 0 nn_cos_phi.nn.2.weight 512 0 nn_cos_phi.nn.2.bias 512 0 nn_cos_phi.nn.4.weight 1024 0 nn_cos_phi.nn.4.bias 2 0 nn_energy.nn.0.weight 273920 0 nn_energy.nn.0.bias 512 0 nn_energy.nn.2.weight 512 0 nn_energy.nn.2.bias 512 0 nn_energy.nn.4.weight 1024 0 nn_energy.nn.4.bias 2 0 [2024-08-26 08:48:39,800] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:48:39,800] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:48:39,800] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:48:39,800] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:48:39,826] INFO: train_dataset: cld_edm_ttbar_pf, 80700 [2024-08-26 08:48:39,826] INFO: train_dataset: cld_edm_ttbar_pf, 80700 [2024-08-26 08:48:40,058] INFO: valid_dataset: cld_edm_ttbar_pf, 20200 [2024-08-26 08:48:40,058] INFO: valid_dataset: cld_edm_ttbar_pf, 20200 [2024-08-26 08:48:40,685] INFO: Done with training. Total training time on device 0 is 0.0min [2024-08-26 08:48:40,685] INFO: Done with training. Total training time on device 0 is 0.0min [2024-08-26 08:49:24,891] INFO: Will use single-gpu: NVIDIA GeForce GTX 1080 Ti [2024-08-26 08:49:25,051] INFO: using dtype=torch.float32 [2024-08-26 08:49:25,051] INFO: using dtype=torch.float32 [2024-08-26 08:49:25,092] INFO: model_kwargs: {'input_dim': 17, 'num_classes': 6, 'input_encoding': 'joint', 'pt_mode': 'linear', 'eta_mode': 'linear', 'sin_phi_mode': 'linear', 'cos_phi_mode': 'linear', 'energy_mode': 'linear', 'elemtypes_nonzero': [1, 2], 'learned_representation_mode': 'last', 'conv_type': 'attention', 'num_convs': 3, 'dropout_ff': 0.0, 'dropout_conv_id_mha': 0.0, 'dropout_conv_id_ff': 0.0, 'dropout_conv_reg_mha': 0.0, 'dropout_conv_reg_ff': 0.0, 'activation': 'relu', 'head_dim': 16, 'num_heads': 32, 'attention_type': 'efficient'} [2024-08-26 08:49:25,092] INFO: model_kwargs: {'input_dim': 17, 'num_classes': 6, 'input_encoding': 'joint', 'pt_mode': 'linear', 'eta_mode': 'linear', 'sin_phi_mode': 'linear', 'cos_phi_mode': 'linear', 'energy_mode': 'linear', 'elemtypes_nonzero': [1, 2], 'learned_representation_mode': 'last', 'conv_type': 'attention', 'num_convs': 3, 'dropout_ff': 0.0, 'dropout_conv_id_mha': 0.0, 'dropout_conv_id_ff': 0.0, 'dropout_conv_reg_mha': 0.0, 'dropout_conv_reg_ff': 0.0, 'activation': 'relu', 'head_dim': 16, 'num_heads': 32, 'attention_type': 'efficient'} [2024-08-26 08:49:25,121] INFO: using attention_type=efficient [2024-08-26 08:49:25,121] INFO: using attention_type=efficient [2024-08-26 08:49:25,140] INFO: using attention_type=efficient [2024-08-26 08:49:25,140] INFO: using attention_type=efficient [2024-08-26 08:49:25,159] INFO: using attention_type=efficient [2024-08-26 08:49:25,159] INFO: using attention_type=efficient [2024-08-26 08:49:25,178] INFO: using attention_type=efficient [2024-08-26 08:49:25,178] INFO: using attention_type=efficient [2024-08-26 08:49:25,198] INFO: using attention_type=efficient [2024-08-26 08:49:25,198] INFO: using attention_type=efficient [2024-08-26 08:49:25,217] INFO: using attention_type=efficient [2024-08-26 08:49:25,217] INFO: using attention_type=efficient [2024-08-26 08:49:27,145] INFO: Loaded model weights from /pfvol/experiments/MLPF_clic_backbone_8GTX/checkpoints/checkpoint-82-8.676534.pth [2024-08-26 08:49:27,145] INFO: Loaded model weights from /pfvol/experiments/MLPF_clic_backbone_8GTX/checkpoints/checkpoint-82-8.676534.pth [2024-08-26 08:49:27,276] INFO: MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) [2024-08-26 08:49:27,276] INFO: MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) [2024-08-26 08:49:27,279] INFO: Trainable parameters: 11671568 [2024-08-26 08:49:27,279] INFO: Trainable parameters: 11671568 [2024-08-26 08:49:27,279] INFO: Non-trainable parameters: 0 [2024-08-26 08:49:27,279] INFO: Non-trainable parameters: 0 [2024-08-26 08:49:27,279] INFO: Total parameters: 11671568 [2024-08-26 08:49:27,279] INFO: Total parameters: 11671568 [2024-08-26 08:49:27,284] INFO: Modules Trainable parameters Non-tranable parameters nn0_id.0.weight 8704 0 nn0_id.0.bias 512 0 nn0_id.2.weight 512 0 nn0_id.2.bias 512 0 nn0_id.4.weight 262144 0 nn0_id.4.bias 512 0 nn0_reg.0.weight 8704 0 nn0_reg.0.bias 512 0 nn0_reg.2.weight 512 0 nn0_reg.2.bias 512 0 nn0_reg.4.weight 262144 0 nn0_reg.4.bias 512 0 conv_id.0.mha.in_proj_weight 786432 0 conv_id.0.mha.in_proj_bias 1536 0 conv_id.0.mha.out_proj.weight 262144 0 conv_id.0.mha.out_proj.bias 512 0 conv_id.0.norm0.weight 512 0 conv_id.0.norm0.bias 512 0 conv_id.0.norm1.weight 512 0 conv_id.0.norm1.bias 512 0 conv_id.0.seq.0.weight 262144 0 conv_id.0.seq.0.bias 512 0 conv_id.0.seq.2.weight 262144 0 conv_id.0.seq.2.bias 512 0 conv_id.1.mha.in_proj_weight 786432 0 conv_id.1.mha.in_proj_bias 1536 0 conv_id.1.mha.out_proj.weight 262144 0 conv_id.1.mha.out_proj.bias 512 0 conv_id.1.norm0.weight 512 0 conv_id.1.norm0.bias 512 0 conv_id.1.norm1.weight 512 0 conv_id.1.norm1.bias 512 0 conv_id.1.seq.0.weight 262144 0 conv_id.1.seq.0.bias 512 0 conv_id.1.seq.2.weight 262144 0 conv_id.1.seq.2.bias 512 0 conv_id.2.mha.in_proj_weight 786432 0 conv_id.2.mha.in_proj_bias 1536 0 conv_id.2.mha.out_proj.weight 262144 0 conv_id.2.mha.out_proj.bias 512 0 conv_id.2.norm0.weight 512 0 conv_id.2.norm0.bias 512 0 conv_id.2.norm1.weight 512 0 conv_id.2.norm1.bias 512 0 conv_id.2.seq.0.weight 262144 0 conv_id.2.seq.0.bias 512 0 conv_id.2.seq.2.weight 262144 0 conv_id.2.seq.2.bias 512 0 conv_reg.0.mha.in_proj_weight 786432 0 conv_reg.0.mha.in_proj_bias 1536 0 conv_reg.0.mha.out_proj.weight 262144 0 conv_reg.0.mha.out_proj.bias 512 0 conv_reg.0.norm0.weight 512 0 conv_reg.0.norm0.bias 512 0 conv_reg.0.norm1.weight 512 0 conv_reg.0.norm1.bias 512 0 conv_reg.0.seq.0.weight 262144 0 conv_reg.0.seq.0.bias 512 0 conv_reg.0.seq.2.weight 262144 0 conv_reg.0.seq.2.bias 512 0 conv_reg.1.mha.in_proj_weight 786432 0 conv_reg.1.mha.in_proj_bias 1536 0 conv_reg.1.mha.out_proj.weight 262144 0 conv_reg.1.mha.out_proj.bias 512 0 conv_reg.1.norm0.weight 512 0 conv_reg.1.norm0.bias 512 0 conv_reg.1.norm1.weight 512 0 conv_reg.1.norm1.bias 512 0 conv_reg.1.seq.0.weight 262144 0 conv_reg.1.seq.0.bias 512 0 conv_reg.1.seq.2.weight 262144 0 conv_reg.1.seq.2.bias 512 0 conv_reg.2.mha.in_proj_weight 786432 0 conv_reg.2.mha.in_proj_bias 1536 0 conv_reg.2.mha.out_proj.weight 262144 0 conv_reg.2.mha.out_proj.bias 512 0 conv_reg.2.norm0.weight 512 0 conv_reg.2.norm0.bias 512 0 conv_reg.2.norm1.weight 512 0 conv_reg.2.norm1.bias 512 0 conv_reg.2.seq.0.weight 262144 0 conv_reg.2.seq.0.bias 512 0 conv_reg.2.seq.2.weight 262144 0 conv_reg.2.seq.2.bias 512 0 nn_id.0.weight 270848 0 nn_id.0.bias 512 0 nn_id.2.weight 512 0 nn_id.2.bias 512 0 nn_id.4.weight 3072 0 nn_id.4.bias 6 0 nn_pt.nn.0.weight 273920 0 nn_pt.nn.0.bias 512 0 nn_pt.nn.2.weight 512 0 nn_pt.nn.2.bias 512 0 nn_pt.nn.4.weight 1024 0 nn_pt.nn.4.bias 2 0 nn_eta.nn.0.weight 273920 0 nn_eta.nn.0.bias 512 0 nn_eta.nn.2.weight 512 0 nn_eta.nn.2.bias 512 0 nn_eta.nn.4.weight 1024 0 nn_eta.nn.4.bias 2 0 nn_sin_phi.nn.0.weight 273920 0 nn_sin_phi.nn.0.bias 512 0 nn_sin_phi.nn.2.weight 512 0 nn_sin_phi.nn.2.bias 512 0 nn_sin_phi.nn.4.weight 1024 0 nn_sin_phi.nn.4.bias 2 0 nn_cos_phi.nn.0.weight 273920 0 nn_cos_phi.nn.0.bias 512 0 nn_cos_phi.nn.2.weight 512 0 nn_cos_phi.nn.2.bias 512 0 nn_cos_phi.nn.4.weight 1024 0 nn_cos_phi.nn.4.bias 2 0 nn_energy.nn.0.weight 273920 0 nn_energy.nn.0.bias 512 0 nn_energy.nn.2.weight 512 0 nn_energy.nn.2.bias 512 0 nn_energy.nn.4.weight 1024 0 nn_energy.nn.4.bias 2 0 [2024-08-26 08:49:27,284] INFO: Modules Trainable parameters Non-tranable parameters nn0_id.0.weight 8704 0 nn0_id.0.bias 512 0 nn0_id.2.weight 512 0 nn0_id.2.bias 512 0 nn0_id.4.weight 262144 0 nn0_id.4.bias 512 0 nn0_reg.0.weight 8704 0 nn0_reg.0.bias 512 0 nn0_reg.2.weight 512 0 nn0_reg.2.bias 512 0 nn0_reg.4.weight 262144 0 nn0_reg.4.bias 512 0 conv_id.0.mha.in_proj_weight 786432 0 conv_id.0.mha.in_proj_bias 1536 0 conv_id.0.mha.out_proj.weight 262144 0 conv_id.0.mha.out_proj.bias 512 0 conv_id.0.norm0.weight 512 0 conv_id.0.norm0.bias 512 0 conv_id.0.norm1.weight 512 0 conv_id.0.norm1.bias 512 0 conv_id.0.seq.0.weight 262144 0 conv_id.0.seq.0.bias 512 0 conv_id.0.seq.2.weight 262144 0 conv_id.0.seq.2.bias 512 0 conv_id.1.mha.in_proj_weight 786432 0 conv_id.1.mha.in_proj_bias 1536 0 conv_id.1.mha.out_proj.weight 262144 0 conv_id.1.mha.out_proj.bias 512 0 conv_id.1.norm0.weight 512 0 conv_id.1.norm0.bias 512 0 conv_id.1.norm1.weight 512 0 conv_id.1.norm1.bias 512 0 conv_id.1.seq.0.weight 262144 0 conv_id.1.seq.0.bias 512 0 conv_id.1.seq.2.weight 262144 0 conv_id.1.seq.2.bias 512 0 conv_id.2.mha.in_proj_weight 786432 0 conv_id.2.mha.in_proj_bias 1536 0 conv_id.2.mha.out_proj.weight 262144 0 conv_id.2.mha.out_proj.bias 512 0 conv_id.2.norm0.weight 512 0 conv_id.2.norm0.bias 512 0 conv_id.2.norm1.weight 512 0 conv_id.2.norm1.bias 512 0 conv_id.2.seq.0.weight 262144 0 conv_id.2.seq.0.bias 512 0 conv_id.2.seq.2.weight 262144 0 conv_id.2.seq.2.bias 512 0 conv_reg.0.mha.in_proj_weight 786432 0 conv_reg.0.mha.in_proj_bias 1536 0 conv_reg.0.mha.out_proj.weight 262144 0 conv_reg.0.mha.out_proj.bias 512 0 conv_reg.0.norm0.weight 512 0 conv_reg.0.norm0.bias 512 0 conv_reg.0.norm1.weight 512 0 conv_reg.0.norm1.bias 512 0 conv_reg.0.seq.0.weight 262144 0 conv_reg.0.seq.0.bias 512 0 conv_reg.0.seq.2.weight 262144 0 conv_reg.0.seq.2.bias 512 0 conv_reg.1.mha.in_proj_weight 786432 0 conv_reg.1.mha.in_proj_bias 1536 0 conv_reg.1.mha.out_proj.weight 262144 0 conv_reg.1.mha.out_proj.bias 512 0 conv_reg.1.norm0.weight 512 0 conv_reg.1.norm0.bias 512 0 conv_reg.1.norm1.weight 512 0 conv_reg.1.norm1.bias 512 0 conv_reg.1.seq.0.weight 262144 0 conv_reg.1.seq.0.bias 512 0 conv_reg.1.seq.2.weight 262144 0 conv_reg.1.seq.2.bias 512 0 conv_reg.2.mha.in_proj_weight 786432 0 conv_reg.2.mha.in_proj_bias 1536 0 conv_reg.2.mha.out_proj.weight 262144 0 conv_reg.2.mha.out_proj.bias 512 0 conv_reg.2.norm0.weight 512 0 conv_reg.2.norm0.bias 512 0 conv_reg.2.norm1.weight 512 0 conv_reg.2.norm1.bias 512 0 conv_reg.2.seq.0.weight 262144 0 conv_reg.2.seq.0.bias 512 0 conv_reg.2.seq.2.weight 262144 0 conv_reg.2.seq.2.bias 512 0 nn_id.0.weight 270848 0 nn_id.0.bias 512 0 nn_id.2.weight 512 0 nn_id.2.bias 512 0 nn_id.4.weight 3072 0 nn_id.4.bias 6 0 nn_pt.nn.0.weight 273920 0 nn_pt.nn.0.bias 512 0 nn_pt.nn.2.weight 512 0 nn_pt.nn.2.bias 512 0 nn_pt.nn.4.weight 1024 0 nn_pt.nn.4.bias 2 0 nn_eta.nn.0.weight 273920 0 nn_eta.nn.0.bias 512 0 nn_eta.nn.2.weight 512 0 nn_eta.nn.2.bias 512 0 nn_eta.nn.4.weight 1024 0 nn_eta.nn.4.bias 2 0 nn_sin_phi.nn.0.weight 273920 0 nn_sin_phi.nn.0.bias 512 0 nn_sin_phi.nn.2.weight 512 0 nn_sin_phi.nn.2.bias 512 0 nn_sin_phi.nn.4.weight 1024 0 nn_sin_phi.nn.4.bias 2 0 nn_cos_phi.nn.0.weight 273920 0 nn_cos_phi.nn.0.bias 512 0 nn_cos_phi.nn.2.weight 512 0 nn_cos_phi.nn.2.bias 512 0 nn_cos_phi.nn.4.weight 1024 0 nn_cos_phi.nn.4.bias 2 0 nn_energy.nn.0.weight 273920 0 nn_energy.nn.0.bias 512 0 nn_energy.nn.2.weight 512 0 nn_energy.nn.2.bias 512 0 nn_energy.nn.4.weight 1024 0 nn_energy.nn.4.bias 2 0 [2024-08-26 08:49:29,194] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:49:29,194] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:49:29,194] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:49:29,194] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:49:29,217] INFO: train_dataset: cld_edm_ttbar_pf, 80700 [2024-08-26 08:49:29,217] INFO: train_dataset: cld_edm_ttbar_pf, 80700 [2024-08-26 08:49:29,408] INFO: valid_dataset: cld_edm_ttbar_pf, 20200 [2024-08-26 08:49:29,408] INFO: valid_dataset: cld_edm_ttbar_pf, 20200 [2024-08-26 08:49:29,520] INFO: Done with training. Total training time on device 0 is 0.0min [2024-08-26 08:49:29,520] INFO: Done with training. Total training time on device 0 is 0.0min [2024-08-26 08:54:00,511] INFO: Will use single-gpu: NVIDIA GeForce GTX 1080 Ti [2024-08-26 08:54:00,638] INFO: using dtype=torch.float32 [2024-08-26 08:54:00,638] INFO: using dtype=torch.float32 [2024-08-26 08:54:00,638] INFO: model_kwargs: {'input_dim': 17, 'num_classes': 6, 'input_encoding': 'joint', 'pt_mode': 'linear', 'eta_mode': 'linear', 'sin_phi_mode': 'linear', 'cos_phi_mode': 'linear', 'energy_mode': 'linear', 'elemtypes_nonzero': [1, 2], 'learned_representation_mode': 'last', 'conv_type': 'attention', 'num_convs': 3, 'dropout_ff': 0.0, 'dropout_conv_id_mha': 0.0, 'dropout_conv_id_ff': 0.0, 'dropout_conv_reg_mha': 0.0, 'dropout_conv_reg_ff': 0.0, 'activation': 'relu', 'head_dim': 16, 'num_heads': 32, 'attention_type': 'efficient'} [2024-08-26 08:54:00,638] INFO: model_kwargs: {'input_dim': 17, 'num_classes': 6, 'input_encoding': 'joint', 'pt_mode': 'linear', 'eta_mode': 'linear', 'sin_phi_mode': 'linear', 'cos_phi_mode': 'linear', 'energy_mode': 'linear', 'elemtypes_nonzero': [1, 2], 'learned_representation_mode': 'last', 'conv_type': 'attention', 'num_convs': 3, 'dropout_ff': 0.0, 'dropout_conv_id_mha': 0.0, 'dropout_conv_id_ff': 0.0, 'dropout_conv_reg_mha': 0.0, 'dropout_conv_reg_ff': 0.0, 'activation': 'relu', 'head_dim': 16, 'num_heads': 32, 'attention_type': 'efficient'} [2024-08-26 08:54:00,691] INFO: using attention_type=efficient [2024-08-26 08:54:00,691] INFO: using attention_type=efficient [2024-08-26 08:54:00,710] INFO: using attention_type=efficient [2024-08-26 08:54:00,710] INFO: using attention_type=efficient [2024-08-26 08:54:00,729] INFO: using attention_type=efficient [2024-08-26 08:54:00,729] INFO: using attention_type=efficient [2024-08-26 08:54:00,748] INFO: using attention_type=efficient [2024-08-26 08:54:00,748] INFO: using attention_type=efficient [2024-08-26 08:54:00,767] INFO: using attention_type=efficient [2024-08-26 08:54:00,767] INFO: using attention_type=efficient [2024-08-26 08:54:00,786] INFO: using attention_type=efficient [2024-08-26 08:54:00,786] INFO: using attention_type=efficient [2024-08-26 08:54:02,711] INFO: Loaded model weights from /pfvol/experiments/MLPF_clic_backbone_8GTX/checkpoints/checkpoint-82-8.676534.pth [2024-08-26 08:54:02,711] INFO: Loaded model weights from /pfvol/experiments/MLPF_clic_backbone_8GTX/checkpoints/checkpoint-82-8.676534.pth [2024-08-26 08:54:02,842] INFO: MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) [2024-08-26 08:54:02,842] INFO: MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) [2024-08-26 08:54:02,844] INFO: Trainable parameters: 11671568 [2024-08-26 08:54:02,844] INFO: Trainable parameters: 11671568 [2024-08-26 08:54:02,844] INFO: Non-trainable parameters: 0 [2024-08-26 08:54:02,844] INFO: Non-trainable parameters: 0 [2024-08-26 08:54:02,845] INFO: Total parameters: 11671568 [2024-08-26 08:54:02,845] INFO: Total parameters: 11671568 [2024-08-26 08:54:02,849] INFO: Modules Trainable parameters Non-tranable parameters nn0_id.0.weight 8704 0 nn0_id.0.bias 512 0 nn0_id.2.weight 512 0 nn0_id.2.bias 512 0 nn0_id.4.weight 262144 0 nn0_id.4.bias 512 0 nn0_reg.0.weight 8704 0 nn0_reg.0.bias 512 0 nn0_reg.2.weight 512 0 nn0_reg.2.bias 512 0 nn0_reg.4.weight 262144 0 nn0_reg.4.bias 512 0 conv_id.0.mha.in_proj_weight 786432 0 conv_id.0.mha.in_proj_bias 1536 0 conv_id.0.mha.out_proj.weight 262144 0 conv_id.0.mha.out_proj.bias 512 0 conv_id.0.norm0.weight 512 0 conv_id.0.norm0.bias 512 0 conv_id.0.norm1.weight 512 0 conv_id.0.norm1.bias 512 0 conv_id.0.seq.0.weight 262144 0 conv_id.0.seq.0.bias 512 0 conv_id.0.seq.2.weight 262144 0 conv_id.0.seq.2.bias 512 0 conv_id.1.mha.in_proj_weight 786432 0 conv_id.1.mha.in_proj_bias 1536 0 conv_id.1.mha.out_proj.weight 262144 0 conv_id.1.mha.out_proj.bias 512 0 conv_id.1.norm0.weight 512 0 conv_id.1.norm0.bias 512 0 conv_id.1.norm1.weight 512 0 conv_id.1.norm1.bias 512 0 conv_id.1.seq.0.weight 262144 0 conv_id.1.seq.0.bias 512 0 conv_id.1.seq.2.weight 262144 0 conv_id.1.seq.2.bias 512 0 conv_id.2.mha.in_proj_weight 786432 0 conv_id.2.mha.in_proj_bias 1536 0 conv_id.2.mha.out_proj.weight 262144 0 conv_id.2.mha.out_proj.bias 512 0 conv_id.2.norm0.weight 512 0 conv_id.2.norm0.bias 512 0 conv_id.2.norm1.weight 512 0 conv_id.2.norm1.bias 512 0 conv_id.2.seq.0.weight 262144 0 conv_id.2.seq.0.bias 512 0 conv_id.2.seq.2.weight 262144 0 conv_id.2.seq.2.bias 512 0 conv_reg.0.mha.in_proj_weight 786432 0 conv_reg.0.mha.in_proj_bias 1536 0 conv_reg.0.mha.out_proj.weight 262144 0 conv_reg.0.mha.out_proj.bias 512 0 conv_reg.0.norm0.weight 512 0 conv_reg.0.norm0.bias 512 0 conv_reg.0.norm1.weight 512 0 conv_reg.0.norm1.bias 512 0 conv_reg.0.seq.0.weight 262144 0 conv_reg.0.seq.0.bias 512 0 conv_reg.0.seq.2.weight 262144 0 conv_reg.0.seq.2.bias 512 0 conv_reg.1.mha.in_proj_weight 786432 0 conv_reg.1.mha.in_proj_bias 1536 0 conv_reg.1.mha.out_proj.weight 262144 0 conv_reg.1.mha.out_proj.bias 512 0 conv_reg.1.norm0.weight 512 0 conv_reg.1.norm0.bias 512 0 conv_reg.1.norm1.weight 512 0 conv_reg.1.norm1.bias 512 0 conv_reg.1.seq.0.weight 262144 0 conv_reg.1.seq.0.bias 512 0 conv_reg.1.seq.2.weight 262144 0 conv_reg.1.seq.2.bias 512 0 conv_reg.2.mha.in_proj_weight 786432 0 conv_reg.2.mha.in_proj_bias 1536 0 conv_reg.2.mha.out_proj.weight 262144 0 conv_reg.2.mha.out_proj.bias 512 0 conv_reg.2.norm0.weight 512 0 conv_reg.2.norm0.bias 512 0 conv_reg.2.norm1.weight 512 0 conv_reg.2.norm1.bias 512 0 conv_reg.2.seq.0.weight 262144 0 conv_reg.2.seq.0.bias 512 0 conv_reg.2.seq.2.weight 262144 0 conv_reg.2.seq.2.bias 512 0 nn_id.0.weight 270848 0 nn_id.0.bias 512 0 nn_id.2.weight 512 0 nn_id.2.bias 512 0 nn_id.4.weight 3072 0 nn_id.4.bias 6 0 nn_pt.nn.0.weight 273920 0 nn_pt.nn.0.bias 512 0 nn_pt.nn.2.weight 512 0 nn_pt.nn.2.bias 512 0 nn_pt.nn.4.weight 1024 0 nn_pt.nn.4.bias 2 0 nn_eta.nn.0.weight 273920 0 nn_eta.nn.0.bias 512 0 nn_eta.nn.2.weight 512 0 nn_eta.nn.2.bias 512 0 nn_eta.nn.4.weight 1024 0 nn_eta.nn.4.bias 2 0 nn_sin_phi.nn.0.weight 273920 0 nn_sin_phi.nn.0.bias 512 0 nn_sin_phi.nn.2.weight 512 0 nn_sin_phi.nn.2.bias 512 0 nn_sin_phi.nn.4.weight 1024 0 nn_sin_phi.nn.4.bias 2 0 nn_cos_phi.nn.0.weight 273920 0 nn_cos_phi.nn.0.bias 512 0 nn_cos_phi.nn.2.weight 512 0 nn_cos_phi.nn.2.bias 512 0 nn_cos_phi.nn.4.weight 1024 0 nn_cos_phi.nn.4.bias 2 0 nn_energy.nn.0.weight 273920 0 nn_energy.nn.0.bias 512 0 nn_energy.nn.2.weight 512 0 nn_energy.nn.2.bias 512 0 nn_energy.nn.4.weight 1024 0 nn_energy.nn.4.bias 2 0 [2024-08-26 08:54:02,849] INFO: Modules Trainable parameters Non-tranable parameters nn0_id.0.weight 8704 0 nn0_id.0.bias 512 0 nn0_id.2.weight 512 0 nn0_id.2.bias 512 0 nn0_id.4.weight 262144 0 nn0_id.4.bias 512 0 nn0_reg.0.weight 8704 0 nn0_reg.0.bias 512 0 nn0_reg.2.weight 512 0 nn0_reg.2.bias 512 0 nn0_reg.4.weight 262144 0 nn0_reg.4.bias 512 0 conv_id.0.mha.in_proj_weight 786432 0 conv_id.0.mha.in_proj_bias 1536 0 conv_id.0.mha.out_proj.weight 262144 0 conv_id.0.mha.out_proj.bias 512 0 conv_id.0.norm0.weight 512 0 conv_id.0.norm0.bias 512 0 conv_id.0.norm1.weight 512 0 conv_id.0.norm1.bias 512 0 conv_id.0.seq.0.weight 262144 0 conv_id.0.seq.0.bias 512 0 conv_id.0.seq.2.weight 262144 0 conv_id.0.seq.2.bias 512 0 conv_id.1.mha.in_proj_weight 786432 0 conv_id.1.mha.in_proj_bias 1536 0 conv_id.1.mha.out_proj.weight 262144 0 conv_id.1.mha.out_proj.bias 512 0 conv_id.1.norm0.weight 512 0 conv_id.1.norm0.bias 512 0 conv_id.1.norm1.weight 512 0 conv_id.1.norm1.bias 512 0 conv_id.1.seq.0.weight 262144 0 conv_id.1.seq.0.bias 512 0 conv_id.1.seq.2.weight 262144 0 conv_id.1.seq.2.bias 512 0 conv_id.2.mha.in_proj_weight 786432 0 conv_id.2.mha.in_proj_bias 1536 0 conv_id.2.mha.out_proj.weight 262144 0 conv_id.2.mha.out_proj.bias 512 0 conv_id.2.norm0.weight 512 0 conv_id.2.norm0.bias 512 0 conv_id.2.norm1.weight 512 0 conv_id.2.norm1.bias 512 0 conv_id.2.seq.0.weight 262144 0 conv_id.2.seq.0.bias 512 0 conv_id.2.seq.2.weight 262144 0 conv_id.2.seq.2.bias 512 0 conv_reg.0.mha.in_proj_weight 786432 0 conv_reg.0.mha.in_proj_bias 1536 0 conv_reg.0.mha.out_proj.weight 262144 0 conv_reg.0.mha.out_proj.bias 512 0 conv_reg.0.norm0.weight 512 0 conv_reg.0.norm0.bias 512 0 conv_reg.0.norm1.weight 512 0 conv_reg.0.norm1.bias 512 0 conv_reg.0.seq.0.weight 262144 0 conv_reg.0.seq.0.bias 512 0 conv_reg.0.seq.2.weight 262144 0 conv_reg.0.seq.2.bias 512 0 conv_reg.1.mha.in_proj_weight 786432 0 conv_reg.1.mha.in_proj_bias 1536 0 conv_reg.1.mha.out_proj.weight 262144 0 conv_reg.1.mha.out_proj.bias 512 0 conv_reg.1.norm0.weight 512 0 conv_reg.1.norm0.bias 512 0 conv_reg.1.norm1.weight 512 0 conv_reg.1.norm1.bias 512 0 conv_reg.1.seq.0.weight 262144 0 conv_reg.1.seq.0.bias 512 0 conv_reg.1.seq.2.weight 262144 0 conv_reg.1.seq.2.bias 512 0 conv_reg.2.mha.in_proj_weight 786432 0 conv_reg.2.mha.in_proj_bias 1536 0 conv_reg.2.mha.out_proj.weight 262144 0 conv_reg.2.mha.out_proj.bias 512 0 conv_reg.2.norm0.weight 512 0 conv_reg.2.norm0.bias 512 0 conv_reg.2.norm1.weight 512 0 conv_reg.2.norm1.bias 512 0 conv_reg.2.seq.0.weight 262144 0 conv_reg.2.seq.0.bias 512 0 conv_reg.2.seq.2.weight 262144 0 conv_reg.2.seq.2.bias 512 0 nn_id.0.weight 270848 0 nn_id.0.bias 512 0 nn_id.2.weight 512 0 nn_id.2.bias 512 0 nn_id.4.weight 3072 0 nn_id.4.bias 6 0 nn_pt.nn.0.weight 273920 0 nn_pt.nn.0.bias 512 0 nn_pt.nn.2.weight 512 0 nn_pt.nn.2.bias 512 0 nn_pt.nn.4.weight 1024 0 nn_pt.nn.4.bias 2 0 nn_eta.nn.0.weight 273920 0 nn_eta.nn.0.bias 512 0 nn_eta.nn.2.weight 512 0 nn_eta.nn.2.bias 512 0 nn_eta.nn.4.weight 1024 0 nn_eta.nn.4.bias 2 0 nn_sin_phi.nn.0.weight 273920 0 nn_sin_phi.nn.0.bias 512 0 nn_sin_phi.nn.2.weight 512 0 nn_sin_phi.nn.2.bias 512 0 nn_sin_phi.nn.4.weight 1024 0 nn_sin_phi.nn.4.bias 2 0 nn_cos_phi.nn.0.weight 273920 0 nn_cos_phi.nn.0.bias 512 0 nn_cos_phi.nn.2.weight 512 0 nn_cos_phi.nn.2.bias 512 0 nn_cos_phi.nn.4.weight 1024 0 nn_cos_phi.nn.4.bias 2 0 nn_energy.nn.0.weight 273920 0 nn_energy.nn.0.bias 512 0 nn_energy.nn.2.weight 512 0 nn_energy.nn.2.bias 512 0 nn_energy.nn.4.weight 1024 0 nn_energy.nn.4.bias 2 0 [2024-08-26 08:54:04,495] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:54:04,495] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:54:04,495] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:54:04,495] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:54:04,519] INFO: train_dataset: cld_edm_ttbar_pf, 80700 [2024-08-26 08:54:04,519] INFO: train_dataset: cld_edm_ttbar_pf, 80700 [2024-08-26 08:54:04,715] INFO: valid_dataset: cld_edm_ttbar_pf, 20200 [2024-08-26 08:54:04,715] INFO: valid_dataset: cld_edm_ttbar_pf, 20200 [2024-08-26 08:54:04,831] INFO: Done with training. Total training time on device 0 is 0.0min [2024-08-26 08:54:04,831] INFO: Done with training. Total training time on device 0 is 0.0min [2024-08-26 08:55:29,847] INFO: Will use single-gpu: NVIDIA GeForce GTX 1080 Ti [2024-08-26 08:55:29,969] INFO: using dtype=torch.float32 [2024-08-26 08:55:29,969] INFO: using dtype=torch.float32 [2024-08-26 08:55:30,014] INFO: model_kwargs: {'input_dim': 17, 'num_classes': 6, 'input_encoding': 'joint', 'pt_mode': 'linear', 'eta_mode': 'linear', 'sin_phi_mode': 'linear', 'cos_phi_mode': 'linear', 'energy_mode': 'linear', 'elemtypes_nonzero': [1, 2], 'learned_representation_mode': 'last', 'conv_type': 'attention', 'num_convs': 3, 'dropout_ff': 0.0, 'dropout_conv_id_mha': 0.0, 'dropout_conv_id_ff': 0.0, 'dropout_conv_reg_mha': 0.0, 'dropout_conv_reg_ff': 0.0, 'activation': 'relu', 'head_dim': 16, 'num_heads': 32, 'attention_type': 'efficient'} [2024-08-26 08:55:30,014] INFO: model_kwargs: {'input_dim': 17, 'num_classes': 6, 'input_encoding': 'joint', 'pt_mode': 'linear', 'eta_mode': 'linear', 'sin_phi_mode': 'linear', 'cos_phi_mode': 'linear', 'energy_mode': 'linear', 'elemtypes_nonzero': [1, 2], 'learned_representation_mode': 'last', 'conv_type': 'attention', 'num_convs': 3, 'dropout_ff': 0.0, 'dropout_conv_id_mha': 0.0, 'dropout_conv_id_ff': 0.0, 'dropout_conv_reg_mha': 0.0, 'dropout_conv_reg_ff': 0.0, 'activation': 'relu', 'head_dim': 16, 'num_heads': 32, 'attention_type': 'efficient'} [2024-08-26 08:55:30,067] INFO: using attention_type=efficient [2024-08-26 08:55:30,067] INFO: using attention_type=efficient [2024-08-26 08:55:30,087] INFO: using attention_type=efficient [2024-08-26 08:55:30,087] INFO: using attention_type=efficient [2024-08-26 08:55:30,106] INFO: using attention_type=efficient [2024-08-26 08:55:30,106] INFO: using attention_type=efficient [2024-08-26 08:55:30,126] INFO: using attention_type=efficient [2024-08-26 08:55:30,126] INFO: using attention_type=efficient [2024-08-26 08:55:30,151] INFO: using attention_type=efficient [2024-08-26 08:55:30,151] INFO: using attention_type=efficient [2024-08-26 08:55:30,171] INFO: using attention_type=efficient [2024-08-26 08:55:30,171] INFO: using attention_type=efficient [2024-08-26 08:55:32,118] INFO: Loaded model weights from /pfvol/experiments/MLPF_clic_backbone_8GTX/checkpoints/checkpoint-82-8.676534.pth [2024-08-26 08:55:32,118] INFO: Loaded model weights from /pfvol/experiments/MLPF_clic_backbone_8GTX/checkpoints/checkpoint-82-8.676534.pth [2024-08-26 08:55:32,255] INFO: MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) [2024-08-26 08:55:32,255] INFO: MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) [2024-08-26 08:55:32,257] INFO: Trainable parameters: 11671568 [2024-08-26 08:55:32,257] INFO: Trainable parameters: 11671568 [2024-08-26 08:55:32,257] INFO: Non-trainable parameters: 0 [2024-08-26 08:55:32,257] INFO: Non-trainable parameters: 0 [2024-08-26 08:55:32,258] INFO: Total parameters: 11671568 [2024-08-26 08:55:32,258] INFO: Total parameters: 11671568 [2024-08-26 08:55:32,263] INFO: Modules Trainable parameters Non-tranable parameters nn0_id.0.weight 8704 0 nn0_id.0.bias 512 0 nn0_id.2.weight 512 0 nn0_id.2.bias 512 0 nn0_id.4.weight 262144 0 nn0_id.4.bias 512 0 nn0_reg.0.weight 8704 0 nn0_reg.0.bias 512 0 nn0_reg.2.weight 512 0 nn0_reg.2.bias 512 0 nn0_reg.4.weight 262144 0 nn0_reg.4.bias 512 0 conv_id.0.mha.in_proj_weight 786432 0 conv_id.0.mha.in_proj_bias 1536 0 conv_id.0.mha.out_proj.weight 262144 0 conv_id.0.mha.out_proj.bias 512 0 conv_id.0.norm0.weight 512 0 conv_id.0.norm0.bias 512 0 conv_id.0.norm1.weight 512 0 conv_id.0.norm1.bias 512 0 conv_id.0.seq.0.weight 262144 0 conv_id.0.seq.0.bias 512 0 conv_id.0.seq.2.weight 262144 0 conv_id.0.seq.2.bias 512 0 conv_id.1.mha.in_proj_weight 786432 0 conv_id.1.mha.in_proj_bias 1536 0 conv_id.1.mha.out_proj.weight 262144 0 conv_id.1.mha.out_proj.bias 512 0 conv_id.1.norm0.weight 512 0 conv_id.1.norm0.bias 512 0 conv_id.1.norm1.weight 512 0 conv_id.1.norm1.bias 512 0 conv_id.1.seq.0.weight 262144 0 conv_id.1.seq.0.bias 512 0 conv_id.1.seq.2.weight 262144 0 conv_id.1.seq.2.bias 512 0 conv_id.2.mha.in_proj_weight 786432 0 conv_id.2.mha.in_proj_bias 1536 0 conv_id.2.mha.out_proj.weight 262144 0 conv_id.2.mha.out_proj.bias 512 0 conv_id.2.norm0.weight 512 0 conv_id.2.norm0.bias 512 0 conv_id.2.norm1.weight 512 0 conv_id.2.norm1.bias 512 0 conv_id.2.seq.0.weight 262144 0 conv_id.2.seq.0.bias 512 0 conv_id.2.seq.2.weight 262144 0 conv_id.2.seq.2.bias 512 0 conv_reg.0.mha.in_proj_weight 786432 0 conv_reg.0.mha.in_proj_bias 1536 0 conv_reg.0.mha.out_proj.weight 262144 0 conv_reg.0.mha.out_proj.bias 512 0 conv_reg.0.norm0.weight 512 0 conv_reg.0.norm0.bias 512 0 conv_reg.0.norm1.weight 512 0 conv_reg.0.norm1.bias 512 0 conv_reg.0.seq.0.weight 262144 0 conv_reg.0.seq.0.bias 512 0 conv_reg.0.seq.2.weight 262144 0 conv_reg.0.seq.2.bias 512 0 conv_reg.1.mha.in_proj_weight 786432 0 conv_reg.1.mha.in_proj_bias 1536 0 conv_reg.1.mha.out_proj.weight 262144 0 conv_reg.1.mha.out_proj.bias 512 0 conv_reg.1.norm0.weight 512 0 conv_reg.1.norm0.bias 512 0 conv_reg.1.norm1.weight 512 0 conv_reg.1.norm1.bias 512 0 conv_reg.1.seq.0.weight 262144 0 conv_reg.1.seq.0.bias 512 0 conv_reg.1.seq.2.weight 262144 0 conv_reg.1.seq.2.bias 512 0 conv_reg.2.mha.in_proj_weight 786432 0 conv_reg.2.mha.in_proj_bias 1536 0 conv_reg.2.mha.out_proj.weight 262144 0 conv_reg.2.mha.out_proj.bias 512 0 conv_reg.2.norm0.weight 512 0 conv_reg.2.norm0.bias 512 0 conv_reg.2.norm1.weight 512 0 conv_reg.2.norm1.bias 512 0 conv_reg.2.seq.0.weight 262144 0 conv_reg.2.seq.0.bias 512 0 conv_reg.2.seq.2.weight 262144 0 conv_reg.2.seq.2.bias 512 0 nn_id.0.weight 270848 0 nn_id.0.bias 512 0 nn_id.2.weight 512 0 nn_id.2.bias 512 0 nn_id.4.weight 3072 0 nn_id.4.bias 6 0 nn_pt.nn.0.weight 273920 0 nn_pt.nn.0.bias 512 0 nn_pt.nn.2.weight 512 0 nn_pt.nn.2.bias 512 0 nn_pt.nn.4.weight 1024 0 nn_pt.nn.4.bias 2 0 nn_eta.nn.0.weight 273920 0 nn_eta.nn.0.bias 512 0 nn_eta.nn.2.weight 512 0 nn_eta.nn.2.bias 512 0 nn_eta.nn.4.weight 1024 0 nn_eta.nn.4.bias 2 0 nn_sin_phi.nn.0.weight 273920 0 nn_sin_phi.nn.0.bias 512 0 nn_sin_phi.nn.2.weight 512 0 nn_sin_phi.nn.2.bias 512 0 nn_sin_phi.nn.4.weight 1024 0 nn_sin_phi.nn.4.bias 2 0 nn_cos_phi.nn.0.weight 273920 0 nn_cos_phi.nn.0.bias 512 0 nn_cos_phi.nn.2.weight 512 0 nn_cos_phi.nn.2.bias 512 0 nn_cos_phi.nn.4.weight 1024 0 nn_cos_phi.nn.4.bias 2 0 nn_energy.nn.0.weight 273920 0 nn_energy.nn.0.bias 512 0 nn_energy.nn.2.weight 512 0 nn_energy.nn.2.bias 512 0 nn_energy.nn.4.weight 1024 0 nn_energy.nn.4.bias 2 0 [2024-08-26 08:55:32,263] INFO: Modules Trainable parameters Non-tranable parameters nn0_id.0.weight 8704 0 nn0_id.0.bias 512 0 nn0_id.2.weight 512 0 nn0_id.2.bias 512 0 nn0_id.4.weight 262144 0 nn0_id.4.bias 512 0 nn0_reg.0.weight 8704 0 nn0_reg.0.bias 512 0 nn0_reg.2.weight 512 0 nn0_reg.2.bias 512 0 nn0_reg.4.weight 262144 0 nn0_reg.4.bias 512 0 conv_id.0.mha.in_proj_weight 786432 0 conv_id.0.mha.in_proj_bias 1536 0 conv_id.0.mha.out_proj.weight 262144 0 conv_id.0.mha.out_proj.bias 512 0 conv_id.0.norm0.weight 512 0 conv_id.0.norm0.bias 512 0 conv_id.0.norm1.weight 512 0 conv_id.0.norm1.bias 512 0 conv_id.0.seq.0.weight 262144 0 conv_id.0.seq.0.bias 512 0 conv_id.0.seq.2.weight 262144 0 conv_id.0.seq.2.bias 512 0 conv_id.1.mha.in_proj_weight 786432 0 conv_id.1.mha.in_proj_bias 1536 0 conv_id.1.mha.out_proj.weight 262144 0 conv_id.1.mha.out_proj.bias 512 0 conv_id.1.norm0.weight 512 0 conv_id.1.norm0.bias 512 0 conv_id.1.norm1.weight 512 0 conv_id.1.norm1.bias 512 0 conv_id.1.seq.0.weight 262144 0 conv_id.1.seq.0.bias 512 0 conv_id.1.seq.2.weight 262144 0 conv_id.1.seq.2.bias 512 0 conv_id.2.mha.in_proj_weight 786432 0 conv_id.2.mha.in_proj_bias 1536 0 conv_id.2.mha.out_proj.weight 262144 0 conv_id.2.mha.out_proj.bias 512 0 conv_id.2.norm0.weight 512 0 conv_id.2.norm0.bias 512 0 conv_id.2.norm1.weight 512 0 conv_id.2.norm1.bias 512 0 conv_id.2.seq.0.weight 262144 0 conv_id.2.seq.0.bias 512 0 conv_id.2.seq.2.weight 262144 0 conv_id.2.seq.2.bias 512 0 conv_reg.0.mha.in_proj_weight 786432 0 conv_reg.0.mha.in_proj_bias 1536 0 conv_reg.0.mha.out_proj.weight 262144 0 conv_reg.0.mha.out_proj.bias 512 0 conv_reg.0.norm0.weight 512 0 conv_reg.0.norm0.bias 512 0 conv_reg.0.norm1.weight 512 0 conv_reg.0.norm1.bias 512 0 conv_reg.0.seq.0.weight 262144 0 conv_reg.0.seq.0.bias 512 0 conv_reg.0.seq.2.weight 262144 0 conv_reg.0.seq.2.bias 512 0 conv_reg.1.mha.in_proj_weight 786432 0 conv_reg.1.mha.in_proj_bias 1536 0 conv_reg.1.mha.out_proj.weight 262144 0 conv_reg.1.mha.out_proj.bias 512 0 conv_reg.1.norm0.weight 512 0 conv_reg.1.norm0.bias 512 0 conv_reg.1.norm1.weight 512 0 conv_reg.1.norm1.bias 512 0 conv_reg.1.seq.0.weight 262144 0 conv_reg.1.seq.0.bias 512 0 conv_reg.1.seq.2.weight 262144 0 conv_reg.1.seq.2.bias 512 0 conv_reg.2.mha.in_proj_weight 786432 0 conv_reg.2.mha.in_proj_bias 1536 0 conv_reg.2.mha.out_proj.weight 262144 0 conv_reg.2.mha.out_proj.bias 512 0 conv_reg.2.norm0.weight 512 0 conv_reg.2.norm0.bias 512 0 conv_reg.2.norm1.weight 512 0 conv_reg.2.norm1.bias 512 0 conv_reg.2.seq.0.weight 262144 0 conv_reg.2.seq.0.bias 512 0 conv_reg.2.seq.2.weight 262144 0 conv_reg.2.seq.2.bias 512 0 nn_id.0.weight 270848 0 nn_id.0.bias 512 0 nn_id.2.weight 512 0 nn_id.2.bias 512 0 nn_id.4.weight 3072 0 nn_id.4.bias 6 0 nn_pt.nn.0.weight 273920 0 nn_pt.nn.0.bias 512 0 nn_pt.nn.2.weight 512 0 nn_pt.nn.2.bias 512 0 nn_pt.nn.4.weight 1024 0 nn_pt.nn.4.bias 2 0 nn_eta.nn.0.weight 273920 0 nn_eta.nn.0.bias 512 0 nn_eta.nn.2.weight 512 0 nn_eta.nn.2.bias 512 0 nn_eta.nn.4.weight 1024 0 nn_eta.nn.4.bias 2 0 nn_sin_phi.nn.0.weight 273920 0 nn_sin_phi.nn.0.bias 512 0 nn_sin_phi.nn.2.weight 512 0 nn_sin_phi.nn.2.bias 512 0 nn_sin_phi.nn.4.weight 1024 0 nn_sin_phi.nn.4.bias 2 0 nn_cos_phi.nn.0.weight 273920 0 nn_cos_phi.nn.0.bias 512 0 nn_cos_phi.nn.2.weight 512 0 nn_cos_phi.nn.2.bias 512 0 nn_cos_phi.nn.4.weight 1024 0 nn_cos_phi.nn.4.bias 2 0 nn_energy.nn.0.weight 273920 0 nn_energy.nn.0.bias 512 0 nn_energy.nn.2.weight 512 0 nn_energy.nn.2.bias 512 0 nn_energy.nn.4.weight 1024 0 nn_energy.nn.4.bias 2 0 [2024-08-26 08:55:34,214] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:55:34,214] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:55:34,214] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:55:34,214] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:55:34,249] INFO: train_dataset: cld_edm_ttbar_pf, 80700 [2024-08-26 08:55:34,249] INFO: train_dataset: cld_edm_ttbar_pf, 80700 [2024-08-26 08:55:34,455] INFO: valid_dataset: cld_edm_ttbar_pf, 20200 [2024-08-26 08:55:34,455] INFO: valid_dataset: cld_edm_ttbar_pf, 20200 [2024-08-26 08:55:34,613] INFO: Done with training. Total training time on device 0 is 0.0min [2024-08-26 08:55:34,613] INFO: Done with training. Total training time on device 0 is 0.0min [2024-08-26 08:56:42,896] INFO: Will use single-gpu: NVIDIA GeForce GTX 1080 Ti [2024-08-26 08:56:43,318] INFO: using dtype=torch.float32 [2024-08-26 08:56:43,318] INFO: using dtype=torch.float32 [2024-08-26 08:56:43,362] INFO: model_kwargs: {'input_dim': 17, 'num_classes': 6, 'input_encoding': 'joint', 'pt_mode': 'linear', 'eta_mode': 'linear', 'sin_phi_mode': 'linear', 'cos_phi_mode': 'linear', 'energy_mode': 'linear', 'elemtypes_nonzero': [1, 2], 'learned_representation_mode': 'last', 'conv_type': 'attention', 'num_convs': 3, 'dropout_ff': 0.0, 'dropout_conv_id_mha': 0.0, 'dropout_conv_id_ff': 0.0, 'dropout_conv_reg_mha': 0.0, 'dropout_conv_reg_ff': 0.0, 'activation': 'relu', 'head_dim': 16, 'num_heads': 32, 'attention_type': 'efficient'} [2024-08-26 08:56:43,362] INFO: model_kwargs: {'input_dim': 17, 'num_classes': 6, 'input_encoding': 'joint', 'pt_mode': 'linear', 'eta_mode': 'linear', 'sin_phi_mode': 'linear', 'cos_phi_mode': 'linear', 'energy_mode': 'linear', 'elemtypes_nonzero': [1, 2], 'learned_representation_mode': 'last', 'conv_type': 'attention', 'num_convs': 3, 'dropout_ff': 0.0, 'dropout_conv_id_mha': 0.0, 'dropout_conv_id_ff': 0.0, 'dropout_conv_reg_mha': 0.0, 'dropout_conv_reg_ff': 0.0, 'activation': 'relu', 'head_dim': 16, 'num_heads': 32, 'attention_type': 'efficient'} [2024-08-26 08:56:43,412] INFO: using attention_type=efficient [2024-08-26 08:56:43,412] INFO: using attention_type=efficient [2024-08-26 08:56:43,431] INFO: using attention_type=efficient [2024-08-26 08:56:43,431] INFO: using attention_type=efficient [2024-08-26 08:56:43,450] INFO: using attention_type=efficient [2024-08-26 08:56:43,450] INFO: using attention_type=efficient [2024-08-26 08:56:43,469] INFO: using attention_type=efficient [2024-08-26 08:56:43,469] INFO: using attention_type=efficient [2024-08-26 08:56:43,488] INFO: using attention_type=efficient [2024-08-26 08:56:43,488] INFO: using attention_type=efficient [2024-08-26 08:56:43,507] INFO: using attention_type=efficient [2024-08-26 08:56:43,507] INFO: using attention_type=efficient [2024-08-26 08:56:45,426] INFO: Loaded model weights from /pfvol/experiments/MLPF_clic_backbone_8GTX/checkpoints/checkpoint-82-8.676534.pth [2024-08-26 08:56:45,426] INFO: Loaded model weights from /pfvol/experiments/MLPF_clic_backbone_8GTX/checkpoints/checkpoint-82-8.676534.pth [2024-08-26 08:56:45,563] INFO: MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) [2024-08-26 08:56:45,563] INFO: MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) [2024-08-26 08:56:45,565] INFO: Trainable parameters: 11671568 [2024-08-26 08:56:45,565] INFO: Trainable parameters: 11671568 [2024-08-26 08:56:45,565] INFO: Non-trainable parameters: 0 [2024-08-26 08:56:45,565] INFO: Non-trainable parameters: 0 [2024-08-26 08:56:45,565] INFO: Total parameters: 11671568 [2024-08-26 08:56:45,565] INFO: Total parameters: 11671568 [2024-08-26 08:56:45,570] INFO: Modules Trainable parameters Non-tranable parameters nn0_id.0.weight 8704 0 nn0_id.0.bias 512 0 nn0_id.2.weight 512 0 nn0_id.2.bias 512 0 nn0_id.4.weight 262144 0 nn0_id.4.bias 512 0 nn0_reg.0.weight 8704 0 nn0_reg.0.bias 512 0 nn0_reg.2.weight 512 0 nn0_reg.2.bias 512 0 nn0_reg.4.weight 262144 0 nn0_reg.4.bias 512 0 conv_id.0.mha.in_proj_weight 786432 0 conv_id.0.mha.in_proj_bias 1536 0 conv_id.0.mha.out_proj.weight 262144 0 conv_id.0.mha.out_proj.bias 512 0 conv_id.0.norm0.weight 512 0 conv_id.0.norm0.bias 512 0 conv_id.0.norm1.weight 512 0 conv_id.0.norm1.bias 512 0 conv_id.0.seq.0.weight 262144 0 conv_id.0.seq.0.bias 512 0 conv_id.0.seq.2.weight 262144 0 conv_id.0.seq.2.bias 512 0 conv_id.1.mha.in_proj_weight 786432 0 conv_id.1.mha.in_proj_bias 1536 0 conv_id.1.mha.out_proj.weight 262144 0 conv_id.1.mha.out_proj.bias 512 0 conv_id.1.norm0.weight 512 0 conv_id.1.norm0.bias 512 0 conv_id.1.norm1.weight 512 0 conv_id.1.norm1.bias 512 0 conv_id.1.seq.0.weight 262144 0 conv_id.1.seq.0.bias 512 0 conv_id.1.seq.2.weight 262144 0 conv_id.1.seq.2.bias 512 0 conv_id.2.mha.in_proj_weight 786432 0 conv_id.2.mha.in_proj_bias 1536 0 conv_id.2.mha.out_proj.weight 262144 0 conv_id.2.mha.out_proj.bias 512 0 conv_id.2.norm0.weight 512 0 conv_id.2.norm0.bias 512 0 conv_id.2.norm1.weight 512 0 conv_id.2.norm1.bias 512 0 conv_id.2.seq.0.weight 262144 0 conv_id.2.seq.0.bias 512 0 conv_id.2.seq.2.weight 262144 0 conv_id.2.seq.2.bias 512 0 conv_reg.0.mha.in_proj_weight 786432 0 conv_reg.0.mha.in_proj_bias 1536 0 conv_reg.0.mha.out_proj.weight 262144 0 conv_reg.0.mha.out_proj.bias 512 0 conv_reg.0.norm0.weight 512 0 conv_reg.0.norm0.bias 512 0 conv_reg.0.norm1.weight 512 0 conv_reg.0.norm1.bias 512 0 conv_reg.0.seq.0.weight 262144 0 conv_reg.0.seq.0.bias 512 0 conv_reg.0.seq.2.weight 262144 0 conv_reg.0.seq.2.bias 512 0 conv_reg.1.mha.in_proj_weight 786432 0 conv_reg.1.mha.in_proj_bias 1536 0 conv_reg.1.mha.out_proj.weight 262144 0 conv_reg.1.mha.out_proj.bias 512 0 conv_reg.1.norm0.weight 512 0 conv_reg.1.norm0.bias 512 0 conv_reg.1.norm1.weight 512 0 conv_reg.1.norm1.bias 512 0 conv_reg.1.seq.0.weight 262144 0 conv_reg.1.seq.0.bias 512 0 conv_reg.1.seq.2.weight 262144 0 conv_reg.1.seq.2.bias 512 0 conv_reg.2.mha.in_proj_weight 786432 0 conv_reg.2.mha.in_proj_bias 1536 0 conv_reg.2.mha.out_proj.weight 262144 0 conv_reg.2.mha.out_proj.bias 512 0 conv_reg.2.norm0.weight 512 0 conv_reg.2.norm0.bias 512 0 conv_reg.2.norm1.weight 512 0 conv_reg.2.norm1.bias 512 0 conv_reg.2.seq.0.weight 262144 0 conv_reg.2.seq.0.bias 512 0 conv_reg.2.seq.2.weight 262144 0 conv_reg.2.seq.2.bias 512 0 nn_id.0.weight 270848 0 nn_id.0.bias 512 0 nn_id.2.weight 512 0 nn_id.2.bias 512 0 nn_id.4.weight 3072 0 nn_id.4.bias 6 0 nn_pt.nn.0.weight 273920 0 nn_pt.nn.0.bias 512 0 nn_pt.nn.2.weight 512 0 nn_pt.nn.2.bias 512 0 nn_pt.nn.4.weight 1024 0 nn_pt.nn.4.bias 2 0 nn_eta.nn.0.weight 273920 0 nn_eta.nn.0.bias 512 0 nn_eta.nn.2.weight 512 0 nn_eta.nn.2.bias 512 0 nn_eta.nn.4.weight 1024 0 nn_eta.nn.4.bias 2 0 nn_sin_phi.nn.0.weight 273920 0 nn_sin_phi.nn.0.bias 512 0 nn_sin_phi.nn.2.weight 512 0 nn_sin_phi.nn.2.bias 512 0 nn_sin_phi.nn.4.weight 1024 0 nn_sin_phi.nn.4.bias 2 0 nn_cos_phi.nn.0.weight 273920 0 nn_cos_phi.nn.0.bias 512 0 nn_cos_phi.nn.2.weight 512 0 nn_cos_phi.nn.2.bias 512 0 nn_cos_phi.nn.4.weight 1024 0 nn_cos_phi.nn.4.bias 2 0 nn_energy.nn.0.weight 273920 0 nn_energy.nn.0.bias 512 0 nn_energy.nn.2.weight 512 0 nn_energy.nn.2.bias 512 0 nn_energy.nn.4.weight 1024 0 nn_energy.nn.4.bias 2 0 [2024-08-26 08:56:45,570] INFO: Modules Trainable parameters Non-tranable parameters nn0_id.0.weight 8704 0 nn0_id.0.bias 512 0 nn0_id.2.weight 512 0 nn0_id.2.bias 512 0 nn0_id.4.weight 262144 0 nn0_id.4.bias 512 0 nn0_reg.0.weight 8704 0 nn0_reg.0.bias 512 0 nn0_reg.2.weight 512 0 nn0_reg.2.bias 512 0 nn0_reg.4.weight 262144 0 nn0_reg.4.bias 512 0 conv_id.0.mha.in_proj_weight 786432 0 conv_id.0.mha.in_proj_bias 1536 0 conv_id.0.mha.out_proj.weight 262144 0 conv_id.0.mha.out_proj.bias 512 0 conv_id.0.norm0.weight 512 0 conv_id.0.norm0.bias 512 0 conv_id.0.norm1.weight 512 0 conv_id.0.norm1.bias 512 0 conv_id.0.seq.0.weight 262144 0 conv_id.0.seq.0.bias 512 0 conv_id.0.seq.2.weight 262144 0 conv_id.0.seq.2.bias 512 0 conv_id.1.mha.in_proj_weight 786432 0 conv_id.1.mha.in_proj_bias 1536 0 conv_id.1.mha.out_proj.weight 262144 0 conv_id.1.mha.out_proj.bias 512 0 conv_id.1.norm0.weight 512 0 conv_id.1.norm0.bias 512 0 conv_id.1.norm1.weight 512 0 conv_id.1.norm1.bias 512 0 conv_id.1.seq.0.weight 262144 0 conv_id.1.seq.0.bias 512 0 conv_id.1.seq.2.weight 262144 0 conv_id.1.seq.2.bias 512 0 conv_id.2.mha.in_proj_weight 786432 0 conv_id.2.mha.in_proj_bias 1536 0 conv_id.2.mha.out_proj.weight 262144 0 conv_id.2.mha.out_proj.bias 512 0 conv_id.2.norm0.weight 512 0 conv_id.2.norm0.bias 512 0 conv_id.2.norm1.weight 512 0 conv_id.2.norm1.bias 512 0 conv_id.2.seq.0.weight 262144 0 conv_id.2.seq.0.bias 512 0 conv_id.2.seq.2.weight 262144 0 conv_id.2.seq.2.bias 512 0 conv_reg.0.mha.in_proj_weight 786432 0 conv_reg.0.mha.in_proj_bias 1536 0 conv_reg.0.mha.out_proj.weight 262144 0 conv_reg.0.mha.out_proj.bias 512 0 conv_reg.0.norm0.weight 512 0 conv_reg.0.norm0.bias 512 0 conv_reg.0.norm1.weight 512 0 conv_reg.0.norm1.bias 512 0 conv_reg.0.seq.0.weight 262144 0 conv_reg.0.seq.0.bias 512 0 conv_reg.0.seq.2.weight 262144 0 conv_reg.0.seq.2.bias 512 0 conv_reg.1.mha.in_proj_weight 786432 0 conv_reg.1.mha.in_proj_bias 1536 0 conv_reg.1.mha.out_proj.weight 262144 0 conv_reg.1.mha.out_proj.bias 512 0 conv_reg.1.norm0.weight 512 0 conv_reg.1.norm0.bias 512 0 conv_reg.1.norm1.weight 512 0 conv_reg.1.norm1.bias 512 0 conv_reg.1.seq.0.weight 262144 0 conv_reg.1.seq.0.bias 512 0 conv_reg.1.seq.2.weight 262144 0 conv_reg.1.seq.2.bias 512 0 conv_reg.2.mha.in_proj_weight 786432 0 conv_reg.2.mha.in_proj_bias 1536 0 conv_reg.2.mha.out_proj.weight 262144 0 conv_reg.2.mha.out_proj.bias 512 0 conv_reg.2.norm0.weight 512 0 conv_reg.2.norm0.bias 512 0 conv_reg.2.norm1.weight 512 0 conv_reg.2.norm1.bias 512 0 conv_reg.2.seq.0.weight 262144 0 conv_reg.2.seq.0.bias 512 0 conv_reg.2.seq.2.weight 262144 0 conv_reg.2.seq.2.bias 512 0 nn_id.0.weight 270848 0 nn_id.0.bias 512 0 nn_id.2.weight 512 0 nn_id.2.bias 512 0 nn_id.4.weight 3072 0 nn_id.4.bias 6 0 nn_pt.nn.0.weight 273920 0 nn_pt.nn.0.bias 512 0 nn_pt.nn.2.weight 512 0 nn_pt.nn.2.bias 512 0 nn_pt.nn.4.weight 1024 0 nn_pt.nn.4.bias 2 0 nn_eta.nn.0.weight 273920 0 nn_eta.nn.0.bias 512 0 nn_eta.nn.2.weight 512 0 nn_eta.nn.2.bias 512 0 nn_eta.nn.4.weight 1024 0 nn_eta.nn.4.bias 2 0 nn_sin_phi.nn.0.weight 273920 0 nn_sin_phi.nn.0.bias 512 0 nn_sin_phi.nn.2.weight 512 0 nn_sin_phi.nn.2.bias 512 0 nn_sin_phi.nn.4.weight 1024 0 nn_sin_phi.nn.4.bias 2 0 nn_cos_phi.nn.0.weight 273920 0 nn_cos_phi.nn.0.bias 512 0 nn_cos_phi.nn.2.weight 512 0 nn_cos_phi.nn.2.bias 512 0 nn_cos_phi.nn.4.weight 1024 0 nn_cos_phi.nn.4.bias 2 0 nn_energy.nn.0.weight 273920 0 nn_energy.nn.0.bias 512 0 nn_energy.nn.2.weight 512 0 nn_energy.nn.2.bias 512 0 nn_energy.nn.4.weight 1024 0 nn_energy.nn.4.bias 2 0 [2024-08-26 08:56:47,902] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:56:47,902] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:56:47,902] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:56:47,902] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_8GTX [2024-08-26 08:56:47,927] INFO: train_dataset: cld_edm_ttbar_pf, 80700 [2024-08-26 08:56:47,927] INFO: train_dataset: cld_edm_ttbar_pf, 80700 [2024-08-26 08:56:48,149] INFO: valid_dataset: cld_edm_ttbar_pf, 20200 [2024-08-26 08:56:48,149] INFO: valid_dataset: cld_edm_ttbar_pf, 20200 [2024-08-26 08:56:48,261] INFO: Initiating epoch #83 train run on device rank=0 [2024-08-26 08:56:48,261] INFO: Initiating epoch #83 train run on device rank=0