[2024-07-01 14:20:05,101] INFO: Will use single-gpu: NVIDIA GeForce GTX 1080 Ti [2024-07-01 14:20:05,102] INFO: using dtype=torch.float32 [2024-07-01 14:20:05,102] INFO: using dtype=torch.float32 [2024-07-01 14:20:05,293] INFO: using attention_type=math [2024-07-01 14:20:05,293] INFO: using attention_type=math [2024-07-01 14:20:05,311] INFO: using attention_type=math [2024-07-01 14:20:05,311] INFO: using attention_type=math [2024-07-01 14:20:05,328] INFO: using attention_type=math [2024-07-01 14:20:05,328] INFO: using attention_type=math [2024-07-01 14:20:05,345] INFO: using attention_type=math [2024-07-01 14:20:05,345] INFO: using attention_type=math [2024-07-01 14:20:05,362] INFO: using attention_type=math [2024-07-01 14:20:05,362] INFO: using attention_type=math [2024-07-01 14:20:05,379] INFO: using attention_type=math [2024-07-01 14:20:05,379] INFO: using attention_type=math [2024-07-01 14:20:06,817] INFO: mlpf_kwargs: {'input_dim': 17, 'num_classes': 6, 'input_encoding': 'joint', 'pt_mode': 'linear', 'eta_mode': 'linear', 'sin_phi_mode': 'linear', 'cos_phi_mode': 'linear', 'energy_mode': 'linear', 'elemtypes_nonzero': [1, 2], 'learned_representation_mode': 'last', 'conv_type': 'attention', 'num_convs': 3, 'dropout_ff': 0.0, 'dropout_conv_id_mha': 0.0, 'dropout_conv_id_ff': 0.0, 'dropout_conv_reg_mha': 0.0, 'dropout_conv_reg_ff': 0.0, 'activation': 'relu', 'head_dim': 16, 'num_heads': 32, 'attention_type': 'math'} [2024-07-01 14:20:06,817] INFO: mlpf_kwargs: {'input_dim': 17, 'num_classes': 6, 'input_encoding': 'joint', 'pt_mode': 'linear', 'eta_mode': 'linear', 'sin_phi_mode': 'linear', 'cos_phi_mode': 'linear', 'energy_mode': 'linear', 'elemtypes_nonzero': [1, 2], 'learned_representation_mode': 'last', 'conv_type': 'attention', 'num_convs': 3, 'dropout_ff': 0.0, 'dropout_conv_id_mha': 0.0, 'dropout_conv_id_ff': 0.0, 'dropout_conv_reg_mha': 0.0, 'dropout_conv_reg_ff': 0.0, 'activation': 'relu', 'head_dim': 16, 'num_heads': 32, 'attention_type': 'math'} [2024-07-01 14:20:06,817] INFO: Loaded model weights from /pfvol/experiments/MLPF_clic_backbone_pyg-clic_20240429_101112_971749/best_weights.pth [2024-07-01 14:20:06,817] INFO: Loaded model weights from /pfvol/experiments/MLPF_clic_backbone_pyg-clic_20240429_101112_971749/best_weights.pth [2024-07-01 14:20:06,909] INFO: MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) [2024-07-01 14:20:06,909] INFO: MLPF( (nn0_id): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (nn0_reg): Sequential( (0): Linear(in_features=17, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=512, bias=True) ) (conv_id): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (conv_reg): ModuleList( (0-2): 3 x SelfAttentionLayer( (mha): MultiheadAttention( (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True) ) (norm0): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (seq): Sequential( (0): Linear(in_features=512, out_features=512, bias=True) (1): ReLU() (2): Linear(in_features=512, out_features=512, bias=True) (3): ReLU() ) (dropout): Dropout(p=0.0, inplace=False) ) ) (nn_id): Sequential( (0): Linear(in_features=529, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=6, bias=True) ) (nn_pt): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_eta): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_sin_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_cos_phi): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) (nn_energy): RegressionOutput( (nn): Sequential( (0): Linear(in_features=535, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0.0, inplace=False) (4): Linear(in_features=512, out_features=2, bias=True) ) ) ) [2024-07-01 14:20:06,912] INFO: Backbone Trainable parameters: 11671568 [2024-07-01 14:20:06,912] INFO: Backbone Trainable parameters: 11671568 [2024-07-01 14:20:06,912] INFO: Backbone Non-trainable parameters: 0 [2024-07-01 14:20:06,912] INFO: Backbone Non-trainable parameters: 0 [2024-07-01 14:20:06,912] INFO: Backbone Total parameters: 11671568 [2024-07-01 14:20:06,912] INFO: Backbone Total parameters: 11671568 [2024-07-01 14:20:06,917] INFO: Modules Trainable parameters Non-tranable parameters nn0_id.0.weight 8704 0 nn0_id.0.bias 512 0 nn0_id.2.weight 512 0 nn0_id.2.bias 512 0 nn0_id.4.weight 262144 0 nn0_id.4.bias 512 0 nn0_reg.0.weight 8704 0 nn0_reg.0.bias 512 0 nn0_reg.2.weight 512 0 nn0_reg.2.bias 512 0 nn0_reg.4.weight 262144 0 nn0_reg.4.bias 512 0 conv_id.0.mha.in_proj_weight 786432 0 conv_id.0.mha.in_proj_bias 1536 0 conv_id.0.mha.out_proj.weight 262144 0 conv_id.0.mha.out_proj.bias 512 0 conv_id.0.norm0.weight 512 0 conv_id.0.norm0.bias 512 0 conv_id.0.norm1.weight 512 0 conv_id.0.norm1.bias 512 0 conv_id.0.seq.0.weight 262144 0 conv_id.0.seq.0.bias 512 0 conv_id.0.seq.2.weight 262144 0 conv_id.0.seq.2.bias 512 0 conv_id.1.mha.in_proj_weight 786432 0 conv_id.1.mha.in_proj_bias 1536 0 conv_id.1.mha.out_proj.weight 262144 0 conv_id.1.mha.out_proj.bias 512 0 conv_id.1.norm0.weight 512 0 conv_id.1.norm0.bias 512 0 conv_id.1.norm1.weight 512 0 conv_id.1.norm1.bias 512 0 conv_id.1.seq.0.weight 262144 0 conv_id.1.seq.0.bias 512 0 conv_id.1.seq.2.weight 262144 0 conv_id.1.seq.2.bias 512 0 conv_id.2.mha.in_proj_weight 786432 0 conv_id.2.mha.in_proj_bias 1536 0 conv_id.2.mha.out_proj.weight 262144 0 conv_id.2.mha.out_proj.bias 512 0 conv_id.2.norm0.weight 512 0 conv_id.2.norm0.bias 512 0 conv_id.2.norm1.weight 512 0 conv_id.2.norm1.bias 512 0 conv_id.2.seq.0.weight 262144 0 conv_id.2.seq.0.bias 512 0 conv_id.2.seq.2.weight 262144 0 conv_id.2.seq.2.bias 512 0 conv_reg.0.mha.in_proj_weight 786432 0 conv_reg.0.mha.in_proj_bias 1536 0 conv_reg.0.mha.out_proj.weight 262144 0 conv_reg.0.mha.out_proj.bias 512 0 conv_reg.0.norm0.weight 512 0 conv_reg.0.norm0.bias 512 0 conv_reg.0.norm1.weight 512 0 conv_reg.0.norm1.bias 512 0 conv_reg.0.seq.0.weight 262144 0 conv_reg.0.seq.0.bias 512 0 conv_reg.0.seq.2.weight 262144 0 conv_reg.0.seq.2.bias 512 0 conv_reg.1.mha.in_proj_weight 786432 0 conv_reg.1.mha.in_proj_bias 1536 0 conv_reg.1.mha.out_proj.weight 262144 0 conv_reg.1.mha.out_proj.bias 512 0 conv_reg.1.norm0.weight 512 0 conv_reg.1.norm0.bias 512 0 conv_reg.1.norm1.weight 512 0 conv_reg.1.norm1.bias 512 0 conv_reg.1.seq.0.weight 262144 0 conv_reg.1.seq.0.bias 512 0 conv_reg.1.seq.2.weight 262144 0 conv_reg.1.seq.2.bias 512 0 conv_reg.2.mha.in_proj_weight 786432 0 conv_reg.2.mha.in_proj_bias 1536 0 conv_reg.2.mha.out_proj.weight 262144 0 conv_reg.2.mha.out_proj.bias 512 0 conv_reg.2.norm0.weight 512 0 conv_reg.2.norm0.bias 512 0 conv_reg.2.norm1.weight 512 0 conv_reg.2.norm1.bias 512 0 conv_reg.2.seq.0.weight 262144 0 conv_reg.2.seq.0.bias 512 0 conv_reg.2.seq.2.weight 262144 0 conv_reg.2.seq.2.bias 512 0 nn_id.0.weight 270848 0 nn_id.0.bias 512 0 nn_id.2.weight 512 0 nn_id.2.bias 512 0 nn_id.4.weight 3072 0 nn_id.4.bias 6 0 nn_pt.nn.0.weight 273920 0 nn_pt.nn.0.bias 512 0 nn_pt.nn.2.weight 512 0 nn_pt.nn.2.bias 512 0 nn_pt.nn.4.weight 1024 0 nn_pt.nn.4.bias 2 0 nn_eta.nn.0.weight 273920 0 nn_eta.nn.0.bias 512 0 nn_eta.nn.2.weight 512 0 nn_eta.nn.2.bias 512 0 nn_eta.nn.4.weight 1024 0 nn_eta.nn.4.bias 2 0 nn_sin_phi.nn.0.weight 273920 0 nn_sin_phi.nn.0.bias 512 0 nn_sin_phi.nn.2.weight 512 0 nn_sin_phi.nn.2.bias 512 0 nn_sin_phi.nn.4.weight 1024 0 nn_sin_phi.nn.4.bias 2 0 nn_cos_phi.nn.0.weight 273920 0 nn_cos_phi.nn.0.bias 512 0 nn_cos_phi.nn.2.weight 512 0 nn_cos_phi.nn.2.bias 512 0 nn_cos_phi.nn.4.weight 1024 0 nn_cos_phi.nn.4.bias 2 0 nn_energy.nn.0.weight 273920 0 nn_energy.nn.0.bias 512 0 nn_energy.nn.2.weight 512 0 nn_energy.nn.2.bias 512 0 nn_energy.nn.4.weight 1024 0 nn_energy.nn.4.bias 2 0 [2024-07-01 14:20:06,917] INFO: Modules Trainable parameters Non-tranable parameters nn0_id.0.weight 8704 0 nn0_id.0.bias 512 0 nn0_id.2.weight 512 0 nn0_id.2.bias 512 0 nn0_id.4.weight 262144 0 nn0_id.4.bias 512 0 nn0_reg.0.weight 8704 0 nn0_reg.0.bias 512 0 nn0_reg.2.weight 512 0 nn0_reg.2.bias 512 0 nn0_reg.4.weight 262144 0 nn0_reg.4.bias 512 0 conv_id.0.mha.in_proj_weight 786432 0 conv_id.0.mha.in_proj_bias 1536 0 conv_id.0.mha.out_proj.weight 262144 0 conv_id.0.mha.out_proj.bias 512 0 conv_id.0.norm0.weight 512 0 conv_id.0.norm0.bias 512 0 conv_id.0.norm1.weight 512 0 conv_id.0.norm1.bias 512 0 conv_id.0.seq.0.weight 262144 0 conv_id.0.seq.0.bias 512 0 conv_id.0.seq.2.weight 262144 0 conv_id.0.seq.2.bias 512 0 conv_id.1.mha.in_proj_weight 786432 0 conv_id.1.mha.in_proj_bias 1536 0 conv_id.1.mha.out_proj.weight 262144 0 conv_id.1.mha.out_proj.bias 512 0 conv_id.1.norm0.weight 512 0 conv_id.1.norm0.bias 512 0 conv_id.1.norm1.weight 512 0 conv_id.1.norm1.bias 512 0 conv_id.1.seq.0.weight 262144 0 conv_id.1.seq.0.bias 512 0 conv_id.1.seq.2.weight 262144 0 conv_id.1.seq.2.bias 512 0 conv_id.2.mha.in_proj_weight 786432 0 conv_id.2.mha.in_proj_bias 1536 0 conv_id.2.mha.out_proj.weight 262144 0 conv_id.2.mha.out_proj.bias 512 0 conv_id.2.norm0.weight 512 0 conv_id.2.norm0.bias 512 0 conv_id.2.norm1.weight 512 0 conv_id.2.norm1.bias 512 0 conv_id.2.seq.0.weight 262144 0 conv_id.2.seq.0.bias 512 0 conv_id.2.seq.2.weight 262144 0 conv_id.2.seq.2.bias 512 0 conv_reg.0.mha.in_proj_weight 786432 0 conv_reg.0.mha.in_proj_bias 1536 0 conv_reg.0.mha.out_proj.weight 262144 0 conv_reg.0.mha.out_proj.bias 512 0 conv_reg.0.norm0.weight 512 0 conv_reg.0.norm0.bias 512 0 conv_reg.0.norm1.weight 512 0 conv_reg.0.norm1.bias 512 0 conv_reg.0.seq.0.weight 262144 0 conv_reg.0.seq.0.bias 512 0 conv_reg.0.seq.2.weight 262144 0 conv_reg.0.seq.2.bias 512 0 conv_reg.1.mha.in_proj_weight 786432 0 conv_reg.1.mha.in_proj_bias 1536 0 conv_reg.1.mha.out_proj.weight 262144 0 conv_reg.1.mha.out_proj.bias 512 0 conv_reg.1.norm0.weight 512 0 conv_reg.1.norm0.bias 512 0 conv_reg.1.norm1.weight 512 0 conv_reg.1.norm1.bias 512 0 conv_reg.1.seq.0.weight 262144 0 conv_reg.1.seq.0.bias 512 0 conv_reg.1.seq.2.weight 262144 0 conv_reg.1.seq.2.bias 512 0 conv_reg.2.mha.in_proj_weight 786432 0 conv_reg.2.mha.in_proj_bias 1536 0 conv_reg.2.mha.out_proj.weight 262144 0 conv_reg.2.mha.out_proj.bias 512 0 conv_reg.2.norm0.weight 512 0 conv_reg.2.norm0.bias 512 0 conv_reg.2.norm1.weight 512 0 conv_reg.2.norm1.bias 512 0 conv_reg.2.seq.0.weight 262144 0 conv_reg.2.seq.0.bias 512 0 conv_reg.2.seq.2.weight 262144 0 conv_reg.2.seq.2.bias 512 0 nn_id.0.weight 270848 0 nn_id.0.bias 512 0 nn_id.2.weight 512 0 nn_id.2.bias 512 0 nn_id.4.weight 3072 0 nn_id.4.bias 6 0 nn_pt.nn.0.weight 273920 0 nn_pt.nn.0.bias 512 0 nn_pt.nn.2.weight 512 0 nn_pt.nn.2.bias 512 0 nn_pt.nn.4.weight 1024 0 nn_pt.nn.4.bias 2 0 nn_eta.nn.0.weight 273920 0 nn_eta.nn.0.bias 512 0 nn_eta.nn.2.weight 512 0 nn_eta.nn.2.bias 512 0 nn_eta.nn.4.weight 1024 0 nn_eta.nn.4.bias 2 0 nn_sin_phi.nn.0.weight 273920 0 nn_sin_phi.nn.0.bias 512 0 nn_sin_phi.nn.2.weight 512 0 nn_sin_phi.nn.2.bias 512 0 nn_sin_phi.nn.4.weight 1024 0 nn_sin_phi.nn.4.bias 2 0 nn_cos_phi.nn.0.weight 273920 0 nn_cos_phi.nn.0.bias 512 0 nn_cos_phi.nn.2.weight 512 0 nn_cos_phi.nn.2.bias 512 0 nn_cos_phi.nn.4.weight 1024 0 nn_cos_phi.nn.4.bias 2 0 nn_energy.nn.0.weight 273920 0 nn_energy.nn.0.bias 512 0 nn_energy.nn.2.weight 512 0 nn_energy.nn.2.bias 512 0 nn_energy.nn.4.weight 1024 0 nn_energy.nn.4.bias 2 0 [2024-07-01 14:20:06,928] INFO: DeepMET( (nn): Sequential( (0): Linear(in_features=11, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0, inplace=False) (4): Linear(in_features=512, out_features=1, bias=True) ) ) [2024-07-01 14:20:06,928] INFO: DeepMET( (nn): Sequential( (0): Linear(in_features=11, out_features=512, bias=True) (1): ReLU() (2): LayerNorm((512,), eps=1e-05, elementwise_affine=True) (3): Dropout(p=0, inplace=False) (4): Linear(in_features=512, out_features=1, bias=True) ) ) [2024-07-01 14:20:06,928] INFO: DeepMET Trainable parameters: 7681 [2024-07-01 14:20:06,928] INFO: DeepMET Trainable parameters: 7681 [2024-07-01 14:20:06,928] INFO: DeepMET Non-trainable parameters: 0 [2024-07-01 14:20:06,928] INFO: DeepMET Non-trainable parameters: 0 [2024-07-01 14:20:06,928] INFO: DeepMET Total parameters: 7681 [2024-07-01 14:20:06,928] INFO: DeepMET Total parameters: 7681 [2024-07-01 14:20:06,930] INFO: Modules Trainable parameters Non-tranable parameters nn.0.weight 5632 0 nn.0.bias 512 0 nn.2.weight 512 0 nn.2.bias 512 0 nn.4.weight 512 0 nn.4.bias 1 0 [2024-07-01 14:20:06,930] INFO: Modules Trainable parameters Non-tranable parameters nn.0.weight 5632 0 nn.0.bias 512 0 nn.2.weight 512 0 nn.2.bias 512 0 nn.4.weight 512 0 nn.4.bias 1 0 [2024-07-01 14:20:06,968] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_pyg-clic_20240429_101112_971749/mlpf_20240701_142004_930865 [2024-07-01 14:20:06,968] INFO: Creating experiment dir /pfvol/experiments/MLPF_clic_backbone_pyg-clic_20240429_101112_971749/mlpf_20240701_142004_930865 [2024-07-01 14:20:06,968] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_pyg-clic_20240429_101112_971749/mlpf_20240701_142004_930865 [2024-07-01 14:20:06,968] INFO: Model directory /pfvol/experiments/MLPF_clic_backbone_pyg-clic_20240429_101112_971749/mlpf_20240701_142004_930865 [2024-07-01 14:20:07,018] INFO: train_dataset: clic_edm_ttbar_pf, 800800 [2024-07-01 14:20:07,018] INFO: train_dataset: clic_edm_ttbar_pf, 800800 [2024-07-01 14:20:07,393] INFO: valid_dataset: clic_edm_ttbar_pf, 200200 [2024-07-01 14:20:07,393] INFO: valid_dataset: clic_edm_ttbar_pf, 200200 [2024-07-01 14:20:07,653] INFO: Initiating epoch #1 train run on device rank=0 [2024-07-01 14:20:07,653] INFO: Initiating epoch #1 train run on device rank=0