Skip to content

Commit

Permalink
gemma 2b sae resid post 12. fix ghost grad print
Browse files Browse the repository at this point in the history
  • Loading branch information
jbloom-md committed May 21, 2024
1 parent a10283d commit 2a676b2
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 1 deletion.
2 changes: 2 additions & 0 deletions sae_lens/pretrained_saes.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -68,3 +68,5 @@ SAE_LOOKUP:
path: "gemma_2b_blocks.0.hook_resid_post_16384_anthropic"
- id: "blocks.6.hook_resid_post"
path: "gemma_2b_blocks.6.hook_resid_post_16384_anthropic_fast_lr"
- id: "blocks.12.hook_resid_post"
path: "gemma_2b_blocks.12.hook_resid_post_16384"
2 changes: 1 addition & 1 deletion sae_lens/training/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -243,7 +243,7 @@ def __post_init__(self):
f"Number tokens in sparsity calculation window: {self.feature_sampling_window * self.train_batch_size_tokens:.2e}"
)

if not self.use_ghost_grads:
if self.use_ghost_grads:
print("Using Ghost Grads.")

def get_checkpoints_by_step(self) -> tuple[dict[int, str], bool]:
Expand Down

0 comments on commit 2a676b2

Please sign in to comment.