Update README.md
Browse files
README.md
CHANGED
|
@@ -8,9 +8,9 @@ Gemma Scope is a comprehensive, open suite of sparse autoencoders for Gemma 2 9B
|
|
| 8 |
|
| 9 |
See our [landing page](https://huggingface.co/google/gemma-scope) for details on the whole suite. This is a specific set of SAEs:
|
| 10 |
|
| 11 |
-
# 2. What Is `
|
| 12 |
|
| 13 |
-
- `
|
| 14 |
- `2b-pt-`: These SAEs were trained on Gemma v2 2B base model.
|
| 15 |
- `att`: These SAEs were trained on the attention layer outputs, before the final linear projection.
|
| 16 |
|
|
|
|
| 8 |
|
| 9 |
See our [landing page](https://huggingface.co/google/gemma-scope) for details on the whole suite. This is a specific set of SAEs:
|
| 10 |
|
| 11 |
+
# 2. What Is `gemma-scope-2b-pt-att`?
|
| 12 |
|
| 13 |
+
- `gemma-scope-`: See 1.
|
| 14 |
- `2b-pt-`: These SAEs were trained on Gemma v2 2B base model.
|
| 15 |
- `att`: These SAEs were trained on the attention layer outputs, before the final linear projection.
|
| 16 |
|