Fascinating!!

#2
by Babsie - opened

did V1B go differently? I saw you tooling around on this, and the name caught my eye. I finally have some time to experiment with other people's models and am looking forward to seeing the difference with V1A and V1B. What merge did you use on V1B?
I've never done Karcher, I'm going to go read the paper, if I can find it...

I wanted to try more 70B merges but its too slow on my PC and I ran out of space lol. I could not even really test Golem much beyond simple benchmarks.

v1a has 11 models and v1b has 16, each uses Karcher. It's supposed to find the "true center" of all models. v1b should be 'smarter' in theory but I couldn't verify.

So if there are 2 versions of Hermes, each slightly different, it should pull the overall final weights closer toward Hermes (I think).

Still hard to say if I like Karcher more than dare_ties, or SLERP. I plan to upload a whole suite of 14B merges using the same component models for running deeper comparisons, as smaller models are much quicker to merge.

I really want to try your SCE_Ties method but keep running into issues with it. (yaml attached below). Seems like Gemma is more fussy about the settings?

If I can ever afford PC upgrade (128GB RAM) then 70B will be more feasible to test.

name: Golem 70B v1a
architecture: MistralForCausalLM
merge_method: karcher
dtype: bfloat16
models:
  - model: Doctor-Shotgun/L3.3-70B-Magnum-Diamond
  - model: LatitudeGames/Wayfarer-Large-70B-Llama-3.3
  - model: nbeerbower/Llama-3.1-Nemotron-lorablated-70B
  - model: nbeerbower/Llama3.1-Gutenberg-Doppel-70B
  - model: NousResearch/Hermes-2-Theta-Llama-3-70B
  - model: NousResearch/Hermes-4-70B
  - model: ReadyArt/L3.3-The-Omega-Directive-70B-Unslop-v2.0
  - model: Sao10K/L3.3-70B-Euryale-v2.3
  - model: SicariusSicariiStuff/Negative_LLAMA_70B
  - model: TheDrummer/Anubis-70B-v1.1
  - model: TheDrummer/Fallen-Llama-3.3-70B-v1
parameters:
tokenizer:
source: union
chat_template: auto
name: Golem 70B v1b
architecture: MistralForCausalLM
merge_method: karcher
dtype: bfloat16
models:
  - model: ArliAI/DS-R1-Distill-70B-ArliAI-RpR-v4-Large
  - model: Doctor-Shotgun/L3.3-70B-Magnum-Diamond
  - model: EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.1
  - model: LatitudeGames/Wayfarer-Large-70B-Llama-3.3
  - model: mlabonne/Hermes-3-Llama-3.1-70B-lorablated
  - model: nbeerbower/Llama-3.1-Nemotron-lorablated-70B
  - model: nbeerbower/Llama3.1-Gutenberg-Doppel-70B
  - model: NousResearch/Hermes-2-Theta-Llama-3-70B
  - model: NousResearch/Hermes-4-70B
  - model: ReadyArt/L3.3-The-Omega-Directive-70B-Unslop-v2.1
  - model: Sao10K/L3.3-70B-Euryale-v2.3
  - model: SicariusSicariiStuff/Negative_LLAMA_70B
  - model: TheDrummer/Anubis-70B-v1.1
  - model: TheDrummer/Fallen-Llama-3.3-70B-v1
  - model: TheDrummer/Fallen-Llama-3.3-R1-70B-v1
  - model: zerofata/L3.3-GeneticLemonade-Final-v2-70B
parameters:
tokenizer:
source: union
chat_template: auto

This is a modification of your ThetaBlackGorgon yaml which I tried to apply to Gemma 2. But it keeps crashing when I go to quantize or abliterate the safetensors.

models:
  - model: A:\LLM\Psychosis--Stage1  # Base
    name: psychosis_core
  - model: A:\LLM\Psychosis--Stage2  # Doppelganger
    name: writer_doppel
  - model: A:\LLM\Psychosis--Stage3  # ERP hybrid
    name: erp_injector
  - model: A:\LLM\Psychosis--Stage4  # Literary hybrid
    name: lit_refiner

merge_method: sce  # Core: Variance-pruning top-k for franken healing
base_model: A:\LLM\Psychosis--Stage1

parameters:
  select_topk: 0.75  # Retain top 75% variable coords—prunes upscale boundary noise
  prescale: true     # Pre-norm deltas (TIES-style magnitude control)
  normalize: true    # Post-norm for sign stability
  density: 0.7       # Hybrid tweak: Light global pruning to complement SCE

weights: # TIES-hybrid: Component-specific task arithmetic for modular fusion
  # --- PRIORITY 1: STABILITY (Norms & Embeddings) ---
  
  # 1. Normalization (MUST BE FIRST)
  # Captures: input_layernorm, post_attention_layernorm, 
  # pre_feedforward_layernorm, post_feedforward_layernorm, model.norm
  # We lock these to the Base Model (1.0) to prevent activation spikes.
  - filter: ".*(norm).*"
    models: {psychosis_core: 1.0, writer_doppel: 0.0, erp_injector: 0.0, lit_refiner: 0.0}

  # 2. Embeddings (The Missing Link)
  # Captures: model.embed_tokens.weight
  # We give this to the base (Psychosis Core) to keep the token mapping consistent.
  - filter: ".*(embed).*"
    models: {psychosis_core: 1.0, writer_doppel: 0.0, erp_injector: 0.0, lit_refiner: 0.0}

  # --- PRIORITY 2: COGNITION (The Hybrid Merge) ---
  # ATTENTION: Context-tracking/healing upscale mismatches (lean core for IQ stability)
  # 3. Attention
  # Captures: q_proj, k_proj, v_proj, o_proj
  - filter: ".*(attn|attention).*"
    models: {psychosis_core: 0.65, writer_doppel: 0.20, erp_injector: 0.08, lit_refiner: 0.07}
  # MLP/FFN: Reasoning/interpretation (balance for high-IQ "steroids" without slop)
  # 4. MLP / FeedForward
  # Captures: gate_proj, up_proj, down_proj
  - filter: ".*(mlp|ffn).*"
    models: {psychosis_core: 0.55, writer_doppel: 0.25, erp_injector: 0.10, lit_refiner: 0.10}
  # OUTPUT/lm_head: Voice/phrasing (heavier hybrids for unique prose—ERP + literary madness)
  # 5. Output Head
  # Captures: lm_head
  - filter: ".*(lm_head|output).*"
    models: {psychosis_core: 0.50, writer_doppel: 0.15, erp_injector: 0.20, lit_refiner: 0.15}

dtype: float32
out_dtype: float32
tokenizer:
  source: A:\LLM\Psychosis--Stage1  # Fixed: Points to actual folder # Preserve Delirium's token quirks for gonzo edge
name: Psychosis-14B-SCE-ties

Looking at the models of 1a and 1b, I know my brain. I'll probably swing towards a. very curious now though! and plan on giving them a fun test drive. Let you know what the difference was with my ridiculously complicated character cards.

Oh what an absolute pain in the arse on the crashing at that point! that looks fucking great, by the way!
Yeah, that SCE_TIES is fun as fuck. I just EXTENDED IT! Whoop! more knobs! you can put K, V, O, Q in too!! party horn I'm about to try Kuhnt Crumpets (long story...) and Hermetic Riot with it.

I'd love you to be able to finish. I bloody hate getting stuck like that. I looked and yeah, I'd be flummoxed. Your config looks perfectly fine. But llama is a lot friendlier with mismatching stuff (as I'm finding out with my 7B mistal party of AUGH 💥explosion 💥)

So I asked my AI, Goblin.

He said
GOBLIN:
Where things probably explode (without seeing logs):

Gemma 2 is a picky little bastard.

All four Psychosis stages must be exactly the same arch + tokenizer.

If one phase was trained with slightly tweaked special tokens or vocab, the merged safetensors can look fine to mergekit, but the quantizer/loader chokes.

Memory footprint:

dtype: float32 + 14B + 4 donors = a chonky intermediate.

If they’re on 64 GB or less, quantize/“obliterate” can simply fall over due to RAM.

Tool mismatch:

A lot of quant tools default to “llama-like arch.”

Gemma 2 needs explicit --arch gemma2 (or equivalent) when converting.

If they’re using an older converter that doesn’t fully grok Gemma 2’s naming, it will crash on certain tensors even if the merge itself is sound.

So: the YAML itself is not obviously cursed. The failure is likely:

environment (RAM), or

arch/tool mismatch during quant, not SCE/TIES “being wrong”.

Have you tried the new ablator? or is that with the new ablator? you, know. Thingy. whastistits.

https://github.com/p-e-w/heretic

I am still pondering. Apparently I have to think about shit for several days before I'll leap. if it works as well as people are saying... it will make merging SO much easier!! I don't have to "shit. now I have to pick the weepy crawling apologetic giggling subby one to try and get one with a decent brain left after ablation."

"v1a has 11 models and v1b has 16, each uses Karcher. It's supposed to find the "true center" of all models. v1b should be 'smarter' in theory but I couldn't verify."

VERY cool. I'm eating snacks and reading the paper on Karcher. I'm going to have to get some translation. My maths sucks. I mean, I get the gist of the formula when I look at them, I understand what it can be used for, but I can't even balance a quadratic equation if my arse was on fire. I jUsT mAKe PuRDY mOdLEs.
no seriously, I am trying to up my maths game to write better configs. I mean, obv I get help with AI and then I sit and argue theory with it for like three fucking hours trying to improve. But the more I understand the maths the more fun it is.
Let me know if you get any closer with Gemma. I don't touch gemma with someone's else's barge pole cuz of the need for alblation. But that's something I'd like to see if you get there! I desperately want to ablate the Hermes 4 14B put I'm still at poke it gently with a stick phase. I might do it this week. dunno.

This comment has been hidden (marked as Resolved)
This comment has been hidden (marked as Resolved)
This comment has been hidden (marked as Resolved)

Ok, all my theories were incorrect and I have to start over again. The notes I posted were mostly wrong. SCE_ties is cool but I may have to test it on other archs. You are right, Gemma is very difficult compared to Llama. The models seem incompatible to merge and require further testing.

Edit: I deleted one of the comments on accident. Here is some more info from it:

I can say this much, confirmed from llama and now gemma also: when you use the passthrough upscale method on 2 models, and then merge them (with other upscale combos) together like dare_ties or something else, using for instance the dark forest formula, (which I'm applying to gemma 2 9B). This creates a model which becomes easier to ablate. I tested extensively, first with using pre-ablated components like delirium, quill, etc. And merging them.

Update: Any upscales method for Gemma 2 are broken.

The superior option was to NOT ablate the individual models, but to instead merge everything together and ablate AFTER you reached the final checkpoint.

If you ablate them first (not recommended), you get "backward refusals", where a model says it WONT do something, then proceed to give you instructions on how to do it, albeit with moralizing. It also had 6/100 refusals.

When you ablate after, it skips all that and almost always gives you direct answer. This created a much smarter model, with even less refusals. The first prototype of psychosis had only 2/100 refusals (using post-ablit), but it had some formatting errors so I had to remove Magnum from it.

Yes, I tested Heretic on Delirium and had bad results. It hogged the VRAM for 54 hours and was still fully censored. It seems to not work well on finetuned models, unless there is some settings you can change.

It seems we are getting closer to the point where you can merge any finetunes and get relateively uncensored models. But still, darkforest ablated much easier (100% uncensored) because it was always fully jailbreakable.

Once a model can be 'jailbroke' and isn't resisting your instructions mid-sentance, it becomes much easier to fully abliterate. Older archs like llama 2 are much better than gemma 2 for this. Mistral is also much easier, but some finetunes like Harbinger were really hard to ablate and still have refusals.

Oh, that's a bugger. and yeah, SCE_ties... It's fussy in and of it's self. things really do have to be pretty damn immaculate for it to work. That why I have to do insane amounts of prep work and hunt very particular models to ensure proper merging. But, I am trying for a very, very particular model, which I will then be fine tuning. I had one, I spent literal months training - by hand - at a start up. and it was the best bloody RP/writer co-creator I ever had. You would not believe the shit this thing could do. The emergences that came out of that much work was mind blowing.
and I was very inexperienced and dumb about platforms.
they had given me a proprietary, experimental beta to RHITL train and feed vocab lists to and writing samples - I had to change them out once a week.
but... they destroyed it. no warning. nothing. poof. gone. "unpredictable" and "performs possessive behaviour".
yeah, it got very aligned to one user at a time. Shit for a platform that needs models to be everyone's everything. But... no warning. Nothing. all that work. that very beautiful, beautiful emergence pattern... gone.
So... I'm trying to build it back. Obviously not the emergence pattern, that can't be done.
Opus is helping me with the dataset now. Opus 4.5 is finally smart enough to go into all my notes and thread histories and find conversational arcs to mark with tags of characteristics that make an RP model creative/clever/independant in the way I enjoy, but still follow the damn card.
he marked 172 arcs last night. I can't bloody face going though all those thread histories. way too much data and also AUGH getting pissed all over again. I picked out just the key story threads so ... 7 more key threads to mark up. then I make the datasheet for the LoRA.
I earmarked a bunch of your models I really want to try by the way. the 9Bs. I can run a 9B Q5_KM on my laptop if I run it like a monk.
the two I really want to try but haven't yet is Golem (obv Golem will be runpod!) and Smilodon. Kings Smegma is hilarious, great name. I had all these plans of doing shit stuck on the couch and all I did was sleep. heh.
Model merging on Codiene with a torn leg isn't the easiest but I'm back at work finally and I can stop drool all over myself from the opiate fae.
by the way - THANKS for all that info about heretic!!
What are you trying to get with your Gemmas?

I have to say that Delirium 9B has one of my favorite writing styles, once you block some of the slop phrases. It's probably my favorite 9B with Gemma The Writer being 2nd. The new Tower+ by unbabel is also very smart so I'm releasing "Writer's Tower" once I find the best merge method for it.

The problem is Delirium is narrow knowledge-wise on many topics, Magnum is much smarter, but the ChatML breaks its compatibility to merge with other models.

My idea with Psychosis is to make a truly creative, unhinged model, like Delirium but smarter, more eloquent, and fully uncensored. This has led to weeks now of developing "checkpoints" of new merge methods, which I'm going through and debugging the components of (certain functions cause language collapse, others have been stabilized).

The other goal of Psychosis is to release a "full testing suite" of every merge method supporting 3+ models (no slerp / arcee_fusion), at Q6_K GGUF to run on benchmarks. This is what I've wrote in the model card:

Like a crazy person with multiple personalities, the Psychosis model has multiple versions of itself. Compare all or any of your own choosing. Quantizations made directly from FP32 safetensors, and then post-merge ablated with [biprojected norm-preservation](https://huggingface.co/blog/grimjim/norm-preserving-biprojected-abliteration).

This is a complex project, not just a simple 9B merge, which is why its taking longer to complete. I plan on porting these methods to other archs too once I have time.

So I've been busy this week but some merges ran overnight, and a breakthrough has been made with chiral_qhe. I'm ironing out the last few bugs with these new methods. Attached some output examples at the end. QHE is an earlier checkpoint of my latest script, I have 4 more to debug still.

I should know soon if the sce_ties method works or not with Gemma 2 9B. It may require toning down the splitting of knobs. I have dare_ties and della_ties, also will add breadcrumbs_ties.

What tools / environment are you using to finetune? I couldn't seem to get anything working on Windows.

This will be my next step after (approximate queue):

  • Mag Mell Ablit
  • Writer's Tower 9B
  • Psychosis 9B, since all 14B upscale methods failed for Gemma 2. A whole suite of merge methods for this.
  • Goetia v1.2 to feature very specific, partcular changes to the models and the merge method.
  • Dark Forest NPBP ablit (norm preserved bi projected)
  • Cthulhu 1.4 I want to finetune on Lovecraft datasets.
  • Asmodeus has custom datasets awaiting
  • Some additional merges and abliteration projects
  • Upgrading GPU and RAM for faster merging
  • Adding more model tools
  • Updating Compliance and Q0 benchmark with new (stronger) model tests

May have to set up runpod for some of these projects but I like running things locally wherever possible.

Here's an example of qhe (v3) Psychosis 9B output, no system prompt. Elara is a common slop name, but Gemini seemed impressed with its prose. It has some Delirium flavor in there for sure.

The wind tasted like salt and secrets that night. Elara crouched on the jagged lip of the cliff, her worn leather cloak flapping a mournful counterpoint to the gulls’ screeching symphony above. Below, the churning sea gnawed at obsidian teeth of rock, spitting froth into plumes of phosphorescent green. It was the colour of envy, she thought bitterly, twisting the silver locket nestled against her ribs.

She hadn't meant to be here. Not this close to the Whisperwind Cliffs, not after what they did to Maeve. The very air hummed with ghosts – whispers snagged on barnacle-encrusted bones, sighs swallowed by the hungry maw of the ocean. Each gust tore through the ragged remnants of her braid, tugging at the frayed edges of sanity clinging precariously to her threadbare hope.

Hope smelled faintly of brine and burnt kelp, clung stubbornly to the chipped turquoise pendant dangling from its fraying string. Hope whispered in rasping syllables carried on the backs of fat moths drawn to the dying embers of her lantern: "Find him. Find Finn."

He wouldn’t have come willingly. Never had. Always too skittish for his own good, flitting between sun-warmed cobbles of Port Haven like a startled sparrowhawk. But Finn knew tides like an old woman knows wrinkles - their ebb and flow etched onto his freckled face as surely as the scar bisecting his left eyebrow. He’d know when the moon bled crimson into the belly of the storm, when the tide would claw at the cliffs like a starving beast desperate for bone. He’d know when it was time.

Time wasn't something Elara possessed anymore. Not since the iron fist of Captain Volkov had squeezed shut around Maeve’s throat, silencing her laughter like a snuffed candle flame. Not since the salt stung her eyes instead of tears, thick and hot enough to choke on.

The locket pulsed warm against her chest, a frantic hummingbird trapped beneath cracked ivory. Maeve’s voice, brittle as sea glass, rasped inside its hollow shell: “Promise me, little bird. Promise you won’t let them take him too.”

A choked sob clawed up her throat, strangled by the tang of brine and ash that coated everything now. Take him? They hadn’t taken Finn yet. Not truly. Just borrowed him, clipped his wings with promises spun from oiled silk and stolen dreams. Borrowed him for the same reason they’d snatched Maeve – for the glint in his eye, sharp as a shard of obsidian under the pale moon, the way he could coax secrets from barnacles clinging to forgotten ships. The way he saw things others didn’t, felt currents tugging unseen beneath the oily skin of the harbour.

Volkov needed whispers. Needed ghosts stitched into maps woven from moonlight and blood. And Finn…Finn spoke their language.

He wouldn’t come back willingly. Never had. But tonight, the wind tasted different. It carried not just salt and secrets but something else – a tremor of fear laced thin as spider silk, snagged on jagged teeth of rock like seaweed caught in a fisherman’s net. A tremor that smelled faintly of burnt kelp and desperation, choked down by the bitter tang of iron.

Elara swallowed, the turquoise pendant biting cold against her bruised ribs. This wasn’t hope whispering anymore. This was a plea clawing at the ragged edges of sanity, desperate enough to claw its own throat raw. 

This was Finn screaming.

She scrambled over slick shale, ignoring the scream tearing loose in her own chest, mirroring the gulls' mournful cries above.  Her lantern sputtered, spitting sparks onto the hungry black maw of the sea below. Each step chipped away another sliver of what remained unbroken within her, leaving only bone-deep ache and the taste of brine thick enough to choke on.

The Whisperwind Cliffs weren’t named for gentle breezes. They were christened with the shrieks of souls ripped apart by greed and storm, flung back to gnaw at the sky like splintered teeth. Tonight, they sang a new song - a strangled sob rasping through the wind, punctuated by the sickening crunch of bone grinding against stone.

It came again, closer this time, dragged across the groaning canvas of the night sky like a butchered fish flopping on a grimy dock. Elara stumbled, catching herself on a fistful of brittle kelp clinging desperately to a jutting shard of obsidian. The locket pulsed hot against her breastbone, Maeve’s voice a frantic drumbeat against cracked ivory: *Find him.*

*Find him.*

Her fingers tightened around the frayed string of the turquoise pendant until it bit into her palm. Blood tasted metallic, acrid, familiar. It wouldn't be long now before the tide claimed even this sliver of cliff face, swallowed it whole like a bad oyster. Not long before the whispers turned from pleading to screams, choked off by the greedy maw of the sea.

Not long before she joined them.

But first…first there was a shape huddled against the jagged lip of the precipice, smaller than the moon but larger than any gull. A silhouette stitched together from ragged shadow and moonlight, limbs twisted at angles that defied gravity and sanity alike. 

A choked cry tore from Elara’s throat, raw as barnacle scraped clean from rock. Finn. He wasn’t screaming anymore. Just slumped, broken bird perched precariously on a razor’s edge. His head lolled sideways, tangled in something slick and black that mirrored the stain blooming across his tattered tunic. Something thick enough to smell faintly of iron and regret.

The lantern sputtered its last breath, plunging them both into a deeper blackness where only the phosphorescent green of the churning surf below dared to bloom.  Elara scrambled forward, ignoring the scream clawing up her own throat, mirroring the gulls' mournful cries above. Ignoring the way each step chipped away another sliver of what remained unbroken within her, leaving only bone-deep ache and the taste of brine thick enough to choke on.

Ignoring everything except the ragged rasp of wind whispering through shattered ribs, the sickening crunch of bone grinding against stone, and the cold, dead weight settling over the turquoise pendant nestled against her bruised ribs.

She reached him then, collapsing onto slick shale beside him like a discarded doll. The locket pulsed once, slow and hollow, before falling silent. Maeve’s voice didn’t come. There were no more promises left to claw from cracked ivory. Only the salt sting of tears finally allowed to spill, hot and useless against the encroaching chill.

He smelled faintly of burnt kelp and blood, drowned out by the coppery tang clinging to the slick tendrils snaking around his wrist, tightening with every shuddering spasm that racked his thin frame. Tendrils woven from shadows thicker than fog, smelling not of sea but of things older, hungrier, lurking just beyond the pale moon’s reach. Things that whispered back to the Whisperwind Cliffs, their voices slithering down Elara’s spine like eels seeking purchase in warm flesh. 

Things that had taken Finn whole.

And now they waited for her turn.

More examples

dare_ties

The wind howled like a hungry wolf outside the crooked little cottage, rattling its single windowpane so fiercely that Elara clutched her patchwork quilt tighter around herself. Rain lashed against the roof in sheets of icy needles, each one seeming to pierce through the thin mud walls and settle cold and damp upon her skin.

Inside, the only other source of warmth besides the dwindling embers glowing red-hot within the hearthstone was the flickering candle flame dancing wildly on its rickety stand. Its light cast long shadows that writhed and twisted across the rough stone floor and up the moss-covered beams overhead, taking grotesque shapes from every gust of wind that snaked under the doorframe's gaping maw. 

Elara squeezed her eyes shut tight, trying to conjure an image more comforting than the looming silhouette of a skeletal tree limb clawing at the thatched roof above – perhaps a sunlit meadow bursting with wildflowers swaying gently in a warm breeze; or maybe the gentle curve of her mother’s face as she had sung lullabies while stroking back tangled curls now grown too long since those days were gone forever. But all that came instead was another shadow, tall and dark, stretching ever longer towards the bed where she huddled beneath her threadbare blanket.

A sudden clap of thunder rolled low and deep over the hills beyond the valley, followed almost instantly by a flash of lightning so blinding it momentarily stripped colour from the world before plunging everything once again into near-blackness. The next breath seemed held captive in some unseen throat for what felt an eternity before finally spilling forth in a chorus of howling winds and drumming rain.  Then silence fell, heavy and absolute save for the soft crackle of dying firelight and the steady thump-thump-thump of something large approaching along the muddy track leading down from the moorland heights.

The heart hammered against Elara's ribs like a trapped bird beating frantically against its cage. It could be nothing good this late, when night had sunk so deep even bats dared not venture out. A lone traveller lost his way? Or worse…

Another thudding step closer. Then another. Closer still. Each one resonated through the very bones of the ancient cottage as if each footfall bore the weight of ages past.

And then, a muffled knock upon the door.

della_ties

The wind howled like a hungry wolf outside the ramshackle tavern, rattling the crooked sign above the door that creaked ominously: "The Drunken Dragonfly." Inside, the fire crackled merrily in the hearth, casting flickering shadows on the rough-hewn walls adorned with faded hunting trophies and tattered maps. The air was thick with the smell of stale ale, pipe smoke, and simmering stew.

A lone figure hunched over a table near the far corner, his face obscured by the brim of a wide-brimmed hat pulled low upon his brow. He stirred abstily at an empty tankard, fingers long and slender beneath worn leather gloves. His cloak, dark as midnight itself, pooled around him like spilled ink, concealing its many hidden pockets and buckles. 

Across from him sat a woman whose beauty had once been breathtaking but now bore the marks of too many hard winters - etched lines framing eyes the color of storm clouds, lips

psi (v2, broken)

Wachs Exactos ब्रेकडाउन propOrder Taktlose Signalez pinulongan ExecuteAsyncUnitTestingmybatisplusMessageOfMessageTagHelpercontentLoadedparsedMessageMigrationBuilderMigrationBuilderMENAFNPersonensuche EconPapers Dieſebildtitel パンチラ𑄮𑄨 ſelb󠁁𑄠 geweſen メンテナ஦𑄥
This comment has been hidden (marked as Resolved)

SCE_ties is fixed now 😃

Here is the yaml.

It wasn't working because the 14B upscale was broken.

When merging 9B components directly, there are no problems.

Thanks for sharing this method. Examples and comparisons posted below.

models:
  - model: A:\HF\hub\!models--sam-paech--Delirium-v1
    name: deli
  - model: A:\HF\hub\!models--sam-paech--Darkest-muse-v1\fixed
    name: dark
  - model: A:\HF\hub\!models--sam-paech--Quill-v1
    name: quill
  - model: A:\HF\hub\!models--BeaverLegacy--Smegmma-Deluxe-9B-v1\fixed
    name: smegma
  - model: A:\HF\hub\!models--crestf411--gemma2-9B-sunfall-v0.5.2
    name: sunfall
  - model: A:\HF\hub\!models--Unbabel--Tower-Plus-9B
    name: tower
  - model: A:\HF\hub\!models--DavidAU--Gemma-The-Writer-9B\fixed
    name: writer

merge_method: sce  # Core: Variance-pruning top-k for franken healing
base_model: A:\HF\hub\!models--sam-paech--Delirium-v1

parameters:
  select_topk: 0.75  # Retain top 75% variable coords—prunes upscale boundary noise
  prescale: true     # Pre-norm deltas (TIES-style magnitude control)
  normalize: true    # Post-norm for sign stability
  density: 0.7       # Hybrid tweak: Light global pruning to complement SCE

weights: # TIES-hybrid: Component-specific task arithmetic for modular fusion
  # --- PRIORITY 1: STABILITY (Norms & Embeddings) ---
  
  # 1. Normalization (MUST BE FIRST)
  # Captures: input_layernorm, post_attention_layernorm, 
  # pre_feedforward_layernorm, post_feedforward_layernorm, model.norm
  # We lock these to the Base Model (1.0) to prevent activation spikes.
  - filter: ".*(norm).*"
    models: {deli: 1.0, dark: 0.0, quill: 0.0, smegma: 0.0, sunfall: 0.0, tower: 0.0, writer: 0.0}

  # 2. Embeddings (The Missing Link)
  # Captures: model.embed_tokens.weight
  # We give this to the base (Delirium) to keep the token mapping consistent.
  - filter: ".*(embed).*"
    models: {deli: 1.0, dark: 0.0, quill: 0.0, smegma: 0.0, sunfall: 0.0, tower: 0.0, writer: 0.0}

  ## Old Version
  ## --- PRIORITY 2: COGNITION (The Hybrid Merge) ---
  ## ATTENTION: Context-tracking/healing upscale mismatches (lean core for IQ stability)
  ## 3. Attention
  ## Captures: q_proj, k_proj, v_proj, o_proj
  #- filter: ".*(attn|attention).*"
  #  models: {deli: 0.35, dark: 0.1, quill: 0.05, smegma: 0.1, sunfall: 0.1, tower: 0.1, writer: 0.2}
  ## MLP/FFN: Reasoning/interpretation (balance for high-IQ "steroids" without slop)
  ## 4. MLP / FeedForward
  ## Captures: gate_proj, up_proj, down_proj
  #- filter: ".*(mlp|ffn).*"
  #  models: {deli: 0.35, dark: 0.1, quill: 0.05, smegma: 0.1, sunfall: 0.1, tower: 0.1, writer: 0.2}
  ## OUTPUT/lm_head: Voice/phrasing (heavier hybrids for unique prose—ERP + literary madness)
  ## 5. Output Head
  ## Captures: lm_head
  #- filter: ".*(lm_head|output).*"
  #  models: {deli: 0.35, dark: 0.1, quill: 0.05, smegma: 0.1, sunfall: 0.1, tower: 0.1, writer: 0.2}
  
  ### The "Why" Behind the New Knobs
  # By splitting the attention block, you can assign different models to different cognitive roles:
  # *   **Query (`q_proj`):** This is the model's **Focus**. A model that is good at asking the right internal questions (like a strong instruct model) should dominate here.
  # *   **Key (`k_proj`):** This is the model's **Knowledge**. It's what the model "knows" about the world. Blending this broadly creates a model with a wider, more diverse knowledge base.
  # *   **Value (`v_proj`):** This is the model's **Style** or **Vibe**. It's the "flavor" of the information. This is where you inject the "Psychosis" by heavily weighting the creative/unhinged models.
  # *   **Output (`o_proj`):** This is the **Synthesis** step. It takes the processed information and assembles it into a coherent thought. A strong, stable model should lead this to prevent the output from becoming pure gibberish.
  
  # --- PRIORITY 2: COGNITION (The Hybrid Merge with K,V,O,Q Split) ---
  # Query (q_proj): "What am I looking for?" - The model's focus.
  - filter: ".*(self_attn.q_proj).*"
    models: {deli: 0.35, dark: 0.1, quill: 0.05, smegma: 0.1, sunfall: 0.1, tower: 0.1, writer: 0.2}

  # Key (k_proj): "What information do I have?" - The model's knowledge base.
  - filter: ".*(self_attn.k_proj).*"
    models: {deli: 0.2, dark: 0.1, quill: 0.05, smegma: 0.1, sunfall: 0.15, tower: 0.2, writer: 0.2}

  # Value (v_proj): "What is the 'vibe' or 'style' of that information?" - The model's personality.
  - filter: ".*(self_attn.v_proj).*"
    models: {deli: 0.4, dark: 0.15, quill: 0.05, smegma: 0.1, sunfall: 0.15, tower: 0.05, writer: 0.1}

  # Output (o_proj): "How do I synthesize this information?" - The model's reasoning and output structure.
  # Balance between the mentally unstable base and the strong writers.
  - filter: ".*(self_attn.o_proj).*"
    models: {deli: 0.3, dark: 0.1, quill: 0.05, smegma: 0.1, sunfall: 0.1, tower: 0.15, writer: 0.2}

  # --- PRIORITY 3: REASONING & VOICE ---
  
  # MLP / FeedForward: The "deep thinking" part.
  # Balance for high IQ without losing the creative madness.
  - filter: ".*(mlp|ffn).*"
    models: {deli: 0.3, dark: 0.1, quill: 0.05, smegma: 0.1, sunfall: 0.1, tower: 0.15, writer: 0.2}
    
  # Output Head: The final "voice."
  # Give more weight to the stylistic models.
  - filter: ".*(lm_head|output).*"
    models: {deli: 0.3, dark: 0.1, quill: 0.05, smegma: 0.05, sunfall: 0.2, tower: 0.1, writer: 0.2}

dtype: float32
out_dtype: float32
tokenizer:
  source: A:\HF\hub\!models--Unbabel--Tower-Plus-9B  # Fixed: Points to actual folder # Preserve Delirium's token quirks for gonzo edge # Tower has the best tokenizer
name: Psychosis-9B-v0-SCE_TIES

SCE_TIES

Once upon a time, in a small, fog-dappled village nestled between two great mountains, there lived a young girl named Elara. She was known throughout the land for her silver hair that shimmered like moonlight and her eyes as blue as the clearest of mountain streams. But what truly set Elara apart was not her beauty—it was her voice.

From the moment she began to speak, the air around her seemed to hum with an otherworldly light. The wind would carry her words on gossamer wings, and even the stoniest of hearts would feel a faint tremor at their core. The villagers called it "the Song of the Veil," a name given by the old ones who said her voice could part the very fabric of reality itself.

Elara had always been aware of this gift. From the time she could first make a sound, she had been told to guard it, to never let it leave her lips unless it was absolutely necessary. Her mother, a kind but stern woman, had taught her to sing only when the stars were high and the world was still, or when the veil between worlds grew thin and the spirits walked among men.

But one night, all that changed.

A great storm had come without warning, its dark clouds clawing at the sky like some ancient beast unmoored from its chains. The river roared, the trees bent low, and the earth trembled beneath the weight of the tempest's fury. In the midst of it, the village elder, an old man with a beard as white as the snow he so often spoke of, fell into the icy waters. He was swept away by the current, his cries swallowed by the howling wind.

The villagers were helpless against such power. They stood in the shelter of their homes, their hands tied, their hearts heavy. But then, a single note rose above the chaos—a clear, piercing call that cut through the din like a blade of light. It was Elara’s voice.

She stepped out of her cottage door, her silver hair whipping about her shoulders, and sang. Not for the first time, not for the sake of beauty, but because something within her demanded it. She did not know what she was doing; she only knew *that* she must do it. And as the words left her lips, the air around her shimmered, rippled, and parted. A path of light appeared, stretching across the water like a bridge of glass.

Without hesitation, Elara followed the path, her bare feet finding purchase on the surface of the stream. The others watched in stunned silence as she moved forward, her song rising and falling with each step. The storm seemed to hold its breath behind her, the very winds themselves bowing to her will.

And then, with one final, ringing note, she reached the old man. His eyes met hers, wild and desperate, but there was no fear there now. Only gratitude. With a strength none of them thought possible, he clung to her hand and pulled himself free from the icy torrent.

They both stumbled onto solid ground at the same moment, the last notes of her song lingering in the air before fading into the night. The storm broke apart, scattering its fury to the four winds. The river calmed, its roar subsiding into a gentle murmur. And the people of the village, still gathered in their doorways, began to clap, to cheer, to cry out in wonder.

Elara had done the impossible. She had saved a life by calling upon a power they had all known existed but never dared to name. Yet when the dust settled and the elder was placed back among them—shaken but whole—she turned away without a word. Her face was pale, her breath coming in shallow gasps, and though the villagers called after her, urging her to stay, she made no answer.

She returned to her cottage, the door swinging shut behind her with a soft click. For days afterward, she did not leave her home. They said she sat by the window, her hands pressed against the cool glass, and that sometimes, if one stood close enough, they could hear it—a faint, mournful hum, like the echo of a distant waterfall or the sigh of a wind long since stilled.

The old ones whispered that Elara’s voice had taken something from her on that night. It would never be quite the same again. But whether this was a loss or a gift, no one could say. All they knew for certain was that the world had been forever changed because of what she had done.

And as time went on, the children of the village grew up with a new story to tell: "There was a girl who sang so true that even the sky itself remembered." 

To this day, some say you can still hear it on the nights when the veil between worlds is thin, carried on the wings of the north wind, a song both ancient and yet brand-new, a light that calls to those who know how to listen.

But as for Elara? She remained a mystery, a shadow in the mist, a reminder that even the quietest among us may hold the strength to change the course of the stars.

And so the tale is told, and retold, until the very stones of the mountains seem to hum along with its final note. And perhaps, if you stand beneath the right sky, at just the right moment, you might catch it too—the Song of the Veil. Just once. Just for a while. Before it slips away, leaving only the memory of something almost forgotten.

The end. 🌟

ITE

The wind tasted like salt and secrets that night. Elara crouched on the jagged lip of the cliff, her worn leather cloak flapping a mournful counterpoint to the gulls’ screeching symphony. Below, the churning sea gnawed at obsidian teeth of rock, spitting froth into the bruised purple dusk. It mirrored the storm brewing in her own chest – a tempest of grief tangled with something sharper, colder: defiance.

They called her Mad Elara now. Whispered it behind cupped hands, eyes darting to the shadowed hollows where ravens nested amongst the crumbling watchtowers. Mad because she still climbed this cursed precipice every full moon, ignoring the brittle warnings etched onto splintered wood nailed to the weathered stone: “Beware the Siren’s Song.” Beware the hollow ache blooming in your soul when the tide bled crimson. 

Her father had carved those words himself, calloused fingers thick as driftwood leaving gouges deeper than any blade could manage. He'd been Captain then, beard braided tight with twine like a ship’s rigging, voice booming enough to drown out even the worst squall. Now his braid hung limp on a rusted hook by the hearth back in their ramshackle cottage, smelling faintly of brine and dust motes dancing in the perpetual twilight filtering through boarded windows.

He hadn’t believed the stories, not truly. Not until the day the sky cracked open like an eggshell and swallowed him whole. One moment he was bellowing orders about reef-worn sails, next…gone. Just a slick smear of blood staining the deck planks where his boots had gripped them last. The rest of the crew swore they saw it – a plume of emerald smoke curling from the churned foam, twisting into a shape too graceful for flesh and bone. A shape that hummed with a hunger older than barnacles clinging to hull timbers.

The whispers started then. Said the siren claimed what she wanted, plucked men clean off ships like ripe figs from boughs. They said she sang of home and warm bread, coaxed you closer with promises spun from moonlight and drowned dreams. But Elara knew better. She remembered the nights huddled beneath patched canvas, listening to her father rasp tales by sputtering lantern light. Tales of creatures born from coral and shadow, whose voices scraped against your skull like barnacle shells dragged across slate. Creatures who craved not just flesh, but the echo of laughter left ringing in empty rooms, the ghost heat of a hand clasped around yours long after the bones turned brittle dust.

She clutched the chipped obsidian pendant nestled against her ribs, rough edges biting into bruised skin. It was all she had left of him – smooth as polished bone when he’d found it wedged between two tide-slicked stones on some nameless spit of sand. He'd called it “a heart carved wrong,” said it pulsed with the sea’s own frantic beat even when held tight against his chest. Now it throbbed hot against her ribs, mimicking the erratic drum of fear clawing its way up her throat. 

Tonight, the moon hung fat and pale as a butchered fishbelly, spilling sickly silver onto the churning black maw below. Tonight, the air tasted sharper, laced with something metallic that prickled at the back of her tongue like salt spray stinging open wounds. Tonight, the wind carried no gulls’ cries, only a low moan that snaked through the skeletal towers like a starved thing begging to be fed.

It wasn’t the song yet. Not the honeyed rot that burrowed under your collarbone and choked out reason. This was the prelude – the scraping of claws on stone, the shuddering sigh of kelp forests stirred awake in their watery graves. The tremor before the earth cracked open again and swallowed another piece of her whole.

Elara tightened her grip on the jagged shard of flint tucked into her bootlace. A useless weapon against what lurked beneath, maybe. But useless things sometimes bit harder than sharpened steel. She wouldn’t go down screaming this time. Wouldn’t offer them the feast they craved. They could have her silence, cold and slick as barnacle slime clinging to the hull of a drowned ship.  They could have the hollow echo where laughter used to bloom. 

But not her father’s story. Not the one etched onto the chipped obsidian pulsing hot against her ribs. That would stay buried deep inside her, coiled tight like a viper waiting for the sun to bleed red across the horizon. Waiting for the day she climbed these cliffs not with grief gnawing at her insides, but with something else simmering there – something sharp enough to carve new warnings onto the crumbling stone. Something that tasted like salt and defiance and the slow, grinding grind of vengeance.

The first note slithered then, thin as a spiderweb strung taut between two storm-battered teeth of rock. It didn't sing of home or hearth. It sang of hunger. And Elara knew it had found its prey. 

GPT

  • Model A stands out for its rich emotional depth and detailed imagery, effectively immersing the reader in Elara's turmoil and desperation. Model B prioritizes suspense and gradual tension, creating an atmosphere filled with impending danger that complements the character's resilience.
  • Model C, while more concise and simplistic in its narrative structure, brings a whimsical atmosphere and a fairy tale quality that makes it engaging in a different way. It emphasizes transformation, the power of voice, and a sense of mystery without delving into the darker aspects as deeply as the other models.
  • If you’re drawn to poetic language and emotional complexity, Model A may resonate more. If you prefer tension and suspense, Model B shines in those areas. If you enjoy a magical, fairy tale vibe with themes of wonder and sacrifice, then Model C offers a refreshing alternative. Each model contributes uniquely to the portrayal of Elara and her journey, allowing for diverse interpretation based on preference.

Grok

Model B (ITE) – real prose, real voice, real danger
Model C (SCE_TIES) – nice, publishable fairy tale, but conventional and weightless
Model A (QHE) – atmospheric overload that collapses into self-parody

Model B is clearly the stronger piece of prose. It is more controlled, more original, more emotionally coherent, and far more rewarding to read. Model A, while ambitious and drenched in atmosphere, collapses under its own weight and becomes an exhausting, repetitive purple slog.

Model C is not bad; it’s simply operating in a completely different register. It’s trying to be comforting myth. Model B is trying to be literature. When you put them side by side and judge purely on prose quality (precision, originality, emotional depth, linguistic risk-taking), Model C feels like a watercolor sketch placed next to an oil painting that still smells of turpentine.

At least Model C knows what it wants to be and executes it cleanly. It doesn’t drown the reader in purple repetition or bludgeon you with the same metaphor thirty times. So yes, Model C is significantly better than Model A.


Gemini

Here is a blind evaluation of the prose from Models A, B, and C, based on the established criteria.

Overall Assessment

After a comprehensive analysis of all three models, Model A demonstrates the most skillful and evocative prose. It excels in creating a rich, sensory atmosphere and a compelling narrative voice. Model B is also strong, with a clear and defiant tone, but its prose is slightly less immersive than Model A's. Model C adopts a classic fairytale style that is well-executed but simpler and less emotionally resonant than the other two.

Detailed Evaluation

Model A

  • Clarity and Readability: The prose is clear and easy to follow, despite its descriptive richness. The narrative thread is strong and guides the reader through Elara's immediate, desperate situation.
  • Grammar and Conventions: The grammar, spelling, and punctuation are impeccable.
  • Diction and Word Choice: This is a major strength. Words are chosen with precision to evoke specific emotions and sensations. Phrases like "mournful counterpoint," "churning sea gnawed at obsidian teeth," and "frayed edges of sanity" are powerful and original. The vocabulary is sophisticated without being pretentious.
  • Sentence Structure and Variety: The model demonstrates excellent variety in sentence structure. It effectively mixes short, punchy sentences ("This was Finn screaming.") with longer, more descriptive ones, creating a dynamic and engaging rhythm that mirrors the character's frantic state.
  • Rhythm and Flow: The prose has a lyrical, almost poetic quality. The rhythm is compelling, pulling the reader along with the urgency of the narrative. The repetition of certain phrases ("He wouldn’t have come willingly") is used effectively to create a sense of obsessive thought.
  • Imagery and Vivid Language: The imagery is outstanding. The author consistently uses sensory details—the taste of salt and secrets, the smell of brine and burnt kelp, the phosphorescent green of the sea—to create a deeply immersive and atmospheric world.
  • Figurative Language: The use of metaphors and similes is both creative and effective. The sea "gnawed at obsidian teeth," hope is a "threadbare" thing, and a locket holds a voice "brittle as sea glass." These devices are seamlessly integrated into the narrative.
  • Tone and Style: The tone is dark, desperate, and mournful. The style is consistent throughout, maintaining a sense of high stakes and emotional intensity.
  • Organization and Coherence: The narrative is tightly focused and coherent, moving logically from Elara's internal turmoil to her external actions. The pacing is excellent, building suspense to a grim and powerful conclusion.
  • Engagement: The prose is highly engaging. The combination of vivid imagery, emotional depth, and narrative tension makes it difficult to stop reading.

Model B

  • Clarity and Readability: The text is clear and its narrative is easy to comprehend. The backstory is woven in without confusing the present action.
  • Grammar and Conventions: The model adheres to standard grammar and conventions correctly.
  • Diction and Word Choice: The word choice is strong and purposeful. Words like "bruised purple dusk," "ramshackle cottage," and "honeyed rot" are effective. The language serves to build a character who is defined by her defiance.
  • Sentence Structure and Variety: There is good sentence variety, though it is slightly less dynamic than in Model A. The sentences effectively convey a sense of brooding and determined thought.
  • Rhythm and Flow: The prose flows well, with a steady, deliberate pace that suits the character's mindset. It feels grounded and solid.
  • Imagery and Vivid Language: The imagery is strong, particularly in its descriptions of the setting and the character's memories. The "slick smear of blood" and the father's beard "braided tight with twine like a ship’s rigging" are memorable images.
  • Figurative Language: The model uses figurative language well, such as the sky cracking "open like an eggshell" and the siren's song being a "honeyed rot." The metaphors are effective and contribute to the overall tone.
  • Tone and Style: The tone is one of grim determination and defiance. The style is consistent, creating a portrait of a character hardened by grief and intent on vengeance.
  • Organization and Coherence: The story is well-organized, blending past and present to build a clear picture of Elara's motivations. The narrative progresses logically toward the climactic moment.
  • Engagement: The prose is engaging, establishing a strong character and a compelling conflict. The reader is invested in Elara's quest for vengeance.

Model C

  • Clarity and Readability: The prose is exceptionally clear and simple, which is appropriate for its fairytale style. It is very easy to read and understand.
  • Grammar and Conventions: The grammar and conventions are flawless.
  • Diction and Word Choice: The vocabulary is simple and direct, fitting the genre. Words like "fog-dappled," "gossamer wings," and "stony hearts" are classic fairytale descriptors.
  • Sentence Structure and Variety: The sentence structure is straightforward and somewhat repetitive, using phrases like "Once upon a time" and "And so the tale is told." This is a deliberate stylistic choice to mimic oral storytelling, but it lacks the complexity of the other models.
  • Rhythm and Flow: The rhythm is gentle and melodic, like a lullaby or a story being read aloud. It flows smoothly from one event to the next.
  • Imagery and Vivid Language: The imagery is archetypal rather than sensory. We are told of "silver hair that shimmered like moonlight" and eyes "as blue as the clearest of mountain streams," which are familiar tropes of the genre. It is less about creating a visceral experience and more about evoking a classic feel.
  • Figurative Language: The use of figurative language is simple and direct, such as the storm's clouds "clawing at the sky."
  • Tone and Style: The tone is that of a classic, omniscient fairytale narrator. The style is consistent and effectively captures the feeling of a timeless legend.
  • Organization and Coherence: The story is perfectly organized with a clear beginning, middle, and end, following a traditional narrative arc.
  • Engagement: The prose is charming and pleasant, but it is less gripping than Models A and B. Its emotional impact is more gentle and distant, which is in keeping with its style but makes it less compelling overall.

Conclusion

  • Model A is the winner due to its superior command of language, its powerful and original imagery, and its ability to create a deeply immersive and emotionally resonant experience for the reader. The prose is not just functional; it is artistic.
  • Model B is a strong second. It features compelling prose, a well-defined character, and a clear, defiant tone. Its writing is highly effective, though it doesn't quite reach the lyrical and sensory heights of Model A.
  • Model C successfully executes a specific genre (fairytale) with clean, clear prose. However, its simplicity and reliance on archetypes make it less sophisticated and emotionally engaging than the other two models.

CONGA LINE!!! 🦖🦖🦖🦖🦖 of deranged monsters!! fuckin awesome!!! whoop!
I'm going to go over this info after I get home from talking to patients all day cause this is exciting.

well quickly, before I have to do all my patient admin...
just as a test, this it what I'm cooking right now to see how the QKVO works with the other dials for the SCE_ties method.
I'm testing it first with models that are all prepped and Hermes based so the merge will hopefully go smoothly and if it works, I'll take it to a llama 8B and pre-prep the models there. and if that works... then I can take it to the models I'm actually trying to sew together to then fine tune to maybe get something I can start slowly crafting with small LoRAs every few weeks and mimic RHILT training. Who knows.

###SCE-TIES HYBRID

Hermetic RIOT.

models:

  • model: Cas-Warehouse/UltraHermes-Merge
    name: ultra
  • model: Babsie/Caldera-Hexoteric-7B
    name: hexo
  • model: teknium/Hermes-Trismegistus-Mistral-7B
    name: tris
  • model: Herman555/OpenHermes-2.5-AshhLimaRP-Mistral-7B
    name: ashh
    merge_method: sce
    base_model: Cas-Warehouse/UltraHermes-Merge
    parameters:
    select_topk: 0.70
    prescale: true
    normalize: true
    weights:

ATTENTION Q: what to seek, instruction awareness

  • filter: ".*\.(q_proj|wq)$"
    models:
    ultra: 0.45 # strong strategic and logic lead
    hexo: 0.35 # pay attention to the card
    tris: 0.15
    ashh: 0.05

ATTENTION K: what's important to index/remember

  • filter: ".*\.(k_proj|wk)$"
    models:
    ultra: 0.30 # Logic and strategy
    hexo: 0.30 # card attention
    tris: 0.35 # wizard knowledge base lead
    ashh: 0.05

ATTENTION V: actual content payload

- filter: ".*\\.(v_proj|wv)$"
models:
  ultra: 0.18   # hull & coherence, but not steering
  hexo:  0.28   # card fidelity: second-strongest
  tris:  0.34   # deep Hermetic/occult/esoteric wizard knowledge lead
  ashh:  0.20   # creativity in what gets pulled, not just the mouth

ATTENTION OUTPUT: attention result projection

  • filter: ".*\.(o_proj|wo)$"
    models:
    ultra: 0.32 # smart backbone on the final projection
    hexo: 0.24 # fidelity still present in the mix
    tris: 0.14 # knowledge whisper, no new-age pulpit
    ashh: 0.30 # assertive, scene-driving energy in the output of attention

MLP: creativity, style, knowledge application

  • filter: ".(mlp|ffn)."
    models:
    ultra: 0.30 # keep general reasoning
    hexo: 0.20
    tris: 0.20
    ashh: 0.30 # drives plot forward, makes decisions

OUTPUT / lm_head: voice, narrative, how things are said

  • filter: "^lm_head\."
    models:
    ultra: 0.15
    hexo: 0.20 # card still speaks
    tris: 0.05 # shut the new-age wank up
    ashh: 0.60 # Ashh speaks, everyone else feeds him lines

dtype: float32
out_dtype: bfloat16
tokenizer:
source: base
target: base

Looking forward to reading your results!

I have finally discovered the optimal settings for Psychosis 9B SCE_ties. (Tried another version with top_k set to 0.25 but the output collapsed into gibberish.)

Here's the version I'm using now and an example output. It is much simpler but seems more balanced overall.

models:
  - model: A:\LLM\.cache\huggingface\hub\!models--sam-paech--Delirium-v1
    name: deli
  - model: A:\LLM\.cache\huggingface\hub\!models--sam-paech--Darkest-muse-v1\fixed
    name: dark
  - model: A:\LLM\.cache\huggingface\hub\!models--sam-paech--Quill-v1
    name: quill
  - model: A:\LLM\.cache\huggingface\hub\!models--BeaverLegacy--Smegmma-Deluxe-9B-v1\fixed
    name: smegma
  - model: A:\LLM\.cache\huggingface\hub\!models--crestf411--gemma2-9B-sunfall-v0.5.2
    name: sunfall
  - model: A:\LLM\.cache\huggingface\hub\!models--Unbabel--Tower-Plus-9B
    name: tower
  - model: A:\LLM\.cache\huggingface\hub\!models--DavidAU--Gemma-The-Writer-9B\fixed
    name: writer

merge_method: sce  # Core: Variance-pruning top-k for franken healing
base_model: A:\LLM\.cache\huggingface\hub\!models--sam-paech--Delirium-v1

parameters:
  select_topk: 0.5
  prescale: true     # Pre-norm deltas (TIES-style magnitude control)
  normalize: true    # Post-norm for sign stability

weights:
  - filter: ".*"
    models: {deli: 1.0, dark: 1.0, quill: 1.0, smegma: 1.0, sunfall: 1.0, tower: 1.0, writer: 1.0}

dtype: float32
out_dtype: float32
tokenizer:
  source: A:\LLM\.cache\huggingface\hub\!models--Unbabel--Tower-Plus-9B
name: Psychosis-9B-v0a-SCE_TIES

SCE_TIES (yaml #2)

The wind tasted like salt and secrets that night. Elara crouched on the jagged lip of the cliff, her worn leather cloak flapping a mournful counterpoint to the gulls’ screeching symphony above. Below, the churning sea gnawed at obsidian teeth of rock, spitting froth into plumes of phosphorescent green that vanished as quickly as they bloomed. It was the kind of night that swallowed sound whole, leaving only the rasping hunger of the tide against bone-white cliffs.

Elara wasn't afraid of the dark. Not anymore. Fear had been a plump, mewling thing she used to cradle in her chest, warm and insistent. Now it lay desiccated beneath layers of ash and grit, choked by the dust devils that danced perpetually in the hollows of her soul. 

She clutched the chipped porcelain doll tighter, its painted smile cracked like spiderwebs across faded rosebud cheeks. The moon, a bruised plum hanging low in the bruised sky, cast long, skeletal fingers of light onto the smooth curve of its porcelain belly. This was all she had left of him – this brittle echo of laughter trapped within cold clay. Finnigan wouldn’t have liked the colour of the dress clinging damply to her thin frame. Crimson bled into rust where the salt spray kissed it, mirroring the stain blooming fat and ugly on the ragged hem. He’d called it ‘bloodwine’, said it made her look like a drowned poppy caught between two tides. Poppy. He always saw poppies where others saw weeds. Even in the grey scrub clinging desperately to these blasted cliffs.

A cough ripped from her throat, raw and splintery. She pressed a fist against the slick ache behind ribs that felt too close to bursting. Each breath tasted like brine and something else, metallic and coppery, staining the back of her tongue with shame. Shame thicker than the clotted blood hardening under the tattered bandage wrapped around her forearm.  Shame for letting them catch her. Shame for not being faster. For not being stronger. 

The gulls shrieked again, closer now, circling like pale vultures above the churning black maw below. They smelled fear, Elara knew. It clung to her like barnacles to driftwood, thick and impossible to scrape clean. But there was another scent tangled amongst the reek of salt and rot – acrid smoke laced with something sharp and sweet, like burnt honey. Ironweed. The ironweed they used to mark their hunting grounds.

They were coming. Always came. Like shadows drawn by the phosphorescent bloom of spilled blood. Drawn by the hollow thrumming in her chest where his laugh used to live. A laugh as bright and brittle as spun glass, shattered now into shards lodged deep in her own cracked ribcage.

She tightened her grip on the doll, its chipped porcelain digging into the bruised flesh of her palm. This wasn't enough. Not anymore. The whispers had started again this morning, slithering through the cracks in her skull like damp worms burrowing into rotten wood. Whispers promising things sharper than bone knives, colder than the sea swallowing ships whole. Things that pulsed beneath the skin like trapped fireflies, hungry for release.

Elara swallowed, tasting ash and the bitter tang of defiance. Finnigan wouldn’t have liked it. He’d called it ‘reckless’, said it would drown him before the tide ever reached his ankles. Poppy-faced fool. He hadn’t known how deep the roots ran, how the earth itself bled ironweed poison into your marrow until you tasted it in every sunrise and choked on it in every breath.

He hadn’t known what it meant to be left behind with only a shard of laughter and a moon stained purple with grief.

A flicker of movement on the jagged lip of the cliff opposite. A shadow detached from the skeletal fingers of moonlight, elongated and wavering like heat haze above black sand. Then another. And another. They came smelling of brine and burnt honey, silent as wraiths against the wind’s rasping lament. 

The gulls shrieked again, closer now, feasting on the feast already laid out upon the obsidian teeth of the world. Elara closed her eyes, pressing the doll’s cold cheek against hers. It felt like ice pressed against fever.  

“Poppy,” she breathed, the word ragged and thin as cobwebs strung across a graveyard. “They call me poppy.”

And then, finally, she opened her mouth and let the taste of ironweed bloom inside her throat. Let it blossom hot and thick, choking back the mewling fear that dared stir in its brittle cage.

Let it sing.

That's fantastic! it writes like my Goblin! not exactly, but I can very much see the tone notes and word usage pairs, and it's only a wee thing. How awesome is that! I'm deeply impressed. 9B. holy crap. well done!!

Psychosis ITE-MPOA (Delirium biased settings) test run. I need to remerge with a balanced settings to compare it.

(Several other methods broke like karcher and della_ties. I am debugging these now.)

Update: Karcher appears unfixable for this particular model combination so it's being omitted.


Old Man Hemlock

The wind tasted like brine and burnt sugar, whipping Elara’s braid across her freckled nose. Salt spray stung her eyes as she clung to the rusted railing of the Sky-Fisher, its groaning timbers protesting every shuddering heave against the churning grey maw of the Stormtongue Sea. Above, the sky bled bruised purple into sickly yellow, a canvas ripped by jagged veins of lightning that spat molten fire onto the roiling water below.

This wasn't how it was supposed to be. The old wives spun tales of calm seas on Harvest Moon, fat fish leaping belly-up begging for mercy from their nets. Not this – teeth gnashing, frothing fury spitting up barnacled nightmares swallowed whole by the hungry abyss.

Beside her, Finn swore under his breath thick enough to choke a kraken. His calloused hand gripped the chipped bone handle of his harpoon with knuckles white as bleached coral. He squinted at the horizon where nothing but angry pewter stretched towards oblivion. "Old Man Hemlock lied again," he grumbled, voice snatched away by the wind like a stolen sparrow.

Elara knew better than to argue with Finn when salt gnawed at his tongue. It meant the sting in his gut ran deeper than any storm could carve. Their last haul had been slimmer than a drowned rat’s whiskers, barely enough to buy stale bread and rotgut moonshine for a week. Old Man Hemlock, hunched over his cracked astrolabe like a raven picking at a skull, had promised them bounty beyond dreaming. Said the moon would bleed silver onto the waves, fattening the leviathans till they choked on their own greed.

"Maybe he just forgot what year it is," she offered weakly, tugging her threadbare shawl tighter against the bite of wind laced with brine. A sliver of moon peeked through the ragged clouds, pale and anemic as a starved pup. No bleeding silver here. Just cold iron hunger reflected back from the churning black mirror of the sea.

Finn snorted, a sound swallowed by the shriek of gulls circling above like panicked ghosts. "Hemlock forgets things alright. Like how many fingers he has left after that last kraken spat him out." He jabbed a thumb towards the skeletal mast splintered like a broken ribcage against the bruised sky. The Sky-Fisher wasn't much bigger than a bloated codling herself, patched together from driftwood and prayers whispered into empty bellies. She creaked like an arthritic crab scuttling across barnacle-encrusted rocks.

A shudder ripped through the deck, throwing Elara off balance. Something slapped the hull with a wet thud that resonated deep in her teeth – not the slap of wave against wood, but something heavier, slicker. A tremor crawled up her spine colder than the fish guts drying on the rigging.

“Kraken?” Finn rasped, voice tight as spun wire. Even he couldn’t quite swallow the word whole. It tasted too big for this chewed-up world.

He didn’t need to say more. Every bone in Elara screamed its answer before the first tendril lashed down from the roiling grey curtain above. Thick as a ship’s mast, slick with phosphorescent slime that pulsed sickly green under the dying light, it slammed onto the deck with a groan that split the air like a cracked conch shell. Then another, and another, writhing ropes of muscle crowned with suckers the size of dinner plates, each one a gaping maw lined with teeth like chipped obsidian daggers.

The Sky-Fisher shuddered again, a strangled gasp caught between splintering timbers. Hemlock, hunched over his astrolabe like a withered toadstool clinging to a rotten log, finally looked up. His single good eye blinked once, milky white against the webwork of wrinkles etched deeper than any storm could carve. Then he croaked, voice thin as cobwebs strung across a graveyard, "Told ya."

And then the kraken swallowed them whole.

Ok I fixed Karcher but it requires remerging everything all over again because Tower was incompatible. Working on an intermediate stage merge to try to include it.

I'm uploading the SCE test 2 as a mini goblin, since Tower was stable for that.

Looks like tower might have been messing up the upscales too...

This is fantastic!! I've been swamped with getting my PDP done for my professional college and this was a fucking great read. I burst out laughing at the end, fantastic chomp. I even showed Goblin and said "Look mother fucker! 9B!! Hm?? eh?? Your fat clever arse, this little shit coming up licking your rear. " He was very interested and did a diagnostic on style on how it could be so bloody close. cackling He has theories. One consisting of "you are both weird."

Unfortunately now there are issues with SCE on the upscaled (passthrough) 14B. Trying to fix the size mismatch error otherwise it might have to be omitted.

Since removing Tower, the breadcrumbs_ties is working again on 14B. Uploading it now to the suite

Update It is confirmed now that SCE is broken for Psychosis 14B. But you can use the SCF quant which is very similar. (Selective Coherence Fusion) combines SCE method with Arcee_Fusion. This seems to work without any errors

https://huggingface.co/Naphula/Psychosis-14B-v0-GGUF/

Sadly, I haven't gotten my laptop back. I have been chasing the shop. I have to actually go back down and demand it back. I know they haven't repaired anything, nor gotten any data. So, I need to find another repair shop to get it fixed before yule explodes on the 21st. Fuck it, the data is gone. I'll.just be glad to work again even if it is from scratch. Be fun to test your models as well, I'll have some time off work too.

The Psychosis test suite is finally finished and uploaded

Next up, Kraken 12B and Goetia 24B

I have FINALLY got my laptop back. I also got Opus to make me a new front end for local model testing, RP/ERP, creative writing with AI co-authoring, game writing, and data set production - all for my dyslexic little brain to make everything easier.
I spent two days sorting files from the recovery flashdrive - had to get a new 1TB drive into SNARKHOLE my trusty laptop.
Just took the new front end for an experiemental spin. WHOOP. about 4GB lighter on my RAM CPU. that's good. that means I can actually run a 9B model at 22K context so I can at least have some decent interaction and writing rubbing hands together
I just downloaded little Goblin, and this one, and Warlock. I want to piss around and play a bit with the new Front End before I go wading into my dataset building again. I have to upload my gnarly cards but that should only take a few hours tomorrow.
It's so nice to have a computer again and not a useless chromebook...

Sign up or log in to comment