Boris Knyazev
  • about
  • publications
  • blog (current)
  • repositories
  • Media
  • cv
  • llm
  • •

  • merging
  • •

  • moe
  • •

  • empirical-study
  • •

  • research
  • •

  • medium

MetaMerge: Model Merging with Meta Networks

Merging ViTs and LLMs using a pretrained (graph) neural net.

15 min read   ·   2026

REAM: Compressing Mixture-of-Experts LLMs

Merging experts in Mixture-of-Experts (MoE) LLMs to compress a 235B LLM.

35 min read   ·   2026

NiNo: Learning to Accelerate Training of Neural Networks

Explaining our ICLR 2025 paper and visualizing neuron permutation symmetry.

15 min read   ·   2025


  • MetaMerge: Model Merging with Meta Networks

    Merging ViTs and LLMs using a pretrained (graph) neural net.

    15 min read   ·   February 19, 2026

    2026   ·   nino   merging   gnn   neural-graphs   empirical-study   research

  • REAM: Compressing Mixture-of-Experts LLMs

    Merging experts in Mixture-of-Experts (MoE) LLMs to compress a 235B LLM.

    35 min read   ·   January 15, 2026

    2026   ·   llm   merging   mixture-of-experts   moe   compression   empirical-study   research

  • NiNo: Learning to Accelerate Training of Neural Networks

    Explaining our ICLR 2025 paper and visualizing neuron permutation symmetry.

    15 min read   ·   September 30, 2025

    2025   ·   nino   gnn   neural-graphs   llms   transformers   optimization   paper   visualization

  • Can we do better than Convolutional Neural Networks?

    11 min read   ·   September 30, 2019   ·   medium.com

    2019   ·   medium

  • Spectral Graph Convolution Explained and Implemented Step By Step

    16 min read   ·   August 15, 2019   ·   medium.com

    2019   ·   medium

  • <
  • 1
  • 2
  • >
Email LinkedIn Follow me on