Boris Knyazev
  • about
  • publications
  • blog
  • repositories
  • Media
  • cv
  • llm
  • •

  • merging
  • •

  • moe
  • •

  • empirical-study
  • •

  • research
  • •

  • medium

MetaMerge: Model Merging with Meta Networks

Merging ViTs and LLMs using a pretrained (graph) neural net.

15 min read   ·   2026

REAM: Compressing Mixture-of-Experts LLMs

Merging experts in Mixture-of-Experts (MoE) LLMs to compress a 235B LLM.

35 min read   ·   2026

NiNo: Learning to Accelerate Training of Neural Networks

Explaining our ICLR 2025 paper and visualizing neuron permutation symmetry.

15 min read   ·   2025


  • Anisotropic, Dynamic, Spectral and Multiscale Filters Defined on Graphs

    22 min read   ·   August 12, 2019   ·   medium.com

    2019   ·   medium

  • Tutorial on Graph Neural Networks for Computer Vision and Beyond (Part 1)

    24 min read   ·   August 04, 2019   ·   medium.com

    2019   ·   medium

  • <
  • 1
  • 2
  • >
Email LinkedIn Follow me on