Boris Knyazev
  • about
  • publications
  • blog (current)
  • repositories
  • Media
  • cv
  • llm
  • •

  • merging
  • •

  • moe
  • •

  • empirical-study
  • •

  • research
  • •

  • medium
  • REAM: Compressing Mixture-of-Experts LLMs

    Merging experts in Mixture-of-Experts (MoE) LLMs to compress a 235B LLM.

    34 min read   ·   January 15, 2026

    2026   ·   llm   merging   mixture-of-experts   moe   compression   empirical-study   research

  • Training LLMs Faster by Learning Neuron Interaction

    6 min read   ·   September 30, 2025   ·   medium.com

    2025   ·   medium

  • Can we do better than Convolutional Neural Networks?

    11 min read   ·   September 30, 2019   ·   medium.com

    2019   ·   medium

  • Spectral Graph Convolution Explained and Implemented Step By Step

    16 min read   ·   August 15, 2019   ·   medium.com

    2019   ·   medium

  • Anisotropic, Dynamic, Spectral and Multiscale Filters Defined on Graphs

    22 min read   ·   August 12, 2019   ·   medium.com

    2019   ·   medium

  • <
  • 1
  • 2
  • >
Email LinkedIn Follow me on