Pseudo-Invertible Neural Networks

1Technion    2Ben-Gurion University
SPNN block architecture diagram

SPNN architecture. A surjective coupling block with an explicit pseudo-inverse $g^\dagger$ implemented via an auxiliary network $r$.

Abstract

The Moore-Penrose Pseudo-inverse (PInv) serves as the fundamental solution for linear systems. In this paper, we propose a natural generalization of PInv to the nonlinear regime in general and to neural networks in particular. We introduce Surjective Pseudo-invertible Neural Networks (SPNN), a class of architectures explicitly designed to admit a tractable non-linear PInv. The proposed non-linear PInv and its implementation in SPNN satisfy fundamental geometric properties. One such property is null-space projection or “Back-Projection”, \( x' = x + A^\dagger (y - Ax) \), which moves a sample \(x\) to its closest consistent state \(x'\) satisfying \(Ax = y\). We formalize Non-Linear Back-Projection (NLBP), a method that guarantees the same consistency constraint for non-linear mappings \(f(x) = y\) via our defined PInv. We leverage SPNNs to expand the scope of zero-shot inverse problems. Diffusion-based null-space projection has revolutionized zero-shot solving for linear inverse problems by exploiting closed-form back-projection. We extend this method to non-linear degradations. Here, “degradation” is broadly generalized to include any non-linear loss of information, spanning from optical distortions to semantic abstractions like classification. This approach enables zero-shot inversion of complex degradations and allows precise semantic control over generative outputs without retraining the diffusion prior.

Key Results

Reconstruction from Semantics

Reconstruction from semantics (CelebA-HQ)

Using only a low-dimensional semantic measurement (e.g., 40 attribute logits), NLBP-guided diffusion reconstructs photorealistic faces that match the target semantics without retraining.

Attribute-Controlled Generation

Single-attribute editing via NLBP

We enforce a single attribute (e.g., “Eyeglasses”) by modifying one coordinate in the semantic space and applying gentle NLBP guidance, while leaving all other attributes free for the diffusion prior to hallucinate.

Multi-Attribute Conditional Generation

Multi-attribute conditional generation via NLBP

We simultaneously enforce multiple semantic constraints (e.g., Male + Smiling + Eyeglasses) by fixing several coordinates in the target logit vector and applying NLBP guidance during sampling.

Citation

@misc{spnn2026pseudo_invertible,
  title        = {Pseudo-Invertible Neural Networks},
  author       = {Yamit Ehrlich and Nimrod Berman and Assaf Shocher},
  year         = {2026},
  note         = {Preprint, under review},
  howpublished = {\url{https://github.com/yamitehr/SPNN}},
  url          = {https://github.com/yamitehr/SPNN}
}

Acknowledgements

A.S. is supported by the Chaya Career Advancement Chair.