Published July 18, 2025 | Version v1
Preprint Restricted

L-Shape Embedding Architecture: Efficient Multi-Relational Representations

Description

L-Shape Embedding Architecture: Efficient Multi-Relational Representations

A geometric packing approach to knowledge representation that achieves 60% compression while enhancing relationship modeling. By decomposing embeddings into orthogonal dimensions with specialized properties, L-Shape enables superior preservation of hierarchical, similarity, and cross-domain relationships in a fraction of the space.

The architecture leverages the insight that different relationship types have natural geometric affinities - hierarchical relationships align with vertical dimensions while similarity relationships distribute horizontally. This orthogonal decomposition dramatically improves contrast between relationship types, enabling more precise semantic reasoning.

Key achievements:
- 60% reduction in embedding size
- Enhanced relationship type discrimination
- Improved cross-domain transfer
- Principled geometric decomposition

This work challenges the assumption that all relationships must share the same geometric space, demonstrating that specialized geometric structures lead to both efficiency and quality improvements.

Repository: https://github.com/jamestexas/papers

Files

Restricted

The record is publicly accessible, but files are restricted to users with access.