Hyperdimensional connection method - A Lossless Framework Preserving Meaning, Structure, and Semantic Relationships across Modalities.(A MatrixTransformer subsidiary)
Creators
Description
Hyperdimensional Connection Method: A Lossless Framework Preserving Meaning, Structure, and Semantic Relationships across Modalities
Contemporary AI systems often prioritize performance over structure, favouring probabilistic output over interpretability. In contrast, this work proposes a **lossless, structure-preserving approach** to connection discovery in high-dimensional data. Rather than relying on stochastic models, the framework explicitly encodes and normalizes matrix relationships using deterministic projections (e.g., Frobenius norm-based hypersphere alignment), ensuring full retention of semantic and structural properties.
This project is not about simulating intelligence through prediction; it’s about engineering intelligence through grounded, reversible transformations.
“Not an AI to fear, but one you define.”
The goal is to build failure-resistant, explainable, and human-aligned systems ones that do not exceed their scope but evolve meaningfully within it. Applications span high-dimensional matrix space navigation, semantic clustering, reversible embeddings, and knowledge graph formation across domains.
All code, methods, and test cases are made public for full transparency and reproducibility.
This repository contains the complete implementation and experimental validation of a revolutionary dimensionality reduction method that achieves perfect reconstruction (1.000 accuracy) while discovering semantic patterns invisible to traditional approaches. Built upon the MatrixTransformer framework's 16-dimensional decision hypercube, this method represents a paradigm shift from lossy compression to lossless, interpretable, and universally applicable feature extraction.
Key Features
- Perfect Information Preservation: Zero reconstruction error across all domains (biological, textual, visual) vs. 0.1% loss in traditional methods
- Cross-Modal Pattern Discovery: Unique ability to identify relationships across different feature representation types (3,015 connections in MNIST vs. 0 for traditional methods)
- Semantic Coherence Quantification: Achieves 94.7% semantic coherence in text analysis with queryable connection structures
- Domain-Agnostic Performance: Consistent advantages across 784-dimensional visual data, high-dimensional biological matrices, and multi-modal text representations
- 100% Sparsity Preservation: Maintains complete matrix sparsity while traditional dense methods achieve 0%
Experimental Validation
Comprehensive benchmarking across three diverse domains:
- Biological Data: Drug-gene interaction networks preserving clinically relevant patterns (NFE2L2, AR, CYP3A4)
- Textual Data: NewsGroups dataset with 23 cross-matrix links enabling multi-modal semantic analysis
- Visual Data: MNIST digit recognition with cross-digit relationship discovery and geometric pattern analysis
Technical Innovation
- Hyperdimensional Connection Discovery: Identifies meaningful relationships in 8-dimensional hyperdimensional space
- Hypersphere Projection: Constrains matrices to hypersphere surfaces while preserving structural properties
- Bidirectional Matrix Conversion: Enables lossless round-trip transformation between connection and matrix representations
- Query-Ready Architecture: Supports unlimited post-hoc analysis including similarity searches, anomaly detection, and relationship discovery
Applications
- Bioinformatics: Drug discovery with preserved biological network structure
- Natural Language Processing: Multi-modal text analysis with cross-representation relationship discovery
- Computer Vision: Visual pattern analysis with cross-pattern relationship discovery
- Financial Analysis: Anomaly detection preserving sparse transaction patterns
- Scientific Computing: Simulation embeddings maintaining physical constraints
Repository Contents
- Complete MatrixTransformer implementation with hyperdimensional extensions
- Experimental benchmarking code and datasets
- Comprehensive visualizations and analysis tools
- Domain-specific applications and examples
- Full reproducibility documentation
This work establishes a new standard for analytical methods that refuse to sacrifice information for computational convenience, opening new possibilities for scientific discovery where perfect information preservation enables insights impossible with traditional lossy approaches.
from matrixtransformer import MatrixTransformer
import numpy as np
# Initialize the transformer
transformer = MatrixTransformer(dimensions=256)
# Add some sample matrices to the transformer's storage
sample_matrices = [
np.random.randn(28, 28), # Image-like matrix
np.eye(10), # Identity matrix
np.random.randn(15, 15), # Random square matrix
np.random.randn(20, 30), # Rectangular matrix
np.diag(np.random.randn(12)) # Diagonal matrix
]
# Store matrices in the transformer
transformer.matrices = sample_matrices
# Optional: Add some metadata about the matrices
transformer.layer_info = [
{'type': 'image', 'source': 'synthetic'},
{'type': 'identity', 'source': 'standard'},
{'type': 'random', 'source': 'synthetic'},
{'type': 'rectangular', 'source': 'synthetic'},
{'type': 'diagonal', 'source': 'synthetic'}
]
# Find hyperdimensional connections
print("Finding hyperdimensional connections...")
connections = transformer.find_hyperdimensional_connections(num_dims=8)
# Access stored matrices
print(f"\nAccessing stored matrices:")
print(f"Number of matrices stored: {len(transformer.matrices)}")
for i, matrix in enumerate(transformer.matrices):
print(f"Matrix {i}: shape {matrix.shape}, type: {transformer._detect_matrix_type(matrix)}")
# Convert connections to matrix representation
print("\nConverting connections to matrix format...")
coords3d = []
for i, matrix in enumerate(transformer.matrices):
coords = transformer._generate_matrix_coordinates(matrix, i)
coords3d.append(coords)
coords3d = np.array(coords3d)
indices = list(range(len(transformer.matrices)))
# Create connection matrix with metadata
conn_matrix, metadata = transformer.connections_to_matrix(
connections, coords3d, indices, matrix_type='general'
)
print(f"Connection matrix shape: {conn_matrix.shape}")
print(f"Matrix sparsity: {metadata.get('matrix_sparsity', 'N/A')}")
print(f"Total connections found: {metadata.get('connection_count', 'N/A')}")
# Reconstruct connections from matrix
print("\nReconstructing connections from matrix...")
reconstructed_connections = transformer.matrix_to_connections(conn_matrix, metadata)
# Compare original vs reconstructed
print(f"Original connections: {len(connections)} matrices")
print(f"Reconstructed connections: {len(reconstructed_connections)} matrices")
# Access specific matrix and its connections
matrix_idx = 0
if matrix_idx in connections:
print(f"\nMatrix {matrix_idx} connections:")
print(f"Original matrix shape: {transformer.matrices[matrix_idx].shape}")
print(f"Number of connections: {len(connections[matrix_idx])}")
# Show first few connections
for i, conn in enumerate(connections[matrix_idx][:3]):
target_idx = conn['target_idx']
strength = conn.get('strength', 'N/A')
print(f" -> Connected to matrix {target_idx} (shape: {transformer.matrices[target_idx].shape}) with strength: {strength}")
# Example: Process a specific matrix through the transformer
print("\nProcessing a matrix through transformer:")
test_matrix = transformer.matrices[0]
matrix_type = transformer._detect_matrix_type(test_matrix)
print(f"Detected matrix type: {matrix_type}")
# Transform the matrix
transformed = transformer.process_rectangular_matrix(test_matrix, matrix_type)
print(f"Transformed matrix shape: {transformed.shape}")
This snippet shows the complete workflow from storing matrices to finding their hyperdimensional relationships and accessing the results.
Citation
GitHub: https://github.com/fikayoAy/matrixTransformer
Ayodele, F. (2025). MatrixTransformer. Zenodo. https://doi.org/10.5281/zenodo.15867279
Files
Hyperdimensional_Connection_Method__Experimental_Evaluation.pdf
Files
(16.3 MB)
Name | Size | Download all |
---|---|---|
md5:8029e90c14ef335604856373fd456eb4
|
16.3 MB | Preview Download |
Additional details
Identifiers
Related works
- Is part of
- Preprint: 10.5281/zenodo.15867279 (DOI)
Software
- Repository URL
- https://github.com/fikayoAy/MatrixTransformer
- Development Status
- Active