install
source · Clone the upstream repo
git clone https://github.com/plurigrid/asi
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/plurigrid/asi "$T" && mkdir -p ~/.claude/skills && cp -r "$T/ies/music-topos/.codex/skills/feedforward-learning-local" ~/.claude/skills/plurigrid-asi-feedforward-learning-local && rm -rf "$T"
manifest:
ies/music-topos/.codex/skills/feedforward-learning-local/SKILL.mdsource content
Feedforward Learning Local
Category: Phase 3 Core - Alternative Learning Paradigms Status: Skeleton Implementation Dependencies: None (standalone learning framework)
Overview
Implements forward-forward (FF) learning algorithm and variants that eliminate backpropagation through local, layer-wise contrastive objectives. Each layer learns to distinguish positive from negative data independently.
Capabilities
- Forward-Forward Algorithm: Hinton's layer-local learning
- Contrastive Objectives: Positive/negative data discrimination
- No Backprop: Purely feedforward gradient computation
- Statistical Communication: Inter-layer coordination via activity statistics
Core Components
-
FF Layer (
)ff_layer.jl- Local goodness function per layer
- Positive/negative data generation
- Layer-wise gradient updates
-
Contrastive Learning (
)contrastive_learning.jl- Contrastive divergence variants
- Energy-based formulations
- Hybrid supervised/unsupervised objectives
-
Statistical Coordination (
)statistical_coordination.jl- Activity normalization between layers
- Whitening and decorrelation
- Predictive coding integration
-
FF Network (
)ff_network.jl- Multi-layer FF architecture
- Inference and training loops
- Comparison with backprop baselines
Integration Points
- Input from: Raw data (no dependencies on other skills)
- Output to:
(decentralized learning signals)emergent-role-assignment - Coordinates with:
(compositional learning)categorical-composition
Usage
using FeedforwardLearningLocal # Create FF network network = FFNetwork([ FFLayer(input_dim=784, hidden_dim=500, threshold=2.0), FFLayer(input_dim=500, hidden_dim=500, threshold=2.0), FFLayer(input_dim=500, hidden_dim=10, threshold=1.0) ]) # Train on MNIST for (x_pos, y) in train_data # Generate negative data by corrupting label x_neg = overlay_wrong_label(x_pos, y) # Local learning at each layer train_step!(network, x_pos, x_neg) end # Inference predictions = predict(network, test_data)
References
- Hinton "The Forward-Forward Algorithm" (2022)
- LeCun et al. "A Tutorial on Energy-Based Learning" (2006)
- Nokland & Eidnes "Training Neural Networks with Local Error Signals" (ICML 2019)
Implementation Status
- Basic FF layer implementation
- Positive/negative data generation
- Multiple variants (supervised, unsupervised)
- Benchmark against backprop
- Integration with predictive coding