+
Skip to content
View justinlietz93's full-sized avatar
:electron:
https://zenodo.org/records/17220869
:electron:
https://zenodo.org/records/17220869

Block or report justinlietz93

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
justinlietz93/README.md

Void Dynamics Model (VDM)

A real-time, event-driven, reaction–diffusion substrate with void walker ecology & memory steering.
No dense scans. Zero training. Divergent reasoning. Constraint satisfaction in the moment.

DOI ORCID AMD/ROCm only Dual Academic/Commercial License


image

🔭 What I’m building

  • VDM — reaction–diffusion field intelligence + walker ecology + scoreboard/GDSP gating
  • Memory Steering — dynamic knowledge graph with event-driven updates
  • Real-time control — swap massive pretraining for fast constraint satisfaction

Reproducibility: baselines + QA artifacts are archived on Zenodo. Code lives in public GitHub repos with tests and docs.


🧪 Reproducible records (Zenodo)


⚙️ Quickstart

# clone
git clone https://github.com/Neuroca-Inc/Prometheus_Void-Dynamics_Model.git
cd Prometheus_Void-Dynamics_Model

# create env (exact commands in repo README)
python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt

# run a Reaction Diffusion + walkers with Memory Steering demo (Coming soon)

# Otherwise check out the src/ folder for the available public sims

🗺️ Roadmap (Oct~Dec 2025)

  • Notebook mirrors of each test ✅
  • 100% Reproducible, falsifiable claims ✅
  • Currently working real time model ✅
  • Zero training ✅
  • Online causality based learning
  • Physics driven intelligence model ✅

CURRENT Execution model: All tests live in src/{domain name}/, figures appear in figures/{domain name} if passing and figures/{domain name}/failed_runs/ if failed. The same goes for logs, just replace figures/ with logs/ in the file path to find them.

GOAL Execution model: All tests live in notebooks/, render figures + filepaths inline, and write results under artifacts/. A master notebook runs the full suite with clear explanations and emits a manifest.

# Run-all (executes the full suite and saves executed copy)
jupyter nbconvert --to notebook --execute notebooks/00_RUN_ALL.ipynb --output notebooks/00_RUN_ALL.out.ipynb

Future Layout

notebooks/
  00_RUN_ALL.ipynb
  01_rd_front_speed.ipynb
  02_rd_dispersion.ipynb
  03_invariant_drift.ipynb
  10_walkers_min_demo.ipynb
  11_locality_bounds.ipynb
  12_gdsp_budget_sweeps.ipynb
  20_memsteer_acceptance.ipynb
  30_rt_control_slice.ipynb
artifacts/
  rd/ walkers/ memsteer/ control/ meta/

Current Status legend: PROVEN (axiom gates passed) · PLAUSIBLE (design + prelim data) · NEEDS_DATA (tests pending) Hardware: I use AMD/ROCm only (MI100, 7900 XTX). CPU fallbacks use smaller grids. I easily ran 100,000 neurons on battery power using an Acer Aspire notebook, planning to achieve 1 billion neurons on the bigger machine.

October 2025 (Weeks 1–4)

  • [DONE] RD Baselines v0.201_rd_front_speed, 02_rd_dispersion, 03_invariant_driftartifacts/rd/ (PROVEN) Gates: front-speed R² ≥ 0.9999, rel-err ≤ 5%; dispersion median rel-err ≤ 1e-1, array R² ≥ 0.98; on-site invariant drift ≤ 1e-8/step.
  • [DONE] Minimal Walkers + GDSP10_walkers_min_demo, 11_locality_bounds, 12_gdsp_budget_sweepsartifacts/walkers/, artifacts/meta/telemetry.json (PROVEN) Gates: finite-support growth within bound; no dense scans; GDSP budget never oversubscribed; event telemetry saved.
  • [STARTED] Memory Steering v0.120_memsteer_acceptanceartifacts/memsteer/ (PLAUSIBLE) Gates: retention half-life = setpoint ± 10%; steering latency < baseline RD horizon; interference curve monotone with steer strength.

November 2025 (Weeks 1–4) — v1.0 “model finished”

  • [STARTED] Real-Time Control Slice30_rt_control_sliceartifacts/control/ (PLAUSIBLE→PROVEN) Gates: goal attainment ≥ 90% (N seeds); control energy ≤ baseline; perturbation recovery ≤ unperturbed horizon. The model is currently capable of post-graduate level reasoning on human readable casuality exams.
  • [STARTED] Release v1.0 Priority Pack — produced by 00_RUN_ALL.ipynb Outputs: executed 00_RUN_ALL.out.ipynb, figures, CSV/JSON metrics, seeds, and artifacts/meta/manifest.json bundled to artifacts/v1.0_priority_pack/.

Provenance & checks (written by 00_RUN_ALL.ipynb)

  • Manifest: artifacts/meta/manifest.json (paths to all outputs).
  • Contradiction report on failure: artifacts/meta/CONTRADICTION_REPORT.json (which gate failed + notebook cell refs).

Pinned Loading

  1. Neuroca-Inc/Prometheus_Void-Dynamics_Model Neuroca-Inc/Prometheus_Void-Dynamics_Model Public

    Neuroca, Inc's official public authorized repository for a physics grounded approach to modeling real time divergent / convergent intelligence. It contains works in the multi-objective real-time ze…

    Python 4 2

  2. Perfect_Prompts Perfect_Prompts Public

    A library and semi-framework for autonomous Agent prompting that supports Go, Rust, Python, and Typescript.

    Python 14 1

  3. agent_tools agent_tools Public

    Agent tools CLI platform.

    Python 119 13

  4. Neuroca-Inc/Cogito Neuroca-Inc/Cogito Public

    This is a stale placeholder for the Cogito codebase. The current version here is not complete and may not work, a stable enhanced variant will replace this soon.

    Python

  5. breakthrough_generator breakthrough_generator Public

    Generates breakthrough ideas from a single prompt through an 8 stage walkthrough, with optional research proposal paper.

    Python 57 10

  6. Physics-Taxonomies Physics-Taxonomies Public

    A repository used for learning, exploration, and cross-domain thinking in different areas of Physics. This will likely be turned into a semantic search docs website / knowledge graph.

    1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载