Logos of Aether: A Measurement-Theoretic Physics Engine in Rust/WGPU (v2.2.1)

I’ve been “winging it” for a year trying to formalize a theory that has been bouncing around my head since high school. After years of post-graduation struggle and feeling like I was failing at life, I finally funneled that obsession into 566 pages of theory and a working simulation engine. This is Logos-Core.

The goal: To prove the universe doesn’t compute with Zero or Infinity, but with Ratios in a resource-constrained substrate called the Monadic Plenum.

🛠 The Stack

  • Language: Rust (strictly leveraging zero-cost abstractions for the PlenumReal type).
  • Compute: WGPU / WGSL (the universe as a GL_RGBA32F state matrix).
  • Architecture: Ping-Pong Buffer Causality (strict temporal separation between frames).

The Core Axiom: $\pi$-Saturation

Standard physics breaks at high densities because linear addition ($1+1=2$) leads to singularities. My engine implements Plenum Addition ($\oplus$). Just as $c$ is the speed limit, $\pi$ is the saturation limit for information density.

$$x \oplus y = \frac{x + y}{1 + \frac{xy}{\pi^2}}$$

In v2.2.1, I’ve verified this on-hardware. While small increments appear linear, self-addition ($x \oplus x$) demonstrates a hard saturation at $\pi$, rendering singularities mathematically impossible at the architectural level.

The Teeter-Totter (Causal Budgeting)

We replace “Uncertainty” with Resource Constraints. The engine enforces a zero-sum trade-off between Temporal Resolution ($d_{self}$) and Spatial Resolution ($d_{world}$):

$$d_{self} \cdot d_{world} \le \pi c$$

If a Monad (pixel) requires high spatial precision, its internal update frequency (Time) must drop. In this architecture, Inertia is literally processing latency.

v2.2.1 “Soliton” Features:

  • Self-Propagating Solitons: High-pressure waves that wrap and interfere without dissipation.
  • The Chronoscope: A real-time GPU visualizer (cargo run --example chronoscope) for observing Aether ripple in a ping-pong buffer.
  • Singularity Prevention: No renormalization required; the math is self-braking.

Strategic Commons Licensing

This project is built to be a common possession of humanity.

  • Code (AGPL-3.0): Open source.
  • Theory (CC BY-NC-ND 4.0): The 573-page manuscript.
  • Math (Public Domain): Truth cannot be enclosed.

Repo: https://github.com/chrisnchips42-blip/logos-core

I’ve spent a long time fucking around in life, doing shit like trying to die and wasting a marriage. But this has bounced around my head 15 years and I think what I have is solid. I’m looking for feedback on the WGSL compute kernels and the Plenum_pingpong implementation. If the universe is a computation, it has a hardware limit—this is my attempt to map it.

  • BusyBoredom@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    2 days ago

    Yep that’s right!

    I was using “grid” to be more easily understood, but what we really have is quantization. We get plank length from the uncertainty principle, as well as plank time, which is reminiscent of common strategies for reducing the computational requirements of simulations (reduced precision calculations). This pattern is repeated in the quantization of charge, etc.

    So you’re right its not a grid. Just a continuous cap on precision. But the point stands, it looks familiar to programmers and is a fair reason to suspect we’re in a simulation. Not proof of course, just a neat little hint.

    • balsoft@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 day ago

      Hm, you should clue me in. I thought the Planck length being the “minimum” unit of length was more to do with particles with a short wavelength to be comparable to a Planck length would need to have so much energy that they will become black holes, which means it’s not really feasible to investigate anything “happening” at scales smaller than that. So to me it doesn’t feel particularly relevant to the uncertainty principle. But it’s been a long time since my physics courses, so I might be missing some obvious connection.

      • BusyBoredom@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        Yep that’s right, below the plank length you can’t make position measurements without destroying what you’re trying to measure.

        And you are right that it can be fully explained without needing to be in a simulation, that is how these were discovered after all. The simulation angle is pretty far outside of the math of respectable physics.

        The reason the simulation hypothesis bleeds into the discussion is because its natural to ask “why” things break down at that specific size. Humans don’t like vague answers like “because god likes that number”, we prefer to tell ourselves stories that fit the numbers into physical things in our minds. Just like the Bohr model of the atom was a useful story about how atoms are structured for decades despite not being rigorously proven (and even being firmly disproven nowadays), one story we can tell ourselves to make sense of quantization is to view it as an attempt at limiting precision for computation.

        It is only a story at the end of the day though. We don’t really know why physics was set up exactly this way any more than we know why the big bang happened in the first place. Just lots of different peoples guesses, telling plausible stories about the math.