Reality Cache

Consider a system where nearby devices collaborate to share both data and computing power without relying on the internet. When one user loads a webpage, it is cached locally and automatically becomes accessible to other nearby devices through peer-to-peer discovery (e.g., Bluetooth or WebRTC), creating a temporary offline web network. At the same time, devices can contribute their processing power to run distributed tasks, such as training a small machine-learning model or processing data in parallel. For example, in a classroom, if one student opens a tutorial website, everyone in the room can access it instantly from their phones, and all devices together can act like a mini local cloud to perform computations faster than a single device.

Description

Well this is a peer-to-peer system where devices share cached web content locally, allowing people nearby to access websites without needing the internet. When one person loads a webpage, it gets stored on their device and becomes discoverable to other nearby devices through technologies like Bluetooth or WebRTC. This effectively turns a room full of phones into a local offline internet.

Mirror Computing extends this idea by allowing nearby devices to share their processing power. Instead of one device performing all computations, tasks can be split across multiple phones or laptops and executed in parallel, creating a temporary distributed compute cluster.

Example Scenario

Imagine a classroom with 30 students:

  1. One student opens a React tutorial webpage while connected to the internet.

  2. The page is cached on their phone.

  3. Other students open the Reality Cache app and instantly see the tutorial listed as available nearby.

  4. They can access the webpage directly from the student’s device without using the internet.

Later, the class runs a machine-learning demo:

  • Each student’s phone contributes a small part of the computation.

  • The task is split into pieces and processed simultaneously across all devices.

  • The combined system trains a small model much faster than a single phone.

In effect, the classroom temporarily becomes a local cloud made entirely of nearby devices.

Issues & PRs Board
No issues or pull requests added.