The Physical Index
Every physical space on earth holds data no application can read. Physixs is building the index that changes that, so any AI or system can ask questions of physical reality the way it already asks questions of the web.
The internet indexed documents. Search made them queryable. Every application that followed was built on top of that layer.
Physical space has never had an equivalent. Scans exist. Digital twins exist. But none of it is queryable. You can walk through a 3D model, but you cannot interrogate it.
We are building the indexing layer for physical reality. Not a visualisation tool. Not a digital twin. The index that every application plugs into when it needs to understand the physical world.
Physixs is to physical space what the search index was to the web.
Any 3D scan input: LiDAR, photogrammetry, Gaussian splatting. iPhone, drone, or professional scanner. Any environment.
Our pipeline cleans, registers, and semantically labels messy real-world data. Occlusions, lighting variations, multi-vintage scans — handled.
A structured spatial object graph. Every physical entity has persistent identity, precise coordinates, and state. The index grows smarter with every site.
Natural language API. Any application, any AI agent, any user interrogates physical reality as simply as querying a database.
Type in plain language. Physixs returns precise, spatially-grounded answers from the physical index of your sites.
The infrastructure layer
Every application that needs to understand physical reality will plug into Physixs. The way every website once plugged into a search index.
We are onboarding our first industrial sites now. Join the waitlist for early access.
You are on the list.
We will be in touch.