Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it's fun thought to play with from technical perspective. How would you make an ethical simulation with limited processing power in 1000 years?

- Dark energy would make it less compute-intensive towards the end of the simulation. If you knew where your observers are you could simulate less and less over time, while also reaching an ethical endpoint for the simulation

- and with Double-slit experiment, we kinda know that universe knows when things are observed

- Cosmic Microwave Background would hide a lot of signals further away, giving a possibility of aggregating or dropping signals further away from observers

- Diffraction Limit would limit the resolution of observations we can make from further away, limiting the resolution of the needed simulation.

- Quantum uncertainty principle would be easy way to make the simulation non-deterministic by just adding some jitter / variance.

I might have written some things that are wrong. Also it doesn't really answer your questions as this doesn't really lead me to believe we surely live in a simulation, it's just something I like to think about sometimes.



Pull the ol' entity system and only simulate things "close by" (or raise the granularity of simulation for things further away). What do I care about a metane molecule in Alpha Centauri but its contribution to a bigger, planet-spanning or galaxy-spanning effect? Same idea.

Edit: this has an obvious issue: if (when?) humanity (or rather the most ubiquitous "observer" species, whatever it is) "saturates" the cosmos, these optimization opportunities will get rarer and rarer. If it leads to localized "slowdowns" of physical phenomena then we'll have positive evidence of the simulation hypothesis.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: