Tale of Personal Solomoff Induction
2030 has a place for you as a well-adjusted HIV- looksmaxxed male SF socialite with implanted mirrorshades and a positive attitude.
You are Homo Oeconomicus - you chase incentives through hill climbing based on the value function plugged into your reptilian brain - many things, all upstream to status, sex, money, you name it.
You scoff at the Homo Religiosus - the old type of guy who measures the goodness of his deeds by a formal moral code, hundreds of years old. You clearly see how much this guy misses out on.
You wake up and meet an interdimensional traveller. He makes a quick introduction, usual stuff to the effect 'I am is you, from the future and having travelled many Everett branches of this universe.'
Then your conversation takes more of a object-level flow.
> Of course I haven't been to all the branches, 10ish right now, but I keep an update function on how did all the copies turn out. Now I'm doing backprop, making corrections. Here is the new value function so that you are the happiest. Hope that it makes your life easier.
It's a holographic n-dimensional chart. You have no idea how, but you understand how it works and can navigate it intuitively.
You take a look at the value function, and it's quite similar to your current one. There are some random deviations. And in some places that yours is smooth, the new one provides a subtle gradient.
You start to form a question, but the guy vanishes.
You spend the day thinking about it. In the evening you try to relax. An portal opens in your room and a visibly older copy of you appears. He says:
> Hear me out, there's a 1'000'000 of branches I have visited now. Not quite personal Solomonoff induction, but closer than it'd been. The updated value function should arrive at your device about now!
You take a look at the n-dimensional chart. It's much more abstract than before. There are nearly no flat surfaces here, all is either high or low. There are huge divergences from your current one across some domains.
> Hey, why is it so different?
> You see, these get more and more abstract. This means they are less specific.
> So I need to apply the abstract ones by default, right?
> Yes
> But where I can consider my situation to be unique, low probability, I should adjust and deviate from the abstract rule?
> Yes, quite so.
> What about the missed utils?
> You need to put some faith in the value function that your situation is not unique enough. You need to trust in most universes in the end it's alright.
P.S.
Sorry for the inactivity throughout the last month or so. Will do more posts in the nearest future, and more regularily.