You didn't read did you? Typical.
In simple english, everything winds down to zero - death.
Where do your sort come from? Can't argue Theisim, cannot recognise science.
No, that is false.
Again you seem to be talking about and giving meaning to things that you don't have an understanding of.
Entropy isn't death, it's not zero either.
It's simply the statement about the microstates that correspond to some macrostate. To be even more precise it's the logrithm of the possible microstates in the system.
[Macrostate = Your big observable, your system which can be anything from a bunch of coins to the universe.]
[Microstate = A specific microscopic configuration of a system that the system may occupy with a certain probability or different possible ways the system can achieve a particular macrostate.]
Let's take Feynman's example, it's quite intuitive. Let's imagine two fluids red and blue, separated by a partition. Now let's remove the partition very slowly, such that we don't disturb the liquids and allow them to mix. It will be a slow process but gradually they will mix together and will spread over the whole container. Now try seperating them again into red and blue. That's what entropy is.
The number of possible arrangement of molecules (called microstates), that give rise to the mixed state (macrostate) are way way more than those that give rise to the perfectly seperated macrostate. There are countless interactions/ways in which you can get the mixed system, not many in which you will get the seperated one.
Which is why you will probably never see the system go back to that intial state (sperated). As the fluids are mixing you can say that the system is maximizing entropy.
And just to be clear, the second law doesn't say that you cannot create order. You can, the universe does this all the time, case in point galaxies, planets, life and so on. It's just says that the net entropy of the universe will probably increase in general in such a case, while the local entropy goes down.
So, Entropy doesn't mean 'death', 'zero' and not even disorder. Disorder is one of the aspects of entropy, it isn't entropy itself. Entropy is just a measure of how likely a certain macrostate is i.e probabilities. It goes depper still but let's stop here.
Now that you know what entropy is, I ask again. What does it have to do with the discussion we are having?