For some reason, talking about the changes in your body after giving birth seems to be a taboo topic still, and there is no reason for it to be so. After your body has performed the miracle of giving life there will be a lot of changes in your body that will not be too pleasant. The good news is, everything will fall back into place eventually.