While it's true that *some* Deists have come to believe this, I don't think all do. Deism doesn't seem to specify *why* the Creator never gave us a revealed religion, only that he/she/it didn't.
After all, to believe that a Creator made the earth and then walked away is to ignore the complete lack of evidence for even knowing this much. Do we know that God walked away? Do we know God is ignoring us? Do we have pictures of any of this? Direct writings? Anything at all to prove the willful absence divine beings?
So why do people ascribe this belief to all of us?