Good approach Jake. Graceful fail over is so important to good UI design and
this is a sensible pattern for achieving that.
As for crawlers following the link, most crawlers drop the values after the ?
to avoid false submissions of data. You just need a graceful way to respond to
GETs that come with no ?CreateDocument&vote=yes.
It's worth noting - what I am describing is GOOD practice for the more well
behaved crawlers. Once upon a time a poorly written crawler trawled my blog
and created about 300 bogus empty comments. Input validation eventually solved
that problem for me. So, hindsight, even if we assume crawlers will drop the
queryparms, there may need to be some other counter measure such as a bot rule
or meta tag.