Table of Contents
Hi! I’m TurnTrout, but the United States government insists on calling me “Alexander Matt Turner.” I like writing about lots of stuff and learning about lots of stuff. For more about my professional life, check out my research. My dating doc resides here.

- Why the name “
TurnTrout”? - Once upon a time, I was unpacking my viola when some dude walked by and said “hey turntrout!”. Then he never said it again, no one talked about it again, everyone forgot about it. At least—everyone forgot until I wanted a Reddit username. And thus I chose
TurnTrout. - Over the years, more people came to know me for the research which I posted as
TurnTrout. Now, people often recognize me by that name. Some people just say “hey ’Trout”! - Why do you love geese so much?
-
In 2020, I had just started a new relationship. My girlfriend was being quite silly, so I informed her that she was a silly goose. She liked the title.
-
We soon happened upon The Untitled Goose Game, a delightful cooperative experience where two players pilot two geese in order to troll and terrorize the residents of a sleepy town. There’s even a dedicated button for honking! We loved the game. We loved each other. We loved the cute geese. Our strong feelings splashed onto the geese. Now, when we thought of geese, we thought of each other; when we thought of each other, we thought of geese.
- How was this website designed?
- Refer to The design of this website.
My email is alex@turntrout.com. While I’m not guaranteed to reply, I always appreciate a good-faith message. Please feel free to write to me! I feel happy to know if and how my writing makes a difference for folks. I also like to hear about new alignment research ideas.
If my writing means something to you or has brightened your life, consider:
- Donating to my ko-fi,1 or
- Leaving me a nice email or an anonymous compliment!
I prefer messages over donations.
Find out when I post more content: newsletter & rss
alex@turntrout.com (pgp)Footnotes
-
Ko-fi donations will not affect my ability to do AI alignment research. To support alignment research, I suggest the Long-Term Future Fund. ⤴