writer, speaker, content creator

Author Archive

2011: In Which I Freelanced More Than Ever

In Writing, Year in Review on December 31, 2011 at 12:27 pm

Earlier this year, I was at a writing workshop, and someone asked me a fairly innocuous question. They asked me “Are you a writer?”

“Sort of,” I said, “it’s not my main job. It only supplements my income.”

The questioner was kind of surprised. “Wait,” they said, “you make money?”

Well, yes. Not very much of it and not frequently enough, but yes, that does happen sometime. I do not like calling myself a “writer.” For some reason, the word seems loaded and uncomfortable, and I have this weird feeling that if I were to say to people “I’m a writer,” people would instantly think of some Hemingway wannabe staring intently at a keyboard, not actually producing anything. You know, Ewan McGregor’s character in Moulin Rouge. A guy with vague, lofty ambitions who is unable to actually translate them into anything at all, and wants to have written more than he wants to write.

There is also the tendency to think of “writer” as in the same category as “rock star,” “astronaut,” or “ballerina;” dreamy jobs that technically do exist, but that do not abound in any significant numbers.

So, I’ve been looking for lots of Plan Bs. Something else. A “real” job. However, I feel most satisfied when I can sit down, pound out an article, and actually call it real work. While I have looked into grad school, this past year I’ve been most excited by the writing jobs that I’ve gotten. This spring I wrote some news for a local publication called About Face Magazine, I ever-so-briefly worked for Portland Picks for Men before they went under, did regular work for Metromix, an arts and leisure site, blogged for the Daily Journal of Commerce, and wrote a feature and began blogging for the Portland Mercury. The day job, Portland Walking Tours, has also picked me up to work as a researcher and content provider. Not a bad collection of bylines, and I have a few other projects on deck.

I like this. I like this a great deal. If I knew I could make a living at it, I would make a living as a writer and journalist. There are real, actual people who do this, who research and report and write full time. I would like to be one of them. Given the poor economy, the state of the newspaper industry, and the general non-scarcity of information, though, I still don’t know if this is wholly and completely possible. I really, really want it to be, though. A local newspaper has now interviewed me twice about being a full-time reporter, and, despite knowing that any newspaper in the country could keel over dead and bankrupt at any moment, I’m ready to say yes if the offer me the job.

2011 has made me all the more want to discard backup plans, and just dive into trying to be a full time wordsmith. Even as I type this, there’s an uncashed freelancing check sitting on the table next to me, and that small amount of professional success only makes me fantasize more about pursuing my dream job. I want to be able to say “I’m a writer” without any kind of reservations, asterisks, or caveats, but I’ll only do that when it is, in fact, my full-time job.

It’s not impossible. Not probable, certainly, but being a professional would not violate the laws of reality. Let’s see what happens in 2012.

Shut Up and Show Me

In Movies on December 27, 2011 at 10:19 am

One of my favorite movies of all time is the director’s cut of Blade Runner. The director’s cut is a moody, dark tone-poem that lets the stark beauty of its setting and characters speak for itself. The theatrical version, though, is terrible. It’s a far different movie for one very important reason- the theatrical version of Blade Runner features voice-overs that explain precisely what is happening on screen, and it utterly kills the mood that the sets, music, costumes, and the rest of it try so hard to create. The theatrical version tells, and fails. The director’s cut shows, and succeeds.

Walk into any writing class and one of the first things that the instructor will tell you is show, don’t tell. That’s repeated over and over again to the point where it’s become somewhat of a hackneyed phrase that everyone says, and few people actually think about. However, it is utterly and totally true.

Yesterday I saw The Artist, a film about a movie actor in the late 1920s and early 1930s who weathers the transition from silent films to talkies. The central self-aware gimmick of The Artist, is that it itself is a silent movie (mostly). Most of the characterization is told with the exaggerated, broad body language of the silent era. The lead does a heroic amount of acting with his eyebrows and mustache, affecting the sort of big, visible facial expressions that can convey emotions without saying a word. The film is gimmicky, but it’s deft and charming enough that it works. Despite the leaden-sounding title and potentially highfalutin’ concept (being a modern silent film) it’s actually extraordinarily light on its feet and charming. It’s a movie that revels in the action thrillers, swashbuckling hijinks, and undiluted showmanship of old Hollywood. Give it a watch. (Bonus: One of the characters is probably the best movie dog I’ve ever seen. Normally I can’t stand animals or children in movies, as they are not presented as developed characters, but rather as emotional cheap-shots. The dog in The Artist, though, is an adorable little micro-badass with awesome comic timing.)

Coming out of The Artist, I immediately thought of Wall-E and Rise of the Planet of the Apes. Neither are silent movies in that they have title cards and whatnot, but, like The Artist, both are films that tell a substantial part of their stories through body language, facial expressions, and action. Wall-E features nonvocal robot characters, and Apes features, well, lots of apes. Like The Artist, both films are also excellent. They show, they don’t tell.

Another two examples I’ll mention: I’m currently playing through Shadow of the Colossus, an extraordinarily empathy-ridden, sad video game that has practically no dialogue. I also recently re-watched Toy Story 3, and was brought to tears by the scene at the end where the main characters bravely stare death in the face, but don’t say a word. The dialogue in the Toy Story movies is extremely well done, but at that emotionally pivotal moment the screenwriters and director knew that the right thing to do was to not say a word, and let Woody  and Buzz’s encounter with death speak for itself.

What makes all of these media good is that they know that they have multiple ways of communicating with the audience. Everything I mentioned above all uses character design, movement, a cocked eyebrow here, an agape mouth there, to say something. What’s more, the media in question trust in that communication style. They don’t show a smiling guy and then have someone say “he looks happy” or add any kind of intelligence-insulting exposition. They don’t attempt to validate nonverbal communication with wordy explanation, or impede on the emotional environment by spelling out what’s going on.

Media that show rather than tell are confident, witty, and have a diversity of ways in which they connect with the audience. Media that do that are better, and more interesting, be they films about movie stars or stories of sentient robots. Whatever genre, they take the advice of every college writing instructor ever, and it turns out to not a tired truism, but utterly worthwhile.

Thoughts on End Notes Vs Foot Notes

In Books, Rants, Writing on December 12, 2011 at 8:11 pm

Right now I’m reading a book that I quite enjoy. It has end notes. The end notes contain citations, so you can see where the author got his information. I’m fine with that. In fact, that’s something I want in pretty much any nonfiction book.

However, the end notes also contain asides and parenthetical remarks on the part of the author. This drives me utterly mad. When I see a very small number in the text, there is no way for me to tell whether or not following it to the back of the book will lead to additional thoughts from the author, or just a citation. Nine times out of ten it’s just a citation that I can ignore for the moment, but every so often it’s additional authorial remarks that I actually want to read. Looking at the main text, though, I have no idea what I’ll find at the back of the book. I just have to look.

I really, really, really, really hate this. It’s annoying, it’s lazy, and (worst of all) it’s an inconvenience to the reader that can be very easily remedied. Citations should be at the back of the book, and marked with end notes. They should be there for the reader, but shoved away into a different clump of pages on not intruding into the main body of text. Authorial asides, however, should be marked with an asterisk or dagger and on the same page as the main text. That way, the reader can easily glance down at them, and not have to futz around in the citation section for other stuff the author might have to say.

It boggles my mind that any book would intermingle authorial asides in with citations. It’s stupid, it’s aggravating, it has an easy solution, and any editor that sends the reader scampering back to the end of the book every half page is an awful human, and should be slapped in the face with a frozen tuna until they recant their various sins against reading.

A special exception can be made for Infinite Jest, though. Infinite Jest is cool.

In Which I Fail Spectacularly at NaNoWriMo

In Writing on December 6, 2011 at 1:38 pm

National Novel Writing Month has destroyed me. I set out at the beginning of November with the intent of writing a 50,000 novel. In the back of my mind, I knew I would fail. And then, giving into subconscious worries and fears, I did precisely that. When November ended, I had less than half of my novel finished.

I went into the whole thing with a certain lack of commitment. While I do have some aspirations when it comes to fiction (someone once said that “aspiring novelist” is a synonym for “human”) I’ve often thought that if I do ever write anything long-form, it will be nonfiction. (For example, the travel memoir that I’ve been trying to unsuccessfully sell/finish for the past two years.) At present, I’ve gotten some nice gigs writing about architecture and built industry in Portland, and I’ve done the odd article about things blowing up. I feel comfortable with nonfiction- after all, with nonfiction the fascinating story is already there. The only thing that a writer has to do is find a way to overlay their own fascination onto the pre-existing facts, and there you go. It comes naturally to me, especially when I’m writing about something that I really enjoy.

For NaNoWriMo, I knew that I would have to create a whole lot of written content very quickly. I chose to do a what I thought was a straightforward genre story- a murder mystery, but with vampires. I figured that I’d be able to put together a plot fairly quickly, and could have a lot of fun with the exposition how my vampires worked. Coming up with the story was pretty easy- I had a murder at the beginning, a twist at the end, and a sleuth trying to figure it all out. There was mystery at the beginning and a big fight at the end. The only problem- I didn’t have nearly enough of a middle.

Writing the story, I realized that mystery novels need red herrings. Lots of them. They need lots of little avenues down which the sleuth can look, and the readers can speculate about. While I thought the big twist at the end was pretty satisfying, I found myself struggling to construct blind alleys in the middle of the story that weren’t obviously not the solution for the puzzle. I ended up struggling far more than I thought I would, got distracted by several other projects, and ended NaNoWriMo soundly defeated. I had less than 25,000 words, and I have no idea what I’m going to do with a semi-completed vampire mystery.

For whatever reason, though, I have decided to take it on next year. Now that binge-writing had defeated me once I (for irrational reasons of pride and insanity) have decided that I need to take it up again until I’m finally successful at it. Come next November, I’m going to be, yet again, attempting to generate vast quantities of bad fiction. Next time, NaNoWriMo. Next time.

Hate Twilight? Good! Hate Ariel, Too.

In Movies on November 19, 2011 at 8:47 am

I’ve never consumed any of the Twilight media except for the first movie, but since I don’t live in a dilapidated shack at the bottom of the ocean, I now know a good deal of the plot and character details because so much of it has bled into the pop culture effluvia. And, if the first movie is any indication, then I doubt that the series becomes anything other than hackneyed, misogynistic virginity-porn that is chiefly driven by a fierce hatred and fear of female sexuality. Here’s the thing though- if you hate Twilight then, to be consistent, you should probably loathe The Little Mermaid as well.

When I was growing up my sister loved this movie, so I had it shoved into my brain multiple times as a kid via VHS. I didn’t really like it that much, but from sheer repetition and exposure it does occupy something of a nostalgic niche in my heart- particularly the number Poor Unfortunate Souls, which is probably the best of the Disney villain songs. The Little Mermaid, though, is cut from the same cloth as Twilight. (Or rather, Twilight is cut from the same cloth as it.) It’s about a teenage girl who, instead of getting hobbies or doing anything interesting with herself, wraps up her identity in finding and marrying some dude.

Ariel is a sixteen year old girl in this movie, who presumably does not have much or anything in the way of sexual experience (just like Bella), encounters a handsome guy from a world not her own (just like Bella), gets obsessed with him and his world (just like Bella), throws away her established life filled with the things and people that are meaningful to her (just like Bella) and gets married at an age where most people aren’t old enough to have graduated college (just like Bella). Again- she’s sixteen. There are moderately priced bottles of whiskey older than her. If anything Ariel’s transformation is even more dramatic. She transforms herself into a human so she can get hitched to this dude she just met. They’ve never even met up for coffee, and Ariel abandons her entire life in the sea so she can get with Eric. That would be like an Earth-person permanently moving to Alpha Centauri so they could possibly get all matrimonial with an extra-terrestrial that they met for five minutes.

Ariel’s beau, a square-jawed piece of nautical beefcake named Eric, is a neutered cipher who functions as an unthreatening object of barely pubescent desire. Eric and Edward are handsome blank Ken dolls of vacant non-masculinity, and the female protagonist’s consummation with them is not any kind of moment of spontaneous sexual passion, but a wedding. (Yes, I know Bella and Edward have sex after they get married, but it’s sex bookended by marriage and pregnancy. It is not sex for pleasure, passion, or any kind of romantic bonding. The intercourse is only there so it can be placed within it’s proper confines of an exchange of rings and a propagation of the species.)

Lots of people justifiably hate Twilight, but Twilight only says explicitly what one of the more beloved Disney movies says implicitly- that for teenage girls to feel fulfilled, it’s primarily important to find a nice boy and then get married. This is utter balderdash, of course. Teenage girls should cultivate hobbies, do activities, study, date lots of boys (or girls, or both), go to school, find a job they’re interested in, have a broad and supportive social circle, do interesting things like rock climbing or calligraphy or kung-fu or clarinet-playing, perhaps get in a long-term relationship with a guy (or girl, or whomever) and then, after years of love and trust and sexual compatibility and mutual support, get married. Or not. Whichever. But you know that already, because you’re not an idiot.

Twilight and The Little Mermaid both hate women and women’s sexuality because they eject that whole middle part (the bit with the rock climbing and sexual compatibility) and just get to the bit with the pretty white princess dress. While it’s great that lots of people call out Twilight for being execrable filth, it’s worth noting that this kind of misogyny suffuses many other parts of popular culture as well. Twilight is not unique in its sex-hating sexism.

Whew! For a moment there I felt like I was back in a college sociology class, getting all angry at the patriarchy. That felt good. The point is, if I were King Triton, or Bella’s parent, or any daughter’s parent I’d tell her this: No daughter of mine is getting married and first finding out who she is, and what she really wants. If I were to catch her watching Twilight, I’d probably say “Sweetie, you know that’s not how relationships work, right? Don’t listen to Ariel of Bella. Here’s a better role model for you- her name is Buffy.”

In Which I Finally Watch Grimm

In Fantasy, Portland, Television on November 18, 2011 at 12:00 pm

It was pretty much inevitable that I was going to start watching NBC’s Grimm, but I put it off for a healthy period of time. It was inevitable because they’ve filmed right outside my home, my work, and during my commute. I’ve seen the trailers, sets, cranes, cameras, boom mics, and port-a-potties strew throughout the city. and all of it has done a lot to pique my curiosity. So, last night during an attack of Crippling Introversion I snuggled up with a mug of green tea and Hulu, and decided to finally watch this thing that has been filming in my live/work/commute space. I also kind of expected to hate it.

Many people (at least the nerds that I tend to hang out with) were comparing it to Bill Willingham’s Vertigo series Fables, a comic that has never really grabbed me. As much as I like Sandman, Transmetropolitan, and Swamp Thing, Fables has always struck me as the contrived and sillier younger sibling of the big kid comics. None of the Fables characters were nearly as well done as Spider Jerusalem, John Constantine or the Swamp Thing. While it’s clever at times, it always seemed like it was skipping the character development step by saying “Hey, look guys! It’s the Big Bad Wolf! You remember him, right? Well, he’s a detective now! Check it out!” It’s fun, but not something as mind-blowing as Sandman or as joyfully profane as Preacher.

So, when I heard that there was a series that was basically Fables (except not) filming in Portland, I kind of went “meh” and thought that I’d never watch it. Last night, though, I was surprised by my reaction: Grimm is certainly not good, but it is also surprisingly not unwatchable. At least from my vantage point as a Portland resident.

Sure, there area lot of things wrong with it. The main character detective guy is a bland cipher, the writing is stilted (at one point someone says “this is no fairy tale” and I wanted to kidney punch whoever put that in there) and the plot of the first episode is stupid and direct in the way that I imagine James Patterson novels are. (I don’t know- I’ve never read a James Patterson novel, but I assume that his books have all the subtlety and plotting of chunk of boiled mutton.)

The look of the show, though, is pretty good. Not the CGI and makeup- that’s totally average. I mean the trees and the dark clouds and craftsman style houses that are all over Portland. The show really looks like Portland, and given that my various jobs tend to all add up to “professional Portland nerd,” I got no end of joy in seeing real, live things that I recognized in the show.

(That said, I was annoyed that the addresses in the show were all fictitious, and, worse than that, did not adhere to Portland’s pretty intuitive numbering conventions. But, apparently all of Law & Order’s NYC locales are made up, so I’ll just have to deal with that.)

The one thing other than seeing my fair city on screen, was Eddie the werewolf. While the protagonist, Nick, is fairly bland, the guy who plays his werewolf sidekick actually seems to be enjoying the part and brings a certain amount of levity the performance. That, and seeing people fight with swords in a modern setting kind of reminded me of Buffy and Angel, and triggered some of my Whedon-based nostalgia buttons.

Grimm certainly isn’t good, but it could become something good. There is potential for it to be much more than just a police procedural with werewolves. It might not be the next Buffy, but it’s by no means a failure. I’d be happy to see it renewed, and continue to plaster my city all over the teevee.

A Small Observation Re: Spinal Tap

In Movies, Music on November 12, 2011 at 12:09 pm

Yesterday was 11/11/11, and to celebrate the august palindromic occasion, a local theater pub was playing This is Spinal Tap at 11:11 last night. I and several of my friends went to see Nigel and company’s rocking misadventures, and a fun time was had by all. Spinal Tap is an utterly intelligent and hilarious film. While watching it last night, though, I noticed something that I hadn’t before: Nigel and his rocking compatriots spend the entire film sober.

Watch for it the next time you fire up Spinal Tap. There are no shots of them doing lines of cocaine, shooting heroin, smoking joints, or even swigging on bottles of whiskey. There are a few oblique references to drugs, but there is nothing explicit. For a band that’s supposed to be bombastic and over the top, they spend a remarkable amount of time not getting high, drunk, or both.

This actually works in the movie’s favor- had the guys in the band been constantly inebriated, they wouldn’t be nearly as likable. As much as it’s about a heavy-metal band all of the musicians in the movie are, as a friend of mine put it, big softies. Their slack-jawed expressions and general doofiness become endearing personality quirks, rather than a side effect of rock ‘n roll excess. It also liberates Spinal Tap from having to confront any issues regarding, say, heroin or alcoholism, and allows the film to retain it’s light-hearted tone. It certainly would be a nastier movie if we were to see Nigel drowning his sorrows in a bottle or sticking a needle in his arm.

As much as I don’t like whitewashing issues or self-censorship, I thought that this was a very deft choice on the part of the filmmakers. Spinal Tap, after all isn’t really about sex (though there’s a hint of that) or drugs. It is, first and foremost, a hilarious movie about rock ‘n roll.

Against Leaf Blowers: An Invective

In Rants on November 8, 2011 at 4:42 pm

Leaf blowers are horrible machines, and I hate them.

Earlier today I exited my apartment and stepped out onto the sunny pavement of Portland’s normally mostly-pleasant North Mississippi. I was on a mission of coffee acquisition. I quite like living on Mississippi- today, like many days, there were many nicely-dressed people ambulating about and enjoying themselves. Cool sunlight illuminated everything, and all was well except for one hideous piece of aural pollution that cut through the daylight like some mixture of a coughing walrus and an unskilled oboist. The sound was coming directly in front of me- there, dominating the avenue with a stream of sound, was a man with a leaf blower. He was wearing protective ear-wear, so presumably the sound was less horrid to him, and before his machine a pile of leaves retreated from the sidewalk and into the street.

Let’s add up the costs and benefits, shall we?

On one side of the equation, leaf blowers make leaves go away. Okay, fine, but we also have rakes and brooms that can do that, so it’s not like they are the only anti-leaf technology that we have. The real benefit of a leaf blower is time. You can make more leaves go away quicker than you can with a broom or rake. So, let’s concede them that benefit.

They also have drawbacks- they are much more expensive than brooms or rakes, so there’s a much higher initial investment. They use gas, and that costs money, and presumably there’s repair and upkeep costs for the owner. They emit exhaust, which harms the environment and presumably the health of whoever is using them. Leaf blowers also have a negative impact on the environment around them, which is something of real value. Commercial and residential real estate’s price is effected by its proximity to noise. A store right next to a loud highway will have much different value than a store next to a nice street like Mississippi.

So, they’re machines that are more expensive that similar technology (brooms) they consume an expensive, finite resource (gas) and they have external negative consequences for the environment around them when it comes to health, environmental impact, and noise. I am willing to bet that the vast majority of leaf blowers do not operate on a scale where the benefits that they confer (time) outweighs their various costs. They are irrational. ugly things, and I want them to go away. When I see someone using one I can’t help but think “why?” and I want to tell them to stop getting their stupid externalies all over me. In other words, leaf blowers not only blow- but they also really, really suck.

What I Think of Occupy Wall Street

In Jobs, Politics on November 4, 2011 at 11:47 am

I have started and stopped this post several times. Unlike having opinions about holidays or science fiction movies, articulating my opinions about Occupy Wall Street have been far more difficult.

As I’ve gotten older, my political passion has cooled greatly, and for the most part I’ve regarded that as a mark of maturity. When I think about what it means to be mature and rational, I’ve oftentimes equated that with having a certain lack of passion. This is a common thing. It is generally cool, after all, to be above it all, or at least seem as such.

Also, I’ve become increasingly more hesitant to categorize myself politically. While I do self-identify as a liberal, I recognize that as an extremely broad category with multiple and oftentimes contradictory definitions. I liked this, as I generally distrust idealism of any sort, and I generally don’t believe in aligning myself with a philosophy that could be outlined as a set of principles. For the most part I’ve envisioned my politics as enlightened rational but also passionless and distant. More than once I’ve thought that I would support a regime of benevolent, technocratic philosopher-kings who make dispassionate decisions based on rational analysis of data. I will admit that I would also probably support a regime of well-meaning computer overlords.

When I went to the original Occupy Portland march on October sixth, I thought of myself as mostly an observer satisfying my curiosity. I did not have a sign or anything to say, and envisioned myself being there mainly to take pictures and then come home and write about it. I was amused to see that there was a crazy-quilt of political ideologies at play: There were anarchists, people there who were opposed to the very existence of the Federal Reserve, and a guy who brandished a sign that said Al QAEDA IS CIA. They were exactly the sort of people that turn me off to any kind of political activism- loud, irrational, and unwilling to engage in any sort of debate because their conclusions must adhere to their pre-existing ideologies.

While I was the event, though, it became increasingly apparent that the lunatic fringe was precisely that- the fringe. The gathering was not one of crazed ideologues at all, nor was it completely dominated by a single demographic. I was surprised at the amount of older people, union members, and clean-cut folks I saw. I know this sounds superficial, but I was immensely happy to see folks who were not black-clad anarchy wackos.

Public events like this have two big things that they can do- the first is that they can galvanize the base, and inflame the passions of those who agree with the basic message of the event, and get them mobilized. The other thing that they can do, though, is change the broader political conversation and persuade people who have differing opinions or no opinions about a given issue.

In this regard, Occupy Wall Street (and Occupy Portland) has done both of those things remarkably well, and as a movement it has earned my respect and support. Starting with October sixth, I personally have been increasingly less jaded, less hopeless, and less removed from American politics and economics. The whole event did remind me that yes, I do have a sense of justice, yes, the financial industry has been hugely irresponsible, and yes, economic inequality is absolutely appalling. Not only did the Occupy movement remind me that I dearly believe these things, but that I should be angry about them. I am very much part of that core liberal demographic that popular demonstrations can awaken and galvanize. I was complacent, now I’m not. I was resigned to the right setting the agenda, now I know that does not have to be the case. I felt like things were unchangeable, now I know that’s not true. I am the galvanized base.

Which brings me to the second point- the Occupy movement has successfully gotten a good amount of people, media outlets, and politicians talking about issues of inequity and injustice. Prior to this there was mainly a lot of unproductive ideological talk about debt and deficits that wasn’t really about debts or deficits. The economic well-being of ordinary Americans was not really being discussed at all. Now, attention is not only on the problems of the middle and lower classes, but also the financial institutions who helped create them.

I’m hopeful. That’s hard for me to admit because it seems like such an idealistic thing to say or think (I can’t stand idealism) and it seems like such an immature feeling to have. Hope and hysteria seem to be twin poles irrationality. Nevertheless, though, I’m hopeful that the conversation really will change, that America can become more equal, and that problems are not intrinsic or entrenched, and we really can fix them.

There’s a lot of dumb things going on with Occupy Wall Street (like those idiotic Guy Fawkes masks, or drum circles) but positive change (like, say, separating consumer banking from speculation) only comes if it is demanded. Financial institutions have demanded plenty over the years, and have gotten it. Upper income American have demanded tax cuts, and gotten those. Occupy Wall Street is finally pushing the demands in the other direction- it’s finally pushing back against the small portion of the population that did much to damage the lives of countless people. We have massive unemployment largely because of rampant speculation on the part of a very specific class. It is right and proper to be angry at them.

Because of that, I support Occupy Wall Street, and sincerely hope that the signs and chants and marches and anger get translated into very real policy, because change isn’t coming by itself.

This is a Dumb Post About Chickens

In Animals, Awesome Things on November 2, 2011 at 4:39 pm

Today I was out and about and saw something amazing. Unfortunately, by the time I got out my camera, the moment had passed and I was only able to get the aftermath of the laudable and fabulous event. Here is what I was able to capture:

CHICKENS!

Those chickens, you may notice, have their backs towards the street and are proceeding to the yard in front of them. The amazing thing that I saw today was a flock (do chickens come in flocks?) of chickens actually crossing the road. I saw chickens cross the road! And they got to the other side! It made me stupidly happy.

Now I want to see cows actually coming home, and a dog eating homework.