I put the (ADD WORD LATER) in Procrastination! by Bonnie Burton, guest blogging over at Wil Wheaton’s blog.
A friendly reminder that we all procrastinate sometimes.
Maunderings of a Digital Self
I put the (ADD WORD LATER) in Procrastination! by Bonnie Burton, guest blogging over at Wil Wheaton’s blog.
A friendly reminder that we all procrastinate sometimes.
And sometimes that’s hugely painful or difficult, especially when we’ve been socialized to believe that who we are, deep down, is somehow immoral and incorrect. Because the first thing you have to figure out is who you are. And what you want. And that it’s all right for you to want and be those things, even if somebody else told you it was wrong. Even if it’s risky. Even if your family might not understand. (Of course, it’s also risky because it might involve important relationships changing drastically, giving up things that are precious to you, and re-assessing your investments or renegotiating your life path.)
That can be a tremendously painful process, this letting go of what you thought you ought to be, what you were invested in being–and just being what you are. Feeling your feelings, Writing your words. Making your art, which involves telling your truths.
Read the rest of the post (and really, a lot of her posts lately). Worth it.
I wish I had time to pause. It may be the one real regret about life that I have — there’s always one more thing, or five more things, that need to be tended, to keep the wolf from the door, to keep the roof from falling in. I have silent, solitary winters, like this one, but I don’t get to pause.
I was talking with a friend the other day about what I call “intellectual fappery.” This is in reference to the sort of academic papers (in particular in art criticism or art theory) that are so wrapped up with jargon and linguistic flourishes that it’s unreadable to a lot of people. It never made much sense to me that they’d do this (why make it harder to win people over to your point of view), and I sort of presumed that it was some sort of smug self-aggrandizement, speaking opaquely to keep out the riff-raff. (Another alternative is that they’re simply too inept at communicating their ideas, and so hide it behind linguistic flourishes.)
I had a minor epiphany, though: while some may be putting on airs, a lot may simply be in love with language. Not in terms of communication, but as expression. Flowery turns of phrase, using ornate, overly complex language not because they want to obscure their message, but simply because they think ornate, overly complex language is pretty. In short, an entire body of writing (looking at you, art theory and art critics) that prioritized form over function.
This doesn’t really make it any more fun to read (especially since you’re usually subjected to it as part of studies, rather than reading it by choice), but it does at least seem like a kinder way to look at things.
I’ve been debating doing NaNoWriMo this year. I’ve participated on and off for years, though I’ve never finished. At this point, it’s been a long time since I’ve written fiction (or told any sort of story, fictional or not), and I miss it. It’s a little weird to say that, since a) there’s technically nothing stopping me from doing it now, and b) I was never all that amazing at it. (I’m trying really hard not to just completely bag on my writing ability, since people seemed to generally respond favorably to what they read, and bear in mind Ira Glass’s quote on creative work, but it’s hard. Even with the stories I was moderately pleased with, there was SO MUCH room for improvement.)
I do miss it, though. It’s weird — I’ve felt blocked to the point of frustration for years now, and unable to bring myself to get past it, even though I know the answer is simply to keep it up until I get through the brambles. I’ve been thinking about it a lot for a while now — the dearth of creative outlets and making in my life, and it really struck home a little while ago. I was having a conversation with someone who is a maker and doer (and just generally awesome person), and we were talking about hanging out sometime, and they said they looked forward to hearing/seeing what I make. I was instantly filled with embarrassment, because I felt like I had nothing to offer to that conversation. I love creative people — it’s what I’m attracted to, both in friends and otherwise — and when given this opportunity to make a more solid connection with someone I already liked and wanted to get to know better, I felt like I had nothing to contribute.
Note, it wasn’t anxiety, it was embarrassment. I was embarrassed — I felt like I was a poser who’d been called out on their facade. I realize that isn’t really fair to myself or entirely accurate — there’s room for people who celebrate art and creativity, who are supportive and the first to cheer others on, and that doesn’t somehow make them a sham. But feelings aren’t rational, and it doesn’t feel like enough to validate the role creativity has on my personal identity.
So, it’s time to wade into the brambles again. It’s been so long that I don’t even remember what telling a story feels like on my tongue, the heft and shape of a narrative in my fingers. It’s time to correct that. I’m debating doing NaNoWriMo this year, and it almost doesn’t matter if I finish, as long as I actually begin.
Last week, I was at a family reunion filled with fabulous, intelligent, talented people whom I’m glad to call family. One thing I noticed: as people pulled out laptops and iPads and smartphones, or discussed some of the current technological hurdles they’re facing in their day to day lives, there was still a lot of frustration and implied distrust of the hardware or software being used. It really hammered home to me that there’s still a long distance left between usable and intuitive. They were adding complexity and hurdles that didn’t need to be there, because they were used to a previous mental model that was more complex.
I work with software and computers every day, and have for years. Even a lot of my hobbies end up taking place on computers. It’s easy to take for granted the human-computer interactions I do on a daily basis, because I do them regularly, and generally even if it’s a new piece of software or hardware, it still behaves similarly enough to other software that I can get the hang of it pretty quickly. The thing is, even with the pervasiveness of technology these days, I am an anomaly, not the norm. Many people — highly skilled, capable people — simply don’t have that background and context for understanding, nor the time or interest to gain it. As far as I see it, this is a lot of what user experience design is all about: finding that line between simplicity and complexity, where people have enough detail to understand what is happening (at least a high level), but is still simple enough that they don’t have to invest cognitive energy to grasp how to use it.
Aiming for clarity is hard on its own, but what I was noticing is that it faces an additional hurdle: overcoming the complexities or mental models of previous designs. It seemed like a big problem in particular for older generations was that they’d fallen out of sync with what experiences were designed to be now, and were burdened with the expectation of complexity or failure from experiences in the past. It’s easy to say “oh, well they just need to retrain themselves,” but that implies they have the cognitive energy, time, and interest to do so.
That’s not to say we shouldn’t keep working on improving the user experience, but it is something to bear in mind when developing software or hardware. I have a few ideas on how to accommodate this, some of which may be more palatable than others:
Obviously, it’s impossible to please all of the people, and maybe more of this is already in progress than I’m aware of, but it does feel like we’ve got a distance left to go on learning to effectively clear out the cobwebs of past experiences.
[Note: Giant conference notes info-dump behind the link. These are raw notes, I’ve not really cleaned them up at all, but wanted to share in case others find it useful.]
Continue reading “Write the Docs 2014 (Day 2)”
[Note: Giant conference notes info-dump behind the link. These are raw notes, I’ve not really cleaned them up at all, but wanted to share in case others find it useful.]
Continue reading “Write the Docs 2014 (Day 1)”
Back in 2010, I ended up having a really rewarding Twitter conversation with some very smart people, talking about hypersigils and how they apply to the internet. I’ve been thinking more about the topic lately, and wanted to expand on what was said before.
Let’s start with the term hypersigil. The term was coined by Grant Morrison, but the concept has been around for a lot longer than that. The term has a certain magickal [sic] connotation because of its origins, and I know that some folks get squicked out about that. If it makes you feel any better, just think of it as a psychological focus used to affect personal change, in the form of creating a narrative. If that’s still not enough, come up with a better term that does an even better job of wrapping a complex concept into a compact term, that does an even better job of packing loads of exformation into one word, and then popularize that instead. I’d love to hear it.
Continue reading “Hypersigils, Identity, and the Internet”