Transcendant Reality Gaming?

We’re huge fans of AR here at SMXL (both flavors–augmented reality, and alternate reality gaming), but we’re always wondering, what’s next?

How about AC–augmented consciousness? Or TRG–transcendant reality gaming?

This fabulous vision video by Michaël Harboun (Transcendenz on Vimeo) gives a taste of where trends such as augmented reality, and BCI (Brain Computer Interface) could lead.

Pretty cool, no? Are you ready to interface with your very own digital guru?

Here’s what Michaël has to say about his project:

In a world in which we are constantly bombarded with injunctions to react or to distract ourselves it gets scarcely possible in our everyday life to dwell upon the essential, the existential, the metaphysical…

Transcendenz offers to connect our everyday life to an invisible reality, the one of ideas, concepts and philosophical questionings which the world is full of but that our eyes cant’ see. By bringing together the concepts of augmented/altered reality, Brain Computer Interface (BCI) and social networks, Transcendenz offers to live immersive philosophical experiences.

Through meditation, users access the world of Interconsciousness, a white and silent universe in which they can connect themselves to the metaphysical experiences. These experiences transform our perception of the world and reveal an invisible philosophy.

By making us aware of things like they are and not like they seem, the concept invites us to transcend our world.

Transcendenz also enables us to access the knowledge of history’s great philosophers, who, since antiquity, try to answer the question: “Why is there something, if there could be nothing?”

I WANT Mr. Tappy!

Or how I came to realize, and embrace what an enormous UX nerd I am.

Mr. Tappy is a camer rig that makes it easy to film users interacting with their mobile devices.

Here’s what Mr. Tappy has to say about their product:

Mr Tappy came from humble beginnings – in fact, he originally was made from plastic cut with a hacksaw and bent over a household toaster. This solution worked fine as a usability filming rig a few years back when ‘a phone was a phone’ …but when touch screens and tablet devices arrived, a more flexible and stable solution was required.

Through a series of prototypes and testing with some of Europe’s largest technology design and research agencies, Mr Tappy was born.

Mr Tappy was designed by Nick Bowmast, a UX researcher helping companies develop better products and services through customer insights. Often these products involve mobile devices and the insights come from watching them use products.

I’m giggling like a school girl… Prototyped? Specifically for UX research? Swoon. Now if I only had an excuse to rush out and buy one.
Anyone out there have an app they need tested? 

Tears in a Bottle – Unpacking the science of the musical tear jerker.

If a recent Wall Street Journal article is to be believed, Adele’s song “Someone Like You” is bound to elicit strong emotions in listeners, because it contains certain structural elements that make human beings more likely to react to it emotionally.

According to the study cited in the article, “Music is most likely to tingle the spine, in short, when it includes surprises in volume, timbre and harmonic pattern.”  In her performance of “Someone Like You,” Adele frequently employs a musical technique in her vocal called an appoggiatura, which is sort of a tonal hiccup that the article describes as “a type of ornamental note that clashes with the melody just enough to create a dissonant sound.”

To sum up, the article states that a song that uses musical techniques to surprise you, or set you slightly on edge, then release that tension over and over in “mini-roller coasters of tension and resolution” is more likely to elicit a strong emotional response, be it chills or tears.  This, in turn, causes the body to release dopamine, which rewards you for the experience and makes you want to repeat it.  Therefore, if you successfully include these elements in a song – along with salient, emotional lyrics – you’re guaranteed to have a hit tearjerker, right?

Now, if you’re like me, here’s the point where your bullshit meter starts going off.  In fact, my first response to this article was to write an angry, fist-shaking diatribe against the wrong-headedness of trying to pin down art with science.  A pretty strong emotional response – maybe the article contained lots of a appoggiaturas.

But, calmer heads have prevailed, because when you really think about it, there’s probably something to all the elements mentioned in the article.  Maybe a large percentage of really high-emotional-impact music does contain those elements.  I can think of quite a few other examples that seem to fit.  And yet the analysis doesn’t quite ring true to me.

What’s missing is art.

The reason why I find Adele’s song so emotional is because I feel as if her emotions are coming across through her singing, and they feel real and genuine to me.  Whether you call it appoggiatura or just plain soul, I feel that her voice is on the verge of breaking into tears at key points in the song (as when she almost squeaks “don’t forget me, I begged…”), and that generates a sympathetic response in me, and so I feel very emotional, too.

Ding!  Lightbulbs!  This is the thing that art — and great storytelling — does so well.  Artistic expression hits home with an audience in that moment where the teller takes messy, raw — even ugly human emotion, and distills it into something beautiful.

Adele’s performance of the song is loaded with emotion, and her vocals feel like they are about to fly out of control — but then she pulls it together, and out comes these moments of real beauty.  A tearjerker is born in the moment when symmetry, beauty, and something I can only describe as a bright pure light shines out of a messy, real, imperfect jumble of human emotions.  This is the roller coaster I find myself on when I listen to Adele sing.

So is it just art, or is there something technical going on as well — something science can point to and find in common with other emotional pieces of music?

I won’t presume to be able to tell you whether Adele’s performance on this recording is as pure an emotional expression as it seems to be, or whether it’s all honed musical technique and artifice.  It’s probably somewhere in-between.

Performance is all about using technique to arrive at something like the true state of emotions required by the given circumstances of a story — and then bringing the performer and the audience along on a journey together.  And Adele shows a mastery in her performance that makes the listener stop in their tracks and experience this song in a visceral way.

However, I strongly believe that you could study the musical formula for a tearjerker until you were blue in the face, and master every single element identified, and still make a song that isn’t half as powerful as “Someone Like You.”  Underneath all Adele’s technique there is a light of beauty that simply cannot be captured by a scientific description of her performance.

And now it’s time for a Pepsi challenge.

First, here’s a cover of the song:
http://open.spotify.com/track/3oDjy7gnCoXTAuVJOu0RsC

The singer has a very good voice, and I believe she utilizes the technique of appoggiatura at the moments specified in the article — in fact, I think she uses it much more, if jazzy blue notes count.

It’s technically decent, and theoretically contains many of the same variations of dynamic, tone, and pattern as the original.

And yet it completely fails to land with me on an emotional level.

Now, here’s the video for Adele’s original recording of the song:
http://www.adele.tv/videos/360/someone-like-you

Close your eyes, and just listen to her voice (the video aspect doesn’t do the song justice).

I don’t hear a formula, and I don’t see a man behind the curtain pulling any levers.  I feel something that seems pure, real, and salient enough to elicit a strong response in me over and over again.

Can someone please pass me a tissue?

How I learned to stop worrying and love FCP X

How most editors felt the day FCP X came out

 

I know it’s almost totally uncool to come out and like Apple’s Final Cut Pro X.  I’m finally going to risk it.

It’s not a monogamous affair — I have both FCP X and FCP 7 installed on my computer, and will continue to do so for the foreseeable future.  Ask anyone who has been a huge fan of Sony Vegas or even Adobe Premiere until quite recently, and they will tell you how lonely it is to use a non-linear editor that most of your peers haven’t adopted.

However, I think FCP X is well on it’s way to becoming the most widely used pro video non-linear editing program, just like FCP 7 was.  And here’s why.

It’s the DSLR of postproduction.

At 300 bucks (really $400 with Motion and Compressor) and with an available 30-day free trial, FCP X has seriously bent the cost curve of pro video editing, while at the same time adding features and a different take on creating video from previous NLEs.

Adobe has dropped the price of it’s CS 5.5 Production suite — by half — to $850, which is still twice the price of FCP X.  (To be fair, you get other programs like Photoshop and Illustrator in that bundle.)  Just like DSLRs did for video cameras,  FCP X is driving category costs down.  If you’re a student or first-time indie filmmaker who wants to edit video, $400 is an easier pill to swallow than $850.  Apple can count on a fast-growing customer base in the next few years, from the bottom up.

And, when you add FCP X to the just-starting Thunderbolt revolution you get something interesting.  Thunderbolt provides super-fast, daisy-chained audiovisual and storage components that can be plugged into any Mac.  This will replace the Mac Pro edit suite with a modular, edit-anywhere model.  When I look at this, I can see an upgrade path for an independent filmmaker from a $300 program on a Mac Mini to a professional editing system in small, affordable steps.  Kind of like gradually tricking out a Canon 7D to become a full-fledged cine rig.

It fixes a lot of stuff that was really annoying in FCP 7, but we had all gotten used to.

FCP X keyframe editor

The color correction tools in X are far superior to 7, making for much cleaner adjustments in RGB color space (I do miss the old color wheel, though).  It’s not quite “Apple Color built-in” but it takes you pretty far down that road for most commonly-used color corrections, including secondary corrections, vignettes, color-range corrections, and excellent color keying tools.  Best of all, FCP X gets rid of the weird gamma-shift that existed inside FCP 7 that all but guaranteed your video looked different after export than it did while you edited it.

You can animate video and stills right in the viewer window, the experience is pretty much like using Motion right in FCP (very tactile / visual).  Also inherited from Motion, you can see keyframes right in the timeline, and apply these not just to video transformations, but also to color correction, filters, compositing, and audio.  And once you do use Motion to do more advanced graphics work, you can publish many of the effects and filters back to FCP, to use like any other effect, right in the timeline.

Finally, you don’t have to transcode most video formats at all, if you don’t want to.  FCP X works a huge variety of codecs, and renders full-quality versions of the video in your timeline, only as you make changes that require it (like color corrects).  Rendering happens in the background, and is quite fast, assuming you have a recent-model Mac with enough memory and a good graphics processor.  Your projects also take up much less space on your hard drives, which can save a lot of money over time.

It tastes like Apple

FCP X uses a sleek, easier-to-learn interface than FCP 7 — or any other pro-level non-linear edit software.  This was one of the most-maligned features, since X shares a lot of interface design DNA with iMovie.  But why not?  iMovie was designed by Apple from the ground up, while previous versions of FCP were always made to look and work like other NLEs.

FCP X interface

There’s a lot to be said for the two-window, three-point-editing paradigm that has been with FCP, Avid, and their ilk since the early days of computer-driven editing.  But, it’s really short-sighted to believe that this is the ONLY way to professionally edit video.  Apple very smartly recognized that a system based on editing online and offline edits with tape-based media was out of sync (pun intended) with nearly everything being filmed today.

So, FCP X replaces that paradigm with one based on digital media and metadata.  It is designed around the ability to produce stunning results on a stand-alone computer, instead of requiring an edit suite with lots of boxes.  Which, frankly, has been the state of affairs for most independent and web-based filmmakers for years — now the tools reflect the reality.

I’m still getting used to the idea of using keywords and tags to arrange media, but the possibilities for organizing complex projects are remarkable.  Look for future cameras to include more on-set metadata to take advantage of this functionality in FCP X.  Imagine being able to sort your project by scene, take, and notes such as “best take” or “bad sound” from the moment you import the footage.  This is the next logical step (and the functionality is already starting to appear in high-end cameras).

More interestingly right now, FCP X has the ability to sync multiple sources of video and audio into compound clips or multicam clips almost effortlessly.  This makes it much easier to work with dual-system sound and multi-camera shoots.  I haven’t used the multicam feature yet, but from the demos I’ve seen, it looks to be the best multicam editor in any NLE.

One amazing feature I have started using frequently is to make an Audition clip in a timeline, where multiple takes live inside a single clip on the timeline.  I can easily audition each take in the context of the story I’m editing.  The timeline adjusts dynamically to differences in timing, and I finalize my choice with a single click.  Take that, 3-point editing!

Finally, I’ve got to add that the new interface is just more fun to use.  I’ve always found this to be true of Apple software design — the user experience tends to be one of exploration and fun.  When I first taught myself editing on FCP 4, it felt like I was learning to fly a plane by instruments.  FCP X feels like jumping into an X-Wing fighter and just using The Force.  Way more fun and intuitive!

What happens to user experience design…

when humans become more integrated with machines?

What new considerations will we designers have to take into account when our users interact with devices that are plugged directly into their nervous systems, or installed in their bodies?

The good folks at Superflux have been thinking about this in the context of curing blindness. Here’s what they have to say:

What if we could change our view of the world with the flick of a switch? The emerging field of optogenetics combines genetic engineering and electronics to manipulate individual nerve cells with light. With this technology, scientists are developing a new form of retinal prostheses. Using a virus to infect the degenerate eye with a light-sensitive protein, wearable optoelectronics can establish a direct optical link with the brain. Song of the Machine explores the possibilites of this new, modified – even enhanced – vision, where wearers adjust for a reduced resolution by tuning into streams of information and electromagnetic vistas, all inaccessible to the ‘normally’ sighted.

I’ve been fascinated with this concept–the idea of enhancing human experience through designed objects that integrate with our bodies–ever since I watched this TED talk by Aimee Mullins, who sees her prosthetic legs as a desirable enhancement rather than simply a replacement for her legs:

Aimee doesn’t want her old legs back. She considers herself better off being able to change her body to suit her mood, and her desires.

It got me thinking, how does the job of a UX designer change when we can help users meet their needs not just through designing easy to understand, learn, and use digital interfaces on their computers, and handheld devices, but by changing a user’s body? What new needs can we help people meet? What are the ethics that we should follow? What should we do for example when a perfectly sighted person wants to blind himself to take advantage of an enhanced visual system?

What do you think?

A taste of “going viral”…

On Pinterest?

I’ve been a member of the design inspiration sharing social network for about a year now. I use it as a giant inspiration board. A place to save all the things I see throughout the day, on blogs, or that are shared by friends on twitter, that make me happy, and that I like.

Up until about a week ago, that is.

Before then Pinterest was simply a place for me to save the things I liked. Oh, sure, I’d get a few people liking my pins, repinning them to their own boards, or even following me. I thought that was kind of fun. I had notifications set to send me an email as soon as there was site activity, and I’d get about 10 a week. Maximum. And then, in the space of a day, all that changed. My inbox was filling up with individual email notifications. Hundreds of them.

The first thing I did was to turn off individual email notifications. And then I noticed something interesting. All these new likes and repins were only for a handful of my pins, and there was no uptick in the number of folks who followed me. What is going on?

I know that Pinterest has been exploding of late. There are blog posts about how it drives more referral traffic than Google+ (and almost as much as Twitter), and that Pinterest is incredibly successful with women (80% of Pinterest users are women. I guess I’m just in touch with my feminine side.).

Could all these new users account for my pin’s 15 seconds? Or did Pinterest release some new feature that suddenly boosted my pins’ popularities?

I may never know. But I’d sure like to find out.

 

Gamification?

I want a badge for everything I do. Who do I talk to about that?

Increasingly, everything I do seems to want to pat me on the shoulder and say ‘Hey, guy, good job!’ I admit: I like it. The group over Gamify defines this theory: “Gamification is the concept that you can apply the basic elements that make games fun and engaging to things that typically aren’t considered a game.” They say it’s going to completely change the way we interact with the world. I’m not sure I buy that, but there are definitely some things to pick up and use in marketing – specifically online marketing.

Rewarding users and providing positive feedback for doing what we want them to do can be the difference between someone staying or going. Taken a step further, it can lead them to come back for more, which, in turn, allows you another opportunity to talk to them (let’s be honest, to sell to them).

It’s not the solution to every problem, but it might be a cool solution to some of them.

The Ghost Club Storyscape: Designing for transmedia storytelling

SMXL partner Jason Nunes is one of the co-authors of The Ghost Club Storyscape: Designing for transmedia storytelling, a paper by Hank Blumenthal, the producer director of the feature film, The Ghost Club. Jason co-wrote and acted in the film.

Here’s the abstract:

One of the key questions about transmedia storytelling is how to design a participant’s experience across different media so that it is connected and perceived as a whole. We extract four components for building such connections from current work in media studies and production literature and practice. These proposed design components are mythology, canon, character and genre. To test this approach we have designed and developed a group of connected digital media expressions, The Ghost Club Storyscape, to experiment with these four ingredients on multiple media.

The Power of Sticky Notes: Strategies for Identifying and Prioritizing User Experience Goals

Check out this excerpt (currently the lead story on theRockPaperInk blog) from SMXL partner Jason Nunes from Interactive Design: An Introduction to the Theory and Application of User-Centered Design, published by Rockport Publishing, and due out in September.

The Power of Sticky Notes: Strategies for Identifying and Prioritizing User Experience Goals – RockPaperInk.com.

Roots and Leaves: Collaboration: Two minds come together to write and design a book

SMXL partner Jason Nunes and Funny Garbage Creative Director Andy Pratt are co-authoring a book called Interaction Design an Introduction by Rockport Publishers, which will be out in September. Andy wrote a great blog post about the team’s writing process, and collaboration in general. He makes some great points. Here are Andy’s key ingredients for successful collaboration:
  • Trust and respect: Everyone on the team must trust each other and know that the other team members will deliver. Don’t focus on what others are doing. Don’t micro manage. Do what you do and do it well.
  • Egoless team members: Confidence is important. Be confident in your skill set, your opinions, and your voice. But listen to what others have to say and let others talk. Collaboration is about dialogue. Your contribution is not measured by how much you talk. It’s measured both by what you say and how you listen.
  • Clear responsibilities: Everyone needs to be clear on who owns what. Other team members should be able to critique. After all, you’re working with professionals who bring their own experience and opinions. A project manager should be able to give their opinions to a designer, a developer to a user experience designer, and so forth. However, in the end, the owner of that decision or task needs to make the final call. And because there is mutual respect and trust within the team, everyone should be comfortable with that.

What do you think? When you collaborate what are your ingredients?