Alasdair Shepherd's Blog


Touché: Disney laughs at your smartphone’s touchscreen, shows Apple and friends how it’s done.

Posted in Tech by Alasdair Shepherd on 11th May, 2012
Tags: , ,

First things first, watch this:

Done? Holy crap, right? Here I am thinking about making a chorded keyboard out of a glove, and Disney of all people can turn your arm into touch controls with nothing but a little bracelet.

There are so many applications for this technology, the video barely scratched the surface. Imagine if it was built into all the floors and furniture of your house. The house would always know where everyone is, and could control lighting and heating appropriately, saving on electricity and gas, and adding a bit of safety too: It’s 3am, and everyone’s asleep. Your bladder wakes you up. As soon as you set foot outside your bedroom, low-level lighting comes on to keep you from tumbling down the stairs without blinding you or waking up the house. How about car seats that judge your posture and suggest you take a break when you need one?

But all of that is near-future, simple stuff. This technology, if it works, paves the way for a world where every object you come into contact with is “smart” (in the smartphone/ smart TV sense of the word). Tap two fingers on your coffee table to pause the movie. Hug your Teddy bear to let your parents know you’re well. Touch fingertips with the guy/girl you just met to exchange phone numbers/ Facebook links/ etc.

Bit by bit, cyberpunk is actually coming true. This is all kinds of awesome. I’ve said it before and I’ll say it again: I love living in the future!

Time for a little open-sourcing on this idea? Let me put it another way. WANT.

When I grow up, I want to be a cyborg!

Posted in Tech by Alasdair Shepherd on 27th May, 2009
Tags: , , , , ,

You know, one of the most unsung technologies of the sci-fi of the past two-and-a-half decades, and my personal favourite, is the neural computer. You don’t see it too much in movies (although Johnny Mnemonic and The Matrix had it) but it’s pretty common in books. In scientific parlance, it’s a Brain-Computer Interface (BCI) or neural prosthetic. It was William Gibson’s Neuromancer that put this stuff in the public eye in ’84, but it has never since been far from sci-fi writers’ minds.

In John Scalzi‘s Old Man’s War and its two sequels, it’s the BrainPal; Peter F. Hamilton called it neural nanonics in his Night’s Dawn trilogy and OCtattoos (with an AI “e-butler” built in) in the Commonwealth Saga; Ghost in the Shell talks of the CyberBrain; and in Cory Doctorow‘s Down and Out In the Magic Kingdom it’s never really named, but it is, if anything, even more obligatory than a mobile phone is today (and what sort of person doesn’t have a mobile in this day and age?!).

The reason I bring it up is that we’re probably, at the very most, twenty years away from having this stuff in our own heads, and I for one relish the prospect.

Of course, the idea of having a computer in your head seems disturbing at first. Think about it, do you want your brain running on Windows? And what of computer viruses? “I can’t come into work today – I’ve come down with a case of Conficker!” Ee gads, the horror. But that’s not really how the technology would work. It’s simple enough really: you have a computer somewhere about your person (it doesn’t have to be inside your head), you control it with your thoughts, and it throws its display up directly on your retina. No actual connection with your mind: that’s still sacrosanct. It could and will get more {invasive/integrated – delete to taste} as the years go by and the march of progress goes ever onward, but I’m just pointing out that it doesn’t have to be.

Anyway, there are three major categories into which BCIs (fictional, real and coming soon) could be placed. The invasive BCI is the sort usually seen in sci-fi. It’s actually implanted in the brain and integrates with the neurons to, essentially, become part of the brain. It has the advantage of being the most integrated, but has a significant problem to overcome: scar tissue. It is a foreign object, after all, and as a natural reaction to its presence the tissue surrounding it scars, ultimately cutting it off from the neural pathways it was trying to connect to in the first place. In reality, this is unlikely to appear on the market any time soon as an answer to the BrainPal, although the technology has been used in medicine; Dr William Dobelle has been using it to treat non-congenital blindness since 1978!

Partially-invasive BCIs sit inside the skull, but not actually in amongst the grey matter. They basically act like EEGs (the technology is actually called ECoG, for electrocorticography), but without a skull in the way they can get a better signal. How’s this for futurism: “ECoG technologies were first trialed in humans in 2004 by Eric Leuthardt and Daniel Moran from Washington University in St Louis. In a later trial, the researchers enabled a teenage boy to play Space Invaders using his ECoG implant.” (from Wikipedia’s article on BCIs)

Then there’s non-invasive BCIs – the version most likely to become our first-gen e-butlers. Electroencephalography (EEG) is already fast approaching a commercial release as a leisure item: Emotiv and NeuroSky are getting ready to release their products to the market as we speak (well, NeuroSky will be selling to other companies for integration into future products – Sega Toys and Square Enix have announced they’re working with them, for instance), while OCZ’s nia (neural impulse actuator) and Interactive Productline’s Mindball game are already available (although the latter is too big and expensive for your average home user – it’s actually a table-sized unit). The interface in all these products is a headset with several electrodes that rest in various strategic spots on your head and read the electrical impulses that are zipping about in there. That’s it. You don’t even need to shave your head for it. The software, of course, is what does the heavy lifting in terms of actually providing an interface between brain and computer (or console, or table-sized coin-op game, or…).

So that’s how you control it with your mind. As time goes by the software will naturally get ever smarter, and soon it won’t even need any amount of training (it takes a little practice to get going, by all accounts – but then, so does Wii Sports). Another clever little bit of technology that might be part of this imagined setup is something called the Audeo, by Ambient Technologies. It’s a wireless neck-band that picks up the impulses sent from your brain to your vocal chords and interprets the words that these impulses would result in. It’s aimed at sufferers of ALS (Amyotrophic Lateral Sclerosis, the neurodegenerative disease that Stephen Hawking suffers from) and is intended to give these patients a voice when theirs is gone. Of course, technologies of any type only ever get better and cheaper, and some day soon someone is bound to take this concept and adapt it so that it can be sold to mobile phone users for their texting needs. With a little practice, anyone can learn to fire those impulses without actually giving voice to them, and this is frequently referred to as “subvocalising”

On the output (from the computer to the user) side of things, again, no brainhacking required. It would be as simple as a pair of glasses (or sunglasses) with a heads-up display (HUD) projected onto them. Several companies, such as Lumus, Rodenstock and others, are working on this very technology. Of course, the concept models they have aren’t exactly fashionable, but they’ll get there. In essence you can see through to the real world, but you can also follow a GPS overlay, watch media, navigate menus by shifting the focus of your eyes, or (probably, eventually) even follow your Twitter stream! In my setup, though, the glasses aren’t a gadget in and of themselves as these first devices will be, but the display unit of your mobile computer; you remember, the one that’s controlled by thought?

As we speak, patients around the world are taking part in a trial of a device called the Argus II, by Second Sight, which is, in essence, a bionic eye. It’s a camera attached to glasses, which wirelessly transmits to an implant in the eye that stimulates the retina. Patients in the trial are reporting that they can make out outlines and bright lights. From there it’s only a hop, skip and jump to the sort of eyes they have in Ghost in the Shell (that was a very memorable image: an extreme closeup of an eye, seemingly human – except for the lens specs printed around the iris!). How long will it be until bionic eyes outperform organic ones? Then they become an upgrade for even the most sighted of us rather than a prosthetic for the visually impared. The reason I mention this is that they could easily be adapted to take input from our mobile computers as well as their own camera apertures. Your HUD goes straight to the optic nerve without first having to be transmitted as light. Yes, this would be surgery, but this is a while away yet.

Wow, I have gone on, haven’t I? This post started as simply “Won’t it be cool when..?” I just wanted to say that the idea of being your own interface with the Internet doesn’t need to involve invasive surgery, as sci-fi would suggest, and that it really isn’t that far away. I think I’ve certainly said as much now, but I just want to add (as Jerry would say, “And now for my final thought”): I am, as you may have guessed, a sci-fi geek and a dreamer. When any of this technology hits the high street, you can guarantee I’ll be first in line. Hmm, maybe not the invasive varieties, though. Imagine being the first real-life sufferer of cyberbrain sclerosis!

Look out Twitter – It’s Flutter!

Posted in Humour by Alasdair Shepherd on 5th April, 2009

There’s a video out there about Flutter, the next logical step after Twitter. This is very funny stuff, but then again… I must confess, I want a pair of Flutter-eyes.

Twitter has been abuzz about it, and odds are you’ve already seen the video. If not, you can find it here, here or even where I saw it (thanks to Twitter, of course!).

It was a funny video, but here’s what really made me laugh: flutter.com is a real website, and it’s down! It’s an online casino, according to the whois. All the untold hoardes who saw that video today (after The Next Web blogged about it) must have thought the same thing I did – “I wonder if anyone’s nabbed that URL yet.” Either that or they thought Flutter might actually be real… Regardless, that video caused what amounted to an unplanned DDoS attack! This is the scary side of the power of social networking. Hilarious.

First!

Posted in General by Alasdair Shepherd on 31st March, 2009

Yes, I have a blog now. Anyone would think I believed my opinion mattered.

Okay, I confess: I have nothing much to say right now. No, really. It won’t be long until something comes along that I feel the need to blog about, but in the meantime you’ll just have to settle for a general “Hello world”-style post. Sorry.

On the plus side, at least now my home page isn’t just a failed search message. I count that as a victory. Over whom? Lethargy. Procrastination. Laziness. The demon goes by many names, and I am its slave. Point in case: I should be writing a lab report (or two) just now. Instead, I’m creating a blog and posting… well, nothing really.

If you’re reading this several years and hundreds of blog posts after it was written, go ahead and laugh. Humble beginnings and all of that. By the way, how is the future? Is everything shinier?

Okay, that’s it. Sorry I just wasted your time. Come back soon!