2462: I Don’t Need Any More Tutorials or Updates

0462_001

I was out today, making heavy use of my phone to assist with some part-time courier work I’ve picked up recently. At some point during the day, the Google Maps app updated, at which point it felt the need to give me a tutorial.

Nothing, so far as I can tell, has changed in the Google Maps app since its last iteration, so quite why it felt the need to deliver an irritating and persistent tutorial is beyond me.

Google Maps isn’t the only app to do this. Pretty much any “productivity” app on mobile these days feels the need to bore you with a pointless (and often non-skippable) slideshow before you can start using it — even in the most simplistic apps.

I get why these tutorials are put in there — it’s to cater to stupid people and/or the technologically disinclined, who might not be familiar with the conventions of interface design. But they should be skippable. And if the app has clearly been on the device — and used heavily — prior to the latest update, it should automatically skip the tutorial by default.

And while we’re on, I can do without pointless, unnecessary updates, too, even though App Store, Google Play and Steam reviewers seem to think that they’re essential to an app or game remaining useful and/or fun. (These people never lived through an age where your word processor came on a floppy disk, and that was it, no more updates unless you shelled out for a new version.) These people are the reason why we get stupid, idiotic revamps to things that worked perfectly well the way they did before, like Twitter and Google Hangouts.

The latter is one I find particularly irritating, particularly in its Chrome extension incarnation. Previously, the Chrome extension was a discreet little affair that took the pop-up Google Hangouts interface from GMail and rendered it in an “always on top” version that could sit on your desktop — tucked away when you didn’t need it, yet just a mouseover away when you did.

Now, however, it’s in its own separate window for no apparent reason — a window that opens up every time you start Chrome, whether or not you have new messages to read — and, presumably in an attempt to “look like Android”, it has one of those annoying mobile-style “drawer” menus on the left. These are fine on mobile as they’re built to be usable with a touch interface, but on the big screen they’re clumsy and unnecessary. I honestly don’t know why we don’t still use drop-down menus any more; they may look boring, but they work. At least Mac OS still uses drop-down menus for most apps, though Office for Mac still has that horrible “Ribbon” thing at the top instead of the old-school toolbar from early versions of Office.

Updates are fine when they add something meaningful: look at something like Final Fantasy XIV, which adds meaningful new content with every major version number update. But when they’re change for change’s sake — like Hangouts’ new format, and Twitter’s insistence on reordering your timeline even when you have repeatedly asked it not to — they’re just annoying. And, moreover, that inexperienced audience the developers were hoping to capture with their tutorials will likely end up being turned off by having to “re-learn” their favourite app every few weeks.

And don’t even get me started on the three system restarts I did the other day, with a notification that there were new Windows updates available every time. At least I managed to excise the cancer that is the Windows 10 nag prompt, so I should be grateful for small victories, I guess.

1696: Side Effects

One of the side-effects of 1) having a job that doesn’t involve staring glassy-eyed at the Internet all day and 2) being in the middle of a self-enforced social media blackout (it’s going great, by the way) is that your priorities and even interests change.

Oh, don’t worry, I’m not about to stop boring you with tales of obscure video games any time soon, but what I have found is that I’m in no hurry to keep up with the latest news in gaming and related spheres such as technology.

This was really driven home to me today when someone asked what I thought of Apple’s new announcements.

Eh? I thought. I haven’t heard anything about those.

Apparently Apple announced a new iPhone and a smartwatch, whatever the fuck one of those is. And I was surprised to find how little of a shit I gave about either of them. My current phone is a functional workhorse at best, though without Facebook and Twitter demanding my attention every few minutes it stays in my pocket or drawer a lot more than it used to, and is largely being used for a bit of lunchtime Web browsing and playing music in the car. As such, I find it hard to get excited about the latest piece of shiny, pretty and overpriced tech that Apple is coming out with. My honeymoon period with “smartphones” is well and truly over: I’m not interested in playing games on them, I’m rapidly discovering the value of not having social media in your pocket, and for organisation, frankly I’d rather use a paper notebook and calendar. Get off my lawn.

It was the watch that particularly bewildered me, though. Before I left the games press, tech writers were just starting to get excited about “wearables”, and I couldn’t fathom why. I still can’t. It just sounds like an unnecessary step in the process of consuming digital content, and a way for the ever-present menace of notifications to be even more intrusive to your daily life than a constantly beeping phone already is. A little computer on your wrist is something straight out of sci-fi and a few years ago I’d have been all over it, but on reflection, now? That’s not what I want. Not at all.

I’m not writing about this to be one of those smug “well, I don’t care about those things you’re excited about” people — though I’m well aware it may well come across that way. Rather, I’m more surprised at myself; I always had myself pegged as a lifelong gadget junkie, and the trail of defunct-but-useful-at-the-time technology (Hi, Palm!) my life has left in its wake would seem to back that up.

But I guess at some stage there’s a saturation point. You see something, and see no way for it to possibly fit into your life; no reason to own one. I already felt this way about tablets — I barely use our iPad even today — and I certainly feel it about Apple’s new watch. Smartphones still have something of a place in my life — if nothing else, it’s useful and convenient to have things like maps and a means of people contacting you (or indeed contacting others) in your pocket — but their role is much diminished from what it was, and I’m in no hurry to upgrade to the latest and greatest.

It’s another case of, as we discussed the other day, solutions to problems you don’t have. All this technology is great, but it convinces us that our lives would be an absolute chaotic mess without it — when, in fact, it’s entirely possible that the opposite could be true. After all, the human race survived pretty well before we discovered the ability to photograph your dinner and post it on the Internet, didn’t we? While I’m not ready to completely let go of my smartphone — not yet? — I’m certainly nowhere near as reliant on technology as I once was, and I’m certainly not obsessively checking news feeds to find out the latest and greatest news about it.

And you know what? It’s pretty nice and peaceful. I could get used to this.

1542: Terebi Desu

Our new TV arrived today at some ungodly hour in the morning — which felt all the more ungodly for the fact that excellent Vita dungeon crawler Demon Gaze had kept me enraptured until 3am — and I’ve been having a bit of a play with it. (For the curious, it’s a Samsung Series 6 55-inch LED TV; it has a catchy three thousand-digit model number but I have no idea what it is.)

When Andie suggested we grab a new TV, I was a little concerned that it might not be a significant upgrade over what we already had — a 40-inch Samsung, albeit one that is now about four or five years old. After all, despite the fact that my previous TV was an end-of-line model when I bought it — making it much cheaper — it was pretty good. Three HDMI ports, built-in Freeview tuner, full 1080p support — it had pretty much everything I needed, though it would have been nice to have an optical output port. Everything I connected to it worked just fine, though, ranging from the PlayStation 2 through the SCART port (yummy, blurry standard-def picture) to the various games consoles and PC through the HDMI ports.

With the previous TV working just fine, why buy a new one, you might ask? Well, having spent this evening playing some Final Fantasy XIV on it and having watched some anime and TV on it earlier… yes, it was a good investment. The increase in size is extremely noticeable — it’s big enough to have a touch of “peripheral vision” now, giving a much more immersive feel to both video and games — and the LED screen is lovely, bright and clear. I have no idea if I’ve optimized its settings appropriately — I’ve put the PC input into Game mode, because prior to that there was noticeable input lag, but haven’t really fiddled with much else — but it certainly seems to look very nice, although as Andie pointed out, the bigger the screen you get, the more of a dog’s dinner standard-definition footage and TV broadcasts look. Oh well.

It’s a Smart TV, too, which means it has two remotes, one of which has a trackpad rather than, you know, just being normal, plus “apps” for doing shit old, dumb TVs don’t do. There’s stuff like BBC iPlayer and Netflix built into it, for example, and even apps for things like Spotify and the like. (There are also games to download, but somehow I don’t see them being particularly worthwhile, and as such I will be giving them a wide berth.) I’m not entirely convinced how much I will use the “smart” features over time, but it’s nice to have them there, I guess — not to mention the fact it is seemingly now impossible to buy a new TV that isn’t 1) “smart” and 2) 3D.

The 3D thing surprises me somewhat, I must confess. I thought 3D TV and gaming had been a colossal failure, and yet all the televisions we looked at over the weekend were 3D in one form or another. The TV we ended up getting is “active 3D”, which is supposedly better because you have to turn the glasses on before they work properly (and for some other reasons, too) and sure, it’s quite fun — we watched a couple of trailers in 3D earlier and it was quite cool — but it’s not something I can see myself using a lot of, and certainly not for protracted periods of time. It will almost certainly be something to show off to people who come and visit, but little else.

Anyway, I’m very pleased with it. It fits nicely on our TV stand and doesn’t look too big or too small, and it’s a noticeable upgrade over what we had before — plus the almost bezel-free design, with the picture going right the way to the edges of the front of the unit, looks absolutely smashing.

I’m sure I’ll be taking it for granted before long — and I’m not looking forward to moving it when our new house is sorted — but yes; I’m glad we got it. And now I’m off to bed because I’ve been staring at it all evening and I think my eyes could probably do with a rest!

1155: The Tablet Revolution

Page_1I’ve come to the conclusion that I’m a dusty old bastard who is set in his ways like an old man. That or everyone else is just plain wrong. Or perhaps a combination of the two.

I’m specifically referring to the “tablet revolution” — that futuristic gubbins that supposes everyone is going to replace their computer/console/handheld/everything with a tablet such as an iPad or whateverthefuck the bajillion Android tablets are called these days. I even read an article earlier where someone from Zynga said that tablets are “becoming the ultimate game platform”.

I must respectfully disagree — at least for my needs and wants, anyway.

Our house has three tablets — an iPad 2, a Motorola Xoom and a Nexus 7. The Nexus 7 is currently in for repair, but got a fair amount of use by Andie, largely for free-to-play mobile games and Kairosoft titles. The iPad 2 also gets a fair amount of use by Andie for the same reasons. My Xoom gets barely any use, though the fact I have SNES, Mega Drive and various other emulators on there ready to go at a moment’s notice is pretty cool.

But yeah. The fact stands: I hardly use these devices at all. Why? Because for my purposes, they don’t offer a superior experience to other bits of kit. For gaming, I have consoles, dedicated handhelds, a laptop PC and a desktop PC. For work, I have my Mac, the aforementioned laptop PC and the desktop PC at a pinch. For browsing the Internet, I have… you know how this goes by now. For me, all of these devices offer a considerably superior experience to all of the tablets we have in this house.

Oh, sure, tablets can ably perform several of these functions, but they don’t do any of them as well as the pre-existing devices. About all they do offer, really, is the fact that they’re incredibly quick to turn on (assuming they have some charge left in them, which my Xoom in particular rarely does) and are a lot more portable and lightweight than many other devices.

But personally speaking, the fact that, say, the iPad is thin and lightweight isn’t enough to make up for the fact that it’s a lot more difficult to type on than an actual physical keyboard. And yes, I know, you can pay through the nose and get an iPad-compatible wireless keyboard (or a generic one for Android) but not only does that remove one of the main benefits of a tablet — its all-in-one portability — there’s other issues too: the pain in the arse it is to access the file system (on iOS, anyway; this is one area where Android is marginally better), the fact that proprietary iOS and Android apps rarely play nicely with established formats (just try getting a Microsoft Word file with any formatting or layout whatsoever to look even a little bit right in Pages for iOS), the fact that some of the work I do requires the precision of a mouse rather than the cack-handedness of a touchscreen, the fact that some websites I want to use are designed for use on a computer with a keyboard and mouse rather than a touchscreen and a virtual keyboard.

And don’t get me started on the games. “The ultimate gaming platform”? Don’t make me laugh, Zynga. While mobile and tablet games have been enormously successful in getting more and more new people into video games, and that’s a good thing for the industry as a whole, there is no way you can say with any good conscience that tablets are an adequate replacement for more established systems — and better-designed control schemes in particular. Have you ever tried to play a first-person shooter on a touchscreen tablet with no buttons? It is one of the most bewildering experiences you’ll ever encounter: why would anyone want to put themselves through that? There are certain genres that work well, of course: strategy games, board game adaptations, word games and adventure games are all good uses of a touchscreen interface… as are the never-ending throng of isometric-perspective building/farming/dragon-raising games that are little more than vehicles for monetisation. There are very few tablet-based games that hold my attention for more than a couple of minutes, in short — the last was Ghost Trick, which doesn’t really count as it was a conversion of a Nintendo DS game.

I guess that’s sort of the point, though. The main benefit of tablet devices (and smartphones, for that matter) is their immediacy — you turn them on, you tap a button and you’re (almost) straight into a game, and you can be out of it again within a matter of minutes if you just needed to fill an awkward silence or wait for someone to come back from the toilet. And that’s good, in a way; it just doesn’t really fit with how play games. As I noted in a reply to Anne on yesterday’s post, I play games as my main form of entertainment. I don’t watch much TV, I don’t watch movies, listening to music is something I tend to do while engaged in some other activity, and so games are my main “relaxing time” activity. I want to sit and play something for an hour or two (or more) at a time, and between freemium energy please-insert-credit-card-to-continue bullshit and the “bite-size”, disposable, forgettable nature of most mobile/tablet games, I just don’t get a satisfying experience from them.

Meanwhile, the laptop I bought a short while back is easily my favourite piece of kit in this house. It’s powerful enough to play pretty-looking games like TrackMania, yet portable enough to carry around in a bag. Its battery life is decent (though not a patch on a tablet) and it has a nice screen. It’s a good means of playing visual novels without having to tie up the TV, and it copes well with anything I might want to throw at it while working on the go. In short, it’s an all-in-one device that does absolutely everything I want it to without making any compromises or dumbing the experience down at all. Sure, it takes a bit longer to turn on than the iPad, but it’s also infinitely more useful and fun to me.

Fuck the tablet revolution, basically. Long live the laptop. And the games console. And the desktop PC. And the dedicated handheld. And, you know, sometimes, just a piece of paper.

#oneaday Day 932: Take Control

I’m generally a pretty disciplined sort of person. I’m good at prioritising, and if I have something that I have to do I’ll make sure that I complete it before I do things that I want to do.

It’s when it comes to prioritising the things that I want to do that things go a bit pear-shaped.

It’s easy to stumble through your days as normal and just let things happen. But if you do that it’s easy to fall into routines and patterns and then wonder where the minutes, hours, days go. Those things that you want to do sometimes get forgotten amid your default activities, your comfort zone, the things that you do without thinking.

In order to fit in all the things that you want to do, sometimes you have to take drastic steps. Steps like scheduling your time.

This approach doesn’t work for everyone. Some people are terrible at sticking to schedules, others simply don’t like the lack of flexibility. But I’ve discovered (and rediscovered) several times over the years that I actually seem to work better and be rather more efficient if I plan out my time carefully rather than simply taking things as they come. It’s a hangover from quite enjoying the sense of “structure” from school and university (even if — ssshhhh… I didn’t always show up to my university lectures and seminars) and it’s something that I should really start doing more of in my daily life if I want to fit everything in. Because even with scheduling, it’s sometimes tricky to squeeze all your desired activities in, and that’s when you have to decide how to make compromises and sacrifices. Thankfully, with the things that I want to do at the moment, I haven’t had to make too many of the latter.

The ironic thing about people not wanting to organise themselves these days is it’s so easy to do so now thanks to technology. You can make your phone remind you to do things, set email-based nags to pop up in your inbox, create task lists that synchronise between devices, take snapshots of things and store them “in the cloud” (urgh) for future reference. You can even get social and be public about the things that you want to do, making use of your friends as a means of browbeating… sorry, “encouraging” you to actually get on and do stuff.

I use a few simple tools to sort myself out. Firstly and most simply is Google Calendar. I use this in favour of iCal on my Mac because it’s easier to sync between devices, is stored online rather than tied to a single device and works with iCal and iOS anyway. Google Calendar is a decent tool with enough features for what I need to do — multiple colour-coded calendars, email reminders, the ability to invite people, time zone support — and it proves valuable when I have taken on lots of things and only have a limited time in which to do them. It was especially valuable this time last year when I was going to Gamescom in Germany and every developer and publisher in the world suddenly wanted a bit of my time. (Apart from EA. They ballsed up my appointment — their fault, not mine — and wouldn’t let me in to their stupid high-security compound. Fuck them. I went to go and see Larian Studios instead, which was much more fun.)

Alongside Google Calendar, I’ve tried several other tools over the years. Evernote is pretty neat, for example. Epic Win was a cool idea that gamified your own productivity, but development seemed to stop quite a while back and it’s still lacking a few features that many other task manager apps offer. Most recently, I’ve been playing with Springpad, which I like a lot, despite a few rough edges.

Springpad is quite a bit like Evernote, but with a few interesting twists. It’s based around the concept of “notebooks”, which are ways of grouping related content together. Within a notebook, you can create a wide variety of different notes, ranging from simple text notes to checklists (mini to-do lists, essentially) via tasks, recipes, books, product information (scannable via the RedLaser barcode-scanning interface on the mobile apps) and all manner of other stuff. A webclipper bookmark allows you to easily clip things into your notebooks, and the interface generally does a pretty good job of figuring out what kind of content you’re trying to store — I tried it with a recipe from BBC Good Food earlier and it successfully recognised it as a recipe, though failed to import the ingredients list correctly.

Springpad also features a “social” component which allows its users to make its notebooks public, too. While I’m not entirely sure that this has been particularly well thought out, it does provide an interesting alternative use for the service, effectively turning it into a kind of blogging platform. Notes can be used as entries, the more specific types of notes used to provide specific information, and the site’s in-built commenting facility allows users to build up a community. It’s a neat idea. I’m not entirely sure how useful it is, of course, but it’s a nice idea.

So anyway. Armed with these simple (and free) tools, I’m attempting to organise myself a bit better. After two days, I’ve already managed to do a bit more than I would have done otherwise, which is pleasing. I shall continue with this system for a little while and see if it’s something that I want to make stick. It will be an interesting experiment if nothing else, and it might actually spur me on to get some things done that I’ve been meaning to get done for a while.

Further updates on exactly what when I have something to share.

#oneaday Day 834: RUMOUR: Rumours ‘Rumoured’, Says Rumour-Monger

20120502-005626.jpg

If you’ve ever started a conversation with “I heard that…” and then gone on to explain exactly how you heard somewhere/from some guy in the pub/from “The Internet” that something awesome/awful is going to happen, then I urge you to think before you speak in future. Because if you continue with that sentence, you’re simply feeding the rumour mill, and the rumour mill doesn’t produce good things and help us make the Bread of Truth. It produces garbage and poo, and then squishes it all out into the world’s most unpleasant pâté.

Tortured (and gross) metaphors aside, it’s a fact that I wish more people — particularly in the press — would cotton on to.

Today, for example, saw news that Liberty X “might be” reforming for a new album and a tour. Firstly, I don’t think anyone wants that, and secondly, the only evidence that such a reunion “might be” happening is the fact that they were photographed together outside the ITV studios and — get this — they were smiling. Stop the fucking presses.

There are a ton of journalism sectors that are particularly prone to this. Showbiz columns report who might be sleeping with whom. Sports columns report who might be moving to some other club for a disproportionately enormous amount of money. Music and arts columns report who might be working on what. And then, of course, there are the tech-related industries.

Anything related to Apple is accompanied by an inordinately huge amount of rumourmongering, for example. In the run-up to the company’s announcement of the third-generation iPad, all sorts of nonsense was flying around. This ranged from suggestions that it might not have a Home button to the frankly astonishing assertion that the reason iOS apps had started having textures like leather and the like in the background was because the new iPad would have a haptic display — i.e. one where you could feel textures as well as see them.

The video games industry is far from immune, either. Rarely a week goes by without one outlet reporting on some rumour from a mysterious, anonymous source and the “story” then being picked up by every other news site on the Web as if it were fact. This particular rumour mill goes into overdrive as a hardware generation starts to wind down and people start wondering what the next generation of consoles might look like. Inevitably, the vast majority of stories turn out to be absolute bollocks, and on the rare occasions when an outlet or reporter writes something that turns out to be true, there’s at least a day’s worth of smug, self-satisfied cries of “Called it.”

No you didn’t. You were throwing darts blindfolded, and you happened to hit a lucky bulls-eye. Your other fifteen darts are embedded in the barman’s testicles, the barmaid’s left boob, the right ear of that hard-looking dude who drinks absinthe by the pint and the TV that was showing the Bolton v Wigan match. (Everyone is angry. I’d run, if I were you.)

So why do we persist on reporting on these festering sores on the very arse of journalism? Because they attract attention, particularly if they’re controversial. If one site prints a story that Liberty X is reforming, or that the next Xbox will feature a system to prevent used games from working on it, or that the iPhone 5 really, totally, absolutely positively is coming out this time, then that will attract commenters like flies around shit. And that means page hits, advertising revenue and the little graphs that make the men in suits happy moving in an upward direction. Who cares if it’s absolute nonsense dreamed up by someone who cleans the toilets at Microsoft? Print it!

I make a point of not reading any stories that start with the prefix “RUMOUR:” now. And should I ever find myself back on the news desk for a popular gaming website, I will most certainly do everything in my power to avoid reporting on such nonsense — unless some actual investigation turns up something interesting, of course. But blindly parroting another site’s “anonymous source”? No. Just no.

So, then, I reiterate: think before you speak/write/publish. Because rumours are rarely helpful. Remember that time it spread around the whole school year that you’d shat your pants when in fact you’d just sat in some mud?

Yeah. That.

#oneaday Day 779: Snark Pit

20120307-235746.jpg

I’ve kind of had it with snark. The whole “let’s piss on everything” parade that shows up any time something vaguely interesting or cool happens is getting really rather tiresome, and over the last few weeks and months I’ve actually been taking steps to minimise my exposure to it by simply unfollowing people on Twitter who prove to be irritants in this fashion. (British game journos, you don’t come off well in this poll, by the way, naming no specific names.)

Unfortunately, on a day like today, which held among other things the promise of a hotly-anticipated iPad-related announcement from Apple (which turned out to be “The New iPad” with its shiny retina display and quad-core processor… yum) it’s difficult to avoid said snark. It seems that for a lot of people nowadays that if something isn’t to your own personal preference, then no-one should enjoy it.

At this point I’ll say that I’m well aware I’ve been guilty of this in the past, and for that I apologise. (The X-Factor is still unquestionably shit, though. There is no valid argument in favour of a show that gave the world Jedward. I’m just not going to rant at length about the subject any more.) I am trying my best these days to see arguments from both sides, but unless you’re some sort of level 99 mediator, you’re always going to come down on one side or another. So long as you don’t force your views on others and expect everyone to agree with you, everyone should be free to do that. (Unless it’s about something dickish. I think we can pretty much universally agree that those who judge people based on skin colour or sexual orientation can all pretty much just bugger off and sit on a spike.)

I digress. I was talking about snark, and specifically relating to today’s Apple announcements. The new iPad is, by all accounts, a lovely-looking device, and the Retina display is sure to raise some eyebrows. As per usual for an Apple event, the company came out with its usual stuff about how it believed we were entering a “post-PC” era and about how people supposedly “preferred gaming on their iPad” to consoles and computers.

Contentious comments, for sure, but firstly, they’re marketing hyperbole — Apple announcements are press events, after all, and a company as big as Apple is never going to be humble about its achievements or lofty ambitions — and secondly, it might not be quite so unreasonable as you think. Already many households are making use of iPads for simple tasks such as browsing the Web, checking email, watching TV and movies, playing games, keeping themselves organised and all manner of other things. And the sheer number of people who have downloaded Angry Birds, whatever you may think of it (I hate it) should give you pause when considering the gaming-related comments.

But instead of thinking these points through rationally and considering the perspective that Apple might have been coming from, in it was with the snark about how wrong Apple was and how much bullshit they were talking. Up went the defensive walls, and a veritable barrage of snark was fired over the parapets towards anyone who dared to say “hmm, hang on, that’s actually quite interesting, and possibly plausible”. (I’m not saying their comments were true, rather that they deserved greater consideration than immediate outright dismissal.)

It only continued when, as usually happens in Apple announcement events, software started to be shown. The new versions of iMovie and GarageBand for iOS drew particular ire, with various Twitter users making acidic comments about how awful the music people makes with GarageBand supposedly is, and how terrible the “movie trailers” facility of iMovie is.

Once again, no consideration was given to the audiences that these features might be directed at. As a former employee of the Apple Store, let me assure you there is absolutely no love lost between me and the tech giant of Cupertino, so I have no “need” or contractual obligation to defend them — and also, a company the size of Apple certainly doesn’t need my defence either. But as a former employee, I know that Apple customers aren’t just high-falutin’ creative types, gadget freaks, tech snobs and people with more money than sense. I know that people who walk through the front door of that store range from very young to very old; from experienced computer user to complete beginner. I know that there’s a considerable proportion of that audience who came to Apple because of its products’ reputation of ease of use. I’ve even taught plenty of those people how to achieve simple tasks in products such as iMovie and GarageBand, and to see the looks on their faces when they realised that yes, they could be creative with their computers despite their lack of technological knowhow was, to use a word Apple itself is very fond of, magical.

As such, I feel it’s grossly unfair and downright blinkered for people (including professional commentators in some situations) to completely dismiss a considerable proportion of Apple’s audience and declare a feature to be “awful” or “crap” simply on the grounds that they don’t see the appeal, or think that its results are cheesy. (They are, but imagine if you had no idea how to edit a video and suddenly discovered you could put together a slick-looking movie trailer from your holiday footage and upload it to the Internet. You’d be pretty stoked, and you wouldn’t care that it was a bit cheesy. If you were inspired by this ease of use, you might even look into the subject further to find out how to take more control over the stuff you were creating.)

I’m using Apple as an example today since the announcement is still pretty fresh in everyone’s mind. But the presence of snark can be found pretty much any time something interesting is announced or discussed, especially in the tech or gaming industries. You can count on there being an unfunny hashtag pun game mocking the story within a matter of minutes; endlessly-retweeted “jokes” trying to look clever; and, of course, protracted slanging matches any time someone calls these people out on it.

And, you know, I’ve had enough. If you have a valid criticism of something, by all means share it and back up your point. But if you have nothing to say other than “I think this is crap, therefore everyone else should too” then kindly keep it to yourself. Because, frankly, your opinion isn’t anywhere near as important as you think it is.

#oneaday Day 735: Enough with the Period Jokes

20120123-235536.jpg

I’ve been using our new toy, the iPad 2, for a little while now, and I have to say it is a most wonderful device of much majesty. Like many others, when the original model iPad was first announced, I was skeptical as to whether such a device could be useful when we already had smartphones. No one seemed quite sure who needed a tablet device, and it didn’t look like Apple did either.

That’s because, as it turns out, pretty much anyone can get something out of a tablet device. My experience with this particular breed of tech is, at this time, limited to Apple’s entry to the market along with my Kindle (not exactly the same breed of device, but does what it does very well and is making me read more — always a good thing) but I can imagine there are similar benefits to Android tablets, albeit without the robust infrastructure that is the App Store.

Let’s consider what I have used this device for today. I have browsed the Internet on it. I have looked at Twitter on it. I have shared images using it. I have played games on it — both five-minute diversions and deep RPG experiences. And right now I am writing a blog post on it, the cack-handed image you see at the top of which was also created on the iPad.

In short, I’m rather in love with it. In fact, the only thing I can’t completely do with it is my job, since the sites I currently write for use self-hosted WordPress that isn’t set up to work with the iOS app, and this means I can’t upload images via the Web interface. A bit of a pain, sure, but at least I can write the posts on the go and put the images in later should I need to.

I’ve been impressed with what an all round entertainment device it is, particularly now we finally have Netflix in the UK. Should I find myself wanting to watch Twin Peaks while on the toilet, I can. We really are living in the future.

I’m sure the novelty will wear off soon, but the fact that since I’ve come home from Americai haven’t played a single PC or console game as yet is somewhat telling, and the videos I’ve watched on the big TV were simply to have them on the big screen — if I wanted a more personal experience, it’d be no big deal to transfer them to the iPad, particularly now you can do it over Wi-Fi.

A sound purchase, then, and not a hint of buyer’s remorse. I may be done with Apple as an employer, but it’s hard to deny that they make damn good products through that gradual process of refinement they go through over the years. I’m intrigued to see what the third iPad may have to offer, should the rumours of its release in March of this year turn out to have any validity whatsoever.