2009/11/01

Digital Tag-dentity

I butted in to a conversation on Twitter (iphone insists on the capital T) this evening about digital identity and the issue of choice when it comes to images being posted on line without consent and particularly the tagging of people in them.

The view being expressed that it should be the choice of the person in the images whether to be tagged or not would have been my view a year ago, but since then I've begun to embrace the idea that 'digital identity' is something you can only influence and not control.

People will have a view about you and they will express it - that's a fact of life unless you hide in a cave and try really hard to make sure no one ever notices you and if your life has no impact on the rest of the world. But that's the definition of a life wasted in my view, and the lack of hermits these days would suggest that few people would choose to adopt that policy for their lives either.

So, before the web, we influenced how the world saw us by choosing what to wear, how to act and what to say/write, but for the most part, we relied on a few close critical friends to reflect back for us how the world really saw us.

Now though, the web is an extention of our conciousness and our memory and much of it leaks into being public. It strikes me that this is kind of like the curse inflicted on that hyper intelligent smug race of aliens in the Hitch Hikers Guide to the Galaxy that led to them all being made telepathic as a punishment by the rest of the galaxy and transmitting their every thought to those around them - so they now spend all their time talking about inane things like the weather to block out their real thoughts...actually, kind of sounds just like Twitter doesn't it? :)

Anyhow; people's thoughts, opinions and memories of us now leak from their minds onto the web and hence into the group collective conciousness instead of remaining private or only shared with just a few people close to them.

As a result, we can ALL tap into those thoughts and memories - which can be great. We can now enjoy seeing ourselves as others see us (!?), from which we can then moderate and modify our behaviour if we don't like how we are being perceived, or we can engage in a dialogue to revise the perception....it's all about knowing where the conversation is happening to be able to join in and have an influence on it.

So tagging - is it rude/wrong to tag someone in a photo without their consent? In terms of the law, (IANAL!) the data protection act clearly applies so the law does imply a degree of responsibility and a requirement to respond accordingly, but it doesn't say it's inherently 'wrong', so perhaps the question is one of ettiquette not rules - one of culture and acceptablity not of absolutes. As such, its something that may change over time as society changes it's view.

At the moment, on a very difficult balance, my own view is that the benefit of being able to easily find for myself where my image is being shared, what is being said about me, and being able to then join in the conversation and either directly influence things (getting an image removed or comments moderated) or indirectly by adding my own version of things or changing my behaviour etc in future, edges things, just. So I would prefer people to tag me in any images online rather than not doing so even without my prior consent.

I already share 'official' images of myself online through work and personally, which are tagged or otherwise able to be associated with me, but until every photo of me online is tagged in an easily searchable way, I have no way of knowing what other people may have placed online.

In addition, I like to imagine the invisible audience of the future generations able to look back and get a real feel for life today -of being able to see the lives of their ancestors (us). Imagine how much richer our history is thanks to all the photos and film of the early 20th century? Now imagine only having the written words and none of those images instead; how much poorer would our understanding of those times would be? Now imagine instead if you could easily search all, ALL those old b/w photos and film clips that exist anywhere to find your great grandparents, to see a glimps into their lives and times. Wow, need I say more?

Yes, there is potential for abuse both by peers and by governments and black hats of having your likeness online and tagged but I suggest it's those abuses which are the problem, not the images or tagging themselves and that we need to focus on those as the issue instead of the tagging behaviour needing to be moderated.

Of course, this all relies on everyone taking an active role in moderating their online digital footprint and engaging with influencing it in a positive way - something which I understand is still seen by many as lacking in the general population, and the young in particular perhaps. My own experience (very limited) is that the 'google generation' is adopting a different approach entirely instead - not caring.

We (older genration) are hung up on how we (and they) are being recorded and indexed and therefore losing all privacy, while they (big stereotype) are resigned to that instead as an unavoidable part of life and therefore just accept it. It's often said that any tech which exists before we're about 5 we just take for granted, anything new before we're about 20 is 'cool', and anything new after that is 'rubbish' or 'dangerous' - perhaps online digital identity management is like that. We may not win in our efforts to educate the young to the dangers of sharing compromising images on line because to them, that's the norm and therefore defined by new rules we don't yet understand? While they know future employeers could see the images, they also expect to be able to see images of their employers and that therefore no one would care as it's just ceases to be noteworthy!

So what does this post conclude? I'm not sure as my own views are quite fluid but for right now:

* tagging of images is unavoidable
* don't fight change, work with it
* tagging can actually make managing ones digital identity easier especially in the new reality

I'm waiting to read what others think of this too in their blogs now though :)


-- Posted from my phone

2009/10/25

Micro Blogging and Justice

Just read this article http://bit.ly/4Cslkv about jurors using twitter during a trial and pondering...

Historically, trials have relied on a myth of an idealised jury that exists in issolation from the rest of the world, mearly absorbing information presented by the two sides within a court room and reaching a decision based only on that.

Within that, the defense and prosecution advocates have needed to become adept at reading a jury's mood/thoughts to pitch their case acordingly. The public are often also left bemused sometimes with the decisions juries reach - and as 'justice must be seen to be done', that's not ideal.

So, what if juries were _required_ to record their thoughts publically during a trial? Sure, there would be issues of honesty, some people would feel intimidated by it and it would open up many issues around jury tampering, re-trials etc too so it's not simple, but I can see advantages too.

Both sides of the case would be able to accertain openly if their points have been convincing, the public could see the reasoning behind jury decisions (that's assuming there is reason I know).

OK Probably not the right answer on balance :), but I do think the Internet and ubiquitous access is a game changer for the mechanisms we have relied on for 'justice' historically and we need to recognize that and not ignore it - it might mean that every jury needs locking away and stripping of all modern devices, but I think there's a way to embrace the new landscape rathe than deny it - just can't see quite how yet. Any ideas anyone? :)

-- Post From My iPhone

2009/10/19

What your e-mail address really tells people about you

Last week I was taking photos of Smeaton's Tower lit up with candles to mark it's 250th aniversary and a complete stranger came over to ask if I could share the pictures I took as their camera had failed.

As a generous sort, and because the friend I was with volunteered me to, I offered to upload then later to flickr and e-mail them a link. Grateful, this stranger then gave me their e-mail address without a second thought.

Later, when I was about to e-mail the link to then I was struck by a memory of the session on digital ID I attended at ALT-C which could have been subtitled 'stalking 101' and wondered just what an e-mail address alone might reveal about someone. The exercise in that session had been too simple as te one thing we knew about each other to start with was that we were all at alt-c, so I wondered just what would be possibly with litterally no additional info.

Scarily, with a simple google search I found their presence on a forum where they used a different nick name, which led to finding their Twitter stream, and from that I could see where they work, where they study, thier hobbies, who their friends are etc.

Now, this was just an academic exercise and I have no wish to know anything about this person in particular, but I suspect that they had no idea that by simply giving a stranger their e-mail address that they were effectively giving such 'personal' info away without a 2nd thought. This is something I've struggled with myself since starting with Twitter as affectively I'm saying "hello world, this is everything about me, please don't kill/rob/scam me!".

But did you realise just how much an e-mail alone can give someone about you?

-- Post From My iPhone

2009/10/04

Innovation prevention or proffesional excellence?

I feel prompted to write something today about the frustration I often feel when my proffesion is disparaged for actually just 'doing it's job', especially by people who often display an incredible naivity about the real world of IT.

I know, opinions are free and we all have them and I know that people prefer to make their own choices in life rather than be told what to do, but lets face it, work is not the same as hobbies and when people express a really naive view about IT and then wonder why that view isn't being followed in a corporate or institutional context then I sometimes just want to scream.

I was watching the X Factor the other week (yes, I know, very sad and this might sound like a bit of a tangent, but bear with me) and as usual in the early programmes I found myself pondering why on earth some of the really bad singers seem completely surprised to be told that actually they have no tallent at all. I was struck by the thought that the reason for their delusion is that they don't respect the proffesion of singing. They don't appreciate the years of work, the effort involved (or sometimes just raw tallent) that singing at an international standard entails.

This seems to be a common tendance in society today. When we go the the doctor we'll check the diagnosis and any prescription on the Internet to arrive at our own opinion, if we need legal help we'll again look to the Internet to arrive at our own opinion of the law, and if we were having building work done then again we'd probably all have a view on the right way a builder should be working thanks to all the DIY programmes filling the TV schedule.

But, is there grounds to think that an opinion arrived at from just a few hours research into these proffesions has any real merit?

This lack of respect for what a proffesion or trade really takes seems to be the norm.

There are always people expressing views about IT based on their experiece 'playing' with different operating systems or software, from reading a few articles about the latest technologies or from just being frustrated with the systems they see at work, but do these views have any real merit in the work place? What experience/skills do IT proffesionals have which gives us a different perspective to make different (difficult) decisions which can see to fly in the face of these naive views?

Well, one of the key things people naively ignore is that an IT proffesionals role is not to provide the latest wizzy shiny thing to the users, but to provide systems that meet the business needs. That's 'needs' and not 'wants' - and two of the key things any business/institution NEEDS are:

* value for money
* business security

These both lead to all sorts of compromises, often complex, which can be difficult and frustrating for eveyone, but if their not met then the business won't be arround too long to worry about it.

One of the oft repeated mistakes is to think that somehing being 'free' means it is automatically the best value for money. If only life were so simple!

Very often, things that appear free up front have much greater total costs. Take for instance Linux for the desktop. Now, I like Linux. I've used it on and off since just about it's inception, compiling my own kernals back then, watching the development of the GUI and the very many open source apps that now get bundled with it, and it's great fun and petty much usable these days. But - what does it actually cost? Well, I know it fairly well (and have a very good theoretical understanding of it so i can quickly upskill when i need to) so when I play with it I can pretty much solve any issues that might arise, but how common are these skills in the market place? What training and certification exist for employeers to judge if the people they employ know much about any particular distribution they might choose to adopt? Critically, how much more do you need to pay your IT staff to keep them once they have those skills? How much more does it cost to provide additional training to the vast majority of non-IT staff to help them transfer their often already limited ability to use MS Windows onto a different desktop, let alone teaching them how to use a different Office package and all the resulting additional support costs from incompatabilities with other people. I'll give you a clue, it's not free!

Let's look at the choice of web browser. Yes, Firefox is fine, with a lot of very useful add-ons and with slightly better standards adherance than IE 8 still (perhaps), and it's free too right? True, but in global market share, Firefox is still only about 1/3 as widely used at IE (from this last months figures, and yes I know IE is still drifting down very slowly, at a rate that won't see a significant change for another few years though).

That means, most web sites are still designed to work with IE first. They have to be or else alienate the majority of their users still. That also means that your IT support staff HAVE to support IE even though most of them would probably much rather see the back of it and prefer using FF themselves. Since they have to support IE as a platform, adding another one adds to complexity for the infrastructure systems (and development of them), decreases support efficiency and hence adds significant cost to the business. So, unless Firefox can deliver a very clear business benefit or is absolutely NEEDED for something then the prudent thing is to just not use it. Why did VHS beat Betamax? Not because it was a better technology, but because of market share. It's chicken and egg, but your IT dept can only react to the world, not re-invent financial reality.

Now, with Firefox for instance where I work we take a pragmatic view to minimize the impact and allow staff to install and run it themselves if they choose. Since the vast majority don't care and so don't bother, the support overhead is minimized while allowing those who may have a requirement to use it without putting barriers in the way of them getting their job done. The hidden costs to the business from those staff having to support themselves or getting a reduced level of support centrally exists, but is hopefully minimized.

But let's look at something more clear cut, antivirus software. Sometimes the choice of AV software is criticised by users who might have had a good experience with a consumer product at home and feel the choice in the workplace is therefore wrong because they percieve it to be causing them problems. Installing and mIntaining an AV program on one or two machines is a very different thing to doing so on a few thousand machines. Therefore the features which are NEEDED in a large scale deployment are quite different. Active directory integration, update bandwidith use, the frequency of updates requiring a restart (very bad for servers), comparability with specialist applications, remote monitoring and repair etc are all very important features for a large installation which wouldn't affect a home user at all. In addition, the ability to 'lock down' the software to prevent users putting the business at risk by disabling the software is key for business continuity planning and assesment.

Do IT staff take delight in preventing people doing their jobs? No, that's a silly sterotype that I would expect people to recognize as such. Are IT departments staffed with idiots who don't know how to do their job? Well, i'm bound to disagree with that! Do we HAVE to sometimes make difficult compromises between allowing freedom and business needs? Definately, and to make those choices often requires careful balancing of conflicting requirements and understanding the business to do so. Is that something you can do naively? No.

So, next time your IT department seems to be out to get you, give them a little more credit - you need to trust that they are proffesionals making very difficult compromises.

....but don't stop telling them what you NEED, but please don't mistake that for what you WANT.

-- Post From My iPhone

2009/10/02

A silly toon (with me in it

I created this little toon soom time ago while messing with http://toonlet.com/ as a way to make learning fun. It's much easier to create a toon than it is to think of somthing profound to say, so this isn't that interesting in itself, but does point out a nice way to make a potentially stuffy boring subject just that little bit more interesting - as long as it wasn't over played:


Getting Creative

by sputuk
Getting Creative

(Tulip is the name of the Teaching and Learning portal/VLE at the University of Plymouth) for a more profound example, see this toon created by someone else.

Anyhow, here's another one I created (much) earlier in true Blue Peter style:





Help! I've become a toon!

Help! I've become a toon!

Let me know what you think - do I have a future as a newspaper cartoonist? Probably not :-)

A metaphysical Moment

Just a silly little movie created to experiment with the resource Flea Palmer made me aware of through using it herself in her blog post:

Fun learning


Since being shown it, I've been pondering the educational benefits of this game for teaching physics, logical problem solving, maybe even team working if a group worked on solutions together:

Possible uses could be:

  • to introduce the concepts in Physics such as momentum, levers, conservation of momentum etc.
  • Approaches to problem solving
  • Team working (eg have a team split into roles, where one group analyses the problem and produces a concept for a solution, another group has to take that concept and design a particular implementation, and another group is tasked with taking that design and actually creating it in the game - the same or another group then feeds back on the out come to the first group who repeat the process until solved)

I particularly like the last idea - which might also work for team building events, not just teaching students :-)

Regardless, its a very fun game, especially in the iPhone version I treated myself to :-)

Note: I'm just adding a few items to this publicly visible blog that I have previously posted on an internally UoP visible only blog.

2009/09/25

What is/could be a PLE

Just looked through this paper on a conceptual view of what could make a PLE. Struck by how similar this is to what we provide to students at UoP as their "MySite" within Sharepoint. Close, but not quiet - as that doesn't conform to some useful standards.

Personal Learning Environment - A Conceptual Study

2009/09/20

Response to blog Open letter to blackboard

I just posted this response (with a few corrections) to an open letter to Blackboard (http://bit.ly/A878U ) calling on them to embrace current technology and make significant improvements (related to the 'VLE is dead' debate):

Worth a go, and I admire the sentiment to engage with the producer to seek improvements (as I argued for at the ALT-C debate on the issue) - but there are 3 things I'd say you need to consider further:

1) recoding a large application like Blackboard is not a trivial exercise. Even with complete commitment and deep pockets, it will take quite some time to re-engineer
2) There are potential issues with software patents to consider-where 'best of breed' sites out there may already have protection in place which may require license arrangements or very creative re-thinking to avoid issues (a topic Blackboard legal people are all too familiar with!)
3) You haven't yet clearly articulated the real issues, beyond saying that "frames are bad m'kay?" ;)
I envisaged the community working together to produce a VLE design spec which would really meet the needs into the next few years against which any existing or future VLE offering could be measured when procuring. This would need to refer to relevant standards for interworking with other systems, detailed feature sets with priorities etc.

...the other aspect is that of course the Moodle option exists to actually contribute directly to the development/improvement cycle too. I also envisage scope within and across institutions to collaborate on projects to produce legal and technical frameworks to allow the easy use and integration of those 'best of breed' services without waiting for existing VLE systems to catch up, as they are probably better focused on being the glue between institutional information systems and the cloud based 3rf party tools rather than constantly playing catch up.

But who knows, the revolution might be starting here? :)

-- Post From My iPhone

2009/09/09

VLE is (not) dead

Just watched the video of the "VLE is Dead" debate from #altc2009 and realised that James has added this blog address under my name, so thought I'd best mention it in case anyone looks at this blog and wonders what on earth this has to do with anything :-)

While this is one of my blogs, its not one I actively maintain and has mostly been a "place holder" until very recently.

Most of my work related blogging is actually done on an internal University of Plymouth blog (for various reasons), so please don't be too disappointed to not find anything related to the debate here, although I will share my notes for my bit of the presentation here if anyone is interested. :-) This includes slightly more of the points I'd have made if time and the direction of the debate had allowed - still culled down from my initial list of points.



But feel free to follow me @sputuk on twitter for a mix of work, personal and often innane thoughts and conversations.

2009/09/06

My view on music downloads as 'theft'

Watching the Big Question on BBC this morning, I felt prompted to put the following on the forum for the programme (slightly editted to correct some typos I spotted too late):

The music industry persists with the fallacy that copying music is theft. It is not.

Theft denies the owner of an item any access or use of that item. Copying something creates a separate new item without affecting the rights of the owner. It is not theft.

The music industry and artists also mistakenly equate a copy made to a 'lost sale'. This is not correct. Most copies are made by people with either only a passing interest or little money. If they did not copy the music they would live without it.

People copy music for choice. Music is available 'freely' as far as the end consumer is concerned via many routes - the radio, music TV channels, pubs/clubs (people do not percieve they are paying for the music), etc - but usually they don't have a choice in when or where to hear tracks they want to hear.

When you buy music, you are confered a limited right to listen to that track where and when you choose. Downloading music without paying for it is only about gaining that 'any time access' without rewarding the artist. It is not theft and most people for most tracks that they download would rather live without that access than pay for it.

Music has ALWAYS been copied - from minstrals traveling around and having their songs re-sung by others, to modern flawless reproduction. Artists can not stop people resinging their songs or even remembering the song in their own mind- all that is gained with a download is greater fidelity to a particular performance. Again, it is not theft.

Artists have historically made money from the scarcity of the copies of their works confering some value to those copies. This ceases to be a valid business model in the digital age and the sooner the music industry and artists wake up to that and leverage the market in otherways, the sooner we can stop wasting effort and money as a society on this pointless 'crime'.

Downloading music is not theft.

----

Additional comment: I do however personally beleive it is immoral to not reward artists for thier works, for things I enjoy, and currently the only way to do that is to pay for music through traditional routes, but the sooner alternative business models exist to reward artists the better.

Germaine Greer made almost exactly the same points as I'd made in the programme but much more elloquently ;) she made the point that copyrite law exists to protect authors from publishers historically so publishers had to pay authors for each copy they produced. By extension, each individual making a copy of music could pay an artist directly - thus marking the end of the traditional publishing industries; which is of course why they are so vehamently defending the old dated business model- they cease to have the same significance in the digital world, just as newspapers are also struggling.

The cultural shift to recognize that things have changed will be even more important when we all have 3D printers in our homes and can download models to build physical things too. A lot of other industries (spare car parts etc) will also need to be re-considered.

2009/05/08

Seale-Hayne End of Session ball 2000

This image is looking at two separate tests-even three:

1) an experiment with coloursplash on the iPhone to selectively remove colour
2) use of Flickit app on iPhone to upload the image to Flickr
3) linking Flickr to Blogger to write about images and add text, which is what this is

I see no reason why this won't work. I can't recall if I've already set blogger up to tweet and update FaceBook when I add a blog entry-if not, those will be on the list to look at at some point, possibly via freind feed.

2009/03/28

Using app on iPhone

Although the email method worked, this will probably be a nicer way to use this blog using this app on the iPhone.

I have just noticed that tapping on the screen is begining to make my finger numb though!

Snap of this app:



Testing publishing by email from iPhone

Well, this should work - apparently

Sent from my iPhone,please forgive brevity and spelling errors