The Tree of Liberty may need watering!

It may be prophetic that it was an American Thomas Jefferson in 1787 who is quoted as saying “The tree of liberty must be refreshed from time to time with the blood of patriots and tyrants. It is its natural manure”. Read the full context at ‘The tree of liberty… (Quotation)‘.

Prophetic because it is the Government of the United States that is testing the patience and tolerance of more than its own populous as it continuous its fight for control over online open expression and orchestration in the information wars. Or rather the voice through its judiciary in the latest instance in the form of U.S. Magistrate Judge James Francis who last week ruled that local search warrants must include customer data stored in servers located outside the U.S.

In simple terms the US is completely losing the plot. It is proposing that it can reach into other sovereign territories out with the established protocols of ‘mutual legal assistance’. In other words the US is now getting so brash it is simply marching into other countries without asking and seizing property simply because it says it can! Property in this instance in the form of electronic data, but none the less property. Just ask Hollywood and the Recording Industry whether they deem their digital data (Movies, records) property or not.

Even the argument that it is an American Corporation based in the US does not hold, as Microsoft is the data processor NOT the data Controller. The Data Controller remains the individual whose data is being sought. A basic tenant of the digital world is the separation of Processor and Controller. Where the legal ownership and ultimate responsibility resides with the CONTROLLER not the PROCESSOR.

Credit so far to Microsoft that they have pushed back on this stating the simple basis globally accepted in the new age of Cloud Computing (supported by traditional sovereign state and international laws) that the data resides OUTSIDE the jurisdiction of the US and Microsoft is not obliged to hand it over.

The problem is the US government has a reputation as a bully, a reputation backed up by far too many corporate and personal cases where they have forced coercion through an abuse of their powers. The tactics used usually taking the form of indirect as well as direct threats of business operational burdens such as IRS investigations or restrictions on individual’s freedom of movement amongst others. For the rest of us in the world we have for a long time looked at our poor US cousins as they have become increasingly subject to losses of liberty, freedoms and impositions that few other parts of the world, with the exception of those that may be classed as regime’s, are subject.

The issue now is the US is breaking out of its own playground and throwing its weight around in an embarrassingly arrogant manner that crosses many moral and rational boundaries. In the context of the Iraq and Afghanistan wars it was on a huge scale, dressed up in the guise of ‘Global Policeman’. Latterly in the form of its stimulation of the Arab dawn across North Africa and involvement in the destabilisation of Syria. For many Americans the impact is off radar, their introspective predispositions leave them desperately ignorant of what the rest of the world actually thinks of their country, and what their politicians are doing in their name.

Even in the latest International situation its political and bureaucratic administration acts in an arrogant and duplicitous way. Just look at recent comments by US Secretary of State John Kerry against Russian Broadcaster RT. As for President Obama, he continues racking up George Bush scale incompetency marks in his performances. Where the US lost credibility many years ago its longstanding trusting partners are being contaminated by the hegemony themselves as the US engages increasingly more foul means than fair.

Another perspective is to question whether this is the desperate act of an administration trying to distract from its own internal fiscal horror show or just incompetence?

IF the US does not start addressing its delusions it may be staring a more seismic scale challenge in the face. The Dollar’s position as a reserve currency has never been weaker. With China and Russia’s tripling of Gold acquisitions in the last year and the threat of sanctions on Russia could see these economic powerhouses questioning the Dollars historic position and look at alternatives such as the EURO. Looking at this in the context of the failure by the US Federal Government to independently audit its Gold Reserves for over 75 years raises serious questions of confidence. The 1950’s audit does not count in many people’s eyes as it did not permit external auditors and only included 5% of the Gold apparently held. Read more in The Biggest Government Lie in History?

The virtual world is evolving at an exponential rate. Information sharing and transparency abounds, governments can hide little and innovation is presenting new ways of addressing entrenched practices. The current poster boy in this regard is the Crypto Currency ‘BitCoin’ which continues to pose challenging questions that go to the heart of our financial systems. Solutions such as bitcoin represent a decentralisation of control that would start to tear down the political and fiscal establishment and regain a balance long lost.

The US and its camp followers need to heed the health warnings. Or I fear there will be a reckoning.

Cloud Vendors – Great Power, Much Responsibility

Headlines abound as the race to the bottom continues in the price wars that is benefiting Cloud Computing users but challenging the 3 principles in Cloud Computing – Microsoft, Amazon as clear leaders with the rapidly maturing but still lagging Google – as they struggle with a new class of business model economics that can no longer rely solely on operational efficiency backed by cost competitiveness. As cost becomes a level playing field there are interesting new shifts in play that will move the emphasis into other areas such as end user experience, brand relationship and trust in the battle for market share.

Microsoft Azure marches relentlessly on with its delivery of a rich end user functionality experience driving home its ease of use credentials, a clear advantage when coupled with its mature partner ecosystem and by association rich customer relationship market surface area. As Scott Guthrie announces the preview of the next generation Microsoft Azure Management Console with a whole raft of features. Features that make it easier and faster to tap more of the great Cloud Computing economics of scale and utility that is spawning a new era of business agility. I do not intend to address the feature benefits coming to market but the trending impact of such a use friendly experience which may prove to be a double edged sword for Microsoft.

It is not the first time we have seen technology emerging from the control of just the technically adept magicians into the consumer space of everyday end users. To draw a parallel, Object Orientated Programming (OO) back in the ’90’s heralded for many purists the end of program development as a skilled profession. The point being made was the Object Orientated (OO) programming models would render developers little more than Lego Builders. As OO programming modularised many common functions and methods in the form of program objects and libraries that developers could stitch together instead of having to write from the ground up each time. The reality was it did simplify development but shifted the focus. Developers were able to spend more time building more complex and advanced software than hacking out repetitive tasks that could now be pulled off the shelf in the form of vendor produced programmable objects ‘libraries’ of functionality. Critically it has made programming more accessible to the hobbyist lowering the bar of entry considerably as rich development environments such as Microsoft Visual Studio and Adobe’s Dreamweaver provide the ultimate canvas to create software on. Microsoft even do a starter version of its Visual Studio for FREE, head over and get Visual Studio Express.

Microsoft are driving a similar evolution with Cloud Computing. The menial scripted grunt work is being replaced with autonomics dressed up in rich point and click user friendly interfaces and template based frameworks. The vendor heralding benefits being to allow systems administrators and developers more time to focus on the higher return activities that can better materially benefit and advance their organisations.

All good so far. But with such convenience as with OO Programming, the automation and abstraction of complexity does come with a health warning for vendors and users alike. With OO Programming the accessibility was still constrained to the technically adept who could architect, develop and code, but cloud computing has lowered the bar to a point where any computer literate individual can now start deploying significant computing capability with almost no programming or systems administrative training. This includes:

  • Websites complete with content management, file sharing or blogging capabilities.
  • Computing infrastructure with enterprise grade Application Servers such as SharePoint MS SQL and Biztalk.
  • Virtual Network & Virtual Private Networks, point and click secure connections between physical office networks and the cloud.

It’s becoming akin to a Server and Services Supermart!

This now presents significant challenges, as this speed of evolution exposes new vendor risks to be managed both by the technical departments as well as the marketing machines. As marketing is becoming the new driving force behind product release cycles and investment.

At a high level these can be viewed in terms of:


As we see in OO programming, default ‘objects’ or libraries tend to be compromises. A jack of all trades solution but master of none. Whilst they may help accelerate development they invariably contain more code and less optimised process than would be desirable if a programmer wrote a dedicated ‘Object’ specific to the demands of the program they were writing. For many scenarios this has little perceptible impact. The march of time and evolution of processing power, available memory and storage in computing systems has masked the overhead of this inefficiency. With Cloud Computing though this has ramifications for the Cloud users just lifting workloads into Virtual machines hosted in the Cloud as the inefficiencies of such a model soon scale up and risk eating away at the very economies of scale that make Cloud such an attractive proposition. Technical competency to architect and then build or evolve systems to truly leverage the unique characteristics of Cloud Computing is underestimated by many.

This has its challenges for the cloud vendors as fine performance margins demanded by the price race to the bottom get a boost from the inefficient use of their resources by first generation cloud users. The risk is end user disillusionment to the cost of running combined with an exponential complexity and overhead of maintaining inefficiently migrated systems.


The release of ever more end user and non-technical friendly Cloud toolsets to the market is a minefield of vendor responsibility that requires great diligence. A somewhat conflicting concept when considered alongside the rapid release cycles of Cloud technologies and demands of marketing departments to parry competitor feature sets as they appear in flurries on the marketplace. With on premise solutions this was largely contained within private network boundaries and slower software release cadence, but with the Cloud these solutions are iterating at a fragmented level and ALL connected in some form to a public network subject to the gauntlet of hackers that are scanning every second almost all the registered Internet Addresses looking for weaknesses. This means that vendors need to raise the quality of their development, test and release practices as well as the unique security context they provide their offerings and finally the mechanisms for maintaining Cloud systems securely and up to date.

The downside for any vendor that falls foul is huge in terms of brand fall-out, loss of trust and credibility. As the rich user interfaces invite less skilled users they become less aware of collateral issues that their actions may have, the fingers are bound to get pointed at vendors when security is compromised. I do mean ‘IS’ compromised, because it is only a matter of time. Furthermore the maintenance and support services from these Cloud vendors introduce what I regard as the final mile in Cloud Service provision with its absence presenting additional ramifications to security. How do these end users know when to patch their solutions or is this something the vendor does for them? If it’s the vendor then whose fault is it if the user’s solution breaks when a vendor applies a security patch automatically? The complexity of the maintenance and support variable in the mix abounds.

For end users it is the ability to replicate and deploy solutions faster than ever before which is both the attraction and the challenge. Placing a greater burden on systems support and maintenance, which if not taken seriously exposes organisations and individuals to compliance breaches and data compromises as solutions fail due to lack of basic maintenance. Again fingers will get pointed at the vendor BUT the responsibility will be far from crystal clear as far as end users are concerned. Reflection on the HeartBleed flaw in the OpenSSL libraries used extensively across the Internet, on mobile devices, PC’ and servers alike attest to. Many organisations lack the basic support and maintenance discipline to identify let alone know how to deploy fixes across their IT estate on premise, Cloud systems don’t fix that but raise the stakes. Where you could get away with lax retrospective maintenance and support with Cloud it is a fool’s errand. The Heartbleed debacle is not a one off, but heralds a new generation of challenges to Cloud vendors and users alike. It will almost certainly occur again in some shape or form, whether on the same scale or not is still to be seen. Use of commercial software may be an insurance versus Open Source, but much of the commercial software today leverages Open Source components.


Ultimately it is the end user experience that will drive this metric. At the pointed end of this is the unanswered question as to how reliable is Cloud. We have seen all the main vendors experience significant outages. When your IT is in the Cloud and it goes down then you are at the mercy of your Cloud vendor to rectify that.

Microsoft is perhaps the most at risk of the big Cloud Vendors. Microsoft’s traditional on premise business has been fundamental to rapid adoption rates of their Cloud services by these same customers as a consistent trust relationships pull through. Complemented greatly by the rapid maturing of the end user experience of its Cloud Offerings to match its on premise experiences making cross skilling a key selling point. Google and Amazon are well behind the curve on their solution usability, and attempts to build enterprise relationships has been largely unsuccessful outside of some discrete workload, as for their developer/partner ecosystem these are negligible in comparison with Microsoft. Despite its rich Partner Ecosystems Microsoft continues to struggle with its quality of supports service across its Cloud offerings and this is only just starting to raise the spectre of the support and maintenance challenge for end users as they commit to cloud. Amazon and Google in turn are hardly in motion across support and maintenance.

Support and Maintenance

Undoubtedly becoming the new surface area across which security and end user experiences gets governed. As yet there is only one vendor who is addressing this space across the whole stack, The Arcadia Support Group with their strategic tie up with Gartner No.1 Infrastructure autonomics leader IPSoft delivering a unique full support and maintenance service that encompasses the whole IT stack, extending deep into the Custom Application layer where no other vendors dare to tread, spanning both Cloud and on premise. This is the last mile in Cloud Computing, Support & Maintenance where even Microsoft with its Partner Ecosystem is struggling to produce a consistent deliverable.

Intellectual Property Challenges for SME’s, A Big Data Opportunity?

This is a copy of a Guest Blog recently posted to the ‘Ideas Maters’ Website’ – Intellectual Property Challenges for SME’s, A Big Data Opportunity?

IP is a challenge for many SME’s, regardless of the industry. Establishing patents for original works, such as logos or unique statements, is not particularly straight forward and has been further complicated by the fragmentation of data resources that enable comparisons against existing patents. This places a reliance on expensive patent lawyers and not always with any guarantees of success.

As the number of existing patents grows, there is a need for more transparency and resources to help businesses understand the patents that already exist and negotiate the best way forward for themselves. Consortiums such as Ideas Matter can help, but there needs to be much more available.

One possible solution is Big Data, particularly the new algorithms used for grammar checking and correction. Their development has heralded a new age of being able to see and understand relations in information/data that were previously beyond our grasp. Now, SME’s should have the ability to query disparate data resources via a common machine learning algorithm that can compare a prospective patent application against all those that currently exist within seconds AND provide conflict or matching accuracy as well as even the best patent lawyers.

The challenge now becomes working with incumbent IT and business; a familiar meeting of old and new technologies. Evidence shows that it can be done – one example is online learning in the United States. The capabilities developed there by ‘massive open online courses’ (MOOCs), such as Udacity (, have resulted in automated assessment systems that could with a small leap of the imagination fulfil a similar function to allow SME’s to query patent databases cheaply and in almost real-time.

Ultimately, this is where the whole patent system needs to move. With increased globalisation and the online nature of business, we are seeing a levelling of the playing field between large Enterprises and SME’s. SME’s enjoy the same toolsets, reach and market potential as their larger competitors but unfortunately not always have the same deep pockets to protect their innovation.

Big Data Oversight or Persecution by Algorithm?

We are at an event horizon of yet another seismic shift in technologies progression and impact on our everyday lives. A shift that on the surface is all but invisible and little understood by many but is already resonating deep into the core fabric of our freedom and liberty as individuals and society at large.

In the over simplistic throwaway tone adopted by Zuckerberg, aka Facebook, challenges to data piracy are brushed off with the statement that users with nothing to hide have nothing to fear, a myth debunked! The sad reality of these words is their naivety as we live in the shadow of multiple examples going back across many centuries where at varying scales this attitude has undermined social freedoms.

The exponential rise of Cloud Computing, the utilisation of computing resources, with its lowering of the cost bar to data storage and ease of access to cheap computational systems opened up a veritable Pandora’s Box. This comes in a many facets, for example:

Data Volume

Where less than a decade ago organisations and individuals would diligently prune great volumes of data to retain just the bare essentials, today data storage is so cheap such prudency has been swept aside and ALL data and information is being stored, leading to huge data warehouses of information being retained indefinably. Local jurisdictional laws are being side lined as technological nuances outstrip regulators ability to adapt and protect. The retention and use of Personal Identifiable Information (PII) beyond its original use case is now the norm. Anonymity mechanisms banded about by companies as protection mechanisms are facile, as new database techniques allow data sets to be re-attached with over 90% accuracy, rendering ANY data stored or held at best ‘pseudo-anonymous’. That being anonymous at the discretion of the Data Controller (the holder of said data). Then there is the invidious class of corporation that attempts through its terms and conditions to contractually acquire and retain FOREVER and for their own use ANY data supplied, Facebook, Google and Amazon the principle protagonists.

The conclusion is ANY data ANY individual, organisation or a third party may supply is almost guaranteed to be retained somewhere AND be amalgamated and used out with its original scope or purpose for which it was given up. By way of examples, which are by no means exhaustive in detail or exclusivity include:

  • UK National Health data – On the pretext of better diagnosis, this has already been demonstrated to be a commercial venture and data is openly shared and poorly secured.
  • ANY online advertising entity that reads your Web Browser cookies – Aggregation of browsing behaviour occurs real-time and is pervasive as browsing habits are shared by backend marketing companies, retailers and search entities as they drive their relentless advert targeting at users.
  • Google Search engine, email and Google Apps usage and content scanning – In the pretext of service improvements no data is sacrosanct, emails and documents are scanned, and whilst these are professed to be anonymous machine activities not human, they expose more than is commonly confessed to.
  • Facebook, everything you submit or anything another individual may post that relates to you is forever at risk of exposure – Facebook have repeatedly adjusted their security policies rendering content previously ‘Private’ as public. Facebook also reserve the right to use your images for their own advertising purposes as well.
  • Vodafone mobile tracking data which is sold to advertising agencies – If you own a mobile phone you are one of the millions who have volunteered to participate in the biggest monitoring exercise the world has ever seen. Every move you make every step you take every connection is being monitored, recorded and available.
  • Experian credit rating agency selling its database to marketing companies – Dictators of who can and who cannot, the credit rating agencies dictate our lives in hidden ways that risk severe fallout on individuals liberty as data errors multiply and impact credit scores, which are very hard to get corrected as these behemoths profess to be greater than though at judging our credibility.

In the UK now the politicians are getting in on the act as government bureaucrats have floated the idea of selling individuals tax information, albeit anonymised! Yes you read that correctly, if you are a UK citizen YOUR tax returns could be readily available in the public domain. As I have stated above and in earlier blogs it is not hard for this data to be reattached to your identity.

Quality & Accuracy

Data and statistical analysis research, for example, was historically confined by real world practical economics of compiling data sets. Data collection and storage was costly to compile and analyse which lead to the practice of statistical analysis of small data sets as representative of a larger body of data. The assurance of quality was preserved by the diligence applied in striving for accuracy and credibility of the data as well as a representative spread across whatever criteria was appropriate for the scope of enquiry. To achieve the same levels of accuracy for statistical purposes across todays petabytes of data make it almost an impossible exercise, therefore data is becoming ‘dirty’ and subject to inaccuracy.

Today datasets are stupendously huge and so conveniently amalgamated, demanding a new approach. Which has coined the term ‘Bid Data’, where data quality has gone out of the window in favour of quantity. The principle being adopted that the completeness of data across all records relevant to a subject is no longer necessary because of the sheer volume of records that can now be referenced. Analysis of these huge volumes possible due to the cheap and conveniently available storage and computing power supplied as a result of Cloud Computing and the development of dedicated ‘Big Data’ computational systems. Data supplied for one purpose today now ends up being influenced by records from disparate sources with questionable outcomes.

Facebook is the outstanding villain in this regard. Continuing to flaunt any regard for Personal Identifiable Information as it harvests user data and markets this to advertising companies as well as reserving the right to use this data WITHOUT their direct consent.

Then there is the Google Flu predictor, the fallen poster boy. Where Google in its adolescent rush for recognition in disciplines out with its search capability professed to be able to predict the annual Flu breakouts in the US, it fell afoul of its own hype. On the face of it to be able to predict the spread of Flu was a fantastic proposition of huge value and benefit to society and health organisations that annually struggle to respond to Flu outbreaks. Google programmers asked questions of its huge data resources – compiled from years of monitoring users online through its own search engine as well as any website that subscribed to its ‘Free’ analytics service, reading all emails that touch the Gmail Service, monitoring Goole Apps usage and scanning associated documents as well as the tracking and recording of mobile activity on their Android platform – they professed to be able to draw what were assumed to be consistent insights from their data that paralleled the annual Flu breakouts. Heralding vindication for their voracious data appetite, that they had achieved a breakthrough, only to have their self-adorned laurels cast asunder as subsequent years Flu breakouts failed to reflect the Google predictions. The Google programmers with their marketing hype inflated egos were found to be human after all. They did not see the unpredictable impact of unrelated trending data within large data sets to materially distort their analysis.

Large Data Sets are like oceans. They have hidden depths, to extend the ocean analogy. There are big currents a la Gulf Stream and there are localised currents and tides which in turn are influenced in unpredictable ways at a macro level by wind, temperature and of course man at a micro level. Google in their human fallible haste were in essence looking at something akin to a local tidal pattern when they thought they were tapped into the certainty of a Data ‘Gulf Stream’. The Google Flu predictor is little more than an exercise in why data quality is still relevant and ‘Big Data’ is still in its infancy and requires careful governance.

Transparency & Accountability

Data analysis is no longer dependent on man-made diligently audited and qualified algorithms but algorithms that evolve dynamically as they become abstracted through machine learning and AI (Artificial Intelligence) programming techniques. Today algorithms running against large data sets are no longer fully understood even by their developers and supervisors.

The aforementioned example of the Google Flu predictor is an early example of this, where advanced machine learning and AI algorithm programming techniques were deployed, they evolved out with the controlled understanding of their creators. Like a boy racer irresponsibly let loose in a Formula 1 car, accelerating off purposefully only to find themselves waking up in A&E (Accident and Emergency), thinking they were the driver only to find they were little more than a passenger. Assuming they could even control the fickle balance of an F1 clutch and accelerator to get off the mark, OK then they had automated launch control … enough of the F1 digression. The point being even the big boys, Google, Amazon, Facebook et al, are still driving with L plates (learners) when it comes to Big Data, so be warned those corporates thinking they have it sussed have some rude awakenings ahead.

Now let’s combine this algorithmic alchemy with the blooming volumes of data available to organisations, extrapolate this into the n’th dimensions with merger and acquisitions and operational memorandum’s of understanding that allow organisations to share and combine data, then the picture takes on an all to Orwellian perspective. A prospect too tempting to ignore for Governments amongst others, NSA (US National Security Agency) springs to mind for some reason!

Don’t get me wrong, Predictive Analysis has been around for as long as data has been complied and analysed with great corporate and social success and benefit. Logistics is a frequently quoted market sector that uses this with great accuracy to efficiently route deliveries saving fuel and increasing efficiency. The key point here being they are working within a controlled data scope albeit with huge data volumes.

The water starts to muddy as we move into the realms of Correlation Analysis. In summary Correlation Analysis finds random relationships in data. That by itself is nothing earth shattering, but when those relationships are multiplied up across huge volumes of mixed data they start to reveal occurrences of factors or attributes that do not directly relate to the original query this starts to get into the realms of probability theory. That being if A, B and C occur together, if there then appears an associated relationship with say P then X, Y and Z are likely. These associated factors or attributes become a ‘Proxy’ that when a particular variable appears would dictate a high probability of a certain outcome. This ‘Proxy’ or associated relationship takes on an altogether different class of data insight extrapolation and have all kinds of implications.

Applied within diligently defined data scopes these Proxies can be hugely insightful. Airlines for example use the monitoring of vibrational and operational outputs from indirect and imperceptibly associated parts of an airliner to predict part failures and optimise preventative maintenance procedures saving millions and raising safety standards in the process. This is achievable because the data sets are controlled in scope. The Correlation Analysis allows for Proxies to become consistent that help flag up the probability of an occurrence such as a part failure that would have been impossible to extrapolate from more rigid traditional algorithm programming techniques. Machine learning and AI techniques allow the man-made algorithms to ‘evolve’ and produce correlations that extend in scope and complexity beyond the capability of the original programmer or programming team. The algorithms themselves no longer recognisable as they become exponentially complex and interwoven.

As these algorithms become more exponential in their scope and complexity their output becomes almost impossible to validate. This drives an interpretative behavioural change from questioning Why and outcome has occurred, but simply accepting What is produced. In the context of the airline, the data is confined to the aircraft albeit down to the most innocuous vibration. But in the context of our Google Flu predictor example earlier the data knows no bounds. The consequences are therefore unpredictable and subject to unknown ‘currents’ of influence which means accepting blindly ‘What is produced’ and being unable to answer the Why, is a worrying regression as data influences more and more of our lives.

For example if middle class individual employed locally of A religion who shops at B supermarket and frequents C websites suddenly starts travelling internationally to x, y or z locations then there is a high probability that he is a terrorist. An extreme example where the travel ‘Proxy’ match with the A, B and C factors = a high probability that the individual could be a terrorist.

This is called the ‘Minority Report’ syndrome, where individuals are pre-judged on probability outputs from Big Data Correlation Analytics, and not on their actual actions. Such scenarios warn of a future where individuals are judged and found guilty NOT on their actual intent and actions but on probability. A frightening prospect, and real risk to freedom and liberty.

This is not far removed from what is already going on in reality. The Memphis Tennessee Police use an algorithm called CRUSH (Criminal Reduction Utilising Statistical History), to extrapolate from that Criminal Statistical History data the ‘probability’ of anti-social flare ups in certain parts of the city.

Then there is the pernicious Google PageRank, the closely guarded secret sauce of the Google search engine that impacts commercial destinies every time Google choses to tweak it.

If Big Data is to grow up it will need to subject itself to checks and balances like any other facet of our lives. Organisations need to be accountable for their decisions and Correlation Analysis of the type articulated above will require their algorithms to be transparent, organisations held accountable.

A good start here would be the Google PageRank algorithm. This I believe has now reached a point in its maturity curve when combined with the anti-trust practices of its owner (Google) that it requires independent auditing. In an ideal world I would hope to see Google adopt its Open Source approach and allow the IT community to vet the algorithm, after all RSA have done so with their encryption technology amongst many other organisations without too much loss of market share. In fact their openness has enhanced their credibility. I suspect not in this case. After all Google is little more than a search engine company and their only hold on value is the control they wield as the signpost and advertising hub of the Internet.

This is as you will be able to deduce not going to be straightforward, but then I suggest neither were many of the compliance and independent auditing practices we now regard as the norm when they were first postulated.

Windows 8.1 Update 1 – Are you serious!

Having now had the Windows 8.1 Update 1 installed since it was inadvertently leaked last week I can only hope that the version leaked is a Beta, a work in progress, in fact a VERY early work in progress if not Alpha.

Headlining the attention Microsoft was giving to this Update was the meant to be acknowledgment of Desktop + Keyboard + Mouse users over the Touchy Feely new User Interface (UI).

For those of you who live on a tablet and the Touch UI is wholly suitable I completely agree you will probably not understand the misfit the touch UI has on large 21+ inch and multi-screen non-touch rigs many users have especially in corporate environments. These are not going away in a hurry and for many touch is simply not a practical option unless you want to live with your nose pressed up to your screen with all the short focal length damage that will do over time, not to mention the mess fingers make on the reflective surfaces of large screens. By way of litmus testing, whenever I go into Microsoft office, which is at least once a week somewhere in the world, I ask how the receptionists are getting on with their Touch all in one desktop PC’s. 95% of them seem to opt for mouse and keyboard, touch is seen as a second class citizen. This reinforces the general consensus we see from Corporates and Enterprises who continue to favour desktop receptive Windows 7, seeing Windows 8 as an ill-fitting desktop choice. If you require further proof just reflect on HP’s decision to reverse back into preinstalling Windows 7 across certain product lines, albeit using Windows 8 downgrade licensing.

So for you touchy people out there with your tablets, just pause before reeling out the ‘resistance to change’ mantra like a broken record. Desktop users have a real need and Microsoft is unusually being reluctant in their support. So hopes were high with the Update 1 that certain factors would come to the fore such as:

  1. Prioritised boot to desktop where users are either not running a touch enabled devices or simply don’t want to use the Touch UI.
  2. Setting desktop apps as the default NOT the screen hogging dual tasking Windows 8 affairs.
  3. Start Menu, NOT the sad nod to Desktop users with the Start Button that bounces them into the Touch UI.
  4. Windowing of Windows 8 Store apps, so these can co-exist alongside traditional desktop apps in an efficient use of screen real-estate multi-tasking mode, instead of the totally inefficient split screen mode that they adopt.

Numbers 1 and 2 seem to have been addressed. Albeit with a complete lack of clear admin settings where users can manually tweak or adapt to suit.

Point 3 is simply ridiculous when you look at the 3rd party aftermarket demand for this feature. Microsoft is showing a complete lack of any form of acknowledgment to end user requests and this continues to be a BIG issue for users who are self-remediating, so I do not understand what Microsoft issue is with not delivering this as an option.

Point 4 is a complete mess, whoever signed this off as an acceptable solution must have been in engineering not user interface design, which is why I sincerely hope the version of the Update 1 I have is not RTM code (Release to market). Why?

  • Putting a top window frame of a Windows 8 Store app with options to shrink to taskbar or close is clunky at best and does not constitute Windowing, it’s an embarrassing kludge. It offers the capability to shrink to the task bar or close the app, no scalable Windowing capability. As for the benefit of this I struggle to see. The implementation is clunky, looks appalling like a retrofit, and does little to add usability apart form make a problem even more glaringly obvious and frustrating. For users with large desktop screens and or high resolutions they require the ability to have the ability of floating app Windows so they do not get forced to stretch across more screen than they have functionality to fill. This implementation does nothing to support desktop orientated users, Windows 8 apps still go full screen assaulting users retina’s with an expanse of colour that makes a mockery of the real desktop multi-tasking capabilities of the Windows 8 OS.
  • As with the token Window top bar there is now a pop-up Desktop Task bar along the bottom screen edge of a Windows 8 Store app. This is again little more than a throwaway token gesture to desktop users which when clicked on bounces you into the Desktop.
  • The ability for Windows 8 apps to be be pinned to the Desktop and or task bar in Desktop mode looks promising, but looks is all you get. Once clicked they open to consume the whole screen in a completely alien desktop user behavioural experience, not the floating Window expected of a Desktop environment. Delivering a shortcut launch benefit only that probably creates more disorientation by adding to the application User Interface behavioural fragmentation.

Of serious concern is a memory issue that may be ‘by design’ but I would suggest could lead to problems. The Windows 8 Store apps seem to sit hogging memory even when apparently closed with the X in the top right of the new token Window frame. You would expect the applications to close fully. But no, it appears that they drop back to 0% CPU usage BUT hang onto memory. See the image below, the Health app and settings app were both apparently closed by clicking the X in the App frame, just as you would traditionally with a desktop app. BUT below you can clearly see they are sitting hogging memory in the screen grad from the Task Manager utility:

So much for this Update 1 being designed to support lower powered Tables. This memory issue alone will turn it into an OS that will simply grind to a halt and require frequent re-boots as users lack the technical knowledge to know any better.

Aside of the desktop user hopes that have been dashed, there are some nice tweaks under the bonnet, with improved OneDrive management and some additional Admin settings. But these are overshadowed by the compounding mess being made of the UI and further behavioural inconsistencies.

In summary the desktop supporting changes are just adding to the OS’s identity crisis, I struggle to see how this will address the corporate/enterprise adoption as a desktop platform. Why Microsoft chose not to implement these desktop changes consistently with what users are familiar with as Desktop application behavioural experiences we can only guess. Instead the half-way house (not even that, the 10% way house) clunky execution simply exacerbates and highlights the poor integration of the touch UI and desktop, making Windows 8 Apps look and feel even more alien and poorly suited for large, multi-screen and or high resolution systems.

It is unbelievable how an experience OS vendor in the face of so much constructive feedback pointing consistently to clear remediation requirements can ignore its consumer’s volunteered goodwill and compound matters undermining what is under the bonnet some great tech.

My only conclusion is we will look back on Windows 8 as a brave example of open/pubic software testing for what we can only hope will be a kick ass Windows 9. For now Windows 8.1 Update 1 may just have taken Windows 8 one step closer to being a truly Vista grade legacy.

Social Media, the Identify Thief’s best Friend

Are you a squeaky hinge?

Do you seep PII (Personal Identifiable Information)?

Are you the weakest link?

When I do reviews of individual’s online visibility and exposure to Identity Theft or raised profile for physical burglary it is not always the individual themselves who are the cause of their downfall but third parties. A compromised eCommerce site is this most volunteered cause of such a loss of PII, and whilst poor eCommerce site security is often the culprit when it comes to Credit or Bank card exposure it is almost always much closer to home when it comes to everything else.

With social media platforms abound whether you are active or passive in your participation, simply having a profile puts you at the centre of your own spinning web of growing visibility. By association everyone who engages with you online is blowing your cover. Like a dripping tap adding to the visibility and insights that can be drawn about you. Home and work addresses complete with telephone numbers come far too easily. When subject to a little bit of additional traditional PI (Personal Investigator) endeavour the gaps are quickly filled with DOB and a rich blend of immediate family members rendering most online media socialites and open book, ripe for a rip-off. It is therefore not a matter of if but when and how your identity will be harvested. The uses abound from the Social media platform data mining to focus advertising at you to a level of exposure that makes a mockery of the usual bank checks, such as:

  • Date of Birth
  • Mothers maiden name
  • Post Code
  • Telephone number

This use of information validation is a vaporous fig leaf in terms of protection, but fertile material for a complete stranger to start applying for Credit in your name and or trying to access your Bank through some form of social engineering. After all when was the last time you had a regular contact in your bank who would recognise your voice, or face even.

The issue here is invariably not you. It is your immediate peer group of ‘friends’ and acquaintances. Individuals who disclose in complete innocence (you hope) snippets of your life and identity which in isolation come over as completely innocuous, but when collated systematically combine to form a veritable treasure trove of information.

Like the scenario of the jet setting playboy with the multimillion pound Mews house in New York complete with Ferrari in the garage. Due to his infrequent visits his property agent was instructed to check the house over once a month and to turn the engine over on the shiny yellow beast in the basement. All very run of the mill and something that occurs the world over. But for an idle moments sitting behind the wheel of said yellow Ferrari with smartphone in hand, a snapshot of the famous logo and dashboard seemed completely innocent at the time, apparently completely anonymous so what was the harm in posting it as yet another piece of some nobodies Facebook trivia and ego boost as it pops up visible to all those ‘friends’.

2 months later following a police investigation of a house breaking and recover of one stolen yellow Ferrari (thanks to chassis embedded tracking device) it transpired that a gang of high end car thieves were using Facebook amongst other social media sites to compile a shopping list of high end cars for their lucrative steal to order business.

In this case they identified the car through the agents Facebook page and had little challenge finding the agents place of work to then follow said agent to locate the car. The rest as they say is in the Police records.

This is not some high tech top market exclusive group of intellectual thieves, but some very basic everyday people using the Internet to harvest information that can target individuals in their own homes almost in real time.

Techniques used by burglars include:

  • Google streetview – enables thieves to case properties in advance without any exposure.
  • Techniques common to stalkers, to ID vulnerable properties or assets.
  • Searches on Facebook and Twitter can be tailored to ID target user groups when they leave local jurisdictions and are therefore likely to be out of the country.
  • Foursquare broadcasting when individuals are at transit locations such as airports and stations. Combine these to build individual movement profiles.
  • Facebook’s ‘Open Graph’ search is a superhighway to targeting user profiles with publicly exposed data.
  • Location tagging – Mobile phone apps increasingly defaulting to broadcast location.
  • Document and image meta data such as ‘EXIF’ embedded into image files.

What can you do to stop this is not as easy as you think. Trying to rein in this data once ousted is almost impossible. Facebook is merciless in holding onto your data and exposing it at their leisure when they elect to change their privacy rules as they have done many times.

Peer pressure makes it hard to get friends to respect your privacy and not post photos and other revealing data about you. Anonymity is not as simple as people think. A Photo today often gives away more than it reveals on the surface, just got to to find out what digital delights are contained in an image file, such as GPS location etc.

Its better late than never so maybe try some of the following:

  • Stop using Facebook, this is probably a tall order but it represents the world’s largest surveillance platform. At least lock down your privacy settings and be VERY selective over whom you ‘friend’. Maybe it’s time to cull some of those less familiar connections and start going for trusted quality rather than ego pumping quantity. Also stop sharing with friends of ‘friends’.
  • Stop using Google GMail, or associated free document collaboration services. Google scans all your communications and this is you also imposing exposure on those you chose to communicate with as you are inviting them into this surveillance trap.
  • Do not share movement information such as vacations, or if you must do so retrospectively NOT real time.
  • Proactively search for and disable location sharing functionality on your mobile apps and social media platforms.
  • If a mobile app or online social media platform does not allow you to disable location sharing, delete it.
  • Use a VPN proxy to obfuscate your location. There are plenty of free ones out there, see a good article on the subject Why you should start using a VPN’
  • Improve your home security. Use technology in your favour such as WiFi camera’s that can monitor your home and capture video. If you do get burgled then at least you can have evidence. But PLEASE make sure you are relaying the data to a secure file share NOT saving it to your home server which is likely to get stolen!!
  • Remember the biggest deterrent to Burglars is VISIBLE alarm systems. These will deflect 90% according to industry insights.
  • Create clear mental demarcation lines so you become discounted about what you do online and also apply this to how you relate to other peoples data.
  • Stop posting information just for the sake of it. It all adds up, whether it is about you or someone else.
  • Anonymity DOES NOT EXIST in a Big Data world. Nothing you put up will be anonymous for long.

The biggest thing you can do is start leading by example, start turning off the firehouse of your own PII flooding into the public domain.

Cloud Success means ‘Hearts and Minds’ Microsoft

The success potential in the cloud for Microsoft was laid down many years ago, well before cloud was even on the horizon, a unique differentiator that no Amazon or Google can challenge …. Yet.

What I am talking about are customer ‘Relationships’ and the trust that those relationships has fostered, especially in the Corporate and Enterprise markets.

The reality being that Microsoft is the only provider who can deliver to both on premise and the Cloud which is for the foreseeable future Hybrid is the overriding model organisations are using. Whilst a feature shoot out would see Microsoft struggling in some discrete areas versus the likes of Amazon the reality is what Microsoft Cloud offers addresses the majority of organisations needs today without them having to stray into untrusted waters, and the offerings from Microsoft are just going to get better and narrow the gap in time to be sure.

As for the cost argument, that is a no brainer, Microsoft has made the commitment on price matching the competition. You can bank that one.

So back to the hearts and Minds. The Key to this is the Partner Ecosystem and how Microsoft engages this and brings its awesome capability to focus on the task in hand. The Partners represent the contact area through which the majority of Microsoft business is generated. Through its Partners Microsoft has the Corporate and Enterprise customer credibility that key competitors like Amazon and Google are struggling to gain. The competition are realising how slow trust and confidence gets built, Microsoft is decades ahead of them, but Cloud is threatening to be a bit of a leveller if they don’t watch out.

Success in the cloud is as Kevin Turner made very clear is non-negotiable for Microsoft. His famous slide of a tunnel with a bright light at the end sums this up. Whether we are in for a train wreck or a sunny day is still too early to judge, but one factor that will make this more certain is decisive action to reverse the disillusionment Partners are having with their Microsoft relationship as they adapt to new cloud business models and the Microsoft dimension that now exists in the service delivery. The relationship with the channel is an up front and central priority if the current wave of ambivalence is to be stemmed before it develops a momentum that will be hard to subside.

The elephant in the room is that Microsoft is perceived as moving into traditional Partner territory with its own services, and to be blunt, it is more than a perception it’s a reality. Let’s not argue that point. Another that always frustrates me and is more visible than ever with Cloud Computing rampaging across the marketplace, is the illusion of role security in IT (Information Technology). Be that a an IT Business offering or individual IT pro in a customer’s IT department the hard reality is that anyone in IT who thinks their current technology skill set will last him or her a career are delusional. That goes for Microsoft Partners with their service offerings and products AND Microsoft to boot with their attitude to how they define their Partner engagement.

In the medium to long term Cloud Computing will change the face of IT as a work and marketplace. People who don’t like change will be some of the laggards holding back their businesses from capitalising on this new computing paradigm, they should NOT be working in IT. They might as well stand on the beach and try and hold the tide back.

Just take a look at what is already happening and visible for those with their eyes open. Start-ups don’t build out Datacentre’s, they launch in the cloud. Why? They are not encumbered and can do the most cost effective and logical thing with the choices available.

Datacentre capacity and scale today is accessible to anyone and can be deployed and run by 2 men and a credit card in less than a week. OK the dogs still there just under the desk, after all the 2 men can do this now from home, they don’t need to build or maintain datacentres anymore, nor the infrastructure or maintenance for that matter ;-)

So why don’t established businesses!! Cloud is encumbered by the conditioning of entrenched IT and a lack of Trust. It is the Trust factor, which is where the relationships are critical and Microsoft has the key to the kingdom.

People buy from people = RELATIONSHIPS = people partnering with people.

Key to any sustained relationship is a positive experience, yup that can have some esoteric dimensions to it but we are talking IT here OK, so back on theme… The magic word in services EXPERIENCE, and specifically quality of experience. It’s not product feature sets, which is the hard cultural shift Microsoft is having to make, yes you have to have the goods but that is not the end game anymore.

Microsoft Partners deliver that personable face of the Microsoft ecosystem and the valued customer experience. The partnering structure Microsoft imposes on its partners though has plummeted in comparison, as I wrote in my last blog ‘Microsoft Partner Network (MPN) in a Modern World’ . The main point being a significantly revamped Partner Program is required to reflect the commercial prerogatives that drives the new Cloud world.

For Microsoft the Holy Grail is to re-engage its Partners in a new way. The symbiotic relationship has never be more important than it is with the shift to a Cloud world. Microsoft needs those relationships to be transitioned to the cloud, and they are not in a position to do that themselves.

In the old world Microsoft you were the factory that sat on your hill in the American North West, with a marketing engine that would fire off great salvoes of promotional air cover under which your partners would get up close and personal with the customers, refining the messaging for local consumption and ring the till. Packaged product would flow from the factory to the Partners and Partners would do and own the magic. The trust and relationships with customers being forged, fostered and cherished for mutual benefit by your Partners, Partners who rise and fall on the quality of the customer experience they offer and the profitability they can nurture from that engagement. The importance of the profitability message cannot be underestimated, it’s not that it never existed between Microsoft and its Partners, it is just that with Cloud Computing the model is changing, Microsoft is now directly impacting that in real time.

Microsoft is now in the field with Partners, expecting adoption as a direct extension of the Partners Service Delivery and by extension their IT teams, as Partners are now beholden on Microsoft as an extension of their support service chain. The established Partnership Trust Microsoft enjoys is getting them in the door and Partners are putting their goodwill and name on the line in blind trust. But Partners are starting to find their newly adopted IT extension(s) are not fully aligned to what impacts their business. Their new IT team dependency (Microsoft) is not delivering as expected or rather as Partners who expect form their own IT teams. Consequently Partners are feeling they are no longer 100% in charge of their own end to end solution delivery and there is the pinch, where control or coordination is missing costs can quickly spiral and profitability evaporate. Quick on the heels of which goes quality of service and customer experience, that hallowed trusted customer relationship, a veritable meltdown like a China Syndrome. In the wings awaits a Book seller and Search engine vendor hungry for custom.

When I am speaking to Partners I call this ‘A New Age of Trust in the Cloud’. For example ISV’s (Independent Software Vendors) experience this when their maintenance and support contracts don’t get renewed. Why? Because a born in the cloud start-up has just mined their customer base with a cloud offering that blows away the incumbent legacy ISV offering. By the time the ISV realises what is going on it will be too late. Service Partners see the same thing happening when the phone stops ringing from established customers with whom they have neglected to develop their Cloud credentials.

In Microsoft’s case its Partner Program MPN (Microsoft Partner Network) is hitting up against this like a ‘Trust Event Horizon’, testing that Partner faith just enough to be dangerous. A question mark over a relationship can be a gnawingly dangerous thing. Trust is a slow won but can also be frighteningly ephemeral.

Microsoft you have it in your power to rewrite the rulebook and engage Partners at new levels of marketing and operational intimacy and in so doing do credit to the trust and confidence of your Partners. Let’s see some of that Blue Ocean thinking, as Cloud is re-writing the IT rulebook it is time the masters of the Partner ecosystem did the same with Partner Channels.

Microsoft and Partners success will be found in the teamwork that is required to deliver end-to-end value + quality, experience rich scenario focused propositions into the market. That is not something Partners can do independently or in semi-detached way. Partners have the experience at the coal face and know how best to engage the customers and Microsoft controls the engines.

Some pointers as to where to start:

  • Service integration – Microsoft you are now an extension of your Partners IT department, they need that same visibility and accountability from you. Extend your service help desk to your Partners and allow your Partners to become an extension of your own support teams with access to escalate to product groups. This has to happen to deliver to the market expectation of a single point of support.
  • Sales integration – Just as with the services, there are now questions that only Microsoft can answer, so when we are in the field and preparing for a client engagement we need the connection and joined up account planning. Yes we like the name change PAM’s (Partner Account Managers) to PSE’s (Partner Sales Executives), now let’s see the action, sharing visibility of managed (and breadth) accounts and working with the internal account teams, akin to an extension of the Partners Sales team. The customer relationship management success will be maximised with a consistent single touch point, the Partner.

This is something Microsoft has been doing internally already, just look at how GFS (Global Foundation Services) integrates and works with the rest of Microsoft to deliver the great suite of Microsoft Online Services. My suggestion is to follow the same principles, an extension of this type of service engagement model, into the Partner base.

Sounds great doesn’t it Partners, but the give back is this will only work with the fully committed and engaged Partners, the ones Microsoft can trust to engage at a level of professionalism to operate within NDA (non-disclosure Agreement) to share this level of operational integration. It will be no good anymore just signing up as a Partner and expecting an open door, it will require a new level of proactive engagement with Microsoft.

Challenging when working on Internet Time which doesn’t wait for anyone.


Get every new post delivered to your Inbox.

Join 255 other followers