Cloud Vendors – Great Power, Much Responsibility


Headlines abound as the race to the bottom continues in the price wars that is benefiting Cloud Computing users but challenging the 3 principles in Cloud Computing – Microsoft, Amazon as clear leaders with the rapidly maturing but still lagging Google – as they struggle with a new class of business model economics that can no longer rely solely on operational efficiency backed by cost competitiveness. As cost becomes a level playing field there are interesting new shifts in play that will move the emphasis into other areas such as end user experience, brand relationship and trust in the battle for market share.

Microsoft Azure marches relentlessly on with its delivery of a rich end user functionality experience driving home its ease of use credentials, a clear advantage when coupled with its mature partner ecosystem and by association rich customer relationship market surface area. As Scott Guthrie announces the preview of the next generation Microsoft Azure Management Console with a whole raft of features. Features that make it easier and faster to tap more of the great Cloud Computing economics of scale and utility that is spawning a new era of business agility. I do not intend to address the feature benefits coming to market but the trending impact of such a use friendly experience which may prove to be a double edged sword for Microsoft.

It is not the first time we have seen technology emerging from the control of just the technically adept magicians into the consumer space of everyday end users. To draw a parallel, Object Orientated Programming (OO) back in the ’90’s heralded for many purists the end of program development as a skilled profession. The point being made was the Object Orientated (OO) programming models would render developers little more than Lego Builders. As OO programming modularised many common functions and methods in the form of program objects and libraries that developers could stitch together instead of having to write from the ground up each time. The reality was it did simplify development but shifted the focus. Developers were able to spend more time building more complex and advanced software than hacking out repetitive tasks that could now be pulled off the shelf in the form of vendor produced programmable objects ‘libraries’ of functionality. Critically it has made programming more accessible to the hobbyist lowering the bar of entry considerably as rich development environments such as Microsoft Visual Studio and Adobe’s Dreamweaver provide the ultimate canvas to create software on. Microsoft even do a starter version of its Visual Studio for FREE, head over and get Visual Studio Express.

Microsoft are driving a similar evolution with Cloud Computing. The menial scripted grunt work is being replaced with autonomics dressed up in rich point and click user friendly interfaces and template based frameworks. The vendor heralding benefits being to allow systems administrators and developers more time to focus on the higher return activities that can better materially benefit and advance their organisations.

All good so far. But with such convenience as with OO Programming, the automation and abstraction of complexity does come with a health warning for vendors and users alike. With OO Programming the accessibility was still constrained to the technically adept who could architect, develop and code, but cloud computing has lowered the bar to a point where any computer literate individual can now start deploying significant computing capability with almost no programming or systems administrative training. This includes:

  • Websites complete with content management, file sharing or blogging capabilities.
  • Computing infrastructure with enterprise grade Application Servers such as SharePoint MS SQL and Biztalk.
  • Virtual Network & Virtual Private Networks, point and click secure connections between physical office networks and the cloud.

It’s becoming akin to a Server and Services Supermart!

This now presents significant challenges, as this speed of evolution exposes new vendor risks to be managed both by the technical departments as well as the marketing machines. As marketing is becoming the new driving force behind product release cycles and investment.

At a high level these can be viewed in terms of:

Efficiency

As we see in OO programming, default ‘objects’ or libraries tend to be compromises. A jack of all trades solution but master of none. Whilst they may help accelerate development they invariably contain more code and less optimised process than would be desirable if a programmer wrote a dedicated ‘Object’ specific to the demands of the program they were writing. For many scenarios this has little perceptible impact. The march of time and evolution of processing power, available memory and storage in computing systems has masked the overhead of this inefficiency. With Cloud Computing though this has ramifications for the Cloud users just lifting workloads into Virtual machines hosted in the Cloud as the inefficiencies of such a model soon scale up and risk eating away at the very economies of scale that make Cloud such an attractive proposition. Technical competency to architect and then build or evolve systems to truly leverage the unique characteristics of Cloud Computing is underestimated by many.

This has its challenges for the cloud vendors as fine performance margins demanded by the price race to the bottom get a boost from the inefficient use of their resources by first generation cloud users. The risk is end user disillusionment to the cost of running combined with an exponential complexity and overhead of maintaining inefficiently migrated systems.

Security

The release of ever more end user and non-technical friendly Cloud toolsets to the market is a minefield of vendor responsibility that requires great diligence. A somewhat conflicting concept when considered alongside the rapid release cycles of Cloud technologies and demands of marketing departments to parry competitor feature sets as they appear in flurries on the marketplace. With on premise solutions this was largely contained within private network boundaries and slower software release cadence, but with the Cloud these solutions are iterating at a fragmented level and ALL connected in some form to a public network subject to the gauntlet of hackers that are scanning every second almost all the registered Internet Addresses looking for weaknesses. This means that vendors need to raise the quality of their development, test and release practices as well as the unique security context they provide their offerings and finally the mechanisms for maintaining Cloud systems securely and up to date.

The downside for any vendor that falls foul is huge in terms of brand fall-out, loss of trust and credibility. As the rich user interfaces invite less skilled users they become less aware of collateral issues that their actions may have, the fingers are bound to get pointed at vendors when security is compromised. I do mean ‘IS’ compromised, because it is only a matter of time. Furthermore the maintenance and support services from these Cloud vendors introduce what I regard as the final mile in Cloud Service provision with its absence presenting additional ramifications to security. How do these end users know when to patch their solutions or is this something the vendor does for them? If it’s the vendor then whose fault is it if the user’s solution breaks when a vendor applies a security patch automatically? The complexity of the maintenance and support variable in the mix abounds.

For end users it is the ability to replicate and deploy solutions faster than ever before which is both the attraction and the challenge. Placing a greater burden on systems support and maintenance, which if not taken seriously exposes organisations and individuals to compliance breaches and data compromises as solutions fail due to lack of basic maintenance. Again fingers will get pointed at the vendor BUT the responsibility will be far from crystal clear as far as end users are concerned. Reflection on the HeartBleed flaw in the OpenSSL libraries used extensively across the Internet, on mobile devices, PC’ and servers alike attest to. Many organisations lack the basic support and maintenance discipline to identify let alone know how to deploy fixes across their IT estate on premise, Cloud systems don’t fix that but raise the stakes. Where you could get away with lax retrospective maintenance and support with Cloud it is a fool’s errand. The Heartbleed debacle is not a one off, but heralds a new generation of challenges to Cloud vendors and users alike. It will almost certainly occur again in some shape or form, whether on the same scale or not is still to be seen. Use of commercial software may be an insurance versus Open Source, but much of the commercial software today leverages Open Source components.

Reliability

Ultimately it is the end user experience that will drive this metric. At the pointed end of this is the unanswered question as to how reliable is Cloud. We have seen all the main vendors experience significant outages. When your IT is in the Cloud and it goes down then you are at the mercy of your Cloud vendor to rectify that.

Microsoft is perhaps the most at risk of the big Cloud Vendors. Microsoft’s traditional on premise business has been fundamental to rapid adoption rates of their Cloud services by these same customers as a consistent trust relationships pull through. Complemented greatly by the rapid maturing of the end user experience of its Cloud Offerings to match its on premise experiences making cross skilling a key selling point. Google and Amazon are well behind the curve on their solution usability, and attempts to build enterprise relationships has been largely unsuccessful outside of some discrete workload, as for their developer/partner ecosystem these are negligible in comparison with Microsoft. Despite its rich Partner Ecosystems Microsoft continues to struggle with its quality of supports service across its Cloud offerings and this is only just starting to raise the spectre of the support and maintenance challenge for end users as they commit to cloud. Amazon and Google in turn are hardly in motion across support and maintenance.

Support and Maintenance

Undoubtedly becoming the new surface area across which security and end user experiences gets governed. As yet there is only one vendor who is addressing this space across the whole stack, The Arcadia Support Group with their strategic tie up with Gartner No.1 Infrastructure autonomics leader IPSoft delivering a unique full support and maintenance service that encompasses the whole IT stack, extending deep into the Custom Application layer where no other vendors dare to tread, spanning both Cloud and on premise. This is the last mile in Cloud Computing, Support & Maintenance where even Microsoft with its Partner Ecosystem is struggling to produce a consistent deliverable.

Intellectual Property Challenges for SME’s, A Big Data Opportunity?


This is a copy of a Guest Blog recently posted to the ‘Ideas Maters’ Website’ – Intellectual Property Challenges for SME’s, A Big Data Opportunity?

IP is a challenge for many SME’s, regardless of the industry. Establishing patents for original works, such as logos or unique statements, is not particularly straight forward and has been further complicated by the fragmentation of data resources that enable comparisons against existing patents. This places a reliance on expensive patent lawyers and not always with any guarantees of success.

As the number of existing patents grows, there is a need for more transparency and resources to help businesses understand the patents that already exist and negotiate the best way forward for themselves. Consortiums such as Ideas Matter can help, but there needs to be much more available.

One possible solution is Big Data, particularly the new algorithms used for grammar checking and correction. Their development has heralded a new age of being able to see and understand relations in information/data that were previously beyond our grasp. Now, SME’s should have the ability to query disparate data resources via a common machine learning algorithm that can compare a prospective patent application against all those that currently exist within seconds AND provide conflict or matching accuracy as well as even the best patent lawyers.

The challenge now becomes working with incumbent IT and business; a familiar meeting of old and new technologies. Evidence shows that it can be done – one example is online learning in the United States. The capabilities developed there by ‘massive open online courses’ (MOOCs), such as Udacity (https://www.udacity.com), have resulted in automated assessment systems that could with a small leap of the imagination fulfil a similar function to allow SME’s to query patent databases cheaply and in almost real-time.

Ultimately, this is where the whole patent system needs to move. With increased globalisation and the online nature of business, we are seeing a levelling of the playing field between large Enterprises and SME’s. SME’s enjoy the same toolsets, reach and market potential as their larger competitors but unfortunately not always have the same deep pockets to protect their innovation.

Big Data Oversight or Persecution by Algorithm?


We are at an event horizon of yet another seismic shift in technologies progression and impact on our everyday lives. A shift that on the surface is all but invisible and little understood by many but is already resonating deep into the core fabric of our freedom and liberty as individuals and society at large.

In the over simplistic throwaway tone adopted by Zuckerberg, aka Facebook, challenges to data piracy are brushed off with the statement that users with nothing to hide have nothing to fear, a myth debunked! The sad reality of these words is their naivety as we live in the shadow of multiple examples going back across many centuries where at varying scales this attitude has undermined social freedoms.

The exponential rise of Cloud Computing, the utilisation of computing resources, with its lowering of the cost bar to data storage and ease of access to cheap computational systems opened up a veritable Pandora’s Box. This comes in a many facets, for example:

Data Volume

Where less than a decade ago organisations and individuals would diligently prune great volumes of data to retain just the bare essentials, today data storage is so cheap such prudency has been swept aside and ALL data and information is being stored, leading to huge data warehouses of information being retained indefinably. Local jurisdictional laws are being side lined as technological nuances outstrip regulators ability to adapt and protect. The retention and use of Personal Identifiable Information (PII) beyond its original use case is now the norm. Anonymity mechanisms banded about by companies as protection mechanisms are facile, as new database techniques allow data sets to be re-attached with over 90% accuracy, rendering ANY data stored or held at best ‘pseudo-anonymous’. That being anonymous at the discretion of the Data Controller (the holder of said data). Then there is the invidious class of corporation that attempts through its terms and conditions to contractually acquire and retain FOREVER and for their own use ANY data supplied, Facebook, Google and Amazon the principle protagonists.

The conclusion is ANY data ANY individual, organisation or a third party may supply is almost guaranteed to be retained somewhere AND be amalgamated and used out with its original scope or purpose for which it was given up. By way of examples, which are by no means exhaustive in detail or exclusivity include:

  • UK National Health data – On the pretext of better diagnosis, this has already been demonstrated to be a commercial venture and data is openly shared and poorly secured.
  • ANY online advertising entity that reads your Web Browser cookies – Aggregation of browsing behaviour occurs real-time and is pervasive as browsing habits are shared by backend marketing companies, retailers and search entities as they drive their relentless advert targeting at users.
  • Google Search engine, email and Google Apps usage and content scanning – In the pretext of service improvements no data is sacrosanct, emails and documents are scanned, and whilst these are professed to be anonymous machine activities not human, they expose more than is commonly confessed to.
  • Facebook, everything you submit or anything another individual may post that relates to you is forever at risk of exposure – Facebook have repeatedly adjusted their security policies rendering content previously ‘Private’ as public. Facebook also reserve the right to use your images for their own advertising purposes as well.
  • Vodafone mobile tracking data which is sold to advertising agencies – If you own a mobile phone you are one of the millions who have volunteered to participate in the biggest monitoring exercise the world has ever seen. Every move you make every step you take every connection is being monitored, recorded and available.
  • Experian credit rating agency selling its database to marketing companies – Dictators of who can and who cannot, the credit rating agencies dictate our lives in hidden ways that risk severe fallout on individuals liberty as data errors multiply and impact credit scores, which are very hard to get corrected as these behemoths profess to be greater than though at judging our credibility.

In the UK now the politicians are getting in on the act as government bureaucrats have floated the idea of selling individuals tax information, albeit anonymised! Yes you read that correctly, if you are a UK citizen YOUR tax returns could be readily available in the public domain. As I have stated above and in earlier blogs it is not hard for this data to be reattached to your identity.

Quality & Accuracy

Data and statistical analysis research, for example, was historically confined by real world practical economics of compiling data sets. Data collection and storage was costly to compile and analyse which lead to the practice of statistical analysis of small data sets as representative of a larger body of data. The assurance of quality was preserved by the diligence applied in striving for accuracy and credibility of the data as well as a representative spread across whatever criteria was appropriate for the scope of enquiry. To achieve the same levels of accuracy for statistical purposes across todays petabytes of data make it almost an impossible exercise, therefore data is becoming ‘dirty’ and subject to inaccuracy.

Today datasets are stupendously huge and so conveniently amalgamated, demanding a new approach. Which has coined the term ‘Bid Data’, where data quality has gone out of the window in favour of quantity. The principle being adopted that the completeness of data across all records relevant to a subject is no longer necessary because of the sheer volume of records that can now be referenced. Analysis of these huge volumes possible due to the cheap and conveniently available storage and computing power supplied as a result of Cloud Computing and the development of dedicated ‘Big Data’ computational systems. Data supplied for one purpose today now ends up being influenced by records from disparate sources with questionable outcomes.

Facebook is the outstanding villain in this regard. Continuing to flaunt any regard for Personal Identifiable Information as it harvests user data and markets this to advertising companies as well as reserving the right to use this data WITHOUT their direct consent.

Then there is the Google Flu predictor, the fallen poster boy. Where Google in its adolescent rush for recognition in disciplines out with its search capability professed to be able to predict the annual Flu breakouts in the US, it fell afoul of its own hype. On the face of it to be able to predict the spread of Flu was a fantastic proposition of huge value and benefit to society and health organisations that annually struggle to respond to Flu outbreaks. Google programmers asked questions of its huge data resources – compiled from years of monitoring users online through its own search engine as well as any website that subscribed to its ‘Free’ analytics service, reading all emails that touch the Gmail Service, monitoring Goole Apps usage and scanning associated documents as well as the tracking and recording of mobile activity on their Android platform – they professed to be able to draw what were assumed to be consistent insights from their data that paralleled the annual Flu breakouts. Heralding vindication for their voracious data appetite, that they had achieved a breakthrough, only to have their self-adorned laurels cast asunder as subsequent years Flu breakouts failed to reflect the Google predictions. The Google programmers with their marketing hype inflated egos were found to be human after all. They did not see the unpredictable impact of unrelated trending data within large data sets to materially distort their analysis.

Large Data Sets are like oceans. They have hidden depths, to extend the ocean analogy. There are big currents a la Gulf Stream and there are localised currents and tides which in turn are influenced in unpredictable ways at a macro level by wind, temperature and of course man at a micro level. Google in their human fallible haste were in essence looking at something akin to a local tidal pattern when they thought they were tapped into the certainty of a Data ‘Gulf Stream’. The Google Flu predictor is little more than an exercise in why data quality is still relevant and ‘Big Data’ is still in its infancy and requires careful governance.

Transparency & Accountability

Data analysis is no longer dependent on man-made diligently audited and qualified algorithms but algorithms that evolve dynamically as they become abstracted through machine learning and AI (Artificial Intelligence) programming techniques. Today algorithms running against large data sets are no longer fully understood even by their developers and supervisors.

The aforementioned example of the Google Flu predictor is an early example of this, where advanced machine learning and AI algorithm programming techniques were deployed, they evolved out with the controlled understanding of their creators. Like a boy racer irresponsibly let loose in a Formula 1 car, accelerating off purposefully only to find themselves waking up in A&E (Accident and Emergency), thinking they were the driver only to find they were little more than a passenger. Assuming they could even control the fickle balance of an F1 clutch and accelerator to get off the mark, OK then they had automated launch control … enough of the F1 digression. The point being even the big boys, Google, Amazon, Facebook et al, are still driving with L plates (learners) when it comes to Big Data, so be warned those corporates thinking they have it sussed have some rude awakenings ahead.

Now let’s combine this algorithmic alchemy with the blooming volumes of data available to organisations, extrapolate this into the n’th dimensions with merger and acquisitions and operational memorandum’s of understanding that allow organisations to share and combine data, then the picture takes on an all to Orwellian perspective. A prospect too tempting to ignore for Governments amongst others, NSA (US National Security Agency) springs to mind for some reason!

Don’t get me wrong, Predictive Analysis has been around for as long as data has been complied and analysed with great corporate and social success and benefit. Logistics is a frequently quoted market sector that uses this with great accuracy to efficiently route deliveries saving fuel and increasing efficiency. The key point here being they are working within a controlled data scope albeit with huge data volumes.

The water starts to muddy as we move into the realms of Correlation Analysis. In summary Correlation Analysis finds random relationships in data. That by itself is nothing earth shattering, but when those relationships are multiplied up across huge volumes of mixed data they start to reveal occurrences of factors or attributes that do not directly relate to the original query this starts to get into the realms of probability theory. That being if A, B and C occur together, if there then appears an associated relationship with say P then X, Y and Z are likely. These associated factors or attributes become a ‘Proxy’ that when a particular variable appears would dictate a high probability of a certain outcome. This ‘Proxy’ or associated relationship takes on an altogether different class of data insight extrapolation and have all kinds of implications.

Applied within diligently defined data scopes these Proxies can be hugely insightful. Airlines for example use the monitoring of vibrational and operational outputs from indirect and imperceptibly associated parts of an airliner to predict part failures and optimise preventative maintenance procedures saving millions and raising safety standards in the process. This is achievable because the data sets are controlled in scope. The Correlation Analysis allows for Proxies to become consistent that help flag up the probability of an occurrence such as a part failure that would have been impossible to extrapolate from more rigid traditional algorithm programming techniques. Machine learning and AI techniques allow the man-made algorithms to ‘evolve’ and produce correlations that extend in scope and complexity beyond the capability of the original programmer or programming team. The algorithms themselves no longer recognisable as they become exponentially complex and interwoven.

As these algorithms become more exponential in their scope and complexity their output becomes almost impossible to validate. This drives an interpretative behavioural change from questioning Why and outcome has occurred, but simply accepting What is produced. In the context of the airline, the data is confined to the aircraft albeit down to the most innocuous vibration. But in the context of our Google Flu predictor example earlier the data knows no bounds. The consequences are therefore unpredictable and subject to unknown ‘currents’ of influence which means accepting blindly ‘What is produced’ and being unable to answer the Why, is a worrying regression as data influences more and more of our lives.

For example if middle class individual employed locally of A religion who shops at B supermarket and frequents C websites suddenly starts travelling internationally to x, y or z locations then there is a high probability that he is a terrorist. An extreme example where the travel ‘Proxy’ match with the A, B and C factors = a high probability that the individual could be a terrorist.

This is called the ‘Minority Report’ syndrome, where individuals are pre-judged on probability outputs from Big Data Correlation Analytics, and not on their actual actions. Such scenarios warn of a future where individuals are judged and found guilty NOT on their actual intent and actions but on probability. A frightening prospect, and real risk to freedom and liberty.

This is not far removed from what is already going on in reality. The Memphis Tennessee Police use an algorithm called CRUSH (Criminal Reduction Utilising Statistical History), to extrapolate from that Criminal Statistical History data the ‘probability’ of anti-social flare ups in certain parts of the city.

Then there is the pernicious Google PageRank, the closely guarded secret sauce of the Google search engine that impacts commercial destinies every time Google choses to tweak it.

If Big Data is to grow up it will need to subject itself to checks and balances like any other facet of our lives. Organisations need to be accountable for their decisions and Correlation Analysis of the type articulated above will require their algorithms to be transparent, organisations held accountable.

A good start here would be the Google PageRank algorithm. This I believe has now reached a point in its maturity curve when combined with the anti-trust practices of its owner (Google) that it requires independent auditing. In an ideal world I would hope to see Google adopt its Open Source approach and allow the IT community to vet the algorithm, after all RSA have done so with their encryption technology amongst many other organisations without too much loss of market share. In fact their openness has enhanced their credibility. I suspect not in this case. After all Google is little more than a search engine company and their only hold on value is the control they wield as the signpost and advertising hub of the Internet.

This is as you will be able to deduce not going to be straightforward, but then I suggest neither were many of the compliance and independent auditing practices we now regard as the norm when they were first postulated.

Cloud Success means ‘Hearts and Minds’ Microsoft


The success potential in the cloud for Microsoft was laid down many years ago, well before cloud was even on the horizon, a unique differentiator that no Amazon or Google can challenge …. Yet.

What I am talking about are customer ‘Relationships’ and the trust that those relationships has fostered, especially in the Corporate and Enterprise markets.

The reality being that Microsoft is the only provider who can deliver to both on premise and the Cloud which is for the foreseeable future Hybrid is the overriding model organisations are using. Whilst a feature shoot out would see Microsoft struggling in some discrete areas versus the likes of Amazon the reality is what Microsoft Cloud offers addresses the majority of organisations needs today without them having to stray into untrusted waters, and the offerings from Microsoft are just going to get better and narrow the gap in time to be sure.

As for the cost argument, that is a no brainer, Microsoft has made the commitment on price matching the competition. You can bank that one.

So back to the hearts and Minds. The Key to this is the Partner Ecosystem and how Microsoft engages this and brings its awesome capability to focus on the task in hand. The Partners represent the contact area through which the majority of Microsoft business is generated. Through its Partners Microsoft has the Corporate and Enterprise customer credibility that key competitors like Amazon and Google are struggling to gain. The competition are realising how slow trust and confidence gets built, Microsoft is decades ahead of them, but Cloud is threatening to be a bit of a leveller if they don’t watch out.

Success in the cloud is as Kevin Turner made very clear is non-negotiable for Microsoft. His famous slide of a tunnel with a bright light at the end sums this up. Whether we are in for a train wreck or a sunny day is still too early to judge, but one factor that will make this more certain is decisive action to reverse the disillusionment Partners are having with their Microsoft relationship as they adapt to new cloud business models and the Microsoft dimension that now exists in the service delivery. The relationship with the channel is an up front and central priority if the current wave of ambivalence is to be stemmed before it develops a momentum that will be hard to subside.

The elephant in the room is that Microsoft is perceived as moving into traditional Partner territory with its own services, and to be blunt, it is more than a perception it’s a reality. Let’s not argue that point. Another that always frustrates me and is more visible than ever with Cloud Computing rampaging across the marketplace, is the illusion of role security in IT (Information Technology). Be that a an IT Business offering or individual IT pro in a customer’s IT department the hard reality is that anyone in IT who thinks their current technology skill set will last him or her a career are delusional. That goes for Microsoft Partners with their service offerings and products AND Microsoft to boot with their attitude to how they define their Partner engagement.

In the medium to long term Cloud Computing will change the face of IT as a work and marketplace. People who don’t like change will be some of the laggards holding back their businesses from capitalising on this new computing paradigm, they should NOT be working in IT. They might as well stand on the beach and try and hold the tide back.

Just take a look at what is already happening and visible for those with their eyes open. Start-ups don’t build out Datacentre’s, they launch in the cloud. Why? They are not encumbered and can do the most cost effective and logical thing with the choices available.

Datacentre capacity and scale today is accessible to anyone and can be deployed and run by 2 men and a credit card in less than a week. OK the dogs still there just under the desk, after all the 2 men can do this now from home, they don’t need to build or maintain datacentres anymore, nor the infrastructure or maintenance for that matter ;-)

So why don’t established businesses!! Cloud is encumbered by the conditioning of entrenched IT and a lack of Trust. It is the Trust factor, which is where the relationships are critical and Microsoft has the key to the kingdom.

People buy from people = RELATIONSHIPS = people partnering with people.

Key to any sustained relationship is a positive experience, yup that can have some esoteric dimensions to it but we are talking IT here OK, so back on theme… The magic word in services EXPERIENCE, and specifically quality of experience. It’s not product feature sets, which is the hard cultural shift Microsoft is having to make, yes you have to have the goods but that is not the end game anymore.

Microsoft Partners deliver that personable face of the Microsoft ecosystem and the valued customer experience. The partnering structure Microsoft imposes on its partners though has plummeted in comparison, as I wrote in my last blog ‘Microsoft Partner Network (MPN) in a Modern World’ . The main point being a significantly revamped Partner Program is required to reflect the commercial prerogatives that drives the new Cloud world.

For Microsoft the Holy Grail is to re-engage its Partners in a new way. The symbiotic relationship has never be more important than it is with the shift to a Cloud world. Microsoft needs those relationships to be transitioned to the cloud, and they are not in a position to do that themselves.

In the old world Microsoft you were the factory that sat on your hill in the American North West, with a marketing engine that would fire off great salvoes of promotional air cover under which your partners would get up close and personal with the customers, refining the messaging for local consumption and ring the till. Packaged product would flow from the factory to the Partners and Partners would do and own the magic. The trust and relationships with customers being forged, fostered and cherished for mutual benefit by your Partners, Partners who rise and fall on the quality of the customer experience they offer and the profitability they can nurture from that engagement. The importance of the profitability message cannot be underestimated, it’s not that it never existed between Microsoft and its Partners, it is just that with Cloud Computing the model is changing, Microsoft is now directly impacting that in real time.

Microsoft is now in the field with Partners, expecting adoption as a direct extension of the Partners Service Delivery and by extension their IT teams, as Partners are now beholden on Microsoft as an extension of their support service chain. The established Partnership Trust Microsoft enjoys is getting them in the door and Partners are putting their goodwill and name on the line in blind trust. But Partners are starting to find their newly adopted IT extension(s) are not fully aligned to what impacts their business. Their new IT team dependency (Microsoft) is not delivering as expected or rather as Partners who expect form their own IT teams. Consequently Partners are feeling they are no longer 100% in charge of their own end to end solution delivery and there is the pinch, where control or coordination is missing costs can quickly spiral and profitability evaporate. Quick on the heels of which goes quality of service and customer experience, that hallowed trusted customer relationship, a veritable meltdown like a China Syndrome. In the wings awaits a Book seller and Search engine vendor hungry for custom.

When I am speaking to Partners I call this ‘A New Age of Trust in the Cloud’. For example ISV’s (Independent Software Vendors) experience this when their maintenance and support contracts don’t get renewed. Why? Because a born in the cloud start-up has just mined their customer base with a cloud offering that blows away the incumbent legacy ISV offering. By the time the ISV realises what is going on it will be too late. Service Partners see the same thing happening when the phone stops ringing from established customers with whom they have neglected to develop their Cloud credentials.

In Microsoft’s case its Partner Program MPN (Microsoft Partner Network) is hitting up against this like a ‘Trust Event Horizon’, testing that Partner faith just enough to be dangerous. A question mark over a relationship can be a gnawingly dangerous thing. Trust is a slow won but can also be frighteningly ephemeral.

Microsoft you have it in your power to rewrite the rulebook and engage Partners at new levels of marketing and operational intimacy and in so doing do credit to the trust and confidence of your Partners. Let’s see some of that Blue Ocean thinking, as Cloud is re-writing the IT rulebook it is time the masters of the Partner ecosystem did the same with Partner Channels.

Microsoft and Partners success will be found in the teamwork that is required to deliver end-to-end value + quality, experience rich scenario focused propositions into the market. That is not something Partners can do independently or in semi-detached way. Partners have the experience at the coal face and know how best to engage the customers and Microsoft controls the engines.

Some pointers as to where to start:

  • Service integration – Microsoft you are now an extension of your Partners IT department, they need that same visibility and accountability from you. Extend your service help desk to your Partners and allow your Partners to become an extension of your own support teams with access to escalate to product groups. This has to happen to deliver to the market expectation of a single point of support.
  • Sales integration – Just as with the services, there are now questions that only Microsoft can answer, so when we are in the field and preparing for a client engagement we need the connection and joined up account planning. Yes we like the name change PAM’s (Partner Account Managers) to PSE’s (Partner Sales Executives), now let’s see the action, sharing visibility of managed (and breadth) accounts and working with the internal account teams, akin to an extension of the Partners Sales team. The customer relationship management success will be maximised with a consistent single touch point, the Partner.

This is something Microsoft has been doing internally already, just look at how GFS (Global Foundation Services) integrates and works with the rest of Microsoft to deliver the great suite of Microsoft Online Services. My suggestion is to follow the same principles, an extension of this type of service engagement model, into the Partner base.

Sounds great doesn’t it Partners, but the give back is this will only work with the fully committed and engaged Partners, the ones Microsoft can trust to engage at a level of professionalism to operate within NDA (non-disclosure Agreement) to share this level of operational integration. It will be no good anymore just signing up as a Partner and expecting an open door, it will require a new level of proactive engagement with Microsoft.

Challenging when working on Internet Time which doesn’t wait for anyone.

Microsoft Partner Network (MPN) in a Modern World


If you are in the Microsoft Partner ecosystem you will be familiar with this time of year, it’s that mad scramble to tick the box’s to ensure your continued membership is renewed ad critically at your chosen level of Certification – Gold or Silver being the two classes of real value.

Some of you will probably have also received email notices of some Competencies having just expired. In our case it transpires that our Microsoft Certified Professionals for one of our Gold Competency no longer qualify with their certifications for that competency. No notice, no details not a partner friendly experience. Enforcing on us a time consuming detective process to find out why. One of many inconsistencies endemic in the system, why can we not have visibility of this as we do with Customer Reference expiry? Customer References are easy to spin up, we are all active in the market and have great customer relationships, but the Certification takes time, billable resources side-lined into training and exams.

This is not the first time someone has flipped a bit at the backend of the portal and rippled some premature or undesirable changes across a whole group of Partners if not the whole network.

It is time this stopped. Partner have better things to do, the world has moved on, the Microsoft Partner Program ‘Microsoft Partner Network’ (MPN) is proving to be ill suited to it. Yes it can muddle on, but it will lose traction and impact the effectiveness of the Partner ecosystem.

The current version of the Microsoft Partner Program ‘Microsoft Partner Network’ (MPN) was launched at the Microsoft Partner Conference (WPC) in Washington DC in 2010. So it is somewhat prophetic that I find myself sitting in ‘The Walter E. Washington Convention Center’ in that very same capital of US and global politics Washington DC. And for a short week in July this year it will once again be the epicentre of the Microsoft Partner world as Microsoft revisit this landmark venue for the 2014 Microsoft partner Conference (#WPC14).

Each year since the seminal 2010 MPN launch, MPN has gone through significant retrofitting and adaptation exercises, with at least one significant identity rebrand. This costs Partners and impacts profitability no matter what value Microsoft may attest to the program. At WPC 2014 in July we are being promised once again a rehash as MPN struggles to keep pace with a fast move IT world and reflect Cloud in the program. Change that will impose further cost and burden on those Partners fit and willing enough to jump through MPN’s hoops. For that is where it’s got to for many Partners, a necessary exercise that delivers questionable direct value, and impose onerous costs if you aspire to the highest Gold standard. The $5,280 (£3,900) membership fee pales into insignificance when the costs of the certification burden on companies is calculated. For the average Systems Integration Partner the cost for certification is over $100,000+, and for some of the other Partner types such as Dynamics, can be as much as twice this PER ANNUM.

The reality is that for all the investment in MPN Partners struggle to articulate value when questioned. Many customers do not look beyond the Partner status, and those that do they look for Gold and then rarely go any deeper into the actual competencies that Partners are Gold in. MPN is regarded more as an internal tool for Microsoft to manage its partner ecosystem prioritised as its own internal resources, to talk strategy with partners around its products and drive technical competency amongst others such as qualifying for incentives.

Yes there is a fist full of product licences that in commercial terms would come in at over £50k, but the reality is with the majority of Microsoft partners sitting in the 15-30 company size, the Internal Use Rights (IUR) are seldom fully tapped, and let’s face it there is a quid pro quo in Microsoft ensuring its Partners have every reason to use its product.

The point is MPN is not servicing its Partner audience and is proving to be ill fitted to the modern IT world that it is trying to support. I hear this across both the managed and unmanaged Partner categories, the Microsoft Partner relationship is drifting apart. There are so many other options out there in the market today, MPN is not delivering.

The effort to maintain Microsoft Partner competencies is disproportionate to the profitability contribution. MPN activities do not directly point to profitability for Partners. This needs to flip, each requirement on a Partner should be judged against how that requirement drives profitability.

I have great respect for the effort, diligence and time taken by the Microsoft MPN team in defining this program all those years ago. It was not rushed, it was deeply researched and widely consulted within the Partner ecosystem paying great respect to the impact that it would have and the need for it to work well into the future. The problem was the market outside was at an inflection point, infract x3:

  1. Cloud was bursting onto the scene, with all its service oriented operational demands and subscription based financials.
  2. Marketplaces were becoming the defacto focus for application retail, reviewing and marketing.
  3. Social the buzz word (excuse the pun) changing the whole cadence of how consumers and businesses alike evaluated purchases, develop new trust models, and start-up business started carving fat slices out of the traditional big vendors with high visibility and agility social media ‘community’ based marketing.

MPN for all its polish and nurturing into life sadly did not, out of the box, perform against any of these new classes that continue to define our industry. One it maybe could have adjusted to but all of them hitting together in Internet Time has proven in retrospect too much to absorb. The retrofitting has seen some degree of Cloud acknowledge through the program but that has not instilled much confidence, akin to slapping a GTI 16 Valve badge on the back of a 1.0 Litre and swapping out for some wider rubber, it does not really address the inadequacies under the bonnet.

I could go on but I would prefer to look forward at an opportunity this offers.

How then does MPN service the Partner ecosystem and Microsoft across these key criteria and the needs of a modern IT marketplace? My thesis is that the orientation has to be on Partner Profitability and customer/market experience. This focuses on building a marketplace as the hub of a program.

There is opportunity for a bold decision to be taken, a decision that would show some blue ocean thinking and send a clear message across the industry, an industry that is hard set in its ways of old style vendor certification and accreditation. Microsoft could once again put some day light between it and the competition who are coming at these self-same challenges from different directions.

A decision to set the pace and not follow the crowd, in the new real time Cloud, subscription based experience driven market place.

But first like any good strategist, as the old saying goes ‘Make sure you can feed the bear’. This would require a commitment and tenacity that cannot fall foul of cultural contamination, brow beating or budgetary cuts. Commit, execution, deliver and reap the rich harvest.

  1. Vet and acquire (or license) third party platforms:
    1. Marketplace.
    2. Payment systems. There are plenty of them out there servicing banks that are up to date, compliant and fit for banking so would service Microsoft Online Billing requirements in their sleep.
    3. IT Community platform. (ie:Spiceworks).
  2. Integrate the platforms which becomes the go to resource for partners, customers and Microsoft for anything Microsoft replacing MPN Portal, current fragmented marketplaces, PinPoint and the Microsoft Account platform.
  3. To be a Partner you would need to transact your Microsoft business through the Microsoft marketplace, yes ALL classes of partners.
  4. Partner accreditation is attached to real time transactional and end user experience review criteria. Partners would be ‘invited’ to join as Gold or Silver as their criteria met set thresholds. The marketplace categories define their competencies de facto the marketplace reviews and their service/product offerings.
  5. Marketplace will ID Partners in the most honest way, market forces. By all means grandfather over Partners for 6 month’s to keep the peace.

A high risk approach is for Microsoft to reinvent the wheel and write their own. If there is something you have that is off radar and is tried, tested and proven in anger as many of these third party solutions are then fine don’t ignore that. Otherwise this would be a challenge for Microsoft, too many egos and techies, but go for the path of least resistance, kick out blockers and promote the doers. You need this yesterday.

The exciting dimension to this formulae is the reality that I believe it could occur in a single fiscal cycle (if not faster). We have seen banks set-up in shorter periods of time so don’t tell me this could not happen for a Partner Channel system.

Amongst many other benefits it could deliver:

  • One Microsoft marketplace.
  • One Microsoft resource directory. (Pinpoint the current Partner Directory can be retired).
  • One Microsoft view on its Partner pipeline.
  • A modern agile approach to the market that is familiar and accepted by consumers and business, making reviews and customer satisfaction as simple as it is on eBay or any other ecommerce site.
  • Pain goes away for partners. For the majority of small partners it will also mature the final mile of their marketing, sales and billing management and for the bigger partners and API in good old Cloud fashion to interoperate with their ERP systems.

It needs something. The above may be a pipe dream, but something has to happen, the current MPN is struggling and as Einstein said, ‘keep doing the same thing and expect a different result = insanity’ or something to that effect….

I have a suspicion that if MPN was pulled tomorrow, 90% of the Partner activity would continue unaffected. That is the degree of detachment the program has from reality. Now put a marketplace in the mix and tools that help partners drive their profitability the that becomes core and influential.

For now Partners it’s just more of the same. See you at WPC 2014!

Microsoft OneDrive, What’s in a name?


‘Microsoft launches OneDrive globally, with some freebies’.

More on the freebie ‘Microsoft to give 100GB free OneDrive storage to early birds’.

So it’s out with a big fanfare and more obfuscation for the humble consumer and business user alike as they get their heads around a name change and a few new bells and whistles but still need to struggle on with some core issues.

As with all things tech today, by the time you read this things may well have moved on, but at the time of writing and probably a while thereafter this will still reflect the state of play.

The naming of OneDrive (Formerly SkyDrive) and OneDrive for Business (Formerly SkyDrive Pro) represents a continued naming convention aberration that Microsoft suffers from in a long history of such faux pas:

  • Explorer – Do you mean Internet Explorer or File Explorer?
  • Exchange Hosted Services or did you mean Hosted Services for Exchange?
  • IIS or IIS? – Was that ‘Internet Information Server’ or ‘Identity Integration Server’?

And don’t forget the mother of all confusions with Windows 8 and Windows 8 RT. As if they did not get enough from that then they go and do the Surface and Surface Pro thing!

The renaming to OneDrive, whether a quaint echoing of the home of Microsoft at ‘1 Redmond Way’ or the more obvious ‘Only Drive you will need‘, was a missed opportunity to correct the confusion, but no in blinkered mode Microsoft crash on regardless.

Personally I think OneDrive would have worked nicely for the consumer application, but something more akin to the former ‘SharePoint Workspaces’ comes across as more meaningful for the business iteration. After all that’s exactly what it is, SharePoint Workspaces on your local PC.

If the problems were all in the name then that would be the end of the issue, but it’s not. The functionality of both iterations is well below standard for Microsoft. If you get the right people in a corner they will confirm that SkyDrive Pro (OneDrive for Business) is broken. Yes you read that right, and if our experience with clients and our own internal use has anything to go by we cannot agree more.

OneDrive for Business may be a fresh name but the authentication, sync and reliability issues all remain.

Remember ‘MESH’, it worked, it was rock solid, so what happens it gets dumped in favour of an inferior ‘Live Mesh’ soon to be ‘SkyDrive’ and now OneDrive. See my blog on that at the time for some history.

OneDrive still maintains a long list of poor end user experience and feedback characteristics including:

  • Online / Offline indicators could be clearly identified on file Glyphs, not hidden in a distant column only visible in detail view.
  • Users should have a choice on set-up about what is held offline or online so they start from a known baseline.
  • Toolbar indicator of activity on the desktop is coming back a last, why would such a fundamental visual queue have been removed??
  • The Windows 8 App is an exercise in screen inefficiency and lacks detailed view.
  • Sync’ing needs to be more network savvy. It still visible impacts network activity and cannot be easily managed form the desktop short of switching it off. Some form of ‘Pause’ feature with visible flag to ID outstanding sync items. It would also be useful to priorities uploads.
  • With privacy and security so important greater identification of file sharing, again with some form of Icon at folder level so inadvertent sharing does not happen.

OneDrive for Business is simply broken, for example:

  • Hours lost to corrupted files that seem to occur due to poor sync management.
  • Offline clash management seems to not work most of the time defaulting to a red cross and failed sync.
  • Login demands (made worse when you are running Single Sign On). One trick here is to flush your cached credentials in the ‘Windows Credential Manager’, but this should not be necessary.
  • Running multiple O365 Accounts there is no clarity as to which account it is asking you to login to! Take a look at the dialogue box below, guess what resource this is accessing? Easy if you only have one Office 365 resource you’re syncing, but if you have multiple, this user experience needs enhancing.

  • Sync Manager is like a developer’s prototype basic.
  • In Window RT it can only Sync with your default My Site, no SharePoint library connectivity and then only online! This gives little advantage over accessing the SharePoint site direct.
  • Why can I not Sync at a Folder level within a SharePoint Library? This renders the tool somewhat useless in many scenarios as the sync volume is so large to take a whole library offline when I just want a folder.

I can go on, but I think you get the point.

As for a call to Microsoft Support well that seems to end with the suggestion of removing a sync’d Library then re-connecting it. Not well received when you are working in the real world of mobile broadband, or even on a small business network, being asked to pull down gigabytes of data all over again does not win friends. It would be nice to be able to re-attach volumes to a new OneDrive for Business instance to save re-syncing everything again.

That I regret is not the end of the tale. The interaction with SyDrive in either of its iterations is having fall-out on the great MS Office applications boot up times, and sometimes failing to boot-up ‘full-stop’! We are still analyzing the Office impact where Office Pro has become so Sloooow to open anything in these drives even when offline, so the findings shared here are without a clear solution.

The responsiveness of MS Office 2013 is a real end user issue when using SkyDrive/OneDrive. Users are killing the apps thinking they have hung and then try to start them again causing a chain reaction of corruption when opening apps by clicking on a file in a sync’d folder. Our advice is to start the app first then open a document using the ‘Open’ menu option from within the app. Not ideal and hardly an optimized user experience but it resolves the issue of apps like Word and Excel etc hanging and getting corrupted. When using Office Pro with Office 365, the repeated opening and closing of an app can cause a corruption requiring the Office install to repair itself. Sometimes a quick repair does it but often a full repair is required which demands an Internet connection, OK on a fixed network connection but costly if not impossible on mobile or public WiFi link, and a complete nightmare when you’re on an 8 hour flight and Office refuses to do anything without an internet connection!

This goes back to the core issue of Microsoft as a service company. They are falling foul of their heritage as a product company in their attempt to be a Services entity, services are about end user experience, and OneDrive products come across as being developed as a collection of features first and functional experience second. The danger for Microsoft now where user’s ownership is no longer an issue or tie in, a poor experience will see users up and running with a competing solution in less than 15 minutes. Welcome to the world of services…

Any wonder DropBox and Box are so sticky in the enterprise. Thank your ‘Supreme Principle’ of choice Microsoft that Apple are so inept with their iCloud or there could be more heat from that goliath. As an aside the fact that Apple iCloud uses Windows Azure is an interesting bit of trivia that would probably have Microsoft seeing money in their back pocket if iCloud was more successful!

Telco plays fast and loose with Customer Login data


My desk had the mornings pile of junk mail on it, opened by my PA, which off the back of a late Burns night entertaining customers got short shrift as I cascaded them into the waste paper recycling bin by my desk. My mind still working at getting itself clear of the nights excesses, grateful I was sober enough at 02:30 to break out of the slipstream of my compadres who where hell bent on making a morning of what was already running out of a great night.

As I coaxed my mind out of its crawler gear to focus on the day’s priorities ahead something like a retinal echo bounced an image off some grey matter deep in my cranium that made me glance back into the bin by my desk now engorged with junkmail. No I did not have that much to drink last night, my stomach was rock soldi, and whilst tired I was by no means hung-over. What had caught my lagging eye was an unsolicited promotional mailing booklet from one of our suppliers with both my name and our company name clearly printed on the front. My intent to simply redirect the junk mail to the shredding bag, as we have a company policy that anything with company identifiable information on it gets shredded. Thus I found myself taking the short stroll to the printer rooms shredding bag winning a few more moments grace delaying the inevitable of anything more mentally taxing than shuffling paper around.

What happened on the way to the shredder bag was one of those things you wish you could bottle and sell, my fortune would have been made, and I would have stolen the hangover cure market overnight! Alas these things are elusive and whilst that wistful dream accelerated away down one thought process my complete and undivided attention was captured by the back page of the booklet form our telco call handling service provider Windsor Telecom. I was now fully focused and firing on all neurons.

Read that one again WINSOR TELECOM …. A bunch of Muppets, you may wish to avoid after reading this.

After the usual double and triple checks that all faculties were functioning and reporting correctly along their respective neural pathways, that feedback was within usual tolerances, and no extremities had suffered any unexpected impact or interference, apart from a growing heat under the collar, I returned to my desk with the aforementioned junk mail.

A reprieve I hear you say, some enlightening, must have, cannot resist, business differentiating, competitive edging, new service had caught my eye? Am I going to share with you the intimate detail of one of those elusive 1% pieces of junk mail that actually delivers?

Well let me start by outing it this way, it provided me with the material for this blog, far from a valuable piece of promotional mail, this was perhaps the most blatant and ignorant breach of even the most basic rules of privacy and data protection I have seen for a long time. In the backdrop of the last few months escalation of data breaches across retail and business it comes as all the more shocking that Windsor Telecom remained so unaware, after all they are meant to be an IT company albeit a Telco derivative.

What shocked me out of my stupor was sight of our service login details in large font printed on the back page of said booklet! Combined with the company details and principle contact clearly printed on the front page this represented perhaps the most blatant and ignorant breach of even the most basic rules of privacy and data protection. I don’t know how many customers had been targeted with this same mailshot but I doubt we were alone.

With these login details any individual would have full access to an organisations virtual telephone service. This would include control over the numbers, the routing and mapping of those numbers, and additional data service such as call recording and logging to name just two of the most obvious high risks exposures.

OK I hear you saying its hard copy not digital, unlikely to have a high risk of being compromised, a red herring. How do you think these details got onto the booklet? I doubt the cost of a secure printing facility was used, this is junk mail after all, more than likely the data was email or FTP’d (File Transfer Protocol) or even worse USB’d to the printing agency possibly via a third party marketing agency first. So what we have is 1,000’s of secure customer login details in some form of digital file circulating between 3 organisations, a digital file which I doubt was encrypted or subject to a secure chain of accountability. So they now quiet possibly reside on any number of machines across multiple organisations and individuals. This has worrying ramifications across other areas of security for a Telco service.

All of that is somewhat academic. The crux of the issue is that here is one of the new generation of Software as a Service telecoms providers lacking a complete grasp of their responsibilities in the high risk world of online services. Software as a service (SaaS) providers are an aggregation point for valuable data and a software target for hackers. This is not new and has been broadcasted in mainstream messaging for over 10 years, so there is no excuse for not being aware, see ‘Hidden risks of software-as-a-service’!

Read the UK Governments ‘2013 Information Security Breaches Survey’ conducted by PWC. Some frightening highlights include:

  • 93% if Large organsiations had a security breach in 2013
  • 87% of Small Business had a security breach.
  • 113 is the median number of breaches suffered by each large organization (up from 71 in the previous period)
  • 17 is the median number of breaches suffered by each large organization (up from 11 in the previous period)
  • 57% of breaches suffered due to staff related security breaches (Up from 45%)

Read the executive summary of the report it is frightening. Despite the raised awareness of security the 2012 to 2013 shows increases in BREACHES, not just attacks, these are actual compromises. Increases in failures by organizational personnel as one of the largest risk areas.

We live in an age of Advanced Persistent Threats this is not going to go away and orgnsiations who aspire to provide our services and earn our trust must go further and invest more than ever before, there are no short cuts and only ignominy.

I have written this article because it is time users are made aware of the responsibilities they are placing on SaaS providers and ultimately for the providers to know that the honeymoon was over long ago, play fast and loose with customer data or the integrity of their trust in your services, then expect to be named and shamed.

Windsor Telecom regard yourself as having been put on notice, you have failed, you have breached a trust that will be hard to win back.

SharePoint 2013 Apps – Inspirational


SharePoint 2013 Apps, what’s new, we had them in SharePoint 2010 didn’t we? Yes, and a lot has changed, but then again for those with a significant investment in SharePoint 2010 nothing you should have to worry about. Let me explain…

  1. Your investment in SharePoint 2010’s full or partial trust sandbox app models will still work and are still supported. With the single caveat that you (or your developers/Partner) know how to code for SharePoint and are not doing anything that is knowingly coded outside the functional support envelop of SharePoint 2010.
  2. In SharePoint 2013 Microsoft has opened up just about every SharePoint operation from the Client-Side Object Model (CSOM). Yup you read that right, from your client (workstation or other unattached compute platform) you can leverage SharePoint operations without installing anything on the server.
  3. Following hot on the heals of the security concerns that will be spawning in your mind from the statement in Point 2 above, Microsoft has implemented a robust security layer around the new App model that effectively constrains access, it’s a very neat solution, perhaps a subject for a future blog.
  4. The new SharePoint App modelled solutions run with NO SharePoint server side code. Magic? No, a very modern approach to server architectural design that is much in the Cloud paradigm.

Let’s face it if this was not the case then Microsoft would be building a road to perdition for itself. Instead Microsoft has taken a very creative approach to supporting the past whilst fully embracing the future, and fully underwriting their Cloud credentials and intent.

The future is flexible, inclusive and interoperable. You no longer need to be a .NET programmer to write fully featured apps for SharePoint AND deploy them commercially. With core web development skills SharePoint 2013 is now a portal for all.

So the answer is YES you can continue developing in the old models, but the question to ask yourself is why would you ie: legacy investment maybe, but not beyond the next refresh of said legacy app I would postulate!

Pre SharePoint 2013 organizations struggled with opening up functionality to their business users and third party developers because of the impact on the SharePoint server itself will benefit hugely. Previously it was necessity to deploy code on the server, even with the sandbox options. This meant lengthy protracted development cycles and high friction IT involvement, multiple instances of Development and Staging environments, all of which = EXPENSIVE. No longer, well not true, you can do it the old way as I say but if you have a new app requirement or an old app past its sell by date then why would you invest in such pain!

The SharePoint 2013 App model changes all of this. Bringing great benefits  to many use groups:

  • ISV (Independent Software Vendors), as just one use group, should see as a gateway to accessing the Billion+ SharePoint / Office user audience, and start leveraging SharePoint and Office365/SharePoint online for that matter.
  • External software houses should see it as a new way of selling functionality into the biggest document management and collaboration ecosystem in the world.
  • End user organizations and internal Dev teams, the old world reasoning has been stripped away for new world time efficient, rapid delivery rich experience, a low or zero ownership cost future approach to commissioning and consuming business IT functionality.

For example:

  • Leverage existing web app backend assets and develop new or additional interfaces in standard Web technology. The CSOM functionality can be accessed using .NET / Silverlight for the power developers and or using the JavaScript API, and REST API’s for any competent web developer.
  • Self-Hosted SharePoint 2013 Apps are hosted on a separate server and therefore can run on ANY operating system, which means supporting the widest scope of application server choices known to developers. This empowers the solution with whatever server side functionality it wants to run on the self hosted platform.
  • Azure Provisioned. This is the real ‘secret sauce’ (which isn’t very secret actually as it is very well documented with plenty of sample code on TechNet) An application designed to be deployed in Windows Azure in its own discrete Azure instance. So when a user requests the app for their SharePoint instance the app informs the user that it needs to be deployed in Azure and that resource is then automatically billed to that user/organizations SharePoint online account. This can be used for on premise as well as SharePoint Online/Office 365.

The magic of the Azure Provisioned SharePoint 2013 application for any ISV is there is NO cost associated with hosting it themselves. This is pure Cloud economics and dynamics coming into its own. For organizations, individual business units purchasing through an EA similar economic benefits apply, not to mention the operational efficiency gains:

  • Instant access to functionality
  • No lengthy deployment process friction with IT departments
  • No procurement negotiation headaches with infrastructure (assuming the EA is in place the rates are fixed and discounts already won).
  • Built in upgrade process that saves IT department’s maintenance time.
  • SharePoint Organization App stores, that can control filtered access to the SharePoint Public Store to allow vetted apps or trusted vendor applications only to be accessible.
  • Leverage existing application investments, Data and Business logic layers can be leveraged requiring only a front end re-write to integrate with organizational SharePoint environments and authentication.

I repeat there is the old way or the future play……

For more information I would suggest a visit to the Microsoft Office blog site ‘Introducing apps for the new Office and SharePoint and the Office Store’   before heading into TechNet and MSDN.

When it comes to PC’s 2013 says it takes two to Tango.


2013 was a year that resounded with harbingers postulating the demise of tech (the PC being the whipping boy) and the rise of tech (fondle’slabs with Apple iPad leading the charge predictably) as never before in a totally compromised digital ecosystem, the latter point coming as no surprise to those in the security industry and the general consensus being it was about time the message went mainstream!

The venerable PC, that stalwart desk hogging passive piece of plastic and tin that the marketer’s would have us believe received a head shot from the Tablet brigade in 2014. Woe for Microsoft being the follow-up story on the theme and predictions of a Phablet’ous Fondle’slab future for all, led by none other than the Apple fanbois and the freebie brigade Google Spyware platform ‘Android’ community.

How refreshing the turn of a New Year can be, articles heralding a bright future for PC’s when reflecting on the real world insights from 2013. Why tablets aren’t replacing the PC anytime soon’ echo’s of what we have been seeing in the business space. PC’s have not disappeared, they have simply been overshadowed by the preferential spending of consumers on Tablets and Smartphones and their reverse entry by the back and side doors into Enterprise IT. As Enterprises have held onto budgets off the back of some hard economic times, the desktops have simply not enjoyed a customary refresh. OK partly stimulated by the market hype around tablets which may have had IT procurement sitting on the fence sweating PC asset waiting to see what Tablets actually meant for the enterprise.

The outcome I believe is, business as usual, albeit there is now a new kid on the block, the touch tablet class of computing interface that will remain largely an Information Consumption and email/text/Video Communication device with limited information creation value confined to the type historically familiar to Kiosk based terminals. Yes there will always be the minority who to prove a point (and obsession) will incur RSI (repetitive strain Injuries) in their lengths to prove Tablets can replace a PC.

CONSTRAINED TABLET PLATFORMS WILL NOT IRRADICATE THE MULTITASKING POWER REQUIRMENTS OF THE PC DESKTOP!

The PC will exist in a Tango with tablet devices (ranging from the iPad class through to the Slab Phones). Users have demonstrated this over and over during 2013. Tablets are purchased and complement the PC, they are not replacing it wholesale as some headlines would have us belive.

The Hybrid device is still to escape repeated birth pangs. Having myself test driven few, I find myself back to a PC/Notebook and Tablet/Smartphone. Hmm… yup that is x4 principle compute devices. The combination provides well for my principle work modes:

  • Road Warrior / Hot desk Office worker – Notebook & Smartphone.
  • Home Worker – Desktop PC & Tablet.
  • Day trip client site visits – Tablet & Smartphone.
  • Recreation – Games console.

OK the last one is not work, but it does reinforce a serious point across this whole debate, ‘horses for courses’. I have given up gaming on PC’s because of the fallout games have had on the PC performance/stability when I then need to rely on it for productivity work. Tablets being a closed system offer greater recreational stability and convenience but will remain platforms for utility games as they don’t get near the horse power for the class and quality of serious gaming experiences.

To summarise therefore I see the majority of business people still demanding a PC experience, and the Windows 8.1 starts to get close but its lack of a fully featured independent desktop configuration keeps it chained to the uncertain future of the Hybrid device. If I was to take a guess I would say the hybrid device will remain a niche prospect, so roll on Windows 8.2 with the prospect of a proper return to Desktop computing for Windows users.

On the device side, serious IT users will demand the power and dexterity of a PC desktop PLUS the kick back ease of information consumption and recreation of a Tablet class device.

Modern IT requires the two to tango and it will be that harmony of experience across them that I think Microsoft has the best chance of getting right. As for the rest, Google has a pumped up smartphone platform and Apple continuous with its proprietary control freakery that will continue to frustrate users who are becoming more IT savvy, demanding greater freedom with their system.

‘Data Weights’ threaten Net Neutrality


‘Is obesity contagious?‘ I will leave you to ponder that one.

My Segway being, whatever the reality in the human realm, clearly something has contaminated our attitude to keeping our data trim. Just as airlines are now catering and considering charging for ‘large seats’  so will the Internet’s aspirations of Net Neutrality’ (the principle that all Internet traffic should be treated the same) go the way of the dodo.

The race to put everything online, downloadable and updatable has apparently little regard for how bloated software has become and the ‘Data Waits’ (excuse the play on words) that ensue. Whilst the main high speed backbone and fibre arteries of the internet are currently sustaining this deluge the finer and more regional capillaries are showing signs of strain and in the extremities incapable of servicing such volumes. Just like humans who over indulge our domestic broadband arteries are furring up fast and if we are to avoid the equivalent of a digital aneurism we need to reflect on these indulgencies, and quickly because the ‘Internet of Things’ is going to be getting more demanding with alliance such as the Qualcomm, LG, the Linux Foundation.

I cannot dispute the value of the online delivery and update model for both vendor and end user, access anytime anywhere you can get an Internet feed and the ability to push updates is a highly viable model. Almost…..

Now look at a sample of what vendors are pushing down your internet connections:

  • Apple OS 6GB operating system + App store.
  • Microsoft with its Windows 8 update at a somewhat lightweight in comparison 1.5GB
  • Google YouTube
  • BT, Sky and Virgin demanding a minimum of 1mb of your bandwidth
  • Then the crowd of online gaming and virtual world companies vying for mindshare with games that range from the veritable lightweights at 500mb to the grotesque at 64GB!

The increase in download volumes are pushing the envelope on even the fastest of home broadband connections, as for the slower end of the market it is fostering veritable frustration and resentment as services take themselves out of reach. To use another analogy, you can only get so much water down a pipe of a certain diameter before things just back up and start failing. Or if you like the car analogy, the experience of sitting on the motorway in a 10 mile tailback having just passed the last service station or junction, burning fuel with the sands of time passing inextricably as you get overly familiar staring at the same view (the back of a vehicle not of your choosing) inhaling its waste as you burn your own ever more expensive fuel.

The Broadband divide that exists is extreme. You have your BT Infinity and cable users enjoying at the top end 40+MB of bandwidth whilst at the other end of the scale you have users that can get no more than 2MB on a good day. The commercial vendors are clearly working to the top end of this spectrum and neglecting the rest.

I live 5 miles from a City, in a wonderful part of the South West of the United Kingdom but the maximum bandwidth I can get is 3MB which rarely averages out at more than 2MB. As a result I have to install and pay for x2 broadband connections and use a Small Enterprise grade router to bond these into a single connection to try and make this bearable. It helps, a bit, but completely voids any support from my broadband vendor. They wash their hands of any problems immediately they learn that I am using my own router. A pity they do not recognise the increase need from even home users for more advanced functionality from routers and supply small enterprise options that they will support.

My broadband experience is not unique in the regions of the UK, a usable 3MB on a good day. The reality of which with a couple of PC’s connected makes any media console streaming/gaming/viewing a futile exercise that just compounds the problem. With the new generation of Operating Systems be it Apple Mac OS or Windows 8 their increased dependency on Internet ‘Cloud’ services means the burden on a domestic broadband is greater than ever. My connection quickly becomes saturated and useless for both PC’s and media consoles as they contend for the meagre bandwidth resources.

So the great fanfare and splurge of media around Microsoft new Xbox One and Sony’s PlayStation are completely wasted as my current Xbox 360, has largely sat redundant since the increased move to streamed content, so the thought of investing in an even more gluttonous media device is absurd. Attempts to watch on demand films or catch-up TV have been a vain attempt to join the streaming masses, doing little more than spoiling evenings with delayed buffering and unwatchable quality. As for the spontaneity of gaming, this has become a strategically planned event due to the frequency and size of console and game updates dominate the system every time it is booted up or a game is spun up, spoiling any fun.

As Xbox One and Playstation are taking download sizes into a new realm with 60GB + game downloads, I can do little but laugh. OK they say that you only need to download 30% of this to get up and running, well that is still 20GB and on my home network that would take over a week to just get up and running on a game. Furthermore it appears that these consoles will also require a ‘day 1 Update’ before you can even use them. For the Xbox One that day 1 update is apparently around 750MB. Which means that it can never realistically be a Day 1 update but more like a day 2 or even 3 for many. Having enquired it appears these online games and their updates cannot apparently be supplied offline ie: via a USB stick. When challenging someone involved at Microsoft on this theme I was informed that games will be available on DVD so it is not all online, which is a small comfort when you later these games demand instant multi Gigabyte updates and much of the added value you pay for depend on online services, so a short term win for long term pain.

The blinkered approach of vendors is inadvertently engineering in obsolescence day one and alienating audiences as the burden of remediation falls on end users broadband. Yes I would like to have an Xbox One but see no value in such an investment where I cannot use it to its full or even 75% potential.

What is at risk here is the very essence of Net neutrality. Bans such as the The Federal Communications Commission Open Internet Order’ and similar regulations or agreements, banning service providers from preventing access to competitors or certain web sites or content across their networks are likely to be challenged.

With the continued eat all you can attitude of vendors with total disregard for the capital investments made by the telco’s and ISP’s in the network, commercial reality will see Net Neutrality disappearing and the equivalent of ‘Internet Tolls’ appearing as network owners strive to realise value from their investments or simply preserve some capacity for their own use over the freeloaders. The impact will be wide as many of the big names in Social Media and other online services currently freeloading start to get throttled.

There is no single answer, but a blend including but not limited to:

  • More efficient developer coding practices. For example how can Apples ’12 Days of Gifts’  (sadly renamed this year to remove the reference to Christmas!) Mobile app warrant a 65MB download when all it is doing is channelling a link to some freebie!
  • Better quality control and greater end user testing. The current attitude of vendors seems to be to subversively enrol the general public to do their final testing in the first few weeks of launch!
  • Provide end user replicable upgrade media. Either through central retail outlets where users can go with a USB or get the latest patches at the same time as buying the DVD.
  • ‘Data Weight’ bandwidth gluttons like YouTube and other freeloading video services need to start contributing towards the cost of the delivery network they use or get throttled during business hours.

Most importantly though this requires a change of attitude. Internet bandwidth has to be paid for by someone and vendors need to start respecting that or we will all end up losers. As for the long term I believe it is inevitable that data charges and or prioritisation will start being legitimately applied by network owners, if only to preserve the commercial interests made in Cloud/Utility computing that depends on the availability and reliability of bandwidth. Just like our physical highways and roads, as they evolved they reached the point where cost had to be recovered from users to maintain their usability and rules to prioritise certain grades of traffic are now not uncommon.

Follow

Get every new post delivered to your Inbox.

Join 255 other followers