Is this Bespoke Spoken For?

It is a truism to say that organizations, in particular governments, are relying more and more on technology to operate and deliver services.  In a 2014 blog, IM/IT Inventory, I explored the concept of what assets should an organization actively manage and track.  The following graphic was introduced as part of a larger concept known as the IM/IT Lifecycle.

IM/IT Inventory-Model with sample mappings

To Include or Not to Include – that is the Catalog?

The purpose of the blog series was to try to answer the question, which bit of technology should be included, or not, in things like an inventory of applications.  The conclusion was pretty much, if it has value – include it – and if it will take a material amount of effort to recreate it, definitely include it.

Alas there is a definitional challenge when it comes to building application catalogs, namely WHAT IS an application?  Some definitions available include:

  1. ITIL – Application: Software that provides functions which are required by an IT service. Each application may be part of more than one IT service. An application runs on one or more servers or clients. See also application management; application portfolio. (Axelos.com; glossary, accessed 2018-01-27).
  2. COBIT – Application: A computer program or set of programs that performs the processing of records for a specific function.  Scope Notes: Contrasts with systems programs, such as an operating system or network control program, and with utility programs, such as copy or sort. (www.isaca.org; glossary, accessed 2018-01-27).
  3. Techopedia – Application software is a program or group of programs designed for end users. These programs are divided into two classes: system software and application software. While system software consists of low-level programs that interact with computers at a basic level, application software resides above system software and includes applications such as database programs, word processors and spreadsheets. Application software may be grouped along with system software or published alone. Application software may simply be referred to as an application. (www.techopedia.org, accessed 2018-01-27).

Why Ownership Matters

All 3 definitions try to get their arms around the question, how long is a piece of string.  Unfortunately the string is only going to get longer as applications pop up on smart phones, are meshed with AI or are used by an organization but inhabit the cloud.  Nevertheless, the definition is important because no matter where/how/who etc. the application is run – an organization must still keep internal accountability for its creation, maintenance, usage, information management and ultimate disposal.

Accountability is important because if service is increasingly delivered by technical means (including robots, AI, cloud and the Borg) then human accountability for its proper functioning and adherence to business objectives becomes more, not less, important. 

Another important reason to define an application is to provide ownership both technical and business.  This concept will become both more difficult and more important as applications become completely embedded in business processes and wink in/out of existence because of an accelerated development time frame.

But is ‘IT’ Spoken For?

One of the challenges of trying to assign ownership is that busy business users don’t want to be bothered with having to own technical resources such as a flux capacitor or an anti-matter chamber (these exist, right?).  They want to focus on their widget production system without being bothered with tech-esse.  However, all the technical bits need to be ‘owned’ by someone for no other reason than to ask: ‘can we turn it off‘?  To assist with this, I am proposing a 3 tier ownership structure.

Level/Description Typical Owner Examples
1. Strategic.  Technology in this group is often at the system level meaning that it is composed of 2 or more applications.  A strategic system is often long-lived and may directly or indirectly support multiple areas of the organization and product lines. The most senior person in an organization that relies on the technology for their immediate accountability.  Resist the urge to give everything to the President/CEO – this is the day-to-day senior owner, e.g. the VP of Operations.

The owner may sometimes be known as the information controller.  I like this as it focuses on the value of a system (information) rather than the means of a system (technical).

  • ERP system including manufacturing, inventory, sales, etc.
  • Production or product support system
2. Tactical Application. A bit of software that may support one or more business/technical needs.  Each tactical application has a single or a few outputs it produces.  It may exist on its own and/or it may be part of a larger system. Because most entries in an organization’s application catalog fall into this category, it is not surprising that most owners are middle level managers.

The biggest challenge in this category is what is in or out.  See the above blog for some thoughts on this.

  • Speciality reporting tools for finance/ marketing.
  • Stand alone database to track a single business function.
3. System/Operations. A bit of software that the techies are primarily aware of… until it does not work of course.  This includes ITIL system programs but may include utilities that business users rely on to operate a tactical application. Although these applications support a business user, their owners are the technical folks.  In this way, the technical areas provide a service to the business/ tactical applications.
  • Print utility, payment control system integrated with a bank, network or firewall software.

Like most things in life there are definite grey areas between the above 3.  This is where professional judgement comes to assign a bit of code to the right category and therefore to the correct accountable person. In doing the assignment, I would suggest that an organization use the following rubric in assigning ownership.

  1. Start High – Work Low: there is an inclination to make all technical things belong to the techies.  Therefore, all systems start with being owned by the highest possible level until they are pushed down based on the following guidelines.
  2. Who Gets to Change or Turn IT Off: this guideline is the contrary to the above, who gets to make changes to the bit of software.  Ideally this should be pushed down as low as possible.  When in doubt, who is responsible to to turn the dang thing off when the application has outlived its usefulness.
  3. The User/Customer, Disenfranchised and the Veto: Because the first two purposely try to move ownership either up or down an organization, the last consideration is consider the outcome.
    1. Who is nearest to the user/customer of the bit of software and therefore can make the best determination whether or not cinnamon red on a burnt orange background is a good colour scheme for web page. Move ownership down when this factor is important.
    2. Who are the internals and externals who are affected but are disconnected from the decision-making choice; these may be users or a few degrees separated from the user. Move ownership up when this factor is important.
    3. Who has a veto either explicitly or implicitly?  What are the internal and external politics of the bit of software?  If there are few such considerations, move ownership down; if there are many such considerations, move the software up.
  4. The Times are a Changin’: Use professional judgement to make a determination and then plan to periodically or ad hoc return to the decision.  Hopefully ownership generally moves down as a system matures and becomes stable – a change in the environment may require it to go up for a bit..

Conclusion and Other Ownership Constructs

The above takes a bit of a traditional view of ownership and purposely avoids such knotty issues as open source software or software as a service.  My expectations is that an organization with lots of applications to document (in particular governments) should spend time on these assets and then turn their attention to the more esoteric.  If an organization really does not own any applications then this discussion is a bit mute and academic anyway.

Let me know if the above model is useful and how you would improve it!

 

 

PRMM – How is That Planning Thing Working Out for You?

This is the second in a series of blogs on a Practical Risk Management Method or PRMM.  At the bottom of this blog is a refresher of the other steps.  This step’s premise is don’t separate your planning activities from your risk management activities.  In other words:

Planning = Risk Management. Planning is ultimately about managing uncertainty which is a fancy name for Risk.  At this point you may be saying:

  1. Of Course: we already do this. Good on you, see you at the next blog!
  2. Great Idea: this may be incrementally more work during the planning process but ultimately over all less effort for the organization.
  3. What is This Planning Thing you Speak Of: hmmm, we may have identified your top risk.

I am afraid I can’t help you if you fall into the last category but hopefully these blogs can help you if you with the first two.

Continue reading

Practical Risk Management Model

Is traditional risk management practical?  If so, why do so many organizations struggle to do it well?  As a quick refresher here are the three steps of virtually all risk management methods:

  1. Establish business objectives.
  2. Identify and quantify some or all of the risks that may prevent the organization from achieving these objectives.
  3. Figure out what you are going to do with the resulting risks (e.g. ignore, manage, transfer, assign owners, etc.).

An Practical Risk Management Method (PRMM)

What makes risk management impractical is that it is often a bolt on and/or a parallel activity.  In addition, risk management often gets bogged down in too many risks and not enough value add (see my blog “Guns, Telephone Books and Risk?” for more on this).  PRMM recommends the following steps:

  1. Planning = Risk Management. Incorporate risk management into existing operational, tactical and strategic planning; don’t separate the two.  Why?  Because planning is how organizations manage uncertainty which is a fancy name for Risk.
  2. Are You Any Good at Change? Evaluate how well your organization responds to change (e.g. when uncertainty becomes certain).  When the unexpected happens, was your response chaotic and uncoordinated or did it go more or less to plan?
  3. How Strong is your ARM? ARM or Antifragile Risk Management is a system that focuses on building robust and resilient organizations.  While step 2 above measures the organization in action, this step anticipates your organization’s uncertainty resiliency.
  4. A Certain Test of Uncertainty.  The organization’s risk/opportunity log is used to stress test the work done above.  Testing measures the robustness of the organization and the scope and reasonableness of the collected risks.  This is the traditional risk management step in PRMM.
  5. Don’t Stop. Modify/improve your plans and keep going.  All of the above activities are meant to be both periodic (e.g. the annual planning process) or continuous.

My next blog are some thoughts on step 1 above, integrating risk management into the planning processes of the organization.

Citizen Centeric Experience – PwC Event 2017-11-24

In my ongoing effort to remember the key notes from events and conferences I have attended, some thoughts on Rethinking the Customer/Citizen Experience; 2017-11-24.  The overview blurb was:

We will look at transformation through the lens of both the ultimate end-user experience, and the internal employee perspective which inherently must be connected to successfully implement change.

Personas and Small Things Create Big Results

Two key themes that came out of the event.  The first was the use of personas to aid in develop a good customer experience.  The second was the importance of implementing big things through a series of small steps.

Personas

Developing a persona is an attempt to understand behaviors of customers/clients.  These are done to help frame development and make changes.  The recommendation is to limit the number of personas to less than six and ideally 3-6.  A single persona is then used to track a collective journey through a process journey.  One description of a persona is as follows (Adapted from Agile Modeling):

A persona defines an archetypical user of a system.  The idea is that if you want to design effective software, then it needs to be designed for a specific person. Personas represent fictitious people which are based on your knowledge of real users. Unlike actors, personas are not roles which people play. In use case modeling actors represent the roles that users, and even other systems, can take with respect to your system. Actors are often documented by a sentence or two describing the role. Personas are different because they describe an archetypical instance of an actor. In a use case model we would have a Customer actor, yet with personas we would instead describe several different types of customers to help bring the idea to life.

It is quite common to see a page or two of documentation written for each persona. The goal is to bring your users to life by developing personas with real names, personalities, motivations, and often even a photo. In other words, a good persona is highly personalized. 

Personas and the Public Sector

According to PwC, personas have been used successfully in various public sector organizations including the Canadian federal government.  My Spidey-risk-senses however went up over two aspects:

  1. The volume of the personas.  Governments do things that no one else wants to do, given the myriad of our product lines; can we realistically develop personas for the breadth of services provided?
  2. Personas as a Cause Celebre. What is the risk of personas becoming a political nightmare? Our society has become increasingly sensitive and intolerant to labels. What are the risks of not having the right personas to meeting a groups demands or having to remove a persona because it does not match an external groups political objectives?

Personas But Tread Carefully

The answer is to use personas but create them through engagement with those they represent. As well, some political mettle is likely required to explain to role of a generic persona that provides a model or analog to society at large (heck, is this not the description of a representative democracy!). Nevertheless, have an emergency risk mitigation plan for either the creation of politically mandated personas or for suppression/modifications of personas for similar political imperatives.

Other than these risks, using a customer experience focused technology methodology can be highly applicable to the public sector. Like most things though, the proof is in the execution and delivery. This leads us to the second part of the morning’s presentation –

Small Steps to Implement Big Change

I am a big fan of the Agile method (e.g. small successes building over a few weeks to a larger objective) versus waterfall.  My observation for governments though is that the larger organization has a hard time with Agile.  It is easier to understand and support a multi-year, multi-million dollar project (e.g. put a man on the moon by the end of a decade) than approve the objective but in a series of short sprints (e.g. what do you mean you plan to have 520 sprints to get a man on the moon!).

Of course I am not being entirely fair to governments in saying this.  After all, it was Apollo ELEVEN that landed on the moon, Apollos ONE through TEN were examples of very LARGE sprints. Nevertheless, here is my thinking about any project:

  1. Large objectives are fine (moon, replacing an aging system, etc.)
  2. The objective must be broken into a series of steps (phases, projects, etc.)
  3. Each step in turn should not exceed the following:
    1. Six months in length
    2. $500,000 in expenditure
    3. 25 people for the entire project team.
    4. Only start upon the successful completion or closure of a prior step.
    5. Turn over is limited but also encouraged, e.g. no more than ~90% of the team is the same project to project but no less than ~50% of the team has changed.
  4. The above measures can be an average for a system, thus
    1. Subsequent phases can get larger but only after smaller projects have successfully concluded
    2. Professional judgement and risk tolerance is encouraged so that the above is a strong guideline and not a set of absolutes.

 

 

 

Day 3 – Greying Population, AI and Extremism

This is the third list of potential disruptive factors that could influence the Canadian Public Service over the next decade or so.  See the previous blog for the first set of three and Seven Days of Disruption blog for the entire set.  These are in support of November 22, 2017 FMI Conference – Disruptive Writers.

  • Depopulation Waves (2015)
  • Evolving Artificial Intelligence (2015)
  • Geopolitical Realignment (2015) and Continued Global Violent Extremism (2015)

Depopulation Waves

Adapted from A.T. Kearney 2015: As global population growth slows, some countries’ populations are already shrinking. Global population growth is decelerating from 1.8 percent in the 1980-2000 to just 1.1 percent in the 2000–2025 period. The three main drivers of depopulation are aging, international migration, and high mortality and morbidity rates.  Depopulation presents a range of challenges including labor shortages, weaker consumer demand, lower tax revenue and higher health care costs as the greying population lives longer.

Editor Note: Additional impacts to the above are a massive transfer of wealth from the baby boomers to their children.  Of course this wealth is only of value if the economic and social structures continue to exist to support them.

Evolving Artificial Intelligence

Adapted from A.T. Kearney 2015: Artificial intelligence (AI) is already used in sectors as distinct as finance, journalism, and engineering, and it continues to find new applications. For instance, AI is used in security trading dark pools, writes breaking news articles, and dominates humans in many games (such as chess, backgammon, Scrabble, and even Jeopardy!). It is also being leveraged in an attempt to cure cancer (as part of the Big Mechanism project being run by the Pentagon’s Defense Advanced Research Projects Agency [DARPA]) and make lethal decisions on the battlefield through its integration into the weapons systems of several countries. Increasing investment in deep learning technologies will enable AI to expand to even more sectors.

Editor Note: This topic has been explored in detail both in the business press and in fiction (anyone remember HAL from the movie 2001: A Space Odyssey?).

Geopolitical Realignment and Continued Global Violent Extremism

Adapted from A.T. Kearney 2015: Global economic and political power is increasingly diffuse thus compli­cating leadership efforts within the international system. In the years since the Global Financial Crisis, the United States and other Western powers have receded from the global stage  while rising regional powers have increased their political influence.  These changing power dynamics are decreasing the effectiveness of global political institu­tions. These institutions have transformed little in the past 60+ years and are failing to accommodate shifting power dynamics.  Global arms spending, has grown in recent years after decades of decline following the conclusion of the Cold War.

Today’s most pressing issues, including security concerns, are global in nature but cooperation has proved increasingly difficult in the current international environment. The international security architecture has been slow to address global terrorism and transnational organized crime. Moreover, lack of trust in govern­ments and businesses complicates international efforts to prevent cyber threats.

Editor Note: Canada has been a direct participant and beneficiary in the international movements of the second half of the 20th century.  From being a founding member of the United Nations and NATO to conceiving the concept of peace keepers, Canada has been described as ‘punch above its weight’ in international affairs.

Day 2: Power, Cyber-Security and Renewables

This is the second list of potential disruptive factors that could influence the Canadian Public Service over the next decade or so.  See the previous blog for the first set of three and Seven Days of Disruption blog for the entire set.  These are in support of November 22, 2017 FMI Conference – Disruptive Writers.

  • Changing Nature of Power (2015)
  • Cyber Insecurity (2015)
  • Dawning of a new urban transportation age and the Canadian City (2017 and editor)

Changing Nature of Power (2015)

Adapted from A.T. Kearney 2015: In today’s world, power is increasingly fleeting and diffuse.  It is disseminated across individuals empowered by new technologies such as search engines and social media; to lower levels of government, including cities; and to start-ups and user-driven networked organizations. The rise of the global middle class is leading to greater individualism and expectations for service from their governments and from businesses, with consumers having never had a broader freedom of choice. Global trust in most institutions reached an all-time low in 2015, with governments continuing to be the least-trusted institution.

Editor note: So What?  The answer is that trust is foundation of a society and an economy.  It profoundly reduces the transaction cost in both.  Public institutions support this trust by enforcing social norms. In a strange twist, public institutions are sometimes powerless at the hands of a small but vocal group of individuals.

For example, an August 2017 discussion on free speech at Ryerson University was cancelled because the University was concerned about safety and security.  Described as domestic terrorism by one of the panelists, this is an example of public institution (Ryerson) self-censuring thought and discussion with a resulting degradation of its own power and trust.  While the individuals involved may congratulate themselves on forcing their view points onto an entire institution, they should also recognize that they are sharing a common heritage with the black, brown or red shirts who dominated politics a century ago.

Cyber Insecurity (2015)

Adapted from A.T. Kearney 2015:While the upside of the Internet is enormous, cyber threats continue to multiply. Estimates put global cyber crime losses at somewhere between $375 billion and $575 billion annually.  All connected devices and systems are vulnerable to attack. Computer systems, for example, are vulnerable to ransomware. The growing IoT also lacks strong security systems and is highly vulnerable to data theft.  To make matters more complex, the cyber arena is a growing domain of warfare between countries, in which businesses can be caught in the crossfire. The “Darknet”—parts of the “Deep Web” that are not discoverable by traditional search engines—remains a serious criminal threat, especially with the rise of crypto-currencies. It is cloaked with encryption software that provides anonymity to users. The Darknet is used as a source of cyber attacks, as well as a place to buy and sell ransomware and other cyber weapons. Another business risk of the Darknet is that it provides a marketplace for stolen data collected through cyber attacks, augmenting hackers’ motivation to continue conducting such attacks.

Editor note: Governments see online services as a way to provide better government for fewer resources.  The Singapore, Scandinavia and the United Kingdom are acknowledged leaders in this effort although the Canada Revenue Agency has also made great strides in allowing for a digital experience.  Nevertheless, governments have a number of challenges including the Facebook-effect, the Shiny-bauble problem and resource asymmetry.

Facebook Effect – You are the Product

The Facebook-effect is the problem of comparing government services to a for profit service such as Facebook.  If Facebook can provide service xyz or make its offerings free, why can’t a government?  There are of course a number of answers to this.  Firstly Facebook is not constrained by the same legal, moral and democratic frameworks.  Facebook has an entirely different revenue model.  For social media, the user is the product.  Thus your likes, shares and contributions builds up a profile of you as a person which can then be monetized.  For governments such monetization of a citizen would be outrageous.  Finally, Facebook can fail while governments are expected to be enduring.  If Facebook ceased to exist tomorrow it would be inconvenient but a new social media product would take its place (anyone still using MySpace?).

The Shiny-Bauble Problem

Governments like to implement new things.  Ribbon cutting and shovel turning is good press and leads to the primary objective of any government – staying in power.  As a result, governments get distracted by the Shiny-Bauble which have a short-term effect or solution that has little enduring value and may cause long-term harm to a society.  The worst thing about Shiny-Baubles is that they may become entrenched in a society by a small group who benefit from the government largess.  In other words, the only thing worst than a Shiny-Bauble is trying to turn one-off.

Resource Asymmetry

Governments often have fewer and less enabled resources to delivery digital services or fight cyber-threats than the legitimate and illegitimate competitors.  The above Facebook discussion is one aspect of this resource asymmetry and consuming valuable government resources pursuing a Shiny-Bauble is another.  A darker example of asymmetry is that the bad guys only need to look for and exploit a single weakness in a government’s cyber environment.  At the same time, a government must fight all threats while trying to provide services.

Dawning of a new urban transportation age and age of renewables (2017 and editor)

Adapted from A.T. Kearney 2015: The global urban population has risen steadily over the past two decades. According to the United Nations (UN), there were about 2.9 billion urbanites in 2000, but that number has increased to 4.1 billion and will hit 4.5 billion in 2022. The number of megacities, defined as cities with 10 million or more inhabitants, rose from just 17 in 2000 to 29 in 2015, and the total is projected to rise to 36 by 2025. Hyper-urbanization is heightening congestion levels in cities around the world. The age of the automobile may be ending as cities adopt innovative new technologies and use more traditional mass and individual transit methods to enable smarter and more sustainable urban transportation and growth.

(Editor) Public transit and electric vehicles are two ways that urbanization will change the face of a city but tele-commuting and promoting walk/bike-able cities are another.  This raises a challenge for Canadians as much of our housing stock has been built since the mid-20th century and the economics and logistics for all of these mitigating solutions will require significant government investment and coordination.  It will also require a change in cultural norms and expectations as owing a home has become central to financial and personal-security well-being.

Seven Days of Disruption

On November 22, 2017, the Edmonton Chapter of the Financial Management Institute is running an event entitled ‘Disruptive Writers‘.  In addition to hearing 3 great speakers discuss their books on either future disruptions or managing change, we will be playing a game called ‘Pin the Tale on the Disruption‘.  Sort of a mini-Delphi of what participants at the conference think will be the biggest challenge to the Canadian Public Service between now and …. ohhhh, say…. 2025 (e.g. about 7 years hence).

The Source of Disruption

There is a variety of sources for the disruptions but they are primarily based on the excellent work of the A.T. Kearney who have produced 3 Global Trend documents (available as follows):

It’s tough to make predictions, especially about the future (Yogi Berra)

A word of caution about the difficulty of making predictions.  Inevitably something better or worse will have muscled all of the excellent possible futures out of the picture.  In addition, Black Swans and the unpredictable are a near certainty.  So, to my future self, I profusely apologize/acknowledge for being so absolutely wrong/right in naming the following future disruptions.

A Laundry List of Disruption (in alphabetical order)

  1. Accelerating Global Climate Change and the cost to mitigate (2015 and editor)
  2. Biotechnology: Frankenstein, Super-bugs and Super-cures (adapted from 2016 and editor)
  3. Canadian Competitiveness and Productivity (editor)
  4. Changing Nature of Power (2015)
  5. Cyber Insecurity (2015)
  6. Dawning of a new urban transportation age and the Canadian City (2017 and editor)
  7. Depopulation Waves (2015)
  8. Evolving Artificial Intelligence (2015)
  9. Geopolitical Realignment and Continued Global Violent Extremism (2015)
  10. Growing debt overhang (2017)
  11. Immigration and Changes to the Canadian Values and Characters (editor)
  12. Indigenous Power (editor)
  13. Islandization” of the global economy (2017), NAFTA Negotiations and the rise of protectionism (editor)
  14. IT Revolution 2.0 and the Rise of the Machines (adapted from 2015)
  15. Post Consumerism (adapted from 2016)
  16. Quebec and Regional Tensions (editor)
  17. Resource and Commodity Supply, Demand and Price (adapted from 2015)
  18. Rising storm of populism; Canada and Cultural War in the Age of Trump and the Progressives (adapted from 2016 and editor)

Can We Monetizing Government Services?

On November 7, I attended a session put on by the Canadian Institute called “Government Connects“. All levels of government spoke about digital transformation of their services.  One of the speakers was the boss of all Alberta Public Servants, Marcia Nelson.  Marcia did a great job discussing what the Government of Alberta is doing in moving its services online.  Certainly Digital Government is the nirvana for most governments as they see cyberspace as being a cheaper, faster and more effective way to deliver more services to citizens.

The User as the Product

Marcia, and many of the speakers, talked about the expectations of citizens relative to their other digital experiences.  For example the ease to create a Facebook account, the functionality available via a GMail account or how a LinkedIn profile is now almost as important as a resume or a business card.  The question from Marcia, and others was ‘how can governments compete with these products?‘.

The other side of these services is a profit motive.  Facebook makes it easy to set up a profile so it can target you with advertisements. Gmail wants you as an email client so it can scan your email and target its advertisement.  LinkedIn wants you to buy a premium membership or at least get your eyeballs on its advertisements.  All of the above are examples of monetizing you as a user into becoming their product.  Assuming informed consent, there is nothing wrong with monetization.  It is an economic transaction in which a slice of your privacy is exchanged for some really good services (like watching cat videos on Facebook just saying).

The Digital Government Disadvantage

So where does government fit into this?  Firstly there is the challenge of resources.  A quick scan of the September 2016 quarterly results of Facebook shows they have about $10.6USD Billion in physical and intangible assets*.  Included in this number is $5.1USD Billion of network and computer software assets (physical) in addition to $1.7USD Billion in technologies and patents (intangible).  In other words, Facebook has excellent technical infrastructure to offer a premium product for free to users.  And if they don’t have a good product now, their $30.3USD Billion in current assets (e.g. cash, securities, etc.) can be used to buy that good product.

* Note, for those accounting weenies out there, an interesting item they have on their balance sheet is ‘Acquired users’.  I could not readily find a definition for this term but it appears that the users are really the Product!

Pity someone like the Government of Alberta (GoA).  A $50 billion a year organization in which an estimated 2.5%, over $1 billion, is spent annually on Information Management and Technology (IMT) (adapted from: GoA IMT Plan, 2016 – 2021, p. 4). From the GoA’s most recent financial statements, they have $4.4CAD Billion (about $3USD Billion) of computer assets – hey not bad – of which 78% of is fully depreciated (e.g. over 5 years old) – YIKES! (adapted from GoA 2015-16 Financial Statements, p. 63).

Beyond relying on old technology, the GoA has to do a lot more than Facebook.  While Facebook can focus on social media, the GoA needs to run registry systems (e.g. vital statistics, land titles or drivers licenses), health systems (e.g. immunization, medical records), education (K-12, student finance, apprenticeship certificates), business (collect taxes/royalties/fines) and human social functions (tracking children in foster care, seniors or homelessness).

The above is not a new story but it is worth repeating every now and then that governments do things that no one else wants to with a tiny fraction of the resources of private industry.  Governments must also build and run systems that have almost no tolerance for failure.

Risk and Skin in the Game

To the last point, risk, this is where government is at a further disadvantage.  The original investors in FaceBook backed a winner.  Those who put money in to Myspace, Friendster or DIGG did not fare so well (huh, never heard of some of these, check out the grave yard of failed social media infographic from the Search Engine Journal January 25, 2013).  Nicholas Taleb calls investors (win or lose) people with ‘Skin in the Game‘ from his book Anti-Fragile.  In contrast, public servants never have skin in the game.  We are always spending other people’s money and our fantastically worst case for abject failure is forced retirement or perhaps being fired – maybe.

In other words, governments have both an advantage and disadvantage around risk. The individuals involved do not have personal risk (advantage) but the organizations also lack the mind focusing benefit of the ‘terror of failure’ (disadvantage).

The Monetization Continuum and How Can Governments ‘Compete’

The reality is that Governments can’t and shouldn’t compete with the Facebook’s of the world.  Creating a bleeding edge user experience would be an inexcusable use of public funds and without the terror of failure would not likely be successful anyway.

But because thought exercises can lead to innovation, I am proposing the ‘Monetization Continuum‘ for governments; a government simply needs to pick a point on a line.  At one end (generally status quo) is ‘Mind and Accept the Gap‘ at the other is ‘Full Monetization‘ with other options falling between these two.  Definitions are provided below as well as way points but generally if you are Singapore you may be more comfortable having McDonald’s ads on your obesity website.  If you are at the other extreme – well this is where Minding the Gap comes in.

Monetization Continuum

End PointsDefinitionExamples
Mind and Accept the GapGovernments acknowledge that they will lag and explain why to their citizens. Periodically, governments leap-frog into a stronger position.Status Quo
MonetizeFund digital government through ad, premium memberships or sponsorship revenue.

 

Premium services could even be tax-deductible!

Faster border crossing via Nexus.

On the Subject of Not Likely

The reality is that governments will and should never monetize their services.  There is a slippery slope of what is reasonable and in good taste.  Governments have something that Facebook or Google does not have – the coercive powers of taxation and legislation. Perhaps governments does not need to build systems when they can force organizations operating in its jurisdictions to offer the services.  There is a long tradition of this in the telecommunications world, for example.  This would not be monetizing users as products, this would monetizing providers as servants for the public good.  Just a thought.

90 or 99 – That is the Strategic Question

Nicolas Taleb would have us believe that strategic planning is ‘superstitious babble’ (see Anti-fragile strategic planning).  In contrast, Kaplan and Norton make strategic planning a cornerstone of the Balanced Scorecard.  The reality is probably in the middle.

This blog however considers the question, how much time should an organization spend on planning?  Successful or not, when do you cut your losses for a year or when do you think that you are not doing enough?

How Much Is Enough?

On the one hand, strategic planning can become its own self-sustaining cottage industry.  Endless meetings are held and navels are closely examined with little to show for it.  On the other hand, the organization is so tied up in operations and ‘crisis du jour‘ that they wake up and discover the world (and even their organization) has completely changed around them.

What rule of thumb or heuristic can be used to know that you are doing enough Strategic Planning without decorating cottages?  My proposed answer is somewhere between the 1.0% and 0.1%. Although a full order of magnitude separates these values, a range is important due to the volatility of an environment an organization finds itself in.  Governments are likely on the low-end (closer to 0.1%) and tech start-ups on the higher end (1.0%).

For more on the basis for these heuristics, take a read of ‘A Ruling on 80, 90 and 99‘ for my thoughts and a review of such things as Vilfredo Pareto’s legacy and internet lurkers. A recap from this blog is as follows:

  • Pareto: 20% of an organization’s actions account for 80% of its results.
  • 90 Rule: 1% of the operational decisions are enacted by 9% of the organization affecting the remaining 90%.
  • 99 Rule: 0.1% of the strategic decisions are enacted by 0.9% of the organization which impacts the remaining 99%.

Thus the 99 Rule provides a minimum amount of time for an organization to consider strategic questions while the 90 rule provides a maximum amount of time.

Who Does What and What to Do with Your Time?

Consider a fictional organization of 1,000 people.  This is a medium sized business, typical government Ministry or employees of a large town or a small city.  Assuming there is about 1,700 productive hours on average per year per employee (e.g. after vacation, training, sick time, etc. see below for my guesstimation on this) this means the organization in total has 1,700,000 hours to allocate.  How much of this precious resource should be spent doing strategic planning?

I am recommending no less than 1,700 hours and no more than 17,000 hours in total.  In total means involving all people in all aspects of the process.  Thus if there is a one hour planning meeting with 20 people in the room, that is 20 hours.  To prepare for this meeting, 3 people may have spent 2 full days each – another 3 x 2 x 8-hours or another 48 hours against the above budget.

Measuring what Matters

The point of completing these measurements is to answer four fundamental questions:

  1. Is the organization doing enough strategic planning relative to the environment?
  2. Is the organization doing too much planning?
  3. Are we getting value for the investment of resources?
  4. How do we get better at the activities to reduce this total?

Is the organization doing enough strategic planning relative to the environment?

What happens if you discover you are not doing enough?  For example your 1,000 person organization is only spending 100 hours per year doing planning.  You may be very good and efficient and if so bravo to you and your planning folks!  On the other hand, you may be missing opportunities, blind sided by challenges and mired in the current day’s crisis – in which case maybe a bit more effort is needed.

Is the organization doing too much planning?

The 1,000 person organization may also be in a Ground Hog Day’esque hell of constantly planning with not much to show for it.  Perhaps you have a full time planning unit of five people who host dozens of senior management sessions and the best they can is produce an anemic planning document that is quickly forgotten.  In this case, measuring the effort of consuming 10 to 20 thousand hours of efforts for nought can lead to better approaches to the effort.

Are we getting value for the investment of resources?

The above two examples demonstrate how a bit of measurement may help you decide that 100 hours is more than sufficient or 20,000 hours was money well spent.  The output of the planning process is… well a plan.  More importantly it is a culture of monitoring, planning and adapting to changing organizational and environmental circumstances.  Thus setting an input target of planning to measure the quality of the output and the impact of the outcomes can answer the question if the planning effort were resources well spent.

How do we get better at the activities to reduce this total?

The advantage of measuring, evaluating and reflecting on the planning efforts is to get better at.  Setting a target (be 1.0% or 0.1%) is the first step of this activity and measuring against this target is the next.

Good luck with your planning efforts and let me know how much time your organization spends on its planning initiatives.

* How much Time Do You Have?

How much time does an organization have per annum to do things?  The answer is … it depends.  Here are two typical organizations.  The first is a medium size enterprise that works an 8-hour day, offers 3-weeks vacation per year, in addition to sick days and training (e.g. for safety, regulatory compliance, etc.).  On the other hand is a Ministry that offers a 7.25-hour day, 5-weeks of vacation plus sick and training days.

Organization Medium Size Company Government Ministry
Hours/day (1) 8 hours 7.25 hours
Work days per year (2) 254  250
Work Hours per year 2,032 1,812.5
Avg Vacation days x work hours (3) 120
(3 weeks)
181.25
(5 weeks)
Avg Sick Days/year x work hours (4) 60
(7.5 days)
54
(7.5 days)
Avg Hours of Learning/year (5) 42 29
Total productive hours/employee 1,810 1,548.25
  1. Few professionals work an 8-hour day let alone a 7.25-hour one.  Nevertheless, everyone has non-productive time such as bathroom breaks, filling up on coffee, walking between buildings.  So I am leaving the actual average productive hours at 8 and 7.25 respectively.
  2. For a cool site in adding this calculation, see: www.workingdays.ca.  Note this includes 3 days of Christmas Closure.
  3. 10 days is the minimum number of vacation days required to be given to an employee.  The average is a surprisingly difficult number to find (at least to a casual searcher).  15 days is based on an Expedia 2015 survey.
  4. Reference Statistics Canada: Days lost per worker by reason, by provinces.
  5. Sources vary.  I have chosen the high value for the for-profit organization as they often have stringent regulatory requirements for health and safety training.  For government I have chosen a medium value.  Sources:

Other Thoughts on Strategic Planning