Sunday, November 1, 2009

Globalize, Localize, Glocalize your KM

The human genome, all genomes: accumulated knowledge assets culled from the great evolutionary experience of being here for eons.

The genome is a master set of instructions at the molecular level directing the creation of an organism. Cabbage, Whale, Lichen, Eagle, Woman, we all on this earth have one.

The genome is a series of instruction sequences, each of which directs the creation of proteins through three basic building blocks, mixed and recombined endlessly and packaged through an infinitely complex series of folding instructions to create the hundred million cells we comprise.

A revealing modern understanding of the genome is that it contains large sequences identical to those found in ancient (and so modern) bacteria and viruses'. How is this?

The genius of evolution has been to steal shamelessly from all and any organisms that can be conjugated with, dissected, plundered and subsumed to serve our own selfish ends: individual adaptation and so the improved potential for survival in a complex and changing world.

After all, a living being is simply a mechanism for a genetic sequence to perpetuate itself.

As a living knowledge document, our own genome represents the currently understood best practices in the building of human beings along with a complex set of alternate versions of the current model that can be chosen depending on the context for which that being will need to survive in.

Interpretation or expression of the most suitable alternate attributes (the accident of the gene if you will) is a gradual process that takes a number of generations to get underway and many more before the benefits begin to accrue the organism. One example of this in the realm of our own recent genetic history is evident in our ability to derive nutritional value from the milk of the cow.

Until recently this milk was indigestible to us, the digestion requiring an ability to disect the sugar component of milk, Lactose. To do this, we need an enzyme to help in the breakdown of the milk to more easily consumed compounds.

This missing enzyme is now common in northern European populations and is called Lactase.

The lactase enzyme was co-opted into the human genome in the recent past in response to our growing practice of animal husbandry and the subsequent wide-spread availability of cow’s milk, a rich food source with great potential nutritional benefits if it could be metabolized. The genetic capability for this enzyme is largely absent from southern European and African populations giving rise to what we call lactose intolerance when these peoples consume milk and milk products.

This is a case-study for localized adaptation in the genome and a good example of how the master plan can be adapted to meet the needs of the organism on a local basis.

Our own knowledge systems must also reflect this type of localization facility to be adaptable and so we come to the term Glocalization, a lumpy word coined almost a decade ago by Thomas Friedman. It refers to the positive side of globalization, it means the ability of a culture or country to absorb enriching influences of other cultures without being overwhelmed.

We can learn from this concept and extend the use to inform how we design our KM systems so they are reflective of the difference between local and global needs as found in the diverse knowledge ecosystems within a large organization.

The diversity can reflect geographic disparity or merely functional, the response is the same; a native mechanism that enables a localized reflection of the application’s core value proposition while enabling the local rendering to remain conjoined with the global mission and accrue to and from the benefits of the globalized mission as a whole.

In other words our knowledge management platform strategy should be defined in holistic terms to reflect the enterprise mission as articulated throughout the various levels of the organization but, the practical realization of this strategy must reflect in a systemic way the functional variance and the local, regional and top level (HQ) needs for knowledge capture, distillation, storage, dissemination and reuse.

Let’s come back down to earth here and ask, “How do we glocalize a KM application”

In a past experience at a large law firm we used a product-customization approach to achieve this with a good degree of success.

Here are some of the learning's from that experience, shared knowledge for you to reuse as you need to…note this is not an exhaustive list, merely indicative.

1) Functional localization considerations:

a) Our experience with different cultures and styles of local leadership / organization led us to offer several ways to navigate the knowledge stores.

One was purely by search on key words, a simple enough concept initially but one that rapidly got complex as we considered all the synonyms that begin to emerge when you cross versions of English with non-English speakers understandings and use of English. Don’t forget support for the 5 Asian character sets in your application navigation.

b) Infrastructure smart enough to know who you are and where you be. Once these two things have been ascertained, the right data store on the right local server can be connected to the user. This should be factored into the architecture right at the beginning otherwise you risk connecting the travelling user to a store thousands of frustrating miles away or, connection to a local store of no relevance whatsoever to the traveler.

c) Don’t forget the administrative functions. Forgetting the power users who will actually be doing the heavy lifting in keeping the system valuable, is key. They are the champions you will depend on to keep the engine running and they must be consulted.

2) Content Localization:

a) A key element at the top of the content consideration list is the value of a local Taxonomy. You can be 100% sure it will not be one that works well outside of the locality and also that there is no other that will work within it.

Be prepared to offer guidance on key nodes within the master taxonomy for inclusion in the local but support the initiative to grow local nodes from those top levels, particularly when the usage will be downwards and not rolled up.

This is an important recognition of how localized the currency of Knowledge really is as distinct from say, financial reporting or key performance indicators. Always keep the following question uppermost in the discussions: ‘who will use this most..?’ and ‘who MIGHT use this and find it of benefit’. A local taxonomy is intrinsically more valuable if it is also reasonably easy for a non-local to quickly assimilate and traverse.

b) Local synonym support. This speaks for itself but be prepared to face a fairly complex set of variations that need to be accommodated. Also anticipate that, even in a perfect implementation of localization, the locals themselves must be trained on the taxonomy and the synonyms otherwise frustration will result.

3) Security

Never underestimate the complexity and difficulty you may encounter trying to establish any sort of consistent application level security solution for knowledge. There are so many views on this subject that the only consensus is the lack of any.

To illustrate this a little further: within a legal practice there are those who believe that the implicit confidentiality of the relationship between an attorney and client explicitly denies the sharing of work-product resulting from this relationship.

In practical terms, this means all legal work product in this paradigm is by default subject to an almost zero sharing policy, it is locked down to all but the intimates of the matter. This is a stance often found in mainland European practices or practices where there is a high degree of long term confidentiality required, for example large arbitration or litigation practices. To some degree this is reflective of the current Opt-In approach to personal data in the EC.

Contrary to this there is the approach that has its roots in the idea that ‘Information just wants to be free’, an approach that in legal KM tends to direct Knowledge architects and KM practitioners (in the US and UK for example) to push for open-access to all documents that are not explicitly excluded. This is like the Opt-Out approach used in the US in regard to personal data.

This is the obverse approach and right away you can see the enormous potential for inter / intra-continental and inter-practice conflict.

Tread carefully here, your best ally is a fully articulated localization policy for security matters that puts local concerns ahead of global, nothing else will allow you to keep the peace and keep moving forward.

You will also need to be sure you carefully articulate who should own these interpretations in each locale. Effective local syndication of the process and results plus the full representation of it is very important.

KM focal points, KM specialists, Paralegals, PSL’s or however your organization recognizes the local KM roles will need to be the owners of this aspect of the system design.

Be sure they engage enthusiastically by providing them the means for local expression in the product you will deliver.

Further discussion of this would probably be better suited to another forum but, some critical early thinking within your own organization around Glocalization of efforts, ahead of any initiative taking shape; will do wonders to improve the potential for survival and success of the initiative as a whole.

Put another way, your initiative will need to be able to accurately reflect local concerns and requirements as a part of the global perspective in order to have a chance at survival in a complex and changing organization.

No comments: