Sunday 29 January 2017

Reading Snippet - In Defence of History (2)

The debate over whether history is science

The author has devoted one whole chapter purely on the topic of "is history a science?" without first explaining why this is such an important topic. It was hinted in the previous chapter on a walkthrough of history of historical studies, in that a "scientific" approach to history has never escaped practitioners' purview.

The author raised a number of common reasons that suggested historical studies is not science, and then countered these reasons. The first reason is that science demands a build up of theories and knowledge that are enriched from generation to generation and proceeds towards the "ultimate truth" of the natural world. Conventional observation suggests that historical studies is about new theories replacing older ones, and any claim of "building on top" amounts to suggesting the proposer is arrogantly standing "at the end of history" and provides a definitive breath-down of his/her past and present colleagues.

The author simply suggest that historical theories and understanding improve from generation to generation, and the previous generations' contributions are definitely picked up and utilised in creating new theories or enhancing the narration. In this way, the reason doesn't stand.

Another reason raised is that scientific observations are impartial within strict confines of laboratories without human emotions' contamination (or bias), whereas historical studies are necessarily observations on human conducted by humans in which personal judgements would be inevitably infused into any conclusion.

The author suggested that a historical study in which the researcher's human emotions and perceptions are manifestly infused would have little academic value and make a bad read for the audience. Impartiality and evidence-based conclusions are just as important in historical studies, and forcibly including present-day moral values in the historical evaluation would only render the study ridiculous as soon as the moral values have shifted.

To the author, the strongest reason for "history is not science" lies in the inability in generating "universal laws" from historical findings. Science could - laws of gravity, laws of thermodynamics etc are repeatable and exceptions could be explained. On contrary, historical findings could at best be generalised to describe approximately accurate observations, but is full of exceptions.

To this, the author's suggestion is that human beings are likely to study history and implement their learnings, such that sequence of events are altered and the conditions for the generalised observations to play out are deconstructed. In this way, an exception is created, but it would otherwise have been repeated if history had not been looked into. A bit of a time machine situation.

The purpose of this chapter seems to become clearer as the discussion continued beyond these arguments & counter-arguments, for he went on to discuss how much literary efforts should be expected from historical studies' writings. Should it make beautiful reads like poems and literature, or focus on "getting the facts accurate and right" and becomes science-journal like documentation?

The author's discussion indicated that there's too much emphasis on scientific training to compile evidence and put forward new hypotheses, with insufficient efforts in presenting history as a reader-friendly piece-of-work. There are writers who attempted to insert metaphors and analogies to liven it up, but they were so deliberate that the meanings were lost and the literature value was negative. To the author, this is definitely a deficiency that needs to be addressed.

To this end, is historical studies a history or an art? It's a craft - to become a good historian, the scientific training to ensure accuracy and robustness of theories & understanding is important, as the myths of "historical studies is not science" has been repelled by the author. However, it is not a pure science given the differences identified, and given the importance of having literary value in the piece so as to be approachable. These characteristics give rise to need for a historical studies apprentice to learn through practicing, and not just absorption of theories and skills as in science - they need to absorb through osmosis on playing it out in a PhD programme and observing how the grand masters (the professors and great historians) practise their trade. Then they can become a good decision-maker when answer on "what approach to take and skills are utilised" is called for in their next piece of study.


Wednesday 25 January 2017

Reading Snippet - In Defence of History

The never-ending fad of "The True View of History"

The author spent the introduction and Chapter I giving a view on how historians see history in term of its function and best practice approach. It makes fun reading, as it gives the reader a sense of the variation in how history as a subject defines itself.

Starting from the middle ages in Western civilisation, history was seen as a chronicle to document God's deeds on Earth, and how those deeds illustrated God's power; by Age of Enlightenment, religious thoughts have given way to philosophy & moral, and history are real-life stories that illustrate the good and the bad, and the desirable traits. When revolutions broke out in Age of Reason, history becomes nostalgic episodes to look back at times when the world was stable and rosy for the well-to-do, contrasting with the contemporary world which was uncertain, brutal and upside-down.

Throughout these three early stages, history is more proverbial and sub-ordinated to other subjects - be it Divinity, Political Theories or Philosophy, and facts only have to be right-ish to be sufficient indicative. However, as natural sciences developed, there was a greater call for history to "get the facts right" in the same way that natural science gradually improved in its accuracy in quantifying and de-mystifying the world. This called for efforts in poring through primary documents and critiquing sources or previous writings to identify the sole truth and write it out. This is a strenuous task and could not be conducted as part of another subject - history earned its place as an independent subject requiring its own skills and expertise.

This 'scientific view' of history evolved with time from 18th century onwards, alongside shifts in scientific philosophy. When raw materials were abundant but under-utilised back in the 18/19th century, the emphasis was on combing through the sources and write definitive tomes; when relativity emerged and everything was 'relative' to the perspective of the researcher, the scientific view adopted relativity and emphasised on relativistic interpretation of historical events. When computers emerged and quantitative data became easily analysable, researchers shunned relative views in favour of 'big data analysis' to derive historical conclusions - let the algorithms and formulae tell the truth, and stop individual researchers' preferences from muddling with the truth.

Apart from the fact-centric evolution of history, the implications of historical studies (and ultimately its purpose) also developed, as people started to get the historical facts "right" and more value could be derived from the more solid base. Moving away from the religious/moral/nostalgic purposes, the rise of nationalism in late 19th and early 20th century organised history along national boundaries and used it to stir patriotism and justify the borders. In between the two world wars, history was used to reflect on victories and defeats and use history to justify their victories or explain why the lands loss through defeats should be recovered; where this went to the extreme in Nazi Germany, historical views were heavily doctored to justify the regime.

After the second world war, the credibilities of history as discoverer and impartial evaluator of past events was shattered. There were calls for it to be once again sub-ordinated to social science subjects, or to focus on "laying out the impartial and correct facts" avoiding judgements or advocacy of excessive interpretations.

In recent times after the fall of Communism, history became challenged by post modernism in whether there could be anything as an impartial history, as any theory or events would be subject to the authors' own preferences and values.

What has been presented by the author and summarised above is not the only interpretation of how historical studies have shifted in the past 3-4 centuries. Different authors would surely place the trends differently or raise other angles of observing such paradigm shifts. One thing that came out of this history of history is how fads came and went.

At every stage and every age, there would be some theories or paradigms that became fashionable, and any doubter would be shot down by its numerous supporters. These paradigms would be heralded as the "one and only correct way" to do things, and the future could only be bright through its hegemony; all the previous paradigms or alternative paradigms are utterly wrong and should be discarded. There would be people raising questions, but they would be ignored; cracks or mis-fits would appear, but people were happy to overlook or whitewash; addendum and modifications were regarded as corruption to this way and would be defended against. When the fad shifts, all of a sudden the paradigm would look silly, and abandoned.

What this episode reminds us is that at any time, there would be countries, cultures, theories, political views that look invincible, the "true way forward", and weak points would be hard to observe and opponents are all "backward and stupid". But we should keep challenging, be sceptical, and do not stop suggesting alternatives or modifications. Otherwise, we will be trapped by endless fads.



Monday 23 January 2017

Life snippet - of "sanctuary" and concept of "co"

Two recent thoughts - "sanctuary" and "co"

Two thoughts have popped up in my mind quite a few times over the past week.

The first is "sanctuary", which is almost in response to the huge amount of noise generated by Prime Minster May's Hard Brexit declaration, the run-up to the inauguration of President Trump, and the farcical election campaign for the chief executive position of Hong Kong.

As these controversial events become common occurrence globally, mainly driven by the previous decades' development of libertarian politics and economics and the associated complacency towards addressing the "less prosperous half" in society, lots of commentaries, alternative views, analyses into their root causes, 'magic bullet' solutions etc will emerge and fill up our social media pages. Some may even become inputs in formulating next-generation political and economic theories and frameworks, by virtue of mass-propagation through various platforms leading to popular adoption that cannot be ignored or corrected.

Lots of social media, lots of voices on each one, we are overwhelmed by information, snippets and opinions. We have to be selective and to 'subscribe' (filter) to channels that are similar to our prevalent world view and mindset, deliberating ignoring ideas counter to our preferences. We spend a lot of time updating ourselves with information and patching up our existing view, but what about the enlightening act of being challenged with new frameworks, world views, mindsets and subject areas?

The social media environment enables us to get broader and deeper within fixed dimensions, but doesn't really allow us to open up new dimensions. It also encourages us to satisfy ourselves with frequent but short feeds of easy-to-understand information packets, instead of less frequent but prolonged sessions of self-challenging narrations. Information packets are important and their value should surely be recognised, but perhaps not to the detriment of prolonged narrations? We all enjoy Youtube short videos, but a good movie or TV drama series has its placed as well?

If we have created digital outlets which enables creation & disbursement of information packets, should there also be a balancing act for creation & disbursement of long narrations? Electronic books have opened the way for digital narration materials, but there should be a 'sanctuary-like' digital space for us to take a break from ceaseless info packets and instead interact with narrations and work on them? If we treat the digital books as the ingredients, and a certain 'sanctuary' as a kitchen, then can there be digital tools to work on the ingredients and create outputs as satisfying as writing a Tweet or sharing an article and getting numerous likes and comments? Can this be a digital space that encourages us to take our time to exercise our mind without much noise, but one that also encourages us to 'take', 'give', 'build (own capabilities and profile)' and 'contribute (to the wider community)' just like the prevailing social media?  Just a thought.

Another interesting thought came from an article in BBC - they reported that given the double-act of an ageing society and expensive housing in France, some people have paired old people with the young, such that the young rents from the old at a much reduced rate, but in turn provides 'assisted care' to the old by keeping a watch eye and performing simple tasks (without becoming a de-facto carer even in a part-time capacity).

For a long time, the market economy serves as an efficient clearing house between different demands and supplies, allowing sought-after goods & services to demand a high price and stimulate supply; entrepreneurs and corporates are encouraged through the lure of profits to identify goods & services that could fetch a high price and/or high volume. A good mechanism that brings benefits to both the customers and suppliers.

But like all good mechanisms, when we rely on it more and more to solve the world's problems, somehow it doesn't deliver the benefits we intended. A lot of non-market solutions such as housewife (or househusband) services have given way to both parents working to generate income/GDP, then pouring the incomes back to the economy to procure childcare or domestic services. The national income has increased as services are now routed through the market, but are the parents really better off? And as this market mindset prevails, other services are facing similar market-going pressure.

The result is that the society puts greater emphasis on the market, prices and wealth. In the past, with a range of non-market solutions available such as a community helping each other, wealth and income is not the be-all-and-end-all. You don't need a high income to live a meaningful life. But as non-market solutions disappear and everything costs money, people providing services through the non-market route becomes negatively deemed - not generating income, not able to bring a wider range of goods & services to their homes (you can't bring beef to the table unless you are in a farming household), not saving up to extend goods & services from now to the future, and there is no one to exchange non-market services with (apart from you, few people in the community are still offering non-market services to trade with).

We end up giving more power to wealth and income, and we start to chase income and wealth by putting a price tag on everything - which subject you should study in university has less to do with your interest and capability as the job prospect; where to buy your house has less to do the comfort than ability to sell at a higher profit in the future; people with a bit more income becomes the absolute 'elite' and income & wealth is the only measurement of a person's worth in society. Our human nature is second to money-making nature, and that seems wrong.

Going back to the non-market economy may not be the best solution at this point in time, as its benefits should not be overlooked - earning now for the future is very attractive, as is the ability to attract provision of goods & services which is not produceable in a local community, not to mention the freeing up of certain community constituents (e.g. wives) from almost-mandatory provision of non-market service.

The non-market solutions emphasis a balance within a local community, while the market-based solutions emphasises a 'global solution' which in theory uses money for goods & services to be cleared regardless of geographical distances - if the price is high, the goods & services would be provided. The market solution therefore discourages the formation of a community that has meaning with each other beyond pure demand & supply of a particular good or service (or a range of goods and services). You are either a provider in search of profit, or a customer ready to pay.

This latest BBC article, and indeed some social enterprises that try to get old or disabled people into employment, are refreshing not least because despite adopting a market-based solution in principle, it has included the non-market of "co" concept into the deal. The supplier of housing is also demander of assisted living services and vice versa. It is clear that the housing supplier has the upper hand and could therefore charge money, but it is equally clear regarding the distinctive value introduced by this mode of market-based solution - without this deal, the housing supplier (and assisted living customer) may have to contract service from yet another party who has a very foreign & singular relationship, which would be expensive (need to familiarise with the customer out of any other context and need to carve out time to provide potential out-of-hours services) and not have as high a satisfaction. This market-based solution clears two deals in one go, and this associated nature of the two deals mean this solution not replaceable with a pure market solution.

This associated nature, i.e. the "co" concept, creates a super-local community relationship between two parties with the benefit of a global-based market solution which brings suppliers and customers from afar.

Could this type of modified market-based solution be expanded, so that we are less money-centric amid a market-dominated society? Just a thought.

Sunday 22 January 2017

Reading Snippet - Art History - a Very Short Introduction (3)

How to read into art - the concept of iconography


In the previous chapters, the author has introduced us to a range of schools of thought that derived cultural-society observations, psychological status of artist and concepts of beauty from artwork. In some way, these thoughts may be over-complicating the appreciation of history of art, and have distracted us from the primary focus - to appreciate a piece of artwork in its own right.

The pure aesthetics of the artwork is as important as the 'beyond-the-artwork' learnings which could be uncovered. It contains messages which the artist wished to visualise, and it required the viewers to see symbols and signals which would allow such messages to be identified, unpicked and understood without the artist's written or spoken annotation. This is where "iconography" joins in the fold.

We could appreciate the aesthetics or historical contexts of an artwork without any knowledge of iconography, and that is how most of us appreciate the 'popular' or unfamiliar artwork - who drew it, the style/school, when it was drawn, why it was historically significant (commissioned by someone famous or for a specific occasion/famous artist/high price etc). But if we start looking deeper into the artwork, we will need to start thinking about why certain items were drawn into the picture, some were omitted, some were given distinctive colouring etc. This kind of analysis in order to reverse engineer why an artist formulated the scene and drew it out in a particular way, and what messages was he trying to input.

A very simple example would be a Greek sculpture of a naked man with a diadem - the diadem itself signalled that the man being carved out was Apollo. Another example was the Dutch paining "Maid with a Milk Jug", which revealed that there were graffiti painted into the back wall and a stove heater was also painted in which was not in the draft sketch - through these items, the artist wanted convey a message of love and warmth, maybe the maid was somehow in love?

By putting on this "iconography" hat when appreciating artwork, we can immediately add new dimensions to art appreciation, and also start describing the artwork beyond who/when/what/how. This act connects our visual observation with art history (what signals and ways of presenting were prevalent during the artist's time?), social history (why are those signals used to denote those meanings? what kinds of issues were top of people's mind that required signalling?) and artist's biography (what affected his choice of signals and the distribution of signals). This is a skill that adds lots of fun and experience into a simple exercise of art appreciation.

Art appreciation can be very simple, but by understanding history of art and utilising those knowledge and skills, we can get a lot more out of it, for both leisure and professional purposes

Monday 16 January 2017

Reading Snippet - Art History - a Very Short Introduction (2)

The historiography of history of art

This book has expended page after page not just on the different schools of thoughts in history of art, but also how museums' settings and positioning contributed towards the expression of history of art.

A museum typically organises exhibits by period or style, which helps to reinforce the orthodox model of history of art - the demarcation from period to period, and the boundaries between styles. However, increasingly museums are arranging exhibitions around specific themes, to help with exploring certain topics in depth, e.g. the role of women in art (patrons, subject, nudity, feminism etc). In some cases, by having carefully set exhibitions, new art periods may emerge - an example is at the turn of the 20th century when a number of different art styles were exhibited under the theme of 'post-impressionism' and they all became known by that umbrella term, despite having vastly varying methodologies and philosophies.

Furthermore, museums help to update the orthodox model through its curation. When major museums purchase new artworks, it sends a signal on what new styles may be viewed as classics and become part of their collections. In this way, a very public adjustment is made to the model - such adjustments can be in terms of new styles, new artists, new themes, or even new ways of appreciating and thinking about art that bring certain artworks into prominence.

A very ancient school of thoughts in art history is to seek rational thinking and wisdom through artwork, meaning that work that displays intelligence through its topic and skills are superior. This subsumes artistry below thinking, and 'beautiful' artwork per se would not be appreciated as much as a 'thoughtful' artwork. However, towards modernity, two opposite schools of thoughts challenges this view. The first is Kant's view that art itself merits its own set of theories and methods of appreciation, without any need for thoughts and rationality to be expressed. This places emphasis on studying what makes an artwork 'beautiful', putting artistry on par with intelligence. Another school of thought is the Hegel school of Zeitgeist, which emphasises the artwork as a manifest of the age's spirit, as such the works should be studied to understand the spirit and the cultures and society associated with that spirit. A typical example is the Marxist argument of different social classes preferring differing art styles and themes depicted by the artworks.

These opposing schools are not static but have evolved with time to create new generations of critical theories, such as incorporating Freudian views on psychoanalysis. This allows artwork to be analysed in association with its author, understanding the artists' subconscious motivations and mentality through the works' subtle symbols. Another advance is that an artwork could be classified into 'inside' and 'outside' elements such as the frame, artist's fame and its price, but the 'outside' elements would come back to affect how its 'inside' is evaluated.

There are many ways through which history of art could be presented, expressed and displayed. However, it is important to bear in mind that 'history' as a subject has two sides - the historical events that have taken place, and how we as the present-day people look at what has happened in the past. This is why different schools of thought emerge, to respond to the present age's needs, anxieties and curiosities; this is why the museums have changed their display method from time or style based to theme based, as some themes such as feminism needs to be explored given today's political atmosphere; this is why new periods, styles and artwork preferences emerge, as they enable the narration of history of art to be smoother in the eyes of the modern patrons or that gaps could be filled.

History is therefore a living subject - not just that forgotten events are re-discovered and new events emerge, but how the events should be studied and aligned also shift as the world moves forward. History, in other words, change with the current world.

Sunday 15 January 2017

Reading Snippet - Art History - a Very Short Introduction (1)

The art history orthodox and why it has become a problem

The author of this short guide to art history has no intention to enforce the 'orthodox' trail of art history upon us. On contrary, her wish is to inform us of this orthodox way, alongside its criticisms and the realistic issues faced by this model, so that the readers could walk away more educated, alert and interested than otherwise.

The orthodox model aims at transforming an art lover into an art connoisseur, someone who could has an acquired taste for 'high art' and could instantly understand the value (both artistic, social and economic) and origins of value for an artwork. To help with this training, studies would be centred around 'star artists' (geniuses) and time-period-bound styles (e.g. classical, impressionist etc). Through these two basic methods, the orthodox art history is presented as a linear progression from primitive to advanced art styles led or signified by geniuses. There can be variations within this orthodox narrative, such as the concept of 'zeitgeist' which is 'the spirit of the age' which emphasises artwork being a manifest of that time period's thoughts. Nonetheless, this reinforces the concept of 'classics', which are the representative artists and artworks for each period.

The most obvious problem brought about by this model is its narrow nature - within each time period, there should only be one 'high art' style manifest through a number of classic artists and artworks. Even if a fair number of styles exist, they might be grouped into an umbrella style such as post-impressionism. Other styles, thoughts and explorations are simply deemed secondary, not-high-enough and fade into the background.

Another trouble is its linear nature - this time-period study face pressure to portray art as ever-advancing. As we move forward into the next period, we expect there to be improvements in art techniques and wisdom - going from 2D to perspective, going from expressing realistic concepts to abstract concepts etc. However, some of the time-period-shifts may simply be a change in how people see art and the anxieties that should be expressed through art, instead of a technical or conceptual improvement per-se. This emphasis on improvement is masking some genuine discussions and researches.

The narrow and linear nature combine is OK when the subject of art history only needs to deal with European art - they can 'force' a lineage from Graeco-Roman art through the middle ages to Renaissance then to impressionism and modernism. But what about art time periods that co-exist with this classic textbook narrative, such as art in South America, Asia and Australasia? They are ignored, sidelined or deemed 'primitive' not worthy of consideration. Where they are considered, it would be in terms of their influence on the mainstream narrative (e.g. Chinese porcelain entering Europe, or impact of Chinese concepts on Romanticism). A wealth of art has now been put into the background, and when the orthodox model picks them up again, the mainstream artists or artwork will take the glory for re-discovering styles/art narratives/improving the global high art. The focus of such re-discovery (or re-attribution to be more precise) is then mis-placed.

One final trouble - by emphasising the orthodox-sanctioned artists and styles, the corresponding artworks become sought after and their monetary value sky-rocket, and a positive-feedback loop is usually in place to then put more weight on these styles and artefacts in the next iteration of the orthodox narrative. Mona Lisa will be attracting a bigger and bigger crowd, with other magnificent artworks in the same period becoming niche or ignored.

This is the issue with the orthodox narrative in art history, but how can we be sure it's not an issue in other subject areas, or indeed a lot of world models that somehow support a 'mainstream narrative'?

Saturday 7 January 2017

Work snippet - Why digital transformations fail (2)

Why digital transformations fail - Planning all at once, spending all at once

Digital transformations within traditional companies, especially the bigger ones or those with some kind of monopolistic power over the market, are usually kick-started by start-ups attracting market attention and growth giving them a wake-up call.

The kicks are usually quite brutal, and these companies are likely to panic. They will start believing that the future is here, and the start-ups will grab it all unless the company acquires similar digital capabilities today or as close to today as possible. The 'next step action' is predictable - big announcement to the press that a lot of money will be invested in technology and a 'digital division' will be responsible for all these initiatives.

When you give a large sum of money to someone to spend with few directions & control, they will squander the money, and the same goes for digital transformation. The new division has the board's blessing to be bold, experimental, and start-up-like, yet it doesn't have the start-ups' natural constraint of limited funding and pressure to generate revenue (or traction) soon. A lot of projects will be scoped, including ideas that are so bold & wild that start-ups dare not touch, ideas that may overlap with each other, or ideas that need a lot of work on both the front end and platform which only deep-pocketed companies could afford. If a product is being launched by a start-up, it will be copied; if a new technology is emerging and praised by some tech blogger, a product would be scoped around it to make the division sound bleeding-edge; if the directors have their pet ideas, they will be realised into projects; if someone within the company had an idea or a problem statement, yet another new project would be established. With healthy funding, there is little need to prioritise, rationalise and be focused.

With so many projects starting at the same time, a large team would be planned & hired. Lots of seats to fill within a short time, wages would be inflated or contractors are hired. A hierarchy riddled with senior positions would emerge - each project would have multiple engineers with a lead, UX and UI designer with a head of design etc, then the leads would be led by heads of, the heads of led by directors and so on. When a lot of people join without common processes & culture being in place, you will need a lot of consultants and coaches to add to the huge team.


The cost of this transformation approach is heavy. With lots of projects working in parallel, the division roadmap becomes very complex to manage and keep on track - different projects may require the same platform to be configured differently, and whose view should prevail? Overlapping projects are naturally creating tension; a number of projects are about to launch at the same time and the supporting departments may be stretched; if one project is delayed, it may have a knock-on effect on all other projects, and suddenly the whole division is paralysed.

With lots of people hired externally within a short time, it's equally tricky to keep people happy & productive - without a strong culture and framework to galvanise team members together, people stick by their old practice from previous workplaces and collaboration between and within teams become hard; when the directors want to harmonise the culture & processes, the entrenched practices become hard to change; a new division is like a virgin territory which everyone fight against each other to maximise their power, and any minor issue would quickly escalate into a political incident; each team & technical function would try to prove their value, and gain overall control of the project or even the division. The directors may want to stamp out all these troubles, but they are overwhelmed by the variety, scale and complexity when there are so many players.

One more thing to mention - lots of projects plus lots of people, this can only mean programme management is next to impossible, with projects' progress easily delayed by each other's minor hiccups and utilisation of each team member hard to keep high. Delivery cost goes up, and project timelines are wildly inaccurate.

Sounds like a chaos? Even if you have the best senior management at helm and the hiring process is rigorously controlled to get only the most capable hands, it will still happen, guaranteed. The vast funds will be quickly spent, results are not delivered, the team members are unhappy and start churning, and the board quickly realises that the value-for-money is poor.

The hell situation doesn't end here. With funds spent and projects late, the board get angry and want to scale back or get the shop back in order. People will have to be let off to 'start again', projects will have to be pored over and prioritised, and this introduces either a sense of fear or anger, and people either pack their bags or try to demonstrate their projects' value (against the other teams). No funding, half built products, and severe in-fighting, it's hard even to say 'time out' and re-group. Not to mention getting the inflated wages down to the market rate.

Monday 2 January 2017

Work snippet - why digital transformations fail (1)

Why digital transformations fail - the superficial transformation


A lot of traditional companies are undergoing digital transformation at present, and most of them will turn out disappointing results in time. I have lived through a number of these personally, and have also heard or witnessed accounts from colleagues associated with other companies' transformations.

These disappointing results are not just 'failure' in terms of projects that had to be abandoned. Most of these companies do not intend to give up, but they are forced to truncate their original vision, lengthen the transformation timeline, or kill some of the projects as the vision changes. These are bread-and-butter changes, but given the high costs involved in digital transformation and the expectations that have been apportioned, it is a rather traumatic experience from top to bottom.

In summary, many routes to massive failure, but they could be avoided. Let's discuss one such route here.

A traditional company often become anxious about digital transformation when they see some start-up rising through the ranks and becoming praised and sung about in the press and among industry players. They see these websites, apps, nice user design interfaces (UX) and new-generation graphics, and believe that this is the future, and the start-up will be taking all the business soon enough.

The response of the CXOs is to buy that start-up, and failing that, launch a digital transformation - if we can't buy it, we will replicate it. This type of transformation is characterised by having a digital division heavily focused on front-end related resources - UX designers, web and app developers, online & social media marketing agents. If they see a nice website, or a nice app, they will copy it; if they see several nice webs and apps, they will quickly analyse their features, then decide which ones to put on their own, although including them all in is the typical decision.

The kind of transformation is fast and exciting - creating a new web or app is comparatively simple, and in no time there is a digital answer to the start-ups. But the downside is immense - the company does not know what they are doing, nor what should be next. You have an app and you have put up some social media marketing, but are the behind-the-scene operations able to help the app perform? Is the backend digital infrastructure supporting the app for future iterations? Do the CXOs understand what metrics to track and what means success/failure? Does the company understand how much autonomy should be given to the app team to grow it, and how much resources should be devoted to its R&D? These are the new things that MUST come with real digital transformations, and merely having a customer-facing product is not digital transformation, it's digital pretension. The company has an app or web, but it hasn't acquired itself a complete business unit like its other business divisions.

This reminds me of a historical comparison - back in mid 19th century, both China and Japan jumped onto the 'reformation' boat to become competitive on the world stage. Both decided to reform their military from sword-and-arrows to guns-and-cannons. The Chinese delegation went to Europe and spent a lot of money on the latest munitions machinery and designs; the Japanese delegation was much poorer but wiser. They spent a small portion of their thin budget on machinery, but then spent time studying the latest military strategies, fighting personnel resourcing & training methodologies, and the logistics and supply chains to support the weapons & soldiers. Half a century later, the two countries came head-to-head in a naval campaign, and Japan wiped out the Chinese navy.

If a company focuses only on the superficials and what the customers are seeing, instead of looking behind the scene to see all the mechanisms and expertise to make the superficials shine, then it is not too far off being China in the historical metaphor.

Sunday 1 January 2017

Work Snippet - Build capabilities, not skills & knowledge

Build capabilities, not skills & knowledge

To help workers upskill and fulfil their career ambitions, more and more companies are setting up "learning & development" teams within their HR divisions. There is such a team in my current workplace, but I didn't find them very effective.

Make no mistake, they are very good people who probe you to reflect on your deeper desires & incentives, thereby helping you to find out what you exactly want (or are suited to) and then they would suggest courses or learn-from-the-job opportunities, or changes to work practices in order to start developing. But on the back of all these exercises, their minds revolves around two themes - what skills do you need to develop, and what knowledge you should acquire. And this is the problem - answering these questions will not get us to where we want to be career and life-wise.

We now live in an age in which skills training and knowledge acquisition are easy to access - you can search online, subscribe to specialist sites, or download specially designed & gamified apps. Merely learning new skills & knowledge is a hygiene factor, and won't get you anywhere in your next job interview.

If you have been an interviewer for your company, you will find that each person's CV is now clogged with skills and knowledge - the days when people learned nothing beyond what were prescribed in their university courses are gone. But at the same time, you also find that the people short-listed for the next round are usually not the candidate with the most skills & knowledge; it is instead the candidates whom you have interrogated and find that the team will be happy to work with, can handle the scenarios and challenges that will arise in the particular office/company, and can contribute towards a team or project's success by utilising their skills & knowledge and amalgamating them with those from the other team members.

As the knowledge economy move up a level through the introduction of web services, artificial intelligence and robotics, the processing-based jobs are replaced, like those general ledgers reconciling receipts, solicitors processing standard divorces or house purchases, and banking analysts processing mortgage applications. These jobs require professional skills & knowledge, and once you have successfully acquired them you know you can do the job after some company-specific adjustments. In work environments that filled with these jobs and management roles that organise & monitor these job roles, asking 'what skills and knowledge do you need' is the model question triggering the most succinct and useful answers.

As processing-based jobs are killed, a new type of jobs surface - "movie-making-type" jobs which does not have a set script of when to engage which skills & knowledge and how the skills & knowledges should be utilised. Instead, as a crew of workers is set upon a fluid project which need to adjust to ever-changing external situations and emerging understanding about the customers and the market, each person needs to identify their role within the team and the results they need to deliver within each phase, then pick or acquire skills & knowledge that would enable them to play the role and fulfil their portion of results. If the requisite skills & knowledge are not ready, then go and get them through those convenient channels.

What this means is that skills & knowledge per se are not the primary distinguishers to set the top candidate from the bulk. What it calls for is the candidates' capabilities - what can they get done or contribute when they work in a team, under the circumstances that might arise during a corporate or project lifecycle? They may not have certain skills & knowledge, but if they can acquire, assimilate & utilise quickly, or have other skills & knowledge that could bring about the same results, or they could effectuate other team members to deliver the same results, then not having those skills is not an issue at all.

These changes has been subtly happening in schools and universities for quite some time - instead of asking students to do a test or write a dissertation, they now do projects and group work; internships and work experiences are expected for university students and a good-to-have for secondary school students. Using the old 'skills & knowledge' model, these changes are to equip students with 'communication' and 'collaboration' skills and skills that cannot be attained or practised through sit-and-write exams; but viewing it from our new angle, this could be better interpreted it as a recognition that acquiring skills & knowledge is insufficient, and it is ensuring that the students acquire the capabilities to utilise skills & knowledge and achieving the end result with their fellow team members that's the crux of education.

The "learning & development" teams in companies should be asking "what capabilities do you need to acquire", and allow them to participate in different projects and take up new roles within those projects to acquire these capabilities. A new workplace situation calls for a new set of questions to be asked and reflected upon. These teams have their own learning & development to do, it seems.