On 8/31/20, the Dow Jones Industrial Average went through a change in composition. Out went Exxon, Pfizer and Raytheon. In came Salesforce.com, Amgen and Honeywell. This is the 8th time the Index components have changed this decade, the 13th time since 2000 and the 55th change since created in 1896. So changes are not uncommon. But, are they meaningful? Ask any academic and you’ll get a resounding “NO.” There is no stated criteria for selection, no metrics for inclusion, no breadth to the number of companies (which has changed significantly over time,) and not even a weighting for market capitalization! The DJIA has no relationship to “the market,” which could well be measured better by the S&P 500, or the Russell 3000. And it doesn’t even link to any specific industry! To academics, “the Dow” is just a random number that reflects nothing worth measuring!!
The DJIA is (currently) a group of 30 stocks selected by the editors of Dow Jones (publisher of the Wall Street Journal, owned by News Corp – which also owns Fox News – and controlled by Rupert Murdock). Despite its lack of respect by academics and money managers, because of its age – and the prestige of being selected by these editors – being on the DJIA has been considered somewhat revered. Think of it as an “editorial award of achievement” for size, profitability and perceived stability. For these reasons, over time many investors have believed the index represents a low–risk way to invest in corporations and grow their wealth.
So the daily value of the DJIA is pretty much meaningless. And being on the DJIA is also pretty much meaningless. But, investors have followed this index every trading day for 124 years. So, it is at least interesting. And that’s because it is a track on what these editors think are important very long-term economic trends.
The original Index composition looks NOTHING like 2020. American Cotton Oil Company, American Spirits Manufacturing Company, American Sugar Refining Company, American Tobacco Company, Chicago Gas Light and Coke, General Electric, Leclede Gas, National Lead, Pacific Mail Steamship Company, Tennessee Coal Iron and Railroad Company, United State Cordage Company and United States Leather Company. Familiar household names? This initial list represents the era in 1896 – an agrarian economy just on the cusp of coming into the industrial age. Not forward looking, but rather somewhat reflective of what were the biggest parts of the economy historically with a not forward.
Over 124 years lots of companies left the DJIA — were replaced – and many replacements left. Some came on, went off, and came back on again – such as AT&T, Exxon (formerly Standard Oil of New Jersey) and Chevron (formerly Standard Oil of California.) Even the vaunted GE was inducted in 1899, only to be removed in 1901 – then added back in 1907 where it stayed until CEO Jeff Immelt imploded the company and it was removed for good in 2018.
But, there has been a theme to the changes. Originally, the index was largely agricultural companies. As the economy changed, the Index rotated into commodity companies like gas, coal, copper and nickel – the materials leading to a new era of tools. This gave way to component manufacturers, dominated by the big steel companies, which created the industrial era. Which, of course, led to big manufacturing companies like 3M and IBM. And, along the way, there was recognition for growth in new parts of the economy, by adding consumer goods companies like P&G, Coca-Cola, McDonald’s, Kraft (since removed,) and Nike along with retailers like Sears (later removed,) Walmart, Home Depot and Walgreens. The massively important role of financial services to the economy was reflected by including Travelers, JPMorgan Chase, American Express, Visa and Goldman Sachs. And as health care advanced, the Index added pharmaceutical companies like Pfizer, Johnson & Johnson and Merck.
Obviously, the word “industrial” no longer has any meaning in the Dow Jones Industrial Index.
Reading across the long history of the DJIA one recognizes the editors’ willingness to try and reflect what was growing in the American economy. But in a laggard way. Not selecting companies too early, preferring instead to see that they make a big difference and remain important for many years. And a tendency to keep them on the index long after the bloom is off the rose – like retaining Kraft until 2008 and still holding onto P&G and Coke today.
The bias has always been to be careful about adding companies, lest they not be sustainable. And not judge too hastily the demise of once great companies. Disney wasn’t added until 1991, long after it was an established entertainment leader. Boeing was added in 1987, after pioneering aviation for 30 years. Microsoft added in 1999, well after it had won the PC war. Thus, the index is a “lagging index.” It reflects a big chunk of what was great, while slowly adding what has recently been great – and never moving too quickly to add companies that just might be tomorrow’s leaders.
Sears added in 1924, wasn’t removed until 1999 when its viability is questionable. Phillip Morris Tobacco (became Altria) was added in 1985, and hung around until 2008 — long after we knew cigarettes were deadly and leadership didn’t know how to do anything else. Even today we see that United Aircraft was added in 1939, which became United Technologies in 1976 and then via merger Raytheon in 2020 – before it is now removed, as all things aircraft are screeching to a pandemic halt. And Boeing is still on the Index despite the 737 fiasco and plunging sales. IBM was added in 1939, and through the 1970s it was a leader in office equipment creating the computer industry. But IBM after years of declining sales and profits isn’t really relevant any longer, yet it is still on the Index.
As for adding growing stars, GM stays on the Index until it goes bankrupt, but Tesla is yet to make consideration (largely due to lack of profit history.) Likewise, Walmart remains even though the “big gun” in retail is obviously Amazon.com (another lacking the size and longevity of profits the editors like.) McDonald’s stays on the list, despite no growth for years and even as the Board investigates its HR department for hiding abhorrent leadership behavior – while Starbucks is eschewed. And Cisco is there, while we all use Zoom for pandemic-driven virtual meetings.
So what can we take away from today’s changes? First, the Index has changed dramatically over 20 years to reflect electronic technology. IBM, Microsoft and Apple are now joined by Salesforce. Pharma company Pfizer is being replaced by bio-pharma company Amgen in a nod to the future, although almost 40 years after Genentech went public. Exxon disappears as oil prices fall to sub-zero, demand declines globally and electric cars are on the cusp of taking market leadership. And conglomerate Honeywell is added just to show the editors still think conglomerates matter – even if GE has nearly disintegrated.
Is any of this meaningful? I don’t really think so. As an award for past performance, it’s a nice token to make the list. As business leaders, however, we need to be a LOT more concerned about developing businesses for the future, based on trends, than is indicated by the components of the DJIA. Driving revenue growth and higher margins comes from doing the next big thing, not the last big thing. And, as investors, if you want to make outsized returns you have to know that a basket of largely laggards (Apple, Microsoft and Salesforce excepted) is not the way to build your retirement nest. Instead, you have to invest in companies that are creating the future, making the trends a reality for businesses and consumers. Think FAANG.
Nonetheless, after 124 years it is still sort of interesting. I guess most of us do still care what the editors of big news companies think.
TRENDS MATTER. If you align with trends your business can do GREAT! Are you aligned with trends? What are the threats and opportunities in your strategy and markets? Do you need an outsider to assess what you don’t know you don’t know? You’ll be surprised how valuable an inexpensive assessment can be for your future business (https://adamhartung.com/assessments/)
Give us a call or send an email. Adam@sparkpartners.com
Americans take it for granted that all currencies are measured against the US Dollar. It’s been that way since WWII, so they just expect it will always be that way. But, things have a way of changing.
In this pandemic the US Federal Reserve is printing money as fast as possible to help prop up the economy. That’s better than the alternative, which would be another Great Depression. But, eventually we have to create value via goods and services to put value in those dollars, or they will be worth a whole lot less. In other words, if we don’t change our fiscal policy to improve production of goods and services, the US Dollar will fall in value – maybe a lot – and it could even lose its status as the world’s “reserve currency.”
Back in 2008, I wrote that there was no inherent reason the US Dollar would be the benchmark for all currencies. It gained that position as the dominant economy after WWII. American’s like to assume superiority, and therefore the US Dollar will always reign supreme. But as I also said in 2008, that’s an assumption that can easily be changed – especially regarding currencies. Lots of factors could cause the US Dollar to suddenly lose a whole lot of value – creating inflation rates that make the 1980s (>18%/year) seem tame.
Since WWII, a lot has happened. Economies in Europe grouped into the Economic Union (EU) making the Euro more powerful. And the economy of China has grown enormously. (China’s economy will be bigger than the USA economy sometime in 2020 or 2021.) Simultaneously, isolationism has hurt growth in America, and caused the EU to lose the UK. What’s rapidly happening is a shift in economic power away from the US and Europe to China.
Additionally, the largest holder of US debt is China. As I pointed out in 2009, this policy of supporting US debt has aided China’s desire to grow. But, as China becomes larger it will no longer need to prop up the US Dollar by purchasing Treasuries. Once bigger than the USA, China could demand that its trade be in Yuan and the value of the dollar could fall very far, very fast.
China has developed enormous inroads into the global economy, across dozens of countries, with its “Belt and Road Initiative” created in 2013. China has quietly become more important to the economy of 70 countries than the USA. Instead of supplying countries guns, China gave them infrastructure and facilities – and jobs – and economic growth. In most of these countries, the USA is more feared than adored, while the Chinese are seen as a very good friend. Meanwhile, the USA “put America first” policies, including trade wars and social justice, have isolated the USA from not only rivals but its global friends – including Europe (threats to kill NATO, for example.)
Now, we are in a pandemic. The Chinese are very determined to control its impact. Meanwhile the USA, UK and many other democracies are being far less careful. If this plays out with a full pandemic recession in the USA, China could stop buying American bonds and the value of the dollar could disintegrate in weeks. Disintegrate as in $1 could be worth 1 penny. It would take bushels of dollars to buy imported goods in stores.
In this election year, the biggest concern is, do those leading the USA realize the peril? Do business leaders? Do you?
What Business Leaders Can Learn From Bitcoin Fanatics
On August 15, Bitcoin rose to $4,000, I wrote a column about the crypto-currency. At the time, I thought Bitcoin was reasonably obscure, and I doubted there would be many readers. I was amazed when the column went semi-viral, and it has had almost 350,000 reads. But even more amazing was that the column generated an enormous amount of feedback. From email responses to Facebook remarks and Tweets I was inundated with people who, largely, wanted my head.
I found this confounding and fascinating. Why would an article that simply said a crypto-currency was speculative draw such an enormous response? And why such hostility? Just as I had not anticipated much readership, I certainly did not anticipate the reaction. These factors led me to research Bitcoin owners, and develop some theories on why Bitcoin is such a big deal to its enthusiasts.
1 – Bitcoin owners want the value to increase
I made the mistake of thinking of Bitcoins as a form of cash. Something to be spent. But I discovered most owners are holding Bitcoins as an asset. Because there are technical limits on how many Bitcoins can be created, and how quickly, these owners see the possibility of Bitcoin value increasing. As “investors” in Bitcoins, they don’t want anything (like a negative column) to put a damper on Bitcoin’s ability to rise.
Such speculation is not uncommon. Many people buy land, gold, silver and diamonds because they expect limited supply, and growing demand, to cause the value to rise. Other people buy Andy Warhol prints, vintage automobiles, signatures of historical people or baseball cards for the same reason. I prefer to call this speculation, but these people refer to themselves as investors in rare assets. Bitcoin investors see themselves in this camp, only they think Bitcoins are less risky than the other assets.
Regardless the nomenclature, anyone who is buying and holding Bitcoins would be unhappy to hear that the asset is risky, or potentially a bad holding. But unlike all those other items I mentioned, Bitcoins are not physical. To some extent merely owning the other assets has a certain amount of its own reward. One can enjoy a diamond ring, or a Warhol print on the wall while waiting to learn if its value goes up or down. But Bitcoins are just computer 1s and 0s, and really a new kind of asset (crypto-currency.) These investors are considerably younger on average, a bit more skittish, and considerably more outspoken regarding the future of their investment – and those who would be negative on Bitcoins.
While wanting their asset value to rise makes sense, it is rare that speculators have been as passionate as those who responded on Bitcoin. I’ve written about many companies I feared would lose value, and thus were speculative, but those columns did not create the fervor with responses like those regarding Bitcoin
2 – Confusion between Bitcoin and Blockchain technology
Blockchain is the underlying technology upon which Bitcoins are created. I have now read a few hundred articles on Bitcoins and Blockchain.
I was struck at just how confusing authors on these topics can be. They will say the two are very different, but then go on at great length that if you believe in Blockchain you should believe in Bitcoins. Few columns on Blockchain don’t talk about Bitcoins. And all Bitcoin authors talk about the wonders of Blockchain.
There is no doubt that Blockchain technology is new to the scene, and shows dramatic promise. Many large organizations are investigating using Blockchain for uses from financial transaction clearing to medical record retention. This is serious technology, and as it matures there are a great many people working to make it as trustworthy as (no, more trustworthy than) the internet. Just as the web requires some rules about URLs, domain naming, page serving, data accumulation, site direction, etc. there are serious people thinking about how to make Blockchain consistent in its application and use – which could open the door for many opportunities to streamline the digital world and make our lives better, and possibly more secure.
There were many, many people who disliked my skeptical view of Bitcoins, and based their entire argument for Bitcoinvalue on their belief in Blockchain. I was schooled over and again on the strength of Blockchain and its many future applications. And I was told that Blockchain technology inherently meant that Bitcoins have to go up in value. Buying Bitcoins was frequently referred to as investing in “Internet 2.0” due to the Blockchain technology.
It is clear that without Blockchain you could not have Bitcoins. But the case demanding one owns Bitcoin because it is built on Blockchain (“the technology of the future” as it is referred to by many) is still being developed. To them I was the one who was confused, unable to see the future they saw built on Blockchain. There were hoards of people who were almost religious in their Bitcoin faith, indicating that there was yet still more underlying their passion.
3 – As trust in government declines, there is growing trust in technology
More than ever in modern history, people have little faith in their government. In the USA, favorable opinions of Congress and its leaders are nearly non-existent. And favorable opinions for the current President started out below normal, and have gotten considerably worse. It is reported now with some regularity that Americans have little trust in the President, Congress, Courts – and the Federal Reserve.
There were, literally, hundreds of people who sent messages talking about the failure of government based currencies. Most of these examples were South American, but still these people made the point, loudly and clearly, that governments can affect the value of their currency. Thus, these Bitcoin investors had lost faith in all government backed securities, including the U.S. dollar, euro, yen, etc. They believed, fervently, that only a currency based on technology, without any government involvement, could ever maintain its value.
Today if someone is asked to give personal information for a census on their city, county, state or country they will often refuse. They want nothing to do with giving additional information to their government.
But these same people allow Facebook, Google and Amazon to watch their most private communications. Facebook records their emotions, their personal interactions, friends, complaints and a million other things going on in users’ lives to develop profiles of what is interesting to them in order to send along newsfeeds, information and ads. Google has recorded every search everyone has ever done, and analyzes those to develop profiles of each person’s interests, concerns, desires and hundreds of other categories to match each with the right ad. Amazon watches every product search made, and everything purchased to profile each person in order to push them the right products, entertainment, news and ads. And they all sell these profiles, and a lot of other personal information, to a host of other companies who do credit ratings, develop credit card offerings and push their own items for sale.
People who have no faith at all in government, and don’t believe government entities can make their lives better, leave their cookies on because they trust these tech companies to use technology to make their lives better. They believe in technology. Are these folks losing privacy? Maybe, but they see a direct benefit to what the technology operated by these tech companies can do for their lives.
For them, Bitcoin represents a future without government. And that clearly drives passion. Blockchain is a bias free, regulator free technology platform. Bitcoin is a government free form of currency, unable to be manipulated by the Federal Reserve, Exchequer of the currency, European Central Bank, Congress, Presidents, the G7, or anyone else. For the vocal Bitcoin owners, they see in Bitcoin a new future with far less government involvement, based on Blockchain technology. And they trust technology far more than they trust the current systems. They claim to not be anarchists, but rather believers in technology over human government, and in some instances even religion.
Leadership lessons from my Bitcoin journey
Often we try to explain away feedback, especially negative feedback or feedback that is hard to interpret, with easy answers. Such as, “they just want their asset to go up in value.” That is a big mistake. If the feedback is strong, it is really worth digging harder to understand why there is passion. Never forget that every piece of negative feedback is a chance to learn and grow. It is almost always worth taking the time to really understand not only what is being said, but why it is being said. There could be a lot more to the issue than face value.
If things are confusing, it is important to sort out the source of the confusion. If I’m talking about a currency, why do they keep talking about the technology? Saying “they don’t get it” misses the point that maybe “you don’t get it.” It is worth digging into the confusion to try and really understand what motivates someone. Only by listening again and again and again, and trying to really see their point of view, can you come to understand that what you think is confusing, to them is not. They aren’t confused, they see you as confused. Until you resolve this issue, both parties will keep talking right past each other.
You cannot lead if you do not understand what other people value. Their belief system may not match yours, and thus they are reaching very different conclusions when looking at the same “facts.” While I may trust the Fed and the ECB, and even banks, if others don’t then they may well have a very different view of the future.
When leaders lose the faith of those they are supposedly leading, unexpected outcomes will occur. Leaders cannot lead people who don’t trust them. Using the power of their office to force their will on others, and forcing conformance to existing processes, methods and systems can often lead to strong negative reactions. People may have no choice short-term but to do as instructed, but they may well be plotting (investing) longer term in a very different future. Failing to see the passion with which they are seeking that different future will only cause the leadership gap to widen, and shorten the time to a disruptive event.
Adam's book reveals the truth about how to use strategy to outpace the competition.
Follow Adam's coverage in the press and in other media.
Follow Adam's column in Forbes.
Photo by Spencer Platt/Getty Images
I’m a believer in Disruptive Innovation. For almost 100 years economists have written about “Creative Destruction,” in which new technologies come along making old technologies — and the companies built on them — obsolete. In the last 20 years, largely thanks to the insights of faculty at the Harvard Business School, we’ve seen a dramatic increase in understanding how new companies use new technologies to disrupt markets and wipe out the profitability of companies that were once clearly successful. In a large way, we’ve come to accept that Disruptive Innovation is good, and the concomitant Creative Destruction of the old players leads to more rapid growth for the economy, increasing jobs and the wherewithal of everyone. Creative Destruction, in the pursuit of progress, is good because it helps economies to grow.
But, not really everyone benefits from Creative Destruction. The trickle down benefits to lots of people can be a long time coming. When market shifts happen, and people lose jobs to new competitors — domestic or offshore — they only know that their life, at least short term, is a lot worse. As they struggle to pay a mortgage, and find a new job, they often learn their skills are outdated. There are new jobs, but these folks are often not qualified. As they take lesser jobs, their incomes dwindle, and they may well lose their homes. And their healthcare.
Economists call this workplace transition “temporary economic dislocation.” Fancy term. They claim that eventually folks do enter the workplace who are properly trained, and those folks make more money than the workers associated with the previous, now inferior, technology. And, eventually, everyone finds new work – at something.
That’s great for economists. But terrible for the folks who lost their jobs. As someone once said “a recession is when your neighbor loses his job. A depression is when you lose your job.” And for a lot of people, the market shift from an industrial economy to an information economy has created severe economic depression in their lives.
A person learns to be a printer, or a printing plate maker, in the 1970s when they are 20-something. Good job, makes a great wage. Secure work, since printing demand just keeps rising. But then along comes the internet with PDF and JPEG documents that people read on a screen, and folks simply quit needing, or wanting, printed documents. In 2016, now age 50-something, this printer or plate-maker no longer has a job. Demand is down, and its really easy to send the printing to some offshore market like Thailand, Brazil or India where printing is cheaper.
What’s he or she to do now? Go back to school you may say. But to learn how to do what? Say it’s online (or digital) document production. OK, but since everyone in the 20s has been practicing this for over a decade it takes years to actually be skilled enough to be competitive. And then, what’s the pay for a starting digital graphic artist? A lot less than what they made as a printer. And who’s going to hire the 58-62 year old digital graphic artist, when there are millions of well trained 20-somethings who seem to be quicker, and more attuned to what the publishers want (especially when the boss ordering the work is 35-42, and really uncomfortable giving orders and feedback to someone her parents’ age.) Oh, and when you look around there are millions of immigrants who are able to do the work, and willing to do it for a whole lot less than anyone native born to your country.
In England last week these disaffected people made it a point to show their country’s leadership that their livelihoods were being “creatively destroyed.” How were they to keep up their standard of living with the flood of immigrants? And with the wealth of the country constantly shifting from the middle class to the wealthy business leaders and bankers? And with work going offshore to less developed countries? While folks who have done well the last 25 years voted overwhelmingly to remain in the EU (such as those who live in what’s called “The City”), those in the suburbs and outlying regions voted overwhelmingly to leave the EU. Sort of like their income gains, and jobs, left them.
A whole lot of anger. To paraphrase the famous line from the movie Network, they were mad as Hell and they weren’t going to take it any longer. Simply put, if they couldn’t participate in the wonderful economic growth of EU participation, they would take it away from those who did. The point wasn’t whether or not the currency might fall 10% or more, or whether stocks on the UK exchange would be routed. After all, these folks largely don’t go to Europe or America, so they don’t care that much what the Euro or dollar costs. And they don’t own stocks, because they aren’t rich enough to do so, so what does it hurt them if equities fall? If this all puts a lot of pain on the wealthy – well just maybe that is what they really wanted.
America is seeing this as well. It’s called the Donald Trump for president campaign. While unemployment is a remarkably low 5%, there are a lot of folks who are working for less money, or simply out of work entirely, because they don’t know how to get a job. They may laugh at Robert De Niro as a retired businessman now working for free in The Intern. But they really don’t think it’s funny. They can’t afford to work for free. They need more income to pay higher property taxes, sales taxes, health care and the costs of just about everything else. And mostly they know they are rapidly being priced out of their lifestyle, and their homes, and figuring they’ll be working well into their 70s just to keep from falling into poverty.
These people hate President Obama. They don’t care if the stock market has soared during his presidency – they don’t own stocks (and if they do in a 401K or similar program they don’t care because it does them no good today). They don’t care that he’s created more jobs than anyone since Reagan or Roosevelt, because they see their jobs gone, and they blame him if their recent college graduate doesn’t have a well-paying job. They don’t care if America is closing in on universal health care, because all they see is that health care is becoming ever more expensive – and often beyond their ability to pay. For them, their personal America is not as good as they expected it to be – and they are very, very angry. And the President is a very identifiable symbol they can blame.
Creative Destruction, and disruptive innovations, are great for the winners. But they can be wildly painful to the losers. And when the disruptive innovations are as big, and frequent, as what’s happened the last 30 years – globalized economy, nationwide and international super banks, outsourcing, offshoring and the entirety of the Internet of Things – it has left a lot of people really concerned about their future. As they see the top 1% live opulent lifestyles, they struggle to keep a 12 year old car running and pay the higher license plate fees. They really don’t care if the economy is growing, or the dollar is strong, or if unemployment is at near-record lows. They feel they are on the losing end of the stick. For them, well, America really isn’t all that great anymore.
So, hungry for revenge, they are happy to kill the goose for dinner that laid the golden eggs. They will take what they can, right now, and they don’t care if the costs are astronomical. They will let tomorrow sort out itself, in a bit of hyper-ignorance to evaluate the likely outcome of their own actions.
Despite their hard times, does this not sound at the least petty, and short-sighted? Doesn’t it seem rather selfish to damn everyone just because your situation isn’t so good? Is it really in the interest of your fellow man to create bad outcomes just because you’ve not done well?
(AP Photo/Cliff Owen, File)
Wells Fargo’s CEO John Stumpf resigned last week. This week he also resigned from the boards of directors at Chevron and Target. For those two roles he was being paid something like $650,000 per year. The interesting question is, why was he on those boards at all? Wasn’t being the CEO and on the board at one of America’s biggest banks a full-time job? After all, he was paid $19.3 million in both 2015 and 2014. You would not have thought he needed a side job to make ends meet.
Which leads to the question, are America’s boards of directors actually staffed with the right people? Ostensibly the board is responsible for governing the corporation. Directors are responsible to insure management makes the right decisions for the long-term best interests of shareholders. And legislators’ have passed multiple laws, such as Sarbanes-Oxley and Dodd-Frank, to allow the regulators, primarily at the SEC (Securities and Exchange Commission), to put real teeth (and enforcement) into directors’ responsibilities.
According to the National Association of Corporate Directors (NACD) a sitting director should do a minimum of 200 hours of work on a board every year. For larger companies committee requirements on top of general board work could easily push this to nearly 300 hours. Thus, Mr. Stumpf should have been doing at least 500 hours of work for Chevron and Target – about 12.5 weeks, or three months. Do you think he actually spent this much time on these roles, given his full time job at Wells Fargo?
This also means that Mr. Stumpf only had nine months to actually work as CEO of Wells Fargo. Maybe that was why he was so unaware of the unethical behavior at the company he led? Why would a board think it is acceptable for a CEO to work only three-fourths of the year? Not many employees have the opportunity to draw full compensation yet take off so much time.
Either Mr. Stumpf wasn’t paying enough attention to Wells Fargo, or he wasn’t paying enough attention to Chevron and Target. Yet, he was being paid very, very handsomely for all those roles. How is that good governance for any one of the three companies?
CEOs serving on additional boards is a bit like electing a governor, who is paid to run the state, and then hearing that the governor is simultaneously going to do part time work for a company or perhaps an agency of a different state. Would any state accept that their governor, state CEO, be allowed to spend three months of every year working side jobs that have nothing to do with being governor? Yet, corporate CEOs regularly take on director roles for other corporations – which in no way benefits their company’s employees, or shareholders. Why?
Further, boards are dominated by sitting or former CEOs. Why? The world moves fast toda, and there are a wealth of skills boards need to effectively govern – far beyond having a room full of CEOs. IT skills, cyber security skills, social media skills, marketing and advertising skills, branding skills, global market skills, intellectual property skills – there is a long list of skills which would greatly improve board diversity, and thereby a board’s ability to govern effectively. So why is hiring so biased toward CEOs? NACD has been asking the same question as it promotes diversity in the boardroom.
Yet, there is one group that is making hay with all that board pay. Former regulators and members of Congress. These people are required to register if they become lobbyists, and they are forced to wait a year, or more, before they can do work for government contractors. But there is nothing which stops them from joining a board of directors.
There is nothing about being a Congressman or Senator which prepares these people for corporate governance, yet this is common practice as corporations seek ways to find influence without breaking the law. But is it worthwhile to investors to have directors that were prominent in government, but perhaps lacking competency for today’s fast-paced business world? Should a directorship and the compensation be a reward for previous government work – or should it be a position of great importance looking out for the interest of the corporation?
There are currently 64 former members of Congress serving on corporate boards. According to a Harvard and Boston University study, 44% of Senators, and 11% of Congress members have landed corporate board directorships since 1992. Their average compensation, per board, is $350,000. Much better than being in Congress. Especially for a part-time job.
Former Speaker John Boehner and famous cigarette smoker, just joined the tobacco company Reynolds America board – although that may be short-lived as British American Tobacco has offered to acquire Reynolds. Former Majority leader Eric Cantor, who was up for the Speaker job when losing his last election, is now on the board of a Wall Street firm, where he earned $2 million in 2015 for bringing in new business – making him the highest paid director in this group. Former Majority Leader Dick Gephardt has accumulated $10.8 million in director compensation since retiring from Congress in 2005.
Tom Ridge, who was a governor, house member and secretary of Homeland Security – but never a businessperson – raked in $1.4 million in director compensation last year. Even former Congressman and subsequently Secretary of Defense and director of the CIA Leon Panetta made almost $600,000 in director comp last year. These fellows are obviously well connected to government leaders, but do they have a clue about how to effectively implement regulations for corporate audit, compensation or nominating and governance committee roles? Are they hired to apply good governance for investors, or to be rainmakers for the company? Or just to give them a good retirement plan?
Boards exist to protect the rights of shareholders. But do they? The issues at Wells Fargo are an example of how ineffective a board can be at oversight, given that serious problems lasted there for at least five years, and whistle-blowers were terminated for specious reasons. And the Wells board paid the CEO almost $20 million per year, while letting him work a quarter or more of each year as a director for other companies. Hard to see how those directors were doing their job.
When companies do poorly employees, investors and analysts will ask “where was the board?” Increasingly it is clear that more should be asking “who is on the board?” Boards should not be stacked with folks that have lofty titles from previous positions, but which are irrelevant to the needs of that corporation and frequently lacking the qualifications to govern effectively. Target’s investors, for example, probably would have benefited far more by a director that understood networks and cyber crime than paying Mr. Stumpf for his part-time assistance away from Wells Fargo. And with oil prices at generational lows, how did Mr. Stumpf help Chevron prepare for a new world of lower oil demand and greater supplier anxiety in the Middle East?
Sarbanes-Oxley was passed after the outrage that occurred at Enron, where the company completely failed and yet the board said it had no idea of the company’s problems. When America’s financial services industry nearly melted down Dodd-Frank was passed to put more onus on directors to understand the financials and compensation practices of their companies. But, it will most likely take yet more legislation, and more regulation, if investors are to be protected by truly independent directors that are the right people, in the right job, and feel accountable for management oversight and company outcomes.
SAUL LOEB/AFP/Getty Images
Everyone knows what happened at Wells Fargo. For many years, possibly as far back as 2005, Wells Fargo leaders pushed employees to “cross-sell” products, like high profit credit cards, to customers. Eventually the company bragged it had an industry leading 6.7 products sold to every customer household. However, we now know that some two million of these accounts were fakes – created by employees to meet aggressive sales goals. And, unfortunately, costing unsuspecting customers quite a lot of fees.
We also know that Wells Fargo leadership knew about this practice for at least five years – and agreed to a $190 million fine. And the company apparently fired 5,300
Which begs the obvious question – if management knew this was happening, why did it continue for at least five years?
Let’s face it, if you owned a restaurant and you knew waiters were adding extras onto the bill, or tip, you would not only fire those waiters, but put in place procedures to stop the practice. But in this case we know that management at Wells Fargo was receiving big bonuses based upon this employee behavior. So they allowed it to continue, perhaps with a gloss of disdain, in order for the execs to make more money.
This is the modern, high-tech financial services industry version of putting employees in known dangerous jobs, like picking coal, in order to make more profit. A lot less bloody, for sure, but no less condemnable. Management was pushing employees to skirt the law, while wearing a fig-leaf of protection.
Ignorance is not excuse – especially for a well-paid CEO.
CEO Stumpf’s testified to Congress that he didn’t know the details of what was happening at the lower levels of his bank. He didn’t know bankers were expected to make 100 sales calls per day. When asked about how sales goals were implemented, he responded to Representative Keith Ellison “Congressman, I don’t know that level of detail.”
Really? Sounds amazingly like Bernie Ebbers at Worldcom. Or Jeff Skilling and Ken Lay at Enron. Men making millions of dollars from illegal activities, but claiming they were ignorant of what their own companies were doing. And if they didn’t know, there was no way the board of directors could know, so don’t blame them either.
Does anyone remember how Congress reacted to those please of ignorance? “No more.” Quickly the Sarbanes-Oxley act was passed, making not only top executives but Boards, and in particular audit chairs, responsible for knowing what happened in their companies. And later Dodd-Frank was passed strengthening these laws – particularly for financial services companies. Ignorance would no longer be an excuse.
Where was Wells Fargo’s compliance department?
Based on these laws every Board of Directors is required to establish a compliance officer to make sure procedures are in place to insure proper behavior by management. This compliance officer is required to report to the board that procedures exist, and that there are metrics in place to make sure laws, and ethics policies, are followed.
Additionally, every company is required to implement a whistle-blower hotline so that employees can report violations of laws, regulations, or company policies. These reports are to go either to the audit chair, or the company external legal counsel. If it is a small company, possibly the company general counsel who is bound by law to keep reports confidential, and report to the board. This was implemented, as law, to make sure employees who observed illegal and unethical management behavior, as happened at Worldcom, Enron and Tyco, could report on management and inform the board so Directors could take corrective action.
Which begs the first question “where the heck was Wells Fargo’s compliance office the last five years?” These were not one-off events. They were standard practice at Wells Fargo. Any competent Chief Compliance Officer had to know, after five-plus years of firings, that the practices violated multiple banking practice laws. He must have informed the CEO. He was, by law, supposed to inform the board. Who was the Chief Compliance Officer? What did he report? To whom? When? Why wasn’t action taken, by the board and CEO, to stop these banking practices?
Should regulators allow executives to fire whistle-blowers?
And about that whistle-blower hotline – apparently employees took advantage of it. In 2010, 2011, 2013 and more recently employees called the hotline, even wrote the Human Resources Department and the office of CEO John Stumpf to report unethical practices. Were their warnings held in anonymity? Were they rewarded for coming forward?
Quite to the contrary, one employee, eight days after logging a hotline call, was fired for tardiness. Another was fired days after sending an email to CEO Stumpf alerting him of aberrant, unethical practices. A Wells Fargo HR employee confirmed that it was common practice to find fault with employees who complained, and fire them. Employees who learned from Enron, and tried to do the right thing, were harassed and fired. Exactly 180 degrees contrary to what Congress ordered when passing recent laws.
None of this was a mystery to Wells Fargo leadership, or CEO Stumpf. CNNMoney reported the names of employees, actions they took and the decisively negative reactions taken by Wells Fargo on September 21. There is no way the Wells Fargo folks who prepared CEO Stumpf for his September 29 testimony were unaware. Yet, he replied to questions from Congress that he didn’t know, or didn’t remember, these events – or these people. In eight days these staffers could have unearthed any information – if it had been exculpatory. That Stumpf’s answer was another plea of ignorance only points to leadership’s plan of hiding behind fig leafs.
CEO Stumpf obviously knew the practices at Wells Fargo. So did all his direct reports. And likely two or three levels downs, at a minimum. Clearly, all the way to branch managers. Additionally, the compliance function was surely fully aware, as was HR, of these practices and chose not to solve the issues – but rather hide them and fire employees in an effort to eliminate credible witnesses from reporting wrongdoing by top leadership.
Where was the board of directors? Why didn’t the audit chair intervene?
It is the explicit job of the audit chair to know that the company is in compliance with all applicable laws. It is the audit chairs’ job to implement the Sarbanes-Oxley and Dodd-Frank regulations, and report any variations from regulations to the company auditors, general counsel, lead outside director and chairperson. Where was proper governance of Wells Fargo? Were the Directors doing their jobs, as required by law, in the post Enron, WorldCom, Tyco, Lehman, AIG world?
Should CEO Stumpf be gone? Without a doubt. He should have been gone years ago, for failing to properly implement and enforce compliance. But he is not alone. The officers who condoned these behaviors should also be gone, as should all HR and other managers who failed to implement the regulations as Congress intended.
Additionally, the board of Wells Fargo has plenty of responsibility to shoulder. The board was not effective, and did not do its job. The directors, who were well paid, did not do enough to recognize improper behavior, implement and monitor compliance or take action.
There is a lot more blame here, and if Wells Fargo is to regain the public trust there need to be many more changes in leadership, and Board composition. It is time for the SEC to dig much deeper into the situation at Wells Fargo, and the leaders complicit in failing to follow the intent of Congress.
(Photo: AP Photo/Andrew Harnik, File)
Last weekend, the Federal Reserve Board’s leadership met to discuss the future of America’s monetary policy. Reports out of that meeting, like reports from all Fed meetings, are long, tedious, and pretty much say nothing. Every analyst tries to interpret from the governors’ statements what might happen next. And because the Fed leadership is so vague, and so academic, the analysts inevitably never guess right.
This bothers a lot of people. There are those who want a lot more “transparency” from the Fed – meaning they want much clearer signals as to what is intended, and usually specifics as to intended actions and a timeline. Because the Fed’s meetings are so cloaked and opaque, some congress members actually want to do away with the Fed, or regulate it a lot more closely.
But for most of us, most of the time, the Fed is pretty much immaterial. When the Fed matters is when there are big swings in the economy, which happen quickly. Then their action is crucial.
Why Small Changes In Interest Rates Don’t Really Matter To Most Of Us
Take the debate right now over a quarter point rise in interest rates. How does this affect most people? Not much. If you have credit card debt, or a car loan, your interest rate is set by the financial institution. And you may hear people talk about zero interest rates, but you know your rate is a whopping amount higher than that. And you know that a quarter point change in the Federal Funds rate will not affect the interest on those loans.
Where you’ll see a difference is in a mortgage. But here, is a quarter point really important?
When I graduated business school in 1982 and wanted to buy my first home the interest rate on an annual, variable rate loan was 18.5%. My first house cost just about $100,000 so the interest was $18,500/year. Today, mortgages are around 3.5%, fixed for anywhere from 3 to 7 years. $18,500 in interest now funds a $525,000 mortgage! If interest rates go to 3.75% – which has many analysts so concerned for the economy – the home value associated with interest of $18,500 is $500,000. Probably within the negotiating range of the buyer.
So you have to borrow a LOT of money for this quarter point to matter. And it does matter to CEOs and CFOs of companies that lead corporations on the S&P 500, or those running huge REITs (Real Estate Investment Trusts) that have enormous debts. But that is not most of us. For most of us, that quarter point difference will not have any impact on our lives.
So Why Do People Pay So Much Attention To The Fed?
The Fed was originally created barely 100 years ago (1913) to try to create a more stable monetary system. But this didn’t work too well in the beginning, which led to the Great Depression. And then, to make matters worse, the conservative bent of the Fed coupled with its fixation on stable interest rates led it to actually cut the money supply as the economy was tanking. This led to a collapse in the value of goods and services, particularly real estate, and the loss of millions of jobs greatly worsening the Great Depression.
It was the depression which really caused economists to focus on studying Fed actions and the economic repercussions. A group of economists, most notably Milton Friedman at the University of Chicago, started saying that the Fed shouldn’t focus on interest rates, but rather on the supply of money. These folks were called “monetarists” and they said interest rates should float, and economists should focus on stable prices.
The 1970s – “Easy Money” Inflation
As we moved into the 1970s, and as Fed Governors kept trying to control interest rates, they found themselves creating more and more money to keep rates low, and in return prices skyrocketed. “Easy money” as they called it allowed ratcheting upward incomes, big pay raises, higher prices for commodities and inflation. Another monetarist leader, Paul Volcker, was named head of the Fed. He rapidly moved to contract the money supply, allowing those 18.5% mortgage rates to develop. Yet, this did stabilize prices and eventually rates lowered, moving down constantly from 1980 to the near zero rates of today for Treasury Bonds and other very large, low risk borrowers.
When the Great Recession hit the Fed leadership, led by Ben Bernanke, remembered the lessons of the Great Depression. As they saw real estate values tumble they were aware of the domino effect this would have on bank failures, and then business failures, just as they had occurred in the 1930s. So they flooded the market with additional currency to keep failures to a minimum, and ease the real estate collapse. This sent interest rates plummeting to the record low levels of the last few years.
Policy Must Address The Current Situation, Not Be Biased By Historical Memories
Yet, people keep worrying about inflation. Those who lived through the 1970s and saw the damage done by inflation are still fearful of it. So they scream loudly about their fear that the last 8 years of monetary ease will create massive future inflation. They want the Fed to be much tighter with money saying that all this cash will someday create inflation down the road. Their view of history is guiding their analysis. Their bias is a fear that “easy money” once caused a problem, so surely it will cause a problem again.
But economists who study prices keep saying that there are currently no signs of price escalation – that wages have not moved up appreciably in a decade, home values are barely where they were a decade ago. Commodity prices are not escalating, in fact many (like oil) are at historically low prices. The dollar is stronger because, relatively speaking, the USA economy is doing better than the rest of the developed world. As long as prices and wages remain without high gains, there is little reason to tighten money, and little reason to feel a higher interest rate is needed.
Further, past monetary increases will not cause future inflation, because monetary policy only affects what is happening now. “Easy money” today can only create inflation today, not in 3 years. And inflation is almost nowhere to be seen.
Ignore Fed “Fine Tuning.” Pay Attention When A Crisis Hits. Otherwise, It’s Up To The Politicians
The big thing to remember is that small changes in policy, such as those that might affect a quarter point change in rates, is “fine tuning” the money supply. And that has pretty nearly no affect on most of us. Where as citizens we should care about the Fed is when big changes happen. We don’t want mistakes like happened in the 1930s, because that hurt everyone. But we do want fast action to deal with a crisis like the falling real estate values and bank collapses that were happening a decade ago.
Remember, it was when the Fed targeted interest rates that the USA economy got into so much trouble. First in the Great Depression, and then in the inflationary 1970s. But when the Fed targeted prices, such as in the 1980s and the mid-2000s, it did exactly what it was created to do, maintain a stable money supply.
So don’t worry about whether analysts think interest rates are going to change a quarter point, or even a half point, in the next year. The big economic question facing us is not a Fed question, but rather “what will it take to increase investment so that we can create more jobs, and provide higher wages leading to a higher standard of living for everyone?” And that is not a question for the Fed to answer. That is up to the economic policy makers in the legislature and the White House.
I’m a believer in Disruptive Innovation. For almost 100 years economists have been writing about “Creative Destruction,” in which new technologies come along making old technologies — and the companies built on them — obsolete. In the last 20 years, largely thanks to the initial insights of the faculty at Harvard Business School, we’ve seen a dramatic increase in understanding how new companies use new technologies to disrupt markets and wipe out the profitability of companies that were once clearly successful. In a large way, we’ve come to accept that Disruptive Innovation is good, and the concomitant creative destruction of the old players leads to more rapid growth for the economy, increasing jobs and the wherewithal of everyone.
But, not really everyone. The trickle to lots of people can be a long time coming. When market shifts happen, and people lose jobs to new competitors — domestic or offshore — they only know that their life, at least short term, is a lot worse. As they struggle to pay a mortgage, and find a new job, they often learn their skills are outdated. There are jobs, but these folks are not qualified. As they take lesser jobs, their incomes dwindle, and they may well lose their homes. And their healthcare.
Economists call this workplace transition “temporary economic dislocation.” Fancy term. They claim that eventually folks do enter the workplace who are properly trained, and those folks make more money than the workers associated with the previous, now inferior, technology.
That’s great for economists. But terrible for the folks who lost their jobs. As someone once said “a recession is when your neighbor loses his job. A depression is when you lose your job.” And for a lot of people, the market shift from an industrial economy to an information economy has created severe economic depression in their lives.
A person learns to be a printer, or a printing plate maker, in the 1970s when they are 20-something. Good job, makes a great wage. Secure, as printing demand just keeps rising. But then along comes the internet with PDF and JPEG documents that people read on a screen, and folks simply quit needing, or wanting, printed documents. In 2016, now age 50-something, this printer or plate-maker no longer has a job. Demand is down, and its really easy to send the printing to some offshore market like Thailand, Brazil or India where printing is cheaper.
What’s he or she to do now? Go back to school you may say. But to learn how to do what? Say it’s on-line (or digital) document production. OK, but since everyone in the 20s has been practicing this for over a decade it takes years to actually be competitive. And then, what’s the pay for a starting digital graphic artist? A lot less than what they made as a printer. And who’s going to hire the 58-62 year old digital graphic artist, when there are millions of well trained 20 somethings who seem to be quicker, and more attuned to what the publishers want (especially when the boss ordering the work is 35-42, and really uncomfortable giving orders and feedback to someone her parents’ age.) Oh, and when you look around there are millions of immigrants who are able to do the work, and willing to do it for a whole lot less than anyone native born to your country.
Source: Master Investor UK
In England last week these disaffected people made it a point to show their country’s leadership that their livelihoods were being “creatively destroyed.” How were they to keep up their standard of living with the flood of immigrants? And with the wealth of the country constantly shifting from the middle class to the wealthy business leaders and bankers? While folks who have done well the last 25 years voted overwhelmingly to remain in the EU (such as those who live in what’s called “The City”), those in the suburbs and outlying regions voted overwhelmingly to leave. Sort of like their income gains, and jobs, left.
To paraphrase the famous line from the movie Network, “they were mad as Hell and they weren’t going to take it any longer.” Simply put, if they couldn’t participate in the wonderful economic growth of EU participation, they would take it away from those who did. The point wasn’t whether or not the currency might fall 10% or more, or whether stocks on the UK exchange would be routed. After all, these folks largely don’t go to Europe or America, so they don’t care that much what the Euro or dollar costs. And they don’t own stocks, because they aren’t rich enough to do so, so what does it hurt them if equities fall? If this all puts a lot of pain on the wealthy – well just maybe that is what they really wanted.
America is seeing this in droves. It’s called the Donald Trump for President campaign. While unemployment is a remarkably low 5%, there are a lot of folks who are working for less money, or simply out of work entirely, because they don’t know how to get a job. They may laugh at Robert DiNero as a retired businessman now working for free in “The Intern.” But they really don’t think it’s funny. They can’t afford to work for free. They need more income to pay higher property taxes, sales taxes, health care and the costs of just about everything else. And mostly they know they are rapidly being priced out of their lifestyle, and their homes, and figuring they’ll be working well into their 70s just to keep from falling into poverty.
These people hate President Obama. They don’t care if the stock market has soared during his Presidency – they don’t own stocks (and if they do in a 401K or similar program they don’t care because it does them no good today.) They don’t care that he’s created more jobs than anyone since Reagan or Roosevelt, because they see their jobs gone, and they blame him if their recent college graduate doesn’t have a well-paying job. They don’t care if we are closing in on universal health care, because all they see is that health care is becoming ever more expensive – and often beyond their ability to pay. For them, their personal America is not as good as they expected it to be – and they are very, very angry. And the President is a very identifiable symbol they can blame.
Creative Destruction, and disruptive innovations, are great for the winners. But they can be wildly painful to the losers. And when the disruptions are as big, and frequent, as what’s happened the last 30 years – globalized economy, nationwide and international super banks, outsourcing, offshoring and the entirety of the Internet of Things – it has left a lot of people really concerned about their future. As they see the top 1% live opulent lifestyles, they struggle to keep a 12 year old car running and pay the higher license plate fees. They really don’t care if the economy is growing, or the dollar is strong, or if unemployment is at near-record lows. They know they are on the losing end of the stick. For them, well, America really isn’t all that great anymore.
So, hungry for revenge, they are happy to kill the goose for dinner that laid the golden eggs. They will take what they can, right now, and they don’t care if the costs are astronomical.
Despite their hard times, does this not sound at the least petty, and short-sighted? Doesn’t it seem rather selfish to damn everyone just because your situation isn’t so good?
Some folks will think that government policy really doesn’t matter that much. That actions like Brexit won’t stop they improvements coming from innovation. They think all will work out OK. They so long for a return to a previous time, when they perceived things were much better, that they are ready to stop the merry-go-round for everyone.
And they can do this. Brexit will create a Depression for the UK. The economy not only won’t grow, it will shrink. Probably for another decade. There will be fewer jobs, meaning less wealth for everyone. Those with assets will ride it out. Those who already struggled will struggle more. Lacking investment funds, private and public, there will be less investment in infratructure and the means of production, making life harder for everyone. There will be less, if any, money to invest in innovation, so there will be fewer new products to enjoy, and fewer improvements in lifestyle and productivity. The currency will be lower, so there will be fewer imports, making the cost of everything go up. In short, it will be very, painful, and costly, for everyone now that those who felt left out of the economic expansion have had their day at the polls.
Growth is a great thing. Growth creates jobs, a better lifestyle, higher productivity, more income and more wealth for everyone. But growth is NOT a given. Policies, and government actions, can stop growth dead in its tracks. The innovations we’ve all been fascinated by, and made part of our everyday lives, from smartphones to autos that last hundreds of thousands of miles and tens of years, to low cost air conditioning, to electricity nearly everywhere, to miracle drugs and gene-based bio-pharmaceuticals that have extended our lives by 30%+ in just one generation — all of these things can come to a screeching, terrible halt.
All it takes to stop the gains of innovation are policies that try to return us to a previous time. In the process of making our countries great again, we can absolutely destroy them. It is impossible to go back in time. It is not impossible to kill the means of economic growth. All we have to do is focus on constructing walls instead of creating jobs, protecting industries instead of free trade, and wrapping ourselves in the flag of sovereignty instead of collaborating globally to maximize growth. By focusing on ourselves we can absolutely hurt a lot of other people.
The politicians getting attention now are those espousing a return to some bygone era. Those who denounce the gains of innovation, and offer pity to those who’ve struggled with market disruptions. But these are not the leaders who will help people improve their lives. Those who focus on promoting innovation, attacking the concentration of wealth at the top, providing more incentives to infrastructure development, and mobilizing resources for training and new job creation can make life better for everyone.
Let’s hope everyone watches what happens in the UK last week, this week and going forward learn the long-term lesson of short-term thinking. Let’s hope that we can return to favoring growth, even a the cost of disruption and creative destruction.