Skip to main content Accessibility help
×
Hostname: page-component-84b7d79bbc-fnpn6 Total loading time: 0 Render date: 2024-07-30T02:01:02.434Z Has data issue: false hasContentIssue false

2 - A History of the Firm That Incorporates Social Norms

from Part I - The Foundation

Published online by Cambridge University Press:  11 October 2018

Douglas E. Stevens
Affiliation:
Georgia State University

Summary

Type
Chapter
Information
Social Norms and the Theory of the Firm
A Foundational Approach
, pp. 21 - 44
Publisher: Cambridge University Press
Print publication year: 2018

Introduction

Alfred D. Chandler Jr. wrote an award-winning history of the firm in Reference Chandler1977 entitled, The Visible Hand: The Managerial Revolution in American Business. At the time, economists and historians had begun to turn their attention to the role of large commercial firms in the economy and in society. However, most theorists still viewed the firm as a unit of production, and the most popular economic rationale for the formation of firms was market failure or the accumulation of monopoly power (Coase Reference Coase1937). Chandler’s history of the firm provided a theoretical rationale for the formation and development of firms based on advances in technology, accounting, and financial control. According to his narrative, these advances made it more efficient for independent contractors, who had formerly sold raw materials and labor in simple exchange markets, to come together under the umbrella of the firm to conduct commerce. Chandler concluded that Adam Smith’s “invisible hand” of the market had been supplanted by the “visible hand” of managerial control. Similarly, Johnson and Kaplan (Reference Johnson and Kaplan1987) provided an insightful history of the firm that emphasized the importance of innovations in management accounting for the formation and development of firms.

More recently, institutional economists and historians have incorporated culture and social norms in narratives explaining the development of free-market economies and capitalist institutions (North Reference North1991, Soll Reference Soll2014). These narratives emphasize advances in accounting and financial control as a catalyst for such development, as in prior narratives, but they also suggest that a merchant culture that promoted social norms such as transparency and accountability played an important role. To expand the narrative of the firm and lay the foundation for future chapters, therefore, I provide a history of the firm that incorporates social norms. I begin my history of the firm by describing the birth of free-market capitalism in medieval Italy and the Netherlands. The creation of double-entry accounting in Italy around 1300 helped generate a culture of transparency and accountability, which allowed merchants to invest capital in the pursuit of trading opportunities. This culture led to large merchant cities with capitalist institutions that supported national and international trade. When French and Spanish armies invaded Italy in the late 1400s, however, this merchant culture faded in favor of a monarchy culture. At about this same time, a merchant culture developed in the Netherlands that also led to large cities with capitalist institutions in support of international trade. Again, French and Spanish monarchies posed a continual threat to this culture. I then describe how free-market capitalism spread to Great Britain and the United States. In the late 1600s, England chose the Dutch merchant culture over the French monarchy culture and invited William of Orange to assume the English throne. While the two cultures continued to fight for dominance in what became Great Britain, the merchant culture found its ultimate expression in the English colonies in America.

I next describe how the merchant cultures in Britain and the United States generated two industrial revolutions that led to further developments in the size and complexity of the firm. From textiles and steel factories to railroads and large manufacturers, the expanding firm relied increasingly on the visible hand of managerial control. This generated a demand for advanced systems of accounting and financial control as well as increased transparency and accountability. However, the increasing complexity of big business in nineteenth-century US industries led to the development of another culture. Opportunistic entrepreneurs in the railroad industry made huge fortunes based on a culture of opaque financial reporting, stock manipulation, and insider trading. This opportunism culture required increased regulation and oversight, which was provided by both government and private institutions such as accounting firms and bond rating agencies. The full threat of the opportunism culture became apparent in the twentieth century when a lack of transparency enabled excessive speculation leading to stock market crashes after the Roaring Twenties (1929) and the dot-com bubble (2000). In each case, further regulation and oversight were imposed by Congress. However, the opportunism culture would eventually infect the regulators, including accounting firms and bond rating agencies. This led to another stock market bubble based on mortgage-backed securities that almost took down the entire financial system in 2008.

The Birth of Free-Market Capitalism in Italy and the Netherlands

Economic historians have applied economic theory and quantitative methods to explain economic development and institutional change. Douglass North (Reference North1991, 97) defines institutions as “humanly devised constraints that structure political, economic and social interactions.” Such institutions consist of both formal rules (e.g., property rights, laws, and constitutions) and informal rules and norms (e.g., customs, traditions, and codes of conduct).Footnote 1 North describes economic development as occurring in stages. First, local trade occurs within small villages and is facilitated largely by informal rules. As trade expands and occurs between villages, terms of exchange must be made more explicit by formal rules. As trade expands further and occurs over greater physical distances, the risk of fraud and unfair practices increases the importance of formal rules. Thus, trade relies increasingly on formal contracting and recordkeeping as it expands from intravillage, to intervillage, to international trade. The final stage of economic development is the urbanization of society as more resources and people become involved in the activities of international trade. While formal rules increase in importance as economic development progresses, however, informal rules and norms continue to play an important role.

In his book, The Reckoning: Financial Accountability and the Rise and Fall of Nations, Jacob Soll (Reference Soll2014) argues that the advent of double-entry accounting in northern Italy around 1300 marks the beginning of the history of free-market capitalism. Without Arabic numerals, ancient Greek and Roman Empires struggled to develop financial management systems for commerce and government. Such systems had to await new numbers with decimal points and a new method of financial accounting. The new numerical system, which originated in India, was introduced to Europe by Arab mathematicians around the twelfth century. Soon a new double-entry accounting system appeared in northern Italy. By the turn of the fourteenth century, city republics emerged in Florence, Genoa, and Venice that were run by patrician merchants whose wealth came from international trade. These merchants used the new accounting system to coordinate long-distance trade and banking. According to Soll, double-entry accounting and a merchant culture of transparency and accountability allowed free-market capitalism to arise first in northern Italy.

The advanced method of accounting that developed in northern Italy was described by Luca Pacioli in his 1494 treatise on arithmetic, geometry, and proportion (Summa de Arithmetica, Geometria, Proportioni, et Proportionalita). In his chapter on accounting, Pacioli explained how double-entry accounting systems allowed merchants to develop profit and loss statements to measure performance and encourage accountability. In contrast to single-entry systems, which also tracked periodic profit and loss, double-entry accounting systems allowed merchants to develop balance sheets that tracked changes in asset values and net worth over time. Called the “Venetian System” by Pacioli, double-entry accounting offered merchants the essential tool of capitalism: the capacity to keep a running total of asset and liability accounts along with contributed capital and retained earnings (Soll Reference Soll2014). As all students of accounting know, the double-entry system of debits and credits allows managers to track changes in these accounts over time that are due to the operating and investing activities of the firm.

Pacioli’s Treatise not only set out the fundamentals of double-entry accounting, which has changed very little in over 500 years, it also disclosed the values and norms that were essential to good financial management and good government. Quoting passages and characters from the Bible, Pacioli argued that hard work and profits were virtues and that God would take care of those who worked hard and knew how to count. He also argued that faithfully keeping a timely and truthful account of financial transactions would put merchants in good standing with God. Well before the Protestant Reformation and its strong work ethic, therefore, a Roman Catholic monk emphasized the virtues of hard work, transparency, and accountability (Soll Reference Soll2014). Raised within the merchant culture of medieval Italy, Pacioli understood that commerce required good accounting and a strong set of social norms. The merchant culture that emerged in northern Italy generated great wealth that financed unparalleled advances in art, architecture, music, science, and literature. Coming as it did after the stagnation of the Middle Ages, this period of renewed prosperity and discovery became known as the Italian Renaissance.

A popular myth is that Pacioli invented double-entry accounting at the time of his treatise in 1494. As is commonly the case, the truth is much more interesting than the myth. Rather than invent double-entry accounting, Pacioli’s treatise simply presented the system of accounting taught in accounting and abacus schools throughout northern Italy after about 1300 (Soll Reference Soll2014). Pacioli’s treatise did not achieve either critical or popular acclaim at the time of its publication. This is attributable in part to forces that weakened the merchant culture in Italy. The same year his treatise was published, France and Spain separately invaded the Italian peninsula and turned it into a battlefield for sixty years. Rather than use religion to argue for transparency and accountability, the French and Spanish monarchies used religion to argue for privileged information and the divine rights of kings. The aristocratic humanism that appealed to the new elites of the Italian Renaissance scorned merchant knowledge, and the Roman Catholic Church increasingly associated financial management and banking with immorality. Given these forces, the merchant culture that made northern Italy the center of international trade and banking in Europe began to fade.

The Netherlands soon became the new center of banking and international trade in Europe. Similar to Italy, accounting schools sprang up in Antwerp and throughout the Netherlands to teach double-entry accounting and financial control. These schools helped nurture a culture of transparency and accountability in both commerce and government. In contrast to Italy, the merchant culture that evolved in the Netherlands was supported by the spread of the Protestant Reformation. To the Dutch Calvinists, wealth and success in commerce became signs of one’s membership into God’s elect. All work was a special calling, which supported a high view of work, commerce, and banking. By 1567, Antwerp was the richest city in the world, with plentiful shops, grand houses, forty-two churches, and a stock exchange. As the strong merchant culture of the Netherlands penetrated society and politics, the seventeen northern provinces eventually threw off the rule of their Habsburg overlords in 1581. In response, King Phillip sent the Spanish Imperial Army to lay siege on the southern city of Antwerp, reducing its population from 100,000 to 40,000 as its artisans and merchants fled to the free north. Soon, Amsterdam became the new center of banking and international trade in Europe. With their accounting schools and well-developed stock exchange, Dutch knowledge of finance and markets soon surpassed that of their Italian predecessors (Soll Reference Soll2014). It is no coincidence, therefore, that the first publicly traded limited liability company was formed and traded in Amsterdam.

In 1602, the Dutch East India Company issued bonds and shares of stock to the public. Dutch citizens had faith in the value of the publicly traded shares because of their strong merchant culture and its social norms of transparency and accountability. The story of the Dutch East India Company, however, is a case study in the difficulty of maintaining transparency and accountability in capital markets (Soll Reference Soll2014). The company’s managers maintained closed books throughout its nearly 200-year history. Although promised by the company’s board, by 1620 no dividends had been paid and no external audits of the company’s books had been conducted. This lack of transparency caused investors to suspect that managers and large investors were embezzling funds, and stocks were bought and sold on the basis of marketplace rumors rather than public financial information. As a result, stock prices experienced large fluctuations. Facing a shareholder revolt in 1622, the company’s seventeen directors finally agreed to an internal audit conducted by the state in private. Thus, transparency and accountability were never achieved at the first publicly traded corporation.

In 1662, Pieter de la Court published a book explaining the merchant culture that had evolved in the Netherlands entitled, True Interest and Political Maxims of the Republic of Holland. His book was an instant best seller throughout the Netherlands and was translated and read in other countries as an explanation for the Dutch economic miracle that was unfolding as a result of free-market capitalism and republican government. De la Court argued that republican government generated greater economic prosperity for its citizens than absolute monarchy. Further, he claimed that successful industry and commerce were not possible without political and economic freedom, accountability, and religious tolerance (Soll Reference Soll2014). His views were in direct opposition to the power and ambition of the Habsburg overlords. Despite growing hostilities with France and Spain, as well as new hostilities with Sweden and England, the Netherlands entered a Dutch golden age. Similar to the Italian Renaissance, this golden age yielded great advances in art, architecture, music, science, and literature. The merchant culture of the Netherlands, however, also began to fade in 1672 when Louis XIV’s troops invaded Holland and William of Orange was made the Dutch stadtholder (ruler).

Free-Market Capitalism Spreads to Britain and the United States

Free-market capitalism eventually spread to Great Britain and the United States. Estimates of per capita gross domestic product (GDP) (Maddison Reference Maddison1995, Reference Maddison2001) suggest that the average person experienced zero income growth from 1 to 1000 AD. In contrast, there was meager but steady growth in per capita GDP from 1000 to 1800. As expected, this growth in income was concentrated primarily in Europe and was driven by the merchant economy in Italy, followed by the Netherlands. In the early 1800s, however, worldwide growth in per capita GDP exploded to more than two percent per year and continued at that rate for the following two centuries. This explosion in wealth and prosperity was driven primarily by the free-market economies in Britain and the United States. The dramatic economic growth that occurred in these two countries far surpassed that of Italy and the Netherlands, and had significant implications for the development of the firm.

As with Italy and the Netherlands before them, Britain and the United States developed merchant cultures that generated capitalist institutions that supported international trade. These institutions included both formal rules of law and informal behavioral rules and norms. In contrast to the two previous free-market economies, however, Britain and the United States benefitted from a new theory of natural law that controlled the power of government and provided its citizens with unparalleled civil liberties. The economic, political, and religious freedoms afforded its citizens generated an explosion of innovation in science and technology, and led to two industrial revolutions. The economic growth experienced by Britain and the United States was further fueled by advances in communication and transportation (Bernstein Reference Bernstein2004). Early on, for example, the economy in Britain benefitted from a vastly expanded post office, a nationwide network of coffeehouses, an improved system of roads, an expanding system of navigable rivers, and a growing industry in shipbuilding and international travel (Pincus Reference Pincus2009).

The first half of the seventeenth century found England locked in a violent conflict between Parliament and two Stuart Kings, James I and Charles I. This conflict led to a bloody civil war and the eventual beheading of Charles in 1649. The Stuart Monarchy was replaced by the Commonwealth of England (1649–1653) followed by the Protectorate under the rule of Oliver Cromwell (1653–1658) and his son (1658–1659). During the years of the Commonwealth and the Protectorate, England was increasingly influenced by the merchant culture of the Netherlands. Over this time period, unquestioned loyalty to the nobility faded, and public policy became increasingly influenced by Parliament, the courts, and public opinion. Further, public opinion became increasingly shaped by a populace that made its living by buying and selling. By the time the Stuart monarchy was restored with the reign of Charles II in 1660, England had become a trading nation with a vibrant merchant culture (Pincus Reference Pincus2009).

Charles II was king of England, Scotland, and Ireland from the restoration of the monarchy in 1660 until his death in 1685. Charles initially sided with France and his first cousin Louis XIV against the rising economic and military power of the Netherlands. This led to two Anglo-Dutch wars during Charles’s reign (1652–1654 and 1665–1667). Although Charles attempted to introduce religious freedom for Catholics in Anglican England, an exclusion controversy arose when it was discovered that Charles’s brother and heir James II was a Catholic. This controversy gave birth to two dominant political parties in England: the pro-exclusion Whig party and the anti-exclusion Tory party. While unsuccessful in excluding James II from becoming king upon Charles’s death, the Whigs repeatedly blocked attempts by James to impose a French-style monarchy on the English people. The commonality of James’s cultural vision with that of the French monarchy is explained by Steve Pincus (Reference Pincus2009, 484):

While merchants yearned for a world in which economic, political, and social information was freely available, James saw informational transparency as politically dangerous. His cultural vision, not surprisingly, had much more in common with that of Louis XIV and Jean-Baptiste Colbert than it shared with that of James’s father.

Under the reign of James II, both Whigs and Tories grew increasingly suspicious of France’s intensions. Jean-Baptiste Colbert’s system of accounting and financial control had made Louis XIV the richest prince in Christendom, and Louis was intent on using his great wealth to spread his influence across Europe. The increasing French influence was devastating for England, who soon ran up a large trade deficit with France. In contrast to the Whigs, however, the Tories felt that the Dutch were a greater threat to England than were the French. The Whigs praised the Dutch as defenders of liberty and industrious self-improvement, while the Tories criticized the Dutch as overly individualistic, ambitious, and economically Machiavellian (Pincus Reference Pincus2009). It soon became clear to both political parties, however, that England had to resist the influence of the French monarchy upon their nation. After consolidating political and financial support, and under the invitation of Parliament, William III, Prince of Orange invaded England in 1688. In 1689, he took the English throne jointly with his wife Mary II, the daughter of James II.

Upon returning to England from his self-imposed exile to the Netherlands in 1689, the Whig philosopher John Locke published his classic work on natural law and government, Two Treatises of Government. In his first treatise, Locke challenged the divine right of kings by referencing Biblical texts and historical examples. In his second treatise, he established a foundation for law and government based on the concept of natural law. According to Locke, humans were entirely free and equal in their natural state. The law of nature, therefore, required that they be free, equal, and subject only to the will of the “infinitely wise maker.” The purpose of civil government, according to Locke, was to enforce the law of nature and thereby protect the freedom and property of its citizens. Locke’s views regarding natural law were so radical at the time that he originally published his Treatises anonymously. Yet, his views became the dominant perspective in Whig England and heavily influenced the English colonists in America. Locke’s views on civil government largely paralleled the views of de la Court in the Netherlands over two decades earlier. Locke’s views on natural law, however, were instrumental in establishing a new system of common law that supported free-market capitalism in Britain and the United States.

North and Weingast (Reference North and Weingast1989) argue that the Glorious Revolution of 1688–1689 brought in institutions that put Britain on the path toward free-market capitalism and the first Industrial Revolution. In contrast, Pincus (Reference Pincus2009) argues that the Glorious Revolution continued the process of modernization that had begun at the beginning of the seventeenth century with debates over the power of the English Church and continued with the English Commonwealth and Protectorate in the 1650s. Thus, formal and informal institutions in support of free-market capitalism were already in place by 1688. Rather than emerge rapidly in the wake of the Glorious Revolution, therefore, capitalist institutions emerged over the seventeenth century as a merchant culture slowly emerged. As in Italy and the Netherlands before it, this culture generated large cities with a populace dedicated to commerce and trade. By the end of the seventeenth century, for example, London had a population of around 575,000 people, which was 65,000 greater than Paris and more than twice the size of Amsterdam. The growing city of London soon became the new center for banking and international trade in Europe (Pincus Reference Pincus2009).

Similar to Italy and the Netherlands before them, Britain’s merchant culture was supported by an education system that taught double-entry accounting and financial management. Accounting schools became common in English society, and by the end of the eighteenth century, they had increased to more than 200 (Soll Reference Soll2014). Heavily influenced by the Protestant Reformation, these schools advanced the ideals of self-discipline, industry, scientific progress, and professional norms. The social norms taught in these schools provided a disciplined and hard-working populace, which further supported free-market capitalism. For example, by most accounts the average Englishman worked three times as many hours in a year as any other European (Burk Reference Burk2007). The result was the wealth of England, which was the foundation for its imperial power and influence around the world. The strong merchant culture that evolved in Britain fueled great advances in science and technology, and led to the first Industrial Revolution from 1760 to 1840.

The influence of the monarchy culture in Britain, however, had not completely vanished. By the publication of Adam Smith’s Wealth of Nations in Reference Smith, Campbell and Skinner1776, Britain needed to be reminded of the aspects of the merchant culture that created the country’s great wealth. In fact, Adam Smith used the British colonies in America rather than his own home country for his prime examples of free-market capitalism (Phillipson Reference Phillipson2010). The establishment of the new nation also reflected the global struggle between the merchant culture and the monarchy culture. By the middle of the eighteenth century, there were approximately 1.5 million British colonists on the continent to only 70,000 French colonists (Burk Reference Burk2007). Due to continuous fighting over territory and trading rights, England declared war on France in 1756. The French and Indian War lasted seven years, and the victory of the British solidified the influence of the merchant culture in the eyes of what were increasingly becoming American colonists. American leaders such as Benjamin Franklin, John Adams, and Thomas Jefferson were heavily influenced by John Locke and Scottish Enlightenment philosophers such as David Hume and Adam Smith. American colonists were also heavily influenced by the Protestant Reformation and a series of religious revivals on the continent called the “Great Awakening” and led by British Protestants such as Jonathan Edwards, John Wesley, and George Whitfield.

Kathleen Burk (Reference Burk2007) describes how the New World of the American colonies became increasingly similar to the Old World in Britain. In addition to sharing a common language and religion (predominantly Protestant), the two countries shared a common culture tied to Lockean principles of equality before the law, private property, and representative government. By the end of the French and Indian War in 1763, George III was king over Great Britain, and the influence of the Whig party in the Parliament was on the wane. By that time, however, the merchant culture of the Whigs had crossed the Atlantic and was firmly planted in American soil. Thus, it is not surprising that American colonists would seek their independence when King George and the Tory-dominated British Parliament began to infringe on their civil liberties. It is also not surprising that Americans who sought independence from Britain associated themselves with the Whig party while Loyalists associated themselves primarily with the Tory party. Some prominent British Whigs, including Adam Smith, opposed the heavy hand of Parliament and supported the American colonists’ quest for greater liberties and economic freedom (Phillipson Reference Phillipson2010).

After the American War of Independence (1775–1783), however, the new nation formed a strong bond with the mother country (Burk Reference Burk2007). By 1790, Britain received almost one-half of America’s exports and supplied more than four-fifths of her imports. In addition to trade, Britain contributed considerable financial resources to help build the new nation, accounting for more than one-half of US bonds held abroad. British investors could be confident in US bonds because of the culture of transparency and accountability that had been planted in American soil. This culture was not always beneficial to the mother country, as when the United States kept a careful accounting of losses and claims relating to a warship named the Alabama, which Britain had built for the Confederate side during the US Civil War. Relations between the two countries were not fully normalized until Britain settled these claims in 1872, paying the United States $15.5 million in gold (Burk Reference Burk2007). At first, the United States exported primarily raw materials and foodstuffs to Britain, including raw cotton, wheat, and tobacco. The second industrial revolution (1870–1914), which involved primarily US industries, allowed the new country to export manufactured goods to the mother country.

German sociologist and political economist Max Weber visited the United States in 1904 and provided a description of the merchant culture that had evolved in the new nation in his classic work, The Protestant Ethic and the Spirit of Capitalism, translated into English in 1930 (Scaff Reference Scaff2011). Weber associated the spirit of capitalism with the Protestant Reformation’s view that secular work and profit were morally good, and used Benjamin Franklin and the American colonists as his prime examples. As with Britain, free-market capitalism was cultivated in the United States by a merchant culture that valued transparency and accountability. For example, Franklin learned double-entry accounting as a printer’s apprentice and kept meticulous books throughout his life. He used his accounting training in his position as Postmaster General of the American colonies and later in his diplomatic missions representing the fledgling republic. This training helped Franklin manage America’s international finances and loans. It is not surprising, therefore, that he viewed accounting as essential to good government and recommended that families teach it to their children (Soll Reference Soll2014). George Washington also kept meticulous records of his spending during the Revolutionary War and promptly produced them when his enemies accused him of profiting from the war.

Industrial Revolutions in Britain and the United States

A number of factors came together in Britain between the late middle ages and the mid-eighteenth century to fuel the first industrial revolution in 1760–1840. Cultural factors included the replacement of medieval Catholicism with enlightenment Protestantism, the adoption by the upper classes of a mechanical worldview inspired by Isaac Newton and the Scientific Revolution, and the spread of that worldview to the working classes through an expansion of literacy in reading and mathematics (Allen Reference Allen2009). Noncultural factors included the emergence of plentiful coal supplies, a continuous supply of productive labor, an unmatched banking system, manufacturing industries, and the strongest navy in the world to keep international trade routes open and safe. Britain had also built a strong transportation infrastructure of roads, canals, railroads, and naval shipping. These factors gave Britain a large advantage in manufacturing and shipping, and helped the small island nation maintain a comfortable lead of about a century over other European countries in being the first industrial nation (Burk Reference Burk2007).

Soll (Reference Soll2014) emphasizes the role of Britain’s merchant culture in fueling the first industrial revolution. Supported by representative government, the Protestant Reformation, and an education system that taught double-entry accounting and financial management, Britain’s merchant culture was unique up until that time. Almost two centuries after Francis Bacon had invented the scientific method, and nearly a century after Isaac Newton had discovered the laws of the heavens, English merchants married scientific rationalism with good accounting to generate great advances in technology and commerce. For example, Josiah Wedgwood used advanced accounting and financial control to increase productivity and profitability in his china company, and James Watt used advanced accounting methods to gain a competitive advantage in his industrial production of the steam engine. Watt’s innovations in accounting and steam engine technology at the University of Glasgow were well known by Adam Smith, whose own knowledge of accounting and finance served him well in his role as administrator at the university and later as Commissioner of Customs in Scotland (Phillipson Reference Phillipson2010).

At the beginning of the first industrial revolution, the United States was a developing country in need of both capital and labor. The first of Britain’s new industries to migrate to the United States was the textile industry. The first water-powered cotton mill was constructed in Lancashire, England in 1764. By 1812, managers and workers from Lancashire, Yorkshire, and Scotland had crossed the Atlantic to establish textile mills and factories in Massachusetts (Burk Reference Burk2007). During this time, the firm underwent significant change (Chandler Reference Chandler1977, Johnson and Kaplan Reference Johnson and Kaplan1987). Before the first industrial revolution, raw materials and intermediate inputs going into the production of clothing were priced by markets formed by independent contractors. For example, spinners were paid per pound for yarn, weavers were paid per yard for cloth, and assemblers were paid per unit of clothing. During the first industrial revolution, spinners, weavers, and assemblers were hired by the firm and brought into the factory. Internalizing these former market transactions required advances in managerial accounting and control, as the firm now had to determine prices for all the factors of production. In particular, the accounting system now needed to keep track of the efficiency with which a textile factory used cotton and labor time.

With the end of the US Civil War in 1865, the young country could now dedicate its energy and resources to the development of its own industrial power. This led to a second industrial revolution in the United States from 1870 to 1920. The completion of the transcontinental railways meant that the entire country was open for economic development. As her population grew, so did the demand for new products and means of shipping those products. As a result, hundreds of new industries were established. Further, developing markets in New York provided capital to support the expanding economy. In 1871, America’s manufacturing output was one-quarter that of Britain’s, US coal production was one-eleventh that of the mother country, and US steel production was almost nonexistent. By the end of the century, however, America’s manufacturing output had increased to seventy percent of Britain’s, US coal production had almost pulled even, and US steel production was twice that of Britain’s. Thereafter, the automobile industry, advances in manufacturing techniques such as the assembly line, and the industrial buildup to World War I created the conditions for further industrial growth. The result of the second industrial revolution is that the United States surpassed Britain as the strongest economic power in the world, and New York surpassed London as the world’s center of banking and international trade (Burke 2007).

Industries in the United States at the beginning of the second industrial revolution included iron and steel, foodstuffs, petroleum, chemicals, and machinery. These industries required bigger and more complex organizations, requiring further developments in the firm. The story of the US steel industry is illustrative. Andrew Carnegie migrated with his family to the United States from Scotland in 1848. Beginning as a laborer changing spools of thread in a cotton mill in Pittsburgh, he eventually worked his way up in the Pittsburgh Railroad Company as a telegraph operator and manager and was instrumental in helping the Union transport troops and supplies during the US Civil War. After the war, Carnegie amassed a huge fortune investing in steel mills before donating almost ninety percent of it to charity. Similar to Josiah Wedgwood in his British china factories, Carnegie used accounting and financial control as a competitive advantage. In particular, he used detailed cost information to push his direct costs below that of his competitors so that he could ensure sufficient demand to keep his plants running at full capacity, even during economic recessions. Weekly data on direct material and conversion costs for each process in his mills was sufficient information for Carnegie to invest more capital and earn higher returns than any other steelmaker in the nineteenth century (Johnson and Kaplan Reference Johnson and Kaplan1987).

Another US industry that expanded rapidly during the second industrial revolution is the railroad industry. Fueled by new iron and steam-power technologies, railroad mileage tripled from 1849 to 1856 (Burk Reference Burk2007). This rapid expansion required a massive influx of capital, and led to $350 million of railroad stock listed on the New York Stock Exchange by 1869 (Soll Reference Soll2014). By the 1870s, the United States had 51,000 miles of track, which was as much as Britain, Europe, and the rest of the world had at the time. To manage the large volume of transactions over an increasing expanse of miles, railroad companies had to develop advanced accounting systems and financial controls. Tracks, landholdings, coal supplies, stations, ticket sales, freight, and personnel costs had to be carefully recorded and tracked. Teams of accountants from each division of the railroad were required to record transactions and then send the results to the central office. Importantly, advanced accounting procedures were needed to record and control cash at hundreds of locations throughout the country. Thus, the rapid growth of the railroad industry required advances not only in accounting and control, but also in communication technologies such as the telegraph.

A New Opportunism Culture Emerges

The massive scale of railroads in America marked the advent of “big business” (Johnson and Kaplan Reference Johnson and Kaplan1987). It also laid bare, however, the difficulty of maintaining transparency and accountability with large, complex firms (Soll Reference Soll2014). As capital poured into US railroads from around the globe, vast fortunes were made by “robber barons” who created their wealth through opaque public reporting, insider trading, and stock price manipulation. Jay Gould, for example, used secrecy and stock manipulation to gain control of both the Atlantic and Great Western Railroad and the Erie Railroad. While great fortunes were being made by opaque public reporting and stock manipulation, fortunes were also being lost. The US government soon took steps to correct these and other potential weaknesses that the second industrial revolution had uncovered in the US capitalist system. First, Congress created the Interstate Commerce Commission in 1887 to regulate the railroad industry. Next, Congress passed the Sherman Antitrust Act in 1890 to break up the great monopolies and trusts that had put too much power into too few hands. Private institutions also took an active role in protecting investors. New York began requiring financial audits for publicly traded firms in 1849, and the American Association of Public Accountants was established in 1887. Motivated by the lack of financial information on railroad securities, John Moody began publishing his Manual of Industrial Statistics in 1900, which eventually led to the largest bond rating agency in the world.

The industrial revolutions in Britain and the United States increased the size and complexity of the firm, and some entrepreneurs used this complexity to profit at the expense of less informed investors. This spawned a new culture of opportunism based on a lack of transparency and accountability. The opportunism culture of this era is epitomized by J. P. Morgan’s quip in 1901, “I owe the public nothing” (Marchand Reference Marchand1998). As the influence of large firms in society grew, this culture of social and moral indifference imposed itself on the whole nation (Khurana Reference Khurana2007). The consequences of the new opportunism culture would soon become apparent. In 1926, William Ripley, a Harvard economics professor, wrote a prophetic article in the Atlantic Monthly entitled, “Stop, Look, Listen! The Shareholder’s Right to Adequate Information.” In the article, Ripley predicted that the current lack of financial transparency would undermine the US economy. At the time, there were no rules to govern corporate financial reporting or public disclosures. Publicly traded companies frequently did not report their earnings or provide balance sheets with accrued liabilities. As a result, the public did not have the information needed to make sound financial investments. Similar to the Dutch East India Company three centuries before, stock prices were based primarily on speculation and marketplace rumors.

Despite Ripley’s warning, trading on the New York Stock Exchange (NYSE) continued unabated through the end of the Roaring Twenties. On October 24, 1929, however, the NYSE experienced a large sell-off that quickly spread across all world markets. Over the next five days, the NYSE lost more than thirty percent of its value, which contributed to the onset of the Great Depression. By 1933, stock prices had dropped eighty-nine percent, wiping out millions of investors. The lack of financial transparency in the stock market did end up undermining the US economy, which did not fully recover until 1940. Despite Franklin Roosevelt’s New Deal programs, unemployment remained in double digits throughout the 1930s. At its lowest point, gross domestic product had dropped thirty percent, 9,000 banks had failed, and unemployment had reached twenty-five percent. As Soll (Reference Soll2014, 192) explains:

Never had an economy so rich and so sophisticated, based largely on publicly traded stock, been allowed to remain so opaque.

The US government went into action to increase regulation and oversight. In 1933, Congress enacted the Glass-Steagall Act, which separated the activities of investment and commercial banking to prevent investment banks from putting depositors’ funds at risk. In 1934, the Roosevelt administration established the Securities and Exchange Commission (SEC) to standardize accounting and reporting for publicly traded companies. The SEC was charged with increasing financial transparency, limiting insider trading, and banning preferred stock lists that had given some investors special privileges over other, less-connected investors. The task of standardizing accounting and reporting for publicly traded companies, however, was largely delegated to members of the accounting profession. Thus, while the SEC required all publicly held companies to provide financial reports that were audited for their conformance to Generally Accepted Accounting Principles (GAAP), the responsibility for creating and updating those accounting principles was left to independent boards made up primarily of Certified Public Accountants.

The beginning of the US accounting profession can be traced to British accountants in the early nineteenth century. By the 1840s, accounting firms had appeared throughout Britain in response to the growth of large-scale organizations and the introduction of income taxation (Parker Reference Parker1986). Deloitte, Price Waterhouse, Whinney Smith & Whinney (now Ernst & Young), Touche, and many other accounting firms appeared in London, the Midlands, and Edinburgh. British accountants first migrated to the United States to protect British investments that had been made in the young nation. Sometime after the 1870s, these accounting firms established offices in the United States to handle the emergence of large corporations during the second industrial revolution. In 1900, Arthur Dickinson moved from his home in London to head the New York office of Price Waterhouse. He found US business fast-paced, unpredictable, and largely unregulated. In particular, he discovered that annual audits – always the backbone of business in England – were few in number and dubious in quality (Soll Reference Soll2014).

In 1913, Arthur Andersen began a uniquely American accounting firm in Chicago. The son of Norwegian immigrants and trained by Price Waterhouse & Company, Andersen was meticulous about audit quality and independence, insisting that auditors answer first and foremost to investors. To teach auditors his high standards of accuracy, discipline, and moral rectitude, Andersen created his own accounting university on a 55,000-acre campus in St. Charles, Illinois. His firm, however, had a more modern view of the role of the accounting profession in the economy. In contrast to British-based accounting firms, Andersen believed that accountants could also act as consultants for their audit clients (Soll Reference Soll2014). By 1970, with competition growing between the Big Eight accounting firms (Price Waterhouse; Deloitte, Haskins & Sells; Peat Marwick Mitchell; Arthur Andersen; Touche Ross; Coopers & Lybrand; Ernst & Whinney; and Arthur Young), other firms joined Andersen in pursuing lucrative consulting contracts with their audit clients.

The accounting profession soon came under the influence of the opportunism culture. Rather than serving as impartial referees with the highest standards of audit quality and independence, audit firms became ineffective in exposing misleading financial statements and even acted as skilled enablers of financial fraud (Soll Reference Soll2014). Under a hail of lawsuits against Big Eight accounting firms in 1976, Congress finally stepped in with an investigative committee led by Democratic senator Lee Metcalf. The Metcalf Committee Report criticized the practice of accounting firms providing consulting services for their audit clients, calling it incompatible with the responsibility of auditors to be independent. Given that consulting yielded fees ranging in the tens of millions of dollars, however, accounting firms used their influence to continue the practice. The pursuit of consulting contracts on audit engagements continued unabated, with consulting fees growing in relation to audit fees. In 1989, a consolidation frenzy hit the US accounting profession as Deloitte, Haskins & Sells merged with Touche Ross to form Deloitte & Touche, and Ernst & Whinney merged with Arthur Young to form Ernst & Young.

At about this same time, advances in technology generated the longest bull market in US history, lasting from the end of 1987 to the beginning of 2000. In what would later be called the “dot-com” bubble, this stock market boom fueled an explosion in mergers and acquisitions and created a new generation of high-tech mega-firms. Because of the strong reputation of their consulting arm, Andersen provided both audit and consulting services to many of these large firms, including Enron and WorldCom. As a result, Andersen’s consulting fees soon eclipsed its auditing fees. The old, staid culture of transparency and accountability was soon replaced by a high-flying, performance-oriented culture as consulting partners began to assert their influence on the firm. Andersen Consulting required plush, well-furnished offices and lucrative pay packages based on financial incentives for sales performance. Increasingly, Andersen’s corporate culture reflected the opportunism culture of the early railroad industry rather than the merchant culture of the accounting profession (Squires, Smith, McDougall, and Yeack Reference Squires, Smith, McDougall and Yeack2003; Toffler Reference Toffler2003).

The consequences of this opportunism culture soon became apparent. Under the Energy Policy Act of 1992, Congress allowed states to deregulate their electricity utilities, opening them up to greater competition. Enron quickly took advantage of the new deregulatory environment, purchasing utility and energy firms based on its rising stock price. In 1997, Jeffrey Skilling was promoted to chief operating officer, second only to CEO Kenneth Lay. Skilling set out to hire the best talent, recruiting only from top MBA programs and investment banks, and rewarded his new talent with merit-based bonuses that had no cap (Thomas Reference Thomas2002). As stock prices fell in 2000, however, it became increasingly difficult for Enron to meet analysts’ earnings projections and the required leverage ratios of rating agencies. In response, Enron began to inflate its income and keep liabilities off of its balance sheet through the use of special purpose entities (SPEs), which essentially were divisions of the firm reported as independent companies. As predicted in an e-mail to Lay by whistle-blower Sherron Watkins, Enron soon imploded in a wave of accounting scandals (Morse and Bower Reference Morse and Bower2002). Named by Fortune as “America’s Most Innovative Company” for six years in a row, Enron filed for bankruptcy in December 2001. The fall of Enron cost investors $11 billion. Skilling’s disdain for accounting and accountability was conveyed by his testimony before congress in February 2002. He stated multiple times that he was not an accountant, and was therefore unaware of the accounting scandals perpetuated under his watch. This defense was ineffective, however, as Skilling was convicted of federal felony charges and sentenced to 24 years in prison.

Enron’s bankruptcy was the largest in US history until WorldCom’s bankruptcy in July 2002, with assets of $63 billion compared to WorldCom’s $104 billion. WorldCom’s accounting scandal, however, was much simpler. Rather than expense monthly costs of cell phone service, WorldCom capitalized and depreciated the costs over future periods, in violation of GAAP. In effect, the company treated monthly utility expenses like long-term investments. An internal auditor, Cynthia Cooper, uncovered the creative accounting and approached Andersen, but was told matter-of-factly that it was not a problem (Ripley Reference Ripley2002). Both Enron and WorldCom were audited by Andersen, and both firms had lucrative consulting contracts with the accounting firm. The accounting scandals and related audit failures at these large corporations quickly took down Andersen. After the Justice Department secured an obstruction of justice confession from the lead partner on the Enron audit in 2002, Andersen told the SEC that it would no longer audit the books of public companies. Once an accounting and consulting giant with 85,000 staff worldwide, Andersen currently employs only a skeleton staff to manage its continuing litigation (Squires et al. Reference Squires, Smith, McDougall and Yeack2003, Toffler Reference Toffler2003).

Similar to the stock market crash that ended the Roaring Twenties, the stock market crash that ended the dot-com bubble led to increased government regulation and oversight. In 2002, Congress passed the Sarbanes-Oxley Act (SOX) to strengthen corporate governance and enact sweeping reforms in the accounting industry. SOX requires all CEOs and CFOs of US-registered companies to certify the honesty of their financial reports. Congress clearly remembered Jeffrey Skilling’s attempts to deny knowledge of Enron’s accounting manipulations. SOX also makes it unlawful for accounting firms to provide nonaudit services, including tax services, unless the activities are preapproved by the audit committee of the client company’s board, and totally bans the provision of eight other categories of nonaudit services. The new legislation also requires the rotation of the lead audit partner and any reviewing partner from a client account after five years. SOX also requires auditors to audit and provide assurance of the effectiveness of the internal controls of the client. The centerpiece of the Act, however, is the creation of the Public Company Accounting Oversight Board (PCAOB). Although the SEC retains oversight and enforcement authority, the new regulatory body is empowered to oversee the quality of audits and initiate its own audits of the audit firms (Squires et al. Reference Squires, Smith, McDougall and Yeack2003).

The Opportunism Culture Nearly Takes Down the Financial System

Another Act of Congress, however, made investors increasingly vulnerable by removing an important safeguard that had been in place since the Great Depression. In 1999, Congress repealed the Glass-Steagall Act and replaced it with the Gramm-Leach-Bliley Act. The new Act consolidated the activities of investment and commercial banking, allowing commercial banks to not only take deposits and make loans, but also underwrite and sell securities. This included the underwriting and selling of mortgage-backed securities. As part of the New Deal in 1938, Congress established the Federal National Mortgage Association (Fannie Mae) to purchase certain mortgages from originators and securitize them for resale. Fannie Mae went public in 1970, the same year that the federal government allowed it to purchase conventional mortgages, and issued its first mortgage-backed security in 1981. The Gramm-Leach-Bliley Act, along with mounting pressure from Congress to expand mortgage lending to low- and moderate-income borrowers, increased the mortgage-backed security activities of Fannie Mae and the more recently formed Federal Home Loan Mortgage Corporation (Freddie Mac). These policy changes also increased the mortgage lending activities of banks and loan originators, who were also encouraged to make loans that had a higher risk of default. This gave birth to the “subprime” mortgage with teaser rates and low underwriting standards.

The repeal of Glass-Steagall, combined with other economic and political factors, created a perfect storm of opportunism that nearly took down the world financial system (Financial Crisis Inquiry Commission 2011, FCIC hereafter). At the turn of the new century, the Federal Reserve had cut interest rates to stimulate economic growth after the dot-com crash. Low interest rates and the highly liquid market in mortgage-backed securities, combined with an influx in foreign capital, caused money to wash through the economy “like water rushing through a broken dam” (FCIC 2011, 5). Mortgage lenders such as New Century Financial, Ameriquest, and American Home Mortgage paid their loan originators huge bonuses for loan volume that included subprime mortgages. Meanwhile, bond brokers on Wall Street earned huge bonuses packaging and selling new kinds of mortgage loans in new kinds of investment products that were deemed to be safe but possessed complex and hidden risks. While Fannie and Freddie initially focused on securitizing prime mortgages, Wall Street focused on the higher-yield loans that were either “subprime” or “Alt-A” (between prime and subprime). By 2005 and 2006, Wall Street was securitizing one-third more loans than Fannie and Freddie, and more than seventy percent of them were subprime or Alt-A loans (FCIC 2011). Losing market share to Wall Street, Fannie and Freddie soon loosened their underwriting standards, purchasing and guaranteeing riskier loans. From 2005 to 2008, Fannie and Freddie also began to purchase subprime and Alt-A mortgage-backed securities from Wall Street.

The torrent of liquidity now rushing into the mortgage market generated a moral hazard all along the mortgage securitization chain. Mortgage brokers were compensated based on the volume of loans generated rather than on the performance and quality of the loans made. Meanwhile, mortgage securitizers were compensated on the basis of the number of new mortgage-backed securities created and sold to investors. Thus, agents all along the mortgage chain were paid enormous sums for making risky investments, even if these same investments could result in significant losses for investors and taxpayers. For the long-term health of the financial system, every participant along the securitization chain should have been motivated to maintain transparency and accountability. Instead, they were motivated to hide their private information from participants further down the chain. Normal safeguards proved ineffective at protecting investors from this moral hazard, as government policy at the time promoted market deregulation and mortgage lending to previously disadvantaged groups. Accounting firms were unable to stem the tide of subprime mortgage lending and securitization. The bond rating agencies not only failed to conduct diligent reviews of the quality of the mortgages in the mortgage-backed securities, they also frequently failed to check to see that the mortgages were what the securitizers said that they were (FCIC 2011).

Why didn’t Wall Street investment firms step in to protect the interests of investors and taxpayers against this moral hazard? The answer lies, at least in part, on the aggressive actions of the Federal Reserve in bailing out Long Term Capital Management (LTCM) in 1998. LTCM was founded in 1994 by John Meriwether, a University of Chicago MBA and former vice chair and head of bond trading at Solomon Brothers (Lowenstein Reference Lowenstein2000). Well trained in the modern finance theory emanating from the Chicago school, Meriwether recruited leading scholars in finance to help guide his hedge fund, including Robert Merton of Harvard University and Myron Scholes of Stanford University. Merton and Scholes shared the 1997 Nobel Prize in Economic Science for creating a new method of determining the value of derivatives. Implementing leveraged bond strategies, LTCM earned spectacular returns in the first four years of 19.9 percent, 42.8 percent, 40.8 percent, and 17.1 percent, respectively. In 1998, however, LTCM lost $4.6 billion in less than four months following the Russian bond crisis. This loss represented more than eighty percent of its nearly $5 billion in capital, but the firm had further leveraged itself by entering into $1 trillion in derivatives contracts with many of the largest commercial and investment banks. To avoid a market collapse, the Federal Reserve negotiated an orderly liquidation of LTCM’s securities and derivatives with many of the same banks and securities firms. The clear message from LTCM was that even modern finance theory could not avoid the downside of bond market risk. The message that Wall Street received, however, was that financial firms deemed “too big to fail” could expect a bailout from the Fed (FCIC 2011).

Wall Street firms either failed to understand or chose to ignore how dangerously exposed the banking system had become to contagion effects of risky mortgage-backed securities. By the third quarter of 2006, home prices were falling and mortgage delinquencies were on the rise. Despite these warning signs, Wall Street kept ordering up mortgage loans, packaging them into securities, taking profits, and earning bonuses. By the end of 2007, however, most of the subprime mortgage lenders had failed or been acquired. In January 2008, Bank of America announced that it would be acquiring the ailing subprime lender Countrywide. Bear Stearns was bought by J.P. Morgan with government assistance in the spring. By the end of the summer, Fannie and Freddie were in conservatorship. Instead of getting better, the financial system continued to deteriorate. In September, Lehman Brothers failed, and the remaining investment banks, Merrill Lynch, Goldman Sachs, and Morgan Stanley, struggled to survive. AIG, with its massive credit default swap portfolio, was bailed out by the federal government. Large commercial banks were not spared the carnage. Washington Mutual became the largest bank failure in US history in September, and Wachovia struck a deal to be acquired by Wells Fargo in October while Citigroup and Bank of America fought to stay afloat. Before it was over, the federal government was on the hook for trillions of dollars through more than two dozen programs aimed at stabilizing the financial system and propping up the nation’s largest financial institutions (FCIC 2011, 23).

As with former stock market crashes after the Roaring Twenties and the dot-com bubble, the market crash of 2008 was fueled by an opportunism culture that did not value transparency and accountability. In addition to the opportunism of actors up and down the mortgage securitization chain, important players who were supposed to protect investors and taxpayers were corrupted by the culture. For example, the bond rating agencies could have stopped the disaster in its tracks. Bond rating agencies evolved from the lack of transparency in the railroad industry during the second industrial revolution. In the late nineteenth century, John Moody saw that there was a lack of useful information on the emerging railroad giants, and that a high percentage of the securities of such corporations “had to be bought on faith rather than knowledge” (Moody Reference Moody1946, 90). He first began publishing his Manual of Industrial Statistics in 1900. The 1907 financial crisis was so severe that it forced Moody to sell the company that published his manual, John Moody & Company. In 1909, however, Moody founded a new company that issued independent credit ratings for bonds. Poor’s issued their first credit rating in 1916 followed by Standard Statistics Company in 1922. Following the merger of the last two companies, there are now only two major bond rating agencies – Moody’s Investor Services (Moody’s) and its competitor, Standard & Poor’s (S&P). Smaller bond rating agencies include Fitch Ratings (Fitch) and a multitude of minor domestic rating agencies around the globe.

Similar to the accounting profession, bond rating agencies are paid by the very institutions whose bonds they rate. The purpose of bond rating agencies is to provide surveillance to see who is violating the prevailing norm of creditworthiness. This is based on the institutionalized belief that “repaying debt is morally right and obligatory” (Sinclair Reference Sinclair2005, 66). Auditors and bond raters serve the same function in that they support and preserve the social norm of creditworthiness through their reputation for neutrality, objectivity, and expertise. Both auditors and bond raters, however, are subject to corruption and failure to uphold normal standards of professional conduct. For example, similar to Andersen, rating agencies were held liable for their role in enabling the Enron accounting scandal. The rating agencies were central to enabling Enron’s scandal for two reasons: (1) maintaining an investment-grade rating allowed Enron to go on raising new funds, and (2) maintaining an investment-grade rating allowed Enron to avoid loan covenant “triggers” that would have caused its loans to come due (Partnoy Reference Partnoy2003). When Moody’s, Standard & Poor’s, and Fitch continued to rate Enron’s bonds investment grade until four days before it declared bankruptcy, therefore, they played as significant a role as did Andersen in enabling Enron’s accounting scandal.

Similar to their role in keeping Enron afloat, the rating agencies played a central role in enabling the opportunism up and down the mortgage securitization chain. High bond ratings allowed bond issuers to receive approval for their creatively structured mortgage-backed securities, allowed banks and other financial institutions to hold less capital for the bonds, and expanded the type of investors who could hold the bonds in their portfolio. Rather than support the norm of creditworthiness, serving as impartial referees with the highest standards of quality and independence, the bond rating agencies became enablers of the opportunistic players along the mortgage securitization chain (FCIC 2011). First, the rating agencies used antiquated models based, in part, on periods of relatively strong credit performance. Second, the rating agencies did not adequately account for the deterioration of underwriting standards or the possibility of housing prices going down. Third, the rating agencies did not adequately adjust their models to take into account the layered risks of the subprime mortgage securities. Thus, the high credit ratings of mortgage-backed securities by Moody’s, Standard & Poor’s, and Fitch perpetuated the stock market bubble, and their downgrades through 2007 and 2008 magnified the subsequent market crash.

This history of the firm provides the following insights. First, the social norms of transparency and accountability from the merchant culture are essential to free-market economies in that they assure the flow of investment capital and the proper functioning of organizations and capital markets. This merchant culture of transparency and accountability, however, is difficult to maintain and has frequently been abandoned in favor of an opportunism culture of opaqueness and opportunistic self-interest. When this opportunism culture of greed and corruption becomes apparent and threatens the economy, government regulation and oversight reduce flexibility and freedom, which is consistent with a monarchy culture. Private capitalist institutions, such as external auditing and the bond rating industry, play an important role in maintaining transparency and accountability in free-market economies, but they too are subject to corruption by the opportunism culture. Thus, government regulation plays an important role in maintaining the merchant culture. The appropriate size of that role, given the temptation of opportunism on the one hand and the threat of monarchy on the other, is a constant challenge to policy makers in free-market economies. Unfortunately, the theory of the firm that emanated from the Chicago school provides little support for the merchant culture and few insights regarding the proper role of government in maintaining that culture. The following chapter addresses this theory of the firm, including its history and emerging characteristics.

Footnotes

1 In sociology, Richard Scott (Reference Scott1995, 33) defines institutions as “cognitive, normative, and regulative structures and activities that provide stability and meaning to social behavior.”

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×